otokonoko's secret base semireal_v4
sample prompt(negative prompt is empty): 1boy, solo, light smile, black hair, playing game around their secret base,
Sep.14th,2024
semi-real model(checkpoint) for SD1.5 specialized model for trap/femboy/otoko no ko version 4.
Merge recipe(for supermerger):
MBW + elemental merge
Elemental merge
A: otokonoko-secret-base-semireal-mix_v2
B: realisticFitGirls_nellie
otokonoko-secret-base-semireal-mix_v2 x 1 + realisticFitGirls_nellie x 0
OUT11:ff.net norm2 2.bias 2.weight:1
OUT10:NOT in_layers.0 attn1.to_out.0 attn2:1
OUT09:NOT out_layers.3 skip_connection proj_in proj_out attn1 attn2 ff.net.0.proj:1
OUT08:NOT out_layers.3 skip_connection norm. attn1 atton2 ff.net.0.proj:1
OUT07:in_layers.0 attn1.to_k.weight ff.net.0.proj:1
OUT06:in_layers.2 out_layers.0 attn1.to_k.weight attn1.to_q.weight attn1.to_v.weight attn2.to_k.weight norm2:1
OUT05:emb_layers.1 in_layers out_layers.0 norm proj_out:1
OUT04:emb_layers.1 in_layers.0 out_layers.0 norm. norm1:1
OUT03:in_layers.0 out_layers attn1.to_out.0 attn1.to_q.weight attn1.to_v.weight attn2.to_k.weight attn2.to_out.0 attn2.to_q.weight norm3:1
OUT02:NOT in_layers.2 out_layers.3 skip_connection conv:1
OUT01:NOT in_layers.2 out_layers.3 skip_connection:1
OUT00:NOT in_layers.2 out_layers.3 skip_connection:1
M00:0.emb_layers.1 0.in_layers 0.out_layers.0 1.norm. 1.proj_out attn1.to_k.weight norm1:1
IN11:NOT in_layers.2 out_layers.3:1
IN10:NOT in_layers.2 out_layers.3:1
IN08:in_layers.2 out_layers norm. attn1.to_k.weight norm1 norm2 norm3:1
IN07:emb_layers.1 in_layers.0 out_layers.0 norm. attn1.to_v.weight norm1 norm2 norm3:1
IN05:emb_layers.1 in_layers.0 out_layers.0 skip_connection proj_in proj_out attn1.to_k.weight:1
IN04:in_layers out_layers.3 skip_connection norm. proj_in proj_out attn1.to_out.0 attn1.to_q.weight norm1 norm2 norm3:1
IN02:out_layers.0 skip_connection attn1.to_k.weight attn1.to_q.weight attn2.to_k.weight ff.net.2 norm1 norm3:1
IN01:skip_connection proj_out attn1.to_k.weight attn1.to_q.weight attn1.to_v.weight attn2.to_out.0 attn2.to_q.weight ff.net.2 norm3:1
-> A
MBW + Elemental merge
A: otokonoko-secret-base-semireal-mix_v2
B: chaosdreamRealistic_v10
otokonoko-secret-base-semireal-mix_v2 x (1-alpha) + chaosdreamRealistic_v10 x alpha (0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0)
OUT11:emb_layers.1:0
OUT11:in_layers.0:1
OUT11:in_layers.2:1
OUT11:out_layers.0:1
OUT11:out_layers.3:0
OUT11:skip_connection:0
OUT11:norm.:0
OUT11:proj_in:0
OUT11:proj_out:1
OUT11:transformer_blocks.0.attn1.to_k.weight:1
OUT11:transformer_blocks.0.attn1.to_out.0:1
OUT11:transformer_blocks.0.attn1.to_q.weight:0
OUT11:transformer_blocks.0.attn1.to_v.weight:0
OUT11:transformer_blocks.0.attn2.to_k.weight:0
OUT11:transformer_blocks.0.attn2.to_out.0:1
OUT11:transformer_blocks.0.attn2.to_q.weight:0
OUT11:transformer_blocks.0.attn2.to_v.weight:1
OUT11:transformer_blocks.0.ff.net.0.proj:0
OUT11:transformer_blocks.0.ff.net.2:1
OUT11:transformer_blocks.0.norm1:0
OUT11:transformer_blocks.0.norm2:1
OUT11:transformer_blocks.0.norm3:1
OUT11:0.bias:0
OUT11:0.weight:0
OUT11:2.bias:1
OUT11:2.weight:1
OUT10::1
OUT09:emb_layers.1:1
OUT09:in_layers.0:1
OUT09:in_layers.2:1
OUT09:out_layers.0:1
OUT09:out_layers.3:1
OUT09:skip_connection:1
OUT09:norm.:1
OUT09:proj_in:1
OUT09:proj_out:1
OUT09:transformer_blocks.0.attn1.to_k.weight:1
OUT09:transformer_blocks.0.attn1.to_out.0:1
OUT09:transformer_blocks.0.attn1.to_q.weight:1
OUT09:transformer_blocks.0.attn1.to_v.weight:1
OUT09:transformer_blocks.0.attn2.to_k.weight:0
OUT09:transformer_blocks.0.attn2.to_out.0:0
OUT09:transformer_blocks.0.attn2.to_q.weight:0
OUT09:transformer_blocks.0.attn2.to_v.weight:1
OUT09:transformer_blocks.0.ff.net.0.proj:1
OUT09:transformer_blocks.0.ff.net.2:1
OUT09:transformer_blocks.0.norm1:1
OUT09:transformer_blocks.0.norm2:1
OUT09:transformer_blocks.0.norm3:1
OUT08:emb_layers.1:0
OUT08:in_layers.0:1
OUT08:in_layers.2:0
OUT08:out_layers.0:1
OUT08:out_layers.3:0
OUT08:skip_connection:0
OUT08:norm.:1
OUT08:proj_in:1
OUT08:proj_out:0
OUT08:transformer_blocks.0.attn1.to_k.weight:0
OUT08:transformer_blocks.0.attn1.to_out.0:0
OUT08:transformer_blocks.0.attn1.to_q.weight:0
OUT08:transformer_blocks.0.attn1.to_v.weight:0
OUT08:transformer_blocks.0.attn2.to_k.weight:1
OUT08:transformer_blocks.0.attn2.to_out.0:0
OUT08:transformer_blocks.0.attn2.to_q.weight:0
OUT08:transformer_blocks.0.attn2.to_v.weight:0
OUT08:transformer_blocks.0.ff.net.0.proj:0
OUT08:transformer_blocks.0.ff.net.2:1
OUT08:transformer_blocks.0.norm1:1
OUT08:transformer_blocks.0.norm2:1
OUT08:transformer_blocks.0.norm3:1
OUT08:conv:0
OUT07:emb_layers.1:1
OUT07:in_layers.0:1
OUT07:in_layers.2:1
OUT07:out_layers.0:0
OUT07:out_layers.3:0
OUT07:skip_connection:0
OUT07:norm.:1
OUT07:proj_in:0
OUT07:proj_out:0
OUT07:transformer_blocks.0.attn1.to_k.weight:0
OUT07:transformer_blocks.0.attn1.to_out.0:0
OUT07:transformer_blocks.0.attn1.to_q.weight:0
OUT07:transformer_blocks.0.attn1.to_v.weight:0
OUT07:transformer_blocks.0.attn2.to_k.weight:0
OUT07:transformer_blocks.0.attn2.to_out.0:0
OUT07:transformer_blocks.0.attn2.to_q.weight:1
OUT07:transformer_blocks.0.attn2.to_v.weight:0
OUT07:transformer_blocks.0.ff.net.0.proj:0
OUT07:transformer_blocks.0.ff.net.2:0
OUT07:transformer_blocks.0.norm1:0
OUT07:transformer_blocks.0.norm2:0
OUT07:transformer_blocks.0.norm3:0
OUT06:emb_layers.1:0
OUT06:in_layers.0:0
OUT06:in_layers.2:0
OUT06:out_layers.0:0
OUT06:out_layers.3:0
OUT06:skip_connection:0
OUT06:norm.:0
OUT06:proj_in:0
OUT06:proj_out:0
OUT06:transformer_blocks.0.attn1.to_k.weight:0
OUT06:transformer_blocks.0.attn1.to_out.0:0
OUT06:transformer_blocks.0.attn1.to_q.weight:0
OUT06:transformer_blocks.0.attn1.to_v.weight:0
OUT06:transformer_blocks.0.attn2.to_k.weight:1
OUT06:transformer_blocks.0.attn2.to_out.0:0
OUT06:transformer_blocks.0.attn2.to_q.weight:1
OUT06:transformer_blocks.0.attn2.to_v.weight:0
OUT06:transformer_blocks.0.ff.net.0.proj:0
OUT06:transformer_blocks.0.ff.net.2:0
OUT06:transformer_blocks.0.norm1:0
OUT06:transformer_blocks.0.norm2:0
OUT06:transformer_blocks.0.norm3:0
OUT05:emb_layers.1:0
OUT05:in_layers.0:0
OUT05:in_layers.2:0
OUT05:out_layers.0:0
OUT05:out_layers.3:0
OUT05:skip_connection:0
OUT05:norm.:1
OUT05:proj_in:0
OUT05:proj_out:0
OUT05:transformer_blocks.0.attn1.to_k.weight:0
OUT05:transformer_blocks.0.attn1.to_out.0:0
OUT05:transformer_blocks.0.attn1.to_q.weight:0
OUT05:transformer_blocks.0.attn1.to_v.weight:0
OUT05:transformer_blocks.0.attn2.to_k.weight:0
OUT05:transformer_blocks.0.attn2.to_out.0:0
OUT05:transformer_blocks.0.attn2.to_q.weight:0
OUT05:transformer_blocks.0.attn2.to_v.weight:0
OUT05:transformer_blocks.0.ff.net.0.proj:0
OUT05:transformer_blocks.0.ff.net.2:0
OUT05:transformer_blocks.0.norm1:0
OUT05:transformer_blocks.0.norm2:0
OUT05:transformer_blocks.0.norm3:0
OUT05:conv:0
OUT04:emb_layers.1:0
OUT04:in_layers.0:0
OUT04:in_layers.2:1
OUT04:out_layers.0:0
OUT04:out_layers.3:0
OUT04:skip_connection:0
OUT04:norm.:0
OUT04:proj_in:0
OUT04:proj_out:0
OUT04:transformer_blocks.0.attn1.to_k.weight:0
OUT04:transformer_blocks.0.attn1.to_out.0:0
OUT04:transformer_blocks.0.attn1.to_q.weight:0
OUT04:transformer_blocks.0.attn1.to_v.weight:0
OUT04:transformer_blocks.0.attn2.to_k.weight:0
OUT04:transformer_blocks.0.attn2.to_out.0:0
OUT04:transformer_blocks.0.attn2.to_q.weight:0
OUT04:transformer_blocks.0.attn2.to_v.weight:0
OUT04:transformer_blocks.0.ff.net.0.proj:0
OUT04:transformer_blocks.0.ff.net.2:0
OUT04:transformer_blocks.0.norm1:1
OUT04:transformer_blocks.0.norm2:1
OUT04:transformer_blocks.0.norm3:0
OUT03:emb_layers.1:0
OUT03:in_layers.0:0
OUT03:in_layers.2:0
OUT03:out_layers.0:1
OUT03:out_layers.3:0
OUT03:skip_connection:0
OUT03:norm.:1
OUT03:proj_in:0
OUT03:proj_out:0
OUT03:transformer_blocks.0.attn1.to_k.weight:0
OUT03:transformer_blocks.0.attn1.to_out.0:0
OUT03:transformer_blocks.0.attn1.to_q.weight:0
OUT03:transformer_blocks.0.attn1.to_v.weight:0
OUT03:transformer_blocks.0.attn2.to_k.weight:0
OUT03:transformer_blocks.0.attn2.to_out.0:1
OUT03:transformer_blocks.0.attn2.to_q.weight:0
OUT03:transformer_blocks.0.attn2.to_v.weight:0
OUT03:transformer_blocks.0.ff.net.0.proj:0
OUT03:transformer_blocks.0.ff.net.2:0
OUT03:transformer_blocks.0.norm1:1
OUT03:transformer_blocks.0.norm2:1
OUT03:transformer_blocks.0.norm3:1
OUT01:emb_layers.1:0
OUT01:in_layers.0:1
OUT01:in_layers.2:0
OUT01:out_layers.0:0
OUT01:out_layers.3:0
OUT01:skip_connection:0
OUT00:emb_layers.1:0
OUT00:in_layers.0:0
OUT00:in_layers.2:0
OUT00:out_layers.0:1
OUT00:out_layers.3:0
OUT00:skip_connection:0
M00:0.emb_layers.1:0
M00:0.in_layers.0:1
M00:0.in_layers.2:0
M00:0.out_layers.0:1
M00:0.out_layers.3:1
M00:1.norm.:0
M00:1.proj_in:0
M00:1.proj_out:0
M00:1.transformer_blocks.0.attn1.to_k.weight:0
M00:1.transformer_blocks.0.attn1.to_out.0:0
M00:1.transformer_blocks.0.attn1.to_q.weight:0
M00:1.transformer_blocks.0.attn1.to_v.weight:0
M00:1.transformer_blocks.0.attn2.to_k.weight:0
M00:1.transformer_blocks.0.attn2.to_out.0:0
M00:1.transformer_blocks.0.attn2.to_q.weight:0
M00:1.transformer_blocks.0.attn2.to_v.weight:0
M00:1.transformer_blocks.0.ff.net.0.proj:0
M00:1.transformer_blocks.0.ff.net.2:1
M00:1.transformer_blocks.0.norm1:1
M00:1.transformer_blocks.0.norm2:0
M00:1.transformer_blocks.0.norm3:0
M00:2.emb_layers.1:1
M00:2.in_layers.0:1
M00:2.in_layers.2:0
M00:2.out_layers.0:0
M00:2.out_layers.3:0
IN11:emb_layers.1:0
IN11:in_layers.0:0
IN11:in_layers.2:0
IN11:out_layers.0:1
IN11:out_layers.3:0
IN10:emb_layers.1:0
IN10:in_layers.0:0
IN10:in_layers.2:0
IN10:out_layers.0:0
IN10:out_layers.3:0
IN09:op:0
IN08:emb_layers.1:0
IN08:in_layers.0:1
IN08:in_layers.2:0
IN08:out_layers.0:1
IN08:out_layers.3:0
IN08:skip_connection:1
IN08:norm.:1
IN08:proj_in:0
IN08:proj_out:0
IN08:transformer_blocks.0.attn1.to_k.weight:0
IN08:transformer_blocks.0.attn1.to_out.0:0
IN08:transformer_blocks.0.attn1.to_q.weight:0
IN08:transformer_blocks.0.attn1.to_v.weight:0
IN08:transformer_blocks.0.attn2.to_k.weight:0
IN08:transformer_blocks.0.attn2.to_out.0:0
IN08:transformer_blocks.0.attn2.to_q.weight:0
IN08:transformer_blocks.0.attn2.to_v.weight:0
IN08:transformer_blocks.0.ff.net.0.proj:0
IN08:transformer_blocks.0.ff.net.2:0
IN08:transformer_blocks.0.norm1:1
IN08:transformer_blocks.0.norm2:1
IN08:transformer_blocks.0.norm3:1
IN07:emb_layers.1:0
IN07:in_layers.0:0
IN07:in_layers.2:0
IN07:out_layers.0:0
IN07:out_layers.3:0
IN07:skip_connection:0
IN07:norm.:0
IN07:proj_in:0
IN07:proj_out:0
IN07:transformer_blocks.0.attn1.to_k.weight:0
IN07:transformer_blocks.0.attn1.to_out.0:0
IN07:transformer_blocks.0.attn1.to_q.weight:0
IN07:transformer_blocks.0.attn1.to_v.weight:0
IN07:transformer_blocks.0.attn2.to_k.weight:0
IN07:transformer_blocks.0.attn2.to_out.0:0
IN07:transformer_blocks.0.attn2.to_q.weight:0
IN07:transformer_blocks.0.attn2.to_v.weight:0
IN07:transformer_blocks.0.ff.net.0.proj:0
IN07:transformer_blocks.0.ff.net.2:0
IN07:transformer_blocks.0.norm1:0
IN07:transformer_blocks.0.norm2:0
IN07:transformer_blocks.0.norm3:1
IN06:op:0
IN05:emb_layers.1:1
IN05:in_layers.0:1
IN05:in_layers.2:0
IN05:out_layers.0:0
IN05:out_layers.3:0
IN05:skip_connection:1
IN05:norm.:1
IN05:proj_in:0
IN05:proj_out:0
IN05:transformer_blocks.0.attn1.to_k.weight:0
IN05:transformer_blocks.0.attn1.to_out.0:0
IN05:transformer_blocks.0.attn1.to_q.weight:0
IN05:transformer_blocks.0.attn1.to_v.weight:0
IN05:transformer_blocks.0.attn2.to_k.weight:0
IN05:transformer_blocks.0.attn2.to_out.0:0
IN05:transformer_blocks.0.attn2.to_q.weight:0
IN05:transformer_blocks.0.attn2.to_v.weight:0
IN05:transformer_blocks.0.ff.net.0.proj:0
IN05:transformer_blocks.0.ff.net.2:0
IN05:transformer_blocks.0.norm1:1
IN05:transformer_blocks.0.norm2:1
IN05:transformer_blocks.0.norm3:1
IN04:emb_layers.1:0
IN04:in_layers.0:0
IN04:in_layers.2:0
IN04:out_layers.0:0
IN04:out_layers.3:0
IN04:skip_connection:0
IN04:norm.:0
IN04:proj_in:0
IN04:proj_out:0
IN04:transformer_blocks.0.attn1.to_k.weight:0
IN04:transformer_blocks.0.attn1.to_out.0:0
IN04:transformer_blocks.0.attn1.to_q.weight:0
IN04:transformer_blocks.0.attn1.to_v.weight:0
IN04:transformer_blocks.0.attn2.to_k.weight:1
IN04:transformer_blocks.0.attn2.to_out.0:0
IN04:transformer_blocks.0.attn2.to_q.weight:1
IN04:transformer_blocks.0.attn2.to_v.weight:0
IN04:transformer_blocks.0.ff.net.0.proj:0
IN04:transformer_blocks.0.ff.net.2:0
IN04:transformer_blocks.0.norm1:1
IN04:transformer_blocks.0.norm2:0
IN04:transformer_blocks.0.norm3:1
IN03:op:0
IN02:emb_layers.1:1
IN02:in_layers.0:0
IN02:in_layers.2:0
IN02:out_layers.0:0
IN02:out_layers.3:0
IN02:skip_connection:1
IN02:norm.:0
IN02:proj_in:0
IN02:proj_out:0
IN02:transformer_blocks.0.attn1.to_k.weight:1
IN02:transformer_blocks.0.attn1.to_out.0:0
IN02:transformer_blocks.0.attn1.to_q.weight:0
IN02:transformer_blocks.0.attn1.to_v.weight:0
IN02:transformer_blocks.0.attn2.to_k.weight:1
IN02:transformer_blocks.0.attn2.to_out.0:1
IN02:transformer_blocks.0.attn2.to_q.weight:0
IN02:transformer_blocks.0.attn2.to_v.weight:1
IN02:transformer_blocks.0.ff.net.0.proj:0
IN02:transformer_blocks.0.ff.net.2:1
IN02:transformer_blocks.0.norm1:0
IN02:transformer_blocks.0.norm2:0
IN02:transformer_blocks.0.norm3:1
IN01:emb_layers.1:0
IN01:in_layers.0:0
IN01:in_layers.2:1
IN01:out_layers.0:1
IN01:out_layers.3:0
IN01:skip_connection:1
IN01:norm.:1
IN01:proj_in:0
IN01:proj_out:0
IN01:transformer_blocks.0.attn1.to_k.weight:0
IN01:transformer_blocks.0.attn1.to_out.0:0
IN01:transformer_blocks.0.attn1.to_q.weight:0
IN01:transformer_blocks.0.attn1.to_v.weight:0
IN01:transformer_blocks.0.attn2.to_k.weight:1
IN01:transformer_blocks.0.attn2.to_out.0:0
IN01:transformer_blocks.0.attn2.to_q.weight:1
IN01:transformer_blocks.0.attn2.to_v.weight:0
IN01:transformer_blocks.0.ff.net.0.proj:0
IN01:transformer_blocks.0.ff.net.2:0
IN01:transformer_blocks.0.norm1:0
IN01:transformer_blocks.0.norm2:0
IN01:transformer_blocks.0.norm3:0
IN00::0
-> B
MBW + Elemental merge
A: otokonoko-secret-base-semireal-mix_v2
B: aoimix_v1Asian
otokonoko-secret-base-semireal-mix_v2 x (1-alpha) + aoimix_v1Asian x alpha (0.0,0.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0)
OUT10:NOT emb_layers.1 in_layers.0 out_layers.0 attn1.to_out.0 attn1.to_q.weight attn1.to_v.weight attn2.to_k.weight attn2.to_out.0 attn2.to_q.weight ff.net.0.proj norm2:1
OUT08:emb_layers.1 in_layers.2 out_layers.3 skip_connection proj_in proj_out attn1.to_k.weight attn1.to_out.0:1
OUT07:proj_in attn1.to_k.weight ff.net.0.proj:1
OUT06:NOT attn1.to_k.weight attn2.to_k.weight attn2.to_out.0 attn2.to_q.weight ff.net norm1 norm2 norm3:1
OUT05:attn2:1
OUT04:emb_layers.1 in_layers.2 out_layers.0 skip_connection norm. proj_in attn2.to_q.weight:1
OUT03:in_layers out_layers.0 skip_connection ff.net.2 norm1 norm2 norm3:1
OUT01:in_layers out_layers.3 skip_connection:1
OUT00:in_layers.2 skip_connection:1
M00:0.emb_layers.1 1.norm. proj_out attn1.to_k.weight attn1.to_out.0 attn1.to_q.weight norm1 norm2 norm3 2.emb_layers.1 2.in_layers.0 2.out_layers.0:1
IN11:emb_layers.1 in_layers.0 out_layers.0:1
IN09:op:1
IN08:NOT proj_out attn1.to_v.weight attn2.to_k.weight ff.net:1
IN07:norm. norm1 norm2 norm3:1
IN05:emb_layers.1 in_layers out_layers skip_connection norm. proj_in attn1.to_k.weight attn1.to_out.0 attn1.to_q.weight attn2.to_k.weight:1
IN02:in_layers.0 out_layers skip_connection norm. proj_in attn1.to_k.weight attn1.to_q.weight attn1.to_v.weight attn2.to_k.weight ff.net.2 norm1:1
-> C
Elemental merge
A: otokonoko-secret-base-semireal-mix_v2
B: syluxMixV1_v10
otokonoko-secret-base-semireal-mix_v2 x 1 + syluxMixV1_v10 x 0
OUT11:emb_layers.1:1
OUT11:in_layers.0:1
OUT11:in_layers.2:1
OUT11:out_layers.0:1
OUT11:out_layers.3:0
OUT11:skip_connection:0
OUT11:norm.:0
OUT11:proj_in:0
OUT11:proj_out:0
OUT11:transformer_blocks.0.attn1.to_k.weight:0
OUT11:transformer_blocks.0.attn1.to_out.0:1
OUT11:transformer_blocks.0.attn1.to_q.weight:0
OUT11:transformer_blocks.0.attn1.to_v.weight:0
OUT11:transformer_blocks.0.attn2.to_k.weight:0
OUT11:transformer_blocks.0.attn2.to_out.0:1
OUT11:transformer_blocks.0.attn2.to_q.weight:0
OUT11:transformer_blocks.0.attn2.to_v.weight:0
OUT11:transformer_blocks.0.ff.net.0.proj:0
OUT11:transformer_blocks.0.ff.net.2:1
OUT11:transformer_blocks.0.norm1:0
OUT11:transformer_blocks.0.norm2:1
OUT11:transformer_blocks.0.norm3:0
OUT11:0.bias:0
OUT11:0.weight:0
OUT11:2.bias:0
OUT11:2.weight:0
OUT09:emb_layers.1:1
OUT09:in_layers.0:0
OUT09:in_layers.2:0
OUT09:out_layers.0:0
OUT09:out_layers.3:0
OUT09:skip_connection:0
OUT09:norm.:0
OUT09:proj_in:0
OUT09:proj_out:0
OUT09:transformer_blocks.0.attn1.to_k.weight:0
OUT09:transformer_blocks.0.attn1.to_out.0:0
OUT09:transformer_blocks.0.attn1.to_q.weight:0
OUT09:transformer_blocks.0.attn1.to_v.weight:0
OUT09:transformer_blocks.0.attn2.to_k.weight:0
OUT09:transformer_blocks.0.attn2.to_out.0:0
OUT09:transformer_blocks.0.attn2.to_q.weight:0
OUT09:transformer_blocks.0.attn2.to_v.weight:0
OUT09:transformer_blocks.0.ff.net.0.proj:0
OUT09:transformer_blocks.0.ff.net.2:0
OUT09:transformer_blocks.0.norm1:0
OUT09:transformer_blocks.0.norm2:0
OUT09:transformer_blocks.0.norm3:0
OUT08:emb_layers.1:0
OUT08:in_layers.0:0
OUT08:in_layers.2:0
OUT08:out_layers.0:0
OUT08:out_layers.3:0
OUT08:skip_connection:0
OUT08:norm.:0
OUT08:proj_in:0
OUT08:proj_out:0
OUT08:transformer_blocks.0.attn1.to_k.weight:1
OUT08:transformer_blocks.0.attn1.to_out.0:0
OUT08:transformer_blocks.0.attn1.to_q.weight:0
OUT08:transformer_blocks.0.attn1.to_v.weight:0
OUT08:transformer_blocks.0.attn2.to_k.weight:0
OUT08:transformer_blocks.0.attn2.to_out.0:0
OUT08:transformer_blocks.0.attn2.to_q.weight:0
OUT08:transformer_blocks.0.attn2.to_v.weight:0
OUT08:transformer_blocks.0.ff.net.0.proj:0
OUT08:transformer_blocks.0.ff.net.2:0
OUT08:transformer_blocks.0.norm1:0
OUT08:transformer_blocks.0.norm2:0
OUT08:transformer_blocks.0.norm3:0
OUT08:conv:0
OUT07:emb_layers.1:0
OUT07:in_layers.0:0
OUT07:in_layers.2:0
OUT07:out_layers.0:0
OUT07:out_layers.3:0
OUT07:skip_connection:0
OUT07:norm.:0
OUT07:proj_in:0
OUT07:proj_out:0
OUT07:transformer_blocks.0.attn1.to_k.weight:0
OUT07:transformer_blocks.0.attn1.to_out.0:0
OUT07:transformer_blocks.0.attn1.to_q.weight:0
OUT07:transformer_blocks.0.attn1.to_v.weight:0
OUT07:transformer_blocks.0.attn2.to_k.weight:0
OUT07:transformer_blocks.0.attn2.to_out.0:0
OUT07:transformer_blocks.0.attn2.to_q.weight:0
OUT07:transformer_blocks.0.attn2.to_v.weight:0
OUT07:transformer_blocks.0.ff.net.0.proj:0
OUT07:transformer_blocks.0.ff.net.2:0
OUT07:transformer_blocks.0.norm1:0
OUT07:transformer_blocks.0.norm2:0
OUT07:transformer_blocks.0.norm3:0
OUT06:emb_layers.1:0
OUT06:in_layers.0:0
OUT06:in_layers.2:0
OUT06:out_layers.0:0
OUT06:out_layers.3:0
OUT06:skip_connection:0
OUT06:norm.:0
OUT06:proj_in:1
OUT06:proj_out:1
OUT06:transformer_blocks.0.attn1.to_k.weight:0
OUT06:transformer_blocks.0.attn1.to_out.0:0
OUT06:transformer_blocks.0.attn1.to_q.weight:0
OUT06:transformer_blocks.0.attn1.to_v.weight:1
OUT06:transformer_blocks.0.attn2.to_k.weight:0
OUT06:transformer_blocks.0.attn2.to_out.0:0
OUT06:transformer_blocks.0.attn2.to_q.weight:0
OUT06:transformer_blocks.0.attn2.to_v.weight:0
OUT06:transformer_blocks.0.ff.net.0.proj:0
OUT06:transformer_blocks.0.ff.net.2:0
OUT06:transformer_blocks.0.norm1:1
OUT06:transformer_blocks.0.norm2:1
OUT06:transformer_blocks.0.norm3:1
OUT05:emb_layers.1:1
OUT05:in_layers.0:1
OUT05:in_layers.2:1
OUT05:out_layers.0:1
OUT05:out_layers.3:1
OUT05:skip_connection:1
OUT05:norm.:1
OUT05:proj_in:0
OUT05:proj_out:0
OUT05:transformer_blocks.0.attn1.to_k.weight:0
OUT05:transformer_blocks.0.attn1.to_out.0:0
OUT05:transformer_blocks.0.attn1.to_q.weight:0
OUT05:transformer_blocks.0.attn1.to_v.weight:0
OUT05:transformer_blocks.0.attn2.to_k.weight:0
OUT05:transformer_blocks.0.attn2.to_out.0:0
OUT05:transformer_blocks.0.attn2.to_q.weight:0
OUT05:transformer_blocks.0.attn2.to_v.weight:0
OUT05:transformer_blocks.0.ff.net.0.proj:0
OUT05:transformer_blocks.0.ff.net.2:0
OUT05:transformer_blocks.0.norm1:1
OUT05:transformer_blocks.0.norm2:1
OUT05:transformer_blocks.0.norm3:1
OUT05:conv:1
OUT04:emb_layers.1:1
OUT04:in_layers.0:0
OUT04:in_layers.2:0
OUT04:out_layers.0:0
OUT04:out_layers.3:0
OUT04:skip_connection:0
OUT04:norm.:1
OUT04:proj_in:0
OUT04:proj_out:0
OUT04:transformer_blocks.0.attn1.to_k.weight:0
OUT04:transformer_blocks.0.attn1.to_out.0:0
OUT04:transformer_blocks.0.attn1.to_q.weight:0
OUT04:transformer_blocks.0.attn1.to_v.weight:0
OUT04:transformer_blocks.0.attn2.to_k.weight:0
OUT04:transformer_blocks.0.attn2.to_out.0:0
OUT04:transformer_blocks.0.attn2.to_q.weight:0
OUT04:transformer_blocks.0.attn2.to_v.weight:0
OUT04:transformer_blocks.0.ff.net.0.proj:0
OUT04:transformer_blocks.0.ff.net.2:0
OUT04:transformer_blocks.0.norm1:1
OUT04:transformer_blocks.0.norm2:1
OUT04:transformer_blocks.0.norm3:0
OUT03:emb_layers.1:0
OUT03:in_layers.0:0
OUT03:in_layers.2:0
OUT03:out_layers.0:1
OUT03:out_layers.3:0
OUT03:skip_connection:0
OUT03:norm.:1
OUT03:proj_in:0
OUT03:proj_out:0
OUT03:transformer_blocks.0.attn1.to_k.weight:1
OUT03:transformer_blocks.0.attn1.to_out.0:0
OUT03:transformer_blocks.0.attn1.to_q.weight:0
OUT03:transformer_blocks.0.attn1.to_v.weight:0
OUT03:transformer_blocks.0.attn2.to_k.weight:0
OUT03:transformer_blocks.0.attn2.to_out.0:0
OUT03:transformer_blocks.0.attn2.to_q.weight:0
OUT03:transformer_blocks.0.attn2.to_v.weight:0
OUT03:transformer_blocks.0.ff.net.0.proj:0
OUT03:transformer_blocks.0.ff.net.2:0
OUT03:transformer_blocks.0.norm1:0
OUT03:transformer_blocks.0.norm2:1
OUT03:transformer_blocks.0.norm3:1
OUT02:emb_layers.1:0
OUT02:in_layers.0:0
OUT02:in_layers.2:0
OUT02:out_layers.0:0
OUT02:out_layers.3:0
OUT02:skip_connection:0
OUT02:conv:0
OUT01:emb_layers.1:0
OUT01:in_layers.0:0
OUT01:in_layers.2:0
OUT01:out_layers.0:0
OUT01:out_layers.3:0
OUT01:skip_connection:0
OUT00:emb_layers.1:0
OUT00:in_layers.0:0
OUT00:in_layers.2:0
OUT00:out_layers.0:0
OUT00:out_layers.3:1
OUT00:skip_connection:0
M00:0.emb_layers.1:1
M00:0.in_layers.0:1
M00:0.in_layers.2:0
M00:0.out_layers.0:0
M00:0.out_layers.3:0
M00:1.norm.:0
M00:1.proj_in:0
M00:1.proj_out:0
M00:1.transformer_blocks.0.attn1.to_k.weight:0
M00:1.transformer_blocks.0.attn1.to_out.0:0
M00:1.transformer_blocks.0.attn1.to_q.weight:0
M00:1.transformer_blocks.0.attn1.to_v.weight:0
M00:1.transformer_blocks.0.attn2.to_k.weight:0
M00:1.transformer_blocks.0.attn2.to_out.0:0
M00:1.transformer_blocks.0.attn2.to_q.weight:0
M00:1.transformer_blocks.0.attn2.to_v.weight:0
M00:1.transformer_blocks.0.ff.net.0.proj:0
M00:1.transformer_blocks.0.ff.net.2:0
M00:1.transformer_blocks.0.norm1:1
M00:1.transformer_blocks.0.norm2:1
M00:1.transformer_blocks.0.norm3:1
M00:2.emb_layers.1:0
M00:2.in_layers.0:0
M00:2.in_layers.2:0
M00:2.out_layers.0:0
M00:2.out_layers.3:0
IN11:emb_layers.1:0
IN11:in_layers.0:0
IN11:in_layers.2:0
IN11:out_layers.0:1
IN11:out_layers.3:0
IN10:emb_layers.1:1
IN10:in_layers.0:0
IN10:in_layers.2:0
IN10:out_layers.0:0
IN10:out_layers.3:0
IN09:op:0
IN08:emb_layers.1:0
IN08:in_layers.0:0
IN08:in_layers.2:0
IN08:out_layers.0:0
IN08:out_layers.3:0
IN08:skip_connection:0
IN08:norm.:0
IN08:proj_in:0
IN08:proj_out:0
IN08:transformer_blocks.0.attn1.to_k.weight:0
IN08:transformer_blocks.0.attn1.to_out.0:0
IN08:transformer_blocks.0.attn1.to_q.weight:0
IN08:transformer_blocks.0.attn1.to_v.weight:0
IN08:transformer_blocks.0.attn2.to_k.weight:0
IN08:transformer_blocks.0.attn2.to_out.0:0
IN08:transformer_blocks.0.attn2.to_q.weight:0
IN08:transformer_blocks.0.attn2.to_v.weight:0
IN08:transformer_blocks.0.ff.net.0.proj:0
IN08:transformer_blocks.0.ff.net.2:0
IN08:transformer_blocks.0.norm1:1
IN08:transformer_blocks.0.norm2:0
IN08:transformer_blocks.0.norm3:0
IN07:emb_layers.1:0
IN07:in_layers.0:0
IN07:in_layers.2:0
IN07:out_layers.0:0
IN07:out_layers.3:0
IN07:skip_connection:0
IN07:norm.:0
IN07:proj_in:0
IN07:proj_out:0
IN07:transformer_blocks.0.attn1.to_k.weight:0
IN07:transformer_blocks.0.attn1.to_out.0:0
IN07:transformer_blocks.0.attn1.to_q.weight:0
IN07:transformer_blocks.0.attn1.to_v.weight:0
IN07:transformer_blocks.0.attn2.to_k.weight:0
IN07:transformer_blocks.0.attn2.to_out.0:0
IN07:transformer_blocks.0.attn2.to_q.weight:0
IN07:transformer_blocks.0.attn2.to_v.weight:0
IN07:transformer_blocks.0.ff.net.0.proj:0
IN07:transformer_blocks.0.ff.net.2:0
IN07:transformer_blocks.0.norm1:0
IN07:transformer_blocks.0.norm2:0
IN07:transformer_blocks.0.norm3:0
IN06:op:0
IN05:emb_layers.1:0
IN05:in_layers.0:1
IN05:in_layers.2:0
IN05:out_layers.0:1
IN05:out_layers.3:0
IN05:skip_connection:1
IN05:norm.:0
IN05:proj_in:0
IN05:proj_out:0
IN05:transformer_blocks.0.attn1.to_k.weight:1
IN05:transformer_blocks.0.attn1.to_out.0:0
IN05:transformer_blocks.0.attn1.to_q.weight:0
IN05:transformer_blocks.0.attn1.to_v.weight:0
IN05:transformer_blocks.0.attn2.to_k.weight:0
IN05:transformer_blocks.0.attn2.to_out.0:0
IN05:transformer_blocks.0.attn2.to_q.weight:0
IN05:transformer_blocks.0.attn2.to_v.weight:0
IN05:transformer_blocks.0.ff.net.0.proj:0
IN05:transformer_blocks.0.ff.net.2:0
IN05:transformer_blocks.0.norm1:1
IN05:transformer_blocks.0.norm2:0
IN05:transformer_blocks.0.norm3:0
IN04:emb_layers.1:0
IN04:in_layers.0:0
IN04:in_layers.2:0
IN04:out_layers.0:0
IN04:out_layers.3:0
IN04:skip_connection:0
IN04:norm.:0
IN04:proj_in:0
IN04:proj_out:0
IN04:transformer_blocks.0.attn1.to_k.weight:0
IN04:transformer_blocks.0.attn1.to_out.0:0
IN04:transformer_blocks.0.attn1.to_q.weight:0
IN04:transformer_blocks.0.attn1.to_v.weight:0
IN04:transformer_blocks.0.attn2.to_k.weight:0
IN04:transformer_blocks.0.attn2.to_out.0:0
IN04:transformer_blocks.0.attn2.to_q.weight:0
IN04:transformer_blocks.0.attn2.to_v.weight:0
IN04:transformer_blocks.0.ff.net.0.proj:0
IN04:transformer_blocks.0.ff.net.2:1
IN04:transformer_blocks.0.norm1:0
IN04:transformer_blocks.0.norm2:0
IN04:transformer_blocks.0.norm3:0
IN03:op:0
IN02:emb_layers.1:0
IN02:in_layers.0:0
IN02:in_layers.2:1
IN02:out_layers.0:0
IN02:out_layers.3:0
IN02:skip_connection:1
IN02:norm.:0
IN02:proj_in:0
IN02:proj_out:0
IN02:transformer_blocks.0.attn1.to_k.weight:0
IN02:transformer_blocks.0.attn1.to_out.0:0
IN02:transformer_blocks.0.attn1.to_q.weight:0
IN02:transformer_blocks.0.attn1.to_v.weight:0
IN02:transformer_blocks.0.attn2.to_k.weight:0
IN02:transformer_blocks.0.attn2.to_out.0:0
IN02:transformer_blocks.0.attn2.to_q.weight:1
IN02:transformer_blocks.0.attn2.to_v.weight:0
IN02:transformer_blocks.0.ff.net.0.proj:0
IN02:transformer_blocks.0.ff.net.2:0
IN02:transformer_blocks.0.norm1:0
IN02:transformer_blocks.0.norm2:1
IN02:transformer_blocks.0.norm3:0
IN01:emb_layers.1:0
IN01:in_layers.0:0
IN01:in_layers.2:0
IN01:out_layers.0:0
IN01:out_layers.3:0
IN01:skip_connection:1
IN01:norm.:0
IN01:proj_in:0
IN01:proj_out:0
IN01:transformer_blocks.0.attn1.to_k.weight:0
IN01:transformer_blocks.0.attn1.to_out.0:0
IN01:transformer_blocks.0.attn1.to_q.weight:0
IN01:transformer_blocks.0.attn1.to_v.weight:0
IN01:transformer_blocks.0.attn2.to_k.weight:0
IN01:transformer_blocks.0.attn2.to_out.0:0
IN01:transformer_blocks.0.attn2.to_q.weight:0
IN01:transformer_blocks.0.attn2.to_v.weight:0
IN01:transformer_blocks.0.ff.net.0.proj:0
IN01:transformer_blocks.0.ff.net.2:0
IN01:transformer_blocks.0.norm1:0
IN01:transformer_blocks.0.norm2:0
IN01:transformer_blocks.0.norm3:1
IN00::0
-> D
triple sum
D x -1 + C x 1 + B x 1
-> E
C x -1 + D x 1 + B x 1
-> F
A x -1 + B x 1 + C x 1
-> G
Automerge(triple sum)
BASE: otokonoko-secret-base-semireal-mix_v3.safetensors
A: E
B: F
Text Encoder: 0.03875220182141603 (beta): 0.5308011920116564
Unet: [0.34021,0.91867,0.85377,0.25792,0.54982,0.82135,0.14058,0.09644,0.05686,0.84635,0.53843,0.34813,0.44800,0.27635,0.65344,0.75621,0.35122,0.85256,0.20701,0.96390,0.33374,0.10616,0.39868,0.51263,0.97148]
beta: [0.07921,0.34016,0.67765,0.22431,0.86667,0.16542,0.74430,0.78042,0.57138,0.80406,0.75323,0.06256,0.91250,0.99311,0.90435,0.73734,0.24236,0.18824,0.03431,0.39357,0.41758,0.74082,0.06481,0.67429,0.28595]
-> H
BASE: otokonoko-secret-base-semireal-mix_v3.safetensors
A: G
B: E
best score: 0.47007594040284556
Text Encoder: 0.7420431787229468 (beta): 0.01231069067457713
Unet: [0.84575,0.73948,0.47977,0.30135,0.90921,0.05428,0.97686,0.05404,0.72837,0.93789,0.79190,0.12184,0.08773,0.77615,0.73348,0.49570,0.30365,0.77406,0.26801,0.34967,0.82517,0.51515,0.41934,0.31408,0.05834]
beta: [0.06349,0.02390,0.46547,0.85676,0.90323,0.93590,0.77836,0.71765,0.11191,0.40041,0.10161,0.81416,0.44256,0.14426,0.42450,0.81324,0.72221,0.92838,0.39904,0.77984,0.73841,0.36544,0.12059,0.83840,0.95843]
-> I
Base: otokonoko-secret-base-semireal-mix_v3.safetensors
A: G
B: E
Text Encoder: 0.201512778733328 (beta): 0.39412072122813513
Unet: [0.93833,0.19855,0.79529,0.82809,0.24539,0.86785,0.67109,0.98634,0.39248,0.34531,0.46492,0.78631,0.29698,0.38897,0.51185,0.66543,0.05099,0.26588,0.14134,0.40607,0.51686,0.47360,0.76646,0.44647,0.17860]
beta: [0.57737,0.92845,0.11049,0.79594,0.97508,0.74118,0.07102,0.13934,0.76618,0.04411,0.28695,0.44725,0.66752,0.74964,0.74384,0.95856,0.82877,0.61748,0.31968,0.00021,0.38138,0.47904,0.06326,0.88030,0.02327]
-> J
BASE: H
A: I
B: J
Text Encoder: 0.2588833698094659 (beta): 0.5641609616268523
Unet: [0.98686,0.96684,0.96039,0.92923,0.59602,0.38548,0.37115,0.55681,0.87190,0.98331,0.75018,0.48088,0.38593,0.98141,0.00720,0.42322,0.04285,0.11588,0.26396,0.98806,0.15259,0.85913,0.19947,0.96126,0.15852]
beta: [0.20089,0.12688,0.71303,0.89244,0.09426,0.56121,0.08034,0.13752,0.00430,0.17179,0.01820,0.64070,0.70330,0.55442,0.27579,0.04465,0.31771,0.53924,0.09435,0.81457,0.25725,0.17980,0.24745,0.27349,0.50954]
-> K
MBW
K x (1-alpha) + real_model_N x alpha (0.0,1.0,0.6,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0)
-> L
Automerge(triple sum)
BASE: L
A: otokonoko-secret-base-semireal-mix_v3.safetensors
B: K
Text Encoder: 0.21880113464734166 (beta): 0.7575120092035388
Unet: [0.13401,0.56025,0.90851,0.51918,0.98051,0.95255,0.91346,0.08047,0.61222,0.34481,0.08821,0.41845,0.00643,0.01860,0.26430,0.18762,0.00819,0.32160,0.95536,0.49778,0.55641,0.05603,0.32808,0.17249,0.60647]
beta: [0.03588,0.08299,0.28922,0.43944,0.47845,0.72615,0.96740,0.08624,0.83121,0.06913,0.30102,0.57011,0.63715,0.01368,0.59036,0.30198,0.59828,0.59528,0.74419,0.76726,0.67732,0.45506,0.86220,0.34206,0.16428]
-> M
BASE: L
A: otokonoko-secret-base-semireal-mix_v3.safetensors
B: K
Text Encoder: 0.9381644592596778 (beta): 0.6449940363272264
Unet: [0.93455,0.99149,0.76623,0.03922,0.50838,0.42038,0.88188,0.06559,0.67997,0.40039,0.39755,0.30087,0.32224,0.65819,0.33124,0.27013,0.22219,0.30569,0.76007,0.48331,0.04491,0.70018,0.66576,0.50355,0.55120]
beta: [0.23965,0.01426,0.35527,0.41830,0.70163,0.38381,0.10368,0.54067,0.94902,0.31433,0.62891,0.30392,0.37772,0.65661,0.16153,0.05259,0.63196,0.40263,0.93653,0.62578,0.87396,0.97899,0.16461,0.36063,0.47256]
-> N
MBW
M x (1-alpha) + N x alpha (0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0)
-> otokonoko-secret-base-semireal-mix_v4
License:creativeml-openrail-m
For personal use. (not for commercial)
OK:Use the model without crediting the creator
NG:Sell images they generate
NG:Run on services that generate images for money
OK:Share merges using this model
NG:Sell this model or merges using this model
OK:Have different permissions when sharing merges
Thanks to the creators for the great models and tools used in this model!
説明を日本語で書きます。
いろいろなモデルとエレメンタルマージをした結果をごちゃごちゃ混ぜたのがこちら。
最初はv3の代用品と考えていましたがいつしかv3をベースにしていました。
まぁレシピが長くなりましたね…。
IとJは勘違いで同じモデルを使ってました(レシピ整理中に発覚)
比較的webui1.5.xでもまともに出るものを選んでいたのですが、
一番最後に作ったこのサンプルは人外化してた…
civitaiでも主流はXLになってFluxも徐々に増えていっているので1.5の新モデルが少なくなってしまって混ぜる元が無くなりそうです…
最終的にはXLの出力やFluxの出力を基にLoRA作って混ぜていくような感じになるのかなと思っていますが、
その前にグラボ買うか…今使ってるのが7~8年前のPCなのでそろそろ限界の予感もしています。
otokonoko's secret base semireal_v3
sample prompt(negative prompt is empty): 1boy, solo, light smile, black hair, playing game around their secret base,
Sep.14th,2024
semi-real model(checkpoint) for SD1.5 specialized model for trap/femboy/otoko no ko version 3.
Merge recipe(for supermerger):
MBW + elemental merge
otokonoko-secret-base-semireal-mix_v2 x (1-alpha) + vanillanudes_v11 x alpha (0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,1.0)
OUT08:emb_layers out_layers norm proj_out attn1.to_k.weight attn1.to_out attn1.to_v.weight norm1 norm2 norm3:1
OUT07:NOT out_layers.3 norm.weight norm.bias proj_out skip_connection proj_in attn1.to_q.weight attn2.to_out.0 attn2.to_v.weight ff.net.0.proj ff.net.2:1
OUT06:NOT out_layers.3 attn1.to_k.weight attn1.to_out.0 attn1.to_q.weight attn2.to_q.weight:1
OUT05:NOT emb_layers.1 out_layers.3 skip_connection proj_in proj_out attn1 attn2 ff.net conv:1
OUT04:NOT in_layers.0 out_layers.3 skip_connection proj_out attn1.to_out.0 attn1.to_q.weight attn2.to_k.weight attn2.to_v.weight ff.net:1
OUT03:out_layers.0 norm2 norm3:1
OUT01:NOT in_layers.2:1
OUT00:emb_layers.1 in_layers.0 out_layers.0:1
M00:NOT emb_layers.1 in_layers.0 out_layers.0 1.proj_out attn1.to_k.weight attn2 ff.net 0.norm1 0.norm2 2.in_layers.2 2.out_layers:1
IN11:emb_layers.1 in_layers.0 out_layers.0:1
IN08:NOT emb_layers.1 skip_connection attn1.to_v.weight attn2.to_out ff.net norm3:1
IN07:NOT out_layers.3 skip_connection attn1.to_out.0 attn2 ff.net.2:1
IN05:NOT skip_connection proj_in ff.net.0.proj norm1:1
IN04:NOT skip_connection norm. proj_out attn1.to_out.0 attn1.to_v.weight attn2.to_q.weight attn2.to_v.weight ff.net.0.proj:1
IN02:NOT norm. proj_in attn1.to_out.0 attn2.to_out.0 attn2.to_v.weight ff.net.0.proj:1
IN01:NOT emb_layers.1 in_layers.0 attn1.to_v.weight attn2.to_v.weight:1
-> A
otokonoko-secret-base-semireal-mix_v2 x (1-alpha) + aoimix_v1Asian x alpha (0.0,0.0,1.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0)
OUT10:NOT emb_layers.1 in_layers.0 out_layers.0 attn1.to_out.0 attn1.to_q.weight attn1.to_v.weight attn2.to_k.weight attn2.to_out.0 attn2.to_q.weight ff.net.0.proj norm2:1
OUT08:emb_layers.1 in_layers.2 out_layers.3 skip_connection proj_in proj_out attn1.to_k.weight attn1.to_out.0:1
OUT07:proj_in attn1.to_k.weight ff.net.0.proj:1
OUT06:NOT attn1.to_k.weight attn2.to_k.weight attn2.to_out.0 attn2.to_q.weight ff.net norm1 norm2 norm3:1
OUT05:attn2:1
OUT04:emb_layers.1 in_layers.2 out_layers.0 skip_connection norm. proj_in attn2.to_q.weight:1
OUT03:in_layers out_layers.0 skip_connection ff.net.2 norm1 norm2 norm3:1
OUT01:in_layers out_layers.3 skip_connection:1
OUT00:in_layers.2 skip_connection:1
M00:0.emb_layers.1 1.norm. proj_out attn1.to_k.weight attn1.to_out.0 attn1.to_q.weight norm1 norm2 norm3 2.emb_layers.1 2.in_layers.0 2.out_layers.0:1
IN11:emb_layers.1 in_layers.0 out_layers.0:1
IN09:op:1
IN08:NOT proj_out attn1.to_v.weight attn2.to_k.weight ff.net:1
IN07:norm. norm1 norm2 norm3:1
IN05:emb_layers.1 in_layers out_layers skip_connection norm. proj_in attn1.to_k.weight attn1.to_out.0 attn1.to_q.weight attn2.to_k.weight:1
IN02:in_layers.0 out_layers skip_connection norm. proj_in attn1.to_k.weight attn1.to_q.weight attn1.to_v.weight attn2.to_k.weight ff.net.2 norm1:1
-> B
Automerge(triple_sum)
BASE: otokonoko-secret-base-semireal-mix_v2.safetensors
model_a: A
model_b: B
Text Encoder: 0.4882705838709901 (beta): 0.7342000343828106
Unet: [0.89717,0.42486,0.86468,0.12965,0.06892,0.67682,0.72991,0.30634,0.81824,0.01922,0.56748,0.53107,0.65859,0.55852,0.73945,0.18946,0.68306,0.94639,0.74036,0.54746,0.22457,0.03538,0.35899,0.12982,0.32126]
beta: [0.43734,0.85824,0.96926,0.93165,0.56361,0.77132,0.66043,0.12415,0.26958,0.24984,0.74546,0.34073,0.36298,0.36945,0.58034,0.74196,0.92616,0.04630,0.24256,0.19629,0.04524,0.00867,0.13057,0.59019,0.57832]
-> otokonoko-secret-base-semireal-mix_v3
But I use an illusory model.
License:creativeml-openrail-m
For personal use. (not for commercial)
OK:Use the model without crediting the creator
NG:Sell images they generate
NG:Run on services that generate images for money
OK:Share merges using this model
NG:Sell this model or merges using this model
OK:Have different permissions when sharing merges
Thanks to the creators for the great models and tools used in this model!
説明を日本語で書きます。
今回はエレメンタルマージです。
性懲りもなく作り続けていたわけですが、Aで使っていたvanillanudes‗v11が非公開化したため(再現性が検証できないので)、
どうしようか迷った末の公開です(このモデル自体は5月くらいにできてはいた)。
このモデルについては「作成途中でできていた割と良い感じのもの」という立ち位置なので、調整が甘いかも。
webui1.5.2として残しておいた環境だと少しおかしくなりがち
webui1.7.0かwebui1.10.xなら割と良い感じかもしれない
otokonoko's secret base semireal_v2
sample prompt(negative prompt is empty): 1boy, solo, light smile, black hair, playing game around their secret base,
Mar.20th,2024
semi-real model(checkpoint) for SD1.5 specialized model for trap/femboy/otoko no ko version 2.
Merge recipe(for supermerger):
pre-merge process:
xxx-pruned: I pruned model by webui model toolkit.
A-D use MBW and Triple sum, E-G use weight sum, H use MBW weight sum,
A = otokonoko-secret-base-2.5D-mix x (1-alpha-beta) + real_model_N x alpha + realcoharumix_v10-pruned x beta (alpha = 0.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,0.0,0.5,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,1.0,1.0,1.0,beta = 0.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,0.0,0.5,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.3,0.0,0.3,1.0,0.0,1.0,1.0,1.0,1.0)
B = otokonoko-secret-base-2.5D-mix x (1-alpha-beta) + isenganmixRealism_v10 x alpha + kawaiiRealisticAsian_v04 x beta (alpha = 0.0,1.0,1.0,0.0,0.0,0.5,0.5,0.5,0.0,0.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0,beta = 0.0,1.0,1.0,0.0,0.0,0.5,0.5,0.5,0.0,0.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.0,0.0,0.3,1.0,1.0,1.0,1.0,1.0,1.0)
C = otokonoko-secret-base-2.5D-mix x (1-alpha-beta) + lunareality_typed x alpha + realanimemix_v10-pruned x beta (alpha = 0.0,1.0,1.0,0.0,0.0,0.5,0.5,0.5,0.0,0.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.0,0.0,0.3,1.0,1.0,1.0,1.0,1.0,1.0,beta = 0.0,1.0,1.0,0.0,0.0,0.5,0.5,0.5,0.0,0.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,1.0,1.0,1.0,1.0,1.0,1.0)
D = otokonoko-secret-base-2.5D-mix x (1-alpha-beta) + kuronekomix_v10 x alpha + hardcoreAsianPorn_v20 x beta (alpha = 0.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,0.0,0.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,1.0,1.0,beta = 0.0,1.0,1.0,1.0,1.0,1.0,0.0,1.0,0.0,1.0,1.0,1.0,1.0,0.0,1.0,1.0,1.0,0.3,0.0,0.3,1.0,0.0,0.0,1.0,1.0,1.0)
E = C x 0.1 + B x 0.9
F = A x 0.3 + E x 0.7
G = C x 0.35 + D x 0.65
H = F x (1-alpha) + G x alpha (0.0,0.5,0.0,0.3,0.5,0.0,0.3,0.5,0.5,0.3,0.5,0.5,0.5,0.0,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5,0.5)
H -> otokonoko-secret-base-semireal-mix_v2
License:creativeml-openrail-m
For personal use. (not for commercial)
OK:Use the model without crediting the creator
NG:Sell images they generate
NG:Run on services that generate images for money
OK:Share merges using this model
NG:Sell this model or merges using this model
OK:Have different permissions when sharing merges
Thanks to the creators for the great models and tools used in this model!
説明を日本語で書きます。
v1作成後オートマージを続けていたもののうまくいかず悩んでいました。
結果としてそこそこの量のマージ結果のモデルができていて、大体のパターンが見えてきていました(この層は0じゃないといけない等)。
そこで、2.5Dをベースにして手動でマージしなおした結果がコレです。
過剰にリアル感が欲しかったのでtriple sumで強めに入れていますが、セミリアル止まりになっています(OUT03-05層(特にOUT04)に男性の情報が入ってしまっていてあまり顔がリアルにできないのも要因の一つではあります)
webui1.7.0でチューニングしています。1.5.2、1.8.0、forgeでも一応出力確認はしていますがたまに妙な絵が出ることがあります。
なお本ページのサンプルはwebui1.5.2で作成してます(過去のものと同一シード、同一条件で出力して比較するため)
そろそろXLのモデルマージにも手を出したいけど今使っているGTX1070(8GB)じゃきついかなぁと思ってます…
なお1.5でコッショリ作っていたLoRAをXLで作り直してみたらfp8、1batch、1epochが1570steps。8epoch分12560stepsで27時間とかかかったので改良等は諦めて当面は画像生成だけに留まりそうです…
otokonoko's secret base semireal
sample prompt(negative prompt is empty): 1boy, solo, light smile, black hair, playing game around their secret base,
Dec.20th,2023
semi-real model(checkpoint) for SD1.5 specialized model for trap/femboy/otoko no ko
How to create it.
I use auto merge tool from nan-JNVA@5ch. and customize it.
-> ”自動マージ_otokonoko-semireal特化.zip”
base: otokonoko-secret-base-2.5D-mix.safetensors
modelA: hardcoreAsianPorn_v20.safetensors
modelB: real_model_N.safetensors
mode: triple_sum
type: default(MBW)
result
Text Encoder: 0.0 (beta): 0.0
Unet: [1.00000,0.00000,0.00000,0.00000,1.00000,0.00000,1.00000,0.00000,0.00000,0.00000,0.27908,0.93074,0.86015,0.51705,1.00000,0.00000,0.00000,0.00000,0.48520,1.00000,0.00000,1.00000,0.29926,0.59512,0.00000]
beta: [1.00000,0.84878,0.73963,0.34385,0.00000,1.00000,1.00000,1.00000,0.41966,0.00000,0.31770,1.00000,0.00000,1.00000,0.73915,1.00000,0.41583,0.00000,0.00000,0.69422,1.00000,0.35436,0.00000,0.00000,0.00000]
-> A
base: otokonoko-secret-base-2.5D-mix.safetensors
modelA: real_model_N.safetensors
mode: sum
type: default(MBW)
global_min = 0.0
global_max = 2.0
result
Text Encoder: 0.3366062123160801
Unet: [1.41119,0.69936,0.54283,0.76309,1.48250,1.06948,1.11390,1.16857,1.52035,1.38986,0.39688,0.45548,0.00000,0.79838,0.25670,1.85624,0.00000,0.00000,0.39115,2.00000,1.23593,1.13210,1.16706,0.57141,2.00000]
-> B
base: otokonoko-secret-base-2.5D-mix.safetensors
modelA: real_model_N.safetensors
mode: sum
type: default(MBW)
global_min = 0.0
global_max = 2.0
result
Text Encoder: 0.051471927683630125
Unet: [1.15342,0.67836,0.06363,0.98149,0.84862,1.30120,1.29625,1.02234,0.86429,0.61801,1.14635,1.03788,0.00000,0.88142,1.15137,1.05049,0.00000,0.00000,0.92134,1.03317,0.97142,1.21514,0.00000,0.93322,1.13380]
-> C
follow use supermerger
D = A x 0.5 + B x 0.5
E = D x 0.7 + C x 0.3
E -> otokonoko-secret-base-semireal-mix_v1
License:creativeml-openrail-m
For personal use. (not for commercial)
OK:Use the model without crediting the creator
NG:Sell images they generate
NG:Run on services that generate images for money
OK:Share merges using this model
NG:Sell this model or merges using this model
OK:Have different permissions when sharing merges
Thanks to the creators for the great models and tools used in this model!
説明を日本語で書きます。
なんJNVA部に投下されてた自動マージツールをちょっと修正(バッチサイズ2で画像を作った時に全画像対象になる様にbayse_auto_merge.pyの408行目のscores.append(score)のインデントを1つ深くしている)
+男の娘用のスコアモデルを作成
いくつかプロンプトを決めて、最初の評価はどうでもいいので一旦全画像保存(保存の閾値を-1に設定)する。出てきた画像を適当なディレクトリ内にtrue/falseディレクトリを作り、良いものをtrueに、ダメなものをfalseに放り込む
各画像ごとにテキスト(プロンプト)をつけて、スコアモデルを作成(true/falseに作るテキストファイルは画像を生成した時のプロンプトをそのまま書いたものにしておく(パターンごとに1つずつ作ってコピペしてリネームを繰り返す))
評価モデル側のスクリプトtrain_rf_score.pyを使ってスコアモデルを作成。
作ったスコアモデルを使用して同様に出力。点数の付け方がおかしいもの(悪いのに評価が高いものはfalseへ、良いのに評価が低いものはtrueへ分けてテキストファイルを追加)。再度スコアモデルを作成。※これを延々と繰り返す
スコアモデルがある程度妥当な判別をできるようになってきた段階で、マージされた結果を色々試してさらにマージを試みてよかったものを使用。
自動マージツールは改造化、再配布化のファイルだったのでupしておきました。参考まで。
A-Cは自動マージツールで作成しました。作成時のパラメータを張っておきました。
なおモデルの使い方は2.5Dとほぼ同じです。
ちょっとphoto系のプロンプトを強く入れるとリアル風には出ますが、よく見ると結構甘いのでセミリアルにしています。
supermergerを使って作っても似たようなものにはなるけど、有効桁数が多分マージツール側の方が大きい?ようなので、同一にはならなかったはず…。
特定モデルでしか自動マージを試せてないのでいろいろなモデルを混ぜたいとは思っていますが、とりあえず割とマシなものができたつもりなのでうpりました
otokonoko's secret base 2.5D
sample prompt(negative prompt is empty): 1boy, solo, light smile, black hair, playing game around their secret base,
Aug.19th,2023
2.5D model(checkpoint) for SD1.5. specialized model for trap/femboy/otoko no ko
it's so peaky model....take care, especialy OUT04-06.
recipe for supermerger:
LECO is a some kind of LoRA.
pre-merge process
- create LECO for erase muscle.(like muscular\prompt.yaml)
LECO is so random, I put those same place that I made.
1 LECO need for merge.
- LECO_muscular_bm25dbc_last
ok, lets merge.
A-H,M use LoRA merges, I-K use sum Twice, M uses MBW weight sum, L use Weight sum
- A = boxmix25DMale_v1025DBoxcat + LECO_muscular_bm25dbc_last:7.2:0,0,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0
- B = boxmix25DMale_v1025DBoxcat + LECO_muscular_bm25dbc_last:7.2:0,0,1,1,1,1,1,1,1,1,1,1,0,0,0,0,0
- C = boxmix25DMale_v10NSFW + LECO_muscular_bm25dbc_last:9:0,0,1,1,1,1,1,1,1,1,1,0,0,0,0,0,0
- D = boxmix25DMale_v10NSFW + LECO_muscular_bm25dbc_last:9:0,0,1,1,1,1,1,1,1,1,1,1,0,0,0,0,0
- E = homodiffusion2_FP32 + LECO_muscular_bm25dbc_last:10:0,0,1,1,1,1,1,1,1,1,1,0.5,0,0,0,0,0
- F = homodiffusion2_FP32 + LECO_muscular_bm25dbc_last:9.2:0,0,1,1,1,1,1,1,1,1,1,1,0,0,0,0,0
- G = musesErato_v30 + LECO_muscular_bm25dbc_last:14:0,0,1,1,1,1,1,1,1,1,1,0.5,0,0,0,0,0
- H = prismaboysmix_v50FinalBakedVAE + LECO_muscular_bm25dbc_last:7:0,0,1,1,1,1,1,1,1,1,1,0.5,0,0,0,0,0
- I = (C x 0.2 + G x 0.8) x 0.6 + H x 0.4
- J = (A x 0.5 + D x 0.5) x 0.5 + E x 0.5
- K = (B x 0.5 + C x 0.5) x 0.5 + F x 0.5
- L = J x 0.5 + K x 0.5
- M = L x (1-alpha) + I x alpha (0.9,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,0.9,1.0,1.0,1.0,0.5,1.0,1.0,1.0,1.0,0.9,1.0,0.9,1.0)
- N = M + femboiFullFemboyTrap_v10:0.3
N -> otokonoko-secret-base-2.5D-mix
License:creativeml-openrail-m
For personal use. (not for commercial)
OK:Use the model without crediting the creator
NG:Sell images they generate
NG:Run on services that generate images for money
OK:Share merges using this model
NG:Sell this model or merges using this model
OK:Have different permissions when sharing merges
Thanks to the creators for the great models and LoRAs used in this model!
日本語で解説です。
- いわゆる男の娘を出すためのモデルです。
- 自分で使用していてかなりピーキーなモデルということが判明しています(顔層に男性成分が…、OUT04-06はあまりいじらない方がいい)。
- 2.5Dくらいなのでプロンプトに品質系の語句を盛ればそれなりに立体感のある絵が出ます。
- キーワード?は1boyで通常のモデルの1girl相当です(1girlで普通に女性は出ます)
- 顔はほぼ女性、体つきは筋肉少な目の男性、あと付いてるという人物が出てくるところを目指しました。
- 若干筋肉成分が付きがちではあるのですが、muscleとかmuscularとかはLECOを使ってとり切った後のモデルなので、これ以上削減するとどうなるかは不明です。
- モデル生成中は、LECOの比重を増やしすぎると付いてるものがなくなったり体が女性化したりしたので、この値になってます(一部は腹筋が自然に見えるように過剰にかけています)。
- LECO自体ランダム性のあるもの(ぶっちゃけガチャ)なので同一条件で同一モデルに対してLECOを作っても全く同じという事にはならないです。
- なので今回は使用したLECOも同じところに置いておきます。
一般的な注意点
1boy指定でほぼ女の子の見た目をした男の子が出ます。世間では男の娘と呼ばれているタイプの人です。苦手な人は利用しない方がいいです。
- 脱いだら出たりでなかったりします(何が?ナニが)。出ないSeedではプロンプトにいろいろ追加してください
- 着ている状態で膨らませたいけど膨らんでいない状態の場合はbulgeを追加してください。それでも出ない場合は重みを増やしてください。
- なお着ている状態を利用してお胸無い系女子を出すこともできるかもしれないです(膨らませなければバレない?ただしかなり骨格が男性寄りになります)。
- 何かと上半身に着こみがちです。(bare breastsだとお胸が付いてしまうので、bare chestが効けばそちらを使ってください)
- プロンプト内の強調については効きすぎるようなので、やや控えめに使った方がいいと思います。
- ネガティブ側も同様に、(xxxx:1.9)とかを多用するとすぐに絵がガビりだします。
- 512x768を想定しています。横長画像は若干弱いです。
- ネガティブプロンプトにbad anatomyは必須かも
- 自作LECOを多用した関係で、金髪になりがちです。髪色や肌の色、目の色などはプロンプトに入れるか、そういう情報が入ってそうなLoRAを使えば大体出ます
- これもまた自作LECOを多用した所為か、年齢指定があまり効かないかもしれません。80歳でも若々しい…そうありたいものです。
- CFGは大体大丈夫(30でも一応出る)で、DPM++M2 SDE Karrasを使ってsteps数20と50で出力のテストをしました。時々変なものも出ますが大体まともな絵が出ます。
- 引きの絵(立っている人物像)等はHires.fixで修正をお勧めします。
- LoRAは大体効く見たいですが、LoRAやseedによっては股周りがヒドイ事になる可能性があります。
- NSFWは試していませんが、マージした中にNSFWモデルも入ってるので出るときは出るんだと思います。
- もっと2D寄りにもっていきたかったんですが、なかなかうまくいかずここまででいったん出します。
感想
- 漢のモデルから男の娘出せないかな…とふと思ってしまったのが運の尽き。
- あとは男の娘モデルってあまり見たことがなかったので、とりあえず作ってみましたが、正直かなり難しかったです。
- 生えてほしくて比率を変えると生えずにむしろ女性化する。かといって手を抜くと生えてはいるけどムキムキになる。葛藤の末に出来上がったのが本モデルです。
- 最後はどうにもならなくなって調整のためにfemboiさんを入れました。
- 軽い気持ちで始めたら検証が大量になってしてしまって大変だった…しばらくは沼には浸かりたくない…
- prismaboysmix_v50FinalBakedVAEは若干ダークな色合いだけどキャラLoRAがかなりきちんと効くのでいいなぁと思いました。
マージの方針(こんな方針で作ったよという情報)
- 漢のモデルを適当に落としてきてライセンスが大丈夫そうなものに関して適当なキャラLoRAを使って衣装の再現性などを見て使用モデルを厳選します(マージ前)
- 筋肉質な男性が出るモデルから筋肉を引いて中性化します(LECO作成とA-H)
- それぞれの出力を確認して大まかな比率(これを強く入れてあれは若干弱めに入れる)を決めて混ぜます(I-L)
- seedを固定して出力の調整を行ったりしながら比率を決めて混ぜます(M)
- 最後に出力を見つつ男の娘成分を混ぜ合わせます(N)