lora key not loaded

#5
by Theoldsong - opened

lora key not loaded: transformer.transformer_blocks.59.attn.add_k_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.add_q_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.add_v_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_add_out.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_k.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_out.0.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_q.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_v.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.img_mlp.net.2.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.txt_mlp.net.2.lora_A.weight

还有,官方提供的工作流,采样器下的调度器 beta57是怎么来的,我使用最新的comfyui也没有这个值

lora key not loaded: transformer.transformer_blocks.59.attn.add_k_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.add_q_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.add_v_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_add_out.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_k.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_out.0.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_q.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_v.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.img_mlp.net.2.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.txt_mlp.net.2.lora_A.weight

这算是一个小bug,本身我训练的时候冻结了59层,所以59层加载不加载其实没有影响。
beta57是一个插件的调度器,没有的话用beta也可以

is it a merge?

lora key not loaded: transformer.transformer_blocks.59.attn.add_k_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.add_q_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.add_v_proj.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_add_out.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_k.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_out.0.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_q.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.attn.to_v.lora.down.weight
lora key not loaded: transformer.transformer_blocks.59.img_mlp.net.2.lora_A.weight
lora key not loaded: transformer.transformer_blocks.59.txt_mlp.net.2.lora_A.weight

这算是一个小bug,本身我训练的时候冻结了59层,所以59层加载不加载其实没有影响。
beta57是一个插件的调度器,没有的话用beta也可以

好的,感谢志佬解答!

Sign up or log in to comment