Shadowed-Wyvern-12B?

#1
by MrDevolver - opened

Hello,

what happened to Shadowed-Wyvern-12B? I downloaded it just yesterday, it seemed to be a combination of just the right models and now it's gone. 😧

I thought there was issue with it stopping early or something, might have been problem with my context in sillytavern, I am a perfectionist. When tested in llama.cpp webui it was not happening. I saw you made a quant, has it been stable for you?

I thought there was issue with it stopping early or something, might have been problem with my context in sillytavern, I am a perfectionist. When tested in llama.cpp webui it was not happening. I saw you made a quant, has it been stable for you?

I ended up using your quant, it was okay in chub, there was only one character which was problematic, but it turns out it was that one character, because other models had the same issue with that character. Testing yesterday with other characters was very good though. I feel like the models you chose for it were a very good choice, because I tested them individually before and I thought of a model that would merge them all as the perfect model. Except the Scarlet Eclipse, because I wasn't aware of its existence during my testing.

As for the issues with stopping early, that may be related to chat template, or the lack of it. Models usually act differently depending on the used chat template and I did notice your models usually don't have any Jinja based chat template, but rather rely on the old chat template method with manual settings. I admit sometimes I have to tweak it here and there to make it work the way that seems correct, but that's a small fee for having good RP models in this size.

Shadowed-Wyvern-12B seems to be unstable at longer context where others are not, giving clipped responses. I am guessing this came from setting weight for lm_head in merge.

Vortex5 changed discussion status to closed

Sign up or log in to comment