20-30B merges
Collection
2 items
•
Updated
This is a merge of pre-trained language models
This merge of finetunes will generate less slop, more creative output, while not lose good logic and instruction folloving abilities.
Due to it's components, this merge have neutral, or even negative bias. If you ask it to insult you, you will be pleasantly surprised.
Tested it on my brand new grimdark scenario, it creatively follows instructions, many deascription and emotions in responses.
Tested on russian and english, very good on both.
Tested on T1.01
Use ChatML