File size: 473 Bytes
346868c db17c54 d70b07b db17c54 3aeb9f6 db17c54 4767256 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
datasets:
- lemonilia/LimaRP
- grimulkan/theory-of-mind
- jeiku/gnosisreformatted
tags:
- not-for-all-audiences
---
# Fett-uccine
This model is created by training Mistral base model on LimaRP (ShareGPT format provided by SAO), theory of mind, and gnosis(provided by jeiku).
The 8-bit lora was then merged into Mistral Instruct resulting in what you see here.
Works best with ChatML Instruct
This model is in honor of the SillyTavern community, keep being awesome! |