Fett-uccine-7B / README.md
Epiculous's picture
Update README.md
db17c54 verified
|
raw
history blame
348 Bytes
---
datasets:
- lemonilia/LimaRP
- grimulkan/theory-of-mind
- jeiku/gnosisreformatted
tags:
- not-for-all-audiences
---
#Fett-uccine
This model is created by training Mistral base model on LimaRP (ShareGPT format provided by SAO), theory of mind, and gnosis.
The 8-bit lora was then merged into Mistral Instruct resulting in what you see here.