This cross-architecture distillation, with Phi?
2
#14 opened 21 days ago
by
sometimesanotion
Template
1
#13 opened 3 months ago
by
isr431
Update SuperNova-Medius with a merge with Qwen/Qwen2.5-Coder-14B-Instruct + Further Training ๐
11
#12 opened 3 months ago
by
Joseph717171
max output tokens?
1
#11 opened 3 months ago
by
sirus
![](https://cdn-avatars.huggingface.co/v1/production/uploads/1656090140385-noauth.jpeg)
Is there any example tutorial on mergekit-tokensurgeon?
#10 opened 3 months ago
by
win10
![](https://cdn-avatars.huggingface.co/v1/production/uploads/1678188568629-noauth.png)
We distilled the logits of Llama 3.1 405B using an offline approach.
#9 opened 3 months ago
by
sirus
![](https://cdn-avatars.huggingface.co/v1/production/uploads/1656090140385-noauth.jpeg)
Unusual tokenizer.json file size
#8 opened 3 months ago
by
AuriAetherwiing
![](https://cdn-avatars.huggingface.co/v1/production/uploads/661659172851ba108048ff25/QHnzCEkqf7VM5qNdnivY_.png)
How about a 3 way merge with a distillation from Mistral Large? :D
#7 opened 3 months ago
by
DreamGenX
![](https://cdn-avatars.huggingface.co/v1/production/uploads/6548b80bb3a7efb9391e19e8/DYCJL22AOn8kDLQhi9TaW.png)
Ideal quantization levels
2
#6 opened 4 months ago
by
jadbox
Multilingual, Uncensored and extensive vocabulary.
5
#4 opened 4 months ago
by
Kukedlc
![](https://cdn-avatars.huggingface.co/v1/production/uploads/64d71ab4089bc502ceb44d29/nnacD7gbRSMbxCYBqkRYX.png)
2 base models = a nice merge UI on the model page
2
#1 opened 4 months ago
by
victor
![](https://cdn-avatars.huggingface.co/v1/production/uploads/5f17f0a0925b9863e28ad517/X7QKoiXbUtEZSG9jyvfk3.jpeg)