license: other | |
license_name: yi-license | |
license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE | |
base_model: jondurbin/bagel-34b-v0.2 | |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/pf4d6FA7DriRtVq5HCkxd.png) | |
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64c14f6b02e1f8f67c73bd05/e4u8VYfDBh11u60rFYJHF.png) | |
This model is a finetune of jondurbin's excellent [bagel](https://huggingface.co/jondurbin/bagel-34b-v0.2) model. | |
It has been trained with new datasets and a new technique, which we will share to the community soon. | |
This model has not utilised any form of merging. | |
### Evaluation Results | |
| Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K | | |
| --- | --- | --- | --- | --- | --- | --- | | |
| 77.29 | 74.23 | 86.76 | 76.66 | 70.22 | 83.66 | 72.18 | | |
### Contamination Results | |
With reference model jondurbin/bagel-34b-v0.2: | |
| ARC | TruthfulQA | GSM8K | | |
| --- | --- | --- | | |
| 0.08| 0.38| 0.88| | |
*** | |
Vanilla Quantization by [nold](https://huggingface.co/nold), Original Model [abacusai/Smaug-34B-v0.1](https://huggingface.co/abacusai/Smaug-34B-v0.1). Created using [llm-quantizer](https://github.com/Nold360/llm-quantizer) Pipeline - 465d7970507dcaac4cb50221157a68c840965774 | |