Update README.md
Browse files
README.md
CHANGED
@@ -15,6 +15,8 @@ base_model:
|
|
15 |
|
16 |
# L3.1-Moe-4x8B-v0.2
|
17 |
|
|
|
|
|
18 |
This model is a Mixture of Experts (MoE) made with mergekit-moe. It uses the following base models:
|
19 |
|
20 |
- [Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base](https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base)
|
|
|
15 |
|
16 |
# L3.1-Moe-4x8B-v0.2
|
17 |
|
18 |
+
![cover](https://github.com/moeru-ai/L3.1-Moe/blob/main/cover/v0.2.png?raw=true)
|
19 |
+
|
20 |
This model is a Mixture of Experts (MoE) made with mergekit-moe. It uses the following base models:
|
21 |
|
22 |
- [Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base](https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base)
|