DraftReasoner-2x7B-MoE-v0.1

Experimental 2-expert MoE merge using mlabonne/Marcoro14-7B-slerp as base.

Notes

Please evaluate before use in any application pipeline. Activation for Math part of the model would be 'math', 'reason', 'solve', 'count'.

Downloads last month
5
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.