This is a GPT-2 (1558M) model trained in llm.c for 100B tokens with cosine LR on DCLM-Baseline.
A lot more detailed info and observations are here: https://x.com/Yuchenj_UW/status/1813260100192334108
- Downloads last month
- 132
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.