GPT2 Ukrainian
A generative language model for the Ukrainian language follows the GPT-2 architecture (124M parameters).
- hidden size: 768
- number of heads: 12
- number of layers: 12
- seq length: 1024
- tokens: 11238113280 (3 epochs)
- steps: 57167
Training data
- OSCAR
- Wikimedia dumps
License
MIT
- Downloads last month
- 196
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.