YAML Metadata
Error:
"datasets[0]" with value "Arabic Wikipedia" is not valid. If possible, use a dataset id from https://hf.co/datasets.
GPT2-Small-Arabic
Model description
GPT2 model from Arabic Wikipedia dataset based on gpt2-small (using Fastai2).
Intended uses & limitations
How to use
An example is provided in this colab notebook. Both text and poetry (fine-tuned model) generation are included.
Limitations and bias
GPT2-small-arabic (trained on Arabic Wikipedia) has several limitations in terms of coverage (Arabic Wikipeedia quality, no diacritics) and training performance. Use as demonstration or proof of concepts but not as production code.
Training data
This pretrained model used the Arabic Wikipedia dump (around 900 MB).
Training procedure
Training was done using Fastai2 library on Kaggle, using free GPU.
Eval results
Final perplexity reached was 72.19, loss: 4.28, accuracy: 0.307
BibTeX entry and citation info
@inproceedings{Abed Khooli,
year={2020}
}
- Downloads last month
- 416
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.