Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
research-dump
/
Fine_tuned_albert-large-v2_TAQA_extension
like
0
Follow
Hsuvas's Research Backups
1
Text Classification
Transformers
Safetensors
albert
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Fine_tuned_albert-large-v2_TAQA_extension
/
tokenizer_config.json
Commit History
Upload tokenizer
0c4f935
verified
hsuvaskakoty
commited on
Jan 20