Indonesian T5 Abstractive Summarization Base Model

Hello everyone, we are from Bina Nusantara University (SumText Group) consisting of Stevan Pohan, Joseph Vincent Liem, and Yongky Alexander Tristan. This is the result of a model that we have fine-tuned for the use of abstractive summarization

Load Fine Tuned Model

  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
  model_path = "migz117/T5-Abstractive"
  model = AutoModelForSeq2SeqLM.from_pretrained(model_path).to(device)
  tokenizer = AutoTokenizer.from_pretrained(model_path)
Downloads last month
17
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Dataset used to train migz117/T5-Abstractive