Baseline for the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model as a baseline. It was finetuned on the dataset BookSum (full-book setting).

Downloads last month
111
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train abertsch/bart-base-booksum