mrapacz's picture
Upload README.md with huggingface_hub
c7b749f verified
|
raw
history blame
1.01 kB
metadata
license: cc-by-sa-4.0
language:
  - en
metrics:
  - bleu
base_model:
  - PhilTa
library_name: transformers
datasets:
  - mrapacz/greek-interlinear-translations

Model Card for Ancient Greek to English Interlinear Translation Model

This model performs interlinear translation from Ancient Greek to {Language}, maintaining word-level alignment between source and target texts.

Model Details

Model Description

  • Developed By: Maciej Rapacz, AGH University of Kraków
  • Model Type: Neural machine translation (T5-based)
  • Base Model: PhilTa
  • Tokenizer: PhilTa
  • Language(s): Ancient Greek (source) → English (target)
  • License: CC BY-NC-SA 4.0
  • Tag Set: OB (Oblubienica)
  • Text Preprocessing: Normalized
  • Morphological Encoding: emb-sum

Model Performance

  • BLEU Score: 55.49
  • SemScore: 0.86

Model Sources