Pilota model for dialogs

A model for Pilota trained with Accommodation Search Dialog Corpus and other additional examples

Usage

  1. Install Pilota

  2. Prepare inputs

    • Command

      echo -e 'ใ”่ฆๆœ›ใ‚’ใŠ็Ÿฅใ‚‰ใ›ใใ ใ•ใ„\tใฏใ„ใ€‚้ƒจๅฑ‹ใ‹ใ‚‰ๅฏŒๅฃซๅฑฑใŒ่ฆ‹ใˆใฆใ€ๅคœๆ™ฏใ‚’่ฆ‹ใชใŒใ‚‰้ฃŸไบ‹ใฎใงใใ‚‹ใƒ›ใƒ†ใƒซใŒใ„ใ„ใชใ€‚\nใ“ใ‚“ใซใกใฏ\tใ“ใ‚“ใซใกใฏ' | python -m pilota.convert.plain2request | tee input.jsonl
      
    • Output

      {"context": [{"name": "agent", "text": "ใ”่ฆๆœ›ใ‚’ใŠ็Ÿฅใ‚‰ใ›ใใ ใ•ใ„"}], "utterance": "ใฏใ„ใ€‚้ƒจๅฑ‹ใ‹ใ‚‰ๅฏŒๅฃซๅฑฑใŒ่ฆ‹ใˆใฆใ€ๅคœๆ™ฏใ‚’่ฆ‹ใชใŒใ‚‰้ฃŸไบ‹ใฎใงใใ‚‹ใƒ›ใƒ†ใƒซใŒใ„ใ„ใชใ€‚", "sentences": null, "meta": {}}
      {"context": [{"name": "agent", "text": "ใ“ใ‚“ใซใกใฏ"}], "utterance": "ใ“ใ‚“ใซใกใฏ", "sentences": null, "meta": {}}
      
  3. Feed it to Pilota

    • Command

      pilota -m megagonlabs/pilota_dialog --batch_size 1 --outlen 60 --nbest 1 --beam 5 < input.jsonl
      
    • Output

      [{"scuds_nbest": [[]], "original_ranks": [0], "scores": [0.9911208689212798], "scores_detail": [{"OK": 0.9704028964042664, "incorrect_none": 0.04205145686864853, "lack": 0.0007874675211496651, "limited": 0.0003119863977190107, "non_fluent": 0.0002362923405598849, "untruth": 0.0013080810895189643}], "sentence": "ใฏใ„ใ€‚"}, {"scuds_nbest": [["้ƒจๅฑ‹ใ‹ใ‚‰ๅฏŒๅฃซๅฑฑใŒ่ฆ‹ใˆใ‚‹ใƒ›ใƒ†ใƒซใŒ่‰ฏใ„ใ€‚", "ๅคœๆ™ฏใ‚’่ฆ‹ใชใŒใ‚‰้ฃŸไบ‹ใฎใงใใ‚‹ใƒ›ใƒ†ใƒซใŒ่‰ฏใ„ใ€‚"]], "original_ranks": [0], "scores": [0.9952289938926696], "scores_detail": [{"OK": 0.9840966463088989, "incorrect_none": 0.010280555114150047, "lack": 0.0032871251460164785, "limited": 0.00041511686868034303, "non_fluent": 0.0002954243100248277, "untruth": 0.003289491171017289}], "sentence": "้ƒจๅฑ‹ใ‹ใ‚‰ๅฏŒๅฃซๅฑฑใŒ่ฆ‹ใˆใฆใ€ๅคœๆ™ฏใ‚’่ฆ‹ใชใŒใ‚‰้ฃŸไบ‹ใฎใงใใ‚‹ใƒ›ใƒ†ใƒซใŒใ„ใ„ใชใ€‚"}]
      [{"scuds_nbest": [[]], "original_ranks": [0], "scores": [0.9831213414669036], "scores_detail": [{"OK": 0.9704028964042664, "incorrect_none": 0.04205145686864853, "lack": 0.0007874675211496651, "limited": 0.0003119863977190107, "non_fluent": 0.0002362923405598849, "untruth": 0.0013080810895189643}], "sentence": "ใ“ใ‚“ใซใกใฏ"}]
      

License

Apache License 2.0

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.