Justus-Jonas commited on
Commit
1828d76
Β·
verified Β·
1 Parent(s): 10d5c2f

adding paper

Browse files
Files changed (1) hide show
  1. README.md +9 -8
README.md CHANGED
@@ -12,7 +12,7 @@ pipeline_tag: conversational
12
  </p>
13
 
14
  <p align="center">
15
- πŸ€— <a href="anonymous" target="_blank">Models</a> | πŸ“Š <a href="anonymous" target="_blank">Datasets</a> | πŸ“ƒ <a href="anonymous" target="_blank">Paper</a>
16
  </p>
17
 
18
  `triple-encoders` are models for contextualizing distributed [Sentence Transformers](https://sbert.net/) representations. This model was trained on the [DailyDialog](https://huggingface.co/datasets/daily_dialog) dataset and can be used for conversational sequence modeling and short-term planning via sequential modular late-interaction:
@@ -176,12 +176,13 @@ trainer.train("output/path/to/save/model")
176
  ## Citation
177
  If you use triple-encoders in your research, please cite the following paper:
178
  ```
179
- % todo
180
- @article{anonymous,
181
- title={Triple Encoders: Represenations That Fire Together, Wire Together},
182
- author={Justus-Jonas Erker, Florian Mai, Nils Reimers, Gerasimos Spanakis, Iryna Gurevych},
183
- journal={axiv},
184
- year={2024}
 
185
  }
186
  ```
187
  # Contact
@@ -199,4 +200,4 @@ triple-encoders is licensed under the Apache License, Version 2.0. See [LICENSE]
199
 
200
 
201
  ### Acknowledgement
202
- this package is based upon the [imaginaryNLP](https://github.com/Justus-Jonas/imaginaryNLP) and [Sentence Transformers](https://sbert.net/).
 
12
  </p>
13
 
14
  <p align="center">
15
+ πŸ€— <a href="https://huggingface.co/UKPLab/triple-encoders-dailydialog" target="_blank">Models</a> | πŸ“ƒ <a href="https://arxiv.org/pdf/2402.12332.pdf" target="_blank">Paper</a>
16
  </p>
17
 
18
  `triple-encoders` are models for contextualizing distributed [Sentence Transformers](https://sbert.net/) representations. This model was trained on the [DailyDialog](https://huggingface.co/datasets/daily_dialog) dataset and can be used for conversational sequence modeling and short-term planning via sequential modular late-interaction:
 
176
  ## Citation
177
  If you use triple-encoders in your research, please cite the following paper:
178
  ```
179
+ @misc{erker2024tripleencoders,
180
+ title={Triple-Encoders: Representations That Fire Together, Wire Together},
181
+ author={Justus-Jonas Erker and Florian Mai and Nils Reimers and Gerasimos Spanakis and Iryna Gurevych},
182
+ year={2024},
183
+ eprint={2402.12332},
184
+ archivePrefix={arXiv},
185
+ primaryClass={cs.CL}
186
  }
187
  ```
188
  # Contact
 
200
 
201
 
202
  ### Acknowledgement
203
+ our package is based upon the [imaginaryNLP](https://github.com/Justus-Jonas/imaginaryNLP) and [Sentence Transformers](https://sbert.net/).