adding paper
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ pipeline_tag: conversational
|
|
12 |
</p>
|
13 |
|
14 |
<p align="center">
|
15 |
-
π€ <a href="
|
16 |
</p>
|
17 |
|
18 |
`triple-encoders` are models for contextualizing distributed [Sentence Transformers](https://sbert.net/) representations. This model was trained on the [DailyDialog](https://huggingface.co/datasets/daily_dialog) dataset and can be used for conversational sequence modeling and short-term planning via sequential modular late-interaction:
|
@@ -176,12 +176,13 @@ trainer.train("output/path/to/save/model")
|
|
176 |
## Citation
|
177 |
If you use triple-encoders in your research, please cite the following paper:
|
178 |
```
|
179 |
-
|
180 |
-
|
181 |
-
|
182 |
-
|
183 |
-
|
184 |
-
|
|
|
185 |
}
|
186 |
```
|
187 |
# Contact
|
@@ -199,4 +200,4 @@ triple-encoders is licensed under the Apache License, Version 2.0. See [LICENSE]
|
|
199 |
|
200 |
|
201 |
### Acknowledgement
|
202 |
-
|
|
|
12 |
</p>
|
13 |
|
14 |
<p align="center">
|
15 |
+
π€ <a href="https://huggingface.co/UKPLab/triple-encoders-dailydialog" target="_blank">Models</a> | π <a href="https://arxiv.org/pdf/2402.12332.pdf" target="_blank">Paper</a>
|
16 |
</p>
|
17 |
|
18 |
`triple-encoders` are models for contextualizing distributed [Sentence Transformers](https://sbert.net/) representations. This model was trained on the [DailyDialog](https://huggingface.co/datasets/daily_dialog) dataset and can be used for conversational sequence modeling and short-term planning via sequential modular late-interaction:
|
|
|
176 |
## Citation
|
177 |
If you use triple-encoders in your research, please cite the following paper:
|
178 |
```
|
179 |
+
@misc{erker2024tripleencoders,
|
180 |
+
title={Triple-Encoders: Representations That Fire Together, Wire Together},
|
181 |
+
author={Justus-Jonas Erker and Florian Mai and Nils Reimers and Gerasimos Spanakis and Iryna Gurevych},
|
182 |
+
year={2024},
|
183 |
+
eprint={2402.12332},
|
184 |
+
archivePrefix={arXiv},
|
185 |
+
primaryClass={cs.CL}
|
186 |
}
|
187 |
```
|
188 |
# Contact
|
|
|
200 |
|
201 |
|
202 |
### Acknowledgement
|
203 |
+
our package is based upon the [imaginaryNLP](https://github.com/Justus-Jonas/imaginaryNLP) and [Sentence Transformers](https://sbert.net/).
|