File size: 1,743 Bytes
e199941
9aa4bf4
e199941
 
 
 
e517975
 
e199941
 
 
 
 
 
 
9aa4bf4
e199941
9aa4bf4
e199941
9aa4bf4
e199941
9aa4bf4
e199941
 
9aa4bf4
e199941
9aa4bf4
 
e199941
9aa4bf4
1b52f8d
c0c4ecc
9aa4bf4
e199941
9aa4bf4
1b52f8d
9aa4bf4
 
 
 
 
 
 
e199941
9aa4bf4
 
 
1b52f8d
e199941
 
9aa4bf4
e199941
9aa4bf4
e199941
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---
library_name: transformers.js
tags:
  - feature-extraction
  - sentence-similarity
  - mteb
  - sentence_transformers
  - transformers
language:
  - de
  - en
inference: false
license: apache-2.0
---

https://huggingface.co/jinaai/jina-embeddings-v2-base-de with ONNX weights to be compatible with Transformers.js.

## Usage (Transformers.js)

If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@xenova/transformers) using:
```bash
npm i @xenova/transformers
```

You can then use the model to compute embeddings, as follows:

```js
import { pipeline, cos_sim } from '@xenova/transformers';

// Create a feature extraction pipeline
const extractor = await pipeline('feature-extraction', 'Xenova/jina-embeddings-v2-base-zh', {
    quantized: false, // Comment out this line to use the quantized version
});

// Compute sentence embeddings
const texts = ['How is the weather today?', '今天天气怎么样?'];
const output = await extractor(texts, { pooling: 'mean', normalize: true });
// Tensor {
//   dims: [2, 768],
// 	 type: 'float32',
//   data: Float32Array(1536)[...],
// 	 size: 1536
// }

// Compute cosine similarity between the two embeddings
const score = cos_sim(output[0].data, output[1].data);
console.log(score);
// 0.7860610759096025
```

---

Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).