Xenova HF staff commited on
Commit
db6426b
·
verified ·
1 Parent(s): 1b52f8d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -4
README.md CHANGED
@@ -28,12 +28,12 @@ You can then use the model to compute embeddings, as follows:
28
  import { pipeline, cos_sim } from '@xenova/transformers';
29
 
30
  // Create a feature extraction pipeline
31
- const extractor = await pipeline('feature-extraction', 'Xenova/jina-embeddings-v2-base-zh', {
32
  quantized: false, // Comment out this line to use the quantized version
33
  });
34
 
35
  // Compute sentence embeddings
36
- const texts = ['How is the weather today?', '今天天气怎么样?'];
37
  const output = await extractor(texts, { pooling: 'mean', normalize: true });
38
  // Tensor {
39
  // dims: [2, 768],
@@ -45,10 +45,9 @@ const output = await extractor(texts, { pooling: 'mean', normalize: true });
45
  // Compute cosine similarity between the two embeddings
46
  const score = cos_sim(output[0].data, output[1].data);
47
  console.log(score);
48
- // 0.7860610759096025
49
  ```
50
 
51
  ---
52
 
53
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
54
-
 
28
  import { pipeline, cos_sim } from '@xenova/transformers';
29
 
30
  // Create a feature extraction pipeline
31
+ const extractor = await pipeline('feature-extraction', 'Xenova/jina-embeddings-v2-base-de', {
32
  quantized: false, // Comment out this line to use the quantized version
33
  });
34
 
35
  // Compute sentence embeddings
36
+ const texts = ['How is the weather today?', 'Wie ist das Wetter heute?'];
37
  const output = await extractor(texts, { pooling: 'mean', normalize: true });
38
  // Tensor {
39
  // dims: [2, 768],
 
45
  // Compute cosine similarity between the two embeddings
46
  const score = cos_sim(output[0].data, output[1].data);
47
  console.log(score);
48
+ // 0.9602110344414481
49
  ```
50
 
51
  ---
52
 
53
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).