---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:6300
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
widget:
- source_sentence: Favorable resolution of tax positions would be recognized as a
reduction to the effective income tax rate in the period of resolution.
sentences:
- What was the operational trend for voice connections from 2021 to 2023?
- How is a favorable resolution of a tax position recognized in financial terms?
- What was the amount of cash generated from operations by the company in fiscal
year 2023?
- source_sentence: The cumulative basis adjustments associated with these hedging
relationships are a reduction of the amortized cost basis of the closed portfolios
of $19 million.
sentences:
- What was the reduction in the amortized cost basis of the closed portfolios due
to cumulative basis adjustments in these hedging relationships?
- How is the inclusion of the financial statements in the IBM's Form 10-K described?
- What was the percentage increase in Electronic Arts' diluted earnings per share
in the fiscal year ended March 31, 2023?
- source_sentence: Walmart's fintech venture, ONE, provides financial services such
as money orders, prepaid access, money transfers, check cashing, bill payment,
and certain types of installment lending.
sentences:
- What types of financial services are offered through Walmart's fintech venture,
ONE?
- How much cash did FedEx have at the end of May 2023?
- What is the purpose of Visa according to the overview provided?
- source_sentence: Medicare Star Ratings - A portion of each Medicare Advantage plan’s
reimbursement is tied to the plan’s “star ratings.” The star rating system considers
a variety of measures adopted by CMS, including quality of preventative services,
chronic illness management, compliance and overall customer satisfaction. Only
Medicare Advantage plans with an overall star rating of 4 or more stars (out of
5 stars) are eligible for a quality bonus in their basic premium rates.
sentences:
- What authority does the Macao government have over VML's recapitalization plans?
- How do Medicare Advantage star ratings impact the financial reimbursements of
plans?
- How are the adjusted income from operations and gross profit figures calculated
for the company?
- source_sentence: For the year ended December 31, 2022, the free cash flow reported
was -$11,569 million.
sentences:
- What new initiative did Dollar Tree announce in September 2021?
- What was the amount of deferred net loss on derivatives included in accumulated
other comprehensive income as of December 31, 2023?
- What was the free cash flow reported for the year ended December 31, 2022?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.7028571428571428
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8242857142857143
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8585714285714285
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8942857142857142
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7028571428571428
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2747619047619047
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1717142857142857
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08942857142857143
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7028571428571428
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8242857142857143
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8585714285714285
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8942857142857142
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8013128307721423
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7712681405895693
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7753497571186561
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.7071428571428572
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8228571428571428
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8571428571428571
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8928571428571429
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7071428571428572
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2742857142857143
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1714285714285714
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08928571428571427
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7071428571428572
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8228571428571428
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8571428571428571
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8928571428571429
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8020949349247011
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7728367346938774
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7769793740668877
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.7057142857142857
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8185714285714286
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.85
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8914285714285715
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7057142857142857
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27285714285714285
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16999999999999998
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08914285714285713
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7057142857142857
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8185714285714286
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.85
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8914285714285715
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7985100284142371
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.768848072562358
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7728585204433037
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.6842857142857143
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8085714285714286
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8442857142857143
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8842857142857142
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6842857142857143
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.26952380952380955
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16885714285714284
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08842857142857141
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6842857142857143
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8085714285714286
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8442857142857143
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8842857142857142
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7851722154382534
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.753300453514739
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7577938812506425
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.6685714285714286
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.78
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8114285714285714
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8628571428571429
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6685714285714286
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.26
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16228571428571426
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08628571428571427
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6685714285714286
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.78
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8114285714285714
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8628571428571429
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7647477473058039
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7335680272108841
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7387414091286255
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("kenoc/bge-base-financial-matryoshka")
# Run inference
sentences = [
'For the year ended December 31, 2022, the free cash flow reported was -$11,569 million.',
'What was the free cash flow reported for the year ended December 31, 2022?',
'What was the amount of deferred net loss on derivatives included in accumulated other comprehensive income as of December 31, 2023?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------|
| cosine_accuracy@1 | 0.7029 | 0.7071 | 0.7057 | 0.6843 | 0.6686 |
| cosine_accuracy@3 | 0.8243 | 0.8229 | 0.8186 | 0.8086 | 0.78 |
| cosine_accuracy@5 | 0.8586 | 0.8571 | 0.85 | 0.8443 | 0.8114 |
| cosine_accuracy@10 | 0.8943 | 0.8929 | 0.8914 | 0.8843 | 0.8629 |
| cosine_precision@1 | 0.7029 | 0.7071 | 0.7057 | 0.6843 | 0.6686 |
| cosine_precision@3 | 0.2748 | 0.2743 | 0.2729 | 0.2695 | 0.26 |
| cosine_precision@5 | 0.1717 | 0.1714 | 0.17 | 0.1689 | 0.1623 |
| cosine_precision@10 | 0.0894 | 0.0893 | 0.0891 | 0.0884 | 0.0863 |
| cosine_recall@1 | 0.7029 | 0.7071 | 0.7057 | 0.6843 | 0.6686 |
| cosine_recall@3 | 0.8243 | 0.8229 | 0.8186 | 0.8086 | 0.78 |
| cosine_recall@5 | 0.8586 | 0.8571 | 0.85 | 0.8443 | 0.8114 |
| cosine_recall@10 | 0.8943 | 0.8929 | 0.8914 | 0.8843 | 0.8629 |
| **cosine_ndcg@10** | **0.8013** | **0.8021** | **0.7985** | **0.7852** | **0.7647** |
| cosine_mrr@10 | 0.7713 | 0.7728 | 0.7688 | 0.7533 | 0.7336 |
| cosine_map@100 | 0.7753 | 0.777 | 0.7729 | 0.7578 | 0.7387 |
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 6,300 training samples
* Columns: positive
and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
During fiscal year 2023, 276 billion payments and cash transactions with Visa’s brand were processed by Visa or other networks.
| What significant milestone of transactions did Visa reach during fiscal year 2023?
|
| The AMPTC for microinverters decreases by 25% each year beginning in 2030 and ending after 2032.
| What is the trajectory of the AMPTC for microinverters starting in 2030?
|
| Revenue increased in 2023 driven by increased volume and higher realized prices. The increase in revenue in 2023 was primarily driven by sales of Mounjaro®, Verzenio®, Jardiance®, as well as the sales of the rights for the olanzapine portfolio, including Zyprexa®, and for Baqsimi®, partially offset by the absence of revenue from COVID-19 antibodies and lower sales of Alimta® following the entry of multiple generics in the first half of 2022.
| What factors contributed to the increase in revenue in 2023?
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 8
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters