sud-962081's picture
Add new SentenceTransformer model
91eca6c verified
metadata
language:
  - en
license: apache-2.0
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:6300
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
widget:
  - source_sentence: >-
      As of the end of 2023, Hilton's development pipeline included projects in
      118 countries and territories.
    sentences:
      - >-
        What was the total net income attributed to AT&T common stockholders in
        2023?
      - >-
        How many countries and territories did Hilton's development pipeline
        encompass as of the end of 2023?
      - >-
        What caused the increase in Medicare receivables in 2023 compared to
        2022?
  - source_sentence: >-
      Alex G. Balazs was appointed as the Executive Vice President and Chief
      Technology Officer effective September 5, 2023.
    sentences:
      - What page of IBM's Form 10-K contains the Financial Statement Schedule?
      - >-
        When was Alex G. Balazs appointed as the Executive Vice President and
        Chief Technology Officer?
      - >-
        How much were the valuation allowances provided for deferred tax assets
        related to loss carryforwards as of December 31, 2023?
  - source_sentence: >-
      HP's global wellness program emphasizes five pillars of wellness:
      physical, financial, emotional, life balance, and social/community.
    sentences:
      - >-
        What are the five pillars of wellness emphasized in HP's global wellness
        program?
      - >-
        What was the fair value of money market mutual funds measured at as of
        January 31, 2023 and how was it categorized in the fair value hierarchy?
      - >-
        What amount was authorized for future share repurchases by the company
        as of October 31, 2023?
  - source_sentence: >-
      Item 3, titled 'Legal Proceedings' in a 10-K filing, directs to Note 16
      where specific information is further detailed in Item 8 of Part II.
    sentences:
      - >-
        What was the grant date fair value of options vested for HP in fiscal
        years 2023, 2022, and 2021?
      - >-
        What is the balance at the end of the year for Comcast's Total Equity in
        2023?
      - What is indicated by Item 3, 'Legal Proceedings', in a 10-K filing?
  - source_sentence: >-
      During 2023, we received approximately $220 of cash collateral, on a net
      basis.
    sentences:
      - How much cash collateral did AT&T receive on a net basis during 2023?
      - >-
        What percentage of FedEx Corporation's consolidated revenues did jet
        fuel costs represent in 2023?
      - >-
        What measures has Bank of America taken to streamline its organizational
        structure?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: BGE base Financial Matryoshka
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.7128571428571429
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8428571428571429
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8842857142857142
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.92
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7128571428571429
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.28095238095238095
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.17685714285714288
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09199999999999998
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7128571428571429
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8428571428571429
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8842857142857142
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.92
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8195233962517928
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7870022675736963
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7905145024165581
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.7157142857142857
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8457142857142858
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8814285714285715
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9228571428571428
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7157142857142857
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2819047619047619
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1762857142857143
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09228571428571428
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7157142857142857
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8457142857142858
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8814285714285715
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9228571428571428
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.821183673183428
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7884829931972789
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7916656681436871
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.7114285714285714
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8414285714285714
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8842857142857142
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9157142857142857
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7114285714285714
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.28047619047619043
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.17685714285714285
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09157142857142858
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7114285714285714
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8414285714285714
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8842857142857142
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9157142857142857
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8157881706696753
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7834812925170066
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7870779881453726
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.6957142857142857
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.82
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8685714285714285
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9057142857142857
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6957142857142857
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2733333333333333
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1737142857142857
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09057142857142857
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6957142857142857
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.82
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8685714285714285
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9057142857142857
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8018105093606251
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7683497732426302
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7722509873826792
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.6528571428571428
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7942857142857143
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8314285714285714
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8757142857142857
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6528571428571428
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.26476190476190475
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1662857142857143
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08757142857142856
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6528571428571428
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.7942857142857143
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8314285714285714
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8757142857142857
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7667522193115596
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7315833333333331
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7359673420065519
            name: Cosine Map@100

BGE base Financial Matryoshka

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5 on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • json
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sud-962081/bge-base-financial-matryoshka")
# Run inference
sentences = [
    'During 2023, we received approximately $220 of cash collateral, on a net basis.',
    'How much cash collateral did AT&T receive on a net basis during 2023?',
    "What percentage of FedEx Corporation's consolidated revenues did jet fuel costs represent in 2023?",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric dim_768 dim_512 dim_256 dim_128 dim_64
cosine_accuracy@1 0.7129 0.7157 0.7114 0.6957 0.6529
cosine_accuracy@3 0.8429 0.8457 0.8414 0.82 0.7943
cosine_accuracy@5 0.8843 0.8814 0.8843 0.8686 0.8314
cosine_accuracy@10 0.92 0.9229 0.9157 0.9057 0.8757
cosine_precision@1 0.7129 0.7157 0.7114 0.6957 0.6529
cosine_precision@3 0.281 0.2819 0.2805 0.2733 0.2648
cosine_precision@5 0.1769 0.1763 0.1769 0.1737 0.1663
cosine_precision@10 0.092 0.0923 0.0916 0.0906 0.0876
cosine_recall@1 0.7129 0.7157 0.7114 0.6957 0.6529
cosine_recall@3 0.8429 0.8457 0.8414 0.82 0.7943
cosine_recall@5 0.8843 0.8814 0.8843 0.8686 0.8314
cosine_recall@10 0.92 0.9229 0.9157 0.9057 0.8757
cosine_ndcg@10 0.8195 0.8212 0.8158 0.8018 0.7668
cosine_mrr@10 0.787 0.7885 0.7835 0.7683 0.7316
cosine_map@100 0.7905 0.7917 0.7871 0.7723 0.736

Training Details

Training Dataset

json

  • Dataset: json
  • Size: 6,300 training samples
  • Columns: positive and anchor
  • Approximate statistics based on the first 1000 samples:
    positive anchor
    type string string
    details
    • min: 4 tokens
    • mean: 44.47 tokens
    • max: 260 tokens
    • min: 9 tokens
    • mean: 20.17 tokens
    • max: 43 tokens
  • Samples:
    positive anchor
    SmartFlex benefits and the 'Best of Both' work model at The Hershey Company allow employees to balance professional and personal demands through flexible work arrangements. How does The Hershey Company ensure flexibility and work-life balance for its employees?
    In February 2024, our Board authorized an additional $2.0 billion stock repurchase program, with no expiration from the date of authorization. What amount was authorized for common stock repurchase by the company's Board in February 2024?
    Beginning in 2025, the first GM EVs will be constructed using the North American Charging Standard (NACS) hardware. What significant change is set for General Motors' EVs starting in 2025 regarding charging hardware?
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • num_train_epochs: 4
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
1.0 7 - 0.8036 0.8049 0.7942 0.7726 0.7375
1.4848 10 2.2028 - - - - -
2.0 14 - 0.8169 0.8173 0.8127 0.8000 0.7602
2.9697 20 0.9836 - - - - -
3.0 21 - 0.8187 0.8214 0.8142 0.8017 0.7658
3.4848 24 - 0.8195 0.8212 0.8158 0.8018 0.7668
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.3.1
  • Transformers: 4.47.0
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.2.1
  • Datasets: 3.2.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}