---
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: i’ve just been making sure that it is healthier food and not unhealthy food.
- text: 28 male, history of smoking but quit last year, no major health issues, history
of pretty bad acne on back as a teen - was on acutane as a teen, 6ft something,
healthy average weight.
- text: is this expected of a fairly healthy young person just due to getting covid?
- text: we never said no matter the cost, we always said as long as mom and baby are
healthy.
- text: for how many days in succession, can one healthy individual take a single
dose of 500mg paracetamol, without causing liver damage?
metrics:
- accuracy
- precision
- recall
- f1
pipeline_tag: text-classification
library_name: setfit
inference: true
base_model: sentence-transformers/paraphrase-mpnet-base-v2
model-index:
- name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.75
name: Accuracy
- type: precision
value: 0.75
name: Precision
- type: recall
value: 0.75
name: Recall
- type: f1
value: 0.75
name: F1
---
# SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 2 classes
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| time |
- "i mean it hasn't been that long, my heart is perfectly healthy otherwise."
- 'otherwise i’m been healthy and all other blood work they did this year was unremarkable. \n'
- 'but i can not run 3 seconds without breathing for 10 minutes that should say how unhealthy i am.'
|
| no | - 'one of the ob doctors i work with likes to emphasize these lists are birth "preferences" as the birth plan is ultimately having a healthy baby and mom.'
- 'some who may seem “soft” to you enjoy the challenge and reward of safely delivering tens of thousands of healthy babies in their career and putting them in their mother’s arms. \n\n'
- 'so you are right, he is just going to wipe out normal healthy flora , and this includes that wee innocent little lactobacillus the gp wants to put down like old yeller.'
|
## Evaluation
### Metrics
| Label | Accuracy | Precision | Recall | F1 |
|:--------|:---------|:----------|:-------|:-----|
| **all** | 0.75 | 0.75 | 0.75 | 0.75 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("setfit_model_id")
# Run inference
preds = model("i’ve just been making sure that it is healthier food and not unhealthy food.")
```
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 12 | 25.325 | 60 |
| Label | Training Sample Count |
|:------|:----------------------|
| no | 36 |
| time | 44 |
### Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (10, 10)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 3786
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-----:|:----:|:-------------:|:---------------:|
| 0.005 | 1 | 0.2939 | - |
| 0.25 | 50 | 0.2641 | - |
| 0.5 | 100 | 0.195 | - |
| 0.75 | 150 | 0.0162 | - |
| 1.0 | 200 | 0.0007 | - |
| 1.25 | 250 | 0.0003 | - |
| 1.5 | 300 | 0.0002 | - |
| 1.75 | 350 | 0.0001 | - |
| 2.0 | 400 | 0.0001 | - |
| 2.25 | 450 | 0.0002 | - |
| 2.5 | 500 | 0.0013 | - |
| 2.75 | 550 | 0.0002 | - |
| 3.0 | 600 | 0.0006 | - |
| 3.25 | 650 | 0.0015 | - |
| 3.5 | 700 | 0.0008 | - |
| 3.75 | 750 | 0.0001 | - |
| 4.0 | 800 | 0.0001 | - |
| 4.25 | 850 | 0.0007 | - |
| 4.5 | 900 | 0.0001 | - |
| 4.75 | 950 | 0.003 | - |
| 5.0 | 1000 | 0.0001 | - |
| 5.25 | 1050 | 0.0018 | - |
| 5.5 | 1100 | 0.0001 | - |
| 5.75 | 1150 | 0.0001 | - |
| 6.0 | 1200 | 0.0014 | - |
| 6.25 | 1250 | 0.0001 | - |
| 6.5 | 1300 | 0.0009 | - |
| 6.75 | 1350 | 0.0001 | - |
| 7.0 | 1400 | 0.0002 | - |
| 7.25 | 1450 | 0.0 | - |
| 7.5 | 1500 | 0.0 | - |
| 7.75 | 1550 | 0.0002 | - |
| 8.0 | 1600 | 0.0 | - |
| 8.25 | 1650 | 0.0006 | - |
| 8.5 | 1700 | 0.0 | - |
| 8.75 | 1750 | 0.0 | - |
| 9.0 | 1800 | 0.0 | - |
| 9.25 | 1850 | 0.0 | - |
| 9.5 | 1900 | 0.0 | - |
| 9.75 | 1950 | 0.0 | - |
| 10.0 | 2000 | 0.0 | - |
### Framework Versions
- Python: 3.11.7
- SetFit: 1.1.1
- Sentence Transformers: 3.3.1
- Transformers: 4.44.2
- PyTorch: 2.5.1
- Datasets: 2.19.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```