Update README.md
Browse files
README.md
CHANGED
@@ -74,7 +74,7 @@ Swap:
|
|
74 |
CDS differs from swaps as it focuses on credit risk protection, while swaps involve cash flow exchange.
|
75 |
```
|
76 |
|
77 |
-
# Example with Transformers:
|
78 |
|
79 |
See the snippet below for usage with Transformers:
|
80 |
|
@@ -178,3 +178,21 @@ The least profitable strategy is the inflation-linked bond spread strangle strat
|
|
178 |
The most risky strategies are the inflation-linked swap spread strangle strategy and the inflation-linked bond spread strangle strategy, with a risk score of -1 and -2, respectively.
|
179 |
The least risky strategy is the inflation-linked credit spread straddle strategy, with a risk score of 0.
|
180 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
74 |
CDS differs from swaps as it focuses on credit risk protection, while swaps involve cash flow exchange.
|
75 |
```
|
76 |
|
77 |
+
# Example with Transformers and pipeline:
|
78 |
|
79 |
See the snippet below for usage with Transformers:
|
80 |
|
|
|
178 |
The most risky strategies are the inflation-linked swap spread strangle strategy and the inflation-linked bond spread strangle strategy, with a risk score of -1 and -2, respectively.
|
179 |
The least risky strategy is the inflation-linked credit spread straddle strategy, with a risk score of 0.
|
180 |
```
|
181 |
+
|
182 |
+
|
183 |
+
# Example with Transformers:
|
184 |
+
|
185 |
+
```python
|
186 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
187 |
+
|
188 |
+
tokenizer = AutoTokenizer.from_pretrained("baconnier/finance_dolphin_orpo_llama3_8B_r64_51K")
|
189 |
+
model = AutoModelForCausalLM.from_pretrained("baconnier/finance_dolphin_orpo_llama3_8B_r64_51K")
|
190 |
+
|
191 |
+
|
192 |
+
prompt = "What is CDS compare it to a swap"
|
193 |
+
inputs = tokenizer(prompt, return_tensors="pt")
|
194 |
+
|
195 |
+
# Generate
|
196 |
+
generate_ids = model.generate(inputs.input_ids, max_length=200)
|
197 |
+
tokenizer.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]
|
198 |
+
```
|