Update README.md
Browse files
README.md
CHANGED
@@ -83,7 +83,7 @@ print(transnormer(sentence, num_beams=4, max_length=128))
|
|
83 |
|
84 |
The model was trained using a maximum input length of 512 bytes (~70 words).
|
85 |
Inference is generally possible for longer sequences, but may be worse than for shorter sequence.
|
86 |
-
Generally, shorter sequences
|
87 |
Consider splitting long sequences to process them separately.
|
88 |
|
89 |
|
|
|
83 |
|
84 |
The model was trained using a maximum input length of 512 bytes (~70 words).
|
85 |
Inference is generally possible for longer sequences, but may be worse than for shorter sequence.
|
86 |
+
Generally, by passing shorter sequences you make sure that inference is faster and less computationally expensive.
|
87 |
Consider splitting long sequences to process them separately.
|
88 |
|
89 |
|