Sao10K's picture
Update README.md
6abc08c verified
|
raw
history blame
784 Bytes
---
license: cc-by-nc-4.0
language:
- en
---
Fimbulvetr-v2 but extended to 16K with PoSE. A sane context value would be ~12K before it degrades.
<br>I get consistent and reliable answers at ~11K context fine.
<br> Still coherent at up to 16K though! Just works not that well.
Notes:
<br> \- I noticed peoplle having bad issues with quants. Be it GGUF or others, at 8 bit or less. Kind of a weird issue? I had little to no issues during testing at the full precision
<br> \- Slightly different results from base Fimbulvetr-v2, but during my tests they are similar enough. The vibes are still there.
<br> \- Formatting issues happen rarely. Sometimes. A reroll / regenerate fixes it from tests.
![Needle](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2.1-16K/resolve/main/output.png)