SmolLM2: When Smol Goes Big -- Data-Centric Training of a Small Language Model
Paper
β’
2502.02737
β’
Published
β’
62
None defined yet.
float16
. However, there's some precision loss somewhere and generation doesn't work in float16
mode yet. I'm looking into this and will keep you posted! Or take a look at this issue if you'd like to help: https://github.com/huggingface/swift-transformers/issues/95