library_name: transformers | |
tags: [] | |
Install: https://github.com/KONAKONA666/q8_kernels | |
# Usage | |
```python | |
import torch | |
from q8_kernels.models.T5EncoderFP8 import T5EncoderModelFP8 | |
text_encoder = T5EncoderModelFP8.from_pretrained( | |
"konakona/t5xxl_encoder_fp8", torch_dtype=torch.bfloat16 | |
) | |
``` | |
Needed dX for token training(PTI and textual inversion) in LTX. \ | |
dX is in bf16, calculations in FP8 \ | |
Needs Ada GPU | |