This repository provides the int8 quantized AoT compiled binary of Flux.1-Dev.
Follow this gist for details on how it was obtained and how to perform inference.
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.
Model tree for sayakpaul/flux.1-dev-int8-aot-compiled
Base model
black-forest-labs/FLUX.1-dev