Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
medmekk
/
Minitron-4B-Base.GGUF
like
0
GGUF
Inference Endpoints
imatrix
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Minitron-4B-Base.GGUF
1 contributor
History:
2 commits
medmekk
HF staff
Upload quantized models
d175dbd
verified
12 days ago
.gitattributes
2.85 kB
Upload quantized models
12 days ago
Minitron-4B-Base-IQ3_M_imat.gguf
2.18 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-IQ3_XXS_imat.gguf
1.88 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-IQ4_NL_imat.gguf
2.57 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-IQ4_XS_imat.gguf
2.46 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q2_K.gguf
1.9 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q3_K_L.gguf
2.45 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q3_K_M.gguf
2.3 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q3_K_S.gguf
2.12 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q4_0.gguf
2.57 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q4_K_M.gguf
2.7 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q4_K_M_imat.gguf
2.7 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q4_K_S.gguf
2.58 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q4_K_S_imat.gguf
2.58 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q5_0.gguf
2.99 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q5_K_M.gguf
3.06 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q5_K_M_imat.gguf
3.06 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q5_K_S.gguf
2.99 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q5_K_S_imat.gguf
2.99 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q6_K.gguf
3.45 GB
LFS
Upload quantized models
12 days ago
Minitron-4B-Base-Q8_0.gguf
4.46 GB
LFS
Upload quantized models
12 days ago
README.md
1.22 kB
Upload quantized models
12 days ago