--- license: mit language: - en base_model: - deepseek-ai/DeepSeek-R1-Distill-Qwen-7B --- AWQ 4 bits quantization from DeepSeek-R1-Distill-Qwen-7B commit 393119fcd6a873e5776c79b0db01c96911f5f0fc ```python from awq import AutoAWQForCausalLM from transformers import AutoTokenizer model_name = "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B" commit_hash = "393119fcd6a873e5776c79b0db01c96911f5f0fc" # Download the model and tokenizer at the specific commit hash model = AutoAWQForCausalLM.from_pretrained(model_name, revision=commit_hash) tokenizer = AutoTokenizer.from_pretrained(model_name, revision=commit_hash) quant_config = { "zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMM" } model.quantize(tokenizer, quant_config=quant_config) ```