![](https://cdn-avatars.huggingface.co/v1/production/uploads/6317aade83d8d2fd903192d9/tPLjYEeP6q1w0j_G2TJG_.png)
NousResearch/Hermes-2-Pro-Mistral-7B-GGUF
Updated
โข
9.6k
โข
227
Are there any examples or notebooks showing how to use AWQ in LORA fine-tuning a LLM? Or just use AWQ model from huggingface directly? I'm asking as neither the docs nor the Release notes explain anything.
use_dora=True
to your LoraConfig
. Find out more about this method here: https://arxiv.org/abs/2402.09353