Rotary position embeddings not loaded
#39
by
cwbc
- opened
When I load the model weights to transformers.LlavaLlamaForCausalLM
, it says the rotary position embeddings rotary_emb.inv_freq
are loaded from the checkpoint. Does it affect the model performance?
No
When I load the model weights to
transformers.LlavaLlamaForCausalLM
, it says the rotary position embeddingsrotary_emb.inv_freq
are loaded from the checkpoint. Does it affect the model performance?
Do you know how I can get the values being used for rotary_emb.inv_freq?