How do I resolve this issue?
1
#11 opened 11 months ago
by
anshumankmr
![](https://cdn-avatars.huggingface.co/v1/production/uploads/65b9f159c29f995b6deeaaf9/8vI8bNXWI1aqTYlNcYWlC.jpeg)
codeLlama-70b-hf
#10 opened 12 months ago
by
raviald
[AUTOMATED] Model Memory Requirements
#9 opened 12 months ago
by
model-sizer-bot
Adding Evaluation Results
#8 opened 12 months ago
by
leaderboard-pr-bot
![](https://cdn-avatars.huggingface.co/v1/production/uploads/655506df9dc61e22c5f9c732/IZGvup0FdVlioPPIPnzZv.jpeg)
rope_theta and max_position_embeddings
#7 opened 12 months ago
by
zchenyu
![](https://cdn-avatars.huggingface.co/v1/production/uploads/63203631948125b2256118bb/qK3HW6Nz3MnJCD0DjUfE6.jpeg)
cuda out of memory exceptions
2
#6 opened 12 months ago
by
sanipanwala
How come the total size of the model is 138GB ?
1
#5 opened about 1 year ago
by
gagan001
Provide prompt examples
3
#3 opened about 1 year ago
by
tangles
![](https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/h7dT77kNV3gcVeJQ8LHxh.jpeg)