runtime error
Exit code: 1. Reason: ors: 57%|ββββββ | 2.45G/4.32G [00:07<00:04, 436MB/s][A model-00004-of-00004.safetensors: 67%|βββββββ | 2.90G/4.32G [00:09<00:03, 399MB/s][A model-00004-of-00004.safetensors: 80%|ββββββββ | 3.47G/4.32G [00:10<00:01, 443MB/s][A model-00004-of-00004.safetensors: 92%|ββββββββββ| 4.00G/4.32G [00:11<00:00, 467MB/s][A model-00004-of-00004.safetensors: 100%|ββββββββββ| 4.32G/4.32G [00:11<00:00, 368MB/s] Downloading shards: 100%|ββββββββββ| 4/4 [00:49<00:00, 12.40s/it][A Downloading shards: 100%|ββββββββββ| 4/4 [00:49<00:00, 12.44s/it] Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s][A Loading checkpoint shards: 0%| | 0/4 [00:01<?, ?it/s] Traceback (most recent call last): File "/home/user/app/app.py", line 50, in <module> model = AutoModelForCausalLM.from_pretrained("unsloth/DeepSeek-R1-Distill-Qwen-32B-bnb-4bit", device_map="auto") # to("cuda:0") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4224, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4794, in _load_pretrained_model new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 873, in _load_state_dict_into_meta_model set_module_tensor_to_device(model, param_name, param_device, **set_module_kwargs) File "/usr/local/lib/python3.10/site-packages/accelerate/utils/modeling.py", line 287, in set_module_tensor_to_device raise ValueError( ValueError: Trying to set a tensor of shape torch.Size([70778880, 1]) in "weight" (which has shape torch.Size([5120, 27648])), this looks incorrect.
Container logs:
Fetching error logs...