2025-01-28 00:52:04 | INFO | model_worker | args: Namespace(host='0.0.0.0', port=40000, worker_address='http://localhost:40000', controller_address='http://localhost:10000', model_path='.', model_base=None, model_name=None, device='cuda', multi_modal=True, limit_model_concurrency=5, stream_interval=1, no_register=False, load_8bit=False, load_4bit=False) 2025-01-28 00:52:04 | WARNING | model_worker | Multimodal mode is automatically detected with model name, please make sure `llava` is included in the model path. 2025-01-28 00:52:04 | INFO | model_worker | Loading the model . on worker f9540c ... 2025-01-28 00:52:04 | ERROR | stderr | Traceback (most recent call last): 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main 2025-01-28 00:52:04 | ERROR | stderr | return _run_code(code, main_globals, None, 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in _run_code 2025-01-28 00:52:04 | ERROR | stderr | exec(code, run_globals) 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\LLaVA-Med\llava\serve\model_worker.py", line 275, in 2025-01-28 00:52:04 | ERROR | stderr | worker = ModelWorker(args.controller_address, 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\LLaVA-Med\llava\serve\model_worker.py", line 65, in __init__ 2025-01-28 00:52:04 | ERROR | stderr | self.tokenizer, self.model, self.image_processor, self.context_len = load_pretrained_model( 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\LLaVA-Med\llava\model\builder.py", line 57, in load_pretrained_model 2025-01-28 00:52:04 | ERROR | stderr | model = AutoModelForCausalLM.from_pretrained(model_path, low_cpu_mem_usage=True, **kwargs) 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 526, in from_pretrained 2025-01-28 00:52:04 | ERROR | stderr | config, kwargs = AutoConfig.from_pretrained( 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1082, in from_pretrained 2025-01-28 00:52:04 | ERROR | stderr | config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 644, in get_config_dict 2025-01-28 00:52:04 | ERROR | stderr | config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\configuration_utils.py", line 699, in _get_config_dict 2025-01-28 00:52:04 | ERROR | stderr | resolved_config_file = cached_file( 2025-01-28 00:52:04 | ERROR | stderr | File "C:\Users\admin\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\utils\hub.py", line 360, in cached_file 2025-01-28 00:52:04 | ERROR | stderr | raise EnvironmentError( 2025-01-28 00:52:04 | ERROR | stderr | OSError: . does not appear to have a file named config.json. Checkout 'https://huggingface.co/./None' for available files.