sanchit-gandhi commited on
Commit
9d8c837
Β·
verified Β·
1 Parent(s): ab7be36

End of training

Browse files
README.md CHANGED
@@ -1,12 +1,16 @@
1
  ---
2
  base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
3
  tags:
 
 
 
 
4
  - trl
5
  - sft
6
  - alignment-handbook
7
  - generated_from_trainer
8
  datasets:
9
- - generator
10
  model-index:
11
  - name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
12
  results: []
@@ -17,7 +21,7 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # sanchit-gandhi/Mistral-7B-v0.1-6-layer
19
 
20
- This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the generator dataset.
21
  It achieves the following results on the evaluation set:
22
  - Loss: 1.0042
23
 
 
1
  ---
2
  base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
3
  tags:
4
+ - alignment-handbook
5
+ - trl
6
+ - sft
7
+ - generated_from_trainer
8
  - trl
9
  - sft
10
  - alignment-handbook
11
  - generated_from_trainer
12
  datasets:
13
+ - stingning/ultrachat
14
  model-index:
15
  - name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
16
  results: []
 
21
 
22
  # sanchit-gandhi/Mistral-7B-v0.1-6-layer
23
 
24
+ This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the stingning/ultrachat dataset.
25
  It achieves the following results on the evaluation set:
26
  - Loss: 1.0042
27
 
config.json CHANGED
@@ -21,6 +21,6 @@
21
  "tie_word_embeddings": false,
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.1",
24
- "use_cache": false,
25
  "vocab_size": 32000
26
  }
 
21
  "tie_word_embeddings": false,
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.1",
24
+ "use_cache": true,
25
  "vocab_size": 32000
26
  }
wandb/debug-internal.log CHANGED
@@ -86,3 +86,9 @@
86
  2024-04-25 23:49:36,063 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: stop_status
87
  2024-04-25 23:49:36,063 DEBUG SenderThread:212584 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:49:37,798 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
 
 
 
 
 
 
 
86
  2024-04-25 23:49:36,063 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: stop_status
87
  2024-04-25 23:49:36,063 DEBUG SenderThread:212584 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:49:37,798 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
89
+ 2024-04-25 23:49:39,684 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: status_report
90
+ 2024-04-25 23:49:39,800 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
91
+ 2024-04-25 23:49:40,801 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
92
+ 2024-04-25 23:49:41,802 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
93
+ 2024-04-25 23:49:42,804 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
94
+ 2024-04-25 23:49:45,208 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: status_report
wandb/run-20240425_234916-ozdw63qu/files/output.log CHANGED
@@ -37,3 +37,21 @@ Training completed. Do not forget to share your model on huggingface.co/models =
37
  [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:49:35,768 >> tokenizer config file saved in ./tokenizer_config.json
38
  [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:49:35,769 >> Special tokens file saved in ./special_tokens_map.json
39
  [INFO|modelcard.py:450] 2024-04-25 23:49:35,816 >> Dropping the following result as it does not have all the necessary fields:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
37
  [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:49:35,768 >> tokenizer config file saved in ./tokenizer_config.json
38
  [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:49:35,769 >> Special tokens file saved in ./special_tokens_map.json
39
  [INFO|modelcard.py:450] 2024-04-25 23:49:35,816 >> Dropping the following result as it does not have all the necessary fields:
40
+ {'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'generator', 'type': 'generator', 'config': 'default', 'split': 'train', 'args': 'default'}}
41
+ events.out.tfevents.1714088955.ip-26-0-167-177.211869.0: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 5.07k/5.07k [00:00<00:00, 36.5kB/s]
42
+ events.out.tfevents.1714088965.ip-26-0-167-177.211869.1: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 364/364 [00:00<00:00, 2.20kB/s]
43
+ training_args.bin: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4.98k/4.98k [00:00<00:00, 25.1kB/s]
44
+ run-etajcxpg.wandb: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 9.71M/9.71M [00:00<00:00, 30.9MB/s]
45
+ Upload 4 LFS files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4/4 [00:00<00:00, 8.49it/s]β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 364/364 [00:00<00:00, 3.15kB/s]
46
+ 2024-04-25 23:49:40 - INFO - __main__ - Model saved to ./
47
+ [INFO|modelcard.py:450] 2024-04-25 23:49:40,249 >> Dropping the following result as it does not have all the necessary fields:
48
+ {'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'stingning/ultrachat', 'type': 'stingning/ultrachat', 'config': 'default', 'split': 'train', 'args': 'default'}}
49
+ [INFO|configuration_utils.py:471] 2024-04-25 23:49:40,253 >> Configuration saved in ./config.json
50
+ [INFO|trainer.py:3305] 2024-04-25 23:49:40,254 >> Saving model checkpoint to ./
51
+ [INFO|configuration_utils.py:471] 2024-04-25 23:49:40,255 >> Configuration saved in ./config.json
52
+ [INFO|configuration_utils.py:697] 2024-04-25 23:49:40,257 >> Configuration saved in ./generation_config.json
53
+ 2024-04-25 23:49:40 - INFO - __main__ - Pushing to hub...
54
+ [INFO|modeling_utils.py:2590] 2024-04-25 23:49:45,207 >> Model weights saved in ./model.safetensors
55
+ [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:49:45,209 >> tokenizer config file saved in ./tokenizer_config.json
56
+ [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:49:45,211 >> Special tokens file saved in ./special_tokens_map.json
57
+ [INFO|modelcard.py:450] 2024-04-25 23:49:45,255 >> Dropping the following result as it does not have all the necessary fields:
wandb/run-20240425_234916-ozdw63qu/logs/debug-internal.log CHANGED
@@ -86,3 +86,9 @@
86
  2024-04-25 23:49:36,063 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: stop_status
87
  2024-04-25 23:49:36,063 DEBUG SenderThread:212584 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:49:37,798 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
 
 
 
 
 
 
 
86
  2024-04-25 23:49:36,063 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: stop_status
87
  2024-04-25 23:49:36,063 DEBUG SenderThread:212584 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:49:37,798 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
89
+ 2024-04-25 23:49:39,684 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: status_report
90
+ 2024-04-25 23:49:39,800 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
91
+ 2024-04-25 23:49:40,801 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
92
+ 2024-04-25 23:49:41,802 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
93
+ 2024-04-25 23:49:42,804 INFO Thread-12 :212584 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_234916-ozdw63qu/files/output.log
94
+ 2024-04-25 23:49:45,208 DEBUG HandlerThread:212584 [handler.py:handle_request():146] handle_request: status_report