sanchit-gandhi commited on
Commit
382a3cb
Β·
verified Β·
1 Parent(s): 0a3a72c

End of training

Browse files
README.md CHANGED
@@ -1,11 +1,15 @@
1
  ---
2
  base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
3
  tags:
 
 
 
 
4
  - trl
5
  - sft
6
  - generated_from_trainer
7
  datasets:
8
- - generator
9
  model-index:
10
  - name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
11
  results: []
@@ -16,7 +20,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # sanchit-gandhi/Mistral-7B-v0.1-6-layer
18
 
19
- This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the generator dataset.
20
  It achieves the following results on the evaluation set:
21
  - Loss: 1.0042
22
 
 
1
  ---
2
  base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
3
  tags:
4
+ - alignment-handbook
5
+ - trl
6
+ - sft
7
+ - generated_from_trainer
8
  - trl
9
  - sft
10
  - generated_from_trainer
11
  datasets:
12
+ - stingning/ultrachat
13
  model-index:
14
  - name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
15
  results: []
 
20
 
21
  # sanchit-gandhi/Mistral-7B-v0.1-6-layer
22
 
23
+ This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the stingning/ultrachat dataset.
24
  It achieves the following results on the evaluation set:
25
  - Loss: 1.0042
26
 
config.json CHANGED
@@ -21,6 +21,6 @@
21
  "tie_word_embeddings": false,
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.1",
24
- "use_cache": false,
25
  "vocab_size": 32000
26
  }
 
21
  "tie_word_embeddings": false,
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.1",
24
+ "use_cache": true,
25
  "vocab_size": 32000
26
  }
wandb/debug-internal.log CHANGED
@@ -37480,3 +37480,9 @@
37480
  2024-04-25 23:47:32,219 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: keepalive
37481
  2024-04-25 23:47:32,264 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37482
  2024-04-25 23:47:33,304 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: status_report
 
 
 
 
 
 
 
37480
  2024-04-25 23:47:32,219 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: keepalive
37481
  2024-04-25 23:47:32,264 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37482
  2024-04-25 23:47:33,304 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: status_report
37483
+ 2024-04-25 23:47:36,269 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37484
+ 2024-04-25 23:47:37,260 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: keepalive
37485
+ 2024-04-25 23:47:38,271 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37486
+ 2024-04-25 23:47:38,974 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: status_report
37487
+ 2024-04-25 23:47:41,480 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: internal_messages
37488
+ 2024-04-25 23:47:42,275 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
wandb/run-20240425_134518-etajcxpg/files/output.log CHANGED
@@ -18858,3 +18858,19 @@ Training completed. Do not forget to share your model on huggingface.co/models =
18858
  [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:47:31,220 >> tokenizer config file saved in ./tokenizer_config.json
18859
  [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:47:31,221 >> Special tokens file saved in ./special_tokens_map.json
18860
  [INFO|modelcard.py:450] 2024-04-25 23:47:31,303 >> Dropping the following result as it does not have all the necessary fields:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18858
  [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:47:31,220 >> tokenizer config file saved in ./tokenizer_config.json
18859
  [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:47:31,221 >> Special tokens file saved in ./special_tokens_map.json
18860
  [INFO|modelcard.py:450] 2024-04-25 23:47:31,303 >> Dropping the following result as it does not have all the necessary fields:
18861
+ {'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'generator', 'type': 'generator', 'config': 'default', 'split': 'train', 'args': 'default'}}
18862
+ events.out.tfevents.1714088841.ip-26-0-167-177.156194.1: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 364/364 [00:00<00:00, 2.49kB/s]
18863
+ run-etajcxpg.wandb: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 9.70M/9.70M [00:00<00:00, 42.3MB/s]
18864
+ Upload 2 LFS files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:00<00:00, 5.07it/s] | 0.00/364 [00:00<?, ?B/s]
18865
+ 2024-04-25 23:47:35 - INFO - __main__ - Model saved to ./
18866
+ 2024-04-25 23:47:35 - INFO - __main__ - Pushing to hub...
18867
+ [INFO|modelcard.py:450] 2024-04-25 23:47:35,964 >> Dropping the following result as it does not have all the necessary fields:
18868
+ {'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'stingning/ultrachat', 'type': 'stingning/ultrachat', 'config': 'default', 'split': 'train', 'args': 'default'}}
18869
+ [INFO|configuration_utils.py:471] 2024-04-25 23:47:35,968 >> Configuration saved in ./config.json
18870
+ [INFO|trainer.py:3305] 2024-04-25 23:47:35,970 >> Saving model checkpoint to ./
18871
+ [INFO|configuration_utils.py:471] 2024-04-25 23:47:35,972 >> Configuration saved in ./config.json
18872
+ [INFO|configuration_utils.py:697] 2024-04-25 23:47:35,973 >> Configuration saved in ./generation_config.json
18873
+ [INFO|modeling_utils.py:2590] 2024-04-25 23:47:40,802 >> Model weights saved in ./model.safetensors
18874
+ [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:47:40,804 >> tokenizer config file saved in ./tokenizer_config.json
18875
+ [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:47:40,806 >> Special tokens file saved in ./special_tokens_map.json
18876
+ [INFO|modelcard.py:450] 2024-04-25 23:47:40,849 >> Dropping the following result as it does not have all the necessary fields:
wandb/run-20240425_134518-etajcxpg/logs/debug-internal.log CHANGED
@@ -37480,3 +37480,9 @@
37480
  2024-04-25 23:47:32,219 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: keepalive
37481
  2024-04-25 23:47:32,264 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37482
  2024-04-25 23:47:33,304 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: status_report
 
 
 
 
 
 
 
37480
  2024-04-25 23:47:32,219 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: keepalive
37481
  2024-04-25 23:47:32,264 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37482
  2024-04-25 23:47:33,304 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: status_report
37483
+ 2024-04-25 23:47:36,269 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37484
+ 2024-04-25 23:47:37,260 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: keepalive
37485
+ 2024-04-25 23:47:38,271 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log
37486
+ 2024-04-25 23:47:38,974 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: status_report
37487
+ 2024-04-25 23:47:41,480 DEBUG HandlerThread:156911 [handler.py:handle_request():146] handle_request: internal_messages
37488
+ 2024-04-25 23:47:42,275 INFO Thread-12 :156911 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_134518-etajcxpg/files/output.log