--- base_model: - microsoft/phi-3.5-mini-instruct - AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common library_name: transformers tags: - mergekit - merge license: mit license_link: https://huggingface.co/microsoft/Phi-3.5-mini-instruct/resolve/main/LICENSE language: - en - ja inference: true pipeline_tag: text-generation widget: - messages: - role: user content: こんにちは! - messages: - role: user content: ステンレスってどうやって作るのでしょうか? - messages: - role: user content: hello! - messages: - role: user content: Guten Tag! --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [microsoft/phi-3.5-mini-instruct](https://huggingface.co/microsoft/phi-3.5-mini-instruct) as a base. ### Models Merged The following models were included in the merge: * [AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common](https://huggingface.co/AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: AXCXEPT/Borea-Phi-3.5-mini-Instruct-Common parameters: weight: 1 density: 1 merge_method: ties base_model: microsoft/phi-3.5-mini-instruct parameters: weight: 1 density: 1 normalize: true int8_mask: true dtype: bfloat16 ```