File size: 1,641 Bytes
407931f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
84541f4
407931f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---
license: llama3
language:
- en
base_model:
- meta-llama/Meta-Llama-3-8B
tags:
- materials science
- large language model
---
# Model Card for LLaMat-3

**LLaMat-3** is a specialized large language model designed to be a foundational large language model for materials science. 

---

## Overview

- **Model Type:** Large Language Model (LLM)  
- **Base Model:** LLaMat-3 (continued pretraining of LLaMA-3 on material science data)  
- **Language:** English  
- **License:** LLaMA-3 License  
- **Tags:** Material Science, Domain Adaptation, Table Understanding, Scientific Data Parsing, Materials Copilot  

---

## Model Details

### Key Features

- **Applications:** Can be finetuned for information extraction, table understanding, parsing data for research tasks, and crystal structure generation.  

### Development and Support
- **Developed by:** [M3RG, IIT Delhi](https://github.com/M3RG-IITD/) & [DAIR, IIT Delhi](https://github.com/dair-iitd)
- **Compute Support:**  
  - **Edinburgh International Data Facility (EIDF):** Provided access to Cerebras CS2 clusters for pretraining.  
  - **IIT Delhi High-Performance Computing Cluster:** Supported fine-tuning and inference stages.  

---

## Technical Specifications

### Hardware Infrastructure
- **Pretraining:** 2 Cerebras CS-2 Wafer-Scale Engines (WSE-2)  

### Software Stack
- **Frameworks:** PyTorch, Hugging Face Transformers  

---

## Model Sources
- **Repository:** [LLaMat on GitHub](https://github.com/M3RG-IITD/llamat)  
- **Compute Resources:** [EIDF Cerebras CS Clusters](https://edinburgh-international-data-facility.ed.ac.uk/services/computing/cerebras-cs)

---