ericsorides commited on
Commit
a53e3e2
·
verified ·
1 Parent(s): 4da855b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +82 -0
README.md ADDED
@@ -0,0 +1,82 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model:
4
+ - Esperanto/Protein-Llama-3-8B
5
+ tags:
6
+ - biology
7
+ - medical
8
+ ---
9
+
10
+
11
+ ## Model Details
12
+
13
+ Protein-Llama-3-8B is a specialized version of the [Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) language model, fine-tuned for the task of protein language modeling.
14
+ This model has been continually pre-trained using [LoRA](https://huggingface.co/docs/diffusers/en/training/lora) technique on extensive datasets of protein sequences, enabling it to generate novel protein sequences based on natural language prompts.
15
+ It supports both uncontrollable and controllable protein generation, allowing users to specify desired characteristics for the proteins.
16
+ The model is designed to facilitate advancements in protein engineering, making it a valuable tool for drug development, chemical synthesis, and other biotechnological applications.
17
+ For full details please read [our paper](https://arxiv.org/abs/2411.05966).
18
+ This is the ONNX version of the model in fp16 precision with KVC.
19
+
20
+ ### Model Description
21
+
22
+ Generating novel protein sequences possessing desired properties, termed as protein engineering, is crucial for industries like drug development and chemical synthesis. Traditional protein engineering techniques often involve introducing random mutations into the gene encoding the protein of interest. This is followed by expression and screening to identify variants with improved or novel functions, which are then reproduced. While effective, these approaches are labor-intensive and time-consuming, as they rely on iterating over known protein sequences. This limits their ability to generate diverse protein sequences with entirely new capabilities, as they are constrained by existing protein templates. Moreover, the need to analyze numerous protein variants can waste valuable experimental resources.
23
+ However, leveraging a Large Language Model (LLM) that has learned the "protein language" significantly accelerates this process. An LLM can generate and evaluate protein sequences in a matter of seconds. The inherent randomness of LLM-generated sequences enhances diversity, enabling the creation of completely novel proteins with potentially unprecedented functions. This not only streamlines the discovery and development process but also expands the scope of possibilities in protein engineering.
24
+ This model is based on the Llama-3-8B architecture and is capable of generating proteins based on user defined characteristics.
25
+
26
+ [Energy Efficient Protein Language Models: Leveraging Small Language Models with LoRA for Controllable Protein Generation](https://huggingface.co/papers/2411.05966)
27
+
28
+ ## Usage
29
+
30
+ To download and use the Protein-Llama-3 model for inference, follow these steps:
31
+
32
+ ### Installation
33
+
34
+ Ensure you have the `transformers` library installed. You can install it using pip:
35
+
36
+ ```bash
37
+ pip install transformers
38
+ ```
39
+
40
+ ### Uncontrollable Generation
41
+
42
+ Uncontrollable generation can be handled via prompting the model with the phrase 'Seq=<'.
43
+
44
+ ```
45
+ generator = pipeline('text-generation', model="Esperanto/Protein-Llama-3-8B")
46
+
47
+ sequences = generator("Seq=<",temperature=0.2,
48
+ top_k=40,
49
+ top_p=0.9,
50
+ do_sample=True,
51
+ repetition_penalty=1.2,
52
+ max_new_tokens=30,
53
+ num_return_sequences=500)
54
+
55
+ for sequence in sequences:
56
+ print(sequence['generated_text'])
57
+
58
+ ```
59
+
60
+ ### Controllable Generation
61
+
62
+ Controllable generation can be done by prompting the model with '[Generate xxx protein] Seq=<'. Here, xxx can be any family from the 10 classes supported by this model.
63
+
64
+ ```
65
+ generator = pipeline('text-generation', model="Esperanto/Protein-Llama-3-8B")
66
+
67
+ sequences = generator("[Generate Ligase enzyme protein] Seq=<",temperature=0.2,
68
+ top_k=40,
69
+ top_p=0.9,
70
+ do_sample=True,
71
+ repetition_penalty=1.2,
72
+ max_new_tokens=30,
73
+ num_return_sequences=500)
74
+
75
+ for sequence in sequences:
76
+ print(sequence['generated_text'])
77
+
78
+ ```
79
+
80
+ ### Contributors
81
+
82
+ Aayush Shah, Shankar Jayaratnam