|
--- |
|
license: other |
|
--- |
|
# AIDO.Protein-16B-v1 |
|
|
|
AIDO.Protein-16B-v1 continues the pre-training of [AIDO.Protein-16B](https://huggingface.co/genbio-ai/AIDO.Protein-16B) using an additional 100 billion amino acids from Uniref90. |
|
|
|
|
|
## How to Use |
|
### Build any downstream models from this backbone with ModelGenerator |
|
For more information, visit: [Model Generator](https://github.com/genbio-ai/modelgenerator) |
|
```bash |
|
mgen fit --model SequenceClassification --model.backbone aido_protein_16b_v1 --data SequenceClassificationDataModule --data.path <hf_or_local_path_to_your_dataset> |
|
mgen test --model SequenceClassification --model.backbone aido_protein_16b_v1 --data SequenceClassificationDataModule --data.path <hf_or_local_path_to_your_dataset> |
|
``` |
|
|
|
### Or use directly in Python |
|
#### Embedding |
|
```python |
|
from modelgenerator.tasks import Embed |
|
model = Embed.from_config({"model.backbone": "aido_protein_16b_v1"}).eval() |
|
transformed_batch = model.transform({"sequences": ["HELLQ", "WRLD"]}) |
|
embedding = model(transformed_batch) |
|
print(embedding.shape) |
|
print(embedding) |
|
``` |
|
#### Sequence Level Classification |
|
```python |
|
import torch |
|
from modelgenerator.tasks import SequenceClassification |
|
model = SequenceClassification.from_config({"model.backbone": "aido_protein_16b_v1", "model.n_classes": 2}).eval() |
|
transformed_batch = model.transform({"sequences": ["HELLQ", "WRLD"]}) |
|
logits = model(transformed_batch) |
|
print(logits) |
|
print(torch.argmax(logits, dim=-1)) |
|
``` |
|
#### Token Level Classification |
|
```python |
|
import torch |
|
from modelgenerator.tasks import TokenClassification |
|
model = TokenClassification.from_config({"model.backbone": "aido_protein_16b_v1", "model.n_classes": 3}).eval() |
|
transformed_batch = model.transform({"sequences": ["HELLQ", "WRLD"]}) |
|
logits = model(transformed_batch) |
|
print(logits) |
|
print(torch.argmax(logits, dim=-1)) |
|
``` |
|
#### Regression |
|
```python |
|
from modelgenerator.tasks import SequenceRegression |
|
model = SequenceRegression.from_config({"model.backbone": "aido_protein_16b_v1"}).eval() |
|
transformed_batch = model.transform({"sequences": ["HELLQ", "WRLD"]}) |
|
logits = model(transformed_batch) |
|
print(logits) |
|
``` |
|
|
|
# Citation |
|
Please cite AIDO.Protein using the following BibTex code: |
|
``` |
|
@inproceedings{sun_mixture_2024, |
|
title = {Mixture of Experts Enable Efficient and Effective Protein Understanding and Design}, |
|
url = {https://www.biorxiv.org/content/10.1101/2024.11.29.625425v1}, |
|
doi = {10.1101/2024.11.29.625425}, |
|
publisher = {bioRxiv}, |
|
author = {Sun, Ning and Zou, Shuxian and Tao, Tianhua and Mahbub, Sazan and Li, Dian and Zhuang, Yonghao and Wang, Hongyi and Cheng, Xingyi and Song, Le and Xing, Eric P.}, |
|
year = {2024}, |
|
booktitle={NeurIPS 2024 Workshop on AI for New Drug Modalities}, |
|
} |
|
``` |