Mol ID
A transformer encoder model pretrained on 50M ZINC SMILES string using flash attention 2
Hardware:
- gpu that support flash attention 2 and bf16
Software:
- flash attention 2
- lightning for mixed precision (bf16-mixed)
- wandb for logging
- huggingface
- tokenizers
- datasets
github repo: link
- Downloads last month
- 46
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.