File size: 300 Bytes
690f436 faf4c20 |
1 2 3 4 5 6 7 8 9 10 |
---
license: mit
---
This repo contains a low-rank adapter for LLaMA-65b
fit on the Stanford Alpaca dataset translated into Japanese.
It doesn't contain the foundation model itself, so it's MIT licensed.
Instructions for running it can be found at https://github.com/kunishou/Japanese-Alpaca-LoRA. |