RakutenAI-7B
RakutenAI-7B is a systematic initiative that brings the latest technologies to the world of Japanese LLMs. RakutenAI-7B achieves the best scores on the Japanese language understanding benchmarks while maintaining a competitive performance on the English test sets among similar models such as OpenCalm, Elyza, Youri, Nekomata and Swallow. RakutenAI-7B leverages the Mistral model architecture and is based on Mistral-7B-v0.1 pre-trained checkpoint, exemplifying a successful retrofitting of the pre-trained model weights. Moreover, we extend Mistral's vocabulary from 32k to 48k to offer a better character-per-token rate for Japanese.
Model Evaluation results can be found on our HuggingFace repository.
Model Downloads
| Model | Type | Download |
|---|---|---|
| RakutenAI-7B | Foundation Model | 🤗 HuggingFace |
| RakutenAI-7B-instruct | Instruction-tuned Model | 🤗 HuggingFace |
| RakutenAI-7B-chat | Chat-tuned Model | 🤗 HuggingFace |
Getting Started
You can refer to our "Getting Started" guide for a step-by-step tutorial on how to incorporate our models into your own applications.