Rakuten AI

Documentation

R

RakutenAI-2.0-mini

RakutenAI-2.0-mini is a lightweight Japanese language model trained from scratch using a transformer architecture, designed for efficient performance in resource-constrained environments. As a foundation model, it serves as the backbone for instruct models.

RakutenAI-2.0-mini-instruct is a lightweight yet powerful fine-tuned variant of RakutenAI-2.0-mini, specifically designed for edge devices and resource-constrained environments. While compact in size, this model delivers efficient, high-quality instruction-following capabilities, making it an ideal choice for on-device AI applications, low-latency inference, and cost-effective deployment. It achieves competitive performance within the sub-2B parameter category on Japanese MT Bench, offering a balance of speed, efficiency, and accuracy for real-world use cases.

Model Evaluation results can be found on our HuggingFace repository.

Model Downloads

ModelTypeDownload
RakutenAI-2.0-miniFoundation Model🤗 HuggingFace
RakutenAI-2.0-mini-instructInstruction-tuned Model🤗 HuggingFace

Getting Started

You can refer to our "Getting Started" guide for a step-by-step tutorial on how to incorporate our models into your own applications.

Built with v0