A team of researchers from the Tokyo Institute of Technology, Fujitsu and others have announced the development of a large language model that can serve as a foundation for generative artificial intelligence, using the Japanese supercomputer Fugaku.

Trained extensively on data in Japanese, which account for 60% of the total training data, the Fugaku-LLM model revealed Friday is expected to lead to research on generative AI tailored to domestic needs.

In May 2023, the researchers — also including those from Tohoku University, Nagoya University, the government-backed research institute Riken, CyberAgent and Kotoba Technologies — launched the project employing the supercomputer jointly developed by Fujitsu and Riken.