In anticipation of the advent of the AI inference era, SoftBank has announced its intention to invest 3 billion yen in the development of high-performance, power-saving next-generation memory.
🟦 SoftBank to invest 3 billion yen in next-generation memory development: An “unexpected challenge” in anticipation of the AI era
At its fiscal 2024 financial results briefing, SoftBank announced plans to invest approximately 3 billion yen in the development of next-generation memory technology for AI data centers. The aim is to create a new memory structure that can take full advantage of GPU performance in AI processing centered on inference.
The current HBM (High Bandwidth Memory) is said to consumes a large amount of power and has limited processing power, and SoftBank is considering a new memory architecture that utilizes patented technology from universities to overcome this problem. Commercialization will be entrusted to semiconductor companies in the core business, and the company will focus on the acquisition and utilization of intellectual property (IP).
🟦 Why are telecommunications companies now “developing memory”?
In the background, the inference processing of AI models is increasing rapidly, and the reality is that the time has come when the efficiency of AI data centers will determine the competitiveness of the industry. In particular, there was concern that the existing memory structure would become a bottleneck in the advanced learning and retrieval processing of generative AI such as the “Sarashina” series and “Cristal Intelligence” that we are developing in-house.
While the performance of GPUs is improving year by year, there is a structural problem that if the memory that supports it does not evolve, the capabilities of AI as a whole will plateau. In light of this situation, President Miyakawa says that he decided to develop the project based on the idea that “it will not be realized unless someone takes the risk first.”
🟦 Summary
In order to support the evolution of AI inference processing, SoftBank has announced that it will invest 3 billion yen in the development of next-generation memory. The aim is to realize a new structure that exceeds the limits of HBM and contribute to improving the efficiency of AI data centers.
AI requires high-speed, high-capacity RAM, and this trend will continue for some time unless the model structure changes significantly. SoftBank Group’s involvement in the memory space is reminiscent of its investment in Kingston Technology in the 1990s.