Kioxia Announces New SSD Model Optimised for GPU AI-Driven Workloads
Kioxia Corporation has announced the development of SSD Super High IOPS, a new type of SSD that enables GPUs to access high-speed flash memory directly as memory expansion on High Bandwidth Memory (HBM) in AI systems. The new SSD Super High IOPS, which comprises the Kioxia GP Series, has been specifically designed to meet the growing demands of AI performance and high-performance computing. This SSD provides greater memory capacity accessible to GPUs for faster data access to AI workloads. Evaluation samples of the Kioxia GP Series will be available to selected customers by the end of 2026.
NVIDIA’s Storage-Next initiative is aimed at anticipating the shift from compute-intensive to data-intensive workloads and addressing the growing need for GPU-accessible memory space, which is currently limited by HBM size. Expanding the memory space available to GPUs enables access to larger datasets and improves GPU utilisation by moving more data closer to computing resources.
The NVIDIA Storage-Next initiative calls upon SSD vendors to design drives optimised for GPU-driven AI workloads. This initiative effectively expands HBM capacity by enabling GPUs to access flash-based memory. Kioxia supports the NVIDIA initiative with the Kioxia GP Series SSD, which utilises Kioxia’s high-performance, low-latency XL-FLASH class storage memory. These SSDs are uniquely positioned for this architecture, delivering higher IOPS, finer data access (512 byte), and lower power consumption per IO compared to conventional Kioxia TLC SSDs.
“Kioxia fully supports the NVIDIA Storage-Next initiative and will deliver SSDs specifically designed to effectively meet the needs of GPU-accessible memory,” said Makoto Hamada, Senior Director of the SSD Division at Kioxia Corporation. “This collaboration is crucial in shaping the future of AI storage architecture.”
Kioxia reaffirms its commitment to advancing AI and high-performance computing technology through continuous innovation and strategic collaboration. The Kioxia GP Series SSD product line has been designed to meet the evolving needs of AI workloads.
Furthermore, AI models are rapidly expanding towards trillions of parameters whilst context windows are expanding to millions of tokens. This phenomenon is driving unprecedented growth in KV (Key Value) cache requirements. Architectures such as NVIDIA’s Context Memory Storage (CMX) recognise the need to extend the memory hierarchy beyond GPU memory using high-performance storage. The Kioxia CM9 Series PCIe 5.0 E3.S SSD, which features 25.6 TB TLC capacity with 3 DWPD durability, delivers the performance, capacity, and durability needed to support large-scale inference environments. Kioxia believes that this class of storage will play an important role in enhancing efficient and cost-effective AI inference infrastructure. Samples will begin shipping in the third quarter of 2026.
Kioxia will demonstrate an emulator of the SSD Super High IOPS and other technology innovations at NVIDIA GTC at booth 3522.