
Storage platform provider StorONE has announced an expansion of its strategic partnership with Phison Electronics, a developer of NAND flash technologies to deliver AI-native intelligent, on-premises storage platforms that combine Phison’s aiDAPTIV+ memory extension technology with StorONE’s high-performance storage platform.
The partnership will provide enterprises and research teams to train and deploy large language models (LLMs) on-premises, using secure, high-throughput storage enhanced with conversational AI interfaces for simple management and optimization.
This expansion of their partnership will integrate Phison’s aiDAPTIV+ GPU memory extension technology with StorONE’s high-performance software-defined storage platform. aiDAPTIV+ extends the GPU’s VRAM to Phison’s SSDs, which eliminates the need for massive numbers of data center GPUs.
The idea is to let customers to train the largest language models — up to several billion parameters — on-premises with less hardware required while offering a built-in conversational chatbot that helps users manage, query, and optimize storage configurations and performance in plain English.
“In today’s AI-first world, infrastructure must evolve beyond raw capacity,” Phison wrote in a blog post announcing the deal. “This partnership aims to reimagine storage as a smart, responsive platform that actively supports AI model development, training, and real-time inferencing.”
The joint solution will be available in Q2 2025, offering customers a transformative combination of high-speed, scalable storage and AI embedding acceleration that supports LLM training at scale.
Source:: Network World