Computational SSDs: moving compute to where the data lives

Computational SSDs: moving compute to where the data lives Computational SSDs: moving compute to where the data lives

In the age of AI, real-time analytics, and high-performance computing (HPC), data is both a competitive differentiator and a crushing challenge. Enterprises are drowning in petabytes of information, while traditional architectures struggle to move and process that data fast enough. CPUs and DRAM remain overloaded with repetitive filtering, scanning, and query operations, while PCIe buses and network links strain under relentless traffic.

Manufacturers are increasingly turning to computational storage as a different approach to help break this bottleneck. By embedding compute directly into solid-state drives (SSDs), organisations can shift data processing closer to where the data actually resides. The result is faster insights, lower power consumption, simplified infrastructure, and a more sustainable path forward for modern data centres.

Why computational SSDs matter

In a conventional CPU-centric model, the CPU requests data, storage delivers it, and DRAM manages the back-and-forth. This approach made sense when datasets were small and workloads predictable. Today, that paradigm is under pressure like never before.

When every query, analytic, or AI training pass requires multiple round trips between storage and CPU, throughput suffers. For example, in a CPU-centred design you have one read/write load on DDR from the CPU and another from SSD data movement. Tasks become limited by DRAM bandwidth, effectively cutting processing speed in half – from 96GB/s theoretical to closer to 48GB/s in practice.

Now contrast that with the bandwidth available inside SSDs themselves. Each SSD carries about 28GB/s of internal bandwidth. A server equipped with 32 SSDs has nearly 896GB/s of ‘free’ computational bandwidth. That translates to almost a billion bytes per second of scanning and filtering capability, without overwhelming PCIe lanes or the network interface.

By allowing SSDs to offload repetitive, low-level work such as scanning, filtering, or pre-processing, the CPU is liberated to focus on higher-order operations. This not only speeds up performance but also boosts overall system efficiency and scalability.

From storage devices to data engines

Computational SSDs mark a profound shift in how enterprises think about storage. Instead of being a passive repository, the SSD becomes an active data engine. Embedded compute resources filter queries, pre-sort data, and accelerate workloads before the information ever leaves the drive.

For AI and analytics-heavy environments, this is a game changer. Training a model or running a real-time dashboard often requires brute-force passes through massive datasets. By handling those repetitive sweeps directly in the storage layer, computational SSDs dramatically reduce data movement, improve latency, and increase throughput.

The advantages extend across the entire system:

  • Performance – database processing speeds can increase by up to 20x compared to CPU-only designs
  • Density – higher user counts per server translate to more efficient infrastructure and lower cost per seat
  • Sustainability – with fewer CPU cycles wasted and less data moving across buses, computational SSDs reduce both power draw and cooling requirements

Business benefits beyond speed

While performance metrics often grab headlines, the real value lies in operational and business outcomes. Enterprises deploying computational SSDs are seeing:

  • Simplified infrastructure – by reducing reliance on complex data pipelines between processors and storage, organisations can cut system complexity and associated costs
  • Lower power and smaller footprint – less data movement means less energy expended, fewer racks required, and better sustainability scores, which is an increasingly important metric for boards and regulators
  • Enhanced agility and scalability – with compute capacity distributed at the storage layer, enterprises can scale workloads more flexibly and respond faster to business demands

In industries where time-to-insight is critical, such as finance, healthcare, research, and manufacturing, this flexibility can be the difference between leading the market and lagging behind.

Overcoming the adoption curve

Of course, no transformational technology comes without hurdles. Computational SSDs require shifts in software architecture and, in some cases, hardware integration. APIs may need to be updated to fully exploit in-drive compute, and IT teams will need to adapt operational practices.

A natural resistance can also arise when it comes to introducing this novel approach into mission-critical environments. Enterprises want assurance that new hardware won’t compromise reliability or require wholesale redesigns.

Here is where vendor collaboration is key. Phison, for example, has developed its Pascari enterprise SSD line with computational features baked in from the controller up, ensuring seamless integration. Through its IMAGIN+ customisation services, Phison works directly with OEMs and integrators to tailor solutions for each environment, from pre-installation testing and validation to ongoing optimisation.

This minimises disruption and accelerates time-to-value, which is critical for organisations wary of adoption risks.

A strategic evolution for data infrastructure

At its core, computational storage requires organisations to rethink their data infrastructure. By shifting processing to where the data resides, enterprises can sidestep bottlenecks that have plagued traditional architectures for decades.

This evolution is particularly well suited for three demanding environments. In AI workloads, models require rapid, repeated passes through massive datasets and computational SSDs can dramatically reduce overhead. In high-performance computing, simulations and scientific applications depend on sustained throughput, making storage-level acceleration a natural fit. And in real-time analytics, where milliseconds directly affect customer experience and operational efficiency, processing data in place ensures speed and responsiveness at scale.

Phison has positioned itself at the forefront of this shift, leveraging decades of controller design expertise to create SSDs that not only store data but also serve as computational accelerators.

Looking ahead to a storage-centric future

As data growth continues, the question is not whether enterprises should adopt computational SSDs but how quickly they can do so. Analysts predict that the most competitive data centres of the next decade will be those that distribute compute intelligently – pushing it closer to data and reducing reliance on overburdened CPUs.

For enterprises, this can solve bottlenecks and also unlock new business models. Faster analytics enable smarter personalisation. Accelerated AI training leads to quicker innovation cycles. Leaner infrastructure reduces both capital and operational costs.

Computational SSDs represent a rare triple win of better performance, lower cost, and stronger sustainability. Embedding intelligence at the storage layer allows organisations to move beyond storage and treat data as a true strategic asset.

About the author:

Michael Wu, President and General Manager, Phison US

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Previous Post
AI simulation for automotive component design and manufacturing

AI simulation for automotive component design and manufacturing

Next Post
An inductive sensor that simultaneously reads two sets of coils

An inductive sensor that simultaneously reads two sets of coils