Software runs on data and data is often regarded as the new oil. So it makes sense to put data as close to where it is being processed as possible, in order to reduce latency for performance-hungry processing tasks.
Some architectures call for big chunks of memory-like storage located near the compute function, while, conversely, in some cases, it makes more sense to move the compute nearer to the bulk storage.
In this series of articles we explore the architectural decisions driving modern data processing… and, specifically, we look at computational storage
The Storage Network Industry Association (SNIA) defines computational storage as follows:
“Computational storage is defined as architectures that provide Computational Storage Functions (CSF) coupled to storage, offloading host processing or reducing data movement. These architectures enable improvements in application performance and/or infrastructure efficiency through the integration of compute resources (outside of the traditional compute & memory architecture) either directly with storage or between the host and the storage. The goal of these architectures is to enable parallel computation and/or to alleviate constraints on existing compute, memory, storage and I/O.”
This post is written by Randy Kerns, senior analyst and strategist, Evaluator Group – an organisation that describes itself as committed to delivering unbiased, comprehensive research and information on information management, data storage, data protection and IT infrastructure solutions for modern datacentres.
Kerns writes as follows…
Jumping to conclusions over ‘the next big thing’ may be a natural reaction to new ideas. But, at the same time, it may be a disservice to the ongoing development required by setting expectations and timing that may lead to disappointment… it can also lead to a lessening of the long-term potential value resulting in us all too soon moving on to ‘the next, next big thing’ and perhaps even worse.
Such is the case with the advent of computational storage.
Enabled by the use of solid-state storage technology without the need to control servo positioning and rotational mechanics of spinning devices, the multi-core processor gives an opportunity to add value in some areas with great potential benefit. Reducing the data movement required by operating on data on the devices allows for greater computation to be done by the server processors and achieving possible scaling efficiencies with multiple devices operating in parallel are cited benefits that are apparent.
With that frame of reference, we should perhaps ask… what are the potential areas that can be addressed to provide benefit?
Speculations, expectations & extrapolations
This is where the speculations transition to expectations and lose perspective on the challenges and timing.
The use of computational storage currently has two defined approaches: one where a fixed set of operations can be performed on the data stored such as transcoding, encryption, or compression and another where a programme can be downloaded into the devices to perform a specific task on the data such as searching or directed access to data.
Some of the vendors of computational storage already have solutions available or demonstrable. The speculation of this becoming general purpose usage opens the discussion of the challenges that are presented and the potential. There are many challenges but none that cannot be overcome with time and effort. The environment – processing and data handling technology – is not static so changes there will affect design and potential gain. The benefits from the near-term delivered capabilities will be measured and give an indication of potential. Even if these implementations are special purpose and more applicable in some vertical markets, they have real value. Expanding usage to add incremental value is the developmental approach.
So then, setting expectations for broad, general purpose usage may overlook the value gained with the defined solutions.
Customer evaluation and choice is another set of considerations that will need to be examined in greater detail. For now, setting near-term expectations and explaining value is important for the vendors and investment in development of computational storage.