Sergey Nivens - Fotolia
The appliance comes as pre-configured hardware with Pentaho software that allows ingestion of data from multiple sources and formats to allow data analytics to be run across them.
Suggested workloads are those where an organisation would want to run analytics across a variety of datasets to identify patterns, including in real time.
These could include banking, for use cases such as fraud detection, and retail, where web analytics and identification of purchasing patterns would allow offers to be generated. HDS also has in mind internet of things (IoT) use cases.
The HSP 400 series appliance comes as a dual Intel 12-core CPU node with 12 6TB SAS drives.
HSP 400 nodes can be combined to build clusters of 20 to provide 1.4PB of capacity with 240 CPU cores.
Read more about big data storage
- Big data storage demands capacity and processing/IOPS performance, and a range of choices exist such as scale-out NAS, object storage, hyperscale and hyper-converged storage.
- Hadoop changes the game for enterprise storage. We look at how Hadoop crunches big data, its key storage requirements and survey the suppliers.
HSP nodes and clusters can be added in non-disruptive scale-out fashion.
Pentaho can also manage Hadoop, Apache Spark, Hortonworks, Cassandra and NoSQL data platforms.
Pentaho is an open source data analytics platform that was acquired by HDS in June 2015. It allows users to create a “data lake”, with data ingested from object, file and block access systems, on which data analytics can be run.