Hewlett Packard Enterprise has reached what the firm thinks is a milestone for software application developers interested in cutting edge infrastructure, data-centric computing and software-defined computing frameworks. To be specific, the firm has announced the demonstration of Memory-Driven Computing (MDC).
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
As a piece of software methodology and a concept in its own right, memory-driven describes a format where we put memory (as opposed to processing) at the center of the computing platform (where processing would generally be considered to be more traditional) — and we do this in order to attempt to realise higher performance and efficiency.
Welcome to The Machine
Developed as part of The Machine research program, HPE says this proof-of-concept prototype represents a big shift in the fundamental architecture on which all computers have been built for the past 60 years. So does it?
“The beauty of memory driven computing is that it is infinitely scalable from devices the size of a fingernail right up to the size of a supercomputer,” said HPE CEO Meg Whitman during the HPE Discover 2016 event in London this week.
The Machine research project itself is said to be one of the largest and most complex research projects in HPE’s history.
Antonio Neri, executive vice president and general manager of the Enterprise Group at HPE has said that with this prototype, which was brought online in October, shows the fundamental building blocks of the new architecture working together.
What’s inside memory-driven?
- Compute nodes accessing a shared pool of Fabric-Attached Memory;
- An optimised Linux-based operating system (OS) running on a customised System on a Chip (SOC);
- Photonics/Optical communication links, including the new X1 photonics module, are online and operational; and
- New software programming tools designed to take advantage of abundant persistent memory.
During the design phase of the prototype, simulations predicted the speed of this architecture would be high — specifically, the company has run new software programming tools on existing products, illustrating improved execution speeds of up to 8,000 times on a variety of workloads.
The Machine research project will also increase focus on exascale computing. Exascale is a developing area of High Performance Computing (HPC) that aims to create computers several orders of magnitude more powerful than any system online today.
Memory-Driven Computing Commercialisation
“HPE is committed to rapidly commercialising the technologies developed under The Machine research project into new and existing products. These technologies currently fall into four categories: Non-volatile memory, fabric (including photonics), ecosystem enablement and security,” said the company, in a press statement.
Non-Volatile Memory (NVM)
HPE is working to bring byte-addressable NVM to market and plans to introduce it as soon as 2018/2019. Using technologies from The Machine project, the company developed HPE Persistent Memory – a step on the path to byte-addressable non-volatile memory, which aims to approach the performance of DRAM while offering the capacity and persistence of traditional storage.
Fabric (including Photonics)
Due to its photonics research, HPE has taken steps to future-proof products, such as enabling HPE Synergy systems that will be available next year to accept future photonics/optics technologies currently in advanced development.
Much work has already been completed to build software for future memory-driven systems. HPE launched a Hortonworks/Spark collaboration this year to bring software built for Memory-Driven Computing to market. In June 2016, the company also began releasing code packages on Github to begin familiarizing developers with programming on the new memory-driven architecture.
According to HPE, “The company plans to put this code into existing systems within the next year and will develop next-generation analytics and applications into new systems as soon as 2018/2019. As part of the Gen-Z Consortium HPE plans to start integrating ecosystem technology and specifications from this industry collaboration into a range of products during the next few years.”
With this prototype, HPE demonstrated new, secure memory interconnects in line with its vision to embed security throughout the entire hardware and software stack. HPE plans to further this work with new hardware security features in the next year.