Intel’s MD of worldwide professional services Aaron Davies-Morris has been speaking at the Hadoop Summit 2013 in San Jose about his firm’s drive to work with the open source Hadoop Java-based programming framework.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Hadoop’s key feature hinges around its support for the processing of large data sets in a distributed computing environment — it is part of the Apache project.
NOTE: Hadoop makes it possible to run applications on systems with thousands of nodes involving thousands of terabytes, an area of massive High Performance Computing (HPC) that Intel is also keen to focus on to (obviously) help drive high-margin chip sales.
Intel will also look to sell services around Hadoop to augment its core chip business.
Aaron Davies-Morris asserts that “Intel is committed to open source” but with an important caveat …
… open source technology like Hadoop where must not be extensively deployed (given the weight of information a typical use case would see put “inside” it) without the right security controls in place.
The intel-hadoop / project-rhino GitHub page details the following:
As Hadoop extends into new markets and sees new use cases with security and compliance challenges, the benefits of processing sensitive and legally protected data with all Hadoop projects and HBase must be coupled with protection for private information that limits performance impact. Project Rhino is our open source effort to enhance the existing data protection capabilities of the Hadoop ecosystem to address these challenges, and contribute the code back to Apache.
Intel’s Davies-Morris says that Intel will retain its security focus on Hadoop by virtue of its partner framework, “We think our partner strategy helps us meet large enterprise needs,” he said.
“Having a layered defence of security is [the only smart way forward],” added Davies-Morris.
Intel says it will now move forward with its work on encryption, access control and auditing (as one set of layers) and also firewalls and IDS (intrusion detection system) technology (as another layer).
Is Intel being opportunistic and jumping on the big data bandwagon, or conscientious and forward-thinking? My money is on the latter.