Pentaho thinks its new Streamlined Data Refinery solution architecture (optimised for the HP Vertica Analytics Platform) could provide a better (more “refined & sophisticated” route to big data analytics.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
This software is designed to help create analytic datasets within Hadoop and immediately push that data to HP Vertica.
The firms’ vision is one of analytics in Hadoop where users can:
• analyse and,
• visualise all of their data.
Users are saying that while Hadoop is a “great landing zone” for big data, the real analysis of granular data needs an additional custom-designed extra.
“We’re using Pentaho and HP Vertica to quickly slice and dice terabytes of data and millions of daily records, plus record click stream data and pixel logs,” said Jaiesh Khavani, senior manager, BI & data warehousing at Santa Monica-based e-commerce fashion company Beachmint.
“While Hadoop is a great landing zone for our big data requirements, to truly engineer data sets for predictive analytics, we need purpose-built platforms like HP Vertica and Pentaho to provide the data integration, reporting, and visualization capabilities to drive meaningful insights,” added Khavani.
Through this collaboration, analytics-ready datasets are blended using HP Vertica for analytics against minimally modeled data.
According to Pentaho’s Christopher Dziekan, as one of the four big data blueprints built by his firm, the Streamlined Data Refinery for HP Vertica offers data developers in-cluster data integration capabilities to power scalable and interactive analytics for end users.