Field engineering veep at Concurrent, Inc Supreet Oberoi says that “2015 is the ‘show me the money year’ for Hadoop” — his firm offers an application development platform for big data applications.
The following text is all attributed to Oberoi as a guest post on Computer Weekly’s Open Source Inside blog.
Hadoop has been rapidly adopted by enterprises as “the way” to execute big data strategy. However, enterprises must now show ROI by making these data applications operational in order for businesses to make decisions. They can do so by selecting the right platform to develop on Hadoop.
When building applications on Hadoop, enterprises have struggled with choosing between easy-to-use tools and necessary frameworks, but the easy way almost always falls short.
Additionally, ease of development is only a small part of the overall effort to operationalise applications on Hadoop – enterprises need tools to debug, tune, deploy, monitor, govern and provide compliance.
When setting the foundation of a new platform architecture with Hadoop, enterprises must choose an architecture that scales with data, absorbs the complexity in applications, integrates with legacy systems and operationalises developments with minimal effort – all while ensuring the compliance and the governance needs are met. In addition, best practices must be shared and components reused across teams.
Selecting the right platform is not just a technical decision – technology leaders need to develop a path to train existing organisations.
Leveraging only existing skillsets may limit options, and selecting easy-to-use technology may result in an overly simple approach to develop applications that won’t meet the needs of the enterprise. As a result, selecting a platform that leverages existing development skill sets, such as Java, that can scale to meet the demands of the enterprise is what is required to show ROI.
This is why 2015 will be the “show me the money” year for Hadoop.
NOTE: Concurrent builds application infrastructure products that are designed to help create, deploy, run and manage data applications at scale on Apache Hadoop.