June 2012 Archives

The storage revolution will be driven by software

bridgwatera | No Comments
| More

When will the storage industry experience its next 'paradigm shift'? This is what we want to know.

But why do we need to know this?

Well, here are a few pointers for you that came out a discussion with Red Hat's Ranga Rangachari, the company's VP & GM of its storage business unit:

...we are experiencing a data explosion

1. 70,000 images are uploaded to Facebook every minute, so we are experiencing a data explosion.

2. As much as 95% of new data is unstructured, do you realise that we are experiencing a data explosion?

3. A new level of data monitoring exists and connected "instrumented" devices that form part of the so-called "Internet of Things" are fuelling data growth so that... guess what... we are experiencing a data explosion.

4. Disk speeds today are argued to run at roughly the same speed that they did a decade ago, so there is a concern - because although capacity has increased, speed has not followed a Moore's Law path of progress, but at the same time, we are experiencing a data explosion.

The cloud computing model is of course here now and we need to work to align storage with more prevalently implemented virtualisation techniques and processes argues Red Hat's Rangachari i.e. we are indeed experiencing a data explosion, but there is still a lot of work to do.

The killer change-maker here then?

I asked Rangachari where to look for the next sea-change revolution in storage and he confirmed that the storage revolution will be driven by software, rather than hardware advancements in the first instance. That being said, he does also acknowledge the work of the Open Compute Project and other working groups in this sector and insists that development and innovation at the hardware end will not get left behind.

TechTarget's own SearchStorage.com features a piece on this topic written by Rich Castagna who also eludes to a new "storage revolution" that may be on the way

Castagna writes, "The rumblings are unmistakable. Something's afoot with data storage, and it looks like some big changes may be looming on the not-too-distant horizon. It's not just about cool new products from brash startups that may catch the storage market behemoths napping or the latest new twists on old technologies."

Data storage will never be the same, soon. We're just not all quite sure how yet.

Progress Software: the "invisible" cloud is the "performant" cloud

bridgwatera | No Comments
| More

Boston-headquartered Progress Software has been going through various reincarnations of itself since the 1980s during which time it has variously been known as a business application infrastructure software company, a data integration specialist and (if the company's marketing machine pulls it off) a competent player in the new arena of cloud computing.

As hugely sceptical of cloudwashing and the current trend to position "just about anything"-as-a-Service down some level of Internet pipe connectivity as we should rightly remain, the company's senior architect for SaaS and cloud computing strategy Mike Omerod has recently spoken of the need to diffuse talk of different cloud architectures and focus on what data really needs in order to be "computed" properly.

Public Cloud, Private Cloud or Hybrid Cloud?

Omerod says that although we have these current structural options to choose from in terms of virtualised hosted cloud services, all that really matters is that an application is available to the right set of users in the most "performant" way possible.

Editorial note: Is performant an Americanism derived as some soft or portmanteau out of 'performance' and 'conformant'? OK, I guess we know what he means.

Omerod says that with this determined focus on performant-ness (is that a word either?) we can move to a point where we view the cloud as a more transparent entity.

"In a perfect world, I shouldn't need to care about a physical implementation, or about whether or not it's a public or private implementation. When all's said and done, I simply want to be able to provide governance parameters around the use of my application and let the platform make the decisions about the optimum way to then serve up my app," said Omerod.

Here's the concept once again...

SCENARIO #1 -- Imagine setting governance parameters around the sensitivity of an application's data and having the platform determine that in the instance of highly sensitive data -- the app should run in a private cloud.

SCENARIO #2 -- If a key metric is "low latency performance" for users in a particular geographic region, again the platform should then determine to run the app in a data centre closest to users and this could possibly be public or private.

SCENARIO #3 -- Imagine a scenario where the platform is frequently checking the usage rates for various different public cloud vendors and "seamlessly moving the application" to the lowest rate provider, again based upon my specified criteria. This is, in real terms, Omerod's "invisible cloud" in motion.

"We may be a little way away from being able to achieve this, but I believe that the future of cloud will be the 'transparent cloud', where I can focus on building the best business applications driving the best business value rather than having to worry about the whole public/private/hybrid cloud debate, or being locked into a particular cloud vendor that may at some future time not meet my requirements," said Omerod.

If Progress' cloud evangelist being too fanciful here?

Do we need to be conceptualising and theorising around the practical real-world use cases of cloud computing in this way still, in 2012.

The short answer is yes, or you wouldn't have read this far, right?

Steven Gerrard and cloud computing

bridgwatera | No Comments
| More

Everything happens in the cloud now doesn't it?

It appears to be an almost pre-ordained destiny today that once a piece of software has been developed, its form and function must ultimately form the part of a service-based delivery option that essentially depends on one hosting provider or another to virtualise its existence before it ends up on users' desktops.

So how far can the cloud go into industry?

Could venue and event management be delivered via a Software-as-a-Service (SaaS) based platform with a fully integrated CRM option to keep customers happy?

Could the cloud provide enough software management capability to help Steven Gerrard's Liverpool Football Club operate its venue, track sales and enquiries... and even formulate optimal room configurations for large and small organisations?
SGerrard.JPG

Obviously it can.

Priava's SaaS platform is based on Oracle and the company currently has clients on its books including the Williams Formula One Team, Liverpool Football Club, Sydney Olympic Park, The Institute of Directors and The Royal Botanic Gardens Kew.

"We believe Priava will set the standard of usability, technology and functionality in the venue and events software market globally. It will also strengthen and centralise how our services are delivered to customers across the globe, including key growth markets in APAC, EMEA and Latin America, " said James Pegum, CEO, Priava. "We want Priava to become synonymous with quality solutions as we create further improvements to our cloud based venues and event management solution."

Editorial Comment: OK so there is a whole lot of "cloudwashing "going on at the moment and it's obviously pretty easy to say that playing Sonic The Hedgehog online is basically Sonic and chums in the cloud. Priava doesn't talk too deeply about the backend engineering of its product and how the applications work in an intelligent way that is "contextually aware" of the kind of data needed to run an events management operation, but then the firm clearly relies on its Oracle backend for most the guts of what it does. This is not simply cloudwashing, but it does show that if companies are going to "cloud-label" their software services now that they need to tell the technology story clearly and upfront and not just assume that customers or industry-watchers will simply say that's a great cool product and it just happens to fit new cloud delivery models. Nice work guys, more blood and technical guts next time please.

Application virtualisation simulation calms developer frustration

bridgwatera | No Comments
| More

How many challenges do software application developers face today? It's an impossible question to answer and if anything the challenges are becoming more complex all the time.

Shridhar Mittal is general manager of the ITKO customer solutions unit at
 CA Technologies (the artist formerly known as Computer Associates). Mittal summarises some of the main challenges in "app" development just now into seven main headings.

Developers, says Mittal, are tasked with delivering complex applications today and presented with many real challenges such as:

1. The demand for more releases per year than ever before
2. An increased level of functionality (of all kinds) within applications
3. End-users are now expecting "Facebook" standard application experiences
4. The total number of platforms applications have to run on is increasing all the time
5. There is always the challenge of the downtime developers face whilst they wait for testing teams to have access to related systems
6. Parallel development cycles increasing demands on mainframe and other back end systems for testing
7. Cloud migration may now be impacting all of the above and providing new challenges throughout

CA's Mittal argues that "service virtualisation" can address all of these issues by decoupling the dependencies that developers face in building composite applications, including mainframes, ERP & third party systems and databases.

According to Mittal, "Service virtualisation removes the constraints that cause the usual 'Dev/Test Speed Trap', where developers are asked to create 'stubs' or 'mock-ups' for the testing teams and/or wait for testers to have all dependent applications and systems available before testing the next build. If a development shop is mocking a single binary transaction -- that is a simple enough problem to solve with a bit of code. But as soon as multiple systems talk to each other and multiple business scenarios need to be validated. The manual stub approach quickly starts to trip over its own weight, so the process of mocking and massaging test data and environments can drastically slow down delivery."

The argument here is that a virtual service environment means each development/test team can work in a simulated environment that behaves just like the real thing.

In simple terms, service virtualisation is a capability that allows testers to remove constraints from the software development lifecycle process.

It allows developers and testers to test an application on a virtual infrastructure that has been configured to imitate a real production environment.

"Service virtualisation enables testing teams to change the variables to prepare for different scenarios as well. What it doesn't do is require large upfront capital in terms of infrastructure investment to create an adequate test environment to simulate the real enterprise on which the application will run. It also doesn't force those testing the app to prioritise or compromise in terms of the three critical criteria in application development and testing: cost, quality or schedule," added Mittal.

Magic Software: HTML5 is "very susceptible" to SQL injection attacks

bridgwatera | 1 Comment
| More

The security of the applications we use on our mobile devices is a subject of increasing concern.

As smartphones get smarter, tablets get easier to swallow and the cloud powers all of these "slim" devices with all the back-end power that they do not naturally come with at birth due to their small form factor and scaled down dimensions -- the issue of mobile security warms up further still.

So should this be of concern to software application developers?

This is the question posed by UK MD of Magic Software David Akka who says that the answer is most definitely, yes.

"Take, for example one of the key vulnerabilities of web applications: SQL injection. A recent report highlights that SQL injection is the number one risk for web applications and HTML5 is very susceptible to these kind of attacks as, quite simply, it has not been designed with security in mind," asserts Akka.

NOTE: SQL injection is a type of security exploit in which the attacker adds Structured Query Language (SQL) code to a Web form input box to gain access to resources or make changes to data.

MAgic.png
"Past experience should play a role here as this is the third time we are going through a security debate related to HTML. It was built and then later, when security mattered, patches were created to fix those problems; but this of course left loopholes, which are open to exploitation. As developers, we may embrace a tool for its practicality, but looking at it from a different perspective, it's not yet mature enough to provide the robust levels if security required for enterprise application development," he said.

Akka rightly points out that of course vulnerabilities exist everywhere and mobility is just the new line of attack, however it does seem to be one where we are less prepared.

"This is also due in part to increased use of collaborative applications. Although viruses are nothing new, people are perhaps less 'virus aware' when downloading applications than they should be, and in my mind, mobile collaborative apps could be a significant threat. The solution boils down to getting the right tools for the job, which can take a huge amount of work and forethought, and right now, I believe HTML5 is not one of those tools," said Akka.

Magic's Akka believes that given the current state of uncertainty (and until HTML5 reaches full maturity) it's more sensible to use a Mobile Enterprise Application Platform (MEAP), which allows you to develop once and then deliver to multiple platforms.

"This is a far safer option where security is concerned," he said.

BlackBerry Jam preserves fruits of new GUI flavour enhancements

bridgwatera | 1 Comment
| More

BlackBerry's rocky road of technology evolution has continued this month with a
UK developer tour event aimed to silence a few critics, preview the forthcoming BlackBerry 10 platform and showcase the Cascades user interface design tool.

BlackBerry maker Research in Motion (RIM) stated that 500 developers registered to attend the BlackBerry 10 Jam World Tour event in London last week.

NOTE: BlackBerry 10 is the forthcoming BlackBerry operating system for RIM mobile handheld smartphones and tablets based on the QNX Unix-like real-time operating system for embedded systems, which was acquired by RIM in April 2010.

RIM says it has done nothing less than "transform the developer experience" with BlackBerry 10, creating a platform for developers to use the native (C/C++ with Cascades and QML development) and web (HTML5 with BlackBerry WebWorks) development environments.

Prototyping jam recipe
blackberry3.jpg
Each qualified developer received a BlackBerry 10 Dev Alpha device to enable them to test the apps they develop -- see image below.

Out since May 2012 this year, each programmer will also have been pointed to the BlackBerry 10 developer toolkit for native and HTML5 software development.

Further, programmers have been directed to play with the new Cascades user interface design tool -- a functional route (as described by RIM VP of developer relations Alec Saunders) to "graphically-rich, high-performance apps" all round.

Reports from the BlackBerry Jam itself suggest that there are some significant developments to the way "screen gestures" are interpreted with the new GUI and that there is a new "flow" about the whole user experience.

James Richardson writes on CrackBerry.com detailing the following experiences with the new GUI and OS, "The new operating system looks beautiful. I was a little unsure in advance if I would like the touch screen keyboard but after a little practice it all flows very naturally. The 'flicking' of the predicted words upwards works well as do the other keyboard gestures such as swiping left across the keyboard to delete a word, and swiping downwards to switch to the further screens that contain numbers and symbols."

Photo credit: CrackBerry.com

blackberry-10-homescreen.jpg

40:Love, David Lloyd serves Brit-born M# for .NET

bridgwatera | No Comments
| More

Any technology vendor that makes disproportionate assertions claiming to be able to "slash development times" does of course need to be approached with caution, shrewd scepticism and nervous apprehension.

But a new software tool designed by a British firm has promised to do just that and claims to already have energy services company Mitie and gym, tennis and leisure club David Lloyd on its books.

The M# programming language has been developed by Geeks Ltd in the Morden region of London, just a racket's twang away from the courts and strawberries at Wimbledon itself.

The company claims to be able to cut .NET development time by more than half with this tool, which is designed to enable .NET developers to build business applications and complex websites.

NOTE: The M# compiler transforms programs written in M# into standard ASP.NET and C# code.

"Good developers are expensive. Every day spent running a software project imposes significant cost on a business, so cutting project time without compromising on build quality is important to keep costs low and meet expectations," said Paymon Khamooshi, director at Geeks Ltd.

David_Lloyd_logo1.png

"Throwing manpower at it is commonly not the solution either. According to Brooks law: adding manpower to a late project just makes it later, so we've devised a tool that will solve this issue but also put an end to late .NET projects. M# can cut dev time and cost significantly. The typical time/cost savings are about 4x."

Based on the 4x principle, comparisons for a typical enterprise software development project are:

  • Onshore: £400,000 (typical consultancy or in-house.
  • Offshore: £280,000 plus overhead costs (onshore management, quality control) of £40,000.
  • M#: £100,000.

"Given these numbers we believe that outsourcing .NET development projects should be a thing of the past," added Khamooshi. "If we can cut development time and costs by as much as 4x then there surely has to be scope for keeping projects in Britain -- we are prepared to license the M# technology to enable UK developers to meet this challenge."

What can software application developers expect from Cloud Computing World Forum?

bridgwatera | No Comments
| More

There's a cloud computing industry exhibition/conference/symposium/jamboree (pick any one you like) on this week in London!

You don't say?

Hang on -- these things get staged all the time now and they are just a hotchpotch of industry vendors with enough money to spend on stand space, booth babes, give away promotional packs of jellybeans right?

There's no real news of value surely?

This is all about knocking up an intra-industry get together and a chance to open a crate of cold Becks "biers" on the stand at 5pm to show what a cool bunch of guys the marketing evangelists are as far as I understand it. Right?

So what's making HARD news this year?

It may be more a case of hard facts than hard news, but in terms of what that means for software application developers that may be no bad thing.

Cloud Computing World Forum.png
Cloud consumerisation & contextualisation

According to the Cloud Industry Forum it is the consumerisation and contextualisation of the cloud that is changing the basis of expectation for software application developer's responsibilities in the service-based computing model.

"The real world impact of legacy applications and their regulation plus levels of customisation/integration of applications now call into check how IT services can be deployed. As the market is arguably still nascent, the number of new entrants moving into the market is driving a level of divergence as each aims to get its value proposition communicated. This noise is one of the key issues the Cloud Industry Forum is committed to providing guidance on by providing clarity on best practice for cloud service delivery," said Andy Burton, chair of the Cloud Industry Forum and CEO of Fasthosts.

So not only are consumers changing the way they use the cloud every day, we also have a "still-nascent" industry where best practices and standards are not yet set in stone.

Are software developers using cloud technologies not then caught in the middle?

Nigel.jpg"The challenge for software professionals in these 'ever-shifting' cloud-centric environments today is to look at the tools they have around them now. Companies like Rackspace are known for our branded 'Fanatical Support' function for a reason; we offer technical engineering care and tuition tailored to specific application deployment types whether they reside on private, public or hybrid clouds," said Nigel Beighton, VP of technology international at Rackspace.

"What's important for programmers in the cloud arena now is that they grasp an appreciation for what 'type' of applications (and data) are well suited to the cloud model. Yes it's all about standards now, but every solution needs to be viewed at a more granular level too," added Beighton.

For another viewpoint...

Sacha Labourey, CEO of CloudBees has said that the cloud represents one of the largest IT paradigm shifts, ever.

"From redefining the concepts of operating systems and middleware, to revolutionising the way IT services are built and consumed, the cloud is ushering in change. Platform as a Service (PaaS) provide developers with the ability to develop applications without having to maintain the underlying infrastructure. PaaS accelerates time to market. As a result, businesses are more agile and able to respond to market pressures more quickly than ever before."

Dell's VP of cloud solutions Ricky Santos shows off his media training and goes on the record to say that, "At Dell we believe that cloud isn't just a technology."

Pardon?

"It's a corporate strategy focused on business outcomes," he said.

Oh, I see, tell me more.

"Many customers and cloud vendors adopt cloud solutions but fail to get the full benefit because they operate it separately from the rest of the IT. The real benefit of the cloud comes when it's integrated with IT and leveraged across all environments. Services consultants should work to understand a customer's business and help them plan, build, deploy, manage and access clouds that meet their specific needs," said Santos.

... and finally

Karl Stevens, EMEA cloud architect at Red Hat has suggested that to ensure cloud implementations remain truly open, not only in licence, but also in governance, it is important that developers "consider users and code ahead of infrastructure" -- to keep up with "growing standards and consumer-based innovation" as he puts it.

"The cloud should not be a catalyst for lock-in, and open clouds enable the simple development of any application model on public, private and hybrid platforms. For example, Red Hat sponsors OpenShift Origin, the open source project behind its OpenShift product whereby it lets developers take advantage of the OpenShift Platform-as-a-Service (PaaS) by running their version on their own infrastructure. This means programmers can run the code behind the firewall and tweak it to work with existing applications to keep cloud as simple as possible," said Stevens.

Our cloud-based challenges then, are still manifold, multiplex and multifarious. Will we get all the answers from Cloud Computing World Forum and their list of hot topics as listed here for our consideration?

No -- it's unlikely.

Will we get free jellybeans?

Yes -- it's a certainty.

OK - I'll go.

Just how big is "big data" anyway?

bridgwatera | No Comments
| More

We hear a lot of talk concerning big data, but just how big is it and how much is data actually growing by?

For that matter, do we all agree on exactly what big data actually is right now?

Search Cloud Computing defines big data (also spelled Big Data) as a general term used to describe the voluminous amount of unstructured and semi-structured data a company creates -- although big data doesn't refer to any specific quantity, the term is often used when speaking about petabytes and exabytes of data.

Beware the big data scaremongers

You don't have to read far too find reports claiming that we currently have in excess of 10 times more data than we did three years ago, but it's important to hold fire on this kind of claim.

Beware the big data scaremongers; we don't have 10 times more data than we did three years ago.
Data-transfer.svg.png
Robin Bloor uses his Inside Analysis blog to remind is that IDC has estimated a (compound annual growth rate) CAGR of 40% for data storage, which would "suggest" that we in fact have less than three times as much data as three years ago.

Bloor also points to estimates from Dell, which approximate for similar growth taking into account the huge swathes of unstructured data being created in the form of videos, sound, images and text etc.

If we can finally "get over" the big data hype then we must now start turning our attention to what we do indeed do with the data that we do have (however big it is), which is why you will see ANALYTICS mentioned repeatedly as they key so-what factor when it comes to big data and the IT assets that it represents to us.

It is important at this stage to define the size and scope of big data so that we know whether we really do have ten times more data (or just a whole lot less) than we did three (or even five) years ago so that developers can architect applications to the correct grade in terms of robustness, resilience and overall roundness of vision.

Subscribe to blog feed

About this Archive

This page is an archive of entries from June 2012 listed from newest to oldest.

May 2012 is the previous archive.

July 2012 is the next archive.

Find recent content on the main index or look in the archives to find all content.