IT does matter, but not as we know it

News

IT does matter, but not as we know it

Cliff Saran

Three years after taking IT down several pegs, business pundit Nicholas Carr sees a healthy future for the industry, albeit in a less glamorous form

"IT doesn't matter," proclaimed business writer Nicholas Carr in his famous or infamous (depending on your niche in the IT industry) article for the Harvard Business Review. The questioning of the strategic value of IT to the business sent shockwaves throughout the industry and among IT directors.

Carr's analysis raised concern in boardrooms far and wide that IT might not give a company any competitive edge. His take on IT went completely against the industry's marketing message that investing in IT was the most important strategic investment a company could make - a message taken seriously by most business executives who saw themselves as forward-thinking.

Three years down the line, Carr is unrepentant, reiterating his message to an audience of CIOs in London that IT is a business utility, much like telephones and fax, heating and lighting.

He does, however, sniff change in the air, forecasting tough times ahead for the familiar client/server approach as businesses committed to utility computing instead.

"There have been a lot of assumptions on the role of IT," Carr said. "We are at a major turning point, which will have profound effects on IT management."

He now sees IT as being at the heart of business infrastructure, and believes IT directors at larger companies will need to invest in datacentre facilities to support a utility computing approach for internal users.

Carr believes that, by using virtualisation technology, datacentres can be run as utilities. On-demand IT drastically reduces maintenance and enables servers to run more efficiently, with less downtime.

But Carr warned his London audience that to move to a utility computing model required a cultural change for the chief information officer and dismantling the client/server empire the IT department has assembled in favour of standards based computer architectures.

The questions are whether there is a place for innovative use of IT and whether the current interest in web collaboration technologies will fuel another dotcom spurt.

But if Carr is right and there is no strategic advantage in fancy IT systems, what about companies such as Dell, whose innovative real-time, IT-powered supply chain has made it the world's largest PC maker?

Carr told Computer Weekly that Dell was certainly an example of a company that had reaped a lot of rewards by being the first to automate its supply chain in a thorough way. But he questioned whether such a business advantage was still open to other companies. "It is rare to see a Wal-Mart or Dell able to sustain a distinctive technology and keep it out of competitors' hands for five years," he said.

Carr said it was still immensely difficult for IT to be truly innovative within business. "It is very, very hard for most companies to use IT as a factor that differentiates them from other companies and thus underpins superior revenue growth. The ability for competitors to copy innovation just keeps getting faster." In other words, the window of opportunity is shrinking rapidly.

Nor does he regard enterprise applications, which may once have been seen as strategic, as anything more than prerequisites for staying in business. Deploying enterprise resource planning (ERP) did not give the business a competitive edge, he said. "In order to compete, businesses have to run an ERP system."

According to Carr, companies do not think they will gain a competitive advantage by being the first with ERP. "Businesses are implementing what is a commodity capability at the lowest price and the lowest risk and as fast as possible," he said.

But he acknowledged that there are exceptions - for instance, where a company has a distinctive product or process which can be encapsulated in software, solidifying the competitive advantage.

"Goldman Sachs, for instance, has a sophisticated pricing algorithm for derivatives which can be encapsulated into software and kept secret, because it is building it on its own and the software represents its own intellectual property, so it is very hard for competitors to copy," he said.

Owning the intellectual property of software is key, according to Carr. "For stuff that is supplied by vendors, obviously the vendor wants to sell to your company and other companies." That makes it very difficult to maintain a competitive edge with commercial software.

But even when a company has seen a business benefit from custom software development in the past, Carr is not convinced that the benefit applies now.

For those who hark back to the days when any self-respecting company had armies of programmers beavering away on in-house systems, Carr doubts whether the supposedly unique businesses processes encoded in legacy applications were ever really that unique.

"Most companies spent a lot of time over the years developing customised software and it is just a big headache," he said. They thought they were going to be able to run their business better than competitors, "but now most of this stuff is fairly routine and all they get by maintaining it themselves and continuing to customise the code is higher costs".

Carr said that as the IT industry matures, suppliers will also have to adjust to being treated more like utility providers. "The IT industry is not growing as fast as it was in the 1990s. Companies are battling more for market share among themselves than just to pull in new growth."

Some current attitudes towards IT are fall-out from the fact that the dotcom frenzy, Y2K and the transition to the euro all happened at around the same time. All had had big systems implications which had led to disappointments about the value of IT to the business, Carr said. "Ever since then, companies have focused more on getting the full potential out of the stuff they had already bought and less on buying the next big thing."

But a failure to confer competitive advantage does not, in Carr's opinion, necessarily mean all IT is the same. "There is certainly a role for cheap, Wintel and Linux boxes and high-end boxes to support high levels of throughput which are built for virtualisation."

One example of the utility model, according to Carr, is Google, which has built what he describes as a massively efficient, scalable centralised plant for internet searching. "When you do a Google search, obviously it is not going on in your computer, it is supplied by this big [Google] utility."

Significantly, Carr said, Google did not have to charge for its utility service as it made money through advertising, a revenue model at odds with that of traditional utilities.

The question is whether other companies could replicate Google's success as an IT utility.

"I think so, and some will be within big companies, but not all will go down the parallel computing route of Google," he said.

One of the key challenges for Carr's grand vision of utility computing is ownership of the data and where it resides. "One of the barriers to the adoption of the utility model is concern about data security," he said.

However, users were already giving data to third-party businesses, Carr pointed out. "I have a sense that the fears about security, while real, are overplayed a bit," he said, taking as an example payroll service providers such as ADP. "You are giving ADP all your payroll data - extremely sensitive information - and nobody cares."

But broader questions on the safety of data should be raised at a governmental and national security level, he said. "We have never seen a basic commercial infrastructure that could be supplied from another country." Utility computing is not like roads or the electricity grid, which are controlled within the country in which they operate.

"You are able to run the data-centres that underpin a lot of commercial activity of the country from another country, which definitely raises a lot of issues around security and variations in law relating to data protection."

Moreover, the politicians have taken "a very laissez-faire attitude" regarding data legislation in the light of utility-based computing. "The politicians do not really understand what is going on," he said.

As for the next wave of web developments, Carr does not expect a rush of investment in start-ups. "In the first dotcom bubble every man in the street got carried away and was investing, which you do not see anymore," he said. "With certain Web 2.0 companies there is probably more venture capital being invested in them than is warranted, but it is still a fairly narrow phenomenon as far as I can see."

What is utility computing?

Utility computing is a provisioning model in which a service provider makes computing resources and infrastructure management available to customers as needed, and charges them for specific usage.



Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy