Long voyage to the land of SOA

Forty years ago, hardware was the main concern of IT specialists. Now the focus is firmly on the end-user and the strategic needs of the business. We report on the evolution of software development

Computer specialists had to know their hardware better than they knew their end-users in the old days, but today it is the other way around.

Programmers still beaver away at code, but industry observers see business analysts replacing systems analysts, and service oriented architecture (SOA) promising a future in which system components are mixed and matched to meet business needs.

The prospect of such a future has emerged thanks to an evolution of software, hardware and telecommunications that was unimaginable 40 years ago.

"We used to base our system delivery dates on two compilation turnarounds a week, and the machine time cost £1 a minute," says Stephanie Shirley, who in the 1960s was establishing the software company that became today's Xansa IT services group. "That certainly concentrated the mind on getting it bug-free first time. These days you can do compilations any time, and so quickly.

"Knowing the hardware was certainly important. When I started programming, different sorts of memory were still being tried out. These days you do not have to understand how the hardware works at all."

Others agree, remembering having to write their own basic software such as read, write and seek routines before they could add new-fangled 2Mbyte disc drives to their computers.

By the late 1960s, high-level languages, notably Cobol, were making their mark. Enthusiastic programmers dived in and produced "terrible birds' nests of programs", according to Shirley. This brought a move towards modular programming to split systems into manageable, understandable chunks.

But using the hardware as efficiently as possible was still the overriding concern of programmers. Even by the mid-1970s, discs cost £55 per megabyte, many companies measured their central mainframe memory in kilobytes, and 2,400 bits a second was considered fast data communication.

Huge advances in these and other areas - notably processor microtechnology and systems software - have removed the need for developers to be concerned about hardware, and have helped bring software development to the edge of a new era.

Yet this future could be based on the systems of the past for some time: 90% of application development spending is on maintaining existing systems, according to consultancy Northdoor.

Legacy software provider Micro Focus, which continues to make a good living from Cobol, points to studies showing that in 70% of organisations the main systems are originally written in that language.

"Many organisations accept there is no need to move away from systems that are doing a perfectly good job, just because they are 20 or 30 years old," says Butler Group research director Tim Jennings. "It is better to let them continue, and use new technologies to integrate them into new applications.

"Only a relatively small percentage of development now is new code. A lot of effort is going into integration, whether it is integrating a new application into existing systems or integrating existing systems with each other.

"The forward direction lies here: a business process-driven approach to new applications, with integration using SOA principles to combine new and legacy systems to support that business process.

"Traditional software development will still exist, but it will become more specialised and the number of people doing it will become smaller.

"Business analysts will sit with a user department to get to grips with a process, produce a process model that the department director can understand, and then work with the developers to specify which components can be provided by existing systems, what changes might be needed to those systems, and what new software needs to be developed or bought in."

According to Jennings, "The holy grail will be for the IT department to be able to say, 'We have already got systems that can support these 20 functions in this new process we can develop these three functions and buy in these four.'

"Packages now dominate new development and we will see the likes of SAP and Oracle decomposing their integrated suites into service components, with companies combining these with in-house development.

"Open source will gain further ground and will coexist with traditional licensing, enabling users to innovate around suppliers' products. So open source and traditional licensing will offer different benefits that suit different usage scenarios."

This could mean a return to the days before big integrated packages, says Mike Small, director of security strategy at software supplier CA.

"People used to go for the best of breed for different applications but they often did not interoperate. So SAP and others said, 'Come to us - we might not be best in every area, but it is all integrated.' In future people can go back to picking best-of-breed components and they will interoperate."

This SOA vision is shared by Alex Drobik, a European vice-president at Gartner, who sees big benefits for business and IT departments.

"The shift from hard-coded monolithic applications and all-embracing packages towards the SOA vision of software components combined and then combined again, means processes and their IT can be built around people, not the other way around," he says.

"There is also an opportunity to build and rebuild business processes quite quickly as circumstances change, without having to reinvent the wheel. Big package suppliers and others are getting into this - SAP, Oracle, IBM.

"All this means there is an excitement in the industry that we have not seen since the Y2K project and the early days of e-commerce."

This time around, however, there is less urgency and more caution, says Drobik. This is because those old systems are still working well, cost is an issue, skills need to be developed, and users need to be prepared too.

"Very few organisations have moved to a full SOA component-based software architecture that they can reconfigure at will," says Drobik.

"But it has only emerged in the past three or four years and there is no great rush. And if everyone did it tomorrow, there would not be enough skills in users or suppliers. That is why big suppliers are not forcing it along. In maybe 20 years' time everyone will be doing it, but in the next five to 10 years people will be at different stages.

"Now is the time for CIOs to look at the issues. What is the business need? Do we need to change our processes? What performance improvement will we get? What skills do we need? Can we get these components from package suppliers?

"There is also a user skills issue. If we give the marketing team advanced data mining, can they use it? If you ask companies to identify their top 10 business processes and how these might change in the next two or three years, most will have difficulty answering."

Another view of the SOA approach is that it will raise project success rates. "Software development is no longer considered a black art," says David Roberts, chief executive of IT directors' group The Corporate IT Forum.

"Its increasingly important role is bringing growing responsibility to deliver to professional, measurable standards. Being fit for purpose will replace speed to market as a prerequisite, and leaps of faith will be replaced by small, sure steps to change."

Others agree. "Engineering principles and good practice in software development that evolved in the 1970s were thrown to the wind with Unix and Windows," says Small.

"Microsoft made a good job of producing a standard system that people could use, but the focus was on function, not reliability, and they have had to fix that later.

"Unix was developed as an engineering workstation system, not as a multi-user financial transaction system. Some of its early shortcomings did not matter when it was a just desktop workstation, but they have had to be fixed in retrospect."

Professional standards are coming to the fore, and this may result in the licensing of individuals or companies. The British Computer Society has seen its membership grow by nearly 40% in just over two years. It puts this down partly to its new membership structure, but also to growing demand for better qualified specialists in the wake of high profile project failures.

The BCS is leading a professionalism in IT initiative involving other IT bodies, including E-skills UK and Intellect. Any licensing or regulation is likely to be led by the industry or specialist groups within it.

"Legislation takes a long time, and the IT industry changes quickly," says BCS president Charles Hughes. "The support for the professionalism initiative suggests that the industry should be sufficiently enthused to put in place what is necessary for self-regulation.

"This is already happening. For example, a register of health informaticians is being produced through the new UK Council of Health Informatics Professionals. This demonstrates what can be achieved by like-minded people working together to put in place a mechanism driven by the profession and its users."

Whatever the past progress and future of software development, some things never change. "Software development has been totally transformed over 40 years, but a few things have not changed at all," says Andrew White, a principal consultant at Deloitte.

"Success is still critically dependent on stable and clearly expressed requirements that are understandable and endorsed by the senior user. Bad software engineering discipline leads to more bugs and higher costs. And testing always pays off."




This was last published in October 2006

Read more on Web software

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close