How the web was spun

Examine the evolution of Web 1.0 into Web 2.0, and what is most striking is the continuity, rather than any revolutionary changes. The principle applies to personalities as well as technology. Vint Cerf, co-developer of TCP/IP, is now vice-president and chief internet evangelist for Google, regarded by Tim O'Reilly, who defined Web 2.0, as the archetypal Web 2.0 company. ...

Examine the evolution of Web 1.0 into Web 2.0, and what is most striking is the continuity, rather than any revolutionary changes. The principle applies to personalities as well as technology. Vint Cerf, co-developer of TCP/IP, is now vice-president and chief internet evangelist for Google, regarded by Tim O'Reilly, who defined Web 2.0, as the archetypal Web 2.0 company.

Many characteristics of the web can be found in its forerunner, Arpanet, which was a collaboration between very able people united by a vision of an interlinked community, with no immediate profit in mind, and a firm understanding of the need for standards that were open and accessible. The best introduction to this is A Brief History of the Internet, written by several of the key figures of those pre-web days, including Cerf, Barry Leiner and Bob Kahn.

Stable, open standards that avoid needless complexity are still the building blocks of the web. One of the characteristics of Web 2.0 is the ever more creative ways in which these standards are being used.


The forerunner and in many respects the foundation of the internet was Arpaet, developed by the Advanced Research Projects Agency (ARPA) of the US Department of Defense. It was the first packet switching network: lines could be shared by many users, with many destinations, in contrast to circuit switching, which required the line to be dedicated to connecting two nodes for the duration of a session.

According to A Brief History of the Internet, "The idea of open-architecture networking was first introduced by Kahn shortly after having arrived at DARPA in 1972... At the time, the program was called 'internetting'."

The Arpanet team designed a communications protocol, Network Control Program or NCP, that could be implemented in the very different host computers which were being linked. The first ever e-mail was sent over Arpanet, and the Arpanet team also developed the File Transfer Protocol (FTP).

The Arpanet developers had a radically new view of the role of computers. An Arpanet completion report draft in 1977 stated, "The ARPA theme is that the promise offered by the computer as a communication medium between people dwarfs into relative insignificance the historical beginnings of the computer as an arithmetic engine."


In 1983, NCP was replaced on Arpanet by Transmission Control Protocol/Internet Protocol, TCP/IP, largely the work of Kahn and Cerf.

According to A Brief History of the Internet, "A key concept of the internet is that it was not designed for just one application, but as a general infrastructure on which new applications could be conceived. It is the general purpose nature of the service provided by TCP and IP that makes this possible."

The most widely used current version of TCP/IP, IPv4, has 32-bit address spaces and provides four billion addresses, which is considerably fewer than the number of people on the planet. And those addresses have been largely collared by the developed world, leaving no room for India and China. IPv6, adopted by the Internet Engineering Task Force in 1994, introduces 128-bit addresses, expanding the potential address pool to the trillions. Computers and other devices are now supplied IPv6-ready. But rolling the version out universally will cost tens of billions of pounds and is likely to be done only when networks come up for renewal.

The World Wide Web

In 1980, while working at Cern, the European particle physics laboratory in Geneva, Tim Berners-Lee wrote a program for storing information using random associations, which he called Enquire. This formed the conceptual basis for the global hypertext project which Berners-Lee proposed in 1989, to be known as the World Wide Web.

Frustrated by constantly needing to write programs to access and convert data on the different computers which Cern staff had brought with them, Berners-Lee came up with the idea of making each computer look like "part of some imaginary information system which everyone can read And that became the WWW."

WorldWideWeb was also the name of the hypertext browser that Berners-Lee created, along with the initial specifications of URIs, HTTP and HTML. At first used only within Cern, these were soon being discussed and refined on the global internet.

Berners-Lee adds, "Inventing it was easy... the difficult bit was getting people to agree to all use the same sort of HTTP, and URLs, and HTML."

As director of the World Wide Web Consortium (W3C), which co-ordinates web standards, getting agreement is something that Berners-Lee is still doing.


Mosaic was not, as often assumed, the first publicly available graphical interface web browser that distinction goes (arguably) to Pei Wi's ViolaWWW.

But Mosaic was the first to appeal to the general user, rather than techies, and its co-developer (with Eric Bina) at the National Center for Supercomputing Applications (NCSA) was the entrepreneurial Marc Andreessen, who in the rough draft of a strategy that has since become ubiquitous, wrote, "NCSA Mosaic is free for internal use by commercial organisations, and is also available for licensing by commercial organisations for modification and/or distribution."

Andreessen left NCSA to found Netscape, hoping to capitalise on Mosaic, but NCSA licensed the code instead to a company called Spyglass. Netscape Navigator (1994) and Internet Explorer (1995) both drew heavily on Mosaic. On his personal pages at the site, Berners-Lee writes, "When the Netscape brand appeared, people realised the difference between the general World Wide Web concept and specific software."

Work on Mosaic ceased in 1997, but you can still download it.


Wikis - collections of pages that can be edited by anyone, at any time, from anywhere - can be found all over the web, from Wikipedia to wikis run by scripting language projects (for example,, not to mention faith groups, Star Trek fans and collaborative novelists.

The name was conceived by Ward Cunningham (now with Microsoft), from a Hawaiian word meaning quick. He developed the concept from his work with Apple's Hypercard.

Cunningham and Cunningham, a consultancy specialising in object-oriented programming, set up a wiki site in 1994 as a discussion and collaboration site for people seriously interested in patterns in software development.

Cunningham explains, "In its simplest form, a wiki is a combination of a CGI script and a bunch of plain text files". He has written FAQs for getting a wiki going.

There are now dozens of wiki software systems for people working with Java, .Net, Perl, Python, Ruby and other languages.

According to Cunningham: "Wiki is the original Web 2.0 application. It is the first popular application to allow a community to contribute and organise content."

Social networking

Social networking through the use of connected computers pre-dates the web community communication was one of Arpanet's goals.

The first recognisable social networking site was in 1995, the same year that the Journal of Computer-Mediated Communication was founded. The journal contains papers such as Email Flaming Behaviors and Organizational Conflict, and Nicknames in a German Forum on Eating Disorders.

According to a recent JCMC paper, "What makes social network sites unique is not that they allow individuals to meet strangers, but rather that they enable users to articulate and make visible their social networks." Racking up lists of "friends" is not the point.

A lot more of this kind of research is likely to appear. The same paper points out, "The fact that participation on social network sites leaves online traces offers unprecedented opportunities for researchers."


Does Web 2.0 facilitate real progress by aggregating individual insights (the wisdom of crowds), or are we witnessing digital maoism, "a loss of insight and subtlety, a disregard for the nuances of considered opinions, and an increased tendency to enshrine the official or normative beliefs of an organisation"?

There have been many attempts to explain how the free collaborative spirit that drives so many Web 2.0 projects can be harnessed by a business culture that, in its attitude to intellectual property, is stuck well before Web 1.0. The introduction to Don Tapscott and Anthony Willliams' book Wikinomics, for example, says, "Companies that engage with these exploding web-enabled communities are already discovering the true dividends of collective capability and genius."

The case against Web 2.0 has been put by Andrew Keen, in his book The Cult of the Amateur, and in the Wall Street Journal, saying, "The impartiality of the authoritative, accountable expert is replaced by murkiness of the anonymous amateur."

A correction published in USA Today on 3 January 2007 stated, "A review of Wikinomics on Tuesday should have said, 'This is not another book about profitless internet start-ups.' The word 'not' was inadvertently omitted."


Rest is one of those technologies that are supposedly characteristic of Web 2.0 but which turn out to be firmly rooted in Web 1.0.

Rest, or Representational State Transfer, was first described in a 2000 doctoral dissertation by Roy Fielding, one of the authors of the W3C HTTP specification.

Contrasting the Restful approach to web services with cumbersome standards such as Soap and Web Services Definition Language (WSDL), champions claim that "rather than inventing an exhaustive list of standards, Rest uses existing internet standards, including HTTP, XML and TCP/IP".

Fielding says the Rest architectural style has guided the design and development of the web's architecture since 1994, and served as the framework for web standards such as HTTP and URIs.

He also says the principles behind Rest can be used to explain why some web technologies have been more widely adopted than others, even when (as with the generally derided Javascript against Java applets) the balance of developer opinion is not in their favour.


There are a number of ways of creating rich internet applications, which provide the same kind of user experience and responsiveness as desktop applications by doing as much processing as possible in the web client. The leading proprietary approaches are Adobe's Flash and Microsoft's Silverlight.

In 2005, Jesse James Garrett of Adaptive Path published Ajax: A New Approach to Web Applications, which described how existing web standards, specifically XHTML and CSS, the Document Object Model, XML and XSLT, XMLHttpRequest and JavaScript, could allow a user to interact with an application asynchronously, independent of communication with the server. "So the user is never staring at a blank browser window and an hourglass icon, waiting around for the server to do something."

Despite criticisms from purists, Ajax (or Asynchronous Javascript and XML) is becoming ubiquitous. Google is a big user, and the Google Web Toolkit is Ajax-based. Microsoft has recently integrated Ajax into ASP.Net 3.5, with support in Visual Studio. There are dozens of other Ajax frameworks and toolkits.

Foaf and the semantic web

Membership of a social networking site can involve a lot of work. And moving your allegiance to another site can mean doing that work all over again.

On a larger scale, there is a risk that centralised and proprietary sites will fragment the web into silos of data, akin to the enterprise applications that began to be broken open in the 1980s.

Foaf, or Friend of a Friend, provides a way of adding machine-readable characteristics to your personal home page, so that you can link up with people who share your interests without locking yourself in to a proprietary centralised database.

Foaf uses the W3C's Resource Definition Framework (RDF) and XML. Begun in 2000 by Libby Miller and Dan Brickley, and strongly endorsed by Berners-Lee, Foaf is in many ways the prototype application of the semantic web (the W3C's common, open framework that will allow data to be shared and reused across application, enterprise, and community boundaries).

Foaf has inevitably attracted the label Web 3.0, but in many ways, it is no more than a refinement of the ideas that have driven the World Wide Web from the outset.


Who's really behind the internet >>

Read more on Web software