Web 2.0: What does it constitute?

Feature

Web 2.0: What does it constitute?

Four years on from the first serious attempt to define it, there is no firm consensus about what constitutes Web 2.0. Within the past few months, it has been described by senior industry commentators as "a major trend that is building steadily" and "a bagful of old technologies under a new name".

Companies identified by analysts such as Forrester and Gartner as eager to harness Web 2.0 seem to be less interested in technology, than the potential of "social networking", modeled on Facebook and its peers, to improve collaboration within the company, and to encourage customers to give up information about themselves, and feedback about products and services.

But other companies, facing an expected cut in IT budgets, are excited by the prospects of equipping end-users with the power to write their own applications with freely-downloadable tools and components.

Inventor of the World Wide Web Tim Berners-Lee has said that some of the supposedly defining characteristics of Web 2.0, such as collaboration and user involvement, are what the web was supposed to be about all along. He has pointed out, as others have, that Amazon was incorporating user-generated content (book reviews began in 1996) and Rest (Representational State Transfer) in its developments long before the term Web 2.0 came into use. So in effect, we are simply at a later stage of Web 1.0.

Meanwhile the dawn of Web 3.0 is already being proclaimed by excitable technology columnists, marketing people clutching at the next big thing, and developers who want the Web 2.0 brand buried.

In 2005, respected developers' handbook publisher Tim O'Reilly set down in detail what separated Web 1.0 from Web 2.0.

However, he had to acknowledge sufficient exceptions, Amazon among them, to suggest that something less than a full generational shift had occurred: "There are many areas of Web 2.0, where the '2.0-ness' is not something new, but rather a fuller realisation of the true potential of the web platform".

O'Reilly identified Google as "the standard bearer for Web 2.0", and pointed out the differences between it and predecessors such as Netscape, which tried to adapt for the web the business model established by Microsoft and other PC software suppliers.

Google "began its life as a native web application, never sold or packaged, but delivered as a service, with customers paying, directly or indirectly... none of the trappings of the old software industry are present." There were no scheduled software releases, "just continuous improvement" (or perpetual beta, as O'Reilly later dubbed it).

Perhaps the most important breakthrough was Google's willingness to relinquish control of the user-end of the transaction, instead of trying to lock them in with proprietary technology and restrictive licensing.

O'Reilly took a second Web 2.0 principle from Peer-to-Peer pioneer BitTorrent, which works by completely decentralising the delivery of files, with every client also functioning as a server. The more popular a file, is, the faster it can be served, since there are more users providing bandwidth and fragments of the file. Thus, "the service automatically gets better the more people use it".

BitTorrent also illustrates the way that customers will willingly serve themselves, and in doing so serve others. But Web 2.0 goes much further. Taking another model from open source, users are treated as "co-developers", actively encouraged to contribute, and monitored in real time to see what they are using, and how they are using it.

Taking advantage of this, instead of working to a three-year release cycle, Web 2.0 developers can come up with a feature in the morning, write it in the afternoon, put it out overnight, and wait for the response the next day to decide whether to continue with it or withdraw it. O'Reilly says scripting languages like Perl, Python and Ruby are optimised for this kind of programming.

"Until Web 2.0 the learning curve to creating websites was quite high, complex, and a definite barrier to entry," says the third of our triumvirate of Tims, Tim Bray, director of Web Technologies at Sun Microsystems. "Now the new tools that are web oriented, like PHP and Rails, have no other objective than building websites, easier and faster, and the effect is that the number of potential developers is growing."

Web 2.0 takes some of its philosophical underpinning from James Surowiecki's book The Wisdom of Crowds, which asserts that the aggregated insights of large groups of diverse people can provide better answers and innovations than individual experts.

Jakob Nielsen, regarded as the foremost authority on human-computer interaction ("usability"), has no patience with the wisdom of crowds approach, when it comes to business application development. "Employees are pre-vetted: they've been hired and thus presumably have a minimum quality level. In contrast, on the Web, most people are bozos and not worth listening to".

The active contribution users make should not be over-estimated Nielsen quotes figures from an AT&T study which found that 90% of users are "lurkers", who consume but don't produce, and 9% make occasional contributions, meaning that just 1% do most of the work.

And as AT&T researcher Will Hill observed, "unfortunately, those people who have nothing better to do than post on the internet all day long are rarely the ones who have the most insights."

In practice, even fewer than 1% of people may be making a useful contribution - but these may be the most energetic and able members of a very large community. In 2006 1,000 people, just 0.003% of its users, contributed around two-thirds of Wikipedia's edits.

Nielsen's December 2007 Alertbox is a long and intemperate attack on the dangers he believes Web 2.0 presents for good interface design, good development, and business integrity. Nielsen is also notable for earlier attacks on Ajax (Asynchronous Javascript and XML), one of the pillars of Web 2.0 development.

Ajax speeds up response times by enabling just part of a page to be updated, instead of downloading a whole new page. Nielsen's objections include that this breaks the "back" button - the ability to get back to where you've been, which Nielsen says is the second most used feature in Web navigation.

Ajax is one of the technologies used to create Rich Internet Applications - closer to the desktop experience than the traditional Web - which are characteristic of Web 2.0. Others include Adobe Flash and Microsoft Silverlight. Whatever Nielsen's criticisms (and he is even harder on Flash), Ajax is at least based on international standards. JavaScript/Jscript is maintained by the European Computer Manufacturers Association XML, HTML and XTML, the XMLHttpRequest API, and Cascading Style Sheets (CSS, used to separate presentation from content), by the W3C.

"Everybody who has a Web browser has got that platform," says Berners-Lee, in a podcast available on IBM's developerWorks site. "So the nice thing about it is when you do code up an Ajax implementation, other people can take it and play with it."

REST, increasingly used for web services in preference to the more cumbersome SOAP in web service development, is frequently cited as another characteristic of Web 2.0. First defined in 2000, it makes use of HTTP Hypertext Transfer Protocol to interface with resources identified by their URIs. For Tim Bray, "integration is probably the number one problem we face, and REST is probably the solution."

But far from being new, it has been pointed out that the World Wide Web itself is just one very large RESTful application.

So it is easy to sympathise with Berners-Lee when he says (in the same podcast), that Web 2.0 means "building stuff using the Web standards which have been produced by all these people working on Web 1.0, plus Javascript of course. Web 2.0 for some people means moving some of the thinking client-side, so making it more immediate. But the idea of the Web as interaction between people is really what it was designed to be."

Which has not stopped his W3C colleagues taking to the road with a hectic schedule of presentations in the early months of this year, all arguing that Web 2.0 is a step on the way to the Semantic Web, a long-standing W3C initiative to create a standards-based framework able to understand the links between data which is related in the real world, and follow that data wherever it resides, regardless of application and database boundaries.

They point out that Web 2.0 applications combine all kinds of data from all kinds of places - so-called mashing up. Unfortunately, W3C member Steven Pemberton says, "by putting a lot of work into a website, you commit yourself to it, and lock yourself into their data formats too. This is similar to data lock-in when you use a proprietary program. Moving comes at great cost. Try installing a new server, or different Wiki software."

The problem with Web 2.0, Pemberton says, is that it "partitions the web into a number of topical sub-webs, and locks you in, thereby reducing the value of the network as a whole."

How do you decide which social networking site to join? he asks. "Do you join several and repeat the work?" With the Semantic Web's Resource Description Framework (RDF), you won't need to sign up to separate networks, and can keep ownership of your data. "You could describe it as a CSS for meaning: it allows you to add a small layer of markup to your page that adds machine-readable semantics."

The problems with Web 2.0 lock-in which Pemberton describes, were illustrated when a prominent member of the active 1%, Robert Scoble, ran a routine called Plaxo to try to extract details of his 5,000 contacts from Facebook, in breach of the site's terms of use, and had his account disabled. Although he has apparently had his account reinstated, the furore has made the issue of Web 2.0 data ownership and portability fiercely topical.

These issues, along with privacy, are being tackled by Dataportability.org, which aims to put existing standards "in context of each other so that consumers, suppliers and developers can more easily understand and implement them as an end-to-end data portability solution".

Facebook has opened itself up to the extent of supplying APIs, which allow external developers to provide applications for its users. But when Google announced its OpenSocial set of APIs, which will enable developers to create portable applications and bridges between social networking websites, Facebook was not among those taking part. Four years after O'Reilly attempted to define Web 2.0, Google, it seems, remains the standard-bearer, while others are forgetting what it was supposed to be about.

Wake up to the dawn of Web 2.0 >>

Web 2.0: beyond the buzz words >>

Listen to Cliff Saran speak to Don Tapscott, author of Wikinomics - How Mass Collaboration Changes Everything, on the economics of Web 2.0 >>

Read Cliff Saran's blog on Web 2.0 >>

Web 2.0 can work for storage >>

Gartner's at-a-glance guide to social networking risks >>

Ordnance Survey launches free Web 2. 0 mapping development tool >>

IBM promotes Web 2. 0 mashups with new tool >>

Web 2. 0 tools of the trade >>

Fortune 500 companies push for greater Web 2. 0 security >>





Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in February 2008

 

COMMENTS powered by Disqus  //  Commenting policy