As early as Pei Wei's Viola browser in 1992, the web was being used to deliver "applets" and other kinds of active content within the web browser. Java's introduction in 1995 was framed around the delivery of such applets. JavaScript and then DHTML were introduced as lightweight ways to provide client side programmability and richer user experiences. Several years ago, Macromedia coined the term "Rich Internet Applications" (which has also been picked up by open source Flash competitor Laszlo Systems) to highlight the capabilities of Flash to deliver not just multimedia content but also GUI-style application experiences.
However, the potential of the web to deliver full scale applications didn't hit the mainstream till Google introduced Gmail, quickly followed by Google Maps, web based applications with rich user interfaces and PC-equivalent interactivity. The collection of technologies used by Google was christened AJAX, in a seminal essay by Jesse James Garrett of web design firm Adaptive Path. He wrote:
"Ajax isn't a technology. It's really several technologies, each flourishing in its own right, coming together in powerful new ways. Ajax incorporates:
In his book, A Pattern Language, Christopher Alexander prescribes a format for the concise description of the solution to architectural problems. He writes: "Each pattern describes a problem that occurs over and over again in our environment, and then describes the core of the solution to that problem, in such a way that you can use this solution a million times over, without ever doing it the same way twice."
The Long Tail
Small sites make up the bulk of the internet's content; narrow niches make up the bulk of internet's the possible applications. Therefore: Leverage customer-self service and algorithmic data management to reach out to the entire web, to the edges and not just the center, to the long tail and not just the head.
Data is the Next Intel Inside
Applications are increasingly data-driven. Therefore: For competitive advantage, seek to own a unique, hard-to-recreate source of data.
Users Add Value
The key to competitive advantage in internet applications is the extent to which users add their own data to that which you provide. Therefore: Don't restrict your "architecture of participation" to software development. Involve your users both implicitly and explicitly in adding value to your application.
Network Effects by Default
Only a small percentage of users will go to the trouble of adding value to your application. Therefore: Set inclusive defaults for aggregating user data as a side-effect of their use of the application.
Some Rights Reserved. Intellectual property protection limits re-use and prevents experimentation. Therefore: When benefits come from collective adoption, not private restriction, make sure that barriers to adoption are low. Follow existing standards, and use licenses with as few restrictions as possible. Design for "hackability" and "remixability."
The Perpetual Beta
When devices and programs are connected to the internet, applications are no longer software artifacts, they are ongoing services. Therefore: Don't package up new features into monolithic releases, but instead add them on a regular basis as part of the normal user experience. Engage your users as real-time testers, and instrument the service so that you know how people use the new features.
Cooperate, Don't Control
Web 2.0 applications are built of a network of cooperating data services. Therefore: Offer web services interfaces and content syndication, and re-use the data services of others. Support lightweight programming models that allow for loosely-coupled systems.
Software Above the Level of a Single Device
The PC is no longer the only access device for internet applications, and applications that are limited to a single device are less valuable than those that are connected. Therefore: Design your application from the get-go to integrate services across handheld devices, PCs, and internet servers.
AJAX is also a key component of Web 2.0 applications such as Flickr, now part of Yahoo!, 37signals' applications basecamp and backpack, as well as other Google applications such as Gmail and Orkut. We're entering an unprecedented period of user interface innovation, as web developers are finally able to build web applications as rich as local PC-based applications.
Interestingly, many of the capabilities now being explored have been around for many years. In the late '90s, both Microsoft and Netscape had a vision of the kind of capabilities that are now finally being realized, but their battle over the standards to be used made cross-browser applications difficult. It was only when Microsoft definitively won the browser wars, and there was a single de-facto browser standard to write to, that this kind of application became possible. And while Firefox has reintroduced competition to the browser market, at least so far we haven't seen the destructive competition over web standards that held back progress in the '90s.
We expect to see many new web applications over the next few years, both truly novel applications, and rich web reimplementations of PC applications. Every platform change to date has also created opportunities for a leadership change in the dominant applications of the previous platform.
Gmail has already provided some interesting innovations in email, combining the strengths of the web (accessible from anywhere, deep database competencies, searchability) with user interfaces that approach PC interfaces in usability. Meanwhile, other mail clients on the PC platform are nibbling away at the problem from the other end, adding IM and presence capabilities. How far are we from an integrated communications client combining the best of email, IM, and the cell phone, using VoIP to add voice capabilities to the rich capabilities of web applications? The race is on.
It's easy to see how Web 2.0 will also remake the address book. A Web 2.0-style address book would treat the local address book on the PC or phone merely as a cache of the contacts you've explicitly asked the system to remember. Meanwhile, a web-based synchronization agent, Gmail-style, would remember every message sent or received, every email address and every phone number used, and build social networking heuristics to decide which ones to offer up as alternatives when an answer wasn't found in the local cache. Lacking an answer there, the system would query the broader social network.
A Web 2.0 word processor would support wiki-style collaborative editing, not just standalone documents. But it would also support the rich formatting we've come to expect in PC-based word processors. Writely is a good example of such an application, although it hasn't yet gained wide traction.
Nor will the Web 2.0 revolution be limited to PC applications. Salesforce.com demonstrates how the web can be used to deliver software as a service, in enterprise scale applications such as CRM.
The competitive opportunity for new entrants is to fully embrace the potential of Web 2.0. Companies that succeed will create applications that learn from their users, using an architecture of participation to build a commanding advantage not just in the software interface, but in the richness of the shared data.
Core Competencies of Web 2.0 Companies
In exploring the seven principles above, we've highlighted some of the principal features of Web 2.0. Each of the examples we've explored demonstrates one or more of those key principles, but may miss others. Let's close, therefore, by summarizing what we believe to be the core competencies of Web 2.0 companies:
Services, not packaged software, with cost-effective scalability
Control over unique, hard-to-recreate data sources that get richer as more people use them
Trusting users as co-developers
Harnessing collective intelligence
Leveraging the long tail through customer self-service
Software above the level of a single device
Lightweight user interfaces, development models, AND business models
The next time a company claims that it's "Web 2.0," test their features against the list above. The more points they score, the more they are worthy of the name. Remember, though, that excellence in one area may be more telling than some small steps in all seven.
Tim O'Reilly
O’Reilly Media, Inc., tim@oreilly.com
President and CEO
Questions or comments for Tim about this article? Ask them here. You must be logged in to the O'Reilly Network to post a talkback.
Showing messages 1 through 25 of 25.
Excellent Matter on Web 2.0
2006-02-01 00:31:11
EldoItteera
[Reply | View]
<br/>
I felt that this is very excellent matter on Web2.0 especially in the scenario where ,most of the products are consumed as services rather than applications or components. Its true that Google has added a lot of new things to Web 2.0.Lets wait and watch for what next!!!
<br/>
<br/>
Eldo Itteera
WeServices Center of Excellence
Infosys Technlogies Ltd.
Value of software
2006-01-15 05:37:03
Teun
[Reply | View]
I took the liberty to take one of Tim's statments from the paragraph 'the web as a platform' and rewrote it to: “The value of the software is proportional to the scale and dynamism of the data it helps to manage, and to the richness of the user experience and the amount of satisfaction it offers.”
I think this 'definition' almost covers the whole Web 2.0 concept. Please comment.
Regards, Teun
Web 2.0 Article By Tim O'Reilly
2006-01-08 09:52:47
RFHJ
[Reply | View]
Slammin' article, Tim! Insightful, lucid, and well cross-referenced to pertinent, supporting articles.
I'm on the "guru" team at my shop and will forward the URL to my CIO and walk up the hall to tell him in person that this is a presentation he NEEDS to see.
Great article! Need a similar one for comparing Web 1.0 vs 2.0 business models
2006-01-03 23:30:32
Usha_K
[Reply | View]
I've spent the last 10 years in the Enterprise Infrastructure (read J2EE, ESBs, BPM) world servicing classic large-scale IT enterprises. I found your article on Web 2.0 quite refreshing and lucid, especially in understanding how our traditional software building, bundling and collaboration paradigms compare to the emerging next-generation software paradigms.
What is additionally interesting for me, however, is to know how the business models compare for Web 1.0 versus Web 2.0 - with respect to the past failures and successes, and emerging next-gen business models.
In one of your sidebars you mention Christopher Alexander's A Pattern Language. I'd like to add that A Pattern Language was published almost 30 years ago, and Christopher Alexander has continued to do interesting work in that time. He has recently published a four-part series entitled The Nature of Order in which he argues for, and presents concepts relevant to, a more organic design process -- something very much in tune with Web 2.0.
A recent post of mine inspired by his and others' work in the area:
just a contribution
2005-12-13 17:43:37
GiselaGiardino
[Reply | View]
Hi Tim,
I am happy to have come to this article as I was needing to hear from *you* what the concept 'web 2.0' means. It is clear enough to me, despite I may discuss some minor details. But it´s of no importance. I just wanted to contribute here one variable that I consider crucial to understand the whole phenomena, and unless I have overseen it -possible-, it is not mentioned in the article. What made possible the transition from web 1.0 to web 2.0? The changes, widening and improvements on the ISPs, the Internet connections.
It looks like old-school companies and softwares were blind about what looks so obvious today under the lens of web 2.0 builders, this is: The concept of Internet as a Community. Years ago, and not too many this was, I think, envisioned but yet the connectivity of the net couldn´t make it possible.
So software companies and incoming Internet services companies (say Netscape) were driven to build applications intended for users who were not connected to the internet but for about a couple of hours a day and were on poor Dial-up.
The advantages of high-speed connectivity through broadband wi-fi and other(?) :) connections made possible an Internet usage on realtime: phonecalls, chat, downloads, uploads, roleplay games, online creation, blogsphere, p2p share, dommestic computers as servers... and several other possibilities you think of. *There* is where Web 2.0 -I believe- was born as a tangible reality.
Google APIs, Yahoo´s, P2Ps, Napter, BitTorrent, Skype... all of them are based in the fact high-speed connectivity of the critical mass of users. Actually you are mentioning that BitTorrent feeds itself from the share each new user brings.
Maybe this is not treated seriouly as a variable for the business, but I think it is a subtle yet concrete booster of all this massive change.
Ok, enough. I would go further, but I know you understand my point. I just wanted to contribute to the article with this, that I think it is important to understand -too- why Web 1.0 companies had one business model (based on treating each computer as a standalone entity), and why the transition to web 2.0 happens when companies *can* treat computers as nodes from a network -can expect them connected 24/7-.
The difference between a sum of elements and a community of them, is the -level of- connection between them. We are highly interconnected now => We are working on Web 2.0. I *love* that. I am a humble missionary of this movement. Too. Thank you, Tim.
Gisela Yer Alieness |-)
What is Web 2.0
2005-10-26 05:14:29
perfected
[Reply | View]
Hi Tim, as I mentioned on radar, I do not believe that Web 2.0 is about user interfaces, but rather making it easier for applications to understand other applications.
For this reason, I dont believe that Flickr is a Web 2.0 application. A Web 2.0 application is one that does a small task and does it well, then it is re-used (a bit like using standard libraries in development). A great example of this is Salesforces' AppExchange.
I describe more of my point of view as well as examples in my post What Does Web 2.0 Mean for Business (http://www.nik.com.au/archives/2005/10/26/what-web-20-means-for-business/)
What is Web 2.0
2005-11-07 06:19:28
minoopy
[Reply | View]
Response to this comment and "Knowledge as the next 'Next Intel Inside'" from Tim Finin.
As described in this comment, web 2.0 should make applicaitons understand each other. Is it too fast for the Web to develop at the current stage? Borrowing the comments from Tim Finin in his Knowledge as the next 'Next Intel Inside', with RDF data standards was fully recommended above XML/XSLT within the Web 2.0 era, the semantics between the applications could be built up, since then the real "understanding" between the applications could be stated.
This is purely my personal view point. Welcome all kinds of comments!
What is Web 2.0
2005-10-26 17:29:41
timoreilly
[Reply | View]
Maybe you don't know, then, about the Flickr API, which makes it easy to reuse the flickr database in other applications. That's part of what has made it a web 2.0 poster child with lots of innovation and re-use. Google for instance for the Flickr color picker...
Japanese Localize
2005-10-16 07:26:52
huehara88
[Reply | View]
Please permit my Japanese localization of this Article.
If not, let me know.
FolkMind – a killer app for the Web 2.0 era
2005-10-15 13:33:08
GeorgeChiramattel
[Reply | View]
Hi,
First of all let me congratulate you on this beautiful article.
I would also like to add to this discussion.<br/>
If the Internet represents the 'collective intelligence of humanity' then in my opinion we require better tooling to utilize it. I wouldn't expect the 'virtual brain of humanity' to come with a 'search box' as its primary interface :-)
At the following URL, I have described how we can build a better tool to handle the huge volume of information that is getting published on the net. I call this tool FolkMind.
Interesting Mathematical Definition of Web 2.0
2005-10-13 09:50:37
Owen
[Reply | View]
In our complexity group, Friam.org, we've been discussing whether or not there is an emergent property to Web 2.0. One contender is Reed's Law
http://www.reed.com/dprframeweb/dprframe.asp?section=gfn
Basically a component of Web 2.0 is migration from "Metcalfe's Law" and "Reed's Law". Metcalfe's law states the value of a network varies as the number of pair-wise connections between nodes, (the complete graph of the nodes). This varies as n^2.
Reed's law states the value of a network varies as the number of subgroups within that network. This varies as 2^n, a much, much larger number.
This transition is occurring due to the migration of the web from a publishing technology to a community.
A good article on the idea is "That Sneaky Exponential—Beyond Metcalfe's Law to the Power of Community Building"
http://www.reed.com/Papers/GFN/reedslaw.html
Owen
Web 2.0 v. Web 1.0
2005-10-13 01:41:51
joelcere
[Reply | View]
The evolution to Web 2.0, for lack of a better term is about attitude and expectation. Whether it is technology that led to a change of attitude, or that a shift in our relation to the web led to new technology is an academic debate which I will leave to the more technically endowed.
In the 90s, the web was driven by companies seeking to turn it into a giant shopping mall. Consumers are now reclaiming the web for what it was intended for: a collective space bringing people together so that they could share experience and information. Just picture this: a collection of mega websites competing to attract eyeballs v. loose networks accessible by search engines, tags and connections where you can share information, engage in conversations and co-create. I am caricaturing here but the change is quite noticeable...
This is how I understand it: Web 2.0. is a different way of looking at the web.
Joel
http://beyondpr.blogspot.com
Napster - The inside story
2005-10-04 11:23:18
DonDodge
[Reply | View]
The original Napster was a Web 2.0 style company, back in the 1.0 world. We were too far ahead of the curve (business and legal) to make it a successful business. I just did a post "Napster- the inside story" that gives an insiders view of what we were trying to do and what went wrong. You can see it here
http://dondodge.typepad.com/the_next_big_thing/2005/10/napster_the_ins.html
Business Applications
2005-10-04 08:28:32
DemianE
[Reply | View]
What's missing in the article is a discussion about true apps on the web. Where does Salesforce fit in? Netsuite? Oracle OnDemand? Employease? They are not "consumer sites" like yahoo and google and wikipedia - they are not advertising driven...could they be? How do the economics work when you have 50,000 users instead of 50,000,000?
I must admit I've been wondering about the "free software" mindset that is funded by other means, but I have yet to have an aha moment. Perhaps there is a new model that will fly...
Demian Entrekin
http://www.projectarena.com
Also check out: Putting some meat on the Web 2.0 bones.
2005-10-03 23:31:04
hypermark
[Reply | View]
For what it's worth, I have written a few posts that attempt to make sense of the WHAT, WHY and HOW of Web 2.0, the most recent of which is called, "Putting some meat on the Web 2.0 bones." If interested in such things, check it out:
I contend that it's more that these things came to prominence at the same time (i.e. zeitgeist) rather than some well-defined, technical commonality forming this group of terms.
Similarly, I look at the big bunch of stuff that Tim agglomerates into the topic "Web 2.0" and see more zeitgeist than sharp concept delineation. So maybe Web 2.0 is just Web 2005? If so, then I look forward to Web 2.1.3.12.
Here is an example of what Ajax concept may bring in the area of catalogs, IOW large database browsing applications.
http://www.abaqueinside.com/IntuiCatAjaxDemoVerif.asp
currently in french, but fairly intuitive)
Knowledge as the next "Next Intel Inside"
2005-10-02 15:37:47
finin
[Reply | View]
While the use of RDF is not part of the current Web 2.0 model, I'm hopeful that it will develop a key role, especially for web applications that want to flexibly import and export data and knowledge to other applications on the web. Since the current model makes use of data interchange and manipulation using XML and XSLT and , asynchronous data retrieval via XMLHttpRequest the pathway is there for the data to be expressed in RDF's XML encoding. -- Tim Finin, http://ebiquity.umbc.edu/
Missing Link? and a Great ThankYou
2005-10-02 06:08:44
RandomProp
[Reply | View]
Wonderful article. Very clarifying, even for a novice such as myself. Thank you so much.
Question: Is there a missing link at bottom of p.2, section 2, last bullet on peer-production where you write "There are more than 100,000 open source software projects listed on (???)."?
Where (???) are these software projects listed? Thanks again for an illuminating article.
Missing Link? and a Great ThankYou
2005-10-02 08:58:03
timoreilly
[Reply | View]
Oops. Sourceforge.net
Tim O'Reilly 为 Web 2.0 正本清源
2005-10-02 01:25:55
errorter
[Reply | View]
As simple as possible and no simpler
2005-10-01 02:02:03
jbond
[Reply | View]
One key factor in all this is APIs and Data formats that are as simple as possible and no simpler. That's:-
- REST not XMLRPC, XMLRPC not SOAP, SOAP, not WS*
- RSS, not custom XML schema, Simple XML schema not obfuscated RDF
http://ww.voidstar.com
As simple as possible and no simpler
2005-10-01 10:10:14
timoreilly
[Reply | View]
Julian -- I completely agree. I wrote "lightweight programming models" but I should have used your formulation. It's central to the success of the internet as a whole.
I remember bringing Fred Baker, the chair of the IETF, to the second open source summit, and asking him what advice he could give to the Open source community, and it was essentially what you said above: standardize as little as possible, just enough to ensure interoperability.