Summer has gone by so quickly. What happened to June? I've been culling readings from all over everywhere, aggregating the best definitions of
Web 2.0. Notice there is a lot:
twenty-five in all. I tried making sense of everything, even trying to arrange and shuffle for a catchy acronym (think
ROY G. BIV). I challenge all librarians and other information professionals interested in
Web 2.0 to do the same: find a catchy acronym and share it with us all. I will share
my own in one month's time.
(1) Social Networks - The content of a site should comprise user-provided information that attracts members of an ever-expanding network. (example:
Facebook)
(2) Wisdom of Crowds - Group judgments are surprisingly accurate, and the aggregation of input is facilitated by the ready availability of social networking sites. (example:
Wikipedia)
(3) Loosely Coupled API's - Short for "Application Programming Interface," API provides a set of instructions (messages) that a programmer can use to communicate between applications, thus allowing programmers to incorporate one piece of software to directly manipulate (code) into another. (example:
Google Maps)
(4) Mashups - They are combinations of APIs and data that result in new information resources and services. (example:
Calgary Mapped)
(5) Permanent Betas - The idea is that no software is ever truly complete so long as the user community is still commenting upon it, and thus, improving it. (example:
Google Labs)
(6) Software Gets Better the More People Use It - Because all social networking sites seek to capitalize on user input, the true value of each site is definted by the number of people it can bring together. (example:
Windows Live Messenger)
(7) Folksonomies - It's a classification system created in a bottom-up fashion and with no central coordination. Entirely differing from the traditional classification schemes such as the Dewey Decimal or Library of Congress Classifications, folksonomies allow any user to "social tag" whatever phrase they deem necessary for an object. (example:
Flickr and
Youtube)
(8) Individual Production and User Generated Content - Free
social software tools such as blogs and wikis have lowered the barrier to entry, following the same footsteps as the 1980s self-publishing revolution sparked by the advent of the office laser printer and desktop publishing software. In the world of Web 2.0, with a few clicks of the mouse, a user can upload videos or photos from their digital cameras and into their own media space, tag it with keywords and make the content available for everyone in the world.
(9) Harness the Power of the Crowd - Harnessing not the "intellectual" power, but the power of the "
wisdom of the crowds," "
crowd-sourcing" and "
folksonomies."
(10) Data on an Epic Scale - Google has a total database measured in hundreds of petabytes (a million, billion bytes) which is swelled each day by terabytes of new information. Much of this is collected indirectly from users and aggregated as a side effect of the ordinary use of major Internet services and applications such as Google, Amazon, and EBay. In a sense these services are 'learning' every time they are used by mining and sifting data for better services.
(11) Architecture of Participation - Through the use of the application or service, the service itself gets better. Simply argued, the more you use it - and the more other people use - the better it gets. Web 2.0 technologies are designed to take the user interactions and utilize them to improve itself. (e.g.
Google search).
(12) Network Effects - It is general economic term often used to describe the increase in vaue to the existing users of a service in which there is some form of interaction with others, as more and more people to start to use it. As the Internet is, at heart, a telecommunications network, it is therefore subject to the network effect. In Web 2.0, new software services are being made available which, due to their social nature, rely a great deal on the network effect for their adoption.
eBay is one example of how the application of this concept works so successfully.
(13) Openness - Web 2.0 places an emphasis on making use of the information in vast databases that the services help to populate. This means Web 2. 0 is about working with open standards, using open source software, making use of free data, re-using data and working in a spirit of open innovation.
(14) The Read/Write Web - A term given to describe the main differences between Old Media (newspaper, radio, and TV) and New Media (e.g. blogs, wikis, RSS feeds), the new Web is dynamic in that it allows consumers of the web to alter and add to the pages they visit -
information flows in all directions.
(15) The Web as a Platform - Better known as "perpetual beta," the idea behind Web 2.0 services is that they need to be constantly updated. Thus, this includes experimenting with new features in a live environment to see how customers react.
(16) The Long Tail - The new Web lowers the barriers for publishing anything (including media) related to a specific interest because it empowers writers to connect directly with international audiences interested in extremely narrow topics, whereas originally it was difficult to publish a book related to a very specific interest because its audience would be too limited to justify the publisher's investment.
(17) Harnessing Collective Intelligence - Google, Amazon, and Wikipedia are good examples of how successful Web 2.0-centric companies use the collective intelligence of users in order to continually improve services based on user contributions. Google's
PageRank examines how many links points to a page, and from what sites those links come in order to determine its relevancy instead of the evaluating the relevance of websites based solely on their content.
(18) Science of Networks - To truly understand Web 2.0, one must not only understand web networks, but also human and scientific networks. Ever heard of
six degrees of separation and the
small world phenomenon? Knowing how to open up a
Facebook account isn't good enough; we must know what goes on behind the scene in the interconnectedness of networks - socially and scientifically.
(19) Core Datasets from User Contributions - Web 2.0 companies use to collect unique datasets is through user contributions. However, collecting is only half the picture;
using the datasets is the key. These contributions are then organized into databases and analyzed to extract the collective intelligence hidden in the data. This extracted information is then used to extract collective knowledge that can be applied to the direct improvement of the website or web service.
(20) Lightweight Programming Models - The move toward database driven web services has been accompanied by new software development models that often lead to greater flexibility. In sharing and processing datasets between partners, this enables mashups and remixes of data.
Google Maps is a common example as it allows people to combine its data and application with other geographic datasets and applications.
(21) The Wisdom of the Crowds - Not only has it blurred the boundary between amateur and professional status, in a connected world, ordinary people often have access to better information than officials do. As an example, the
collective intelligence of the evacuees of the towers saved numerous lives in the face of disobeying authority which told them to stay put.
(22) Digital Natives - Because a generation (mostly the under 25's) have grown up surrounded by developing technologies, those fully at home in a digital environment aren't worried about information overload; rather, they crave it.
(23) Internet Economics - Small is the new big. Unlike the past when publishing was controlled by publishers, Web 2.0's read/write web has opened up markets to a far bigger range of supply and demand. The amateur who writes one book has access to the same shelf space as the professional author.
(24) "Wirelessness" - Digital natives are less attached to computers and are more interested in accessing information through mobile devices, when and where they need it. Hence, traditional client applications designed to run on a specific platform, will struggle if not disappear in the long run.
(25) Who Will Rule? - This will be the ultimate question (and prize). As
Sharon Richardson argues, whoever rules "may not even exist yet."