Saturday, April 14, 2007

A Master Definition

As promised, here's my analysis of Michael Habib's Master's thesis "Toward Academic Library 2.0: Development and Application of a Library 2.0 Methodology" paper after a more thoroughly careful reading a second time. Habib's thesis astutely asserts that Web 2.0 has seven main concepts. Here they are:

(1) The Read/Write Web - A term given to describe the main differences between Old Media (newspaper, radio, and TV) and New Media (e.g. blogs, wikis, RSS feeds), the new Web is dynamic in that it allows consumers of the web to alter and add to the pages they visit - information flows in all directions.

(2) The Web as a Platform - Better known as "perpetual beta," the idea behind Web 2.0 services is that they need to be constantly updated. Thus, this includes experimenting with new features in a live environment to see how customers react.

(3) The Long Tail - The new Web lowers the barriers for publishing anything (including media) related to a specific interest because it empowers writers to connect directly with international audiences interested in extremely narrow topics, whereas originally it was difficult to publish a book related to a very specific interest because its audience would be too limited to justify the publisher's investment.

(4) Harnessing Collective Intelligence - Google, Amazon, and Wikipedia are good examples of how successful Web 2.0-centric companies use the collective intelligence of users in order to continually improve services based on user contributions. Google's PageRank examines how many links points to a page, and from what sites those links come in order to determine its relevancy instead of the evaluating the relevance of websites based solely on their content.

(5) Network Effects - It is a concept which explains why social technologies benefit from an economy that awards value to the service as more people join the service. eBay is one example of how the application of this concept works so successfully.

(6) Core Datasets from User Contributions - Web 2.0 companies use to collect unique datasets is through user contributions. However, collecting is only half the picture; using the datasets is the key. These contributions are then organized into databases and analyzed to extract the collective intelligence hidden in the data. This extracted information is then used to extract collective knowledge that can be applied to the direct improvement of the website or web service.

(7) Lightweight Programming Models - The move toward database driven web services has been accompanied by new software development models that often lead to greater flexibility. In sharing and processing datasets between partners, this enables mashups and remixes of data. Google Maps is a common example as it allows people to combine its data and application with other geographic datasets and applications.

Friday, April 13, 2007

From Web 2.0 to Web 2.0

I've been reading up on Web 2.0-related material, and have noticed that although it appears to have somewhat disjointed viewpoints, they nonetheless point to a consensus. Over the next while, I will be analyzing these differences, and will come up with my own "theory" as to what Web 2.0 is (or is not). To start off, I'm going to compare two versions of the definitions of Web 2.0 from both ends of the chronological spectrum: an "old" entry written back in 2005 by Sharon Richardson and a "new" one by Michael Habib, a Master's thesis recently published in 2007 (in a later entry).

Here are Richardson's main points about Web 2.0.

(1) The Wisdom of the Crowds - Not only has it blurred the boundary between amateur and professional status, in a connected world, ordinary people often have access to better information than officials do. As an example, the collective intelligence of the evacuees of the World Trace Center towers saved numerous lives in the face of disobeying authority which told them to stay put.

(2) Digital Natives - Because a generation (mostly the under 25's) have grown up surrounded by developing technologies, those fully at home in a digital environment aren't worried about information overload; rather, they crave it.

(3) Internet Economics - Small is the new big. Unlike the past when publishing was controlled by publishers, Web 2.0's read/write web has opened up markets to a far bigger range of supply and demand. The amateur who writes one book has access to the same shelf space as the professional author.

(4) "Wirelessness" - Digital natives are less attached to computers and are more interested in accessing information through mobile devices, when and where they need it. Hence, traditional client applications designed to run on a specific platform, will struggle if not disappear in the long run.

(5) Who Will Rule? - This will be the ultimate question (and prize). As Richardson argues, whoever rules "may not even exist yet."

Wednesday, April 11, 2007

Introducing Google Maps (Personalized)

Google has gone ahead and just introduced to us a new feature which is eerily familiar to what a simple mashup can do: in fact, it's pretty much what a mashup is. Using Google Maps, Google has saved us the trouble of using API coding and programming and simply allowed us to personalize our own "Google Map." Give it a try: first, you need to have a theme (e.g. vacation areas you'd like to go to). Second, fill in the required information (addresses). Third, add photos, pictures, descriptions, anything you'd like to "customize" your map. Voila! A personalized map that you can share with your clients, acquaintances, friends, family, and just about anyone you can think of to. Yes, this is Web 2.0: we're doing just fine, thank you very much.