Sunday, April 22, 2007

An Early Web 2.0 Definition

Although most people identify with Tim O'Reilly's "What Is Web 2.0Design Patterns and Business Models for the Next Generation of Software" as the foremost definition of Web 2.0, one article worth taking notice of is Paul Graham's "Web 2.0" in November 2005, just a month after O'Reilly's article came out. Shorter and simpler in scope than O'Reilly's biblical explanation, Graham's definition nonetheless offers an equally salient view of Web 2.0. Graham asserts that regardless of new technologies, there is common thread: Web 2.0 means using the web the way it's meant to be used. The "trends" we're seeing now are simply the inherent nature of the web emerging from under the broken models that got imposed on it during the Dot-com Bubble.
Here are Graham's main points about Web 2.0:

(1) Ajax - Short for "Javascript now works," Ajax programming allows web-based applications to work much more like desktop ones. A whole new generation of software is being written to take advantage of Ajax. There hasn't been such a wave of new applications since microcomputers first appeared. Even Microsoft sees it, but it's too late for them to do anything more than leak "internal" documents designed to give the impression they're on top of this new trend.

(2) Democracy - Even amateurs can surpass professionals, when they have the right kind of system to channel their efforts, whether it's the news or academic writing. Wikipedia may be the most famous. The most dramatic example of Web 2.0 democracy is not in the selection of ideas, but their production. The top links on are often links to individual people's sites rather than to magazine articles or news stories.

(3) Don't Maltreat Users - During the Bubble a lot of popular sites were quite high-handed with users. And not just in obvious ways, like making them register, or subjecting them to annoying ads. The very design of the average site in the late 90s was an abuse. Many of the most popular sites were loaded with obtrusive branding that made them slow to load and sent the user the message: this is our site, not yours. Because sites were offering free things, companies felt they needed to make users jump over hoops of fire to get them. Web 2.0 frowns upon that mentality.

Saturday, April 21, 2007

BCLA Conference: Day #2

Day #2 of the BCLA Conference has just wrapped up. Once again, the sessions were fascinating and the catering first class. I'm having a wonderful time. Highlight of the day was the session on The Electronic Health Library of BC (eHLbc): Expanding Access to Health Information Trends. I thought I needed to take a break from Web 2.0, and luckily I did because this session reinforced my education about the need for collaboration and cooperation in order to bring the best information services available for users. After all, as librarians, isn't it our duties to gather, organize, and disseminate the best information possible at the quickest possible time possible? Hence, libraries of the future are best served by collaborative action and pooling of resources. British Columbia is only beginning to catch up, for after over two years of assiduous effort by a working group of academic and health librarians, in partnership with the BC Academic Health Council, the innovative provincial database consortia known as the Electronic Health Library of BC (eHLbc) went live on April 1, 2006.

It was a particularly interesting session in that it provided an account of the process that brought the eHLbc vision to life, such as creating a request for proposals, creating steering and planning committees, as well as identifying future steps that are being planned. In providing the entire BC academic and health care community with high quality, cost-effective, equitable and easily accessible health library resources that will support and improve practice, education and research, eHLbc appears to be taking a huge step for the health practitioners.

Friday, April 20, 2007

BCLA Conference: Day #1

Day #1 of the British Columbia Library Association Conference, Beyond 20/20: Envisioning the Future at the Hilton in Burnaby, BC has just been completed. The BCLA does a good job in cultivating the next generation of librarians and information specialists by offering volunteer work for paid conference hours: students at SLAIS and Langara's Library Tech program not only get valuable experience in behind-the-scenes organizing, but also gets much needed conference time that they otherwise likely wouldn't be able to afford.

Highlight of the day? Speaking with people from Andornot. In a twenty minute discussion, not only did I learn more about the consulting business, but also about the implementation of innovative technologies for library catalogues and databases. Andornot is a Vancouver, B.C. company that specializes in database design and application development, data conversion, search and report form design and optimization, web hosting, and training sessions.

What am I impressed about? Web 2.0 technologies. Rex Turgano, one of the consultants at Andornot, showed me some of the high-end (yet incredibly simple and straightforward) technologies that he uses for not only Andornot projects, but also his own personal hobbies. He showed me how easily a blog service such as Blogger or Moveable Type could be used as a full content management system. Hence, anyone with a little knowledge of HTML and creativity can easily maximize on the use of RSS feeds, a blog, as well as even wiki in "mashing" up together a homepage at very little cost.

Saturday, April 14, 2007

A Master Definition

As promised, here's my analysis of Michael Habib's Master's thesis "Toward Academic Library 2.0: Development and Application of a Library 2.0 Methodology" paper after a more thoroughly careful reading a second time. Habib's thesis astutely asserts that Web 2.0 has seven main concepts. Here they are:

(1) The Read/Write Web - A term given to describe the main differences between Old Media (newspaper, radio, and TV) and New Media (e.g. blogs, wikis, RSS feeds), the new Web is dynamic in that it allows consumers of the web to alter and add to the pages they visit - information flows in all directions.

(2) The Web as a Platform - Better known as "perpetual beta," the idea behind Web 2.0 services is that they need to be constantly updated. Thus, this includes experimenting with new features in a live environment to see how customers react.

(3) The Long Tail - The new Web lowers the barriers for publishing anything (including media) related to a specific interest because it empowers writers to connect directly with international audiences interested in extremely narrow topics, whereas originally it was difficult to publish a book related to a very specific interest because its audience would be too limited to justify the publisher's investment.

(4) Harnessing Collective Intelligence - Google, Amazon, and Wikipedia are good examples of how successful Web 2.0-centric companies use the collective intelligence of users in order to continually improve services based on user contributions. Google's PageRank examines how many links points to a page, and from what sites those links come in order to determine its relevancy instead of the evaluating the relevance of websites based solely on their content.

(5) Network Effects - It is a concept which explains why social technologies benefit from an economy that awards value to the service as more people join the service. eBay is one example of how the application of this concept works so successfully.

(6) Core Datasets from User Contributions - Web 2.0 companies use to collect unique datasets is through user contributions. However, collecting is only half the picture; using the datasets is the key. These contributions are then organized into databases and analyzed to extract the collective intelligence hidden in the data. This extracted information is then used to extract collective knowledge that can be applied to the direct improvement of the website or web service.

(7) Lightweight Programming Models - The move toward database driven web services has been accompanied by new software development models that often lead to greater flexibility. In sharing and processing datasets between partners, this enables mashups and remixes of data. Google Maps is a common example as it allows people to combine its data and application with other geographic datasets and applications.

Friday, April 13, 2007

From Web 2.0 to Web 2.0

I've been reading up on Web 2.0-related material, and have noticed that although it appears to have somewhat disjointed viewpoints, they nonetheless point to a consensus. Over the next while, I will be analyzing these differences, and will come up with my own "theory" as to what Web 2.0 is (or is not). To start off, I'm going to compare two versions of the definitions of Web 2.0 from both ends of the chronological spectrum: an "old" entry written back in 2005 by Sharon Richardson and a "new" one by Michael Habib, a Master's thesis recently published in 2007 (in a later entry).

Here are Richardson's main points about Web 2.0.

(1) The Wisdom of the Crowds - Not only has it blurred the boundary between amateur and professional status, in a connected world, ordinary people often have access to better information than officials do. As an example, the collective intelligence of the evacuees of the World Trace Center towers saved numerous lives in the face of disobeying authority which told them to stay put.

(2) Digital Natives - Because a generation (mostly the under 25's) have grown up surrounded by developing technologies, those fully at home in a digital environment aren't worried about information overload; rather, they crave it.

(3) Internet Economics - Small is the new big. Unlike the past when publishing was controlled by publishers, Web 2.0's read/write web has opened up markets to a far bigger range of supply and demand. The amateur who writes one book has access to the same shelf space as the professional author.

(4) "Wirelessness" - Digital natives are less attached to computers and are more interested in accessing information through mobile devices, when and where they need it. Hence, traditional client applications designed to run on a specific platform, will struggle if not disappear in the long run.

(5) Who Will Rule? - This will be the ultimate question (and prize). As Richardson argues, whoever rules "may not even exist yet."