Tuesday, May 15, 2007

Mr. Bean Goes to the Library

Nothing like a little midweek humour to cure away the malaise and pass the time. Every now and then, I try to post something different to "mix things up" a bit and provide a different flavour to this blog.

I just couldn't help not sharing with you this intriguingly funny Mr. Bean clip, starring Rowan Atkinson (one of my favourite comedians) which is one of the rare sketches that didn't quite make it to the big screen and got left on the cutting room floor, only to be released when the DVD format came out. Just in time, too, for Mr Bean's return the big screen with his latest movie when it comes out later this summer. This clip is certainly ones that librarians would hold their breaths. Enjoy!

Sunday, May 13, 2007

The 10 Forces

Thomas Friedman's The World is Flat is a must-read for information professionals. Although the word "Web 2.0" is not found anywhere in the book because it was written before 2005, the concepts are there. The ideas presented in this book are exceptionally juicy and thought-provoking for those interested in understanding not only how we got to where we are in the information world, but also where we are going. Friedman points out Ten Flatteners defines our 21st century world:

#1: Collapse of Berlin Wall: The fall of the Berlin Wall is the starting point for leveling the global playing field. Because the event became the ultimate symbol for the end of the Cold war, it allowed people from other side of the wall to join the global economy.

#2: Netscape: With Netscape, the World Wide Web broadened the audience for the Internet from its roots as a communications medium used primarily by scientists, to everyone who has internet connection.

#3: Workflow software: Technically, what makes this possible is the development of a new data description language, called XML, and its related transport protocol, called SOAP. Such programming allows a vast network of underground plumbing which enables Web and software applications to communicate with each seamlessly.

#4: Open sourcing: New open source software such as blogs and wikis has allowed communities to upload and collaborate on online projects. Free software has leveled the playing field for all, preventing big businesses to monopolize as they could in the past.

#5: Outsourcing: Outsourcing has allowed companies to split service and manufacturing activities into components, with each component performed in most efficient, cost-effective way. At the same time, poorer countries such as India benefits because not only can workers can achieve a better lifestyle and higher pay without leaving their homes, India as a whole becomes a global economic power by preventing a braindrain because of these new technologies.

#6: Offshoring: Similar to outsourcing, countries that could not produce certain products in the past suddenly can do so and become global players. Offshoring allows countries such as China to manufacture the very same product in the very same way, only with cheaper labor, lower taxes, subsidized energy, and lower health-care costs . How? With the internet, anyone from anywhere can have fast, free information blueprints to build just about anything and anywhere.

#7: Supply chaining: Using Wal-Mart as its primary example, supply chaining allows horizontal collaboration among suppliers, retailers, and customers to create value at a more efficient pace and at a lower price, thus resulting in the adoption of common standards between companies and more efficient global collaboration.

#8: Insourcing: Using UPS as a prime example, insourcing is about one company performing services on behalf of another company. For example, UPS itself repairs Toshiba computers on behalf of Toshiba. The work is done at the UPS hub, by UPS employees. Instead of being competitors, businesses are actually collaborating with each other in order to maximize profits and efficiency through the use of greater communication technologies.

#9: In-forming: With the advent of Google, Yahoo!, and MSN Search, everyone who can type has the same basic access to overall research information. Search engines has become a total equalizer. In-forming is the ability to not only build an deploy one's own personal supply chain - a supply chain of information, knowledge, and entertainment, but also for self-collaboration - that is, becoming your own self-direct and self-empowered researcher, editor, and selector of entertainment, without having to leave the house or office.

#10: "The Steroids": "Wirelessness" is the ultimate "flattener" because it amplifies and turbocharges all the other flatterners, making it possible to do each and every one of them in a way that is "digital, mobile, virtual and personal." Some of these new technologies are already a big part of our lives, including cell phones, iPods, personal digital assistants, instant messaging, and voice over IP, or VOIP. These are but the early technologies: the best is yet to come.

Tuesday, May 08, 2007

The Mashup Competition for '06

Paul Miller has written an interesting article on creating mashups for a library contest. A total of eighteen entries were received for the competition, spanning everything from very simple enhancements to existing library functions right through to a collaborative effort to provide library services inside the Second Life 3D online digital world. Entries came in from public and academic libraries, as well as from the commercial sector. As is the trend with Mashups more generally, map-based Mashups proved common.It's a fun read! Here's an excerpt:
The 'mashup' is a point in time; a means to an end. Our purpose is not, necessarily, t encourage the neverending development of small tweaks and hacks around existing systems. Our purpose is to create a safe and incentivised environment within which the whole sector can begin to give serious thought to what they actually want in the future. Should we continue to change the systems we have incrementally, or are we approaching the point at which some revolutionary change is required? Mashups are 'easy', mashups are quick. Mashups free their creator to think differently, and to try the unexpected. Some of that which they learn will inform our collective thinking as we move forward.

Sunday, May 06, 2007

Globalization 3.0

I've got some time now and finally catching up with some reading. I've got my hand on Thomas Friedman's The World Is Flat: A Brief History of the Twenty-First Century, which has been on my wish list since last year around this time. One thing which stands out is his argument that we are in Globalization 3.0. As an information professional, I find this immensely intriguing. Does Web 2.0 fit in this rubric? Is it merely a small piece of a much larger picture? I thought I'd share with you this interesting chronological framework:

Globalization 1.0 (1492 - 1800) - The world shrank from size large to size medium. It was about countries and muscles. The key driving force was how much muscle, horsepower, wind power, and steam power a country had and how creatively it deployed it. The main question was: Where does my country fit into global competition and opportunities?

Globalization 2.0 (1800 - 2000) - This era shrank the world from a size medium to size small. The key agent of change was multinational corporations (MNC's), which went global for markets and labour, spearheaded by the Industrial Revolution. The key dynamic forces behind this era of globalization was technology: steamships, railroads, telephones, then mainframe computers. The main question was: Where does my company fit into the global economy?

Globalization 3.0 (2000 - present) - We've entered the era where size small has shrunk to size tiny and flattening the playing field at the same time. The dynamic force behind our unique era is the power for individuals to collaborate and compete globally. The dynamic forces behind this is software in conjunction with the creaton of a global fiber-optic network that has made us all next-door neighbours. The question now is: Where do I fit into the global competition and opportunities of the day, and how can I, collaborate with others globally?

Saturday, May 05, 2007

Web 2.0 Course this Summer at University of Western Ontario

Web 2.0 is slowly emerging in the LIS curricula. Amanda Etches-Johnson, the User Experience Librarian at McMaster University Library, is teaching an innovative course at the University of Western Ontario LIS school called LIS 757: Social Software and Libraries. Here is a brief description of what the class entails:
The term “social software” has been applied to Web-based software tools that facilitate communication, collaboration, and network/community-building. This course will explore social software applications such as blogs, RSS, wikis, social bookmarking, tagging, and online social networks within the context of library services.

What do you think? Is it time that LIS faculties make Web 2.0 courses mandatory, or at least integrated into the curricula? Here is a schedule of the weekly topics.
  • Week 1: Introduction to social software
  • Weeks 2 & 3: Blogs - introduction to technology, terminology & software options. Discussion of blog content, design, usability, and library case studies.
  • Weeks 4 & 5: RSS - introduction to RSS technology and specifications. Discussion of RSS trends and current issues, review of RSS aggregators, hands-on, and library case studies.
  • Week 6: Wikis – technology, software options, hands-on, and library case studies.
  • Week 7-8: Social bookmarking, tagging, folksonomies – technology, trends and current issues, hands-on, and case studies.
  • Week 9-10: Online communities and social networks – trends and current issues, exploration of various online communities, hands-on, library case studies.
  • Week 11: Gaming and virtual worlds.
  • Weeks 12-13: best practices, discussion, evaluation.

Friday, May 04, 2007

Search Engines for '06

Search engine guru Phil Bradley has written an excellent column about the latest and greatest search engines in 2006. I've been keeping track of Mr. Bradley's blog, and it's a hub of fantastic information geared towards librarians, information staff, information professionals and web designers. Here is an excert of his article on Ariadne:
It's very easy simply to concentrate on the 'Big Four' search engines - Ask, Google, Live and Yahoo, while missing out on what is happening elsewhere. I know that I'm as guilty of that as anyone else and so for this column I thought I would look back over 2006 and see which search engines have come to my attention, what I think of them, and see how well they have actually fared. This is of course by no means a comprehensive list, and I will inevitably have missed out some but I hope I will have caught the main contenders.

My search engine of the year? Cha Cha. Why? I've written a post about it a while back ago. It's a superb compliment to searching for those "needle in a haystack" type reference inquiries.

Sunday, April 29, 2007

Top 10 Library Blogs

I have ten bloggers that I follow. To me, they are the future of librarianship. Their blogs are not just random musings, but instead are thoughtful, reflective, and fresh in content. In my opinion, these blogging librarians represent the next generation of the profession: they are creative, technologically savvy, and passionate about what they do. You can easily tell that they do this truly because they love it.

(10) Blog on the Side - Darlene Fichter, Data Library Coordinator at the University of Saskatchewan, never fails to offer fascinating insight into the technological side of the information profession. Each post offers a little something different. Hence, it makes my top 10.

(9) McMaster University Library - This one's kind of unfair. A university librarian shouldn't be ranked so highly. (Doesn't he better things to do?) And that's absolutely why Jeffrey Trzeciak offers such an exciting blog. He gives us a glimpse of the inner workings of a university librarian's viewpoint. Indeed, there is management-speak, but underneath the marketing and formality, is hidden a fabulous hub of fascinating ideas and fabulous vision of Library 2.0.

(8) Michael Habib - I'm just astounded at how far Mr. Habib will go. The sky's the limit for this man. And he's only just finished his MLS a few months ago. I consider Habib to be one of the foremost experts of Library 2.0 theory, as he wrote his dissertation on it.

(7) Library Crunch - Michael Casey's blog about Library 2.0-related issues in LIS. Casey is the progenitor of the the term "Library 2.0" and not surprisingly, his blog offers the most innovative insights into the profession.

(6) davidrothmman.net - Very highly technology-charged blog with superb insight into the latest medical library-related happenings.

(5) Vancouver Law Librarian - Humorous and enlightening, he offers more of a tech-related posts in the legal information profession.

(4) Meredith Farkas - One of the up-and-coming stars in the library world, Meredith has already published articles, contributes frequently to the blogosphere with thoughtful analysis, and even built the inaugural Five Weeks to a Social Library free online course for working librarians.

(3) Krafty Librarian - Michelle Kraft is in my mind, one of the top health librarians in the field, and her blog posts indicate her knowledge and passion for her profession. She is also very updated on the technology side of her area of librarianship, which is a challenge, since hospital libraries are not often the most receptive places for technology due to data privacy.

(2) The Google Scholar Blog - This one's definitely a biased decision; but one which I don't think is overly so. I am confident that many will agree with me that the information in this blog not only serves the medical community, but the information profession. The Google Scholar is on sabbatical on the moment, but his year of material is worth the price of admission alone.

(1) Tame the Web - I rank according to the following criteria: visually creative inteface; length of existence; originality of posts; and quantity/quality of feedback. Michael Stephens, a professor and professional librarian, has a blog that meets all of these criteria. It's definitely worth checking out.

I believe blogging is a new beginning for librarians; whereas in the past, discourse was confined to monthly journal articles, which could only draw response sporadically through conferences and workshops and the occasional phonecall, the blogosphere has transformed and leveled the playing field. Librarians are actually ahead of the game now; we can exchange our views within seconds. I'm proud to be apart of this profession, and excited about where it's going.

Sunday, April 22, 2007

An Early Web 2.0 Definition

Although most people identify with Tim O'Reilly's "What Is Web 2.0Design Patterns and Business Models for the Next Generation of Software" as the foremost definition of Web 2.0, one article worth taking notice of is Paul Graham's "Web 2.0" in November 2005, just a month after O'Reilly's article came out. Shorter and simpler in scope than O'Reilly's biblical explanation, Graham's definition nonetheless offers an equally salient view of Web 2.0. Graham asserts that regardless of new technologies, there is common thread: Web 2.0 means using the web the way it's meant to be used. The "trends" we're seeing now are simply the inherent nature of the web emerging from under the broken models that got imposed on it during the Dot-com Bubble.
Here are Graham's main points about Web 2.0:

(1) Ajax - Short for "Javascript now works," Ajax programming allows web-based applications to work much more like desktop ones. A whole new generation of software is being written to take advantage of Ajax. There hasn't been such a wave of new applications since microcomputers first appeared. Even Microsoft sees it, but it's too late for them to do anything more than leak "internal" documents designed to give the impression they're on top of this new trend.

(2) Democracy - Even amateurs can surpass professionals, when they have the right kind of system to channel their efforts, whether it's the news or academic writing. Wikipedia may be the most famous. The most dramatic example of Web 2.0 democracy is not in the selection of ideas, but their production. The top links on are often links to individual people's sites rather than to magazine articles or news stories.

(3) Don't Maltreat Users - During the Bubble a lot of popular sites were quite high-handed with users. And not just in obvious ways, like making them register, or subjecting them to annoying ads. The very design of the average site in the late 90s was an abuse. Many of the most popular sites were loaded with obtrusive branding that made them slow to load and sent the user the message: this is our site, not yours. Because sites were offering free things, companies felt they needed to make users jump over hoops of fire to get them. Web 2.0 frowns upon that mentality.

Saturday, April 21, 2007

BCLA Conference: Day #2

Day #2 of the BCLA Conference has just wrapped up. Once again, the sessions were fascinating and the catering first class. I'm having a wonderful time. Highlight of the day was the session on The Electronic Health Library of BC (eHLbc): Expanding Access to Health Information Trends. I thought I needed to take a break from Web 2.0, and luckily I did because this session reinforced my education about the need for collaboration and cooperation in order to bring the best information services available for users. After all, as librarians, isn't it our duties to gather, organize, and disseminate the best information possible at the quickest possible time possible? Hence, libraries of the future are best served by collaborative action and pooling of resources. British Columbia is only beginning to catch up, for after over two years of assiduous effort by a working group of academic and health librarians, in partnership with the BC Academic Health Council, the innovative provincial database consortia known as the Electronic Health Library of BC (eHLbc) went live on April 1, 2006.

It was a particularly interesting session in that it provided an account of the process that brought the eHLbc vision to life, such as creating a request for proposals, creating steering and planning committees, as well as identifying future steps that are being planned. In providing the entire BC academic and health care community with high quality, cost-effective, equitable and easily accessible health library resources that will support and improve practice, education and research, eHLbc appears to be taking a huge step for the health practitioners.

Friday, April 20, 2007

BCLA Conference: Day #1

Day #1 of the British Columbia Library Association Conference, Beyond 20/20: Envisioning the Future at the Hilton in Burnaby, BC has just been completed. The BCLA does a good job in cultivating the next generation of librarians and information specialists by offering volunteer work for paid conference hours: students at SLAIS and Langara's Library Tech program not only get valuable experience in behind-the-scenes organizing, but also gets much needed conference time that they otherwise likely wouldn't be able to afford.

Highlight of the day? Speaking with people from Andornot. In a twenty minute discussion, not only did I learn more about the consulting business, but also about the implementation of innovative technologies for library catalogues and databases. Andornot is a Vancouver, B.C. company that specializes in database design and application development, data conversion, search and report form design and optimization, web hosting, and training sessions.

What am I impressed about? Web 2.0 technologies. Rex Turgano, one of the consultants at Andornot, showed me some of the high-end (yet incredibly simple and straightforward) technologies that he uses for not only Andornot projects, but also his own personal hobbies. He showed me how easily a blog service such as Blogger or Moveable Type could be used as a full content management system. Hence, anyone with a little knowledge of HTML and creativity can easily maximize on the use of RSS feeds, a blog, as well as even wiki in "mashing" up together a homepage at very little cost.

Saturday, April 14, 2007

A Master Definition

As promised, here's my analysis of Michael Habib's Master's thesis "Toward Academic Library 2.0: Development and Application of a Library 2.0 Methodology" paper after a more thoroughly careful reading a second time. Habib's thesis astutely asserts that Web 2.0 has seven main concepts. Here they are:

(1) The Read/Write Web - A term given to describe the main differences between Old Media (newspaper, radio, and TV) and New Media (e.g. blogs, wikis, RSS feeds), the new Web is dynamic in that it allows consumers of the web to alter and add to the pages they visit - information flows in all directions.

(2) The Web as a Platform - Better known as "perpetual beta," the idea behind Web 2.0 services is that they need to be constantly updated. Thus, this includes experimenting with new features in a live environment to see how customers react.

(3) The Long Tail - The new Web lowers the barriers for publishing anything (including media) related to a specific interest because it empowers writers to connect directly with international audiences interested in extremely narrow topics, whereas originally it was difficult to publish a book related to a very specific interest because its audience would be too limited to justify the publisher's investment.

(4) Harnessing Collective Intelligence - Google, Amazon, and Wikipedia are good examples of how successful Web 2.0-centric companies use the collective intelligence of users in order to continually improve services based on user contributions. Google's PageRank examines how many links points to a page, and from what sites those links come in order to determine its relevancy instead of the evaluating the relevance of websites based solely on their content.

(5) Network Effects - It is a concept which explains why social technologies benefit from an economy that awards value to the service as more people join the service. eBay is one example of how the application of this concept works so successfully.

(6) Core Datasets from User Contributions - Web 2.0 companies use to collect unique datasets is through user contributions. However, collecting is only half the picture; using the datasets is the key. These contributions are then organized into databases and analyzed to extract the collective intelligence hidden in the data. This extracted information is then used to extract collective knowledge that can be applied to the direct improvement of the website or web service.

(7) Lightweight Programming Models - The move toward database driven web services has been accompanied by new software development models that often lead to greater flexibility. In sharing and processing datasets between partners, this enables mashups and remixes of data. Google Maps is a common example as it allows people to combine its data and application with other geographic datasets and applications.

Friday, April 13, 2007

From Web 2.0 to Web 2.0

I've been reading up on Web 2.0-related material, and have noticed that although it appears to have somewhat disjointed viewpoints, they nonetheless point to a consensus. Over the next while, I will be analyzing these differences, and will come up with my own "theory" as to what Web 2.0 is (or is not). To start off, I'm going to compare two versions of the definitions of Web 2.0 from both ends of the chronological spectrum: an "old" entry written back in 2005 by Sharon Richardson and a "new" one by Michael Habib, a Master's thesis recently published in 2007 (in a later entry).

Here are Richardson's main points about Web 2.0.

(1) The Wisdom of the Crowds - Not only has it blurred the boundary between amateur and professional status, in a connected world, ordinary people often have access to better information than officials do. As an example, the collective intelligence of the evacuees of the World Trace Center towers saved numerous lives in the face of disobeying authority which told them to stay put.

(2) Digital Natives - Because a generation (mostly the under 25's) have grown up surrounded by developing technologies, those fully at home in a digital environment aren't worried about information overload; rather, they crave it.

(3) Internet Economics - Small is the new big. Unlike the past when publishing was controlled by publishers, Web 2.0's read/write web has opened up markets to a far bigger range of supply and demand. The amateur who writes one book has access to the same shelf space as the professional author.

(4) "Wirelessness" - Digital natives are less attached to computers and are more interested in accessing information through mobile devices, when and where they need it. Hence, traditional client applications designed to run on a specific platform, will struggle if not disappear in the long run.

(5) Who Will Rule? - This will be the ultimate question (and prize). As Richardson argues, whoever rules "may not even exist yet."

Wednesday, April 11, 2007

Introducing Google Maps (Personalized)

Google has gone ahead and just introduced to us a new feature which is eerily familiar to what a simple mashup can do: in fact, it's pretty much what a mashup is. Using Google Maps, Google has saved us the trouble of using API coding and programming and simply allowed us to personalize our own "Google Map." Give it a try: first, you need to have a theme (e.g. vacation areas you'd like to go to). Second, fill in the required information (addresses). Third, add photos, pictures, descriptions, anything you'd like to "customize" your map. Voila! A personalized map that you can share with your clients, acquaintances, friends, family, and just about anyone you can think of to. Yes, this is Web 2.0: we're doing just fine, thank you very much.

Monday, April 02, 2007

Wrapping Up on Five Weeks

The Five Weeks to a Social Library course wrapped up on March 17, 2007. In being the first free, grassroots, completely online course devoted to teaching librarians about social software and how to use it in their libraries, Five Weeks to a Social Library provided a free, comprehensive, and social online learning opportunity for librarians interested in learning more about Web 2.0 technologies.

The course’s strength was the very fact that it had appropriately used social software to run much of its content. Not only did it use a wiki (hosted by Drupal) as the platform for its main page, it also capitalized on blogs for allowing discussion virtually anytime anyone felt like posting a message; this allowed for a two-way communication among participants and instructors. Moreover, presentations creatively combined streaming audio, text chat, and presentation slides which truly made online learning an interactive experience; even better, much of the materials are archived for access, thus allowing anyone with an interest to take the course long after its completion, anytime and anywhere.

However, if there are drawbacks to the course, it would be some of the content. For a more experience user of social software, some of the course content might be considered somewhat basic. Because blogs, wikis, Flickr, and social bookmarking are becoming ever more commonly used, focusing too much on them runs the risk of narrowing social software to just these tools. What the course could have done was emphasize more on “theory” and “concepts” such as collective intelligence, open platform, open collaboration, than on the actual tools themselves.

With the advent of the so-called “Web 3.0” or semantic web, blogs and wikis will come and go and will likely be replaced with newer and even more advanced tools in the near future. While the scope of Five Weeks to a Social Library is certainly limited due to its relatively short time frame and also because it is meant to be an introduction of social software for the novice, I felt that the course could have provided a “glimpse” at things to come by offering a presentation or two from a more advanced “techie”, be it a librarian or not, that would have summarized that we are only on the cusp of something great, and that social software phenomenon will look vastly different even a year from now, let alone many years from now.

Nonetheless, the course was an ambitious first step for providing the necessary knowledge that librarians need to advance the profession and provide better services for patrons. While it could have done more to maximize its potential, the course nonetheless passes with flying colours in terms of meeting its goals and expectations from Day 1 when it had announced such an exciting venture of an entirely free online course for anyone interested in social software.

Sunday, April 01, 2007

Happy April Fool's

The good folks at Google pulled a wonderfully brillant and witty April Fools joke which certainly pulled my leg. The joke is Google TiSP, a new FREE in-home wireless broadband service that includes a self-installation kit, which includes setup guide, fiber-optic cable, spindle, wireless router and installation CD. What do you need to do? Simple. A toilet. And a pair of gloves. Then you're ready for high speed internet. According to Google TiSP, actual speeds will vary, depending on network traffic and sewer line conditions. Users with low-flow toilets may simultaneously experience a saving-the-environment glow and slower-data-speed blues. Here's more:

How can Google offer this service for free? We believe that all users deserve free, fast and sanitary online access. To offset the cost of providing the TiSP service, we use information gathered by discreet DNA sequencing of your personal bodily output to display online ads that are contextually relevant to your culinary preferences, current health status and likelihood of developing particular medical conditions going forward. Google also offers premium levels of service for a monthly fee.

Best of all, there is also great technical service support, including on-site technical support in the event of "backup problems, brownouts and data wipes."