I have ten bloggers that I follow. To me, they are the future of librarianship. Their blogs are not just random musings, but instead are thoughtful, reflective, and fresh in content. In my opinion, these blogging librarians represent the next generation of the profession: they are creative, technologically savvy, and passionate about what they do. You can easily tell that they do this truly because they love it.
(10) Blog on the Side - Darlene Fichter, Data Library Coordinator at the University of Saskatchewan, never fails to offer fascinating insight into the technological side of the information profession. Each post offers a little something different. Hence, it makes my top 10.
(9) McMaster University Library - This one's kind of unfair. A university librarian shouldn't be ranked so highly. (Doesn't he better things to do?) And that's absolutely why Jeffrey Trzeciak offers such an exciting blog. He gives us a glimpse of the inner workings of a university librarian's viewpoint. Indeed, there is management-speak, but underneath the marketing and formality, is hidden a fabulous hub of fascinating ideas and fabulous vision of Library 2.0.
(8) Michael Habib - I'm just astounded at how far Mr. Habib will go. The sky's the limit for this man. And he's only just finished his MLS a few months ago. I consider Habib to be one of the foremost experts of Library 2.0 theory, as he wrote his dissertation on it.
(7) Library Crunch - Michael Casey's blog about Library 2.0-related issues in LIS. Casey is the progenitor of the the term "Library 2.0" and not surprisingly, his blog offers the most innovative insights into the profession.
(6) davidrothmman.net - Very highly technology-charged blog with superb insight into the latest medical library-related happenings.
(5) Vancouver Law Librarian - Humorous and enlightening, he offers more of a tech-related posts in the legal information profession.
(4) Meredith Farkas - One of the up-and-coming stars in the library world, Meredith has already published articles, contributes frequently to the blogosphere with thoughtful analysis, and even built the inaugural Five Weeks to a Social Library free online course for working librarians.
(3) Krafty Librarian - Michelle Kraft is in my mind, one of the top health librarians in the field, and her blog posts indicate her knowledge and passion for her profession. She is also very updated on the technology side of her area of librarianship, which is a challenge, since hospital libraries are not often the most receptive places for technology due to data privacy.
(2) The Google Scholar Blog - This one's definitely a biased decision; but one which I don't think is overly so. I am confident that many will agree with me that the information in this blog not only serves the medical community, but the information profession. The Google Scholar is on sabbatical on the moment, but his year of material is worth the price of admission alone.
(1) Tame the Web - I rank according to the following criteria: visually creative inteface; length of existence; originality of posts; and quantity/quality of feedback. Michael Stephens, a professor and professional librarian, has a blog that meets all of these criteria. It's definitely worth checking out.
I believe blogging is a new beginning for librarians; whereas in the past, discourse was confined to monthly journal articles, which could only draw response sporadically through conferences and workshops and the occasional phonecall, the blogosphere has transformed and leveled the playing field. Librarians are actually ahead of the game now; we can exchange our views within seconds. I'm proud to be apart of this profession, and excited about where it's going.
Sunday, April 29, 2007
Sunday, April 22, 2007
An Early Web 2.0 Definition
Although most people identify with Tim O'Reilly's "What Is Web 2.0Design Patterns and Business Models for the Next Generation of Software" as the foremost definition of Web 2.0, one article worth taking notice of is Paul Graham's "Web 2.0" in November 2005, just a month after O'Reilly's article came out. Shorter and simpler in scope than O'Reilly's biblical explanation, Graham's definition nonetheless offers an equally salient view of Web 2.0. Graham asserts that regardless of new technologies, there is common thread: Web 2.0 means using the web the way it's meant to be used. The "trends" we're seeing now are simply the inherent nature of the web emerging from under the broken models that got imposed on it during the Dot-com Bubble.
Here are Graham's main points about Web 2.0:
(1) Ajax - Short for "Javascript now works," Ajax programming allows web-based applications to work much more like desktop ones. A whole new generation of software is being written to take advantage of Ajax. There hasn't been such a wave of new applications since microcomputers first appeared. Even Microsoft sees it, but it's too late for them to do anything more than leak "internal" documents designed to give the impression they're on top of this new trend.
(2) Democracy - Even amateurs can surpass professionals, when they have the right kind of system to channel their efforts, whether it's the news or academic writing. Wikipedia may be the most famous. The most dramatic example of Web 2.0 democracy is not in the selection of ideas, but their production. The top links on are often links to individual people's sites rather than to magazine articles or news stories.
(3) Don't Maltreat Users - During the Bubble a lot of popular sites were quite high-handed with users. And not just in obvious ways, like making them register, or subjecting them to annoying ads. The very design of the average site in the late 90s was an abuse. Many of the most popular sites were loaded with obtrusive branding that made them slow to load and sent the user the message: this is our site, not yours. Because sites were offering free things, companies felt they needed to make users jump over hoops of fire to get them. Web 2.0 frowns upon that mentality.
Saturday, April 21, 2007
BCLA Conference: Day #2
Day #2 of the BCLA Conference has just wrapped up. Once again, the sessions were fascinating and the catering first class. I'm having a wonderful time. Highlight of the day was the session on The Electronic Health Library of BC (eHLbc): Expanding Access to Health Information Trends. I thought I needed to take a break from Web 2.0, and luckily I did because this session reinforced my education about the need for collaboration and cooperation in order to bring the best information services available for users. After all, as librarians, isn't it our duties to gather, organize, and disseminate the best information possible at the quickest possible time possible? Hence, libraries of the future are best served by collaborative action and pooling of resources. British Columbia is only beginning to catch up, for after over two years of assiduous effort by a working group of academic and health librarians, in partnership with the BC Academic Health Council, the innovative provincial database consortia known as the Electronic Health Library of BC (eHLbc) went live on April 1, 2006.
It was a particularly interesting session in that it provided an account of the process that brought the eHLbc vision to life, such as creating a request for proposals, creating steering and planning committees, as well as identifying future steps that are being planned. In providing the entire BC academic and health care community with high quality, cost-effective, equitable and easily accessible health library resources that will support and improve practice, education and research, eHLbc appears to be taking a huge step for the health practitioners.
It was a particularly interesting session in that it provided an account of the process that brought the eHLbc vision to life, such as creating a request for proposals, creating steering and planning committees, as well as identifying future steps that are being planned. In providing the entire BC academic and health care community with high quality, cost-effective, equitable and easily accessible health library resources that will support and improve practice, education and research, eHLbc appears to be taking a huge step for the health practitioners.
Friday, April 20, 2007
BCLA Conference: Day #1
Day #1 of the British Columbia Library Association Conference, Beyond 20/20: Envisioning the Future at the Hilton in Burnaby, BC has just been completed. The BCLA does a good job in cultivating the next generation of librarians and information specialists by offering volunteer work for paid conference hours: students at SLAIS and Langara's Library Tech program not only get valuable experience in behind-the-scenes organizing, but also gets much needed conference time that they otherwise likely wouldn't be able to afford.
Highlight of the day? Speaking with people from Andornot. In a twenty minute discussion, not only did I learn more about the consulting business, but also about the implementation of innovative technologies for library catalogues and databases. Andornot is a Vancouver, B.C. company that specializes in database design and application development, data conversion, search and report form design and optimization, web hosting, and training sessions.
What am I impressed about? Web 2.0 technologies. Rex Turgano, one of the consultants at Andornot, showed me some of the high-end (yet incredibly simple and straightforward) technologies that he uses for not only Andornot projects, but also his own personal hobbies. He showed me how easily a blog service such as Blogger or Moveable Type could be used as a full content management system. Hence, anyone with a little knowledge of HTML and creativity can easily maximize on the use of RSS feeds, a blog, as well as even wiki in "mashing" up together a homepage at very little cost.
Highlight of the day? Speaking with people from Andornot. In a twenty minute discussion, not only did I learn more about the consulting business, but also about the implementation of innovative technologies for library catalogues and databases. Andornot is a Vancouver, B.C. company that specializes in database design and application development, data conversion, search and report form design and optimization, web hosting, and training sessions.
What am I impressed about? Web 2.0 technologies. Rex Turgano, one of the consultants at Andornot, showed me some of the high-end (yet incredibly simple and straightforward) technologies that he uses for not only Andornot projects, but also his own personal hobbies. He showed me how easily a blog service such as Blogger or Moveable Type could be used as a full content management system. Hence, anyone with a little knowledge of HTML and creativity can easily maximize on the use of RSS feeds, a blog, as well as even wiki in "mashing" up together a homepage at very little cost.
Saturday, April 14, 2007
A Master Definition
As promised, here's my analysis of Michael Habib's Master's thesis "Toward Academic Library 2.0: Development and Application of a Library 2.0 Methodology" paper after a more thoroughly careful reading a second time. Habib's thesis astutely asserts that Web 2.0 has seven main concepts. Here they are:
(1) The Read/Write Web - A term given to describe the main differences between Old Media (newspaper, radio, and TV) and New Media (e.g. blogs, wikis, RSS feeds), the new Web is dynamic in that it allows consumers of the web to alter and add to the pages they visit - information flows in all directions.
(2) The Web as a Platform - Better known as "perpetual beta," the idea behind Web 2.0 services is that they need to be constantly updated. Thus, this includes experimenting with new features in a live environment to see how customers react.
(3) The Long Tail - The new Web lowers the barriers for publishing anything (including media) related to a specific interest because it empowers writers to connect directly with international audiences interested in extremely narrow topics, whereas originally it was difficult to publish a book related to a very specific interest because its audience would be too limited to justify the publisher's investment.
(4) Harnessing Collective Intelligence - Google, Amazon, and Wikipedia are good examples of how successful Web 2.0-centric companies use the collective intelligence of users in order to continually improve services based on user contributions. Google's PageRank examines how many links points to a page, and from what sites those links come in order to determine its relevancy instead of the evaluating the relevance of websites based solely on their content.
(5) Network Effects - It is a concept which explains why social technologies benefit from an economy that awards value to the service as more people join the service. eBay is one example of how the application of this concept works so successfully.
(6) Core Datasets from User Contributions - Web 2.0 companies use to collect unique datasets is through user contributions. However, collecting is only half the picture; using the datasets is the key. These contributions are then organized into databases and analyzed to extract the collective intelligence hidden in the data. This extracted information is then used to extract collective knowledge that can be applied to the direct improvement of the website or web service.
(7) Lightweight Programming Models - The move toward database driven web services has been accompanied by new software development models that often lead to greater flexibility. In sharing and processing datasets between partners, this enables mashups and remixes of data. Google Maps is a common example as it allows people to combine its data and application with other geographic datasets and applications.
(1) The Read/Write Web - A term given to describe the main differences between Old Media (newspaper, radio, and TV) and New Media (e.g. blogs, wikis, RSS feeds), the new Web is dynamic in that it allows consumers of the web to alter and add to the pages they visit - information flows in all directions.
(2) The Web as a Platform - Better known as "perpetual beta," the idea behind Web 2.0 services is that they need to be constantly updated. Thus, this includes experimenting with new features in a live environment to see how customers react.
(3) The Long Tail - The new Web lowers the barriers for publishing anything (including media) related to a specific interest because it empowers writers to connect directly with international audiences interested in extremely narrow topics, whereas originally it was difficult to publish a book related to a very specific interest because its audience would be too limited to justify the publisher's investment.
(4) Harnessing Collective Intelligence - Google, Amazon, and Wikipedia are good examples of how successful Web 2.0-centric companies use the collective intelligence of users in order to continually improve services based on user contributions. Google's PageRank examines how many links points to a page, and from what sites those links come in order to determine its relevancy instead of the evaluating the relevance of websites based solely on their content.
(5) Network Effects - It is a concept which explains why social technologies benefit from an economy that awards value to the service as more people join the service. eBay is one example of how the application of this concept works so successfully.
(6) Core Datasets from User Contributions - Web 2.0 companies use to collect unique datasets is through user contributions. However, collecting is only half the picture; using the datasets is the key. These contributions are then organized into databases and analyzed to extract the collective intelligence hidden in the data. This extracted information is then used to extract collective knowledge that can be applied to the direct improvement of the website or web service.
(7) Lightweight Programming Models - The move toward database driven web services has been accompanied by new software development models that often lead to greater flexibility. In sharing and processing datasets between partners, this enables mashups and remixes of data. Google Maps is a common example as it allows people to combine its data and application with other geographic datasets and applications.
Friday, April 13, 2007
From Web 2.0 to Web 2.0
I've been reading up on Web 2.0-related material, and have noticed that although it appears to have somewhat disjointed viewpoints, they nonetheless point to a consensus. Over the next while, I will be analyzing these differences, and will come up with my own "theory" as to what Web 2.0 is (or is not). To start off, I'm going to compare two versions of the definitions of Web 2.0 from both ends of the chronological spectrum: an "old" entry written back in 2005 by Sharon Richardson and a "new" one by Michael Habib, a Master's thesis recently published in 2007 (in a later entry).
Here are Richardson's main points about Web 2.0.
(1) The Wisdom of the Crowds - Not only has it blurred the boundary between amateur and professional status, in a connected world, ordinary people often have access to better information than officials do. As an example, the collective intelligence of the evacuees of the World Trace Center towers saved numerous lives in the face of disobeying authority which told them to stay put.
(2) Digital Natives - Because a generation (mostly the under 25's) have grown up surrounded by developing technologies, those fully at home in a digital environment aren't worried about information overload; rather, they crave it.
(3) Internet Economics - Small is the new big. Unlike the past when publishing was controlled by publishers, Web 2.0's read/write web has opened up markets to a far bigger range of supply and demand. The amateur who writes one book has access to the same shelf space as the professional author.
(4) "Wirelessness" - Digital natives are less attached to computers and are more interested in accessing information through mobile devices, when and where they need it. Hence, traditional client applications designed to run on a specific platform, will struggle if not disappear in the long run.
(5) Who Will Rule? - This will be the ultimate question (and prize). As Richardson argues, whoever rules "may not even exist yet."
Here are Richardson's main points about Web 2.0.
(1) The Wisdom of the Crowds - Not only has it blurred the boundary between amateur and professional status, in a connected world, ordinary people often have access to better information than officials do. As an example, the collective intelligence of the evacuees of the World Trace Center towers saved numerous lives in the face of disobeying authority which told them to stay put.
(2) Digital Natives - Because a generation (mostly the under 25's) have grown up surrounded by developing technologies, those fully at home in a digital environment aren't worried about information overload; rather, they crave it.
(3) Internet Economics - Small is the new big. Unlike the past when publishing was controlled by publishers, Web 2.0's read/write web has opened up markets to a far bigger range of supply and demand. The amateur who writes one book has access to the same shelf space as the professional author.
(4) "Wirelessness" - Digital natives are less attached to computers and are more interested in accessing information through mobile devices, when and where they need it. Hence, traditional client applications designed to run on a specific platform, will struggle if not disappear in the long run.
(5) Who Will Rule? - This will be the ultimate question (and prize). As Richardson argues, whoever rules "may not even exist yet."
Wednesday, April 11, 2007
Introducing Google Maps (Personalized)
Google has gone ahead and just introduced to us a new feature which is eerily familiar to what a simple mashup can do: in fact, it's pretty much what a mashup is. Using Google Maps, Google has saved us the trouble of using API coding and programming and simply allowed us to personalize our own "Google Map." Give it a try: first, you need to have a theme (e.g. vacation areas you'd like to go to). Second, fill in the required information (addresses). Third, add photos, pictures, descriptions, anything you'd like to "customize" your map. Voila! A personalized map that you can share with your clients, acquaintances, friends, family, and just about anyone you can think of to. Yes, this is Web 2.0: we're doing just fine, thank you very much.
Monday, April 02, 2007
Wrapping Up on Five Weeks
The Five Weeks to a Social Library course wrapped up on March 17, 2007. In being the first free, grassroots, completely online course devoted to teaching librarians about social software and how to use it in their libraries, Five Weeks to a Social Library provided a free, comprehensive, and social online learning opportunity for librarians interested in learning more about Web 2.0 technologies.
The course’s strength was the very fact that it had appropriately used social software to run much of its content. Not only did it use a wiki (hosted by Drupal) as the platform for its main page, it also capitalized on blogs for allowing discussion virtually anytime anyone felt like posting a message; this allowed for a two-way communication among participants and instructors. Moreover, presentations creatively combined streaming audio, text chat, and presentation slides which truly made online learning an interactive experience; even better, much of the materials are archived for access, thus allowing anyone with an interest to take the course long after its completion, anytime and anywhere.
However, if there are drawbacks to the course, it would be some of the content. For a more experience user of social software, some of the course content might be considered somewhat basic. Because blogs, wikis, Flickr, and social bookmarking are becoming ever more commonly used, focusing too much on them runs the risk of narrowing social software to just these tools. What the course could have done was emphasize more on “theory” and “concepts” such as collective intelligence, open platform, open collaboration, than on the actual tools themselves.
With the advent of the so-called “Web 3.0” or semantic web, blogs and wikis will come and go and will likely be replaced with newer and even more advanced tools in the near future. While the scope of Five Weeks to a Social Library is certainly limited due to its relatively short time frame and also because it is meant to be an introduction of social software for the novice, I felt that the course could have provided a “glimpse” at things to come by offering a presentation or two from a more advanced “techie”, be it a librarian or not, that would have summarized that we are only on the cusp of something great, and that social software phenomenon will look vastly different even a year from now, let alone many years from now.
Nonetheless, the course was an ambitious first step for providing the necessary knowledge that librarians need to advance the profession and provide better services for patrons. While it could have done more to maximize its potential, the course nonetheless passes with flying colours in terms of meeting its goals and expectations from Day 1 when it had announced such an exciting venture of an entirely free online course for anyone interested in social software.
The course’s strength was the very fact that it had appropriately used social software to run much of its content. Not only did it use a wiki (hosted by Drupal) as the platform for its main page, it also capitalized on blogs for allowing discussion virtually anytime anyone felt like posting a message; this allowed for a two-way communication among participants and instructors. Moreover, presentations creatively combined streaming audio, text chat, and presentation slides which truly made online learning an interactive experience; even better, much of the materials are archived for access, thus allowing anyone with an interest to take the course long after its completion, anytime and anywhere.
However, if there are drawbacks to the course, it would be some of the content. For a more experience user of social software, some of the course content might be considered somewhat basic. Because blogs, wikis, Flickr, and social bookmarking are becoming ever more commonly used, focusing too much on them runs the risk of narrowing social software to just these tools. What the course could have done was emphasize more on “theory” and “concepts” such as collective intelligence, open platform, open collaboration, than on the actual tools themselves.
With the advent of the so-called “Web 3.0” or semantic web, blogs and wikis will come and go and will likely be replaced with newer and even more advanced tools in the near future. While the scope of Five Weeks to a Social Library is certainly limited due to its relatively short time frame and also because it is meant to be an introduction of social software for the novice, I felt that the course could have provided a “glimpse” at things to come by offering a presentation or two from a more advanced “techie”, be it a librarian or not, that would have summarized that we are only on the cusp of something great, and that social software phenomenon will look vastly different even a year from now, let alone many years from now.
Nonetheless, the course was an ambitious first step for providing the necessary knowledge that librarians need to advance the profession and provide better services for patrons. While it could have done more to maximize its potential, the course nonetheless passes with flying colours in terms of meeting its goals and expectations from Day 1 when it had announced such an exciting venture of an entirely free online course for anyone interested in social software.
Sunday, April 01, 2007
Happy April Fool's
The good folks at Google pulled a wonderfully brillant and witty April Fools joke which certainly pulled my leg. The joke is Google TiSP, a new FREE in-home wireless broadband service that includes a self-installation kit, which includes setup guide, fiber-optic cable, spindle, wireless router and installation CD. What do you need to do? Simple. A toilet. And a pair of gloves. Then you're ready for high speed internet. According to Google TiSP, actual speeds will vary, depending on network traffic and sewer line conditions. Users with low-flow toilets may simultaneously experience a saving-the-environment glow and slower-data-speed blues. Here's more:
How can Google offer this service for free? We believe that all users deserve free, fast and sanitary online access. To offset the cost of providing the TiSP service, we use information gathered by discreet DNA sequencing of your personal bodily output to display online ads that are contextually relevant to your culinary preferences, current health status and likelihood of developing particular medical conditions going forward. Google also offers premium levels of service for a monthly fee.
Best of all, there is also great technical service support, including on-site technical support in the event of "backup problems, brownouts and data wipes."
Subscribe to:
Posts (Atom)