Saturday, March 14, 2009

The Search Continues . . . .

New Approach to Search is a must read for those interested in search technology. Joe Weinman goes into the nitty-gritty of search algorithms, but boils it down into easily understandable (and fun) analogies for the laymen. As Weinman argues,

Search algorithms today are largely based on a common paradigm: link
analysis. But they've ignored a mother lode of data: The network.

Nicely said. Although there are a multitude of variations of search algorithms, architectures and tweaks, search technology has been based largely on three canonical approaches. In a nutshell, here they are:

1) Human-powered directories -
Hierarchically organized into taxonomies (e.g. Yahoo!)

2) Crawler-based index -
Generates results largely prioritized by link analysis. (e.g. Google)

3) Collaborative tagging -
Users tag pages with keywords so that future searchers can find
those pages by entering those tags (e.g. Technorati and Del.icio.us)

However, these three options still fail to prevent click fraud and also content unreacheable in the Deep Web. Weinman proposes the Network Service Providers as a fourth option, which uses data and metadata associated with the actual network transport of Web content—including HTML pages, documents, spreadsheets, almost anything —to replace and/or augment traditional Web crawlers, improve the relevance and currency of search results ranking, and reduce click fraud. A network service provider could better determine aggregate surfing behavior and hold times at sites or pages, in a way sensitive to the peculiarities of browser preferences and regardless of whether a search engine is used.

Weinman's proposal is an interesting deviation to the thoughts of Semantic Web enthusiasts. It does throw a quirk into the speculation of the future of Web search technology. And so the search continues . . .

Monday, March 09, 2009

Searching Search Like a Yandex

Let me introduce Yandex. It's an interesting search engine because it precedes Google. In fact, Yandex was founded in the late 1980s, before the advent of the Web. What is interesting is that Yandex is a classic case study that Google is not the end all and be all of search. Google may be good in English, but how does it fare in multilingual searching. (Remember: English is only a fraction of the Internet's languages).

What is interesting is that Yandex's search algorithm is rooted in the highly inflected and very peculiar Russian language. Words can take on some 20 different endings to indicate their relationship to one another. Like the many other non-English languages, this inflection makes the language of Russian precise, but makes search extremely difficult. Google fetches the exact word combination you enter into the search bar, leaving out the slightly different forms that mean similar things. However, Yandex is unique in that it does catch the inflection. Fortune has written an interesting article on Yandex, and my favourite part is its examination into the unique features of this Russian search giant:

While some of its services are similar to offerings available in the U.S. (blog rankings, online banking), it also has developed some applications that only Russians can enjoy, such as an image search engine that eliminates repeated images, a portrait filter that ferrets out faces in an image search, and a real-time traffic report that taps into users' roving cellphone signals to monitor how quickly people are moving through crowded roads in more than a dozen Russian cities.



Thursday, March 05, 2009

BBC's Semantic Web

BBC gets it.   In the latest issue of Nodalities magazine (one of my favourite reads), BBC reveals how it is applying the bottom-up approach to its contribution in realizing the SemWeb.   To make this happen, web programmers broke with BBC tradition by designing from the domain modelup rather than the interface down.  The domain model provided us with a set of objects (brands, series, episodes, versions, ondemands, broadcasts, etc) and their sometimes tangled interrelationships.

This is exciting stuff.  Without ever explicitly talking RDF we’d built a site that complied with Tim Berners-Lee’s four principles for Linked Data:

(1)  Use URIs as names for things. 

(2)  Use HTTP URIs so that people can look up those names. - 

(3)  When someone looks up a URI, provide useful information

(4)  Include links to other URIs

In fact, as the BBC web developers argue, 
considering how best to build websites we’d recommend you throw out the Photoshop and embrace Domain Driven Design and the Linked Data approach every time. Even if you never intend to publish RDF it just works.   The longer term aim of this work is to not only expose BBC data but to ensure that it is contextually linked to the wider web.  
The idea is to free web of data.

BBC Gets It.

Monday, February 23, 2009

Shame on You Wall Street Journal

It is regrettable indeed.   I was deeply saddened and somewhat enraged by the Wall Street Journal's closing of its library.   In our information age, that depends so much on knowledge workers, Wall Street has decided that it could cut back taking away a vital piece of information news gathering, organizing, and dissemination of up-to-minute information.   Can news reporters expect to do all the work themselves?  Can they properly search for relevant and pertinent information? Is that even their jobs?  

Could we inset librarians and information professionals into the jobs of news journalists?   Of course not.  Wall Street - give your head a shake.   A knowledge centre, particularly in a top-notch industrial media giant such as Wall Street, requires expert searchers.    When asked, a spokesperson responds,

It is regrettable. Our reporters do have access to multiple databases including Factiva and this migration to digital databases as you has been happening for many years.

Sure.  Good luck with having your reporters spend up to ten times the amount of time it would take to find information a trained information professional could obtain for you in a fraction of the time.  A librarian is like the glue that holds the house together.  You can only go so far and so long without a librarian's information retrieval skills before the infrastructure cracks and crumbles.   Particularly in our emergine Web 2.0 world of social media and open access resources, can a company survive alone without expert information and knowledge management?  Best of luck Wall Street Journal.  

Saturday, February 21, 2009

Video Sharing for Librarians




I recently presented at TOTS. What is video sharing Whyshould we care? How can be of use for information professionals? What are some issues for us to consider? Let's take a look together.

Monday, February 16, 2009

Who Video Shares? Barack Obama Does!




Who uses Web 2.0 to its fullest capacity? Barack Obama does. The President posts regularly to Vimeo. Vimeo is different in that it offers High-definition content. On October 17, 2007, Vimeo announced support for High Definition playback in 1280x720 (720p), becoming the first video sharing site to support consumer HD.

Wednesday, February 11, 2009

Mashups at PSP 2009


View more presentations from Allan Cho.
I had recently given a presentation as part of a panel at the Association of American Publisher's Professional Scholarly Publishing (PSP) 2009 joint-pre-conference with the National Library of Medicine, titled "MashUp at the Library Managing Colliding User Needs, Technologies, and the Ability to Deliver."    Here are the slides I had used - any comments most appreciated.

Thursday, January 29, 2009

Is Youtube The New Search?

Information professionals everywhere take note: Google is uncomfortably sliding. Gone are the days that we 'google' for information. And now YouTube, conceived as a video hosting and sharing site, has become a bona fide search tool. Searches on it in the United States recently edged out those on Yahoo, which had long been the No. 2 search engine, behind Google. Interesting that Google owns YouTube, isn't it? In November, Americans conducted nearly 2.8 billion searches on YouTube, about 200 million more than on Yahoo, according to comScore. Here is what one 9 year old reveals about his information search behaviour in a New York Times article:

I found some videos that gave me pretty good information about how it mates, how it survives, what it eats,” Tyler said. Similarly, when Tyler gets stuck on one of his favorite games on the Wii, he searches YouTube for tips on how to move forward. And when he wants to explore the ins and outs of collecting Bakugan Battle Brawlers cards, which are linked to a Japanese anime television series, he goes to YouTube again. . .

“When they don’t have really good results on YouTube, then I use Google."

What does this mean? Are Facebook, Youtube, and Twitter going to take down the venerable goliath Google? Not really. I argued in an article that this is the phenomenon of social search. Are things finally catching up?

Monday, January 26, 2009

Ushahidi as a Mashup



I'm going to be talking soon about mashups. (And getting nervous about it, too). One mashup that I will be discussing is Ushahidi. It's an excellent example of how Web 2.0 is saving lives. Using technology to harness peace. More to come. Here is an excellent slide show of Ushahidi.

Wednesday, January 21, 2009

Nova on the Future of the Web

I heart Nova Spivak.  The grandson of management professor Peter Drucker, Spivak is an intellectual in his own right.  Not only is he a semantic web pioneer and technology visionary, he's also founded Twine, one of the first semantic web services out there.   I think he's one of the brightest minds today regarding ideas about the future of the Web. He's a visionary.   Here's a synopsis of Spivak's treatise Future of the Desktop.

1) The desktop of the future is going to be a hosted web service

(2) The Browser is Going to Swallow Up the Desktop

(3) The focus of the desktop will shift from information to attention

(4) Users are going to shift from acting as librarians to acting as daytraders

(5) The Webtop will be more social and will leverage and integrate collective intelligence

(6) The desktop of the future is going to have powerful semantic search and social search capabilities built-in

(7) Interactive shared spaces will replace folders

(8) The Portable Desktop

(9) The Smart Desktop

(10) Federated, open policies and permissions

(11) The personal cloud

(12) The WebOS (Web operating system)

(13) Who is most likely to own the future desktop?

Saturday, January 17, 2009

Topic Maps and the SemWeb

Half a year ago, I had a posting discussing Katherine Adams' seminal article about librarians and the Semweb. Katherine made a point about Topic Maps, which she believes will ultimately point the way to the next stage of the Web's development. They represent a new international standard (ISO 13250). In fact, even the OCLC is looking to topic maps in its Dublin Core Initiative to organize the Web by subject.

In the same posting, Steve Pepper, an independent researcher, writer and lecturer who has worked with open standards for structured information for over two decades, made a very interesting comment. He argues that:

Topic maps is really spearheading is nothing short of a paradigm shift in computing -- the notion of subject-centric computing -- which will affect far more than just the Web.

We've let programs, applications, and even documents occupy centre-stage for far too long. This is topsy-turvy: users are primarily interested in subjects (what the information is about), not how it was created or where it lives. We need to recognize this, and effect the same kind of change in information management that object-orientation effected in programming; hence the need for a subject-centric revolution.
Indeed, the Topic Maps 2008 Conference in Oslo, Norway, April 2-4 has just concluded. So what are topic maps, and why are they relevant for libraries and information organizations? The basic idea is simple: the organizing principle of information should not be where it lives or how it was created, but what it is about. Organize information by subject and it will be easier to integrate, reuse and share – and (not least) easier for users to find. The increased awareness of the importance of metadata and ontologies, the popularity of tagging, and a growing interest in semantic interoperability are part and parcel of the new trend towards subject-centric computing.

This conference brings together these disparate threads by focusing on an open international standard that is subject-centric to its very core: ISO 13250 Topic Maps, which is interestingly what Katherine Adams had pointed out eight years ago. We're getting closer. The pieces are in place. We just need a good evening to frame together the picture.

Monday, January 12, 2009

hakia and Librarians' Race to End the Search Wars

I've always been intrigued by hakia, which is considered the first SemWeb search engine of its kind. It is said that for the next generation web to exist, there needs to be a more concise way for users to find information and to search the web online. hakia is working with librarians to help make its results even more credible in the attempt to win the race to ouster Google in the current search engine wars. hakia is one of the first Semantic Web search engines.

However, besides QDEX (Quality Detection and Extraction) technology, which indexes the Web using SemanticRank algorithm, a solution mix from the disciplines of ontological semantics, fuzzy logic, computational linguistics, and mathematics, hakia also relies on the subject knowledge expertise of professionals. By combining technology and human expertise, it attempts to completely redefine the search process and experience. Take a look at my hakia, Search Engines, and Librarians How Expert Searchers Are Building the Next Generation Web for a deeper analysis of what hakia is trying to do with librarians. Hopefully, it offers more food for thought.

Thursday, January 08, 2009

A New Web 2.0 Journal

Web 2.0 The Magazine: A Journal for Exploring New Internet Frontiers is an important new journal that librarians and information professionals should take a serious look at. It attempts to fill the information gap in the area of Web 2.0 by focusing on new developments, the most used tools, trends, and reviews of books, articles, sites, and systems themselves so as to make Web 2.0 a useful part of the reader’s technology experience. Here is what Web 2.0 The Magazine attempts to do:

Admittedly, Web 2.0 is a hard concept to get one’s arms totally around as it means anything involving “user content”. This broad definition covers everything from social networks, such as Facebook, to 3D Virtual Reality Worlds, such as Second Life and World of Warcraft, with many, many stops in between. The unifying feature in all of the Web 2.0 systems and tools is that they differ fundamentally from Web 1.0, which is a one-way connection, in which information sources, vendors, advertisers, etc. present information for the reader to consume and / or respond to (the fact that a user may choose to buy on-line from Amazon or Sears does not make those sites something other than Web 1.0 since the user was not the one to initiate the content).

Thanks Dean for recommending this journal to me. It's an excellent read so far.

Wednesday, December 31, 2008

Stephen Abram and the World of Libraries

Stephen Abram is a smart guy. The first time I heard him speak was at the 2008 SLA Conference in Seattle. He was brilliant, to say the least. Abram's almost everywhere you turn your head, he is a workaholic to the nth degree. Abram is a innovative librarian who invests his energies in technology and trend forecasting.. Abram also has more than 25 years in libraries as a practicing librarian and in the information industry. In other words, I trust this guy.

Abram has also been highly acclaimed with numerous awards and leadership positions. He was named by Library Journal in 2002 as one of the key people who are influencing the future of libraries and librarianship. Served as President of both the Canadian Library Association (CLA) and Special Libraries Association (SLA). Here is a candid interview that Abram gave a year ago. He reveals he had to apply twice to get into library school, and how he learned the craft of public speaking.



Tuesday, December 30, 2008

Seasons Greetings

Seasons greetings everyone. This holidays, as you are enjoying your Christmas at home, please take some time in considering contributing to a worthwhile campaign. The campaign is Books for Darfur Refugees -- which give $1 for each book received as a holiday gift -- the website is: http://holiday.bookwish.org/.

Books for Darfur Refugees certainly appreciate your helping to spread the word, too. It is a 100% volunteer staffed; 100% of funds raised by this campaign for direct book related aid for Darfur refugees. The good news story here is the inspiration of Darfuris who self-organized their own English classes in refugee camps. For example, they view learning English as their "road to freedom."

Since sending two shipments of specifically requested ESL books to the camps in May 2008, the numbers of refugees learning English has jumped from 400 to 800 (as of July 2008) and now numbers more than 1,100! We are partnering with the British NGO, CORD, that runs education programs for UNHCR and UNICEF in the Bredjing, Treguine, and Gaga camps (60,000 refugees, about 20,000 students, about 1/2 are girls).
The website (http://www.bookwish.org/) shows inspiring photos of the refugees smiling and holding up the ESL books that were sent to them. Happy New Year everyone.

Monday, December 22, 2008

Professor Jerry Newman on Management

A while back ago, I had written on a book I recommended as a must read for those interested in management techniques and the ways in which people interact in a fast-paced workplace The book, My Life on the McJob, explores this, as Jerry Newman, a management professor at Professor at SUNY Buffalo, decided to conduct an experiment as he worked at seven stores over 14 months – two McDonalds, two Burger Kings, one Wendy’s, one Arby’s and one Krystal (a fast food chain in the South), with the stores being located all over America, in Michigan, Florida and New York. Newman worked all jobs, grill, register, custodian, and observed and documented experiences first-hand.

Newman's case study was so fascinating that I wanted to interview him and ask him more about his book and whether it applied to libraries, which often resembles a retail fast-food chain in terms of frenetic pace with customers and rigid tension between management and staff. Here is our interview:

Question: Libraries are every bit as dysfunctional as any organization. What can libraries learn from McJob? Is your book written for fastfood and retail only? Does it apply for all?

Newman: The book is relevant to any organization that has multiple shifts in the course of a day, or that has multiple units within the organization. I think libraries qualify on both accounts. The biggest problem in multi-unit operations, and this isn't just me speaking - mcd agrees with this - is the inconsistency across time and units. To be great, first you must be consistent. This isn't always "sexy", hence the reason for low interest.

Question: What can managers learn from your book? If there is one thing they can take away from your book, what would it be?
Newman:
  • Fast food jobs are HARD – both physically and mentally
  • These jobs provide opportunity to learn important life skills
    • Dealing with pressure situations
    • Communicating with peers
    • Managing conflict (with customers, peers)
  • Fast food is more representative of our country’s diversity and makeup than other industries
  • MOST INTERESTING: The store’s manager (and not corporate operations procedures and values) determines the climate and ultimately the success of the workplace

Question: What works? You had mentioned the four four R’s. What are they?

Newman:
  • Realism…People like predictability, set boundaries and expectations
  • Recognition…Be an ego-architect – reinforce self worth
  • Relationships…Build a social web, identify those employees that connect with others and use them to cultivate camaraderie among the troops
  • Rewards…Gold stars still work

Question: What were some challenges you found?

Newman: How to reward your employees when money is not an option.

  • Provide constructive feedback: Gold stars worked in elementary school, still work now
  • Recognize job proficiency by make an example of a strong employee
  • Offer flexible hours and job security
  • Facilitate social interaction – build a social web, make the work-place a fun-place to be
  • Advertise opportunities to advance
  • Build positive manager/employee relationships

Question: What are some key takeaways from your research in this book?

Newman:

  • Hiring decisions are key to store success and employee retention
  • Culture has the strongest impact on workers’ behavior – and managers are in control
  • Camaraderie and strong work ethic are a winning combination

Question: Were there any surprises during the extent of your experiences?

Newman:

  • Fast food is not an easy job
  • No forum for employee feedback and unsolicited feedback on operations/best practices is not welcome
  • Wide disparities exist across stores – even those with the same name
  • Women are better managers
  • Recognition is a powerful motivator

Thursday, December 18, 2008

New Gen-Archivaria

Archival programs in North America are few and far between. Only a handful of programs available, the majority of archive programs are narrowly focused on records management techniques. Unfortunately, for social and cultural historians, this narrow approach has its limitations. Although as a profession, archivists have worked side-by-side with historians through the ages, archival sciences is still a young academic field. As Alex Ben's Excluding Archival Silences: Oral History and Historical Absence Excluding Archival Silences: Oral History and Historical Absence argues,


archives remain, largely, material repositories of cultural memory. It is an accepted historical problematic, however, that culture is often resistant to material preservation. There exists an undeniable and profound tension between scholarly efforts to reconstruct history and interpret cultural traditions and the fragmentary, and often limited, material record. That is to say, scholarship is shaped by a sinuous negotiation around the historical silences that encompass all of material culture. Historical silences, however, can at times be marginalized (or at best excluded) by a sensitive configuration of material evidence with oral history.

The new generation archivist should be motivated by the long term preservation of moving images and by the invention of new paradigms for access to celluloid, tape, bits and bytes. It should be rooted in historical, practical and theoretical study - and rather than limiting itself to one methodology, it needs to assign equal importance to heritage collections and emerging media types.

One example of innovative ways of recording the past is UBC's First Nations Studies Program's oral history archive projects. In particular, Interactive Video/Transcript Viewer (IVT) is a web-based tool that sychronizes a video with its transcript, so as users play the video, its transcript updates automatically. In addition to searching a video's transcript for key words and phrases, and then playing the video from that point, IVT includes a tool that allows users to create a playlist of clips from interviews for use in meetings. While it took historians thousands of hours of transcription work, IVT transcribes in real-time. These are the types of technologies archivists need to be aware of, in order for us to create active archives. And this is where information professionals need to be aware - to anticipate the needs of its users.

Monday, December 15, 2008

Web 2.0 and its Identity Crisis



Web 2.0 seems to be facing an identity crisis. We don't know quite know what to do with it anymore. We're talking lots about information overload. Web 2.0 is said to be passe. Web 2.0 in fact, might never have existed at all. (It's just a fabrication of the imagination). Whatever Web 2.0 is, it's certainly an evolutwion of the world wide web, which is a reflection of human civilization. We live in a period of globalization, and the web is a manifestation of this. Take for instance. Queen Rania is launching her presence on YouTube and maximizing on the powers of the social web through her powerful video sharing.

On March 30, 2008, the queen of Jordan, Queen Rania launched her own channel on YouTube with a video in which she asked people to send her their questions about Islam and the Arab world until August 12, 2008 (International Youth Day). By intending to respond to those questions and explain the truth about various stereotypes about Arabs and Muslims, Queen Rania hopes to help heal cultural misunderstandings. She continues to post daily videos on subjects that including honor killings, terrorism and the rights of Arab women. Over the five month conversation, her YouTube site had more than 3 million views. Her success shows us the power of social media. True, Web 2.0 might have a fractured identity. But it's an imprint on globalization and our world.

Saturday, December 06, 2008

The Road to Web 3.0 for Librarians

Web 3.0 (Presentation)
View SlideShare presentation or Upload your own.



Recently, I presented to a SLAIS class, LIBR 534: Health and Information Services. I gave a talk about Web 3.0, and more specifically, the continuum from Web 2.0 to Web 3.0. I strongly believe that the road to Web 3.0 is linear, and that in between is the Semantic Web. While many interchangeably use Web 3.0 and Semantic Web, I differentiate the two and contend that only through harnessing Web 2.0's social and collective collaboration and applying it using Semantic Web's intelligent technologies can we realize the potential of Web 3.0.

Monday, December 01, 2008

Early Learning and Libraries

This is Malcolm Gladwell. His new book, Outliers, is an excellent read and in my opinion, confirms Gladwell as a public intellectual. His book makes a number of insightful findings, but perhaps the most mind-boggling is the argument that cultural heritage plays a strong part in a person's educational abilities. In his argument that Asians perform better at mathematics, Gladwell surmises that it is the inherited working culture of rice paddies which makes all the difference.

Perhaps most controversial is the assertion that upper middle class children often score better on standardized testing because their backgrounds allow for concerted cultivation - that is, the abilities. It's the summer time that makes the difference. Rather than looking at test scores at one time period, we need to take a closer look at the test scores over an entire year, and examine the difference in improvement during the entire year. And what we find is astonishing. The reason for the disparity between the social classes is that privileged children are given more resources to practice and study during the summer time. Perhaps this is not surprising, as libraries play a huge role in the lives of young children. I certainly remember that as a young boy, I went to the local library often. (I only wish I had gone more now that I know how much a summer makes).

Libraries are seminal institutions in a child's early learning and educational experience. I like American Libraries' 12 Ways Libraries Are Good for the Country. It's an excellent thesis to why libraries are important for society:

1. Libraries inform citizens

2. Libraries break down boundaries

3. Libraries level the playing field

4. Libraries value the individual

5. Libraries nourish creativity

6. Libraries open kids’ minds

7. Libraries return high dividends

8. Libraries build communities

9. Libraries make families friendlier

10. Libraries offend everyone

11. Libraries offer sanctuary

12. Libraries preserve the past

Thursday, November 27, 2008

PR 2.0 for Information Pro's

Brian Solis, Principal of FutureWorks, an innovative Public Relations and New Media agency in Silicon Valley, along with Jesse Thomas of JESS3, has created a new graphic that helps chart online conversations between the people that populate communities as well as the networks that connect the Social Web. The Conversation Prism is free to use and share. It's their contribution to a new era of media education and literacy.

The conversation map is a live representation of Social Media evolves as services and conversation channels emerge, fuse, and dissipate. As the authors argue philosophically, if a conversation takes place online and you’re not there to hear or see it, did it actually happen? Indeed. Conversations are taking place with or without you, and this map will help all to visualize the potential extent and pervasiveness of the online conversations that can impact and influence your business and brand.

As a communications, service, and information professionals, we should find ourselves at the center of the prism - whether we are observing, listening or participating. Solis and Thomas' visual map is an excellent complement to The Essential Guide to Social Media and the Social Media Manifesto, which will help us all better understand how to listen and in turn, participate in the Web 2.0 world. A new, braver, world.

Monday, November 24, 2008

The Christmas gift from Malcolm Gladwell came early this year. And I just bought a copy. His new book, Outlier, is a magnificent read. In Outliers, Gladwell, the ever-curious mind, examines why some people succeed, living remarkably productive and impactful lives, while so many more never reach their potential. Analyzing historical nuances from Asian rice paddies to the birthdates of Canadian junior hockey players, Gladwell forces us to re-examine our cherished belief of the "self-made man," and throws out the long-held notion that "superstars" do not come from nowhere. Although born with innate genius and talent, successful people are invariably the beneficiaries of hidden advantages and extraordinary opportunities and cultural legacies that allow them to learn and work hard and make sense of the world in ways others cannot.

While there are a plethora of intellectual points for discussion, 'practical intelligence' in my opinion, is the new key term to take away from Gladwell's book. PQ is a term that psychology Robert J. Sternberg proposed, when he argued that there are three intelligences in human cognition:

(1) Analytical intelligence - the ability to analyze and evaluate ideas, solve problems and make decisions

(2) Creative intelligence - involves going beyond what is given to generate novel and interesting ideas

(3) Practical intelligence - the ability that individuals use to find the best fit between themselves and the demands of the environment.

The three intelligences, or as he also calls them three abilities, comprise what Sternberg calls Successful Intelligence: "the integrated set of abilities needed to attain success in life, however an individuals defines it, within his or her sociocultural context." While society tends to have bought into the idea that innate talent, through such test devices as IQ tests, can predict the success of a person, Gladwell re-examines this piece of wisdom, and argues otherwise. This book will be useful for anyone with a curiosity for success. It gives us a better, more complex, inquiry into what fuels success. And it's not just about brains, you know.

Saturday, November 22, 2008

Calling All Librarians - Reference Extract

Calling all librarians. Reference Extract is coming to you all. Envisioned as a web search engine, like Google, Yahoo and MSN. Reference Extracts will be built for maximum credibility by relying on the expertise and credibility judgments of librarians from around the globe. However, unlike other search engines, users enter a search term and get results weighted towards sites most often referred to by librarians at institutions such as the Library of Congress, the University of Washington, the State of Maryland, and over 1,400 libraries worldwide. The Reference Extract project is being developed by the Online Computer Library Center and the information schools of Syracuse University and the University of Washington. With a $100,000 grant from the John D. and Catherine T. MacArthur Foundation, Reference Extract strives to build the foundation necessary to implement it as a large-scale, general user service.

My thoughts? It's not unlike similar attempts to outdo Google. Have you heard of Refseek? RefSeek does not claim to offer more results than Google; instead, it strips any results not related to science, research and academia. It’s different from Google Scholar in that it indexes documents that includes web pages, books, encyclopedias, journals, and newspapers. It also has more results from .edu and .org sites as well as various online encyclopedias such as Wikipedia and Answers.com. With Refseek and Reference Extract, are we having much of the same, except in a different shape and size? We'll see...

Sunday, November 16, 2008

My Secret Life on the McJob

If there is one book you need to read for this Christmas holidays, make it My Secret Life on the McJob. I just couldn't put it down after coming across it up from my local library. My first position out of university was for a big box retail bookstore, and it was tough. (It brings back haunting memories that resonates today). Retail is tough. And professor Jerry Newman of University at Buffalo's State University of New York explains this in pristine detail as he worked undercover in the lowest rung minimum wage labour world of fast food restaurants to reveal insightful, and at times, disturbing practices in retail culture.

In my opinion, My Secret Life on the McJob is a paradigmatic shift in the field work analysis of organizations. Too often Library and Information Science educators are narrowly confined to questionnaires and quantitative analyses and equally narrowly churning out generic, boring, and unusable data about user statistics. Instead of viewing from the top-down, Newman does the exact opposite. Jerry Newman turns a stunted methodology of interviewing and statistical analysis on its head by actually doing a personal sacrifice (physical risk included) through experiencing the problems and flaws of organization behaviour and working as a covert fast food worker. What does he discover? The inefficiencies of retail, fast food, and traditional hierarchical management techniques passed down by the Ford Assembly Line era are not working in our globalized, mobile workforce era.

What Newman forces us to review about our workplace is that people are important. It's about the people. Good ideas come from the front lines. This applies not only to the retail world, but businesses of any kind, and especially libraries.

Friday, November 14, 2008

Are Libraries Knowledge Cafes?

World Cafés are an emerging phenomenon. It's a conversational process based on an innovative yet simple methodology for hosting conversations about questions that matter. These conversations link and build on each other as people move between groups, cross-pollinate ideas, and discover new insights into the questions or issues that are most important in their life, work, or community. As a process, the World Café nurtures the collective intelligence of any group, and in doing so increases people's capacity for effective action in pursuit of common or similar aims.

The World Café refers to a living network of conversations that is continually co-evolving as we explore questions that matter with our family, friends, colleagues, and community. In helping us notice these invisible webs of dialogue and personal relationships that enable us to learn, create shared purpose, and shape life-affirming futures together, the metaphor of the "World as Café" is a growing global community of people, groups, organizations, and networks using World Café principles and processes to harness wisdom of the crowds.

As information professionals and librarians, we need to take notice of such trends and see how it can be applied in our own work spaces. Many knowledge managers today are introducing what they call knowledge commons in which employees can freely (or not) chat among themselves as they commute to and fro during the day. As a result, this space is turned into a knowledge hub where gossip, conversation, and useful ideas normally trapped within the confines of cubicle and office walls are broken free and released into the work place, making for a growth of a healthy work culture and environment.

In a way, this is done everyday in the form of Web 2.0 technologies through social network and instant messaging programs such as Facebook, Twitter, instant messaging, and blogs. Employers, especially knowledge workers, must find a way to integrate this into their working spaces. In my opinion, libraries and information centres need to look towards the knowledge cafes model. Libraries must turn towards becoming information cafes and less as gatekeepers of information.

Monday, November 10, 2008

Web 2.0 Publishing

In Vancouver, there are two publications which have very divergent approaches to not only Asian Canadian issues, but also the use of media and the web. Ricepaper Magazine, established in 1994, as a forum for up and coming Asian writers and artists in Canada, limits its definition of "Asian" to the Pacific Asian Rim ethnic groups Chinese, Japanese, and Korean. Focusing mainly on writers of these ethnic origins, Ricepaper depends mainly on the quarterly print publication as its main point of distribution and have a very static website with limited updates.

Contrast that with Schema Magazine. Schema Magazine strives to reflect the most culturally mobile and diverse generation of Canadians, the generation it coins cultural navigators. We showcase their unique sensibilities, interests and their pursuit of ethnic cool. As Schema's focus on the Vancouver Asian Film Festival shows, the focus of "Asian" is broad and widely interpretable. Schema also uses Web 2.0 technologies as its main channel of communication. Not only does it use a content management system for its webpage, it also has a Youtube channel of Schema's interviews.

The two rival Asian Canadian organization offer an insightful examination into the changing landscape of media and publishing. Staff-wise, both are similar - yet, when it comes to coverage and reach of audience, Web 2.0 simply wins out.

Friday, October 31, 2008

Web 3.0 in the Era of Pledging

Are you ready to be tracked, monitored, and followed? Every step of the way? Well, you better get ready. That's what Web 3.0 technology will be about I predict. That's where we're going, and that's where we'll be. But is PledgeBank a Web 3.0 service?

PledgeBank is a service that helps people get things done, especially things that require several people. How does it do this? Heather Cronk argues in Pushing Towards Web 3.0 Organizing Tools that Pledgebank.com is a Web 3.0 tool. Couldn't be farther from the truth. If it looks like a Web 2.0, smells like Web 2.0, and quacks like Web 2.0 . . . then it's likely Web 2.0. Which is exactly what Pledgebank.com is. No matter how many ways you analyze it and dissect the features, it's simply an aggregated social networking engine. Perhaps not even that.

PledgeBank allows users to set up pledges and then encourages other people to sign up to them. A pledge is a statement of the form 'I will do something, if a certain number of people will help me do it'. The creator of the pledge then publicises their pledge and encourages people to sign up. Two outcomes are possible – either the pledge fails to get enough subscribers before it expires (in which case, we contact everyone and tell them 'better luck next time'), or, the better possibility, the pledge attracts enough people that they are all sent a message saying 'Well done—now get going!'

That's not Web 3.0. That's simply wishful thinking. Web 3.0 is about third generation web computing. It's about the webtop. It's about digital outreach in its purest form. It's about the ability to have the intelligent web at your hands, having your settings uniquely tailored to you. It's beyond the Web and into our daily lives. Something that PledgeBank simply is not. So . . . back to the drawing board . . .

Saturday, October 25, 2008

Information Architecture for LIS Educators

I remember back in LIS school how a prof had told the class that LIS was no longer 'library' school. We didn't need to think so narrowly about working in physical 'libraries.' I didn't quite believe him, and didn't quite understand what options were available for someone with a LIS degree but wanted to pursue other fields. I ended up in an academic library, but that's because I enjoy the university environment and still get to play with emerging technologies for my position. But what my prof had said is true. A friend of mine is currently working in Japan, as a virtual librarian for a North American-based company. I think it goes to show that the world-is-flat-theory is even more true - wirelessness is enabling the world to communicate and collaborate in ways never imaginable before. This is where information professionals come in. Take a look at the job description below. It's a perfect fit for an LIS grad that has the skills, flexibility, and foresight to go far.

Interactive Information Architect - Carlson Marketing Canada - Toronto

As an Interactive Information Architect (IA), you will be responsible for designing new and enhanced functionality for new and existing Client sites, with an emphasis on usability. The role requires well-demonstrated skills in interaction design, solidly informed by usability principles, user interface design standards, and best practices. To be successful, you must quickly understand current applications and new requirements, be able to derive the IA from documented functional requirements, and collaborate with fellow designers, account managers and programmers. Multiple stakeholders will have input and feedback on design output. Expect work to be highly interactive.

Responsibilities:

(1) Must communicate clearly and effectively; strong analytical and oral communication skills, able to collaborate actively with cross-functional teams.

(2) Must be organized, independent, and able to switch rapidly between different projects in a fast-paced and exciting environment.

(3) Must be able to develop new approaches to complex design problems and meet aggressive deadlines.
(4) Must have an eye for detail and can put ideas into a tangible form.

Requirements:
(1) Must have experience in E-commerce, custom application development, brand sites and consumer promotional environments (game theory background an asset)
(2) Thorough knowledge of the web site design process: creative brief, user interface design, task modeling, wire frame and user flow diagramming, usability testing, etc. Be prepared to show interim deliverables, rather than final work in the interview process.
(3) Proven skills in information synthesis, conceptual modeling, task modeling, UI design principles, human factors, User Centered Design, interaction design, usability methodologies, industry standards and trends, platform standards, and software development process.
(4) Strong understanding and experience with HTML, Java, JavaScript, Flash, Adobe InDesign, Adobe Illustrator, Microsoft Visio, Dreamweaver, Axure
(5) Capable of adhering to project schedules and effectively tracking progress to meet challenging deadlines and corporate initiatives.
(7) Ability to work both independently and as part of a team
(8) Proven track record of successful IA deliverables.
(9) Designing for wireless devices a plus

Monday, October 20, 2008

Calling all Librarians and Info Pro's

Calling all those who want to make a difference in this up and coming new Web. Now's your chance to say what you need to say. I don't usually make announcements, but this is one worth the call.
Semantic Technology Conference 2009 Logo

"SemTech 2009:
THE SEMANTIC TECHNOLOGY CONFERENCE"

June 14-18, 2009
Fairmont Hotel, San Jose, California

CALL FOR PRESENTATIONS

Start the Submission Process...

Interested practitioners, developers and researchers are hereby invited to present a paper at the fifth annual conference focused on the application of Semantic Technologies to Information Systems and the Web. The event will be held on June 14-18, 2009 at the Fairmont Hotel in San Jose, California.

The conference will comprise multiple educational sessions, including tutorials, technical topics, business topics, and case studies. We are particularly seeking presentations on currently implemented applications of semantic technology in both the enterprise and internet environments.

A number of appropriate topic areas are identified below. Speakers are invited to offer additional topic areas related to the subject of Semantic Technology if they see fit.

The conference is designed to maximize cross-fertilization between those who are building semantically-based products and those who are implementing them. Therefore, we will consider research and/or academic treatments, vendor and/or analyst reports on the state of the commercial marketplace, and case study presentations from developers and corporate users. For some topics we will include introductory tutorials.

The conference is produced by Semantic Universe, a joint venture of Wilshire Conferences, Inc. and Semantic Arts, Inc.

Audience

The 2008 conference drew over 1000 attendees. We expect to increase that attendance in 2009. The attendees, most of whom were senior and mid-level managers, came from a wide range of industries and disciplines. About half were new to Semantics and we expect that ratio to be the same this year. When you respond, indicate whether your presentation is appropriate for those new to the field, only to experienced practitioners, and whether it is more technical or business-focused (we're looking for a mix).

Tracks (Topic Areas)

The conference program will include 60-minute, six-hour, and three-hour presentations on the following topics:

Business and Marketplace
Industry trends, market outlook, business and investment opportunities.

Collaboration and Social Networks
Leveraging Web 2.0 in semantic systems. FOAF, Semantically-Interlinked Online Communities (SIOC), wikis, tagging, folksonomies, data portability.

Data Integration and Mashups
;Web-scale data integration, semantic mashups, disparate data access, scalability, database requirements, Linked Data, data transformations, XML.

Developing Semantic Applications
Experienced reports or prototypes of specific applications that demonstrate automated semantic inference. Frameworks, platforms, and tools used could include: Wikis, Jena, Redland, JADE, NetKernal, OWL API, RDF, GRDDL, Ruby On Rails, AJAX, JSON, Microformats, Process Specification Language (PSL), Atom, Yahoo! Pipes, Freebase, Powerset, and Twine.

Foundational Topics
This will include the basics of Semantic Technology for the beginner and/or business user including knowledge representation, open world reasoning, logical theory, inference engines, formal semantics, ontologies, taxonomies, folksonomies, vocabularies, assertions, triples, description logic, semantic models.

Knowledge Engineering and Management
Knowledge management concepts, knowledge acquisition, organization and use, building knowledge apps, artificial intelligence.

Ontologies and Ontology Concepts
Ontology definitions, reasoning, upper ontologies, formal ontologies, ontology standards, linking and reuse of ontologies, and ontology design principles.

Semantic Case Studies and Web 3.0
Report on applications that use explicit semantic information to change their appearance or behavior, aka "dynamic apps". Web 3.0 applications. Consumer apps, business apps, research apps.

Semantic Integration
Includes semantic enhancement of Web services, standards such as OWL/S, WSDL/S, WSMO and USDL, semantic brokers.

Semantic Query
Advances in semantically-based federated query, query languages such as SWRL, SPARQL, query performance, faceted query, triple stores, scalability issues.

Semantic Rules
Business Rules, logic programming, production rules, Prolog-like systems, use of Horn rules, inference rules, RuleML, Semantics of Business Vocabulary and Business Rules(SBVR).

Semantic Search
Different approaches to semantic search in the enterprise and on the web, successful application examples, tools (such as Sesame), performance and relevance/accuracy measures, natural language search, faceted search, visualization.

Semantic SOA (Service Oriented Architectures)
Semantic requirements within SOA, message models and design, canonical model development, defining service contracts, shared business services, discovery processes.

Semantic Web
OWL/RDF and Semantic Web rule and query languages such as SWRL, SPARQL and the like. Includes linked data. Also progress of policy and trust.

Semantics for Enterprise Information Management (EIM)
Where and how semantic technology can be used in Enterprise Information Management. Applications such as governance, data quality, decision automation, reporting, publishing, search, enterprise ontologies.

Business Ontologies
Design and deployment methods, best practices, industry-specific ontologies, case studies, ontology-based application development, ontology design tools, ontology-based integration.

Taxonomies
Design and development approaches, tools, underlying disciplines for practitioners, vocabularies, taxonomy representation, taxonomy integration, relationship to ontologies.

Unstructured Information
This will include entity extraction, Natural Language Processing, social tagging, content aggregation, knowledge extraction, metadata acquisition, text analytics, content and document management, multi-language processing, GRDDL.

Other
You are welcome to suggest other topic areas.

Key Dates & Speaker Deadlines

Proposal Submissions Due
All proposals must be submitted via the online Call for Papers process HERE.
November 24, 2008
Speakers notified of selection December 16, 2008
Speaker PowerPoint files due May 18, 2009

Wednesday, October 15, 2008

Hakia and the Semantic Search

Good for you, Hakia. Don't try to beat Google at its own game. Make your own rules instead. Collaborate with librarians.

Sunday, October 12, 2008

Talis' Integration




Talis is an innovator of information technologies for libraries. Richard Wallis, of Panlibus and a contributor to Nodalities' podcasts, explains how Talis can easily integrate its APIs into applications.

Thursday, October 09, 2008

Financial Crisis 2.0

I am re-reading Thomas Friedman's The World Is Flat. With the recent financial crisis, this is almost an appropriate time to examine the world's political and economic infrastructure. Forfuitously, Yihong Ding has written an interesting entry on the financial crisis' effect on China and the United States. Ding, who is not only a computer scientist, but also philosopher, historian, and political commentator - offers a unique blend of intellectualism and insight in arguing that the crisis in the markets will actually benefit the US while hurting the Chinese economy. As someone who is deeply interested in Chinese history, I am intrigued by Ding's insight, particularly in regards to how the financial crisis is interconnected with Web 2.0 and technology. He points out that:

By studying the dot-com bubble, researchers have found that the optical network built during the hype period had become the foundation of the following economic boom at the Web industry, namely the Web 2.0 hype. Without the investment of these optical networks and without the bankrupt of the original optical network investors, we were not able to obtain the cheap price of network usage which is an essential reason behind the Web 2.0 hype. By this mean, it was the IT crisis that constructed the foundation of the new Web-based industry. . .

. . . In comparison we may watch China. The future is, however, not optimistic at all because of this financial crisis. The deep drop of the stock market will greatly hurt the industrial innovation. Moreover, western investors are going to invade China on its debt market and real estate market to cause severe economic inflation in China. As we have discussed, the high price of real estate in China will hurt the formation of Chinese Web-based small businesses. As the result, the technological distance between USA and China will not decrease but increase. As a Chinese myself, I am quite sad on this prediction of the future. However, be honest I would say that it is the future most likely to happen.

Friedman's thesis is a stark contrast to Ding and Chinese economist Junluo Liu's contention. According to the Flat World premise, developing countries such as India and China are quickly catching up to the US due to their increasingly educated and dedicated workforce. Entrepreneurs, particularly in wireless telecommunications industries, no longer require real estate. Everything can be done remotely in era Globalization 3.0. Indian entrepreneurs are very happy to stay in Bombay as America supplies them with outsourced work. True, nothing can replace land; but then again, nothing can replace a talent and creativity.

China had fallen behind due to ten years of a disastrous Cultural Revolution, and trampled by a century of civil war and foreign invasion. But the past is behind us. With a workforce that continues to grow not only in talent, but also in fierce nationalism, can they overcome this upcoming crisis?

Monday, October 06, 2008

Project10X is a Washington, DC based research consultancy specializing in next wave semantic technologies, solutions, and business models. The firm’s clients include technology manufacturers, global 2000 corporations, government agencies, and web 3.0 start-ups. The semantic wave embraces four stages of internet growth. The first stage, Web 1.0, was about connecting information and getting on the net. Web 2.0 is about connecting people — putting the “I” in user interface, and the “we” into a web of social participation. The next stage, web 3.0, is starting now. It is about representing meanings, connecting knowledge, and putting them to work in ways that make our experience of internet more relevant, useful, and enjoyable. Web 4.0 will come later. It is about connecting intelligences in a ubiquitous web where both people and things can reason and communicate together.

Over the next decade the semantic wave will spawn multi-billion dollar technology markets that drive trillion dollar global economic expansions to transform industries as well as our experience of the internet. Drivers and market forces for adoption of semantic technologies in web 3.0 are building. Project 10X has come out with a Semantic Wave 2008: Industry Roadmap to Web 3.0 Executive Summary. It's worth a read.

Sunday, October 05, 2008

Friday, October 03, 2008

The Hakia Question

Calling all librarians. The Semantic Web is looking for you. Everyone on board! Right? Well, maybe. My colleague The Google Scholar, has mixed feelings about Hakia's call out for free service from librarians and information professionals. And he has a right, too. Hakia's is suspiciously similar to Google's asking librarians to help Google Co-op -- and not surprisingly, it failed miserably.

But at the same time, I see it as an opportunity for librarians to make a case for their expertise in information retrieval. We can keep quiet and let others do the work for us; but that only leads to further marginalization. And we'll be left out again, which we did with Web 2.0.

What we librarians should do is not only learn about the SemWeb and come up with solutions, but to offer our knowledge and recommendations, as librarians do in their every day work. If search engine companies are intelligent enough to realize the importance that librarians offer in the search and information retrieval, they'll realize librarians are partners in this race to the SemWeb. Librarians must step up to the plate, it's an opportunity -- and not one to take lightly either. Here is what Hakia has issued:

Yesterday we issued an open call to librarians and information professionals for credible Website submissions at the WebSearch University in Washington D.C. We are glad to report that the immediate feedback is overwhelmingly positive.

Currently, hakia is generating credibility-stamped results for health and medical searches to guide users towards credible Web content. These results come from credible Websites vetted by the Medical Library Association. For an example of a credibility-stamped result, search for What causes heart disease? and mouse over the top search results. We are now aiming to expand our coverage to all topics.

Librarians and information professionals can now suggest URLs of credible Websites on a given topic by joining the hClub. Our credibility site definition is transparent and fulfills most of the following criteria:

Peer review. The publisher of the site must have a peer review process or strict editorial controls to ensure the accuracy, dependability and merit of the published information. Most government institutions, academic journals, and news channels have such review mechanisms in place.
No commercial bias. The publisher of the site shall have no commercial intent or bias. For example, for travel related recommendations consider U.S. Department of State travel portal and not Travelocity.
Currency. The information on the site should be current and links should be working.
Source authenticity. The publisher (preferably) should be the owner/producer of the content.

Upon submission, hakia will process the suggested sites with QDEX (Query Detection and Extraction) technology and make them available to Web searchers in credibility-stamped search results. Each month we will give away thank-you prizes, ranging from a book donation to two conference grants, to participants. To learn more or suggest credible Web sites, please visit http://club.hakia.com/lib/

We are looking forward to hear your feedback! This is just the beginning of a long journey.

Friday, September 26, 2008

The Future of the Semantic Web . . . Is Here?

The future of the Web is here. Based in Los Angeles, Cognition Technologies has developed innovative Semantic Natural Language Processing (NLP) technology which adds word and phrase meaning and "understanding" to computer applications, enabling them to be more human-like in their processing of information. Applications and technologies which utilize Cognition's Semantic NLP(TM) technology are positioned to take full advantage of Web 3.0 (the Semantic Web).

Market Watch has released an interesting article with Cognition Creates World's Largest Semantic Map of the English Language With More Than 10 Million Semantic Connections discussing Cognition Technologies' releasing of the largest commercially available Semantic Map of the English language. The scope of Cognition's Semantic Map is more than double the size of any other computational linguistic dictionary for English, and includes over 10 million semantic connections that are comprised of semantic contexts, meaning representations, taxonomy and word meaning distinctions. Technologies incorporating Cognition's Semantic Map will be able to provide users with more accurate and complete Search capabilities, the ability to personalize and filter content, and improve the user experience by significantly reducing the amount of irrelevant information presented. Cognition Technologies' lexical resources encode a wealth of semantic, morphological and syntactic information about the words contained within documents and their relationships to each other. These resources were created, codified and reviewed by lexicographers and linguists over a span of 24 years.

Cognition's Semantic Map provides software applications with an "understanding" of more than four million semantic contexts (word meanings that create contexts for specific meanings of other related words). It encompasses over 536,000 word senses (word and phrase meanings); 75,000 concept classes (or synonym classes of word meanings); 7,500 nodes in the technology's ontology or classification scheme; and 506,000 word stems (roots of words) for the English language. This enables applications to have a more accurate and relevant understanding of content and user interaction, and can be deployed in a wide variety of markets, including Search, Web-based advertising and machine translation augmentation, to name just a few.

Cognition's comprehensive Semantic Map is a critical component for the next phase of the Web's evolution, a.k.a. the Semantic Web, or Web 3.0 because it gives the computer a depth of knowledge and understanding of language far beyond the current keyword and pattern-matching technologies in place. As Nova Spivak has said, the future of information gathering will involve a combination of the Web and desktop, or 'Webtop' content. Our Semantic Map will enable these technologies to be more efficient and effective intermediaries in the process through such applications as Semantic Search, sentiment extraction and business analytics. I'm excited. Are you? I just wish somebody tried to discern Web 3.0 and Semantic Web though. . .

Wednesday, September 24, 2008

Web 2.0 and 'Live' Videos


We've heard of real-time video, but this is really taking it to another level. Yahoo Live! might just be onto something here. In many ways, it combines all the elements of Web 2.0 PLUS being live. Think about it - you get to social network with friends (or at least users you permit to see you), you customize your own content, and it's dynamic with its imbedding and mashup capabilities with API coding. Watch New York City from sunrise to sunset -- 24/7.

Y! Live is a community of broadcasters. It’s a place to socialize around live video content through broadcasting, viewing, and embedding. These guidelines are a structure for maintaining the creative environment and positive community vibe of Y! Live.

Let's put aside the privacy issues for a moment. And think of all the marketing possibilities this offers. It's like . . . Facebook with real faces :)

Monday, September 22, 2008

Minding the Planet



One of my favourite thinkers of the Web - Nova Spivaks, is a moderator of this panel of visionaries and experts and their ideas of the evolution of the Web.

Friday, September 19, 2008

Cultural Diversity in a world of Web 2.0

We often forget that the Web is a multilingual, enriched with the different yet unique multilingual flavours of different nationalities and ethnicities all streamlined under the guise of a common language. But that is simply not the case. We mustn't forget that behind the layers of technology and programming are people: real human beings who navigate the web behind their own cultural lenses and perceptions. Patrick Chau's "Cultural Differences in Diffusion, Adoption, and Infusion" of Web 2.0 is certainly worth a read.

While most cross-cultural studies in information systems studies are based on Hofstede's cultural dimensions, not much -- if any -- has been inquired into the state of Web 2.0. This is particularly ironic considering Web 2.0 is pegged to be a "social web." How can that be? Chau delves into these issues and re-examines Hofstede's five dimensions between cultures that are individualistic (Western) and cultures that are collectivist (Eastern). It's certainly food for fodder for those of us mired in the enthusiasm of Web 2.0 and Semantic Web. While a great deal has been written about social networks geographically, not enough emphasis has been put on how transnational flows of people as global citizens vary in terms of their use of Web 2.0 technologies. Can they be measured? If so, how?

(1) Personality orientation - Idiocentric or allocentric?

(2) Self contstrual - Independent or interdependent?

(3) Communication style - Low-context communication or high context communication?

(4) Time orientation - monochrome or polychromic?

(5) Cultural Framework - Long-term orientation vs. short-term orientation