Monday, June 24, 2013

Gesture-Based Computing Technology Comes in Leaps and Bounds



The Horizon Report 2012 had accurately forecast that gesture-based computing would be an important technology to watch out for.   Although gamers are already familiar with Nintendo Wii and Microsoft’s Kinect system extend  that to hand and arm motions, or body movement, these first-generation technologies were often clunky and had limited movement and mobility.  Gesture-based technology (also known as motion control) is much more than just gaming as it promises to revolutionize the way we interact with computing technology.

One of the earliest companies coming out of the gate is Leap Motion, which has designed and will be launching a device that allows users to bring “motion control” to their computers.  How will it look?  Perhaps similar to how Captain John Anderton managed multiple computer screens using motion control in Minority Report.  Imagine that we could be seeing the end of the keyboard and mouse.

As Leap Motion becomes reality using a small box the size of a matchbox to handle motion control technology, users can navigate on their screens by waving their hands in the air, and launch and play games on PCs without ever touching the keyboard or mouse.  A sensor is placed on his desk in front of the screen and connects via USB. Once connected, gesture-based computing allows users to engage in virtual activities  with motions and movements similar to what they would use in the real world, manipulating content intuitively.

What type of learning applications can gesture-based computing be useful for?  In medicine, for example, gesture-based motion control enables virtual autopsy using a multi-touch table. Detailed CT scans can be created from a living (or deceased person) and transferred to the table where they are manipulated with gestures, allowing forensic scientists to examine a body, make virtual cross-sections, and view layers including skin, muscle, blood vessels, and bone.  Can you imagine what libraries and museum collections can do by adopting gesture-based computing?

More Resources:

Neßelrath, R., Lu, C., Schulz, C. H., Frey, J., & Alexandersson, J. (2011). A Gesture Based System for Context–Sensitive Interaction with Smart Homes. In Ambient Assisted Living (pp. 209-219). Springer Berlin Heidelberg. [Link]

Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2013). NMC Horizon Report: 2013 Higher Education Edition. [Link]

Maiti, A. (2013, February). Interactive remote laboratories with gesture based interface through microsoft kinect. In Remote Engineering and Virtual Instrumentation (REV), 2013 10th International Conference on (pp. 1-4). IEEE. [Link]

Mistry, P., & Maes, P. (2009, December). SixthSense: a wearable gestural interface. In ACM SIGGRAPH ASIA 2009 Sketches (p. 11). ACM. [Link]