Skip to main content
You are not a member of this wiki.
2010 Short List
Review Press Clippings
RQ1: Discuss Topics
RQ2: Add New Topics
RQ3: Identify Key Trends
RQ4: Identify Critical Challenges
First Round Voting
Second Round Voting
Selected RSS Feeds
Google Custom Search
Horizon Project Central
The Horizon Report
Australia-New Zealand Edition
Business & Economic Development Edition
ANZ Wiki Archive
2009 ANZ Horizon Wiki
2008 ANZ Horizon Wiki
New Media Consortium (NMC)
2010 Short List Gesture-Based Computing
2010 ANZ Short List
2010 ANZ Horizon Report Short List
Time-to-Adoption Horizon: One Year or Less
Time-to-Adoption Horizon: Two to Three Years
Time-to-Adoption Horizon: Four to Five Years
Visual Data Analysis
2010 Final Topic:
Time-to-Adoption: Four to Five Years
It is already common to interact with a new class of devices entirely by using natural gestures. The Microsoft Surface, the iPhone, iPad, and iPod Touch, the Nintendo Wii, and other gesture-based systems accept input in the form of taps, swipes, and other ways of touching, hand and arm motions, or body movement. These are the first in a growing array of alternative input devices that allow computers to recognize and interpret natural physical gestures as a means of control. We are seeing a gradual shift towards interfaces that adapt to — or are built for — humans and human movements. New interface technologies like Kinect, Sixth Sense, and Tamper are using very intuitive approaches to how we connect with our computers, allowing users to engage in virtual activities with motions and movements similar to what they would use in the real world, manipulating content intuitively. The idea that natural, comfortable motions can be used to control computers is opening the way to a host of input devices that look and feel very different from the keyboard and mouse — and that enable our devices to infer meaning from the movements and gestures we make.
Relevance for Teaching, Learning & Creative Enquiry
Gesture-based games like those developed by researchers at Georgia Tech University can help deaf children learn linguistics at a critical time of language development.
Gesture-based interfaces like MIT’s Sixth Sense project can be used to augment virtual information into real world spaces.
After discovering the significant improvement in dexterity that surgeons-in-training gained from playing with the Wii (48%), researchers are developing a set of Wii-based medical training materials.
Gesture-Based Computing in Practice
This innovative project at the Auckland Museum uses touch-screen interfaces to allow visitors to create custom virtual orchids in lifelike detail:
Researchers at MIT are developing inexpensive gesture-based interfaces that track the entire hand:
Dutch company Silverfit uses a gesture-based system to deliver fitness games designed for the elderly:
For Further Reading
Touchy Feely Future for Tech Users
This BBC video gives an overview of touch-based technologies that provide haptic feedback to users, such as a touch screen with a variety of buttons that feel different when pressed.
University Offers New Technology to Help Students Study
, 1 October 2009.) The Mathewson-IGT Knowledge Center at the University of Nevada in Reno purchased two Microsoft Surfaces. In addition to maps and games, the University added an anatomy study guide.
Why Desktop Touch Screens Don't Really Work Well For Humans
The Washington Post
, 12 October 2009.) A desktop touch screen isn't comfortable: a more ergonomic design (like an architect's drafting board) would relieve arm fatigue.
help on how to format text
Creative Commons License
Banner Image Photo Credits
The New Media Consortium
is an international 501(c)3 not-for-profit consortium of
hundreds of learning-focused organizations
dedicated to the exploration and use of new media and new technologies. (
Turn off "Getting Started"