This is an archived version of the original website. Read our disclaimer.

Gestures and natural language to operate my laptop?

screenxtreme_thThe popularity of the Wii game console that can sense gestures and voices may it seems, be the way of human/computer interaction by as soon as next year. Microsoft and HP are advanced already. HP plans to incorporate more gesture features into its TouchSmart PCs later this year. The TouchSmart tx2z Tablet PC can already do some amazing touch screen gesture recongnition.

A system that can recognize human gestures could provide a new way for people with physical disabilities to interact with computers. A related system for the able bodied could also be used to make virtual worlds more realistic.

Gesture recognition technology powers GestureTek‘s touchless interactive displays, point and click presentation systems and mouse replacement solutions. Is this a disruptive technology or what?

The core of gesture recognition is the visual processing and 2-or-3-dimensional modeling of the structure and movement of the human body (e.g. hand, arm, face, lips, or entire body).  This can draw heavily from the fields of human anatomy, kinesiology, and human optical processing.

This is not just about what the computer manufacturers may be offering us and how many device manufacturers may disappear and new one be created, but also about the way we work and who will have easy access to the computer power growing in the clouds.

Linked to my post about netbooks, these developments may drive the next major piece of game-changing in the business world.

PLEASE NOTE: There is tons of useful stuff on Startup Owl, a site that’s been going for a dozen years. So keep browsing, but know that the founder, Will, now devotes most of his time and energy to his new website that you should definitely visit:

Leave a Comment