eyeSight Moves The Touchpad To The Air

Gesture recognition has fully moved into the public realm, thanks in no small part to the Kinect system from Microsoft, but the amazing technology from eyeSight allows users to interact with a level of accuracy and natural gestures that shows the potential of “device-less” input.

Using their own machine vision algorithms to analyze gestures, eyeSight’s Touch Free technology utilizes advanced real-time image processing to convert them into commands.  The technology is completely independent of the underlying processor and camera hardware, using only a standard 2D camera and the eyeSight software.  Of course, the system may be enhanced by using 3D stereoscopic sensors and IR illumination.

Earlier this year, eyeSight revealed commercially-available gesture recognition that allows users to control digital devices with a fingertip, even from across a room.

The success of any gesture system will be dictated by how ‘natural’ it feels. Just as Apple made control via touchscreens feel completely natural a few years ago, we’re hoping to do the same thing with gesture: Our fingertip tracking makes gesture feel so intuitive that eventually people will use it without even thinking about it.
Gideon Shmuel, CEO of eyeSight

Potential beneficiaries of the eyeSight technology in today’s technology might include televisions and set-top boxes, PCs, mobile phones and tablets, in-car applications, digital signage and appliances.  The near-future possibilities provided by such accurate recognition of real-world gestures goes far beyond “Minority Report” type interfaces, breaking the limitation of our smallest personal devices — the fact that we need to interact with them, most commonly through a touch-screen.  With the benefit of gesture recognition, like that from eyeSight, we could make the entire world our input device.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.