Smartphones aren’t just smart, they’re also needy. They demand user attention and interaction, not only through multi touch interfaces, accelerometers and cameras, but through all kinds of sensitive sensors.
Indeed, today’s phones don’t just want to be touched, they want to see you too, to read their owners’ body language.
With this in mind, Israeli firm EyeSight has developed a technology that uses the device’s camera to interact with the user by recognizing hand gestures. The firm’s NUI (Natural User Interface) adds a gesture control UI dimension to your phone.
Originally built for the Symbian platform, EyeSight’s product portfolio – which includes a dozen or so games and applications relying on optical recognition via the phone’s camera – will now also be coming to Android.
Eyesight’s strategic choice to develop for the Android platform is heavily based on the massive momentum Google’s OS has been gaining over the past few months. Not only have handset makers been flocking to add the OS to their smartphones, several are even building their own personalized UIs on top of the Android base – HTC Sense and Motorola Blur for example.
Even for the all-seeing EyeSight, however, it still remains to be seen whether or not the firm will be able to get its wares on the upcoming iPhone 4, which boasts a front facing camera. EyeSight told RCR it is preparing itself for the possibility, but much depends on whether Apple will allow the software on its phone.
One thing we’re fairly sure of is that the fruit themed gadget maker won’t allow EyeSight to create a new UI for the iPhone but it would certainly be nice to see how EyeSight could be integrated using apps.
EyeSight comes to Android
ABOUT AUTHOR