Excited about how gesture-, voice- and eye-control are changing the future of computing? Why not be a part of its creation?

Garnering a great amount of attention at this year’s Las Vegas Consumer Electronics Show (CES), the Intel Perceptual Computing initiative is showcasing how the PCs and laptops of the future are directed by the way that we move, what we say, our facial characteristics, hand articulation and more.

Where devices such as Leap Motion, GestIC, Ubi and Tobi Rexx demonstrate how motion, voice and eye-tracking can individually control computing functions, an integrated approach (which Intel is calling “perceptual computing”) encompasses a wide range of human senses to make human-to-computer interactions more similar to instinctive human-to-human communication. The Samsung Galaxy Camera, for example, integrates touchscreen, gesture control and voice commands in a way that is, arguably, natural and intuitive computing.

BASIC MEMBER CONTENT
This content is available for Basic Members.
Already a member, log in