April 2016: When Xbox 360 first came out with Kinect — the motion-sensing technology that lets you interact with games using body gestures — you were probably impressed. No buttons to push? No joystick? No Wii controller to keep in your pocket? Oh, the freedom!
But did you know that gaming is just the start? The same technology that captures and interprets gestures will soon transform the way you interact with your devices. Here’s a peek at freedoms yet to come:
Gesture recognition isn’t the recognition of the touching, tapping or swiping you do on your smartphone or tablet, even though those touch motions are sometimes called gestures. Rather, it’s the ability to control electronic devices without using your hands at all. Gestures are captured using a sophisticated 3D infrared camera (or cameras) that scan a limited area to locate and analyze human movement within that area. The movement is interpreted as computer commands. Therefore, the captured human movement is what controls the computer.
There’s a lot of fancy tech that happens between the camera and command. Sometimes the goal is to interpret broad, whole-body movements, as with Xbox Kinect. And sometimes it’s to interpret extremely fine movements, as with facial expressions or eye movements.
What can it be used for?
Gesture recognition is part of a larger tech area that includes things like robotics, machine learning, artificial intelligence, smart cameras and computer vision. The possibilities for its use are endless — over time, it will transform the way you communicate with your machines.
For example, at home and work you’ll be able to use the same types of gestures you use right now with your mouse — scrolling, double-clicking, pointing — except you’ll do them in the air. For logging into accounts, instead of the ubiquitous username/password process we all know and don’t love, you’ll be able to simply look at a device’s camera and smile or make some other gesture you’ve chosen. Even just walking in a room could do the trick. Physical attributes like your gait and posture are extraordinarily unique so you can be recognized and granted access based on completely normal behaviors that you do anyway.
If you need specialized training, you’ll use gesture-based commands to interact with the material virtually before you do it in real life — this will be especially useful for medical training and training for dangerous jobs. If you want to learn an instrument, you may not need to buy one. If you need surgery, your doctor may never touch you or the device that controls the robot that operates on you. If you’re an engineer or a designer, you’ll find the best solutions by moving things around in “space.” Eventually, your kids (or grandkids) will use gestures at school and will look at you funny when you tell them about the days when you had to touch the computer.
Where can I use it right now?
You can use it with your Xbox 360, of course! But you can’t use it for navigating the dashboard anymore, just game playing. Microsoft recently removed Kinect gesture support for dashboard navigation because people weren’t using it for that purpose. You can also check out Leap Motion for PCs and Macs or the nPointer app for PCs.
Or you can buy a new car. Or test drive one, anyway. Last year, the BMW 7 Series was the first to introduce gesture control and this year’s 6 Series has the more sophisticated BMW’s Air Touch. Volkswagen’s e-Gulf Touch is rolling out gesture-based controls soon, too.
So the next time you see a video game that’s controlled by gestures, you’ll know it’s not just a gimmick. Gesture recognition is real science with real benefits that you’ll soon wave hello to.
|You may also be interested in:|