Developers: Gesture and Audio Input Are in Your Future
It may not happen tomorrow, but sooner or later you're going to find yourself writing multitouch, gesture- and audio-input-based applications, Tim Huckaby declared during his day two keynote at the Las Vegas edition of the Visual Studio Live! 2012 developer conference series.
"I'm old enough that I remember when using a mouse was an unnatural act!" Huckaby told a packed auditorium at the Mirage hotel on Wednesday. "Now it's second nature. I'd argue that some of this voice- and gesture-capable stuff will be just as natural in a few short years."
Huckaby's keynote focused on human interactions with computers in non-traditional "natural-type" ways -- sometimes referred to as the Natural User Interface, or NUI -- and how it will impact the lives of .NET developers.That's something of a specialty of his Carlsbad, Calif.-based company, InterKnowlogy, which has delivered dozens of large WPF, Silverlight, Surface and Windows 7 Touch applications to clients across the country. He also founded a company, Actus, that specializes in interactive kiosk applications.
In a lively keynote during which he interacted with various gesture- and audio-based applications by flailing his arms and shouting commands, Huckaby argued that multitouch is now cheap, consumer-grade technology that everyone already wants.
"It's now cheap to do multitouch," Huckaby said. "And it improves usability, incredibly. You will see every computing device from here on in -- whether it's a smart phone or your desktop -- every one of them will be multitouch enabled."
To illustrate the pace of NUI evolution, Huckaby demonstrated a 3D application built by his company in early 2007 for cardiac surgeons that allows the user to manipulate the heart image via a touch screen. He contrasted that app with a similar one InterKnowlogy developed recently based on Microsoft's Kinect motion sensing input device.
"This was prototyped in a couple of weeks, and it's just .NET," Huckaby said.
He also demonstrated a touch-screen craps table built by his company that interacts with real-world objects. The bets were activated with physical chips laid down on the screen and "dialed" to establish the size of the bet, and the dice were actual transparent cubes that, when tossed, registered on the board.
The keynoter drew good-natured laughter from his audience as he waved his arms and strained a damaged rotator cuff to demo a physical therapy application designed to track a patient's movements through prescribed exercises and display them on a screen in real time. The application provided feedback to help the patient get the movements right. The application was based on Kinect, which Huckaby said is currently the world's fastest selling consumer electronics device.
The audience was also treated to a video about a neural computer interface, a spider-like contraption worn on the head, which was used to send commands to a wheelchair. Huckaby said the software for the device could be built with .NET right now. He wrapped up his demos with a video of an application that supported physical interactions with virtual objects. He called the C3-based app "a first go at the Holodeck" from Star Trek. He also showed off a game-based app developed for NASA.
"It's time for all of you to start thinking about building applications that use a Natural User Interface," Huckaby told the crowd. "Gesture is coming, fast; multitouch is here. And we might not be thinking commands at computers just yet, but we'll be doing that, too. It's just a matter of time."
Posted by John K. Waters on 03/29/2012 at 9:56 AM