I just published a Kinect mapping tool for PowerPoint allowing users to navigate through a PowerPoint slide deck using gestures. It’s here on CodePlex: https://k4wppt.codeplex.com/ . There are already a lot of these out there, by the way – one of my favorites is the one Josh Blake published.
So why did I think the world needed one more?
The main thing is that, prior to the release of the Kinect SDK 1.7, controlling a slide deck with a Kinect was prone to error and absurdity. Because they are almost universally written for the swipe gesture, prior PowerPoint controllers using Kinect had a tendency to recognize any sort of hand waving gesture as an event. Consequently, as a speaker innocently gesticulated through his point the slides would begin to wander on their own.
The Kinect for Windows team added the grip gesture as well as the push gesture in the SDK 1.7. This required several months of computer learning work to get these recognizers to work effectively in a wide variety of circumstances. They are extremely solid at this point.
The Kinect PowerPoint Mapper I just uploaded to CodePlex takes advantage of the grip gesture to implement a grab-and-throw for PowerPoint navigation. This effectively disambiguates navigation gestures from other symbolic gestures a presenter might use during the course of a talk.
I see the Kinect PowerPoint Mapper serving several audiences:
1. It’s for people who just want a more usable Kinect-navigation tool for PowerPoint.
2. It’s a reference application for developers who want to learn how they can pull the grip and the push recognizers out of the Microsoft Kinect controls and use them in combination with other gestures. (A word of warning, tho – while double grip is working really well in this project, double push seems a little flakey.) One of the peculiarities of the underlying interfaces is that the push notification is a state, when for most purposes it needs to be an event. The grip, on the other hand, is basically a pair of events (grip and ungrip) which need to be transposed into states. The source code for the Mapper demonstrates how these translations can be implemented.
3. The Mapper is configuration based, so users can actually use it with PC apps other than PowerPoint simply by remapping gestures to keystrokes. The current mappings in KinectKeyMapper.exe.config look like this:
<add key="DoubleGraspAction" value="{F5}" /> <add key="DoublePushAction" value="{Esc}" /> <add key="RightSwipeWithGraspAction" value="{Right}" /> <add key="LeftSwipeWithGraspAction" value="{Left}" /> <add key="RightSwipeNoGraspAction" value="" /> <add key="LeftSwipeNoGraspAction" value="" /> <add key="RightPush" value="" /> <add key="LeftPush" value="" /> <add key="TargetApplicationProcessName" value="POWERPNT"/>
Behind the scenes, this is basically translating gesture recognition algorithms (some complex, some not so much) to keystrokes. To have a gesture mapped to a different keystroke, just change the value associated with the gesture – making sure to include the squiggly brackets. If the value is left blank, the gesture will not be read. Finally, the TargetApplicationProcessName tells the application which process to send the keystroke to if there are multiple applications open at the same time. To find a process name in Windows, just go into the Task Manager and look under the process tab. The process name for all currently running applications can be found there – just remove the dot-E-X-E at the end of the name.
4. The project ought to be extended as more gesture recognizers become available from Microsoft or as people just find good algorithms for gesture recognizers over time. Ideally, there will ultimately be enough gestures to map onto your favorite MMO. A key mapper created by the media lab at USC was actually one of the first Kinect apps I started following back in 2010. It seemed like a cool idea then and it still seems cool to me today.