As we get more and more attached to our portable electronics (although addicted might be a little more accurate; my husband now refers to his smartphone as The Leash), it's interesting to see the ways in which new ways of interacting with our devices are developing. Two very intriguing projects, PocketTouch and OmniTouch, are underway at Microsoft Research.
Smartphones' greatest benefits are their portability, their utility, and their ability to give us ubiquitous connectivity. This can also be one of the great frustrations if the smartphone interrupts real world, in-person conversations or events. The PocketTouch project wants you to be able to interact with your device without having to remove it from your pocket or bag. Using a custom, multitouch capacitive sensor on the back of the smartphone, you can control the phone by stroking it through the fabric (which is going to look mighty odd, but hey). The goal, as stated by researchers T. Scott Saponas, Chris Harrison, and Hrvoje Benko, was to implement an eyes-free multitouch input on the device through the user's pocket, purse, bag, or shirt. (The PDF of the full paper is available here.) Rather than being limited to a single physical button you can control the device, send messages, or whatever you'd like to do, by touching the device without removing it from wherever you've stashed it and having it sense and respond to your gestures. I think that's pretty nifty, I have to say. Also, yet another clever use of capacitive sensing!
The second project, OmniTouch, uses a shoulder-mounted pico-projector and a depth camera to let you turn any surface into a multitouch user interface. While the current prototype looks fairly unwieldy, the video showing users interacting via any convenient surface, up to and including their own hands and handy walls, is really exciting. (The researcher paper (PDF) is available here.) Rather than trying to cram a greater range of user input behaviors and methods into the device itself (as happens with gesture recognition, for example), the OmniTouch project wants to expand your interactions to include ... whatever surface is handiest, most of which (as the researchers note in the paper) are bigger than the smartphone's screen. The smartphone may still power whatever specific functionality you're using but your interaction with the device won't necessarily involve you holding it in your hand and bending over its small screen. It's also a self-contained system in that you don't have to salt the environment or the user with additional electronics to make it work.
The confluence of powerful, relatively inexpensive sensors, high-performance signal conditioning, and powerful processing capabilities, tailored for low-power mobile devices is allowing very creative people to greatly expand the way we interact with the devices we love. Who knows how things will look in five years?