A new wearable projection system can turn any surface into an ad-hoc interactive touchscreen, from the palm of your hand to an entire wall. It combines a mini projector combined with a Kinect-type camera to capture a user’s interaction with a virtual screen.
It doesn’t need any buttons or keys — although you could probably draw one if you want — but superimposes them onto any surface you want to use. A Kinect-style infrared depth-sensing camera builds a dynamic 3-D map of your environment, using reflected infrared light to calculate surfaces even as you move around. The laser pico-projector adjusts accordingly, compensating for the surface’s shape and size to prevent distortion.
The system is able to distinguish finger-shaped cylindrical objects, and can sense whether fingers are “clicking” or hovering over something, according to a news release from Carnegie Mellon University. You can type or swipe like you would with a regular touchscreen.
For now, it’s a bulky setup worn on the shoulder, but it could eventually be minimized to become the size of a deck of cards. Co-inventor Chris Harrison, a researcher at CMU who developed the system while an intern at Microsoft, said it could eventually be integrated into future handheld devices.
Harrison also worked on a similar system we told you about last year, the Skinput, which turned a person’s hand or arm into a touchscreen. That system used acoustic sensors to detect the location of finger taps on the hands or forearm, so it was limited to the skin, which seemed cool but not necessarily as useful. By contrast, the OmniTouch system can use any surface of any size, which could make it much more practical.
The team is presenting research with OmniTouch this week at the ACM Symposium on User Interface Software and Technology in Santa Barbara.