A sub-centimeter mapping technology used by the Mars rovers and other robots is now on the iPad, where it can be used to build real-time 3-D maps of your environment. What better use for this than an augmented-reality first-person shooter game?
Swedish startup 13th Lab is implementing a computer vision technique called Simultaneous Localization and Mapping (SLAM), which constructs a 3-D map of a local environment in real time and calculates the current position within it. The result is a new iPad app called Ball Invasion (get it here), wherein the camera’s view becomes a playing field. Instead of advanced robotic sensors and controls, the app just needs the camera and other sensors native to mobile devices.
This is quite a feat, and a potential new avenue for augmented reality. Unlike most AR systems, it doesn’t require previously known markers to trigger the virtual display — it can augment any environment, previously seen or unseen, simply by using the iPad2’s camera.
The goal of the game is to shoot malicious balls hiding in the real world, which becomes part of the playing field — you can bounce virtual items off the actual walls, for instance. The video below shows it in action.
SLAM was developed to help robots determine where they are, by looking around and building a 3-D model of their environment and then determining their place in it. It’s a tough chicken-or-egg problem and one of the most complicated topics in robotics sensing, but, as GigaOM points out, 13th Lab has figured out how to compress this complex capability into a consumer device. So far, it’s only possible with the iPad2’s powerful dual-core A5 CPU, 13th Lab says (though they probably haven't tried using the next-gen quad-core Kal-El yet).
13th Lab’s overall goal is far broader than 3-D games: they want to build a 3-D toolkit for other app developers, according to GigaOM. Ball Invasion is simply the first example.
This type of technology could conceivably be used for many other things, from architecture design to augmented-reality tours.