Microsoft's Surface may be our favourite desk-sized touch-screen device, but the company's researchers have just taken the wraps off of its successor. LightSpace takes all of the best parts of the table-top computer and adds a few cool new features as well.
Whereas Surface projected an image onto the display from below, LightSpace uses projectors built into the ceiling. By combining these with a series of depth-sensitive cameras and elements of augmented reality, the researchers have created a completely interactive 3D space that extends above, around and even between surfaces.
The result is that an image can be projected onto a 'dumb' table and manipulated using typical multitouch gestures in much the same way as on the original Surface. However, the system's awareness of the entire space means that objects can be picked up and transferred between two different displays just by touching both areas. They can also be 'swept' into a person's hand, carried around a room and given to another person before 'dropping' the objects back onto a screen.
The demonstrators also showed a menu system where options could be cycled through by moving your hand up or down.
LightSpace works by tracking objects through a real-world space, creating a detailed 3D computer model. It can even detect interactions between people, such as hand-shakes and other physical contact. Though this means that the installation must be specifically calibrated for the room that it's in, it allows for detailed and accurate interactions with the environment.
Like so many Microsoft Research projects, LightSpace is likely to be confined to the lab for the time being. For those interested in a glimpse into the future, a video demonstrating the capabilities of the system is available here.