Microsoft is set to demonstrate its work on its Lightspace natural world user interface at an event this week, showing how it can create displays from any flat surface and track user movements to let them manipulate documents and virtual objects.
The technology is being demonstrated at the ACM Symposium on User Interface Software and Technology in New York this week, and builds on features in the earlier Microsoft Surface project, which is based on an interactive tabletop display.
Lightspace takes the concept further by allowing any tabletop to become a computer interface, enabling users to touch virtual objects and even pick them up and carry them to another screen or surface, according to Microsoft.
The technology is being demonstrated by Andy Wilson, a senior researcher at Microsoft Research, who will also present a new research paper on the topic.
Lightspace uses a combination of multiple projectors and cameras to sense depth and track the user's body movements.
Such natural user interfaces employ "touch, face and voice recognition, movement sensors, our location, context and even our mood to deliver more natural, human interfaces", according to Microsoft, which said that today's technologies, such as its own Kinect technology for the XBox game console, are just the tip of the iceberg when it comes to what could be possible.
Return to microsoft news headlines
View Microsoft News Archive