Microsoft MirageTable For Augmented Reality - Virtual And Real Worlds Merge Here!
If you are looking for the point where the real and the virtual worlds merge, you are almost there. Microsoft Research has come up with an interactive system called the 'MirageTable' that resides on the top of any table and does a seamless integration of our real and the virtual world. It uses just three things - a Kinect Camera, a stereoscopic and a curved screen and opens door for a variety of interesting applications. We recently saw the #-Link-Snipped-#. This ingenious system from Microsoft will give us that option and more. Right from virtual 3D model creation to interactive gaming with real and virtual objects, MirageTable doesn't fail to impress. When using the 3D conference to talk to a remote person, this system not not only presents a 3D view the person you are talking to, but also gives a seamless 3D shared task space.
![[IMG]](proxy.php?image=http%3A%2F%2Fwww.crazyengineers.com%2Fwp-content%2Fuploads%2F2012%2F05%2FScreen-shot-2012-05-13-at-6.39.07-PM.png&hash=53042392fe1bde336f98d09bd255af5f)
<div class="mceTemp mceIEcenter"><dl id="attachment_33768" class="wp-caption aligncenter" style="width: 650px;"><dd class="wp-caption-dd">MirageTable is a curved projection-based augmented reality system (A), which digitizes any object on the surface (B), presenting correct perspective views accounting for real objects (C) and supporting freehand physics-based interactions (D).</dd></dl></div>
The MirageTable works by tracking the user's eyes and performs a real-time capture of the shape and appearance of any object placed (including userâs body and hands) in front of the camera. Such kind of capturing causes perspective stereoscopic 3D visualizations to the user. What we loved about this system, is that the users are free to interact with the virtual without any gloves, trackers, or instruments. What's more, users are able to perceive these objects even when they are projected over different background colors and geometries such as gaps of drops.
Take a look at the demonstration from Microsoft researchers and feel awed by the presentation of correct perspective use to a single user on top of the dynamic changing geometry of the real world -
<object width="640" height="480" classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="https://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="https://www.youtube.com/v/EaCjTog0u40?version=3&hl=en_US" /><param name="allowfullscreen" value="true" /><embed width="640" height="480" type="application/x-shockwave-flash" src="https://www.youtube.com/v/EaCjTog0u40?version=3&hl=en_US" allowFullScreen="true" allowscriptaccess="always" allowfullscreen="true" /></object>
You may also want to read the full Research Paper PDF here : #-Link-Snipped-#
![[IMG]](proxy.php?image=http%3A%2F%2Fwww.crazyengineers.com%2Fwp-content%2Fuploads%2F2012%2F05%2FScreen-shot-2012-05-13-at-6.39.07-PM.png&hash=53042392fe1bde336f98d09bd255af5f)
<div class="mceTemp mceIEcenter"><dl id="attachment_33768" class="wp-caption aligncenter" style="width: 650px;"><dd class="wp-caption-dd">MirageTable is a curved projection-based augmented reality system (A), which digitizes any object on the surface (B), presenting correct perspective views accounting for real objects (C) and supporting freehand physics-based interactions (D).</dd></dl></div>
The MirageTable works by tracking the user's eyes and performs a real-time capture of the shape and appearance of any object placed (including userâs body and hands) in front of the camera. Such kind of capturing causes perspective stereoscopic 3D visualizations to the user. What we loved about this system, is that the users are free to interact with the virtual without any gloves, trackers, or instruments. What's more, users are able to perceive these objects even when they are projected over different background colors and geometries such as gaps of drops.
Take a look at the demonstration from Microsoft researchers and feel awed by the presentation of correct perspective use to a single user on top of the dynamic changing geometry of the real world -
<object width="640" height="480" classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="https://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0"><param name="allowFullScreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="src" value="https://www.youtube.com/v/EaCjTog0u40?version=3&hl=en_US" /><param name="allowfullscreen" value="true" /><embed width="640" height="480" type="application/x-shockwave-flash" src="https://www.youtube.com/v/EaCjTog0u40?version=3&hl=en_US" allowFullScreen="true" allowscriptaccess="always" allowfullscreen="true" /></object>
You may also want to read the full Research Paper PDF here : #-Link-Snipped-#
0