User interactions via hand gestures are added to a real-time data acquisition, image reconstruction, and mixed-reality display system to allow a user to interact more flexibly with the rendering. Images at precalibrated slice locations are acquired and displayed in real-time to the user, who is able to toggle between viewing some or many slices as well rotate, resize, and dynamically adjust the window and level of the rendering.
This abstract and the presentation materials are available to members only; a login is required.