Ming Lin was the keynote speaker at the IEEE VR Conference on March 19, 2016. Lin’s talk, “Towards Immersive Multimodal Display: Interactive Auditory Rendering for Complex Virtual Environments” presented an overview of recent work on interactive auditory display, consisting of sound synthesis and sound propagation. Lin also presented new techniques on cross-modal interaction for VR that greatly improve the state of the art in sound rendering.
Doctoral student Peter Lincoln demonstrates a head-tracked augmented reality display presented at IEEE VR 2016.
UNC CS faculty and grad students presented four papers at the conference – more than any other university. Two of the papers won awards.
Best Paper went to “From Motion to Photons in 80 Microseconds: Towards Minimal Latency for Virtual and Augmented Reality” presented by Peter Lincoln, Alex Blate, Montek Singh, Turner Whitted, Andrei State, Anselmo Lastra and Henry Fuchs.
The paper described an augmented reality, see-through, dot matrix display with an extremely fast update rate mounted in a head-tracked rig. Augmented reality displays allow for the overlay of graphics on the user’s environment. This display tracks the user’s physical environment smoothly and with very low latency. A video demonstration can be found on Lincoln’s YouTube page.
Best Paper Honorable Mention was awarded to “Interactive Coupled Sound Synthesis-Propagation using Single Point Multipole Expansion” presented by Atul Rungta, Carl Schissler, Ravish Mehra, Chris Malloy, Ming Lin and Dinesh Manocha.
The GAMMA group demonstrated its technique for coupled sound synthesis-propagation using virtual environments in a cathedral, a Tuscan villa, and a research lab.
Where existing sound simulation research has focused on either sound synthesis (creating sound using computers) or sound propagation (simulating sound as it moves through an environment), this paper presented a technique that couples synthesis and propagation to support dynamic sources, listeners, and directivity simultaneously. For more information on this paper, including a video demonstration using the Unity game engine, visit gamma.cs.unc.edu/syncopation.
The department’s other papers included:
- “Interactive and Adaptive Data-Driven Crowd Simulation”
Presented by: Sujeong Kim, Aniket Bera, Andrew Best, Rohan Chabra, Dinesh Manocha
- “Efficient HRTF-based Spatial Audio for Area and Volumetric Sources”
Presented by: Carl Schissler, Aaron Nicholls, Ravish Mehra
For more information about the conference visit the IEEE VR Conference website.