
March 6, 2025
Computer science researchers won a Best Paper Honorable Mention at IEEE Virtual Reality 2025 for their paper, “Multimodal Neural Acoustic Fields for Immersive Virtual Reality and Telepresence.”
This work, published in IEEE Transactions on Visualization and Computer Graphics (TVCG), was led by an incredible team of students—Guansen Tong, Jonathan Chi-Ho Leung, Xi Peng, Haosheng Shi, Liujie Zheng, Shengze Wang, Arryn O’Brien, Ashley Neall, Grace Fei, and Martim Gaspar—under the guidance of Assistant Professor Praneeth Chakravarthula at UNC Chapel Hill. Tong, Leung, Peng, Shi, and Wang are graduate students, while Zheng, O’Brien, Neall, Fei, and Gaspar are undergraduate students.
The research enhances immersive audio in augmented and virtual reality, making virtual experiences feel more natural and lifelike. Imagine watching a live concert in augmented reality from your couch. If you move around the room, the sound dynamically adapts based on distance, direction, and surrounding materials to match your perspective.
Chakravarthula expressed his excitement to see his team awarded for their effort.
“We’re thrilled to see the work recognized and can’t wait to push the boundaries of spatial audio for AR/VR even further!”
Chakravarthula’s research interests lie at the intersection of optics, perception, graphics, optimization, and machine learning. He has previously received Best Paper awards at graphics and vision conferences including ACM SIGGRAPH and ISMAR.
The full paper is available online, and more information can be found on the project website.