UNC researchers earn Best Paper at IEEE VR 2021

The system is demonstrated by a wearer who is being coached in a 3D virtual environment by a trainer
The user (top left) has his motion and environment recorded by the wearable system. The trainer (top right) can then interact with the 3D reconstruction in a virtual environment (center) to provide feedback without being in the same physical location.

A paper co-authored by students Young-Woon Cha, Husam Shaik, Qian Zhang, and Fan Feng; research scientists Andrei State and Adrian Ilie; and professor Henry Fuchs received the Best Paper award at the IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR) 2021, which was held virtually in March and April 2021.

The awarded paper, titled “Mobile, Egocentric Human Body Motion Reconstruction Using Only Eyeglasses-mounted Cameras and a Few Body-worn Inertial Sensors,” presents a real-time system for dynamic 3D capture of a person using hardware conveniently worn on common items like eyeglasses, wrist watches, and shoes. The system is able to convert the captured visual and movement data into a high-fidelity 3D reconstruction of the wearer. Where high-fidelity motion capture would typically be possible only in a studio full of mounted cameras, the presented system enables motion capture anywhere. When combined with existing telepresence and virtual reality technology, such a system has applications in healthcare, education, athletics, and many other fields.

The UNC Graphics and Virtual Reality Group, led by Fuchs, has been frequently recognized by the conference for its research in virtual and augmented reality, winning Best Paper in 2017 and 2016 in addition to 2021.

Three egocentric cameras worn by the user record both the wearer and the surrounding environment.
Three egocentric cameras worn by the user record both the wearer and the surrounding environment. The drone footage (bottom right) shows how accurately the user’s movement is reconstructed in the virtual environment.
Comments are closed.