Skip to main content

From Reality to Virtual Reality: Digitally Preserving Nepal’s Ancient Swayambhu Temple

December 5, 2023

Perched atop a hill in the Kathmandu Valley, the ancient Swayambhu Temple proudly reflects Nepal’s rich cultural tapestry. In a groundbreaking initiative this past October, Religious Studies Professor Lauren Leve and Computer Science Research Scientist Jim Mahaney traveled to Swayambhu to start the long process of capturing the data needed to create an accurate 3D model of the entire temple complex.

Ego-Exo4D Project gives AI training a human touch

November 30, 2023

UNC CS announces its participation in the Ego-Exo4D project, an innovative venture to revolutionize AI. An international consortium of 14 universities in partnership with Meta FAIR team will create a first-of-its-kind, large-scale, multimodal, multiview dataset that enhances AI’s perception, responsiveness, and understanding of human skill in real-world settings.

HackNC & Carolina Data Challenge: Where code meets innovation

November 20, 2023

This fall, UNC Computer Science transformed into a hub of technological innovation and collaboration with two major student events: HackNC and the Carolina Data Challenge. These hackathons, led by student teams with department support, brought together more than 700 students from various colleges across North Carolina and beyond.

Bansal receives Indian Institute of Technology Kanpur Young Alumnus Award

November 14, 2023

Professor Mohit Bansal earned the IIT Kanpur Young Alumnus Award, conferred annually to two alumni under 40 years old who have contributed significantly to achieve exemplary recognition and distinction in their careers. IIT Kanpur is one of the most prestigious academic institutions in India, and thousands of its alumni are eligible for the award.

Researchers from Meta and UNC-Chapel Hill Introduce Branch-Solve-Merge: A Revolutionary Program Enhancing Large Language Models’ Performance in Complex Language Tasks

October 31, 2023

Branch-Solve-Merge (BSM) is a program for enhancing Large Language Models (LLMs) in complex natural language tasks. BSM includes branching, solving, and merging modules to plan, crack, and combine sub-tasks. Applied to LLM response evaluation and constrained text generation with several models, BSM boosts human-LLM agreement, reduces biases, and enables LLaMA-2-chat to match or surpass GPT-4 in most domains.

Oliva given NSF grant to enhance machine learning extrapolation

October 17, 2023

Assistant Professor Junier Oliva received a two-year NSF grant to improve the ability of machine learning models to extrapolate beyond the scope of their training dataset. The project will hopefully enhance scientific tasks across numerous disciplines, including chemical discovery and safety assessment.