Skip to main content

UROP Spotlight: Computational Basis of Everyday Action Planning

To explore the way humans complete simple tasks, Beckett Roberge '28 spent a UROP in the Kanwisher Lab creating a virtual reality simulation of catching a ball. He is now working with Aryan Zoroufi on tracking subjects' movements and comparing them to how different computational techniques catch the ball in the same simulation.
  • Aryan Zoroufi helping Beckett Roberge put on a VR headset.
    Aryan Zoroufi helping Quest UROP Beckett Roberge put on a VR headset.
    all photos courtesy of Bearwalk Cinema

How does the human brain plan its course of action in everyday scenarios, like catching a ball or spotting your keys on a crowded table? During the spring 2025 semester, this was explored in Walter A. Rosenblith Professor of Cognitive Neuroscience Nancy Kanwisher’s lab, where undergraduate researcher Beckett Roberge assisted graduate student Aryan Zoroufi in examining how the brain plans and executes everyday actions—and how those processes compare to computer models and deep neural networks (DNNs).

This project builds on Beckett’s research from previous semesters working with the Kanwisher Lab: last year, Beckett developed the code for a 3D simulation that asks human participants to move a virtual basket to catch a falling object. The challenge this spring was to adapt that model to a virtual reality (VR) environment, which would ultimately allow researchers to collect richer, more immersive data about how people perceive and respond to moving objects—and to test which computer models most closely mirror human performance.
 

Beckett wearing a VR headset and glove.

Each week, Beckett met with Aryan to review project progress and attended Kanwisher Lab meetings to share updates and troubleshoot technical challenges. He tested VR hardware, fixed bugs, developed experiment code in Python, and guided participants through experimental trials to gather data. Beyond building the catching task, Beckett also helped to review prior studies to better situate the team’s findings within the broader field of computational neuroscience.

For Beckett, this UROP was a hands-on introduction to the nuances of neuroscience research, and he hopes to pursue these topics further with a career working on Brain Computer Interfaces (BCIs). “I have learned a lot about the structure of the brain, the different techniques we have at our disposal to study it, and what the state of current research is,” he says. “I love learning about the intersection of brains and computers, how they are different and how they are similar. It would be incredible to see this paper go to fruition and to be able to say that I contributed to it.”

Beckett and Aryan smiling outside of the Quest offices.

Learn more about the UROPs supported by the Quest.