Inside a rehearsal room overlooking the verdant green of Schenley Park, countertenor Ricky Owens burst into song.
His dazzling Russian opera was full of power and emotion.
It also contained crucial data for a team of engineering students studying focus, distraction and flow.
As Mr. Owens sang, pianist I-Hsiang Chao added accompanying countermelodies.
“He's basically sight reading,” said Jocelyn Dueck, an assistant professor of collaborative piano at Carnegie Mellon University.
It was the students’ first time practicing together, and Ms. Dueck could tell when the partnership faltered, even from subtleties like the arch of Mr. Chao’s back.
She marked those moments as “distracted states” on a laptop monitoring the performance in real time, with an assist from an artificial intelligence model, and a wireless headset worn by Mr. Chao. The headset is a simplified version of the electroencephalogram, or EEG, hardware used to monitor brain activity. It sends information directly to a computer where software can be used to predict when a performer is dialed in, and when their mind might be wandering.
The research is part of a new interdisciplinary course developed by Ms. Dueck to blend science and music that is now in its second semester at CMU. The first group looked at how eye tracking could help pianists flip the pages of their sheet music on a tablet, by pairing cameras with AI software.
While the new AI modeling is still in its infancy, it is helping paint a picture of an elusive brain state known as flow.
Experienced by musicians, runners and even gamers, flow state is often described as a feeling of effortlessness brought on by intense attention to the task at hand. The experience can be powerful enough to make people forget that time is passing.
“We're hoping to get this high quality label data here in the music setting, but then apply it to other settings, such as the work setting, or even sports,” said Rohan Sonecha, an engineering student who is using the class as his capstone.
He said Ms. Dueck’s expertise allows a level of precision to the “ground truth data” that helps them refine the AI model. Other research on flow states has required study participants to self-report.
Students are tackling other approaches too, like an AI-powered camera that takes a picture when you yawn. Karen Li said the goal there is to help people understand how distractions could impact their flow.
The partnership was a natural fit for a multidisciplinary campus, said Larry Pileggi, who leads CMU’s Department of Electrical and Computer Engineering.
“Our students love when there's any kind of real world application. And for many of them, the crazier the better,” he said.
It also serves as a boot camp for the real world, he said.
“I think the students we put out there today have to be broader than they ever were, and that's a good thing.”
A separate student group is using the data to build a decibel monitoring device that could be worn as a bracelet and flash red when a person’s environment gets too loud. While Mr. Owens’ performance was certainly piercing, it would likely take about eight hours of continuous opera to hurt someone’s ears, they said.
Emily Garcia, a soprano in the class, sees benefits for performers too. She said making music is an “intimate” collaboration that demands shared focus between musicians. The emerging view into how the brain achieves such teamwork is “amazing,” she said.
The engineering team will show off their results at the department’s capstone showcase on May 3.
Other projects at the demo include a facial recognition tool for pets that could help make doggie doors raccoon-proof, Mr. Pileggi said.
Evan Robinson-Johnson: ejohnson@post-gazette.com and @sightsonwheels
First Published: April 20, 2024, 9:30 a.m.
Updated: April 21, 2024, 12:37 a.m.