10/21/2025 Jeanie Chung
How Ryan Corey and his Listening Technology Lab at UIC and DPI are out to improve hearing assistance technology
Written by Jeanie Chung
Ryan M. Corey (’14 M.S., ’19 Ph.D.), research scientist at Discovery Partners Institute and assistant professor of electrical and computer engineering at University of Illinois Chicago, may just change the way we hear.
In September, Corey received an Early Career Research Award from the National Institute on Deafness and Other Communication Disorders (NIDCD), a division of the National Institutes of Health. The grant provides more than $500,000 over three years to Corey and his Listening Technology Lab for the project, “Perceptual effects of processing and transmission delay in hearing technology.”
The award, one of 26 given out in fiscal year 2025, is Corey’s first major grant as sole principal investigator, and will allow his team to study the effects of delay in wireless hearing assistive devices, which are found in theaters, classrooms, places of worship, and many other public spaces.
By transmitting clear sound directly to the listener’s ears, these wireless systems can have dramatic benefits for people with hearing loss, but they are often inconvenient and cumbersome to use. The Listening Technology Lab team aims to build more convenient and affordable hearing systems using ordinary consumer devices and standard digital wireless protocols like Wi-Fi and Bluetooth.
Toward better hearing technology
Today, most high-end hearing aids already have Bluetooth connectivity, and many ADA-compliant assistive listening devices offer a Wi-Fi option. However, these digital wireless protocols cause delays of tens or even hundreds of milliseconds between when sound is captured and when it reaches the listener. That delay would not be noticeable for a phone call or music listening, but it would cause a disturbing echo in a hearing device. Current constraints for hearing aids allow for about ten milliseconds for signal processing. Do the same limits apply to wireless listening systems?
In the new project, pairs of subjects will hold unscripted conversations in the Listening Technology Lab acoustic studio using hearing devices with variable delays. The researchers will measure how much delay they can tolerate for both wearable hearing aids and wireless remote microphones in different noise and reverberation conditions. The results will help engineers develop innovative digital algorithms and system architectures for hearing technology, which could dramatically improve quality of life for people with hearing loss.
As with many other fields, hearing technology could benefit from the use of AI. And while Corey’s project is not strictly AI or machine learning, it could improve those algorithms.
“The more delay you give them,” he said, “the better they work.”
Corey’s research could lead to a system where “maybe it’s processing your own voice and nearby sounds really quickly, but then for the background noise it’s bringing in all these other devices; it’s running these really big AI algorithms that take a long time.”
Solving unsolved problems
The NIDCD grant is Corey’s third in 2025 — in July he received a grant from the DEVCOM Army Research Lab to study sensor networks that could enable people to hear through hundreds of ears instead of two, and in August he and Justin Aronoff of UIUC’s Binaural Hearing Lab received funding from the National Institutes of Health to improve binaural beamforming algorithms for cochlear implants in partnership with Vortant Technologies.
As the first jointly appointed faculty member between UIC and DPI, Corey’s work reflects DPI’s mission to use cutting-edge research to address community needs and unsolved problems in the marketplace. DPI has supported Corey’s research through Science Team funding, as well as helping him make connections outside of academia, including at the Army Research Lab and with Chicago-area hearing technology startups.
Corey’s team pursues “use-inspired research,” looking for high-impact problems that have been overlooked by other researchers and industry. He’s addressed practical questions, like: which face mask allows the user to be heard best? And he’s taken bigger swings, with wearable microphone arrays. His research is practical, but also really cool.
The hear and now
As with any scientific experiments, the Listening Technology Lab’s work must be done under replicable and extremely controlled conditions. One decibel or one fraction of a millimeter of variation can throw off an experiment, especially when dealing with motion-sensitive spatial audio.
Thus, the most prominent feature in the Listening Technology Lab, located on the fourth floor of UIC’s Science and Engineering Laboratory East, is the Variable Reverberation Time (VRT) chamber, a soundproof booth with walls covered in acoustic panels. Built by Chicagoland-based IAC Acoustics, each panel is attached to the wall with magnets, allowing the team to move or remove them to vary the reverb. Some of the steel walls that hold the panels are so large they had to come in through the window.
Inside the VRT room, there need to be equally precise devices making and transmitting sound. The industry standard for audiology labs is a KEMAR-brand acoustic mannequin, which can both send and receive sound to simulate a human speaking and hearing. The Listening Technology Lab has one — but you can’t simulate conversation among more than one person with only one mannequin, and each KEMAR costs thousands of dollars. Ergo, an elegant solution: 3D-printed heads, funded by a DPI Science Team grant and installed with speakers and microphones, originally developed under Corey at UIUC.
But what really makes the lab run is the people.
The gang’s all hear
The four students in the Listening Technology Lab each found their way from a different place.
Rashen Fernando, a third-year PhD student, is a music aficionado who played guitar when he was young, then got into mixing and mastering. As an undergraduate electrical and electronics engineering major at the University of Peradeniya in Sri Lanka, he did a project on music source separation, separating different sources like drums and vocals. When he applied to UIC’s PhD program, Corey, who was then on his way from Urbana, saw his application and recruited him.
Fernando is currently working on a paper on sound-field interpolation, which can help hearing systems adapt quickly when people move around. He’s also developing new beamforming algorithms to remix real-life sounds for hearing aids. He’s also part of Corey’s collaboration with Aronoff and Vortant Technologies.
Rajesh Rameshbabu is a second-year PhD student who has a long-standing interest in audio signal processing, especially projects that combine real-world applications with strong mathematical foundations. He discovered Corey’s work at an industry conference in 2021 and was inspired by his research, his enthusiasm, and the clarity with which he presented ideas. He finds working on the sensor network project “particularly exciting, because it combines both theoretical and practical challenges, with the potential for broad impact in hearing technology and beyond.”
Yuezhong Xu, a second-year PhD student, wanted to study human-computer interaction. In particular, he was fascinated by an interactive wall at Carnegie Mellon University. Going down a research rabbit hole, he found that Corey’s interests included VR and human-computer interactions, “so I reached out to him.”
Xu is responsible for the lab’s robot heads and is also building a virtual-reality research system that records and plays back high-resolution 3D sound, allowing the team to simulate future algorithms in immersive virtual acoustic settings.
Nefty Lara, an electrical engineering major who will graduate in December, is the only undergrad in the group. She took Corey’s elective, Audio and Acoustic Signal Processing — as did Fernando and Rameshbabu — and asked him to advise her group’s senior design project, which was an intercom system for the lab’s variable-reverberation chamber. Shaped like the Hancock Tower, the speaker doesn’t quite work — most senior design projects don’t — but she loved working with Corey.
“At a bigger university, it can get difficult to find time to meet with your professors,” she said. “I was really, really touched by how he went out of the way to help us, answer our questions, give advice.” At the end of the project, she “basically begged him to let me come work with him.”
Now, Lara is working on finding a standardized way to evaluate assistive listening devices as part of the NIDCD grant.
Rameshbabu describes Corey as “not only deeply knowledgeable but also kind and patient in mentoring,” and all his students give him high marks as a leader.
“Especially as an assistant professor, there’s a lot of pressure on me to produce results.” Corey said. “But I also want to be a good mentor, like I had.”
How we got hear
A lifelong hearing aid user, Corey had always been interested in electrical engineering and systems. But it took him a few years to put the two ideas together. In graduate school at UIUC, he worked in communication systems and signal processing with Andrew Singer, now dean of the College of Engineering and Applied Sciences at SUNY Stony Brook.
While working on quantization and signal processing in Singer’s lab, Corey realized the sensor arrays that delivered clear signals to your cellphone could also work together to deliver clearer signals for hearing aids.
He told Singer, “we can take some of this math, some of these ideas that other people aren’t looking at in the hearing aid space, and apply them and do some big, bold things.”
Impressed by Corey’s enthusiasm, Singer supported him every way he could, working with funders and giving Corey nearly free reign to set up what became the Augmented Listening Laboratory. By the time Corey was a postdoc, the lab attracted not only graduate students, but undergraduates from disciplines across UIUC — from engineering, acoustics, industrial design, fine and applied arts, and more. The Augmented Listening Lab and UIC’s Listening Technology Lab still work closely together, as they will with the new lab Singer is setting up at Stony Brook.
RELATED STORY: "A New Frontier in How We Hear"
Singer calls Corey “wildly creative and brilliant,” noting, “he brings a broad perspective from the mathematical to the socially conscious, in a way that is exactly what the world needs.”
Where we go from here, to hear
Corey sees future hearing technology taking more advantage of connectivity. You walk into a room, your hearing aid connects to other peoples’ hearing aids, to infrastructure in the room, to smartphones — securely and without violating anyone’s privacy.
“We can pick and choose what you hear, basically a mixing board for real life,” Corey said.
It’s an advantage for everyone who struggles to hear in a noisy world. One problem: who chooses what you hear? You may not want to hear anything but your conversation with your friends at a table in a crowded restaurant, but you need to hear someone yelling “Fire!” or an unexpected guest calling your name. Equally important, you don’t want to spend a lot of time curating your own sound mix.
It’s tricky, Corey said, “but we can start to poke at the edges of it by having certain modes, like you’d listen within a certain distance — maybe if you’re at a networking event, you just hear people nearby.”
That’s where human factors start to enter the picture. In graduate school and as a postdoc, Corey was focused on engineering and hardware. With his lab set up at UIC now, he’s planning to work more with local clinicians, because their patients need to inform the technology as much as benefit from it.
For example, in the NIDCD study, on the human side, Corey’s lab wants to know: how much delay can people tolerate? And on the engineering side: can they build clever signal processing strategies that get around that delay?
“We need to go both directions with that interchange,” Corey said. “And that’s why collaboration is going to be really important.”