In progress
A bird chirping, a glass breaking, an ambulance passing by. Listening to sounds helps recognizing events and objects, even when they are out of sight, in the dark or behind a wall, for example. We know very little about how the brain transforms acoustic sound into meaningful output. In this project we combine Artificial Intelligence and neuroscience to find out more about how the brain works. This knowledge contributes to the development of artificial hearing systems and the study of soundscapes.
Do you want to know more? Contact us!
Publications
Team
Prof. Elia Formisano
Full Professor | Principal Investigator
Dr. Yenisel Plasencia-Calaña
Assistant Professor
Chris van der Lans
Front-end Developer