NYU’s “Sounds of New York City” project listens to the city—and then, with the help of citizen scientists, teaches machines to decode the soundscape. Jim Daley reports.
No wonder they call New York the city that never sleeps. In fact, noise is one of the biggest civic complaints made by denizens of the Big Apple. Now, a project that uses citizen science and artificial intelligence, A.I., is trying to help. Called Sounds of New York City, or SONYC, the effort combines a network of sensors that constantly monitor ambient noise, along with machine learning and human volunteers.
“The SONYC project has two main goals: we want to advance the science and engineering of machine listening, and we want to help monitor and mitigate noise pollution in urban areas.”
Oded Nov, a professor of Technology Management and Innovation at NYU’s Tandon School of Engineering.
“Over the past two years, our sensors collected huge amounts of urban sound data.” But computers don’t know what different sounds mean—until they’re trained by people.
That’s where citizen science comes in: SONYC needs members of the public to listen to ambient sounds picked up by noise monitors and label the sounds so the computers can learn to independently recognize them.
“Labeling sound is harder than labeling images because sound is invisible and ephemeral.”
But once people label sounds and enter them into a computer, the machines have an easier time telling, say, a jackhammer from an idling truck.
“Anyone with a computer or a smartphone can participate in this research project. Search for SONYC NYU, and start labeling short sound recordings online. The more labeled examples we give our computers, the better they become at recognizing sounds.”
The information could help inform city agencies about where they should try to cut the noise most. With a little help from citizen scientists, SONYC just might be able to let the city that doesn’t sleep finally get a little shuteye.
[The above text is a transcript of this podcast.]