Imagine yourself blindfolded and placed in a strange room; you snap your fingers or click your tongue, and listen closely to the echoes. Can you hear the shape of the room? Does the sound tell you if a door is open?
While it's commonly known that bats and dolphins and some birds are able to determine their surroundings by echolocation, some human beings are also able to map their environment through reflected sound. Daniel Kish, a Southern Californian who lost his vision due to cancer, has been studied for his ability to navigate the world by making clicking noises with his tongue. (Click here to read a Mens Journal profile of Kish.)
Recently, scientists attempted to see whether or not they could simulate this ability by creating a computer algorithm, a speaker and four microphones. Their efforts, detailed in this this PNAS study, may open the door to such future applications as architectural acoustics and audio forensics.
In a series of experiments, the researchers were able to map three-dimensional images of a lecture room with a movable divider, and a section of a Swiss cathedral, according to Ivan Dokmanic, the lead author and a PhD student in the audiovisual communication laboratory of the Ecole Polytechnique Federal de Lausanne, in Switzerland.
By emitting sounds with a speaker and measuring the amount of time it took for the echo to be picked up by microphones, researchers were able to map the distance of walls in the room, even if they were placed at irregular angles. However, the method was not perfect. The algorithm was unable to describe the arched roof of the cathedral, authors said.
"Our algorithm opens the way for different applications in virtual reality, auralization, architectural acoustics and audio forensics," wrote Dokmanic and colleagues. "As an extension of our method, a person walking around the room and talking into a cellphone could enable us to both hear the room and find the person's location."
Return to Science Now blog.
Follow me on Twitter @montemorin