Echolocation is a method of perceiving the world by emitting noises, then listening to the reflections of these noises off objects in the environment. Animals use echolocation for hunting and navigation, but visually impaired humans also employ echolocation as part of their orienting repertoire while navigating the world. There are a few rare individuals who can echolocate very well without assistance. However, researchers at Boston University have developed a prototype device that can enhance auditory cues while navigating an environment. The device repeatedly emits an inaudible (to humans) ultrasonic click several times per second, and each click reflects off any objects in the environment. The reflections are then detected by special head-mounted microphones, and computer processing converts the ultrasonic signals into audible signals, which the user then can hear over custom open-ear earphones.
The end result is an "auditory image" in which objects in the environment seem to emit "sounds" to the user, with objects of different shapes and textures emitting subtly different sounds, such that the user can distinguish between them. According to BU researcher Cameron Morland (cjmorlan@bu.edu), the unique acoustic characteristics of the reflections enable the user to better distinguish the location and size "surface" properties of objects. For instance, sounds emitted by an object to the left will arrive at the left ear a bit sooner and louder (interaural time difference and interaural level difference).
Furthermore, sweeping the device over a surface while remaining the same distance from it, will produce a reflection with unchanged velocity of the surface of an object is flat. If the surface is tilted so it moves closer to the user, it will sound higher in pitch;tilted the other way, it will sound lower in pitch (a Doppler shift). A roughly textured surface will have some regions that are closer, and others that are further away, and users can easily learn to recognize those differences, and discern the resulting pattern of increased and decreased pitch. "Venetian blinds sound quite different than a flat surface, or a bookshelf packed with different-sized books," says Morland.
The BU team has built a prototype capable of simple detection of objects and open spaces, and preliminary tests show that most people can echolocate a little using the device, and improve quickly with practice. They are now refining their prototype to function in more complex, real-world environments. Morland believes that given enough practice, people should be able to echolocate very well using the device - perhaps better than they could unassisted, since higher frequencies outside the normal range of human hearing are more useful for echolocation. (Movies of the device can be found at cns.bu.edu/~cjmorlan/research)
"What it is like to be a bat: A sonar system for humans" was presented on Tuesday, July 1, 2008 at the Acoustics '08 meeting in Paris.
Adapted from materials provided by American Institute of Physics
A Sonar System for the Blind
Share: