*** Welcome to piglix ***

Echolocation (human)


Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds – for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths – people trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size. This ability is used by some blind people for acoustic wayfinding, or navigating within their environment using auditory rather than visual cues. It is similar in principle to active sonar and to animal echolocation, which is employed by bats, dolphins and toothed whales to find prey.

The term "echolocation" was coined by zoologist Donald Griffin in 1944; however, reports of blind humans being able to locate silent objects date back to 1749. Human echolocation has been known and formally studied since at least the 1950s. In earlier times, human echolocation was sometimes described as "facial vision" or "obstacle sense," as it was believed that the proximity of nearby objects caused pressure changes on the skin. Only in the 1940s did a series of experiments performed in the Cornell Psychological Laboratory show that sound and hearing, rather than pressure changes on the skin, were the mechanisms driving this ability. The field of human and animal echolocation was surveyed in book form as early as 1959. See also White, et al. (1970)

Many blind individuals passively use natural environmental echoes to sense details about their environment; however, others actively produce mouth clicks and are able to gauge information about their environment using the echoes from those clicks. Both passive and active echolocation help blind individuals learn about their environments.

Because sighted individuals learn about their environments using vision, they often do not readily perceive echoes from nearby objects. This is due to an echo suppression phenomenon brought on by the precedence effect. However, with training, sighted individuals with normal hearing can learn to avoid obstacles using only sound, showing that echolocation is a general human ability.

Vision and hearing are closely related in that they can process reflected waves of energy. Vision processes light waves as they travel from their source, bounce off surfaces throughout the environment and enter the eyes. Similarly, the auditory system processes sound waves as they travel from their source, bounce off surfaces and enter the ears. Both systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that they receive. In the case of sound, these waves of reflected energy are called "echoes".


...
Wikipedia

...