The exotic sensory capabilities of humans
Lawrence D. Rosenblum and Michael S. Gordon look beyond the traditional five senses, to echolocation and more.
09 December 2012
I expect some stares as we walk into the bike shop, and we get them.* My companions are both blind, leading with white canes, and one is rolling in his ailing mountain bike. I'm also not surprised when the salesman approaches me to ask what we need. But then one of my companions, Daniel Kish, answers that he's looking for a new tube '24 inches, latex, with a Presta valve'. The salesman quickly realises that despite appearances, Daniel is the experienced rider.
In fact, Daniel has been leading his group of blind mountain bikers and hikers – Team Bat – for over 15 years. Today's group is small: Daniel, his co-leader Brian Bushway, Daniel's intern Megan O'Rourke and myself, the only sighted rider. I'm along to document the experience for my book (Rosenblum, 2010).
We fix Daniel's bike, and meet Brian at his home in Mission Viejo, California. Once equipped, we leave the safety of Brian's driveway and turn onto the residential street leading to the mountain trail. That's when the tongue-clicking begins. Daniel, Brian and Megan are making, loud, sharp clicking sounds with their tongues so that they can hear what I can see. Clicking in this way allows them to produce a sound that can be reflected from parked cars, trash cans, and other objects along the street. These reflected sounds can tell Daniel and his friends the location of these silent obstacles, so that they can avoid them on their ride.
The technique Daniel and his friends are using is known as echolocation. Using the same basic methods as bats, dolphins and other echolocating species, many blind individuals are known to navigate in this way (Griffin, 1944; Rice, 1967). However, human echolocation is not something restricted to the blind. Research in our lab and others has shown that with just 10 minutes of practice, inexperienced subjects who have sight (and while wearing a blindfold) can use echolocation to walk towards a wall and stop just before making contact (Rosenblum et al., 2000). This fact suggests that echolocation may indicate a general sensitivity to reflected sound that gives us all an auditory sense of the space we occupy at any given moment. Consider, for example, the audible difference between a stairwell and walk-in closet, a difference based on how the two settings reflect sound. In fact, our sensitivity to the way sound reflects through different kinds of acoustic environments has forced the movie and television industry to exert considerable effort acoustically modifying soundstage sets so that they sound like they look on screen.But our ability to echolocate may also typify a more general strategy of the brain to maintain a set of exotic skills – skills typically at work at an unconscious, implicit level. While we are mostly unaware of these hidden skills, it seems that they can be refined and made more prominent for a variety of purposes, including to compensate for sensory loss, as in Daniel Kish's case. There is also evidence that our brains are designed to incorporate input via these exotic channels, using the same mechanisms used for our more mundane perceptual skills.
You hear like a bat
Reports of the ability of the blind to remotely sense objects appeared throughout the 19th and early 20th centuries, along with explanations ranging from the their presumed sensitivity to magnetism, to their clairvoyance. The most prevalent early theory held that the blind could sense subtle changes in air pressure on their faces (and other exposed skin) that result from the presence of objects. This 'facial vision' theory was based largely on the introspective reports of the blind themselves (Dresslar, 1893; Supa et al., 1944).
Despite these introspections, it is now known that the remote sense is based on reflected sound. A definitive test was conducted in the 1940s in one of Cornell University's old stone and wood buildings. Karl Dallenbach's lab was on the top floor, and consisted of a large room with a vaulted wood-beam ceiling. Two blind and two sighted men were each was asked to walk blindfolded toward a large masonite board and stop just before making contact. They repeated this task multiple times and were asked to remain quiet as they walked. All four could perform the task with some accuracy, rarely colliding with the board. When asked how they performed the task, three of them reported feeling changes in air pressure – facial vision – and none thought that they were using sound (Supa et al., 1944).
But Dallenbach noticed that, though they tried to remain quiet as they walked, they were inadvertently making a great deal of noise. As was fashionable in the 1940s, the participants all wore hard-soled shoes that produced a noticeable sound with each step on the path of the hardwood floor. To control this sound, the hardwood was covered with plush carpeting and the four were asked to take off their shoes and walk in stocking feet. They also wore headphones that emitted a loud tone to effectively mask environmental sounds. Now, when they walked toward the wall, they collided with it on every trial. Follow-up experiments using methods to neutralise air pressure changes on the skin confirmed that hearing external sound was both necessary and sufficient to perform the task.
Since Dallenbach's work, other laboratories have shown that humans can use echolocation to hear more detailed properties of objects: these include an object's horizontal position, relative distance and relative size (Ashmead et al., 1998; Kellogg, 1962; Rice, 1967; Teng & Whitney, 2011). Astonishingly, humans also have the ability to identify the general shape of an object (square, triangle, disk), and even an object's material composition (wood, metal, cloth) using echolocation (Rice, 1967; Schwitzgebel & Gordon, 2011; Thaler et al., 2010). Blind peopleare generally better at echolocating, but untrained sighted people are also able to perform all of these tasks with some success and to improve their accuracy with practice.
In fact, recent research testing echolocation experts including Daniel Kish, has revealed that the experts' spatial precision for sound-reflecting objects can rival that of bats, and the precision most of us display in determining the location of sound-emitting objects. This acquired precision may be partially a consequence of experts recruiting the visual cortex for their skill. Multiple studies have shown that when echolocating, blind experts' visual cortexes are active. Moreover, this activation seems to occur in ways that closely parallel cortical reactions in typical visual perception (e.g. Milne et al., 2012; Thaler et al., 2012; Teng & Whitney, 2011). For example, when asked to listen to echoes from moving objects, Kish and other experts showed activity in cortical regions typically associated with visual motion detection (medial temporal area, Layer 5). In another recent imaging study, experts were scanned while listening to echoes and judging the object characteristics of either the location, shape or surface material of objects. Depending on the task, the experts showed cortical activity in the neural areas typically used for visual apprehension of each of the same three characteristics.
The fact that the same specific brain regions may be used to determine an object's properties, via either vision or expert echolocation, supports an emerging idea about the brain. The perceptual brain may be organised more around perceptual function than specific sensory systems as such. An organisation emphasising function over sensory modality might more easily take advantage of the information redundancy available across hearing and seeing. This architecture might also pose the brain to more effectively deal with the compensatory plasticity often observed after sensory loss. Thus, the perceptual brain may be organised so that the implicit skills it harbours, including echolocation, are ready to take on a more prominent role when necessary. But exotic perceptual skills are not limited to sight and sound. It turns out that many of our implicit perceptual skills are nose related.
You smell like a dog
I allow my students to blindfold and disorient me, occlude my ears with plugs and industrial ear protectors, and place thick gardening gloves on my hands. After this preparation, they place me about two metres away from a 12-metre rope they've laid across the ground and secured with garden stakes. This rope has been soaked in peppermint oil for a few days. It is my task to crawl to the rope and then follow its angular path using only my sense of smell.
As I slowly crawl with my nose about 10 cm from the ground, I get a very strong odour of grass and earth. It's very familiar and comforting scent, reminiscent of childhood summers. But no peppermint. I lift my head and stick my nose in the air as I've seen dogs do, but it doesn't help. I place my nose back down and continue crawling forward.
Then I get a brief whiff of peppermint. It seems far off and ephemeral, but noticeably spicy, and very different from the earth and grass I've been smelling.I continue forward and then I realise that I've arrived: I am over the rope. I move my head just beyond the point of strongest smell and detect the odour weaken a bit.I move my head back, turn my body parallel with what I believe is the rope, and start crawling along the line of strongest peppermint odour.
As I crawl forward, I have an almost tangible experience of being inside a shallow trench, or gutter, whose shape is defined by the strength of smell. If my nose moves too far to the side of the trench, it's as if the gradient of smell draws my nose back down the side slope, toward the path of strongest scent – the rope. Perhaps this is what it's truly like to smell like a dog.
Sniffing like a dog puts me in good company not only with dogs, but with some of California's brightest young minds. This human scent-tracking experiment is borrowed from work conducted at UC Berkeley in 2007 (i.e. Porter et al., 2007). The undergrads who performed the tracking task found it relatively easy, if embarrassing. With practice, they were able to substantially improve their scent-tracking skills, often doubling their speed at following the trail.
But perhaps the most interesting finding from the scent-tracking study revealed something astonishing about our noses: we compare smells across our two nostrils to determine an odor's location. In fact, comparative studies have shown that organisms ranging from mice and rats to ants and drosophila smell in stereo, and compare those bilateral sensory signals for scent tracking (Rajan et al., 2006; Steck et al., 2010; Wallace et al., 2002). In this sense, our noses join our eyes and ears in making use of two inputs to help you locate where things are (at least for odourants providing some trigeminal nerve stimulation, e.g. Frasnelli et al., 2011). Consider that our auditory system attends to the small differences in when, and how much, sound reaches each of your ears to perceive a source's location: The ear closer to the source receives the sound slightly sooner and louder than the farther ear. A similar process is conducted using the similarities and differences between the two eyes. And it's likely that your brain does something similar with your nose – comparing the amount of odour across your nostrils to determine from where an odour originates.
We find that our noses can provide spatial navigation information that functions in parallel with what we gain visually and auditorily. Our sensory capacities allow us to behave successfully within our spatial environments by tracking visual objects and motion; by using echolocation and sound reflections; and by picking up and tracking the paths of odorants using the partially redundant information from our olfactory systems. Each of the sensory interactions with the world provides a somewhat unique method of gaining the critical information to locate and successfully move towards objects.
The human scent-tracking study helps underscore a larger point: perception has developed a rich and redundant means of extracting critical information about the world. Our ability to navigate through space and accomplish certain spatially relevant tasks (like finding food) is too important to be left to a single sensory system or type of sensory information. We have developed a perceptual brain that has a way to confirm or find alternate ways to be successful in the things that we do. Our brains share resources between all of the sensory interactions available to us to create our experience of the world. And to maintain this spatial sensitivity, the brain provides a degree of functional plasticity to allow for sensory inputs to compensate for one another.
Your plastic brain
Imagine yourself in this experiment. You don a specialised blindfold that prevents your eyes from receiving any light. You then check into a hospital room where you'll live, blindfolded, for the next five days. You are supervised by nurses and researchers, but being in a new environment without the benefit of sight does take some getting used to – as do the multiple tests you will be subjected to during the five days. Early on the first day, you are placed into a large brain scanner, and asked to touch a series of raised dot arrays with your fingers. Your task is to determine whether these dot patterns, presented sequentially in pairs, are the same or different. After this initial test, you begin your first six-hour Braille lesson, a lesson that will be repeated over the next four days.
Incidentally, by the second day you would start experiencing a number of side-effects from continuous blindfolding. You would have visual hallucinations of both amorphous and recognisable images, as well as an initial dullness of flavour, and over-sensitivity to temperature and sound. By the final day of this test (day five), you would no doubt notice that Braille recognition is much easier than it had been on day one. But before you get cocky about your improvement, you are subjected to one last manipulation that immediately renders you incompetent with the task. For this last test, a device is held over the back of your head that makes periodic clicks as you try to match the dot patterns. Strangely, you find the task nearly impossible, and have trouble determining the patterns altogether. The phenomena described are based on the research of Pascual-Leone and colleagues (Pascual-Leone et al., 2001).
Five days of blindfolding would change you and your brain in fascinating ways. Most obvious would be your substantial improvement in discriminating the dot characters: the intensive Braille training would seem to have helped. But you may be disappointed to learn that you would have improved these touch skills even without the Braille training. Five days of blindfolding alone can enhance basic touch skills. In addition, the last brain scan would reveal that when you now touch complex patterns, your visual brain is activated in a way similar to that of an individual who is truly blind. But for participants who were not blindfolded, these brain changes would not occur, even if they did have the intensive Braille training. Five days of visual deprivation is enough to establish much more recruitment of the visual processing areas of the brain in somatosensory tasks, as well as the task performance advantages such involvement provides.
As extra support for this conclusion, recall the (imagined) manipulation that caused you to be unable to perform the task. The one where a clicking device held to the back of your head completely disrupted your touch skills? That device induced a 'virtual lesion' using transcranial magnetic stimulation (TMS) in the visual cortical regions. TMS is a technology that uses localised magnetic pulses to totally disrupt neural processing in a small section of the cortex (see Rossini & Rossi, 2007, for review). In this case, somatosensory neural areas (more traditionally recognised as the part of the brain for processing touch-based information) were unaffected by the TMS, while the manipulation was directly applied to visual cortex. It was these disruptions to visual cortex that were critical to upsetting the newfound touch skills after the blindfolding period.
While it is somewhat striking how quickly our brains are able to adapt in response to new stimulation, perhaps the underlying mechanisms supporting these changes should not be all that surprising. Sensory receptors have been found to be sensitive to some variety of media. Moreover, there is some evidence to suggest that for all of the diversity of energy and stimulation in the environment, once that energy is transduced into a biological signal, it may function as part of a common language within our bodies, e.g. for speech (McGurk & MacDonald, 1976; Rosenblum, 2008), for detection of approaching objects (Gordon & Rosenblum, 2005; Morrongiello & Fenwick, 1991), and in cross-modal neural studies (Matteau et al., 2010; Sur et al., 1988). In the example with Braille, evidence supports the notion that the somatosensory information that bears on object detection creates a pattern of activation in the brain that has some functional equivalence to visual information for spatial patterns (Cheung et al., 2009; Ptito et al., 2008). One might conclude that the brain seeks out functionally similar and redundant patterns to more effectively process that information (Anderson, 2010; Peelen & Kastner, 2009; Reich et al., 2012; Rosenblum, 2008; Samson & Livesay, 2009). The sensory origin of that information becomes pragmatically irrelevant if it is useful in solving a specific perceptual problem.
Our perceptual world
The human perceptual world is rich with information and the perceptual abilities to explore that information. Humans are visually dominant creatures; vision is principal in our phenomenology and the appropriation of the neural cortex associated with vision is greater than for any other function – other sensory modalities notwithstanding. Nonetheless, our perceptual interactions with the world are populated with whatever variety of signals we can detect. Echolocation as a supplement or primary source supports spatial navigation, as does olfaction, if we seek to encounter the world in that manner. Moreover with each of these spatial senses we use similar methods of detection, comparing the gradations of stimulation from symmetrical and bilateral receptive areas – such as our two eyes, two ears, and two nostrils. With respect to somatosensory and visual detection of objects, one might also argue that we use similar sensory methods to explore the contours and edges of an object's surfaces. It seems that by invoking these common methods of detection we are able to organise the sensory signals in our brain by some common functions. This is not a subtle point: Perception may have the capacity to organise by functions rather than by the sensory categories derived from the sensory organs. As suggested, whatever the stimulating source of our senses (e.g. lights, sounds), it is possible that once that energy is transduced by the receptors it becomes part of a common biochemical communication system. Our introspective experience may be of sound, smell or touch, but our perceptual interactions, and the brain's organisational principle, is of spaces, objects and events.
Lawrence D. Rosenblum
is in the Department of Psychology, University of California, Riverside
[email protected]
Michael S. Gordon
is Assistant Professor at William Paterson University, New Jersey, USA
[email protected]
*Vignettes are told from the perspective of the author and have been adapted from his book: Rosenblum, L.D. (2011). See what I'm saying: The extraordinary power of our five senses. New York: Norton.
References
Anderson, M.L. (2010) Neural reuse: A fundamental organizational principle of the brain. Behavioral and Brain Sciences, 33, 245–313.
Ashmead, D.H., Wall, R.S., Ebinger, K.A. et al. (1998). Spatial hearing in children with disabilities. Perception, 27(1), 105–122.
Cheung, S-H., Fang, F., He, S. & Legge, G.E. (2009). Retinotopically specific reorganization of visual cortex for tactile pattern recognition. Current Biology, 19, 596–601.
Dresslar, F.B. (1893). On the pressure sense of the drum of the ear and 'Facial-Vision'. American Journal of Psychology, 5(3), 344–350.
Frasnelli, J., Hummel, T., Berg, J. et al. (2011). Intranasal localizability of odorants: Influence of stimulus volume. Chemical Senses, 36, 405–410.
Gordon, M.S. & Rosenblum, L.D. (2005). Effects of intra-stimulus modality change on audiovisual time-to-arrival judgments. Perception & Psychophysics, 67, 580–594.
Griffin, D.R. (1944). Echolocation by blind men, bats, and radar. Science, 100(2609), 589–590.
Kellogg, W.N. (1962). Sonar system of the blind. Science, 137, 399–404.
Matteau, I., Kupers, R., Ricciardi, E. et al. (2010). Beyond visual, aural, and haptic movement perception: hMT+ is activated by electrotactile motion stimulation of the tongue in sighted and in congenitally blind individuals. Brain Research Bulletin, 82, 264–270.
McGurk, H. & MacDonald, J.W. (1976). Hearing lips and seeing voices. Nature, 264, 746–748.
Milne, J.L., Goodale, M.A., Arnott, S.R. et al. (2012). Parahippocampal cortex is involved in material processing through echolocation in blind echolocation experts. Journal of Vision, 12, 581.
Morrongiello, B.A. & Fenwick, K.D. (1991). Infants' coordination of auditory and visual depth information. Journal of Experimental Child Psychology, 52, 277–296.
Peelen, M.V. & Kastner, S. (2009). A nonvisual look at the functional organization of visual cortex. Neuron, 63, 284–286.
Pascual-Leone, A. & Hamilton, R. (2001). The metamodal organization of the brain. In C. Casanova & M. Ptito (Eds.) Vision:?From Neurons to Cognition. Progress in brain research (Vol. 134, pp.1–19). Elsevier.
Porter, J., Craven, B., Khan, R.M. et al. (2007). Mechanisms of scent-tracking in humans. Nature Neuroscience, 10, 27–29.
Ptito, M., Schneider, F.C.G., Paulson, O.B. & Kupers, R. (2008). Alterations of the visual pathways in congenital blindness. Experimental Brain Research, 187, 41–49.
Rajan, R., Clement, J.P. & Bhalla, U.S. (2006). Rats smell in stereo. Science, 311, 666–670.
Reich, L., Maidenbaum, S. & Amedi, A. (2012). The brain as a flexible task machine: Implications for visual rehabilitation using invasive vs. non-invasive approaches. Current Opinions in Neurology, 25, 86–95.
Rice, C.E. (1967). Human echo perception. Science, 155, 656–664.
Rosenblum, L.D. (2008). Speech perception as a multimodal phenomenon. Current Directions in Psychological Science, 17, 405–409.
Rosenblum, L.D. (2010). See what I'm saying: The extraordinary power of our five senses. New York: Norton.
Rosenblum, L.D., Gordon, M.S. & Jarquin, L. (2000). Echolocation by moving and stationary listeners. Ecological Psychology, 12(3), 181–206.
Rossini, P.M. & Rossi, S. (2007). Transcranial magnetic stimulation. Neurology, 68(7), 484–488.
Samson, S.N. & Livesay, F.J. (2009). Gradients in the brain: The control of the development of form and function in the cerebral cortex. Cold Spring Harbor Perspectives in Biology, 1, 1–16.
Schwitzgebel, E. & Gordon, M.S. (2011). Human echolocation. In E. Schwitzgebel, Perplexities of consciousness (pp.57–70), Boston, MA: MIT Press.
Steck, K., Knaden, M. & Hansson, B.S. (2010). Do desert ants smell the scenery in stereo? Animal Behaviour, 79, 939–945.
Supa, M., Cotzin, M. & Dallenbach, K.M. (1944). Facial vision: The perception of obstacles by the blind. American Journal of Psychology, 57(2), 133–183.
Sur, M., Garraghty, P.E., & Roe, A.W. (1988). Experimentally induced visual projections into auditory thalamus and cortex. Science, 242, 1437–1441.
Teng, S. & Whitney, D. (2011). The acuity of echolocation: Spatial resolution in sighted persons compared to the performance of an expert who is blind. Journal of Visual Impairment & Blindness, 105, 20–32.
Thaler, L., Arnott, S.R. & Goodale, M.A. (2010). Human echolocation I. Journal of Vision, 10(7), 1050.
Thaler, L., Milne, J., Arnott, S.R. & Goodale, M.A. (2012). Brain areas involved in echolocation motion processing in blind echolocation experts. Seeing and Perceiving, 25, 140.
Wallace, D.G., Gorny, B. & Whishaw, I.Q. (2002). Rats can track odors, other rats and themselves: Implications for the study of spatial behavior. Behavioural Brain Research, 131, 185–192.