Read Mind Hacks™: Tips & Tools for Using Your Brain Online

Authors: Tom Stafford,Matt Webb

Tags: #COMPUTERS / Social Aspects / Human-Computer Interaction

Mind Hacks™: Tips & Tools for Using Your Brain (29 page)

BOOK: Mind Hacks™: Tips & Tools for Using Your Brain
11.02Mb size Format: txt, pdf, ePub
ads
Combine Modalities to Increase Intensity
Events that affect more than one sense feel more intense in both of them.

The vision and audition chapters (
Chapter 2
and
Chapter 4
, respectively) of this book
look at the senses individually, just as a lot of psychologists have over the years. But
interesting things begin to happen when you look at the senses as they interact with one another.
1

Multisensory information is the norm in the real world, after all. Tigers smell strong
and rustle as they creep through the undergrowth toward you. Fire shines and crackles as it
burns. Your child says your name as she shakes your shoulder to wake you up.

These examples all suggest that the most basic kind of interaction between two senses
should be the enhanced response to an event that generates two kinds of stimulation rather
than just one. Information from one sense is more likely to be coincidence; simultaneous
information on two senses is a good clue that you have detected a real event.

In Action

We can see the interaction of information hitting two senses at once in all sorts of
situations. People sound clearer when we can see their lips
[
Hear with Your Eyes: The McGurk Effect
]
. Movies feel more
impressive when they have a sound track. If someone gets a tap on one hand as they
simultaneously see two flashes of light, one on each side, the light on the same side as
the hand tap will appear brighter.

Helge Gillmeister and Martin Eimer of Birkbeck College, University of London, have
found that people experience sounds as louder if a small vibration is applied to their
index finger at the same time.
2
Although the vibration didn’t convey any extra information, subjects rated
sounds as up to twice as loud when they occurred at the same time as a finger vibration.
The effect was biggest for quieter sounds.

How It Works

Recent research on such situations shows that the combination of information
is wired into the early stages of sensory processing in the cortex. Areas of the cortex
traditionally thought to respond to only a single sense (e.g., parts of the visual cortex)
do actually respond to stimulation of the other senses too. This makes sense of the fact
that many of these effects occur preconsciously, without any sense of effort or
decision-making. They are preconscious because they are occurring in the parts of the
brain responsible for initial representation and processing of sensation — another example
(as in
To See, Act
) of our perception not being passive but being
actively constructed by our brains in ways we aren’t always aware of.

Macaluso et al.
3
showed that the effect can work the other way round from the one discussed
here: touch can enhance visual discrimination. They don’t suggest that integration is
happening in the visual cortex initially, but instead that parietal cortex areas
responsible for multisensory integration send feedback signals down to visual areas, and
it is this that allows enhanced visual sensitivity.

For enhancement to happen, it has to be labeled as belonging to the same event, and
this is primarily done by the information arriving simultaneously. Individual neurons
[
The Neuron
]
are already set up to
respond to timing information and frequently respond strongest to inputs from different
sources arriving simultaneously. If information arrives at different times, it can
suppress the activity of cells responsible for responding to inputs across senses (senses
are called
modalities
, in the jargon).

So, what makes information from two modalities appear simultaneous? Obviously arriving
at the
exact
same time is not possible; there must be a resolution of
the senses in time below which two events appear to be simultaneous.

Although light moves a million times faster than sound, sound is processed faster once
it gets to the ear
[
Detect Timing with Your Ears
]
than light is processed once it gets to the eye. The relative speed of
processing of each sense, coupled with the speed at which light and sound travel, leads to
a “horizon of simultaneity”
4
at about 10 meters — where visual and auditory signals from the same source
reach the cortex at the same time.

Most events don’t occur just on this 10-meter line, of course, so there must be some
extra mechanisms at work in the brain to allow sound and light events to appear
simultaneous. Previously, researchers had assumed that the calculation of simultaneity was
approximate enough that time difference due to arrival time could be ignored (until you
get to events very far away — like lightning that arrives before thunder, for example). But
now it appears
that our brains make a preconscious adjustment for how far away something is
when calculating whether the sound and the light are arriving at the same time.
5
Another mechanism that operates is simply to override the timing
information that comes from vision with the timing information from auditory information
[
Put Timing Information into Sound and Location Information into Light
]
.

End Notes
  1. To start following up the research on crossmodal interactions, you
    could start by reading
    Crossmodal Space and Crossmodal Attention
    by Charles Spence and Jon Driver. This is an edited book with contributions from many
    of the people at the forefront of the field. You can read more about the Oxford
    University crossmodal research group on its home page:
    http://psyweb.psy.ox.ac.uk/xmodal/default.htm
    .
  2. Gillmeister, H., & Eimer, M. (submitted). Multisensory
    integration in perception: tactile enhancement of perceived loudness.
  3. Macaluso, E., Frith, C. D., & Driver, J. (2000). Modulation
    of human visual cortex by crossmodal spatial attention.
    Science,
    289
    , 1206–1208.
  4. Pöppel, E. (1988).
    Mindworks: Time and Conscious
    Experience
    . New York: Harcourt Brace Jovanovich.
  5. Sugita, Y., and Suzuki, Y. (2003). Audiovisual perception: Implicit
    estimation of sound-arrival time.
    Nature, 421
    , 911.
Watch Yourself to Feel More
Looking at your skin makes it more sensitive, even if you can’t see what it is you’re
feeling. Look through a magnifying glass and it becomes even more sensitive.

The skin is the shortest-range interface we have with the world. It is the only sense
that doesn’t provide any information about distant objects. If you can feel something on
your skin, it is next to you right now.

Body parts exist as inward-facing objects — they provide touch information — but they also
exist as external objects — we can feel them with other body parts, see them, and (if you’re
lucky) feel and see those of other people.
Mold Your Body Schema
and
Understand What Makes Faces Special
explore how we use vision to update our
internal model of our body parts. But the integration of the two senses goes deeper, so much
so that looking at a body part enhances the sensitivity of that body part, even if you
aren’t getting any useful visual information to illuminate what’s happening on your
skin.

In Action

Kennett et al.
1
tested how sensitive people were to touch on their forearms. In controlled
conditions, people were asked to judge if they were feeling two tiny rods pressed against
their skin or just one. The subjects made these judgments in three conditions. The first
two are the most important, providing the basic comparison. Subjects were either in the
dark or in the light and looking at their arm — but with a brief moment of darkness so they
couldn’t actually see their arm as the pins touched it. Subjects allowed to look at their
arms were significantly more accurate, indicating that looking at the arm, even though it
didn’t provide any useful information, improved tactile sensitivity.

The third condition is the most interesting and shows exactly how pervasive the effect
can be. Subjects were shown their forearm through a magnifying glass (still with darkness
at the actual instant of the pinprick). In this condition, their sensitivity was nearly
twice as precise as their sensitivity in the dark!

This is astounding for at least two reasons. First, it shows that visual attention can
improve our sensitivity in another domain, in this case touch. There is no necessity for
touch to interact like this with vision. The senses could be independent until far later
in processing. Imagine if the double-click rate setting on your mouse changed depending on
what was coming down your Internet connection? You’d think it was pretty odd. But for the
brain this kind of interaction makes sense because we control where we look and events
often spark input to more than one of our senses at a time.

The second reason this is astounding is because it shows how a piece of technology
(the magnifying glass) can be used to adjust our neural processing at a very fundamental
level.

How It Works

Touch information is gathered together in the parietal cortex (consult the crib notes
in
Get Acquainted with the Central Nervous System
if you want to know where
that is), in an area called the
primary somatosensory cortex
. You’ll
find neurons here arranged into a map representing the surface of your body
[
Build Your Own Sensory Homunculus
]
, and
you’ll find
polysensory neurons
. These respond in particular when
visual and tactile input synchronize and suppress when the two inputs are discordant; it
seems there’s a network here that integrates information from both senses, either within
the somatosensory map of the body or in a similar map nearby.

This theory explains why brain damage to the parietal cortex can result in distortions
of body image. Some patients with damaged parietal lobes will point to the doctor’s elbow
when asked to point to their own elbow for example.

This hack and
Mold Your Body Schema
show that short-term
changes in our representation of our body are possible. Individual neurons in the cortex
that respond to stimulation of the skin can be shown to change what area of skin they are
responsible for very rapidly. If, for example, you anesthetize one finger so that it is no
longer providing touch sensation to the cortical cells previously responsible for
responding to sensation there, these cells will begin to respond to sensations on the
other fingers.
2
In the magnifying glass condition, the expanded resolution of vision
appears to cause the resources devoted to tactile sensitivity of the skin to adjust,
adding resolution to match the expanded resolution the magnifying glass has artificially
given vision.

In Real Life

This experiment explains why in general we like to look at things as we do them with
our hands or listen to them with our ears — like watching the band at a gig. We don’t just
want to see what’s going on — it actually enhances the other senses as well.

Perhaps this is also why first-person shooter games have hit upon showing an image of
the player’s hands on the display. Having hands where you can see them may actually remap
your bodily representation to make the screen part of your personal — or
near-personal — space, and hence give all the benefits of attention
[
Don’t Divide Attention Across Locations
]
and
multimodal integration (such as the better sense discrimination shown in this hack) that
you get there.

End Notes
  1. Kennett, S., Taylor-Clarke, M., & Haggard, P. (2001).
    Noninformative vision improves the spatial resolution of touch in humans.
    Current Biology, 11
    , 1188–1191.
  2. Calford, M. B., & Tweedale, R. (1991). Acute changes in
    cutaneous receptive fields in primary somatosensory cortex after digit denervation in
    adult flying fox.
    Journal of Neurophysiology, 65
    , 178–187.
Hear with Your Eyes: The McGurk Effect
Listen with your eyes closed and you’ll hear one sound; listen and watch the speaker
at the same time and you’ll hear another.

If there were ever a way of showing that your senses combine to completely change your
ultimate experience, it’s the McGurk Effect. This classic illusion, invented by Harry McGurk
(and originally published in 1976
1
), makes you hear different sounds being spoken depending on whether or not
you can see the speaker’s lips. Knowing what’s going to happen doesn’t help: the effect just
isn’t as strong.

In Action

Watch Arnt Maasø’s McGurk Effect video (
http://www.media.uio.no/personer/arntm/McGurk_english.html
; QuickTime with sound). You can see a freeze frame of the video in
Figure 5-3
.

Figure 5-3. Arnt Maasø’s McGurk Effect video

When you play it with your eyes closed, the voice says “ba ba.” Play the video again,
and watch the mouth: the voice says “da da.” Try to hear “ba ba” while you watch the lips
move. It can’t be done.

How It Works

The illusion itself can’t happen in real life. McGurk made it by splicing the sound of
someone saying “ba ba” over a video of him making a different sound, “ga ga.” When you’re
not watching the video, you hear what’s actually being spoken. But when you see the
speaker too, the two bits of information clash. The position of the lips is key in telling
what sound someone’s making, especially for distinguishing between speech sounds (called
phonemes) like “ba,” “ga,” “pa,” and “da” (those which you make by popping air
out).

Visual information is really important for listening to people speak. It’s a cliché,
but I know I can’t understand people as well when I don’t have my glasses on.

— M.W.

We use both visual and auditory information when figuring out what sound a
person is making and they usually reinforce each other, but when the two conflict, the
brain has to find a resolution. In the world the brain’s used to, objects don’t usually
look as if they’re doing one thing but sound as if they’re doing another.

Since visually you’re seeing “ga ga” and audition is hearing “ba ba,” these are
averaged out and you perceive “da da” instead, a sound that sits equally well with both
information cues. In other situations, visual information will dominate completely and
change a heard syllable to the one seen in the lip movements.
2

Remarkably, you don’t notice the confusion. Sensory information is combined before
language processing is reached, and language processing tunes into only certain phonemes
[
Speech Is Broadband Input to Your Head
]
. The decision as to what you hear is outside your voluntary control. The
McGurk Effect shows integration of information across the senses at a completely
preconscious level. You don’t get to make any decisions about this; what you hear is
affected by what goes in through your eyes. It’s a good thing that in most circumstances
the visual information you get matches what you need to hear.

End Notes
  1. McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing
    voices.
    Nature, 264
    , 746–747.
  2. Fusion of the sound and sight information is possible only when you
    have experience with a suitable compromise phoneme. One of the interesting things
    about phonemes is that they are perceived as either one thing or the other, but not as
    in-between values. So although there exists a continuum of physical sounds in between
    “ba” and “da,” all positions along this spectrum will be perceived as either “ba” or
    “da,” not as in-between sounds (unlike, say, colors, which have continuous physical
    values that you can also perceive). This is called categorical perception.
See Also
BOOK: Mind Hacks™: Tips & Tools for Using Your Brain
11.02Mb size Format: txt, pdf, ePub
ads

Other books

BloodlustandMetal by Lisa Carlisle
Angel Meadow by Audrey Howard
The Nemisin Star by Elaina J Davidson
The Folded Earth: A Novel by Roy, Anuradha
A Good Woman by Danielle Steel
The Great Perhaps by Joe Meno
Kings of Clonmel by John Flanagan
The Death Catchers by Jennifer Anne Kogler
Rogue Dragon by Avram Davidson