03 - Sensation and Perception

ENDURING ISSUES IN SENSATION AND PERCEPTION

Two key questions we address in this chapter concern the extent to which our perceptual experiences accurately reflect what is in the outside world (Person-Situation) and the ways in which our experiences depend on biological processes (Mind-Body). We will also examine the extent to which people around the world perceive events in the same way (Diversity-Universality) and the ways that our experience of the outside world changes as a result of experience over the course of our lives (Stability-Change and Nature-Nurture).

THE NATURE OF SENSATION

Sensation begins when energy stimulates a receptor cell in one of the sense organs, such as the eye or the ear. Each receptor cell responds to one particular form of energy - light waves (in the case of vision) or vibration of air molecules (in the case of hearing). When there is sufficient energy, the receptor cell "fires" and sends to the brain a coded signal that varies according to the characteristics of the stimulus. The process of converting physical energy, such as light or sound, into electrochemical codes is called transduction. For instance, a very bright light might be coded by the rapid firing of a set of nerve cells, but a dim light would set off a much slower firing sequence. The neural signal is coded still further as it passes along the sensory nerves to the central nervous system, so the message that reaches the brain is precise and detailed.

The impulses on the optic nerve reliably produce an experience we call vision, just as impulses moving along the auditory nerve produce the experience we call hearing, or audition. The one-to-one relationship between stimulation of a specific nerve and the resulting sensory experience is known as the doctrine of specific nerve energies. Even if the impulses on the optic nerve are caused by something other than light, the result is still a visual experience. Gentle pressure on an eye, for instance, results in signals from the optic nerve that the brain interprets as visual patterns - the visual pattern of "seeing stars" when we're hit in the eye is so familiar that even cartoons depict it.

Sensory Thresholds

To produce any sensation at all, the physical energy reaching a receptor cell must achieve minimum intensity, or absolute threshold. Any stimulation below the absolute threshold will not be experienced. But how much sensory stimulation is enough?

To answer such a question, psychologists present a stimulus at different intensities and ask people whether they sense anything. You might expect that there would come a point at which people would suddenly say, "Now I see the flash" or "Now I hear a sound." But actually, there is a range of intensities over which a person sometimes - but not always - can sense a stimulus. The absolute threshold is defined as the point at which a person can detect the stimulus 50% of the time that it is presented.

Although there are differences among people, the absolute threshold for each of our senses is remarkably low. The approximate absolute thresholds under ideal circumstances are as follows (McBurney & Collings, 1984):

    • Hearing: The tick of a watch from 6 m (20 feet) in very quiet conditions

    • Vision: A candle flame seen from 50 km (30 miles) on a clear, dark night

    • Taste: 1 g (0.0356 ounces) of table salt in 500 L (529 quarts) of water

    • Smell: One drop of perfume diffused throughout a three-room apartment

    • Touch: The wing of a bee falling on the cheek from a height of 1 cm (0.39 inches)

Under normal conditions, absolute thresholds vary according to the level and nature of ongoing sensory stimulation. For example, your threshold for the taste of salt would be considerably higher after you ate salted peanuts. In this case, the absolute threshold would rise because of sensory adaptation, in which our senses automatically adjust to the overall average level of stimulation in a particular setting. When confronted by a great deal of stimulation, they become much less sensitive than when the overall level of stimulation is low. Similarly, when the level of stimulation drops, our sensory apparatus becomes much more sensitive than under conditions of high stimulation. This process of adaptation allows all of our senses to be keenly attuned to a multitude of environmental cues without getting overloaded.

Imagine now that you can hear a particular sound. How much stronger must the sound become before you notice that is has grown louder? The smallest change in stimulation that you can detect 50% of the time is called the difference threshold, or the just-noticeable difference (jnd). Like the absolute threshold, the difference threshold varies from person to person and from moment to moment. And, like absolute thresholds, difference thresholds tell us something about the flexibility of sensory systems. For example, adding 1 pound to a 5-pound load will certainly be noticed, so we might assume that the difference threshold must be considerably less than 1 pound. Yet adding 1 pound to a 100-pound load probably would not make much of a difference, so we might conclude that the difference threshold must be considerably more than 1 pound. But how can the jnd be both less than and greater then 1 pound? It turns out that the difference threshold varies according to the strength or intensity of the original stimulus. The greater the stimulus, the greater the change necessary to produce a jnd.

In the 1830s, Ernst Weber concluded that the difference threshold is a constant fraction or proportion of the original stimulus; this is a theory known as Weber's law.

Subliminal Perception

The idea of an absolute threshold implies that some events occur subliminally - below our level of awareness. Can subliminal messages used in advertisements and self-help tapes change people's behavior? For decades, the story has circulated that refreshment sales increased dramatically when a movie theater in New Jersey flashed subliminal messages telling people to "Drink Coca-Cola" and "Eat Popcorn." In fact, sales of Coke and popcorn did not change (Dijksterhuis, Aarts, & Smith, 2005).

Similarly, audiotapes with subliminal self-help messages often promise more than they deliver (Dijksterhuis, Aarts, & Smith, 2005). In one series of studies, volunteers used such tapes for several weeks. About half said they had improved as a result of listening to the tapes, but objective tests detected no measurable change. Moreover, the perceived improvement had more to do with the label on the tape than its subliminal content: About half the people who received a tape labeled "Improve Memory" said that their memory had improved even though many of them had actually received a tape intended to boost self-esteem (Greenwald, Spangenberg, Pratkanis, & Eskenazi, 1991).

Nevertheless, there is clear evidence that under carefully controlled conditions, people can be influenced by information outside their awareness. In one study, a group of people was subliminally exposed to words conveying honesty (a positive trait), while another group was subliminally exposed to words conveying hostility (a negative trait). Subsequently, all the participants read a description of a woman whose behavior could be looked at as either honest or hostile. When asked to assess various personality characteristics of the woman, the people who had been subliminally exposed to "honest" words rated her as more honest, and those who had been subliminally exposed to "hostile" words judged her as being hostile (Erdley & D'Agostino, 1988).

These studies and others like them indicate that, in a controlled laboratory setting, people can process and respond to information outside of awareness (Van den Bussche, Van den Noortgate, & Reynvoet, 2009). But this does not mean that people automatically or mindlessly "obey" subliminal messages. To the contrary, independent scientific studies show that hidden messages outside the laboratory have no significant effect on behavior (Dijksterhuis, Aarts, & Smith, 2005).

We have discussed the general characteristics of sensation; however, each of the body's sensory systems works a little differently. We will now examine the unique features of each of the major sensory systems.

VISION

Different animal species depend more on some senses than on others. Bats rely heavily on hearing and some fish rely on taste. But for humans, vision is the most important sense; hence, it has received the most attention from psychologists. To understand vision, we need to look at the parts of the visual system.

The Visual System

Light enters the eye through the cornea, the transparent protective coating over the front part of the eye. It then passes through the pupil, the opening in the center of the iris (the colored part of the eye). In very bright light, the muscles in the iris contract to make the pupil smaller and thus protect the eye from damage. This contraction also helps us to see better in bright light. In dim light, the muscles relax to open the pupil wider and let in as much light as possible.

Inside the pupil, light moves through the lens, which focuses it onto the retina, the light-sensitive inner lining of the back of the eyeball. Directly behind the lens is a depressed spot in the retina called the fovea. The fovea occupies the center of the visual field, and images that pass through the lens are in sharpest focus here. Thus, the words you are now reading are hitting the fovea, while the rest of what you see - a desk, walls, or whatever - is striking other areas of the retina.

The Receptor Cells

The retina contains the receptor cells responsible for vision. These cells are sensitive to only one small part of the spectrum of electromagnetic energy known as visible light. Energies in the electromagnetic spectrum are referred to by their wavelength. Although we receive light waves from the full spectrum, only a portion of it is visible light to us. There are two kinds of receptor cells in the retina - rods and cones - named for their characteristic shapes. About 120 million rods and 8 million cones are present in the retina of each eye. Rods and cones differ from each other in a number of ways. Rods, chiefly responsible for night vision, respond only to varying degrees or intensities of light and dark. Cones, in contrast, allow us to see colors. Operating chiefly in daylight, cones are also less sensitive to light than rods are.

Rods and cones differ in other ways as well. Cones are found mainly in the fovea, which contains no rods. Rods predominate just outside the fovea. The greater the distance from the fovea, the sparser both rods and cones become. Rods and cones also differ in the ways that they connect to nerve cells leading to the brain. Both rods and cones connect to specialized neurons called bipolar cells, which have only one axon and one dendrite. In the fovea, cones generally connect with only one bipolar cell, while several rods share a single bipolar cell.

The one-to-one connection between cones and bipolar cells in the fovea allows for maximum visual acuity. Because rods normally pool their signals to the bipolar cells, they send a less-detailed message to the brain. As a consequence, outside the fovea visual acuity drops by as much as 50%.

In the dark, the fovea is almost useless because it contains no light-sensitive rods. To see a dim object, we have to look slightly to one side so that the image falls just outside the fovea where there are lots of highly sensitive rods. Conversely, if you want to examine something closely and in detail, move it into the sunlight or under a bright lamp. Up to a point, the more light, the better: Stronger light stimulates more cones in the fovea, increasing the likelihood that bipolar cells will start a precise, detailed message on its way to the brain.

Adaptation

Adaptation is the process by which our senses adjust to different levels of stimulation. In the case of vision, adaptation occurs as the sensitivity of rods and cones changes according to how much light is available. In bright light, the rods and cones become less sensitive to light. So when you go from bright sunlight into a dimly lit theater, you can see very little as you look for a seat. First the cones and then the rods slowly adapt until they reach their maximum sensitivity, a process that takes about 30 minutes. Since there is usually not enough energy in very dim light to stimulate many cones, you see the inside of the theater and the people around you primarily in black, white, and gray. The process by which rods and cones become more sensitive to light in response to lowered levels of illumination is called dark adaptation.

By the time you leave the theater, your rods and cones have grown very sensitive. As a result, all the neurons fire at once when you go into bright outdoor light. You shield your eyes and your irises contract in order to reduce the amount of light entering your pupils and striking your retinas. As this process of light adaptation proceeds, the rods and cones become less sensitive to stimulation. Within about a minute, both rods and cones are fully adapted and you no longer need to shield your eyes.

Dark and light adaptation can cause an afterimage.

From Eye To Brain

We have so far directed our attention to the eye, but messages from the eye must travel to the brain in order for a visual experience to occur. The first step in this process is for rods and cones to connect to bipolar cells. The bipolar cells then hook up with the ganglion cells, which lead out of the eye. The axons of the ganglion cells then join to form the optic nerve, which carries messages from each eye to the brain. The place on the retina where the axons of the ganglion cells join to form the optic nerve is called the blind spot. This area contains no receptor cells, so when light from a small object is focused directly on the blind spot, the object will not be seen.

After the nerve fibers that make up the optic nerves leave the eyes, they separate, and some of them cross to the other side of the head at the optic chiasm. The nerve fibers from the right side of each eye travel to the right hemisphere of the brain; those from the left side of each eye travel to the left hemisphere. Therefore, visual information about any object in the left visual field, the area to the left of the viewer, will go to the right hemisphere. Similarly, information about any object in the right visual field, the area to the right of the viewer, will go to the left hemisphere.

The optic nerves carry their messages to various parts of the brain. Some messages reach the area of the brain that controls the reflex movements that adjust the size of the pupil. Others go to the region that directs the eye muscles to change the shape of the lens. Still others go to lower brain centers, rather than the visual cortex. As a result, some people who are temporarily or permanently blind can describe various visual stimuli around them even though they say they "saw" nothing (Ffytche & Zeki, 2011). But the main destinations for messages from the retina are the visual projection areas of the cerebral cortex - the occipital lobe, where the complex coded messages from the retina are registered and interpreted.

How does the brain register and interpret these signals, "translating" light into visual images? First, the ganglion cells in the retina do some preliminary coding of the information entering the eye. Some ganglion cells send messages about the edges of objects. Others convey information about motion. Still others convey information about shadows or highlights. In the brain itself, certain brain cells - called feature detectors - are highly specialized to detect particular elements of the visual field, such as horizontal or vertical lines. Other feature-detector cells register more complex information, with some being sensitive to movement, depth, or color. These different types of feature detectors send messages to specific, but nearby, regions of the cortex. Visual experience, then, depends on the brain's ability to combine these pieces of information into a meaningful image.

Color Vision

Humans, like many other animals, see in color, at least during the day. Color vision is highly adaptive for an animal that needs to know when fruit is ripe or how to avoid poisonous plants and berries (which tend to be brightly hued). There are different ideas, however, about how it is that we are able to see colors.

Properties of Color

We call different colors hues, and to a great extent, the hues you see depend on the wavelength of the light reaching your eyes. The vividness or richness of a hue is its saturation. The dimension of brightness depends largely on the strength of the light entering your eyes. If you squint and look at the color solid, you will reduce the apparent brightness of all the colors in the solid, and many of them will appear to become black.

Hue, saturation, and brightness are three separate aspects of our experience of color. Although people can distinguish only about 150 hues, gradations of saturation and brightness within those 150 hues allow us to see more than 2 million different colors (Travis, 2003).

Theories of Color Vision

If you look closely at a color television screen, you will see that the picture is actually made up of tiny red, green, and blue dots. The same principle is at work in our own ability to see thousands of colors. Specifically, red, green, and blue lights - the primary colors for light mixtures - can be combined to create any hue. This process is called additive color mixing, because each light adds additional wavelengths to the overall mix.

Color mixing with paint follows different rules than does color mixing with light. The color of paint depends not on which wavelengths are present, but rather on which are absorbed and which are reflected. For example, red paint absorbs light from the blue end of the spectrum and reflects light from the red end. Since paint mixing depends on what colors are absorbed, or subtracted, the process is called subtractive color mixing.

In the early 1800s, the German physiologist Hermann von Helmholtz proposed a theory of color vision based on additive color mixing. Helmholtz reasoned that the eye must contain three types of cones that are sensitive to red, green, or blue-violet light. According to this view, color experiences come from mixing the signals from the three receptors. Helmholtz's explanation of color vision is known as trichromatic (or three-color) theory.

Trichromatic theory explains how three primary colors can be combined to produce any other hue. Trichromatic theory does not, however, explain some aspects of normal color vision. Why, for example, don't people with normal color vision ever see a light or pigment that can be described as "reddish green" or "yellowish blue"? And what accounts for color afterimages?

In the later 19th century, another German scientist, Ewald Hering, proposed an alternative theory of color vision that can explain these phenomena. Hering proposed the existence of three pairs of color receptors: a yellow-blue pair and a red-green pair that determine the hue you see; and a black-white pair that determines the brightness of the colors you see. Hering's theory is now known as the opponent-process theory.

Opponent-process theory does a good job of explaining color afterimages. Today, psychologists believe that both the trichromatic and opponent-process theories are valid, but at different stages of the visual process. As trichromatic theory asserts, there are three kinds of cones for color. Thus, trichromatic theory corresponds fairly closely to the types of color receptors that actually exist in the retina. The opponent-process theory closely reflects what happens along the neural pathways that connect the eye and the brain. Together these two theories account for most color phenomena.

Color Vision In Other Species

Most of us assume that color is in the environment. But studies of other species show that, to a great extent, color is in the eye of the beholder. Humans and most other primates perceive a wide range of colors. Most other mammals experience the world only in reds and greens or blues and yellows (Travis, 2003). Hamsters, rats, squirrels, and other rodents are completely color blind. However, some animals can see colors that we can't. Bees can see ultraviolet light. To a bee's eyes, flowers with white petals flash like neon signs pointing the way to nectar.

HEARING

If you had to make a choice, would you give up your sight or your hearing? Presented with this hypothetical choice, most people say they would give up hearing first. But the great teacher and activist Helen Keller, who was both blind and deaf from infancy, regretted her inability to hear more than anything else.

I am just as deaf as I am blind. The problems of deafness are deeper and more complex, if not more important than those of blindness. Deafness is a much worse misfortune. For it means the loss of the most vital stimulus - the sound of the voice that brings language, sets thoughts astir and keeps us in the intellectual company of man. (Keller, 1948; quoted in D. Ackerman, 1995, pp.191-192)

Sound

The sensation we call sound is our brain's interpretation of the ebb and flow of air molecules pounding on our eardrums. When something in the environment moves, pressure is caused as molecules of air or fluid collide with one another and then move apart again. This pressure transmits energy at every collision, creating sound waves. The frequency of the waves is measured in cycles per second, expressed in a unit called hertz (Hz). Frequency primarily determines the pitch of the sound - how high or how low it is. The human ear responds to frequencies from approximately 20 Hz to 20,000 Hz. Cats can hear noises as high as 64,000 Hz and mice can hear sounds up to 100,000 Hz.

The height of the sound wave represents its amplitude, which, together with frequency, determines the perceived loudness of a sound. Sound intensity is measured by a unit called decibel. As we grow older, we lose some of our ability to hear soft sounds; however, we can hear loud sounds as well as ever.

The sounds that we hear seldom result from pure tones. Unlike a tuning fork, musical instruments produce overtones - accompanying sound waves that are different multiples of the frequency of the basic tone. This complex pattern of overtones determines the timbre, or texture, of the sound. A note played on the piano sounds different from the same not played on a violin because of the differing overtones of the two instruments. Music synthesizers can mimic different instruments electronically because they are able to produce the timbre of different musical instruments.

Like our other senses, hearing undergoes adaptation and can function optimally under a wide variety of conditions. City residents enjoying a weekend in the country may be struck at how quiet everything seems. However, after they have adapted to the quieter environment, they may find that the country starts to sound very noisy.

The Ear

Hearing begins when sound waves are gathered by the outer ear and passed along to the eardrum causing it to vibrate. The quivering of the eardrum prompts three tiny bones in the middle ear - the hammer, the anvil, and the stirrup - to hit each other in sequence and thus carry the vibrations to the inner ear. The stirrup is attached to a membrane called the oval window; vibrations of the oval window, in turn, are transmitted to the fluid inside a snail-shaped structure called the cochlea. The cochlea is divided lengthwise by the basilar membrane, which is stiff near the oval window but gradually becomes more flexible toward its other end. When the fluid in the cochlea begins to move, the basilar membrane ripples in response.

Lying on top of the basilar membrane and moving in sync with it is the organ of Corti. Here the messages from the sound waves finally reach the receptor cells for the sense of hearing: thousands of tiny hair cells that are embedded in the organ of Corti. Each hair cell is topped by a bundle of fibers. These fibers are pushed and pulled by the vibrations of the basilar membrane. When these fibers move, the receptor cells send a signal through afferent nerve endings that join to form the auditory nerve to the brain. The brain pools the information from thousands of hair cells to create sounds.

Neural Connections

The sense of hearing is truly bilateral: Each ear sends messages to both cerebral hemispheres. The switching station where the nerve fibers from the ears cross over is in the medulla, part of the brain stem. From the medulla, other nerve fibers carry the messages from the ears to the higher parts of the brain. Some messages go to the brain centers that coordinate the movements of the eyes, head, and ears. Others travel through the reticular formation. But the primary destinations for these auditory messages are the auditory areas in the temporal lobes of the two cerebral hemispheres. En route to the temporal lobes, auditory messages pass through at least four lower brain centers where auditory information becomes more precisely coded.

Theories of Hearing

Thousands of tiny hair cells send messages about the infinite variations in the frequency, amplitude, and overtones of sound waves. But how are the different sound-wave patterns coded into neural messages? One aspect of sound - loudness - seems to depend primarily on how many neurons are activated: The more cells that fire, the louder the sound. The coding of messages regarding pitch is more complicated. There are two basic views of pitch discrimination: place theory and frequency theory. According to place theory, the brain determines pitch by noting the place on the basilar membrane at which the message is strongest. High-frequency sounds cause the greatest vibration at the stiff base of the basilar membrane; low-frequency sounds resonate most strongly at the opposite end. The brain detects the location of the most intense nerve-cell activity and uses this to determine the pitch of a sound.

The frequency theory of pitch discrimination holds that the frequency of vibrations of the basilar membrane as a whole is translated into an equivalent frequency of nerve impulses. Thus, if a hair bundle is pulled or pushed rapidly, its hair cell fires rapidly, sending a rush of signals to the brain. Because neurons cannot fire as rapidly as the frequency of the highest pitched sound that can be heard, theorists have modified the frequency theory to include a volley principle. According to this view, auditory neurons can fire in sequence: One neuron fires, then a second one, and then a third. By then, the first neuron has had time to recover and can fire again. In this way, a set of neurons together, firing in sequence, can send a more rapid series of impulses to the brain than any single neuron could send by itself.

Because neither place theory nor frequency theory alone fully explains pitch discrimination, some combination of the two is necessary. Frequency theory appears to account for the ear's responses to frequencies up to about 4000 Hz; above that, place theory provides a better explanation of what is happening.

Hearing Disorders

Since the mechanisms that allow us to hear are so complicated, the potential is great for a large number of possible problems that may interfere with hearing. Of the 28 million Americans with hearing loss, about 10 million are victims of exposure to noise. About 6.5 million teenagers have some hearing loss (an increase of nearly one-third from the levels in 1988-1994 (Shargorodsky, Curhan, Curhan, & Eavey, 2010)). The chief culprits are leaf blowers, chain saws, and snowmobiles (Biassoni et al., 2005).

For people with irreversible hearing loss, a number of remedies are available. New digital technology has made hearing aids, which amplify sound, more precise by enhancing speech perception and reducing background noise. Surgery can help people with conductive hearing loss due to a stiffening of the connections between the bones of the middle ear. Cochlear implants offer hope to people who suffer from deafness due to cochlear damage.

Far from not hearing enough sound, millions of people hear too much of the wrong kind of sound and suffer greatly because of it. Almost everybody has at some time heard a steady, high-pitched hum that persists even in the quietest room. This sound, which seems to come from inside the head, is called tinnitus, and it is estimated to afflict approximately one out of every eight persons (Koehler & Shore, 2013). In some people, it becomes unbearably loud - like the screeching of subway brakes - and does not go away. In most cases, tinnitus results from irritation or damage to the hair cells. Prolonged exposure to loud sound or toxins, certain medical conditions, and even some antibiotics can cause permanent damage to the hair cells. In many cases, drug therapies, implants that create "white noise" (or sound blockage), and biofeedback can provide relief. The environment can also provide levels of sound that are harmful. It is estimated that more than 100 million people are exposed to sound levels high enough to contribute to serious health problems such as heart disease, hypertension, sleep disturbances, stress, and learning problems (Hammer, Swinburn, & Neitzel, 2014).

THE OTHER SENSES

Researchers have focused most of their attention on vision and hearing because humans rely primarily on these two senses to gather information about their environment. Our other senses - including smell, taste, balance, motion, pressure, temperature, and pain - are also at play, even when we are less conscious of them. We turn first to chemical senses: smell and taste.

Smell

Although the sense of smell in humans is much weaker than in most animals, it is still about 10,000 times more acute than taste. Like our other senses, smell undergoes adaptation.

Research has unlocked many of the mysteries of our other senses, but exactly how we smell is still an open question. Scientists do know that when we breathe in, air flows over the roughly 12 million odor-detecting cells high up in the nasal cavity. Each cell is specialized to respond to only some odorant molecules. The axons from millions of these receptors go directly to the olfactory bulb, where some recoding takes place. Then messages are routed to the olfactory cortex in the temporal lobes of the brain, resulting in our ability to recognize and remember about 10,000 different smells. There our understanding comes to a halt: Exactly how the coded messages from the nose result in the sensation of smell is still a mystery.

Odor sensitivity is related to gender. Numerous studies confirm that women generally have a better sense of smell than men (Chrisler & McCreary, 2010). Age also makes a difference since adults aged 20-40 have the sharpest sense of smell (Doty, 1989; Schiffman, 1997). Of the people tested by Doty and his colleagues, half of those 65-80 years old and three-quarters of those over the age of 80 had meaningful loss of smell (Doty, 2006). Anosmia, the complete loss of smell, can be devastating, but some experimental treatments hold out promise that smell, once lost, can be restored (Raloff, 2007).

Most mammals, including humans, have a second sensory system devoted to the sense of smell - which some animals use in daily living for such important things as marking their territory, identifying sexually receptive mates, and recognizing members of their group. Receptors located in the roof of the nasal cavity detect chemicals called pheromones, which can have quite specific and powerful effects on behavior. Humans also have receptors for pheromones and, like other mammals, there is evidence that pheromones can significantly affect behavior. For example, studies have demonstrated that pheromones can affect the menstrual cycles in women (McClintock, 1999; K. Stern & McClintock, 1998). Researchers have also shown that when males and lesbians are exposed to a natural female pheromone, their general mood is elevated and their ratings of the sexual attractiveness of females described in a story are enhanced. Findings are similar when females and gay men are exposed to a natural male pheromone (Berglund, Lindstrom, & Savic, 2006; Savic, Berglund, & Lindstrom, 2005; Scholey, Bosworth, & Dimitrakaki, 1999; Thorne, Neave, Scholey, Moss, & Fink, 2002; Thorne, Scholey, & Neave, 2000). One study of three dozen heterosexual female university students showed that women who used a perfume laced with a synthetic pheromone engaged in significantly more sexual behavior, although they were not approached more often by men nor did they have more informal dates compared with women who used the same perfume, but without the pheromone (McCoy & Pitino, 2002). Similar findings were reported in an earlier study of 17 heterosexual men (W. B. Cutler, Friedmann, & McCoy, 1998).

Taste

To understand taste, we must distinguish it from flavor - a complex interaction of taste and smell (Bartoshuk, 2009). Try holding your nose when you eat. You will notice that most of the food's flavor will disappear, and you will experience only the basic taste qualities: sweet, sour, salty, bitter, and umami (umami accounts for our sensitivity to monosodium glutamate - MSG - and related proteins).

The receptor cells for the sense of taste are housed in roughly 10,000 taste buds, most of which are found on the tip, sides, and back of the tongue. Each area of the tongue can distinguish all taste qualities, though some areas may be more sensitive to certain tastes than others.

The taste buds are embedded in the tongue's papillae, bumps that you can see if you look at your tongue in the mirror. When we eat something, the chemical substances in the food dissolve in saliva and go into the crevices between the papillae, where they come into contact with the taste buds. In turn, the taste buds release a neurotransmitter that causes adjacent neurons to fire, sending a nerve impulse to the parietal lobe of the brain and to the limbic system.

Taste, like the other senses, experiences adaptation. When you first start eating salted peanuts or potato chips, the saltiness is quite strong, but after a while it becomes less noticeable. Furthermore, exposure to one quality of taste can modify other taste sensations - after brushing your teeth in the morning, you may notice that your orange juice has lost its sweetness. Because the number of taste buds decreases with age, older people often lose interest in food because they cannot taste it as well as they used to.

Kinesthetic and Vestibular Senses

The kinesthetic senses provide information about the speed and direction of our movement in space. More specifically, they relay information about muscle movement, changes in posture, and strain on muscles and joints. Specialized receptors provide constant feedback from the stretching and contraction of individual muscles. The information from those receptors travels via the spinal cord to the cortex of the parietal lobes, the same brain area that perceives the sense of touch.

The vestibular senses provide information about our orientation or position in space that helps determine which way is up and which way is down. Like hearing, the vestibular senses originate in the inner ear, where hair cells serve as the sense organs. The impulses from these hair cells travel to the brain along the auditory nerve, but their ultimate destinations in the brain are still something of a mystery. Certain messages from the vestibular system go to the cerebellum, which controls many of the reflexes involved in coordinated movement. Others reach the areas that regulate the internal body organs, and some find their way to the parietal lobe of the cerebral cortex for analysis and response.

Perhaps we are most acutely aware of our vestibular senses when we experience motion sickness. Certain kinds of motion, such as riding in ships, trigger strong reactions in some people. According to one theory, motion sickness stems from discrepancies between visual information and vestibular sensations (Bubka & Bonato, 2003; R. M. Stern & Koch, 1996). In other words, our eyes and our body are sending our brain contradictory information. Our eyes tell our brain that we are moving, but the organs in our inner ear insist that we are sitting still. Susceptibility to motion sickness appears to be related to both race and genetics: People of Asian ancestry are particularly susceptible to motion sickness (Muth, Stern, Uijtdehaage, & Koch, 1994).

The Skin Senses

Our skin is our largest sense organ - a person 6 feet tall has about 21 square feet of skin. Skin contains receptors for our sense of touch, which plays an important role in human interaction and emotion. In most societies, hellos and good-byes are accompanied by gestures involving touch, like shaking hands. Touching and being touched by others bridges, at least momentarily, our isolation. In fact, research shows that touching someone for just a few seconds can successfully convey emotions such as anger, fear, disgust, love, gratitude, sympathy, happiness, and sadness (Hertenstein, Holmes, McCullough, & Keltner, 2009).

The skin's numerous nerve receptors, distributed in varying concentrations throughout its surface, send nerve fibers to the brain by two routes. Some information goes through the medulla and the thalamus and from there to the sensory cortex in the parietal lobe of the brain - which is presumably where our experiences of touch, pressure, and so on arise. Other information goes through the thalamus and then on to the reticular formation, which is responsible for arousing the nervous system or quieting it down.

Skin receptors give rise to sensations of pressure, temperature, and pain, but the relationship between the receptors and our sensory experiences is subtle. Researchers believe that our brains draw on complex information about the patterns of activity received from many different receptors to detect and discriminate among skin sensations. For example, our skin has "cold fibers" that increase their firing rate as the skin cools down and that slow their firing when the skin heats up. Conversely, we also have "warm fibers" that accelerate their firing rate when the skin gets warm and that slow down when the skin cools. The brain apparently uses the combined information from these two sets of fibers as the basis for determining skin temperature. If both sets are activated at once, the brain may read their combined pattern of firings as "hot." Thus, you might think that you are touching something hot when you are really touching something warm and something cool at the same time, a phenomenon known as paradoxical heat.

The skin senses are remarkably sensitive. For example, skin displacement of as little as 0.0000025 of an inch can result in a sensation of pressure. Moreover, various parts of the body differ greatly in their sensitivity to pressure: Your face and fingertips are extremely sensitive, whereas your legs, feet, and back are much less so.

Like other senses, the skin senses undergo various kinds of sensory adaptation. When we first get into a bath, it may be uncomfortably hot; but in a few minutes we adapt to the heat. Skin senses are also influenced by our expectations. When someone tickles us, our skin senses respond with excitement, but tickling ourselves produces no effect. Clearly, the brain draws on many sources of information in interpreting the sense of touch.

Pain

Although pain plays an important role in everyday life - alerting us to both minor and major injuries - more people visit doctors for relief of pain than for any other reason. The economic impact of pain is more than $100 billion a year in the United States alone (Mackey, 2005). Yet, to a great extent, pain remains a puzzle. What is the purpose of pain? An old adage holds that pain is nature's way of telling you that something is wrong; and it does seem reasonable to assume that damage to the body causes pain. But in many cases, actual physical injury is not accompanied by pain. Conversely, some people feel pain without having been injured or long after an injury has healed. One of the most perplexing examples of this is the phantom limb phenomenon, which occurs in at least 90% of amputees and perhaps half of the people who experience a stroke (Antoniello, Kluger, Sahlein, & Heilman, 2010). After amputation of an arm or a leg, a patient often continues to feel the missing limb; in fact some patients report that they can actually move their phantom limb (Ramachandran & Rogers-Ramachandran, 2007). Surprisingly, children who are born without arms or legs also often report having phantom limbs (Melzack, Israel, Lacroix, & Schultz, 1997). And therein lies an important clue to how the sensation of a phantom limb arises: not in the damaged nerves at the site of the amputation (which the children don't have), but higher up in the brain itself where there apparently is some kind of neural "picture" of what the intact human body should be like. As the brain slowly reorganizes itself to reflect the missing limb, the pain and other sensations often subside with time, demonstrating yet again the force of neural plasticity (Flor, Nikolajsen, & Jensen, 2006).

Gate-Control Theory

The sensation of pain in many ways remains mysterious, but some progress has been made in understanding why and how pain occurs.

When pain messages reach the brain, a complex series of reactions begins. The sympathetic nervous system springs into action. The nervous system and endocrine system go on alert to help deal with the crisis. Meanwhile, chemicals to reduce or stop the pain messages may be released both in the brain and in the spinal cord. Certain areas of the brain stem may also reduce the flow of incoming pain information by sending signals to fibers in the spinal cord to partially or completely close the "gate." These processes account for the fact that, despite an injury, little or no pain may be experienced.

Biopsychosocial Theory

Some psychologists believe that the gate-control theory oversimplifies the complex experience we call pain. According to biopsychosocial theory, pain sensations involve three interrelated phenomena: biological mechanisms, psychological mechanisms, and social mechanisms (Edwards, Campbell, Jamison, & Wiech, 2009).

Biological mechanisms involve the degree to which tissue is injured and our pain pathways have adapted. For example, chronic pain can alter pathways in the nervous system. As a result, the nerves in the spinal cord can become hypersensitive. Let's say you break a bone in your foot and don't get medical attention until pain prevents you from walking. Even after the break heals, a mild blow to your foot may be painful.

Genetics also appears to play a role. Scientists have identified a small genetic variation that accounts partially for individual differences in the experience of pain (Cox et al., 2006). Mutations of this gene can cause individuals to experience no pain, while other mutations can lead to hypersensitivity to pain.

Psychological mechanisms - our thoughts, beliefs, and emotions - also can affect our experience of pain (Wickelgren, 2009). In one study, heat pulses were administered to the lower right leg. Participants who expected only moderate pain reported much less pain when the stimulus was actually severe. Moreover, there was much lower activity in pain-related brain areas; and participants' expectations of lower pain were almost as effective as morphine in relieving physical pain (Koyama, McHaffie, Laurienti, & Coghill, 2005).

Some people make an active effort to cope with pain: Confident that they can overcome pain, they avoid negative feelings, engage in diverting activities, and refuse to let pain interfere with their daily lives. By contrast, others with the same injuries or disorders can be overwhelmed: They feel "victimized," that their pain is ruling their life, and that no one understands. Studies indicate that believing in one's ability to cope may actually cause higher brain centers to reduce or block pain signals (Padhi, 2005; Wall & Melzack, 1996). Even temporary psychological states can have an impact; researchers have found that distracting people with sounds or pleasant aromas can reduce not only the sensation of pain (Moont, Pud, Sprecher, Sharvit, & Yarnitsky, 2010).

Social mechanisms, such as the degree of family support, also play a role (Master et al., 2009). In one large study of chronic pain patients, those who described their families as being supportive reported significantly less pain intensity, less reliance on medication, and greater activity levels than patients who reported family disharmony and limited support (Jamison & Virts, 1990). Cultural expectations can also affect the experience of pain as well as ways of coping with pain (Bonham, 2001; D. Gordon & Bidar-Sielaff, 2006; Sullivan, 2004).

Alternative Treatments

There is a vast array of treatments for reducing pain though none of them is fully effective against all kinds of pain. Some of them work by affecting the gate-control mechanisms in the spinal cord. Others work by destroying the neurons responsible for pain (Weintraub, 2013). Still others, such as pain relievers, work directly in the brain. But the ways in which other treatments work remain a mystery. For example, many studies have shown that if you give pain sufferers a chemically inert pill, or placebo, but tell them that it is an effective pain reducer, they often report some relief (National Institutes of Health, 2007). There is no doubt many home remedies rely on the placebo effect. Research indicates that both placebos and acupuncture, which involves the insertion of thin needles into parts of the body, work in part through the release of endorphins, pain-blocking neurotransmitters (Hollins, 2010). But recent research shows that even for placebos and acupuncture, endorphin release alone does not account for pain reduction (Kong et al., 2006; Matre, Casey, & Knardahl, 2006). Moreover, some other pain-reduction techniques - such as hypnosis or concentration exercises (as in the Lamaze birth technique) - appear to have nothing at all to do with endorphins, but rely on some other means of reducing the pain sensation (deCharms et al., 2005). Clearly, much more research is needed before we will fully understand the sensation of pain.

PERCEPTION

Our senses provide us with raw data about the external world. But unless we interpret this raw information, it is nothing more than what William James (1890) called a "booming, buzzing confusion." The eye records patterns of lightness and darkness, but it does not "see" a bird flittering from branch to branch. Deciphering meaningful patterns in the jumble of sensory information is what we mean by perception. But how does perception differ from sensation?

Perception is the brain's process of organizing and making sense of sensory information. Using sensory information as raw material, the brain creates perceptual experiences that go beyond what is sensed directly.

How do we see objects and shapes? Psychologists assume that perception begins with some real-world object with real-world properties "out there." Psychologists call that object, along with its important perceptual properties, the distal stimulus. We never experience the distal stimulus directly. Energy from it (or in the case of our chemical senses, molecules from it) must activate our sensory system. We call the information that reaches our sensory receptors the proximal stimulus. Remarkably, although the distal stimulus and the proximal stimulus are never the same thing, our perception of the distal stimulus is usually very accurate.

However, sometimes you perceive things that could not possibly exist. In some cases (i.e. devil's tuning fork), the brain actively creates and organizes perceptual experiences out of raw sensory data - sometimes even from data we are not aware of receiving.

Perceptual Organization

Early in the 20th century a group of German psychologists, calling themselves Gestalt psychologists, set out to discover the principles through which we determine sensory information. They demonstrated many of the ways in which the brain creates a coherent perceptual experience that is more than simply the sum of the available sensory information and that it does so in predictable ways.

In one important facet of the perceptual process, we distinguish figures from the ground against which they appear. A colorfully upholstered chair stands out from the bare walls of a room. We can distinguish a violin solo against the ground of a symphony orchestra or a single voice amid cocktail-party chatter. Sometimes, however, there are not enough cues in a pattern to permit us to easily distinguish a figure from its ground. This is the principle behind camouflage: to make a figure blend into its background.

Sometimes a figure with clear contours can be perceived in two very different ways because it is unclear which part of the stimulus is the figure and which is the ground. The artwork doesn't change, but your perception may.

In its search for meaning, our brain tries to fill in missing information, to see whole objects and to hear meaningful sounds, rather than just random bits and pieces of raw, sensory data.

Perceptual Constancies

When anthropologist Colin Turbull (1961) studied the Mbuti pygmies of Zaire, most of them had never left the dense Ituri rain forest and had rarely encountered objects that were more than a few feet away. On one occasion, Turbull took a pygmy guide named Kenge on a trip onto the African plains. When Kenge looked across the plain and saw a distant herd of buffalo, he asked what kind of insects they were. He refused to believe that the tiny black spots he saw were buffalo. As he and Turnbull drove toward the herd, Kenge believed that magic was making the animals grow larger. Because he had no experience of distant objects, he could not perceive the buffalo as having constant size.

Perceptual constancy refers to the tendency to perceive objects as relatively stable and unchanging despite changing sensory information. Without this ability, we would find the world very confusing. For example, a house looks like the same house day or night and from any angle. The sensory information changes as illumination and perspective change, but the object is perceived as constant.

We tend to perceive familiar objects at their true size regardless of the size of the image that they cast on the retina. The farther away an object is, the smaller the retinal image it casts. We might guess that a woman some distance away is 5 feet 4 inches tall when she is really 5 feet 8 inches, but hardly anyone would perceive her as being 3 feet tall, no matter how far away she is. We know from experience that adults are seldom that short. Size constancy depends partly on experience - information about the relative sizes of objects stored in memory - and partly on distance cues.

Familiar objects also tend to be seen as having a constant shape, even though the retinal images they cast can change as they are viewed from different angles. This is called shape constancy. A rectangular door will project a rectangular image on the retina only when it is viewed directly from the front. From any other angle, it casts a trapezoidal image on the retina, but it is not perceived as having suddenly become a trapezoidal door.

Similarly, we tend to perceive familiar objects as keeping their colors, regardless of information that reaches the eye. If you own a red automobile, you will see it as red whether it is on a brightly lit street or in a dark garage. But color constancy does not always hold true. When objects are unfamiliar or there are no customary color cues to guide us, color constancy may be distorted - as when you buy a sweater in a brightly lit store, only to discover that in ordinary daylight, it is not the shade you thought it was.

Brightness constancy means that even though the amount of light available to our eyes varies greatly over the course of a day, the perceived brightness of familiar objects hardly varies at all. We perceive a sheet of white paper as brighter than a piece of coal whether we see these objects in candlelight or under bright sunlight. Brightness constancy occurs because an object reflects the same percentage of the light falling on it whether that light is from a candle or the sun. Rather than basing our judgment of brightness on the absolute amount of light that the object reflects, we assess how the relative reflection compares with the surrounding objects.

Memory and experience play important roles in perceptual constancies, as well.

Perception of Distance and Depth

We are constantly judging the distance between ourselves and other objects. When we walk through a classroom, our perception of distance helps us to avoid bumping into desks or tripping over the wastebasket. We also assess the depth of objects - how much total space they occupy. We use many cues to determine the distance and the depth of objects. Some of these cues depend on visual messages that one eye alone can transmit; these are called monocular cues. Others, known as binocular cues, require the use of both eyes. Having two eyes allows us to make more accurate judgments about distance and depth, particularly when objects are relatively close. But monocular cues alone are often sufficient to allow us to judge distance and depth quite accurately (Vishwanath & Hibbard, 2013).

Monocular Cues

One important monocular distance cue that provides with information about relative position is called interposition. Interposition occurs when one object partly blocks a second object. The first object is perceived as being closer, the second as more distant.

As art students learn, there are several ways in which perspective can help in estimating distance and depth. In linear perspective, two parallel lines that extend into the distance seem to come together at some point on the horizon. In aerial perspective, distant objects have a hazy appearance and a somewhat blurred outline. On a clear day, mountains often seem to be much closer than on a hazy day, when their outlines become blurred. The elevation of an object also serves as a perspective cue to depth: An object that is on a higher horizontal plane seems to be farther away than one on a lower plane.

Another useful monocular cue to distance and depth is texture gradient. An object that is close seems to have a rough or detailed texture. As distance increases, the texture becomes finer, until finally the original texture cannot be distinguished clearly, if at all. For example, when standing on a pebbly beach, you can distinguish among the gray stones and the gravel in front of your feet. However, as you look down the beach, you cannot make out individual stones at all. Shadowing is another important cue to the distance, depth, and solidity of an object.

Bus or train passengers often notice that nearby trees or telephone poles seem to flash past the windows, whereas buildings and other objects farther away seem to move slowly. These differences in the speeds of movement of images across the retina as you move give an important cue to distance and depth. You can observe the same effect if you stand still and move your head from side to side as you focus your gaze on something in the middle distance: Objects close to you seem to move in the direction opposite to the direction in which your head is moving, whereas objects far away seem to move in the same direction as your head. This distance cue is known as motion parallax.

Binocular Cues

All the visual cues examined so far depend on the action of only one eye. Many animals - such as horses, deer, and fish - rely entirely on monocular cues. Although they have two eyes, the two visual fields do not overlap, because their eyes are located on the sides of the head rather than in front. Humans, apes, and many predatory animals - such as lions and wolves - have a distinct physical advantage. Since both eyes are set in the front of the head, the visual fields overlap. The stereoscopic vision derived from combining the two retinal images - one from each eye - makes the perception of depth and distance more accurate.

Because our eyes are set approximately 2 1/2 inches apart, each one has a slightly different view of things. The difference between the two images is known as binocular disparity. The left eye receives more information about the left side of an object, and the right eye receives more information about the right side. Here's how to test this: Close one eye and line up a finger with some vertical line, like the edge of a door. Then open that eye and close the other one. Your finger will appear to have moved. When you look at the finger with both eyes, however, the two different images become one.

An important binocular cue to distance comes from the muscles that control the convergence of the eyes. When we look at objects that are fairly close to us, our eyes tend to converge - to run slightly inward toward each other. The sensations from the muscles that control the movement of the eyes thus provide a cue to distance. If the object is very close, such as at the end of the nose, the eyes cannot converge, and two separate images are perceived. If the object is more than a few yards (meters) away, the sight lines of the eyes are more or less parallel, and there is no convergence.

Location Of Sounds

Just as we use monocular and binocular cues to establish visual depth and distance, we draw on monaural (single-ear) and binaural (two-ear) cues to locate the source of sounds. In one monaural cue, loud sounds are perceived as closer than faint sounds, with changes in loudness translating into changes in distance. Binaural cues work on the principle that because sounds off to one side of the head reach one ear slightly ahead of the other, the time difference between sound waves reaching the two ears helps us make accurate judgments about location.

In a second binaural cue, sound signals arriving from a source off to one side of you are slightly louder in the nearer ear than in the ear farther from the source. The slight difference occurs because your head, in effect, blocks the sound, reducing the intensity of sound in the opposite ear. This relative loudness difference between signals heard separately by the two ears is enough for the brain to locate the sound source and to judge its distance.

Most of us rely so heavily on visual cues that we seldom pay much attention to the rich array of auditory information available around us. But people who have been blind since birth often compensate for their lack of vision by sharpening their awareness of sounds. As a result they can figure out where obstacles lie in their paths by listening to the echoes from a cane and their own voices. In fact, many blind people can judge the size and distance of one object in relation to another by using nothing more than sound cues. This increased sensitivity to sound is less among people who become blind when they were small children and it is often lacking in people who become blind after the age of 10 (Gougoux et al., 2004; Gougoux, Zatorre, Lassonde, Voss, & Lepore, 2005).

Perception of Movement

The perception of movement is a complicated process involving both visual information from the retina and messages from the muscles around the eyes as they follow an object. On occasion, our perceptual processes play tricks on us, and we think we perceive movement when the objects that we are looking at are in fact stationary. We must distinguish, therefore, between real and apparent movement.

Real movement refers to the physical displacement of an object from one position to another. The perception of real movement depends only in part on movement of images across the retina of the eye. If you stand still and move your head to look around you, the images of all the objects in the room will pass across your retina. But messages from the eye muscles counteract the changing information from the retina, so the objects in the room will be perceived as motionless.

The perception of real movement is determined by how the position of objects changes in relation to a background that is perceived as stationary. When we perceive a car moving along a street, for example, we see the street, the buildings, and the sidewalk as a stationary background and the car as a moving object.

Apparent movement occurs when we perceive movement in objects that are actually standing still. One form of apparent movement is referred to as the autokinetic illusion - the perceived motion created by the absence of visual cues surrounding a single stationary object. If you stand in a room that is absolutely dark except for one tiny spot of light and stare at the light for a few seconds, you will begin to see the light drift. In the darkened room, your eyes have no visible framework; there are no cues telling you that the light is really stationary. The slight movements of the eye muscles, which go unnoticed most of the time, make the light appear to move.

Another form of apparent movement is stroboscopic motion - the apparent motion created by a rapid series of still images. This form of apparent movement is illustrated best by a motion picture, which is not in motion at all. The film consists of a series of still pictures showing people and objects in slightly different positions. When the separate images are projected sequentially onto a screen at a specific rate of speed, the people and objects seem to be moving because of the rapid change from one still picture to the next.

Another common perceptual illusion, known as the phi phenomenon, occurs as a result of stroboscopic motion. When a light is flashed on at a certain point in a darkened room, then flashed off, and a second light is flashed on a split second later at a point a short distance away, most people will perceive these two separate lights as a single spot of light moving from one point to another. This perceptual process causes us to see motion in neon signs, where words appear to move across the sign, from one side to the other, as the different combinations of stationary lights are flashed on and off.

Visual Illusions

Visual illusions graphically demonstrate the ways in which we use sensory cues to create perceptual experiences that may (or may not) correspond to what is out there in the real world. By understanding how we are fooled into "seeing" something that isn't there, psychologists can figure out how perceptual processes work in the everyday world and under normal circumstances.

Psychologists generally distinguish between physical and perceptual illusions. One example of a physical illusion is the bent appearance of a stick when it is placed in water - an illusion easily understood because the water acts like a prism, bending the light waves before they reach our eyes. Perceptual illusions occur because the stimulus contains misleading cues that give rise to inaccurate or impossible perceptions.

Artists rely on many perceptual phenomena both to represent reality accurately and to distort it deliberately. In paintings and sketches drawn on a two-dimensional surface, it is almost always necessary to distort objects for them to be perceived correctly by viewers. For example, in representational art, the railroad tracks, sidewalks, and tunnels are always drawn closer together in the distance. Thus, our understanding of perceptual illusions enables us to manipulate images for deliberate effect - and to delight in the results.

Observer Characteristics

Our desires and needs shape our perceptions. To a degree, people tend to see the world the way they want it to be (Dunning & Balcetis, 2013). For example, dieters perceived a muffin as being larger than did non-dieters (van Koningsbruggen, Stroebe, & Aarts, 2011) and people who could win $100 bill perceived it as being closer than a $100 bill they could not win (Balcetis & Dunning, 2010). People in need are likely to perceive something that they think will satisfy that need (Balcetis & Dunning, 2007). The best-known example of this, at least in fiction, is a mirage: People lost in the desert have visual fantasies of an oasis over the next dune. In real life most people have a fear of snakes. When shown an array of pictures, even very young children locate pictures of snakes more quickly than nonthreatening pictures (LoBue & DeLoache, 2008). Thus, people with strong emotions toward certain objects are likely to detect those objects more rapidly than other people do.

Values

Values also affect perception. In one classical experiment, nursery school children were shown a poker chip. Each child was asked to compare the size of the chip with the size of an adjustable circle of light until the child said that the chip and the circle of light were the same size. The children were then brought to a machine that produced a poke chip that could be exchanged for candy. Thus, the children were taught to value the poke chips more highly than they had before. After the children had been rewarded with candy for the poker chips, they were again asked to compare the size of the chips with a circle of light. This time the chips seemed larger to the children (W. W. Lambert, Solomon, & Watson, 1949).

Expectations

Preconceptions about what we are supposed to perceive can influence perception by causing us to delete, insert, transpose, or otherwise modify what we see (Esterman & Yantis, 2010). S. J. Lachman (1984) demonstrated this phenomenon by asking people to copy a group of stimuli similar to this one:

PARIS IN THE

THE SPRING

Often, people tended to omit the "extra" words and to report seeing more familiar (and more normal) expressions, such as PARIS IN THE SPRING. This phenomenon reflects a strong tendency to see what we expect to see, even if our expectation conflicts with external reality.

Cognitive Style

As we mature, we develop a cognitive style - our own way of dealing with the environment - that also affects how we see the world. Some psychologists distinguish between two general approaches that people use in perceiving the world. People taking the field-dependent approach tend to perceive the environment as a whole and do not clearly delineate in their minds the shape, color, size, or other qualities of individual items. If field-dependent people are asked to draw a human figure, they generally draw it so that it blends into the background. By contrast, people who are field independent are more likely to perceive the elements of the environment as separate and distinct from one another and to draw each element as standing out from the background.

Experience and Culture

Cultural background also influences people's perceptions. For example, Westerners' perceptions tend to focus on salient foreground objects, whereas Asians are more inclined to focus on contexts (Kitayama, Duffy, Kawamura, & Larsen, 2003; Miyamoto, Nisbett, & Masuda, 2006). Shown a scene of an elephant in a jungle: "An Asian would see a jungle that happened to have an elephant in it...a Westerner would see the elephant and might notice the jungle." (Denise Park quoted in Binns, 2007, p. 9). There is increasing evidence that cultural differences in people's experiences can actually rewire the nervous system (Goh et al., 2007; Park & Huang, 2010).

Personality

A number of researchers have shown that our individual personalities influence perception. For example, one study compared clinically depressed college students to students with eating disorders in terms of their ability to identify words related to depression and food (von Hippel, Hawkins, & Narayan, 1994). The students saw a series of words very quickly (for less than one-tenth of a second each). In general, students with an eating disorder were faster at identifying words that referred to foods that they commonly thought about than they were at identifying foods that they rarely thought about. Similarly, depressed students were faster at identifying adjectives describing personality traits than they commonly thought about (such as "timid") than adjectives that described traits that they rarely thought about.