B.F. Skinner is perhaps the predominant figure in American psychology. He is an experimental psychologist at Harvard who has developed behaviorism as a position in learning (he remains hesitant to use the term “theory”).
Skinner emphasizes observable behavior in the study of humans—hence the term “behaviorism.” He rejects any attempt at introspection or use of hypothetical internal processes or structures to account for learning. Instead, Skinner uses the consequences of a behavior to explain why the behavior continues or fades.
Many of Skinner’s ideas are built upon Thorndike’s law of effect. Stated briefly, Skinner believes (or has been conditioned to say?) that behavior that is followed by reinforcement (positive or negative) has an increased probability of reoccurrence. Behavior followed by extinction or punishment has a decreased probability of re-occurrence.
Since learning is implied by a change in behavior, a teacher must first determine what behavioral change is desirable, then manipulate the consequences to alter the probability of the behavior recurring. Through proper use of shaping, the teacher can promote the development of new behaviors. In concept, this is quite simple. In practice, it is a bit more difficult, but quite within grasp, as research and experience with programmed instruction and behavior modification show.
Skinner’s ideas about instruction have been very influential on education. After a period of almost total domination behaviorism is beginning to wane, yet its impact will continue to be felt.
Here is a comprehensive set of objectives for Skinner, along with points based on these objectives:
Partially in reaction to the field of psychoanalysis and the work of people like Freud, Skinner thought the best way to advance the field of psychology was through application of the scientific method based on observable experiments not Speculation or theoretical musings. Skinner held firm to the logical positive position that all we can really know is that which we can learn through direct observation using our senses. He was not inclined to speculate about things nor to hypothesize about why something might have happened. He conducted experiments, observed, and recorded the results. Nothing else. He did this because of his belief that the only stable knowledge comes from direct observation, not from speculation about internal matters or things that are not directly observable.
For Skinner, the dependent variable involved in learning is a change in behaviour because that's the only thing that is directly observable. More specifically, the frequency with which a specific behaviour occurs is the dependent variable. Obviously, Skinner would not award anything like "style points" to behaviour because of the qualitative dimension to this. Rather he would specifically define and describe a behavior then count the frequency with which it occurs following some event. The main independent variable involved in learning for Skinner is whether that learning is reinforced. More specifically, Skinner would say that the independent verbal influencing learning is the consequence of the behavior. This could include any of the four consequences described above.
Skinner didn't object so much the other theories of learning because he thought they were wrong but rather because he thought they had nothing to our understanding of learning because they were not based on tangible, observable and repeatable factors. Any theory that looked to an internal process to account for learning, such as developmental and cognitive theories, were readily discounted by Skinner as being non-scientific and thus of limited value. These theories did not generate substantial knowledge that could be used to explain learning or two cause learning to happen according to Skinner. Thus, such theories were not worthy of consideration.
In Skinner's view of learning a person, or animal for that matter, must first emit or demonstrate a specific behavior that is subsequently reinforced and thus becomes learned. Reinforcement can follow only after the new behavior occurs. This raises the question, "How can you get a new behavior to occur so you can reinforce it because it to be learned?" The answer according to Skinner involves shaping. If a person is not capable of the desired behavior then, of course, you can't reinforce the desired behavior in that person. Instead through the process of shaping you begin by reinforcing any approximation of the desired behavior. If a child can't pronounce the word you desire for them to pronounce you begin by reinforcing any approximation of that word that they can make. Then slowly you provide reinforcement only for those approximations that get closer and closer to the actual pronunciation of the word. Gradually the pronunciation of the word shaped until it becomes the correct pronunciation. Then the student is only reinforced for the correct pronunciation. Shaping works according to behaviorism because the desired behavior is reinforced and thus more likely to occur. By reinforcing only closer approximations to desired behavior the student gradually learns how to emit the desired behavior through this process of shaping.
Operant behavior, which is fundamental in Skinner's version of behaviorism, is that behavior that simply occurs and operates on the environment for which you can't identify a specific causes that forces the behavior. Most human behavior would be considered operant behavior. Responded behavior is that behavior of human being that happens as result of reflexes rather than as a result of being rewarded. For example, blinking our eyes when we get something in them is a respondent behavior. It is based on a reflex, and there is a clear cause of that behavior.
Recalling that Skinner is a behaviorist who limits himself to describing observable phenomena rather than speculation on any internal activities of the human mind, learning is considered a change in behavior – nothing more, nothing less. So when talking about the act of learning learning Skinner is really talking about the act of behavioral change. Behavior changes depending upon the consequences of that behavior. If a specific behavior is reinforced then the probability of that behavior occurring again is increased. This then is how the act of learning occurs because learning is nothing more than a change in behavior in Skinner's view.
Stimuli play a key role in behaviorism. Stimuli that follower behavior and increase the probability of that behavior occurring again are called reinforcing stimuli. These reinforcing stimuli, or reinforcers, play a central role in learning without which learning would not occur. Another type of stimuli are also important in a behavioral view. These are the stimuli that set the occasion for or signal that a certain behavior will be followed by reinforcement. Call discriminative stimuli. Think about these as traffic lights for behaviors where a green light would indicate that a behavior is acceptable and will be reinforced while a red light would indicate the behavior is not acceptable and will not be reinforced. This is a matter in which we can control the behaviors of others to some extent. An obvious example of this is the bell that would ring at the end of the class to signal it's over. If the bell rings then the student can get up and run out of class to join his friends in the hall or head to the playground. However, if the bell has not rung and the student gets up and runs out of class it's a very different consequence. Likewise a teacher can let students know that if they ask a question after she says "Are there any questions?" then she will answer that question thoroughly and completely. However if she has not said "Are there any questions?" and a student asks a question she will ignore that question and not respond to it. Her statement "Are there any questions?" is a discriminative stimulus that sets the occasion for reinforcement.
It is not necessary or even desirable to reinforce every occurrence of a behavior in order for learning to occur. Rather learning can be reinforced on a variable schedule in which each response itself does not result in reinforcement. For example, you may provide reinforcement to a learner after every third correct response not after each correct response. This would be an example of a ratio schedule of reinforcement because it is based on providing reinforcement after certain number of responses. In the example given reinforcement is provided for every third correct response this could be considered a1:3 ratio schedule of reinforcement. It would be a fixed ratio schedule because the ratio of every third response remains constant. Another type of ratio schedule can happen when you vary how many responses have to happen before one is reinforced. You may provide reinforcement after the sixth response, then after the fourth response, then after the fifth response, then after the second response, then after the eighth response, and so forth. Overall this might average out to be a reinforcement after every fourth response in which case it would be called a 1:4 variable ratio schedule of reinforcement. Another way to provide reinforcement is not based on the number of responses but rather than amount of time that has elapsed since the last reinforcement was given. These can be based on a fixed time interval or a variable time interval. In a fixed interval schedule you would read reinforce the next response after that interval had passed. Thus, you may reinforce the next response that occurs after a two-minute interval past if this was a two-minute fixed interval schedule. The amount of time in an interval schedule can also vary just as the number of responses in a variable ratio schedule was not constant.
Primary reinforcers of those things like food and water and air they don't have to be learned in order to work as a reinforcer. Secondary reinforcers or things like praise and money that have to be learned through being paired with primary reinforcers before they have reinforcing value.
For Skinner behavior is behavior, and all behavior follows the same laws or principles. He makes no distinction between verbal behavior and motor behavior. We learn to talk and communicate the same way we learn motor behavior like how to walk. There is nothing special about our verbal behavior. It's just another form of behavior.
Skinner accounts for creativity in the same manner that he accounts for all other behavior. There's nothing special about creativity in Skinner's system except that it's much less likely to be seen in the general public. In the past some people have been reinforced for behaviors that we define as creative. Because these behaviors have been reinforced these people persist with these behaviors that we have defined as creative. Thus, we call them creative people. Likewise with people who have a considerable drive and who we are inclined to say are highly motivated. This is not some innate, internal characteristic unique to these few people. Rather it's the logical consequence and what happens when in the past people have been reinforced for setting goals and then persisting with tasks and achieving the goals.
Skinner has traveled from being the most influential theorist in education in the 1950s and 60s to perhaps being among the least influential today. At least this is true as far as reputation. It is very difficult to find educators today who claim to be Skinnerians whereas 40 years ago almost all teachers embraced his ideas. Although his star has faded and his theory is very much out of favor, today you still find many teaching practices that have theory origins in Skinner. Today we don't claim to be Skinnerians but our teaching practices would often indicate otherwise.
Skinner gave us behavioral objectives, the use of reinforcers (rewards), individualized instruction, the simple-to-complex sequencing of content, the use of active practice by students, the use of frequent feedback to students, criterion-referenced testing, self-pacing, mastery learning and many more concepts in wide use in education today. Skinner remains very influential in schools although we no longer consider his theory to be important or even correct. We don't write about using Skinner's ideas and we don't talk about this, but when you look into classrooms you see a lot of Skinner today.
©Copyright 2015 Wallace Hannum