In contrast to the minimal impact of Watson’s work on behavioral learning theory, it is the work of Burrhus Fredric Skinner that dominates textbook accounts and popular understanding of behavioral learning theory. Skinner’s studies of operant conditioning, contingencies of reinforcement, and schedules of reward and punishment have played a major role in the design of instruction for nearly a century. During his career, and following his retirement, Skinner published regularly (as evidenced by the 63-page bibliography compiled by Smith and Morris, 2003), producing over 20 books and nearly 200 articles between 1930 and 1993 (“Articles,” 2009). Though Skinner wrote on a variety of topics, the primary focus of his research consistently centered on operant conditioning and contingencies of reinforcement. As a result of his relentless commitment to the experimental study of principles of behaviorism, and his enduring focus, Skinner is largely responsible for bringing behavior-based principles of learning into the American classroom.
Skinner’s dogmatic and unrelenting approach to scientific inquiry of empirical principles can be understood, in part, by a prominent experience in his teenager years. While reading Shakespeare for one of his school classes, his father made a comment regarding the disputed authorship of the plays of Shakespeare. This in turn led Skinner to the writings of Francis Bacon (Skinner, 1967, p. 388-389), which he believed likely influenced his philosophical position, taken later in life as a scientist (p.409).
Skinner (1979) attributed his conversion to behaviorism to another philosopher named Bertrand Russell (p. 5), who had referred in a review article of The Meaning of Meaning by C. K. Ogden., to Watson’s Behaviorism as “massively impressive.” Skinner subsequently bought both Watson’s Behaviorism and Russell’s Philosophy. These two books, along with a copy of Pavlov’s Conditioned Reflexes, formed the start of a small personal library that he kept in his rented room. He read the first third of Philosophy, but abandoned his reading in the middle third when he lost interest in Russell’s rehash of his views on nature. Ironically, by doing so he missed the last third of the book in which Russell undertakes to disprove the behavioristic view expressed in the first third by talking about “man from within.” The ironic result was that Skinner became converted to the behaviorist perspective and stayed true his course throughout his entire life. Additional influences in his persuasion were a weekly seminar in animal behavior by Walter S. Hunter of Clark University, a graduate student by the name of Charles K. Trueblood, Fred Keller (who had particular sway in Skinner’s resisting the mentalistic predispositions of his department and remaining a behaviorist), and most especially by Pavlov:
The International Congress of Physiology met at the Harvard Medical School in August 1929, and Ivan Petrovich Pavlov gave the principle address!…I heard Pavlov’s address (in German) but did not try to shake his hand. I did get his autograph. A photographer was taking orders for a portrait and had asked Pavlov to write his name on a slip of paper so that his signature could appear on each print. I offered to buy a copy if I could have the slip of paper when the photographer was through with it, and he sent it to me. (Skinner, 1979, pp. 42-43)
While studying psychology at Harvard, Skinner seriously considered transferring to the physiology department (Skinner, 1979, pp. 25, 38) but his decision to stick with psychology was made firm by the availability of a machinist shop in the department of psychology. He was able to use the shop according to his pleasure and it was there in which he was able to build various apparatus (e.g., a silent release door) for use in his experiments (p.32). These experiments were not only the core of Skinner’s research but also his primary source of learning since he claims to have “never learned how to read the ‘literature’ in psychology” (p. 34). What Skinner probably meant to say was that he did not take interest in the contemporary literature of psychology. He certainly did read though, since the background research for the experiments that laid the foundation for his life’s work included a review of the experimental work on reflexes from the middle of the seventeenth century down through Magnus and Pavlov (p. 67). He also benefited from more practical books as he grew into his research through his experiments at Harvard:
If my rats were to get all their food in the apparatus, I could no longer go on using pearl barley. In a book on the breeding and care of the white rate I found a formula for a balanced diet: wheat, corn, flax seed, and bone meal, with a bit of salt, cooked in a double boiler. The mixture would have to be converted into pellets of uniform size, and I consulted a druggist, who showed me his pill machine. (Skinner, 1979, p. 59)
Through his experiments Skinner felt he had discovered a “new theory of conditioning” that was “different from Pavlov’s and much more like most learning in daily life” (p. 89). He differentiated his research from other studies of learning by focusing on the maintenance of behavior strength:
Up to that time the study of learning had been concerned almost exclusively with acquisition and forgetting, but I had stumbled onto the maintenance of behavior in strength. My rats acquired the response of pressing the lever with almost embarrassing speed. Thereafter I was looking at the conditions under which its strength was sustained. (p. 99)
In the Behavior of Organisms (1938) Skinner made a distinction between two types of behavior: respondent behavior, or involuntary reflex behavior elicited by a known stimulus, and operant behavior, or behavior that is simply emitted by an organism in response to a stimulus that is unknown to the observer. Respondent behaviors are reflex behaviors. Operant behaviors are those that appear to be spontaneous, because the stimulus is not known, and, according to Skinner, it is “not important to know its cause” (Hergenhahn, 1982, p. 84). Operant behaviors include most of the things we do in our daily lives.
Skinner (1938) also distinguished between two types of conditioning: Type S and Type R. Through Type S conditioning—identical to Pavlov’s classical conditioning—a stimulus to be conditioned (e.g. an assistant wearing a lab coat) is repeatedly paired with an unconditioned stimulus (e.g. acid) until it comes to elicit the same response (e.g. salivation) that is made when the unconditioned stimulus is presented. In Type S conditioning, the strength of conditioning is usually determined by the magnitude of the response. Type R conditioning refers to the conditioning of operant behavior in which responses (i.e. behaviors emitted in response to unknown stimuli) are reinforced. This type of conditioning is comparable to Thorndike’s law of effect: “If the occurrence of the operant is followed by presentation of a reinforcing stimulus, the strength is increased” (p. 21). In Type R conditioning, the strength of conditioning is usually measured by response rate. Skinner’s operant conditioning is based entirely on Type R conditioning.
Though Skinner did not believe that theories of learning are necessary, and made an argument for why this is so (1961a), his practice of operant conditioning in the experimental analysis of behavior was based on a clearly defined set of principles:
1. Positive reinforcement – a response that is followed by the presentation of a satisfying stimulus tends to be repeated.
2. Negative reinforcement – a response that is followed by the removal of an aversive stimulus tends to be repeated.
3. Punishment – a response that is followed by the presentation of an aversive stimulus becomes less frequent.
4. Reinforcement removal – a response that is followed by the removal of a satisfying stimulus (i.e. a reinforcer) becomes less frequent.
5. Discrimination – discriminations are learned when a behavior is reinforced in the presence of one stimulus but not another, or when a behavior is punished in the presence of one stimulus but not another.
6. Shaping – a new behavior can be learned through the reinforcement of successive approximations to the goal behavior:
The whole process of becoming competent in any field must be divided into a very large number of very small steps, and reinforcement must be contingent upon the accomplishment of each step. (Skinner, 1961g, p. 153)
7. Chaining – complex behavior can be established by linking together a series of simple behaviors already known to the learner, where the response of each link brings the learner into contact with discriminative stimuli that serve as cues for subsequent responses.
8. Priming – various methods, such as showing or telling, can be used to get a learner to behave in a given way for the first time so that the behavior can be reinforced.
9. Prompting – certain discriminative stimuli may be used to provide a guide to prompt behavior that is to be learned.
10. Vanishing (i.e. Fading) – the concept of vanishing refers to the gradual fading out of discriminative stimuli initially used to prompt a behavior. Skinner (1986) provided a practical example:
My daughter Deborah once came home from school complaining that she had been assigned to learn 15 lines of Longfellow’s “Evangeline.” (“Those are very long lines,” she said.) I told her I would show her how she could learn them quite easily. I wrote the lines on a chalkboard and asked her to read them. Then I erased a few words and asked her to read them again. She did so correctly in spite of the omissions. I erased a few more words, and she could still “read” them. After five or six erasures, she “read” them although there was nothing on the chalkboard. At first, the words were primes. By reading them, she engaged in the required behavior – but not yet for the right reasons. The words I left on the chalkboard functioned as slowly vanishing prompts. We do something of the same sort when we learn a poem by ourselves. We prime our behavior by reading a line, and then we turn away from the text and say as much of the line as we can, looking back and prompting ourselves if necessary. By looking back less and less often, we slowly vanish the prompts. (p. 107)
One of Skinner’s greatest and most unique contributions to behavioral learning theory is his research around schedules of reinforcement. Skinner first became interested in schedules of reinforcement when the magazines used to automatically deliver food pellets in response to a bar press jammed or otherwise failed to operate. Under these conditions rats would continue to press the bar even though food was not delivered with every bar press. Skinner took advantage of this as a way to reduce laboratory costs by using less food, and also to initiate a program of study of intermittent reinforcement schedules. Though only four schedules are well known (fixed ratio, fixed interval, variable ratio, and variable interval), Skinner also explored tandem schedules, differential reinforcement of rate, multiple schedules, mixed schedules, chained schedules, and concurrent schedules (Ferster & Skinner, 1957). His second major contribution is the practical implementation of behavioral principles of learning in the classroom using programmed instruction and teaching machines (Skinner, 1960; 1961i; 1961j; 1986). Skinner was not the first to conceive of a teaching machine, but his program of practical application and research paved the way for the modern era of computer-based instruction.