Misplaced Pages

B. F. Skinner

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Burrhus Frederic Skinner) American psychologist and social philosopher (1904–1990)

B. F. Skinner
Skinner, c. 1950
BornBurrhus Frederic Skinner
(1904-03-20)March 20, 1904
Susquehanna, Pennsylvania, U.S.
DiedAugust 18, 1990(1990-08-18) (aged 86)
Cambridge, Massachusetts, U.S.
Alma materHamilton College (AB)
Harvard University (PhD)
Known forBehavior analysis
Operant conditioning
Radical behaviorism
Verbal Behavior (1957)
Spouse Yvonne (Eve) Blue ​(m. 1936)
ChildrenJulie and Deborah
AwardsNational Medal of Science (1968)
Scientific career
FieldsPsychology, linguistics, philosophy
InstitutionsUniversity of Minnesota
Indiana University
Harvard University
Signature

Burrhus Frederic Skinner (March 20, 1904 – August 18, 1990) was an American psychologist, behaviorist, inventor, and social philosopher. He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.

Skinner developed behavior analysis, especially the philosophy of radical behaviorism, and founded the experimental analysis of behavior, a school of experimental research psychology. He also used operant conditioning to strengthen behavior, considering the rate of response to be the most effective measure of response strength. To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box), and to measure rate he invented the cumulative recorder. Using these tools, he and Charles Ferster produced Skinner's most influential experimental work, outlined in their 1957 book Schedules of Reinforcement.

Skinner was a prolific author, publishing 21 books and 180 articles. He imagined the application of his ideas to the design of a human community in his 1948 utopian novel, Walden Two, while his analysis of human behavior culminated in his 1958 work, Verbal Behavior.

Skinner, John B. Watson and Ivan Pavlov, are considered to be the pioneers of modern behaviorism. Accordingly, a June 2002 survey listed Skinner as the most influential psychologist of the 20th century.

Biography

Skinner was born in Susquehanna, Pennsylvania, to Grace and William Skinner, the latter of whom was a lawyer. Skinner became an atheist after a Christian teacher tried to assuage his fear of the hell that his grandmother described. His brother Edward, two and a half years younger, died at age 16 of a cerebral hemorrhage.

Skinner's closest friend as a young boy was Raphael Miller, whom he called Doc because his father was a doctor. Doc and Skinner became friends due to their parents' religiousness and both had an interest in contraptions and gadgets. They had set up a telegraph line between their houses to send messages to each other, although they had to call each other on the telephone due to the confusing messages sent back and forth. During one summer, Doc and Skinner started an elderberry business to gather berries and sell them door to door. They found that when they picked the ripe berries, the unripe ones came off the branches too, so they built a device that was able to separate them. The device was a bent piece of metal to form a trough. They would pour water down the trough into a bucket, and the ripe berries would sink into the bucket and the unripe ones would be pushed over the edge to be thrown away.

Education

Skinner attended Hamilton College in Clinton, New York, with the intention of becoming a writer. He found himself at a social disadvantage at the college because of his intellectual attitude. He was a member of Lambda Chi Alpha fraternity.

He wrote for the school paper, but, as an atheist, he was critical of the traditional mores of his college. After receiving his Bachelor of Arts in English literature in 1926, he attended Harvard University, where he would later research and teach. While attending Harvard, a fellow student, Fred S. Keller, convinced Skinner that he could make an experimental science of the study of behavior. This led Skinner to invent a prototype for the Skinner box and to join Keller in the creation of other tools for small experiments.

After graduation, Skinner unsuccessfully tried to write a novel while he lived with his parents, a period that he later called the "Dark Years". He became disillusioned with his literary skills despite encouragement from the renowned poet Robert Frost, concluding that he had little world experience and no strong personal perspective from which to write. His encounter with John B. Watson's behaviorism led him into graduate study in psychology and to the development of his own version of behaviorism.

Later life

The gravestone of B. F. Skinner and his wife Eve at Mount Auburn Cemetery

Skinner received a PhD from Harvard in 1931, and remained there as a researcher for some years. In 1936, he went to the University of Minnesota in Minneapolis to teach. In 1945, he moved to Indiana University, where he was chair of the psychology department from 1946 to 1947, before returning to Harvard as a tenured professor in 1948. He remained at Harvard for the rest of his life. In 1973, Skinner was one of the signers of the Humanist Manifesto II.

In 1936, Skinner married Yvonne "Eve" Blue. The couple had two daughters, Julie (later Vargas) and Deborah (later Buzan; married Barry Buzan). Yvonne died in 1997, and is buried in Mount Auburn Cemetery, Cambridge, Massachusetts.

Skinner's public exposure had increased in the 1970s, he remained active even after his retirement in 1974, until his death. In 1989, Skinner was diagnosed with leukemia and died on August 18, 1990, in Cambridge, Massachusetts. Ten days before his death, he was given the lifetime achievement award by the American Psychological Association and gave a talk concerning his work.

Contributions to psychology

Behaviorism

Main articles: Behaviorism and Radical behaviorism

Skinner referred to his approach to the study of behavior as radical behaviorism, which originated in the early 1900s as a reaction to depth psychology and other traditional forms of psychology, which often had difficulty making predictions that could be tested experimentally. This philosophy of behavioral science assumes that behavior is a consequence of environmental histories of reinforcement (see applied behavior analysis). In his words:

The position can be stated as follows: what is felt or introspectively observed is not some nonphysical world of consciousness, mind, or mental life but the observer's own body. This does not mean, as I shall show later, that introspection is a kind of psychological research, nor does it mean (and this is the heart of the argument) that what are felt or introspectively observed are the causes of the behavior. An organism behaves as it does because of its current structure, but most of this is out of reach of introspection. At the moment we must content ourselves, as the methodological behaviorist insists, with a person's genetic and environment histories. What are introspectively observed are certain collateral products of those histories.... In this way we repair the major damage wrought by mentalism. When what a person does attributed to what is going on inside him, investigation is brought to an end. Why explain the explanation? For twenty-five hundred years people have been preoccupied with feelings and mental life, but only recently has any interest been shown in a more precise analysis of the role of the environment. Ignorance of that role led in the first place to mental fictions, and it has been perpetuated by the explanatory practices to which they gave rise.

Foundations of Skinner's behaviorism

Skinner's ideas about behaviorism were largely set forth in his first book, The Behavior of Organisms (1938). Here, he gives a systematic description of the manner in which environmental variables control behavior. He distinguished two sorts of behavior which are controlled in different ways:

  • Respondent behaviors are elicited by stimuli, and may be modified through respondent conditioning, often called classical (or pavlovian) conditioning, in which a neutral stimulus is paired with an eliciting stimulus. Such behaviors may be measured by their latency or strength.
  • Operant behaviors are 'emitted', meaning that initially they are not induced by any particular stimulus. They are strengthened through operant conditioning (aka instrumental conditioning), in which the occurrence of a response yields a reinforcer. Such behaviors may be measured by their rate.

Both of these sorts of behavior had already been studied experimentally, most notably: respondents, by Ivan Pavlov; and operants, by Edward Thorndike. Skinner's account differed in some ways from earlier ones, and was one of the first accounts to bring them under one roof.

The idea that behavior is strengthened or weakened by its consequences raises several questions. Among the most commonly asked are these:

  1. Operant responses are strengthened by reinforcement, but where do they come from in the first place?
  2. Once it is in the organism's repertoire, how is a response directed or controlled?
  3. How can very complex and seemingly novel behaviors be explained?

1. Origin of operant behavior

Skinner's answer to the first question was very much like Darwin's answer to the question of the origin of a 'new' bodily structure, namely, variation and selection. Similarly, the behavior of an individual varies from moment to moment; a variation that is followed by reinforcement is strengthened and becomes prominent in that individual's behavioral repertoire. Shaping was Skinner's term for the gradual modification of behavior by the reinforcement of desired variations. Skinner believed that 'superstitious' behavior can arise when a response happens to be followed by reinforcement to which it is actually unrelated.

2. Control of operant behavior

The second question, "how is operant behavior controlled?" arises because, to begin with, the behavior is "emitted" without reference to any particular stimulus. Skinner answered this question by saying that a stimulus comes to control an operant if it is present when the response is reinforced and absent when it is not. For example, if lever-pressing only brings food when a light is on, a rat, or a child, will learn to press the lever only when the light is on. Skinner summarized this relationship by saying that a discriminative stimulus (e.g. light or sound) sets the occasion for the reinforcement (food) of the operant (lever-press). This three-term contingency (stimulus-response-reinforcer) is one of Skinner's most important concepts, and sets his theory apart from theories that use only pair-wise associations.

3. Explaining complex behavior

Most behavior of humans cannot easily be described in terms of individual responses reinforced one by one, and Skinner devoted a great deal of effort to the problem of behavioral complexity. Some complex behavior can be seen as a sequence of relatively simple responses, and here Skinner invoked the idea of "chaining". Chaining is based on the fact, experimentally demonstrated, that a discriminative stimulus not only sets the occasion for subsequent behavior, but it can also reinforce a behavior that precedes it. That is, a discriminative stimulus is also a "conditioned reinforcer". For example, the light that sets the occasion for lever pressing may also be used to reinforce "turning around" in the presence of a noise. This results in the sequence "noise – turn-around – light – press lever – food." Much longer chains can be built by adding more stimuli and responses.

However, Skinner recognized that a great deal of behavior, especially human behavior, cannot be accounted for by gradual shaping or the construction of response sequences. Complex behavior often appears suddenly in its final form, as when a person first finds his way to the elevator by following instructions given at the front desk. To account for such behavior, Skinner introduced the concept of rule-governed behavior. First, relatively simple behaviors come under the control of verbal stimuli: the child learns to "jump," "open the book," and so on. After a large number of responses come under such verbal control, a sequence of verbal stimuli can evoke an almost unlimited variety of complex responses.

Reinforcement

Main article: Reinforcement

Reinforcement, a key concept of behaviorism, is the primary process that shapes and controls behavior, and occurs in two ways: positive and negative. In The Behavior of Organisms (1938), Skinner defines negative reinforcement to be synonymous with punishment, i.e. the presentation of an aversive stimulus. This definition would subsequently be re-defined in Science and Human Behavior (1953).

In what has now become the standard set of definitions, positive reinforcement is the strengthening of behavior by the occurrence of some event (e.g., praise after some behavior is performed), whereas negative reinforcement is the strengthening of behavior by the removal or avoidance of some aversive event (e.g., opening and raising an umbrella over your head on a rainy day is reinforced by the cessation of rain falling on you).

Both types of reinforcement strengthen behavior, or increase the probability of a behavior reoccurring; the difference being in whether the reinforcing event is something applied (positive reinforcement) or something removed or avoided (negative reinforcement). Punishment can be the application of an aversive stimulus/event (positive punishment or punishment by contingent stimulation) or the removal of a desirable stimulus (negative punishment or punishment by contingent withdrawal). Though punishment is often used to suppress behavior, Skinner argued that this suppression is temporary and has a number of other, often unwanted, consequences. Extinction is the absence of a rewarding stimulus, which weakens behavior.

Writing in 1981, Skinner pointed out that Darwinian natural selection is, like reinforced behavior, "selection by consequences". Though, as he said, natural selection has now "made its case," he regretted that essentially the same process, "reinforcement", was less widely accepted as underlying human behavior.

Schedules of reinforcement

Main article: Schedules of reinforcement

Skinner recognized that behavior is typically reinforced more than once, and, together with Charles Ferster, he did an extensive analysis of the various ways in which reinforcements could be arranged over time, calling it the schedules of reinforcement.

The most notable schedules of reinforcement studied by Skinner were continuous, interval (fixed or variable), and ratio (fixed or variable). All are methods used in operant conditioning.

  • Continuous reinforcement (CRF): each time a specific action is performed the subject receives a reinforcement. This method is effective when teaching a new behavior because it quickly establishes an association between the target behavior and the reinforcer.
  • Interval schedule: based on the time intervals between reinforcements.
    • Fixed interval schedule (FI): A procedure in which reinforcements are presented at fixed time periods, provided that the appropriate response is made. This schedule yields a response rate that is low just after reinforcement and becomes rapid just before the next reinforcement is scheduled.
    • Variable interval schedule (VI): A procedure in which behavior is reinforced after scheduled but unpredictable time durations following the previous reinforcement. This schedule yields the most stable rate of responding, with the average frequency of reinforcement determining the frequency of response.
  • Ratio schedules: based on the ratio of responses to reinforcements.
    • Fixed ratio schedule (FR): A procedure in which reinforcement is delivered after a specific number of responses have been made.
    • Variable ratio schedule (VR): A procedure in which reinforcement comes after a number of responses that is randomized from one reinforcement to the next (e.g. slot machines). The lower the number of responses required, the higher the response rate tends to be. Variable ratio schedules tend to produce very rapid and steady responding rates in contrast with fixed ratio schedules where the frequency of response usually drops after the reinforcement occurs.

Token economy

"Skinnerian" principles have been used to create token economies in a number of institutions, such as psychiatric hospitals. When participants behave in desirable ways, their behavior is reinforced with tokens that can be changed for such items as candy, cigarettes, coffee, or the exclusive use of a radio or television set.

Verbal Behavior

Main article: Verbal Behavior

Challenged by Alfred North Whitehead during a casual discussion while at Harvard to provide an account of a randomly provided piece of verbal behavior, Skinner set about attempting to extend his then-new functional, inductive approach to the complexity of human verbal behavior. Developed over two decades, his work appeared in the book Verbal Behavior. Although Noam Chomsky was highly critical of Verbal Behavior, he conceded that Skinner's "S-R psychology" was worth a review. Behavior analysts reject Chomsky's appraisal of Skinner's work as merely "stimulus-response psychology," and some have argued that this mischaracterization highlights a poor understanding of Skinner's work and the field of behavior analysis as a whole.

Verbal Behavior had an uncharacteristically cool reception, partly as a result of Chomsky's review, partly because of Skinner's failure to address or rebut any of Chomsky's criticisms. Skinner's peers may have been slow to adopt the ideas presented in Verbal Behavior because of the absence of experimental evidence—unlike the empirical density that marked Skinner's experimental work.

Scientific inventions

Operant conditioning chamber

Main article: Operant conditioning chamber

An operant conditioning chamber (also known as a "Skinner box") is a laboratory apparatus used in the experimental analysis of animal behavior. It was invented by Skinner while he was a graduate student at Harvard University. As used by Skinner, the box had a lever (for rats), or a disk in one wall (for pigeons). A press on this "manipulandum" could deliver food to the animal through an opening in the wall, and responses reinforced in this way increased in frequency. By controlling this reinforcement together with discriminative stimuli such as lights and tones, or punishments such as electric shocks, experimenters have used the operant box to study a wide variety of topics, including schedules of reinforcement, discriminative control, delayed response ("memory"), punishment, and so on. By channeling research in these directions, the operant conditioning chamber has had a huge influence on course of research in animal learning and its applications. It enabled great progress on problems that could be studied by measuring the rate, probability, or force of a simple, repeatable response. However, it discouraged the study of behavioral processes not easily conceptualized in such terms—spatial learning, in particular, which is now studied in quite different ways, for example, by the use of the water maze.

Cumulative recorder

The cumulative recorder makes a pen-and-ink record of simple repeated responses. Skinner designed it for use with the operant chamber as a convenient way to record and view the rate of responses such as a lever press or a key peck. In this device, a sheet of paper gradually unrolls over a cylinder. Each response steps a small pen across the paper, starting at one edge; when the pen reaches the other edge, it quickly resets to the initial side. The slope of the resulting ink line graphically displays the rate of the response; for example, rapid responses yield a steeply sloping line on the paper, slow responding yields a line of low slope. The cumulative recorder was a key tool used by Skinner in his analysis of behavior, and it was very widely adopted by other experimenters, gradually falling out of use with the advent of the laboratory computer and use of line graphs. Skinner's major experimental exploration of response rates, presented in his book with Charles Ferster, Schedules of Reinforcement, is full of cumulative records produced by this device.

Air crib

The air crib is an easily cleaned, temperature- and humidity-controlled box-bed intended to replace the standard infant crib. After raising one baby, Skinner felt that he could simplify the process for parents and improve the experience for children. He primarily thought of the idea to help his wife cope with the day-to-day tasks of child rearing. Skinner had some specific concerns about raising a baby in the rough environment where he lived in Minnesota. Keeping the child warm was a central priority (Faye, 2010). Though this was the main goal, it also was designed to reduce laundry, diaper rash, and cradle cap, while still allowing the baby to be more mobile and comfortable. Reportedly it had some success in these goals as it was advertised commercially with an estimate of 300 children who were raised in the air crib. Psychology Today tracked down 50 children and ran a short piece on the effects of the air crib. The reports came back positive and that these children and parents enjoyed using the crib (Epstein, 2005). One of these air cribs resides in the gallery at the Center for the History of Psychology in Akron, Ohio (Faye, 2010).

The air crib was designed with three solid walls and a safety-glass panel at the front which could be lowered to move the baby in and out of the crib. The floor was stretched canvas. Sheets were intended to be used over the canvas and were easily rolled off when soiled. Addressing Skinners' concern for temperature, a control box on top of the crib regulated temperature and humidity. Filtered air flowed through the crib from below. This crib was higher than most standard cribs, allowing easier access to the child without the need to bend over (Faye, 2010).

The air crib was a controversial invention. It was popularly characterized as a cruel pen, and it was often compared to Skinner's operant conditioning chamber (or "Skinner box"). Skinner's article in Ladies Home Journal, titled "Baby in a Box", caught the eye of many and contributed to skepticism about the device (Bjork, 1997). A picture published with the article showed the Skinners' daughter, Deborah, peering out of the crib with her hands and face pressed upon the glass. Skinner also used the term "experiment" when describing the crib, and this association with laboratory animal experimentation discouraged the crib's commercial success, although several companies attempted to produce and sell it.

In 2004, therapist Lauren Slater repeated a claim that Skinner may have used his baby daughter in some of his experiments. His outraged daughter publicly accused Slater of not making a good-faith effort to check her facts before publishing. Debora was quoted by the Guardian saying "According to Opening Skinner's Box: Great Psychological Experiments of the Twentieth Century, my father, who was a psychologist based at Harvard from the 1950s to the 90s, "used his infant daughter, Deborah, to prove his theories by putting her for a few hours a day in a laboratory box . . . in which all her needs were controlled and shaped". But it's not true. My father did nothing of the sort."

Teaching machine

The teaching machine, a mechanical invention to automate the task of programmed learning

The teaching machine was a mechanical device whose purpose was to administer a curriculum of programmed learning. The machine embodies key elements of Skinner's theory of learning and had important implications for education in general and classroom instruction in particular.

In one incarnation, the machine was a box that housed a list of questions that could be viewed one at a time through a small window. (see picture.) There was also a mechanism through which the learner could respond to each question. Upon delivering a correct answer, the learner would be rewarded.

Skinner advocated the use of teaching machines for a broad range of students (e.g., preschool aged to adult) and instructional purposes (e.g., reading and music). For example, one machine that he envisioned could teach rhythm. He wrote:

A relatively simple device supplies the necessary contingencies. The student taps a rhythmic pattern in unison with the device. "Unison" is specified very loosely at first (the student can be a little early or late at each tap) but the specifications are slowly sharpened. The process is repeated for various speeds and patterns. In another arrangement, the student echoes rhythmic patterns sounded by the machine, though not in unison, and again the specifications for an accurate reproduction are progressively sharpened. Rhythmic patterns can also be brought under the control of a printed score.

The instructional potential of the teaching machine stemmed from several factors: it provided automatic, immediate and regular reinforcement without the use of aversive control; the material presented was coherent, yet varied and novel; the pace of learning could be adjusted to suit the individual. As a result, students were interested, attentive, and learned efficiently by producing the desired behavior, "learning by doing."

Teaching machines, though perhaps rudimentary, were not rigid instruments of instruction. They could be adjusted and improved based upon the students' performance. For example, if a student made many incorrect responses, the machine could be reprogrammed to provide less advanced prompts or questions—the idea being that students acquire behaviors most efficiently if they make few errors. Multiple-choice formats were not well-suited for teaching machines because they tended to increase student mistakes, and the contingencies of reinforcement were relatively uncontrolled.

Not only useful in teaching explicit skills, machines could also promote the development of a repertoire of behaviors that Skinner called self-management. Effective self-management means attending to stimuli appropriate to a task, avoiding distractions, reducing the opportunity of reward for competing behaviors, and so on. For example, machines encourage students to pay attention before receiving a reward. Skinner contrasted this with the common classroom practice of initially capturing students' attention (e.g., with a lively video) and delivering a reward (e.g., entertainment) before the students have actually performed any relevant behavior. This practice fails to reinforce correct behavior and actually counters the development of self-management.

Skinner pioneered the use of teaching machines in the classroom, especially at the primary level. Today computers run software that performs similar teaching tasks, and there has been a resurgence of interest in the topic related to the development of adaptive learning systems.

Pigeon-guided missile

Main article: Project Pigeon

During World War II, the US Navy required a weapon effective against surface ships, such as the German Bismarck class battleships. Although missile and TV technology existed, the size of the primitive guidance systems available rendered automatic guidance impractical. To solve this problem, Skinner initiated Project Pigeon, which was intended to provide a simple and effective guidance system. Skinner trained pigeons through operant conditioning to peck a camera obscura screen showing incoming targets on individual screens (Schultz-Figueroa, 2019). This system divided the nose cone of a missile into three compartments, with a pigeon placed in each. Within the ship, the three lenses projected an image of distant objects onto a screen in front of each bird. Thus, when the missile was launched from an aircraft within sight of an enemy ship, an image of the ship would appear on the screen. The screen was hinged, which connected the screens to the bomb's guidance system. This was done through four small rubber pneumatic tubes that were attached to each side of the frame, which directed a constant airflow to a pneumatic pickup system that controlled the thrusters of the bomb. Resulting in the missile being guided towards the targeted ship, through just the peck coming from the pigeon (Schultz-Figueroa, 2019).

Despite an effective demonstration, the project was abandoned, and eventually more conventional solutions, such as those based on radar, became available. Skinner complained that "our problem was no one would take us seriously." Before the project was completely abandoned it was tested extensively in the laboratory. After the United States Army ultimately denied it the United States Naval Research Laboratory picked up Skinner's Research and renamed it Project ORCON, which was a contraction of "organic" and "control". Skinner worked closely with the US Naval Research Laboratory continuously testing the pigeon's tracking capacity for guiding missiles to their intended targets. In the end, the pigeons' performance and accuracy relied on so many uncontrollable factors that Project ORCON, like Project Pigeon before it, was again discontinued. It was never used in the field.

Verbal summator

Early in his career Skinner became interested in "latent speech" and experimented with a device he called the verbal summator. This device can be thought of as an auditory version of the Rorschach inkblots. When using the device, human participants listened to incomprehensible auditory "garbage" but often read meaning into what they heard. Thus, as with the Rorschach blots, the device was intended to yield overt behavior that projected subconscious thoughts. Skinner's interest in projective testing was brief, but he later used observations with the summator in creating his theory of verbal behavior. The device also led other researchers to invent new tests such as the tautophone test, the auditory apperception test, and the Azzageddi test.

Influence on teaching

Along with psychology, education has also been influenced by Skinner's views, which are extensively presented in his book The Technology of Teaching, as well as reflected in Fred S. Keller's Personalized System of Instruction and Ogden R. Lindsley's Precision Teaching.

Skinner argued that education has two major purposes:

  1. to teach repertoires of both verbal and nonverbal behavior; and
  2. to interest students in learning.

He recommended bringing students' behavior under appropriate control by providing reinforcement only in the presence of stimuli relevant to the learning task. Because he believed that human behavior can be affected by small consequences, something as simple as "the opportunity to move forward after completing one stage of an activity" can be an effective reinforcer. Skinner was convinced that, to learn, a student must engage in behavior, and not just passively receive information.

Skinner believed that effective teaching must be based on positive reinforcement which is, he argued, more effective at changing and establishing behavior than punishment. He suggested that the main thing people learn from being punished is how to avoid punishment. For example, if a child is forced to practice playing an instrument, the child comes to associate practicing with punishment and thus develops feelings of dreadfulness and wishes to avoid practicing the instrument. This view had obvious implications for the then widespread practice of rote learning and punitive discipline in education. The use of educational activities as punishment may induce rebellious behavior such as vandalism or absence.

Because teachers are primarily responsible for modifying student behavior, Skinner argued that teachers must learn effective ways of teaching. In The Technology of Teaching (1968), Skinner has a chapter on why teachers fail: He says that teachers have not been given an in-depth understanding of teaching and learning. Without knowing the science underpinning teaching, teachers fall back on procedures that work poorly or not at all, such as:

  • using aversive techniques (which produce escape and avoidance and undesirable emotional effects);
  • relying on telling and explaining ("Unfortunately, a student does not learn simply when he is shown or told.");
  • failing to adapt learning tasks to the student's current level; and
  • failing to provide positive reinforcement frequently enough.

Skinner suggests that any age-appropriate skill can be taught. The steps are

  1. Clearly specify the action or performance the student is to learn.
  2. Break down the task into small achievable steps, going from simple to complex.
  3. Let the student perform each step, reinforcing correct actions.
  4. Adjust so that the student is always successful until finally the goal is reached.
  5. Shift to intermittent reinforcement to maintain the student's performance.

Contributions to social theory

Skinner is popularly known mainly for his books Walden Two (1948) and Beyond Freedom and Dignity, (for which he made the cover of Time magazine). The former describes a fictional "experimental community" in 1940s United States. The productivity and happiness of citizens in this community is far greater than in the outside world because the residents practice scientific social planning and use operant conditioning in raising their children.

Walden Two, like Thoreau's Walden, champions a lifestyle that does not support war, or foster competition and social strife. It encourages a lifestyle of minimal consumption, rich social relationships, personal happiness, satisfying work, and leisure. In 1967, Kat Kinkade and others founded the Twin Oaks Community, using Walden Two as a blueprint. The community still exists and continues to use the Planner-Manager system and other aspects of the community described in Skinner's book, though behavior modification is not a community practice.

In Beyond Freedom and Dignity, Skinner suggests that a technology of behavior could help to make a better society. We would, however, have to accept that an autonomous agent is not the driving force of our actions. Skinner offers alternatives to punishment, and challenges his readers to use science and modern technology to construct a better society.

Political views

Skinner's political writings emphasized his hopes that an effective and human science of behavioral control – a technology of human behavior – could help with problems as yet unsolved and often aggravated by advances in technology such as the atomic bomb. Indeed, one of Skinner's goals was to prevent humanity from destroying itself. He saw political activity as the use of aversive or non-aversive means to control a population. Skinner favored the use of positive reinforcement as a means of control, citing Jean-Jacques Rousseau's novel Emile: or, On Education as an example of literature that "did not fear the power of positive reinforcement."

Skinner's book, Walden Two, presents a vision of a decentralized, localized society, which applies a practical, scientific approach and behavioral expertise to deal peacefully with social problems. (For example, his views led him to oppose corporal punishment in schools, and he wrote a letter to the California Senate that helped lead it to a ban on spanking.) Skinner's utopia is both a thought experiment and a rhetorical piece. In Walden Two, Skinner answers the problem that exists in many utopian novels – "What is the Good Life?" The book's answer is a life of friendship, health, art, a healthy balance between work and leisure, a minimum of unpleasantness, and a feeling that one has made worthwhile contributions to a society in which resources are ensured, in part, by minimizing consumption.

If the world is to save any part of its resources for the future, it must reduce not only consumption but the number of consumers.

— B. F. Skinner, Walden Two (1948), p. xi

Skinner described his novel as "my New Atlantis", in reference to Bacon's utopia.

When Milton's Satan falls from heaven, he ends in hell. And what does he say to reassure himself? 'Here, at least, we shall be free.' And that, I think, is the fate of the old-fashioned liberal. He's going to be free, but he's going to find himself in hell.

— B. F. Skinner, from William F. Buckley Jr, On the Firing Line, p. 87.

"'Superstition' in the Pigeon" experiment

One of Skinner's experiments examined the formation of superstition in one of his favorite experimental animals, the pigeon. Skinner placed a series of hungry pigeons in a cage attached to an automatic mechanism that delivered food to the pigeon "at regular intervals with no reference whatsoever to the bird's behavior." He discovered that the pigeons associated the delivery of the food with whatever chance actions they had been performing as it was delivered, and that they subsequently continued to perform these same actions.

One bird was conditioned to turn counter-clockwise about the cage, making two or three turns between reinforcements. Another repeatedly thrust its head into one of the upper corners of the cage. A third developed a 'tossing' response, as if placing its head beneath an invisible bar and lifting it repeatedly. Two birds developed a pendulum motion of the head and body, in which the head was extended forward and swung from right to left with a sharp movement followed by a somewhat slower return.

Skinner suggested that the pigeons behaved as if they were influencing the automatic mechanism with their "rituals", and that this experiment shed light on human behavior:

The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking. There are many analogies in human behavior. Rituals for changing one's fortune at cards are good examples. A few accidental connections between a ritual and favorable consequences suffice to set up and maintain the behavior in spite of many unreinforced instances. The bowler who has released a ball down the alley but continues to behave as if she were controlling it by twisting and turning her arm and shoulder is another case in point. These behaviors have, of course, no real effect upon one's luck or upon a ball half way down an alley, just as in the present case the food would appear as often if the pigeon did nothing—or, more strictly speaking, did something else.

Modern behavioral psychologists have disputed Skinner's "superstition" explanation for the behaviors he recorded. Subsequent research (e.g. Staddon and Simmelhag, 1971), while finding similar behavior, failed to find support for Skinner's "adventitious reinforcement" explanation for it. By looking at the timing of different behaviors within the interval, Staddon and Simmelhag were able to distinguish two classes of behavior: the terminal response, which occurred in anticipation of food, and interim responses, that occurred earlier in the interfood interval and were rarely contiguous with food. Terminal responses seem to reflect classical (as opposed to operant) conditioning, rather than adventitious reinforcement, guided by a process like that observed in 1968 by Brown and Jenkins in their "autoshaping" procedures. The causation of interim activities (such as the schedule-induced polydipsia seen in a similar situation with rats) also cannot be traced to adventitious reinforcement and its details are still obscure (Staddon, 1977).

Criticism

Noam Chomsky

American linguist Noam Chomsky published a review of Skinner's Verbal Behavior in the linguistics journal Language in 1959. Chomsky argued that Skinner's attempt to use behaviorism to explain human language amounted to little more than word games. Conditioned responses could not account for a child's ability to create or understand an infinite variety of novel sentences. Chomsky's review has been credited with launching the cognitive revolution in psychology and other disciplines. Skinner, who rarely responded directly to critics, never formally replied to Chomsky's critique, but endorsed Kenneth MacCorquodale's 1972 reply.

I read half a dozen pages, saw that it missed the point of my book, and went no further. My reasons, I am afraid, show a lack of character. In the first place I should have had to read the review, and I found its tone distasteful. It was not really a review of my book but of what Chomsky took, erroneously, to be my position.

Many academics in the 1960s believed that Skinner's silence on the question meant Chomsky's criticism had been justified. But MacCorquodale wrote that Chomsky's criticism did not focus on Skinner's Verbal Behavior, but rather attacked a confusion of ideas from behavioral psychology. MacCorquodale also regretted Chomsky's aggressive tone. Furthermore, Chomsky had aimed at delivering a definitive refutation of Skinner by citing dozens of animal instinct and animal learning studies. On the one hand, he argued that the studies on animal instinct proved that animal behavior is innate, and therefore Skinner was mistaken. On the other, Chomsky's opinion of the studies on learning was that one cannot draw an analogy from animal studies to human behavior—or, that research on animal instinct refutes research on animal learning.

Chomsky also reviewed Skinner's Beyond Freedom and Dignity, using the same basic motives as his Verbal Behavior review. Among Chomsky's criticisms were that Skinner's laboratory work could not be extended to humans, that when it was extended to humans it represented "scientistic" behavior attempting to emulate science but which was not scientific, that Skinner was not a scientist because he rejected the hypothetico-deductive model of theory testing, and that Skinner had no science of behavior.

Psychodynamic psychology

Skinner has been repeatedly criticized for his supposed animosity towards Sigmund Freud, psychoanalysis, and psychodynamic psychology. Some have argued, however, that Skinner shared several of Freud's assumptions, and that he was influenced by Freudian points of view in more than one field, among them the analysis of defense mechanisms, such as repression. To study such phenomena, Skinner even designed his own projective test, the "verbal summator" described above.

J. E. R. Staddon

As understood by Skinner, ascribing dignity to individuals involves giving them credit for their actions. To say "Skinner is brilliant" means that Skinner is an originating force. If Skinner's determinist theory is right, he is merely the focus of his environment. He is not an originating force and he had no choice in saying the things he said or doing the things he did. Skinner's environment and genetics both allowed and compelled him to write his book. Similarly, the environment and genetic potentials of the advocates of freedom and dignity cause them to resist the reality that their own activities are deterministically grounded. J. E. R. Staddon has argued the compatibilist position; Skinner's determinism is not in any way contradictory to traditional notions of reward and punishment, as he believed.

Professional career

Roles

Awards

Honorary degrees

Skinner received honorary degrees from:

Honorary societies

Skinner was inducted to the following honorary societies:

Bibliography

See also

References

Notes

  1. A free copy of this book (in a 1.6 MB .pdf file) may be downloaded at the B. F. Skinner Foundation web site BFSkinner.org.

Citations

  1. Sobel, Dava (August 20, 1990). "B. F. Skinner, the Champion Of Behaviorism, Is Dead at 86". The New York Times. Archived from the original on August 6, 2010. Retrieved August 30, 2015.
  2. Smith, L. D.; Woodward, W. R. (1996). B. F. Skinner and Behaviorism in American Culture. Bethlehem, Pennsylvania: Lehigh University Press. ISBN 978-0-934223-40-9.
  3. ^ Skinner, B. F. (1948). Walden Two. New York: Macmillan Publishers. ISBN 0-87220-779-X. The science of human behavior is used to eliminate poverty, sexual expression, government as we know it, create a lifestyle without that such as war.
  4. Skinner, B. F. (1972). Beyond Freedom and Dignity. Vintage Books. ISBN 978-0-553-14372-0. OCLC 34263003.
  5. "Skinner, Burrhus Frederic". History of Behavior Analysis. Retrieved July 29, 2021.
  6. Swenson, Christa (May 1999). "Burrhus Frederick Skinner". History of Psychology Archives. Archived from the original on April 4, 2007.
  7. Skinner, B. F. (1974). About Behaviorism. Random House. ISBN 0-394-71618-3.
  8. ^ Schacter, Daniel L.; Gilbert, Daniel T.; Wagner, Daniel M. (2011). Psychology (2nd ed.). New York: Worth Publishers. p. 17. ISBN 978-1-4292-3719-2.
  9. ^ Skinner, B. F. (1938). The Behavior of Organisms. New York: Appleton-Century-Crofts. ISBN 1-58390-007-1.
  10. ^ Ferster, Charles B.; Skinner, B. F. (1957). Schedules of Reinforcement. New York: Appleton-Century-Crofts. ISBN 0-13-792309-0.
  11. Smith, Nathaniel G.; Morris, Edward K. (2021). "Full Bibliography". B. F. Skinner Foundation. Retrieved July 29, 2021. Also available as a PDF.
  12. Skinner, B. F. (1958). Verbal Behavior. Acton, Massachusetts: Copley Publishing Group. ISBN 1-58390-021-7.
  13. Haggbloom, Steven J.; Warnick, Renee; Warnick, Jason E.; Jones, Vinessa K.; et al. (June 1, 2002). "The 100 most eminent psychologists of the 20th century". Review of General Psychology. 6 (2): 139–52. CiteSeerX 10.1.1.586.1913. doi:10.1037/1089-2680.6.2.139. S2CID 145668721.
  14. Skinner, B. F. (1967). "B. F. Skinner". In Boring, E. G.; Lindzey, G. (eds.). A History of Psychology in Autobiography. Vol. 5. New York: Appleton-Century-Crofts. pp. 387–413. doi:10.1037/11579-014. Within a year I had gone to Miss Graves to tell her that I no longer believed in God. 'I know,' she said, 'I have been through that myself.' But her strategy misfired: I never went through it.
  15. Mahoney, Michael J. (October 1991). "B. F. Skinner: A Collective Tribute". Canadian Psychology. 32 (4): 628–635. doi:10.1037/h0084641.
  16. ^ Skinner, B. F (1976). Particulars of My Life (1st ed.). New York: Knopf. ISBN 978-0-394-40071-6.
  17. ^ Bjork, Daniel W. (2013). B. F. Skinner: A Life. American Psychological Association. ISBN 978-1-55798-416-6.
  18. "Establishment History". University of Minnesota. Retrieved December 16, 2020.
  19. Vargas, Julie (February 6, 2014). "Biographical Information". B. F. Skinner Foundation. Retrieved December 16, 2020.
  20. "Humanist Manifesto II". American Humanist Association. Archived from the original on October 20, 2012. Retrieved October 9, 2012.
  21. Skinner, Deborah. "About Skinner". Horses by Skinner. Archived from the original on May 30, 2015. Retrieved September 4, 2014.
  22. Buzan, Deborah Skinner (March 12, 2004). "I was not a lab rat". The Guardian. Retrieved September 4, 2014.
  23. "Skinner, Yvonne, 1911–1997. Papers of Yvonne Skinner, ca.1916–1977: A Finding Aid". Harvard University Library. Archived from the original on July 3, 2018. Retrieved July 30, 2021.
  24. The Famous People. (2017). B. F. Skinner biography
  25. ^ Skinner, B. F. 1974. "Causes of Behavior." Pp. 16–18 in About Behaviorism. ISBN 0-394-71618-3. section 3, "Radical Behaviorism." https://archive.org/stream/aboutbehaviorism00skin#page/16/mode/2up
  26. Pavlov, Ivan (1927). Conditioned Reflexes. Oxford: Oxford University Press.
  27. Thorndike, Edward L. (1911). Animal Intelligence: Experimental Studies. New York: Macmillan.
  28. ^ Jenkins, H. M. 1979. "Animal Learning & Behavior." Ch. 5 in The First Century of Experimental Psychology, edited by E. Hearst. Hillsdale, NJ: Erlbaum.
  29. ^ Skinner, B. F. 1966. Contingencies of Reinforcement. New York: Appleton-Century-Crofts.
  30. Skinner, B. F. 1953. Science and Human Behavior. New York: Macmillan.
  31. Skinner, B. F. (1981). "Selection by Consequences" (PDF). Science. 213 (4507): 501–04. Bibcode:1981Sci...213..501S. doi:10.1126/science.7244649. PMID 7244649. Archived from the original (PDF) on July 2, 2010. Retrieved August 14, 2010.
  32. "Different Types of Reinforcement Schedules" (PDF). autismpdc.fpg.unc.edu. National Professional Development Center for Autism Spectrum Disorders. Archived (PDF) from the original on October 9, 2022. Retrieved February 14, 2015.
  33. Hergenhahn, B. R. (2009). An Introduction to the History of Psychology. United States: Wadsworth Cengage Learning. p. 449. ISBN 978-0-495-50621-8.
  34. B. F. Skinner, (1957) Verbal Behavior. The account in the appendix is that he asked Skinner to explain why he said "No black scorpion is falling upon this table."
  35. "Skinner, Burrhus Frederick(1904–1990)". Credo Reference, Gale. Retrieved October 1, 2013.
  36. ^ Chomsky, Noam (1967). "A Review of B. F. Skinner's Verbal Behavior" (PDF). In Jakobovits, L. A.; Miron, M. S. (eds.). Readings in the Psychology of Language. Prentice-Hall. pp. 48–63. Archived (PDF) from the original on October 9, 2022. Retrieved July 29, 2021.
  37. Palmer, David C. (October 2006). "On Chomsky's appraisal of Skinner's Verbal Behavior: A half century of misunderstanding". The Behavior Analyst. 29 (2): 253–267. doi:10.1007/BF03392134. PMC 2223153.
  38. Richelle, M. 1993. B. F. Skinner: A Reappraisal. Hillsdale: Lawrence Erlbaum Associates.
  39. Michael, J. (1984). "Verbal Behavior". Journal of the Experimental Analysis of Behavior. 42 (3): 363–376. doi:10.1901/jeab.1984.42-363. PMC 1348108. PMID 16812395.
  40. Kubina, Richard M.; Kostewicz, Douglas E.; Brennan, Kaitlyn M.; King, Seth A. (September 2017). "A Critical Review of Line Graphs in Behavior Analytic Journals". Educational Psychology Review. 29 (3): 583–598. doi:10.1007/s10648-015-9339-x. ISSN 1040-726X. S2CID 142317036.
  41. ^ Joyce, Nick & Faye, Cathy (September 1, 2010). "Skinner Air Crib". Aps Observer. 23.
  42. Epstein, Robert (November 1, 1995). "Babies in Boxes". Psychology Today.
  43. Bjork. "B. F. Skinner: A life". Washington, DC: American Psychological Association.
  44. Buzan, Deborah Skinner (March 12, 2004). "I was not a lab rat". the Guardian. Retrieved January 21, 2023.
  45. ^ Skinner, B. F. (1961). "Why we need teaching machines". Harvard Educational Review. 31: 377–398.
  46. "Programmed Instruction and Task Analysis". College of Education, University of Houston. Archived from the original on June 1, 2019. Retrieved September 24, 2012.
  47. Skinner, B.F. (1961). "Teaching machines". Scientific American. 205 (3): 90–112. doi:10.2307/1926170. JSTOR 1926170. PMID 13913636.
  48. Skinner, B. F., and J. Holland. 1961. The Analysis of Behavior: A Program for Self Instruction. p. 387.
  49. "Rebirth of the Teaching Machine through the Seduction of Data Analytics: This Time It's Personal". Philip McRae, Ph.D. April 14, 2013.
  50. ^ Schultz-Figueroa. "Project Pigeon: Rendering the War Animal through Optical Technology". JCMS: Journal of Cinema and Media Studies.
  51. ^ Skinner, B. F. (1936). "The Verbal Summator and a Method for the Study of Latent Speech". Journal of Psychology. 2 (1): 71–107. doi:10.1080/00223980.1936.9917445. hdl:11858/00-001M-0000-002D-7E05-E. S2CID 144303708.
  52. Rutherford, A. 2003. "B. F. Skinner and the auditory inkblot: The rise and fall of the verbal summator as a projective technique." History of Psychology 4:362–78.
  53. Holland, J. 1992. "B. F Skinner." American Psychologist.
  54. ^ Skinner, B. F. 1968. The Technology of Teaching. New York: Appleton-Century-Crofts. LCCN 68--12340.
  55. "B.F. Skinner Sep. 20, 1971". Time. Archived from the original on September 30, 2007.
  56. Skinner, B. F. 1968. "The Design of Experimental Communities." Pp. 271–75 in International Encyclopedia of the Social Sciences 16, edited by S. Darity. New York.
  57. Ramsey, Richard David. 1979. "Morning Star: The Values-Communication of Skinner's 'Walden Two'" (Ph.D. dissertation). Troy, NY: Rensselaer Polytechnic Institute. – via University Microfilms, Ann Arbor, MI. (Ramsey attempts to analyze Walden Two, Beyond Freedom and Dignity, and other Skinner works in the context of Skinner's life; lists over 500 sources.)
  58. Kuhlman, Hilke (October 1, 2010). Living Walden Two: B. F. Skinner's Behaviorist Utopia and Experimental Communities. University of Illinois Press. p. 87.
  59. see Beyond Freedom and Dignity, 1974 for example
  60. Asimov, Nanette (January 30, 1996). "Spanking Debate Hits Assembly". SFGate. San Francisco Chronicle. Retrieved March 2, 2008.
  61. A matter of Consequences, p. 412.
  62. ^ Skinner, B. F. (1948). "'Superstition' in the Pigeon". Journal of Experimental Psychology. 38 (2): 168–172. doi:10.1037/h0055873. PMID 18913665. S2CID 22577459.
  63. Timberlake, W; Lucas, G A (November 1, 1985). "The basis of superstitious behavior: chance contingency, stimulus substitution, or appetitive behavior?". J Exp Anal Behav. 44 (3): 279–299. doi:10.1901/jeab.1985.44-279. PMC 1348192. PMID 4086972.
  64. ^ Chomsky, Noam (1959). "Reviews: Verbal Behavior by B. F. Skinner". Language. 35 (1): 26–58. doi:10.2307/411334. JSTOR 411334. Archived from the original on September 29, 2015. Retrieved May 20, 2007.
  65. ^ MacCorquodale, Kenneth (January 1, 1970). "On Chomsky's review of Skinner's Verbal Behavior". Journal of the Experimental Analysis of Behavior. 13 (1): 83–99. doi:10.1901/jeab.1970.13-83. ISSN 1938-3711. PMC 1333660.
  66. Skinner, B. F. (1972). "A Lecture on 'Having' a Poem". In Skinner, B. F. (ed.). Cumulative Record (PDF) (3rd ed.). Appleton-Century-Crofts. pp. 345–355. ISBN 978-0-9899839-9-0. Archived from the original (PDF) on August 7, 2021. Retrieved August 7, 2021.
  67. Palmer, David C. (2006). "On Chomsky's appraisal of Skinner's Verbal Behavior: a half century of misunderstanding". The Behavior Analyst. 29 (2): 253–267. doi:10.1007/BF03392134. PMC 2223153. PMID 22478467.
  68. Chomsky, Noam (1971). "The Case Against B. F. Skinner". New York Review of Books.
  69. Toates, F. (2009). Burrhus F. Skinner: The shaping of behavior. Houndmills, Basingstoke, England: Palgrave Macmillan.
  70. Overskeid, Geir (September 2007). "Looking for Skinner and Finding Freud". American Psychologist. 62 (6): 590–595. CiteSeerX 10.1.1.321.6288. doi:10.1037/0003-066x.62.6.590. PMID 17874899. S2CID 4610708.
  71. Rutherford, A. (2003). "B. F. Skinner and the auditory inkblot: The rise and fall of the verbal summator as a projective technique". History of Psychology. 6 (4): 362–378. doi:10.1037/1093-4510.6.4.362. PMID 14735913.
  72. Staddon, J. E. R. 2014. The New Behaviorism (2nd ed.).
  73. Staddon, J. E. R. 1995. "On Responsibility and Punishment." The Atlantic Monthly 1995(2):88–94.
  74. Staddon, J. E. R. 1999. "On Responsibility in Science and Law." Social Philosophy and Policy 16:146–74. reprint: 1999. Pp. 146–74 in Responsibility, edited by E. F. Paul, F. D. Miller, and J. Paul.Cambridge: Cambridge University Press.
  75. "The Pantheon of Skeptics". CSI. Committee for Skeptical Inquiry. Archived from the original on January 31, 2017. Retrieved April 30, 2017.
  76. "The winners of the 2024 Ig Nobel awards". The Tartan. Retrieved October 24, 2024.
  77. "APS Member History". search.amphilsoc.org. Retrieved February 27, 2023.
  78. "Burrhus Frederic Skinner". American Academy of Arts & Sciences. February 9, 2023. Retrieved February 27, 2023.
  79. "B. F. Skinner". www.nasonline.org. Retrieved February 27, 2023.

Further reading

  • Chiesa, M. (2004). Radical Behaviorism: The Philosophy and the Science.
  • Epstein, Robert (1997). "Skinner as self-manager." Journal of Applied Behavior Analysis 30:545–69. Retrieved 2 June 2005 – via ENVMED.rochester.edu
  • Pauly, Philip Joseph (1987). Controlling Life: Jacques Loeb and the Engineering Ideal in Biology. Oxford, UK: Oxford University Press. ISBN 978-0-19-504244-3. Retrieved August 14, 2010.
  • Sundberg, M. L. (2008) The VB-MAPP: The Verbal Behavior Milestones Assessment and Placement Program
  • Basil-Curzon, L. (2004) Teaching in Further Education: A outline of Principles and Practice
  • Hardin, C.J. (2004) Effective Classroom Management
  • Kaufhold, J. A. (2002) The Psychology of Learning and the Art of Teaching
  • Bjork, D. W. (1993) B. F. Skinner: A Life
  • Dews, P. B., ed. (1970) Festschrift For B. F. Skinner.New York: Appleton-Century-Crofts.
  • Evans, R. I. (1968) B. F. Skinner: the man and his ideas
  • Nye, Robert D. (1979) What Is B. F. Skinner Really Saying? Englewood Cliffs, NJ: Prentice-Hall.
  • Rutherford, A. (2009) Beyond the box: B. F. Skinner's technology of behavior from laboratory to life, 1950s–1970s.. Toronto: University of Toronto Press.
  • Sagal, P. T. (1981) Skinner's Philosophy. Washington, DC: University Press of America.
  • Smith, D. L. (2002). On Prediction and Control. B. F. Skinner and the Technological Ideal of Science. In W. E. Pickren & D. A. Dewsbury, (Eds.), Evolving Perspectives on the History of Psychology, Washington, D.C.: American Psychological Association.
  • Swirski, Peter (2011) "How I Stopped Worrying and Loved Behavioural Engineering or Communal Life, Adaptations, and B.F. Skinner's Walden Two". American Utopia and Social Engineering in Literature, Social Thought, and Political History. New York, Routledge.
  • Wiener, D. N. (1996) B. F. Skinner: benign anarchist
  • Wolfgang, C.H. and Glickman, Carl D. (1986) Solving Discipline Problems Allyn and Bacon, Inc

External links

Library resources about
B. F. Skinner
By B. F. Skinner
Psychology
Basic
psychology
stylized letter psi
Applied
psychology
Methodologies
Concepts
Psychologists
  • Wilhelm Wundt
  • William James
  • Ivan Pavlov
  • Sigmund Freud
  • Edward Thorndike
  • Carl Jung
  • John B. Watson
  • Clark L. Hull
  • Kurt Lewin
  • Jean Piaget
  • Gordon Allport
  • J. P. Guilford
  • Carl Rogers
  • Erik Erikson
  • B. F. Skinner
  • Donald O. Hebb
  • Ernest Hilgard
  • Harry Harlow
  • Raymond Cattell
  • Abraham Maslow
  • Neal E. Miller
  • Jerome Bruner
  • Donald T. Campbell
  • Hans Eysenck
  • Herbert A. Simon
  • David McClelland
  • Leon Festinger
  • George A. Miller
  • Richard Lazarus
  • Stanley Schachter
  • Robert Zajonc
  • Albert Bandura
  • Roger Brown
  • Endel Tulving
  • Lawrence Kohlberg
  • Noam Chomsky
  • Ulric Neisser
  • Jerome Kagan
  • Walter Mischel
  • Elliot Aronson
  • Daniel Kahneman
  • Paul Ekman
  • Michael Posner
  • Amos Tversky
  • Bruce McEwen
  • Larry Squire
  • Richard E. Nisbett
  • Martin Seligman
  • Ed Diener
  • Shelley E. Taylor
  • John Anderson
  • Ronald C. Kessler
  • Joseph E. LeDoux
  • Richard Davidson
  • Susan Fiske
  • Roy Baumeister
  • Lists
    United States National Medal of Science laureates
    Behavioral and social science
    1960s
    1964
    Neal Elgar Miller
    1980s
    1986
    Herbert A. Simon
    1987
    Anne Anastasi
    George J. Stigler
    1988
    Milton Friedman
    1990s
    1990
    Leonid Hurwicz
    Patrick Suppes
    1991
    George A. Miller
    1992
    Eleanor J. Gibson
    1994
    Robert K. Merton
    1995
    Roger N. Shepard
    1996
    Paul Samuelson
    1997
    William K. Estes
    1998
    William Julius Wilson
    1999
    Robert M. Solow
    2000s
    2000
    Gary Becker
    2003
    R. Duncan Luce
    2004
    Kenneth Arrow
    2005
    Gordon H. Bower
    2008
    Michael I. Posner
    2009
    Mortimer Mishkin
    2010s
    2011
    Anne Treisman
    2014
    Robert Axelrod
    2015
    Albert Bandura
    Biological sciences
    1960s
    1963
    C. B. van Niel
    1964
    Theodosius Dobzhansky
    Marshall W. Nirenberg
    1965
    Francis P. Rous
    George G. Simpson
    Donald D. Van Slyke
    1966
    Edward F. Knipling
    Fritz Albert Lipmann
    William C. Rose
    Sewall Wright
    1967
    Kenneth S. Cole
    Harry F. Harlow
    Michael Heidelberger
    Alfred H. Sturtevant
    1968
    Horace Barker
    Bernard B. Brodie
    Detlev W. Bronk
    Jay Lush
    Burrhus Frederic Skinner
    1969
    Robert Huebner
    Ernst Mayr
    1970s
    1970
    Barbara McClintock
    Albert B. Sabin
    1973
    Daniel I. Arnon
    Earl W. Sutherland Jr.
    1974
    Britton Chance
    Erwin Chargaff
    James V. Neel
    James Augustine Shannon
    1975
    Hallowell Davis
    Paul Gyorgy
    Sterling B. Hendricks
    Orville Alvin Vogel
    1976
    Roger Guillemin
    Keith Roberts Porter
    Efraim Racker
    E. O. Wilson
    1979
    Robert H. Burris
    Elizabeth C. Crosby
    Arthur Kornberg
    Severo Ochoa
    Earl Reece Stadtman
    George Ledyard Stebbins
    Paul Alfred Weiss
    1980s
    1981
    Philip Handler
    1982
    Seymour Benzer
    Glenn W. Burton
    Mildred Cohn
    1983
    Howard L. Bachrach
    Paul Berg
    Wendell L. Roelofs
    Berta Scharrer
    1986
    Stanley Cohen
    Donald A. Henderson
    Vernon B. Mountcastle
    George Emil Palade
    Joan A. Steitz
    1987
    Michael E. DeBakey
    Theodor O. Diener
    Harry Eagle
    Har Gobind Khorana
    Rita Levi-Montalcini
    1988
    Michael S. Brown
    Stanley Norman Cohen
    Joseph L. Goldstein
    Maurice R. Hilleman
    Eric R. Kandel
    Rosalyn Sussman Yalow
    1989
    Katherine Esau
    Viktor Hamburger
    Philip Leder
    Joshua Lederberg
    Roger W. Sperry
    Harland G. Wood
    1990s
    1990
    Baruj Benacerraf
    Herbert W. Boyer
    Daniel E. Koshland Jr.
    Edward B. Lewis
    David G. Nathan
    E. Donnall Thomas
    1991
    Mary Ellen Avery
    G. Evelyn Hutchinson
    Elvin A. Kabat
    Robert W. Kates
    Salvador Luria
    Paul A. Marks
    Folke K. Skoog
    Paul C. Zamecnik
    1992
    Maxine Singer
    Howard Martin Temin
    1993
    Daniel Nathans
    Salome G. Waelsch
    1994
    Thomas Eisner
    Elizabeth F. Neufeld
    1995
    Alexander Rich
    1996
    Ruth Patrick
    1997
    James Watson
    Robert A. Weinberg
    1998
    Bruce Ames
    Janet Rowley
    1999
    David Baltimore
    Jared Diamond
    Lynn Margulis
    2000s
    2000
    Nancy C. Andreasen
    Peter H. Raven
    Carl Woese
    2001
    Francisco J. Ayala
    George F. Bass
    Mario R. Capecchi
    Ann Graybiel
    Gene E. Likens
    Victor A. McKusick
    Harold Varmus
    2002
    James E. Darnell
    Evelyn M. Witkin
    2003
    J. Michael Bishop
    Solomon H. Snyder
    Charles Yanofsky
    2004
    Norman E. Borlaug
    Phillip A. Sharp
    Thomas E. Starzl
    2005
    Anthony Fauci
    Torsten N. Wiesel
    2006
    Rita R. Colwell
    Nina Fedoroff
    Lubert Stryer
    2007
    Robert J. Lefkowitz
    Bert W. O'Malley
    2008
    Francis S. Collins
    Elaine Fuchs
    J. Craig Venter
    2009
    Susan L. Lindquist
    Stanley B. Prusiner
    2010s
    2010
    Ralph L. Brinster
    Rudolf Jaenisch
    2011
    Lucy Shapiro
    Leroy Hood
    Sallie Chisholm
    2012
    May Berenbaum
    Bruce Alberts
    2013
    Rakesh K. Jain
    2014
    Stanley Falkow
    Mary-Claire King
    Simon Levin
    Chemistry
    1960s
    1964
    Roger Adams
    1980s
    1982
    F. Albert Cotton
    Gilbert Stork
    1983
    Roald Hoffmann
    George C. Pimentel
    Richard N. Zare
    1986
    Harry B. Gray
    Yuan Tseh Lee
    Carl S. Marvel
    Frank H. Westheimer
    1987
    William S. Johnson
    Walter H. Stockmayer
    Max Tishler
    1988
    William O. Baker
    Konrad E. Bloch
    Elias J. Corey
    1989
    Richard B. Bernstein
    Melvin Calvin
    Rudolph A. Marcus
    Harden M. McConnell
    1990s
    1990
    Elkan Blout
    Karl Folkers
    John D. Roberts
    1991
    Ronald Breslow
    Gertrude B. Elion
    Dudley R. Herschbach
    Glenn T. Seaborg
    1992
    Howard E. Simmons Jr.
    1993
    Donald J. Cram
    Norman Hackerman
    1994
    George S. Hammond
    1995
    Thomas Cech
    Isabella L. Karle
    1996
    Norman Davidson
    1997
    Darleane C. Hoffman
    Harold S. Johnston
    1998
    John W. Cahn
    George M. Whitesides
    1999
    Stuart A. Rice
    John Ross
    Susan Solomon
    2000s
    2000
    John D. Baldeschwieler
    Ralph F. Hirschmann
    2001
    Ernest R. Davidson
    Gábor A. Somorjai
    2002
    John I. Brauman
    2004
    Stephen J. Lippard
    2005
    Tobin J. Marks
    2006
    Marvin H. Caruthers
    Peter B. Dervan
    2007
    Mostafa A. El-Sayed
    2008
    Joanna Fowler
    JoAnne Stubbe
    2009
    Stephen J. Benkovic
    Marye Anne Fox
    2010s
    2010
    Jacqueline K. Barton
    Peter J. Stang
    2011
    Allen J. Bard
    M. Frederick Hawthorne
    2012
    Judith P. Klinman
    Jerrold Meinwald
    2013
    Geraldine L. Richmond
    2014
    A. Paul Alivisatos
    Engineering sciences
    1960s
    1962
    Theodore von Kármán
    1963
    Vannevar Bush
    John Robinson Pierce
    1964
    Charles S. Draper
    Othmar H. Ammann
    1965
    Hugh L. Dryden
    Clarence L. Johnson
    Warren K. Lewis
    1966
    Claude E. Shannon
    1967
    Edwin H. Land
    Igor I. Sikorsky
    1968
    J. Presper Eckert
    Nathan M. Newmark
    1969
    Jack St. Clair Kilby
    1970s
    1970
    George E. Mueller
    1973
    Harold E. Edgerton
    Richard T. Whitcomb
    1974
    Rudolf Kompfner
    Ralph Brazelton Peck
    Abel Wolman
    1975
    Manson Benedict
    William Hayward Pickering
    Frederick E. Terman
    Wernher von Braun
    1976
    Morris Cohen
    Peter C. Goldmark
    Erwin Wilhelm Müller
    1979
    Emmett N. Leith
    Raymond D. Mindlin
    Robert N. Noyce
    Earl R. Parker
    Simon Ramo
    1980s
    1982
    Edward H. Heinemann
    Donald L. Katz
    1983
    Bill Hewlett
    George Low
    John G. Trump
    1986
    Hans Wolfgang Liepmann
    Tung-Yen Lin
    Bernard M. Oliver
    1987
    Robert Byron Bird
    H. Bolton Seed
    Ernst Weber
    1988
    Daniel C. Drucker
    Willis M. Hawkins
    George W. Housner
    1989
    Harry George Drickamer
    Herbert E. Grier
    1990s
    1990
    Mildred Dresselhaus
    Nick Holonyak Jr.
    1991
    George H. Heilmeier
    Luna B. Leopold
    H. Guyford Stever
    1992
    Calvin F. Quate
    John Roy Whinnery
    1993
    Alfred Y. Cho
    1994
    Ray W. Clough
    1995
    Hermann A. Haus
    1996
    James L. Flanagan
    C. Kumar N. Patel
    1998
    Eli Ruckenstein
    1999
    Kenneth N. Stevens
    2000s
    2000
    Yuan-Cheng B. Fung
    2001
    Andreas Acrivos
    2002
    Leo Beranek
    2003
    John M. Prausnitz
    2004
    Edwin N. Lightfoot
    2005
    Jan D. Achenbach
    2006
    Robert S. Langer
    2007
    David J. Wineland
    2008
    Rudolf E. Kálmán
    2009
    Amnon Yariv
    2010s
    2010
    Shu Chien
    2011
    John B. Goodenough
    2012
    Thomas Kailath
    Mathematical, statistical, and computer sciences
    1960s
    1963
    Norbert Wiener
    1964
    Solomon Lefschetz
    H. Marston Morse
    1965
    Oscar Zariski
    1966
    John Milnor
    1967
    Paul Cohen
    1968
    Jerzy Neyman
    1969
    William Feller
    1970s
    1970
    Richard Brauer
    1973
    John Tukey
    1974
    Kurt Gödel
    1975
    John W. Backus
    Shiing-Shen Chern
    George Dantzig
    1976
    Kurt Otto Friedrichs
    Hassler Whitney
    1979
    Joseph L. Doob
    Donald E. Knuth
    1980s
    1982
    Marshall H. Stone
    1983
    Herman Goldstine
    Isadore Singer
    1986
    Peter Lax
    Antoni Zygmund
    1987
    Raoul Bott
    Michael Freedman
    1988
    Ralph E. Gomory
    Joseph B. Keller
    1989
    Samuel Karlin
    Saunders Mac Lane
    Donald C. Spencer
    1990s
    1990
    George F. Carrier
    Stephen Cole Kleene
    John McCarthy
    1991
    Alberto Calderón
    1992
    Allen Newell
    1993
    Martin David Kruskal
    1994
    John Cocke
    1995
    Louis Nirenberg
    1996
    Richard Karp
    Stephen Smale
    1997
    Shing-Tung Yau
    1998
    Cathleen Synge Morawetz
    1999
    Felix Browder
    Ronald R. Coifman
    2000s
    2000
    John Griggs Thompson
    Karen Uhlenbeck
    2001
    Calyampudi R. Rao
    Elias M. Stein
    2002
    James G. Glimm
    2003
    Carl R. de Boor
    2004
    Dennis P. Sullivan
    2005
    Bradley Efron
    2006
    Hyman Bass
    2007
    Leonard Kleinrock
    Andrew J. Viterbi
    2009
    David B. Mumford
    2010s
    2010
    Richard A. Tapia
    S. R. Srinivasa Varadhan
    2011
    Solomon W. Golomb
    Barry Mazur
    2012
    Alexandre Chorin
    David Blackwell
    2013
    Michael Artin
    Physical sciences
    1960s
    1963
    Luis W. Alvarez
    1964
    Julian Schwinger
    Harold Urey
    Robert Burns Woodward
    1965
    John Bardeen
    Peter Debye
    Leon M. Lederman
    William Rubey
    1966
    Jacob Bjerknes
    Subrahmanyan Chandrasekhar
    Henry Eyring
    John H. Van Vleck
    Vladimir K. Zworykin
    1967
    Jesse Beams
    Francis Birch
    Gregory Breit
    Louis Hammett
    George Kistiakowsky
    1968
    Paul Bartlett
    Herbert Friedman
    Lars Onsager
    Eugene Wigner
    1969
    Herbert C. Brown
    Wolfgang Panofsky
    1970s
    1970
    Robert H. Dicke
    Allan R. Sandage
    John C. Slater
    John A. Wheeler
    Saul Winstein
    1973
    Carl Djerassi
    Maurice Ewing
    Arie Jan Haagen-Smit
    Vladimir Haensel
    Frederick Seitz
    Robert Rathbun Wilson
    1974
    Nicolaas Bloembergen
    Paul Flory
    William Alfred Fowler
    Linus Carl Pauling
    Kenneth Sanborn Pitzer
    1975
    Hans A. Bethe
    Joseph O. Hirschfelder
    Lewis Sarett
    Edgar Bright Wilson
    Chien-Shiung Wu
    1976
    Samuel Goudsmit
    Herbert S. Gutowsky
    Frederick Rossini
    Verner Suomi
    Henry Taube
    George Uhlenbeck
    1979
    Richard P. Feynman
    Herman Mark
    Edward M. Purcell
    John Sinfelt
    Lyman Spitzer
    Victor F. Weisskopf
    1980s
    1982
    Philip W. Anderson
    Yoichiro Nambu
    Edward Teller
    Charles H. Townes
    1983
    E. Margaret Burbidge
    Maurice Goldhaber
    Helmut Landsberg
    Walter Munk
    Frederick Reines
    Bruno B. Rossi
    J. Robert Schrieffer
    1986
    Solomon J. Buchsbaum
    H. Richard Crane
    Herman Feshbach
    Robert Hofstadter
    Chen-Ning Yang
    1987
    Philip Abelson
    Walter Elsasser
    Paul C. Lauterbur
    George Pake
    James A. Van Allen
    1988
    D. Allan Bromley
    Paul Ching-Wu Chu
    Walter Kohn
    Norman Foster Ramsey Jr.
    Jack Steinberger
    1989
    Arnold O. Beckman
    Eugene Parker
    Robert Sharp
    Henry Stommel
    1990s
    1990
    Allan M. Cormack
    Edwin M. McMillan
    Robert Pound
    Roger Revelle
    1991
    Arthur L. Schawlow
    Ed Stone
    Steven Weinberg
    1992
    Eugene M. Shoemaker
    1993
    Val Fitch
    Vera Rubin
    1994
    Albert Overhauser
    Frank Press
    1995
    Hans Dehmelt
    Peter Goldreich
    1996
    Wallace S. Broecker
    1997
    Marshall Rosenbluth
    Martin Schwarzschild
    George Wetherill
    1998
    Don L. Anderson
    John N. Bahcall
    1999
    James Cronin
    Leo Kadanoff
    2000s
    2000
    Willis E. Lamb
    Jeremiah P. Ostriker
    Gilbert F. White
    2001
    Marvin L. Cohen
    Raymond Davis Jr.
    Charles Keeling
    2002
    Richard Garwin
    W. Jason Morgan
    Edward Witten
    2003
    G. Brent Dalrymple
    Riccardo Giacconi
    2004
    Robert N. Clayton
    2005
    Ralph A. Alpher
    Lonnie Thompson
    2006
    Daniel Kleppner
    2007
    Fay Ajzenberg-Selove
    Charles P. Slichter
    2008
    Berni Alder
    James E. Gunn
    2009
    Yakir Aharonov
    Esther M. Conwell
    Warren M. Washington
    2010s
    2011
    Sidney Drell
    Sandra Faber
    Sylvester James Gates
    2012
    Burton Richter
    Sean C. Solomon
    2014
    Shirley Ann Jackson
    E. L. Thorndike Award recipients
    1960s
    1970s
    1980s
    1990s
    2000s
    2010s
    2020s
    Social philosophy
    Concepts
    Schools
    Philosophers
    Ancient
    Medieval
    Early modern
    18th and 19th
    centuries
    20th and 21st
    centuries
    Works
    See also
    Parenting
    Kinship terminology
    Theories · Areas
    Styles
    Techniques
    Child discipline
    Abuse
    Legal and
    social aspects
    Experts
    Organizations
    Categories: