The experimental analysis of behavior is a science that studies the behavior of individuals across a variety of species. A key early scientist was B. F. Skinner who discovered operant behavior, reinforcers, secondary reinforcers, contingencies of reinforcement, stimulus control, shaping, intermittent schedules, discrimination, and generalization. A central method was the [1] examination of functional relations between environment and behavior, as opposed to hypothetico-deductive learning theory[2] that had grown up in the comparative psychology of the 1920–1950 period. Skinner's approach was characterized by observation of measurable behavior which could be predicted and controlled. It owed its early success to the effectiveness of Skinner's procedures of operant conditioning, both in the laboratory and in behavior therapy.
See main article: Classical conditioning. In classical or respondent conditioning, a neutral stimulus (conditioned stimulus) is delivered just before a reflex-eliciting stimulus (unconditioned stimulus) such as food or pain. This typically done by pairing the two stimuli, as in Pavlov's experiments with dogs, where a bell was followed by food delivery. After repeated pairings, the conditioned stimulus comes to elicit the response. [3]
See main article: Operant conditioning. Operant conditioning (also, "instrumental conditioning") is a learning process in which behavior is sensitive to, or controlled by its consequences. Specifically, behavior followed by some consequences becomes more frequent (positive reinforcement), behavior followed by other consequences becomes less frequent (punishment) and behavior not followed by yet other consequence becomes more frequent (negative reinforcement). For example, in a food-deprived subject, when lever-pressing is followed by food delivery lever-pressing increases in frequency (positive reinforcement). Likewise, when stepping off a treadmill is followed by delivery of electric shock, stepping off the treadmill becomes less frequent (punishment). And when stopping lever-pressing is followed by shock, lever-pressing is maintained or increased (negative reinforcement). Many variations and details of this process may be found in the main article.
See main article: Operant conditioning chamber. The most commonly used tool in animal behavioral research is the operant conditioning chamber—also known as a Skinner Box. The chamber is an enclosure designed to hold a test animal (often a rodent, pigeon, or primate). The interior of the chamber contains some type of device that serves the role of discriminative stimuli, at least one mechanism to measure the subject's behavior as a rate of response—such as a lever or key-peck switch—and a mechanism for the delivery of consequences—such as a food pellet dispenser or a token reinforcer such as an LED light.
Of historical interest is the cumulative recorder, an instrument used to record the responses of subjects graphically. Traditionally, its graphing mechanism has consisted of a rotating drum of paper equipped with a marking needle. The needle would start at the bottom of the page and the drum would turn the roll of paper horizontally. Each subject response would result in the marking needle moving vertically along the paper one tick. This makes the rate of response the slope of the graph. For example, a regular rate of response would cause the needle to move vertically at a regular rate, resulting in a straight diagonal line rising towards the right. An accelerating or decelerating rate of response would lead to a quadratic (or similar) curve. For the most part, cumulative records are no longer graphed using rotating drums, but are recorded electronically instead.
Laboratory methods employed in the experimental analysis of behavior are based upon B.F. Skinner's philosophy of radical behaviorism, which is premised upon:
The idea that Skinner's position is anti-theoretical is probably inspired by the arguments he put forth in his article Are Theories of Learning Necessary?[4] However, that article did not argue against the use of theory as such, only against certain theories in certain contexts. Skinner argued that many theories did not explain behavior, but simply offered another layer of structure that itself had to be explained in turn. If an organism is said to have a drive, which causes its behavior, what then causes the drive? Skinner argued that many theories had the effect of halting research or generating useless research.
Skinner's work did have a basis in theory, though his theories were different from those that he criticized. Mecca Chiesa notes that Skinner's theories are inductively derived, while those that he attacked were deductively derived.[5] The theories that Skinner opposed often relied on mediating mechanisms and structures—such as a mechanism for memory as a part of the mind—which were not measurable or observable. Skinner's theories form the basis for two of his books: Verbal Behavior, and Science and Human Behavior. These two texts represent considerable theoretical extensions of his basic laboratory work into the realms of political science, linguistics, sociology and others.