scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Experimental Analysis of Behavior in 1969"


Journal ArticleDOI
TL;DR: The results indicate that pecking can be established and maintained by certain stimulus-reinforcer relationships, independent of explicit or adventitious contingencies between response and reinforcer.
Abstract: If a response key is regularly illuminated for several seconds before food is presented, pigeons will peck it after a moderate number of pairings; this “auto-shaping” procedure of Brown and Jenkins (1968) was explored further in the present series of four experiments. The first showed that pecking was maintained even when pecks turned off the key and prevented reinforcement (auto-maintenance); the second controlled for possible effects of generalization and stimulus change. Two other experiments explored procedures that manipulated the tendency to peck the negatively correlated key by introducing alternative response keys which had no scheduled consequences. The results indicate that pecking can be established and maintained by certain stimulus-reinforcer relationships, independent of explicit or adventitious contingencies between response and reinforcer.

721 citations


Journal ArticleDOI
TL;DR: The present results, together with related research, suggest that the ratio of time spent in two activities equals the ratios of the "values" of the activities.
Abstract: When pigeons' standing on one or the other side of a chamber was reinforced on two concurrent variable-interval schedules, the ratio of time spent on the left to time spent on the right was directly proportional to the ratio of reinforcements produced by standing on the left to reinforcements produced by standing on the right. The constant of proportionality was less than unity for all pigeons, indicating a bias toward the right side of the chamber. The biased matching relation obtained here is comparable to the matching relation obtained with concurrent reinforcement of key pecks. The present results, together with related research, suggest that the ratio of time spent in two activities equals the ratio of the "values" of the activities. The value of an activity is the product of several parameters, such as rate and amount of reinforcement, contingent on that activity.

705 citations


Journal ArticleDOI
TL;DR: Pigeons' responses in the presence of two concurrently available stimuli produced one of two different (terminal-link) stimuli, and a formulation consistent with extant data states that choice behavior is dependent upon the amount of reduction in the expected time to primary reinforcement, as signified by entry into one terminal link, relative to the amount.
Abstract: Pigeons' responses in the presence of two concurrently available (initial-link) stimuli produced one of two different (terminal-link) stimuli. The rate of reinforcement in the presence of one terminal-link stimulus was three times that of the other. Three different pairs of identical but independent variable-interval schedules controlled entry into the terminal links. When the intermediate pair was in effect, the pigeons distributed their (choice) responses in the presence of the concurrently available stimuli of the initial links in the same proportion as reinforcements were distributed in the mutually exclusive terminal links. This finding was consistent with those of earlier studies. When either the pair of larger or smaller variable-interval schedules was in effect, however, proportions of choice responses did not match proportions of reinforcements. In addition, matching was not obtained when entry into the terminal links was controlled by unequal variable-interval schedules. A formulation consistent with extant data states that choice behavior is dependent upon the amount of reduction in the expected time to primary reinforcement, as signified by entry into one terminal link, relative to the amount of reduction in expected time to reinforcement signified by entry into the other terminal link.

502 citations


Journal ArticleDOI
TL;DR: Responding by pigeons on one key of a two-key chamber alternated the color of the second key, on which responding produced food according to a variable-interval schedule of reinforcement, to find relative overall rates of responding and relative times in the presence of a key color approximated the proportions of reinforcements obtained in the absence of that color.
Abstract: Responding by pigeons on one key of a two-key chamber alternated the color of the second key, on which responding produced food according to a variable-interval schedule of reinforcement. From time to time, reinforcement would be available for a response, but in the presence of a particular stimulus, either red or green light on the key. Red or green was chosen irregularly from reinforcement to reinforcement, so that a proportion of the total number of reinforcements could be specified for each color. Experimental manipulations involved variations of (1) the proportions for each color, (2) changeover delay, or, alternatively, (3) a fixed-ratio changeover requirement. The main findings were: (1) relative overall rates of responding and relative times in the presence of a key color approximated the proportions of reinforcements obtained in the presence of that color, while relative local rates of responding changed little; (2) changeover rate decreased as the proportions diverged from 0.50; (3) relative overall rate of responding and relative time remained constant as the changeover delay was increased from 2 to 32 sec, with reinforcement proportions for red and green of 0.75 and 0.25, but they increased above 0.90 when a fixed-ratio changeover of 20 responses replaced the changeover delay; (4) changeover rate decreased as the delay or fixed-ratio was increased.

390 citations


Journal ArticleDOI
TL;DR: The behavior of pigeons on six geometrically spaced fixed-interval schedules ranging from 16 to 512 sec is described as a two-state process, where response rate is low and constant and response rate in the second state is an increasing, negatively accelerated function of rate of reinforcement.
Abstract: The behavior of pigeons on six geometrically spaced fixed-interval schedules ranging from 16 to 512 sec is described as a two-state process. In the first state, which begins immediately after reinforcement, response rate is low and constant. At some variable time after reinforcement there is an abrupt transition to a high and approximately constant rate. The point of rapid transition occurs, on the average, at about two-thirds of the way through the interval. Response rate in the second state is an increasing, negatively accelerated function of rate of reinforcement in the second state.

355 citations


Journal ArticleDOI
TL;DR: Two rough-toothed porpoises were individually trained to emit novel responses by reinforcing a different response to the same set of stimuli in each of a series of training sessions, and a technique was developed for transcribing a complex series of behaviors on to a single cumulative record.
Abstract: Two rough-toothed porpoises (Steno bredanensis) were individually trained to emit novel responses, which were not developed by shaping and which were not previously known to occur in the species, by reinforcing a different response to the same set of stimuli in each of a series of training sessions. A technique was developed for transcribing a complex series of behaviors on to a single cumulative record so that the training sessions of the second animal could be fully recorded. Cumulative records are presented for a session in which the criterion that only novel behaviors would be reinforced was abruptly met with four new types of responses, and for typical preceding and subsequent sessions. Some analogous techniques in the training of pigeons, horses, and humans are discussed.

351 citations


Journal ArticleDOI
TL;DR: This article showed that subjects with a history under ratio conditioning schedules typically produce high and relatively constant rates of responding under fixed-interval (FI) schedules; this responding does not change systematically with changes in FI value.
Abstract: Both high and relatively constant rates of responding without post-reinforcement pauses and lower rates with pauses after reinforcement are produced by human subjects under fixed-interval (FI) schedules. Such FI rates and patterns may be controlled when subjects are provided with different histories of conditioning and different conditions of response cost (reinforcement penalties per response). Subjects with a conditioning history under ratio schedules typically produce high and relatively constant rates of responding under FI schedules; this responding does not change systematically with changes in FI value. In contrast, subjects with a history under schedules which produce little or no responding between reforcements [such as differential-reinforcement-of-low-rate (DRL) schedules] tend to pause after reinforcement and respond at low rates under FI schedules, whether or not they also have ratio conditioning histories; cost increases the likelihood of this type of performance. For DRL-history subjects, post-reinforcement pauses increase and response rates decrease as FI values increase.

218 citations


Journal ArticleDOI
TL;DR: It was concluded that instructions can have major influences on the establishment and maintenance of human operant behavior.
Abstract: In three experiments, human subjects were trained on a five-component multiple schedule with different fixed intervals of monetary reinforcement scheduled in the different components. Subjects uninstructed about the fixed-interval schedules manifested high and generally equivalent rates regardless of the particular component. By comparison, subjects given instructions about the schedules showed orderly progressions of rates and temporal patterning as a function of the interreinforcement intervals, particularly when feedback about reinforcement was delivered but also when reinforcement-feedback was withheld. Administration of the instructions-reinforcement combination to subjects who had already developed poorly differentiated behavior, however, did not make their behavior substantially better differentiated. When cost was imposed for responding, both instructed and uninstructed subjects showed low and differentiated rates regardless of their prior histories. It was concluded that instructions can have major influences on the establishment and maintenance of human operant behavior.

181 citations




Journal ArticleDOI
TL;DR: Pigeons were trained on a two-link concurrent chain schedule in which responses on two keys were reinforced according to independent variable-interval schedules by the production of a change in key color.
Abstract: Pigeons were trained on a two-link concurrent chain schedule in which responses on two keys were reinforced according to independent variable-interval schedules by the production of a change in key color. Further responses on the key on which the stimulus change had been produced gave a single food reinforcement and a return to concurrent variable-interval conditions. On one key the terminal link was a two-valued mixed-interval schedule, while on the other, the terminal link was a fixed-interval schedule. When the mixed-interval values were kept constant and the fixed-interval values varied, relative response rates in the initial concurrent links matched relative reinforcement rates in the terminal links when these were computed from cubic transformations of the reciprocals of the intervals comprising the terminal link schedules.


Journal ArticleDOI
TL;DR: Variability in response locus decreased to a low value during training in which each response produced reinforcement, and increased when fixed intervals, variable intervals, random intervals, or extinction were scheduled.
Abstract: The effect of several reinforcement schedules on the variability in topography of a pigeon's key-peck response was determined. The measure of topography was the location of a key peck within a 10-in. wide by 0.75-in. high response key. Food reinforcement was presented from a magazine located below the center of the response key. Variability in response locus decreased to a low value during training in which each response produced reinforcement. Variability increased when fixed intervals, variable intervals, random intervals, or extinction were scheduled.

Journal ArticleDOI
TL;DR: This finding shows that conditioned suppression occurs during a signal for reinforcing as well as aversive stimulus, which is paired with noncontingent positive reinforcers.
Abstract: Research has revealed the phenomenon of conditioned suppression in which the rate of responding is reduced during a stimulus that is paired with noncontingent shock. The present study replicated this procedure, but used noncontingent positive reinforcers instead of the aversive shock. The lever-pressing responses of rats were reinforced with food or water. While the rats were responding, a stimulus was occasionally presented and paired with the delivery of a noncontingent positive reinforcer, which was either food, water, or brain stimulation for different rats. The result was a reduction in the rate of responding during the conditioned stimulus. This finding shows that conditioned suppression occurs during a signal for reinforcing as well as aversive stimuli.

Journal ArticleDOI
TL;DR: It appears that the rat can better space its responses appropriately when concurrently performing some overt collateral activity, and the amount of this activity apparently comes to serve as a discriminative stimulus.
Abstract: When the lever-pressing behavior of five rats was maintained by a DRL schedule (reinforcement was scheduled only when a specified waiting time between successive responses was exceeded), collateral behavior developed that apparently served a mediating function. In two cases this behavior did not arise until the experimental environment included pieces of wood that the rats started to nibble. When collateral behavior first appeared, it was always accompanied by an increase in responses spaced far enough apart to earn reinforcement. If collateral behavior was prevented, the number of reinforced responses always decreased. Extinction of lever pressing extinguished the collateral behavior. Adding a limited-hold contingency to the schedule did not extinguish collateral behavior. It appears that the rat can better space its responses appropriately when concurrently performing some overt collateral activity. The amount of this activity apparently comes to serve as a discriminative stimulus. To assume the existence of internal events that serve as discriminative stimuli in temporal discriminations is, at least under some circumstances, unnecessary.

Journal ArticleDOI
TL;DR: Pigeons were trained to peck at red or green keys presented simultaneously in discrete trials, and choices of green on the first response after reinforcement matched the proportion of reinforcements for pecking green, extending the generality of overall matching under concurrent reinforcement.
Abstract: Pigeons were trained to peck at red or green keys presented simultaneously in discrete trials. In one experiment, reinforcements were arranged by concurrent variable-interval schedules. The proportion of responses to green approximately matched the proportion of reinforcements produced by pecking green. Detailed analysis of responding revealed a systematic decrease in the probability of switching from green to red within sequences of trials after reinforcement. This trend corresponded to sequential changes in the relative frequency of reinforcement, and not to sequential changes in probability of reinforcement. In a second experiment, reinforcements were scheduled probabilistically every seventh trial. Even though there were no contingencies on pecking during the first six post-reinforcement trials, choices of green on the first response after reinforcement matched the proportion of reinforcements for pecking green. These results extend the generality of overall matching under concurrent reinforcement.

Journal ArticleDOI
TL;DR: When a brief blackout was presented in lieu of reinforcement at the end of 25% of intervals on a fixed-interval 2- min schedule, response rate was reliably and persistently higher during the following 2-min intervals (omission effect).
Abstract: Experiments with pigeons and rats showed that: (1) When a brief blackout was presented in lieu of reinforcement at the end of 25% of intervals on a fixed-interval 2-min schedule, response rate was reliably and persistently higher during the following 2-min intervals (omission effect). This effect was largely due to a decrease in time to first response after reinforcement omission. (2) When blackout duration was varied, within sessions, over the range 2 to 32 sec, time to first response was inversely related to the duration of the preceding blackout, for pigeons, and for rats during the first few sessions after the transition from FI 2-min to FI 2-min with reinforcement omission. Post-blackout pause was independent of blackout duration for rats at asymptote. These results were interpreted in terms of differential depressive effects of reinforcement and blackout on subsequent responding.

Journal ArticleDOI
TL;DR: Post-reinforcement pause was approximately equal for the yoked and ratio pigeons, and was relatively insensitive to changes in the tandem requirement, but terminal response rate increased with increases in theandem requirement, even though reinforcement rate was invariant.
Abstract: Two variables often confounded in fixed-ratio schedules are reinforcement frequency and response requirement. These variables were isolated by a technique that yoked the distributions of reinforcements in time for one group of pigeons to those of pigeons responding on various fixed-ratio schedules. The contingencies for the yoked birds were then manipulated by adding various tandem fixed-ratio requirements to their schedules. Post-reinforcement pause was approximately equal for the yoked and ratio pigeons, and was relatively insensitive to changes in the tandem requirement. Terminal response rate increased with increases in the tandem requirement, even though reinforcement rate was invariant. This increase was attributed to the progressive interference of the tandem requirement with the differential reinforcement of long interresponse times.

Journal ArticleDOI
TL;DR: In this article, the use of a changeover delay between crossover eye movements and reinforcement had the effect of changing the pattern of scanning from fixating four dials in succession or in a Z-shaped pattern to scanning vertically the dials on either side with fewer crossovers.
Abstract: Human macrosaccadic eye movements to two areas of a four-dial display were conditioned by concurrent variable-interval schedules of signals. Reinforcers (signals) were delivered to the two right-hand dials on one schedule and to the two left-hand dials on another, independent schedule. The use of a changeover delay between crossover eye movements and reinforcement had the effect of changing the pattern of scanning from fixating four dials in succession or in a Z-shaped pattern to scanning vertically the dials on either side with fewer crossovers. In the presence of a changeover delay, subjects matched relative eye-movement rates and relative reinforcement rates on each schedule. Rate of crossover eye movements, with a changeover delay in effect, was also inversely related to the difference in reinforcements arranged by the concurrent schedules. The results suggest that for stimuli whose critical components are arranged spatially, conditioned eye movements play an important part in selective stimulus control.

Journal ArticleDOI
TL;DR: The results suggest that in a multiple schedule, the stimulus correlated with extinction, or the lower response rate, functions as a conditioned aversive stimulus.
Abstract: Experiment I sought to determine if the stimulus correlated with extinction in a successive discrimination was an aversive stimulus. An escape response provided an index of aversive control. Two groups of pigeons were exposed to a multiple variable-interval 30-sec extinction schedule. For the experimental group, a single peck on a second key produced a timeout during which all lights in the chamber were dark. For the control group, pecks on the second key had no contingency. The rate of responding on the timeout key during extinction for the experimental group was higher than that of the control group during all sessions of discrimination training except the first. In Exp. II, green was correlated with variable interval 30-sec and red was correlated with variable-interval 5-min. Timeouts were obtained from variable-interval 5-min. There were more timeouts from extinction in Exp. I than from variable-interval 5-min in Exp. II. Experiment III showed that not presenting the positive stimulus reduced the number of timeouts from the negative stimulus for the two birds from Exp. I that had the highest rate of timeouts from extinction, but had little effect on the two birds that had the lowest rate of timeouts. These results suggest that in a multiple schedule, the stimulus correlated with extinction, or the lower response rate, functions as a conditioned aversive stimulus. Explanations of the timeout response in terms of extinction produced variability, displaced aggression, and stimulus change, were considered but found inadequate.

Journal ArticleDOI
TL;DR: Pigeons were exposed to a procedure in which food was presented after a fixed period of time had elapsed, provided no attack against a nearby stuffed pigeon had occurred during the last 15 sec of the period.
Abstract: Pigeons were exposed to a procedure in which food was presented after a fixed period of time had elapsed, provided no attack against a nearby stuffed pigeon had occurred during the last 15 sec of the period. As the minimum inter-food interval was increased logarithmically through seven values from 15 sec to 960 sec, attack increased to a maximum and then decreased. For both pigeons, attack predominantly occurred after, rather than shortly before, food deliveries.

Journal ArticleDOI
TL;DR: The results suggest that the effects of delayed reinforcement on prior responding can be reproduced by imposing a temporally equal fixed-interval schedule in place of the delay; and, therefore, that the time between a response and reinforcement controls the probability of that response, whether other responses intervene or not.
Abstract: When interreinforcement intervals were equated, pigeons demonstrated little or no preference between reinforcement after a delay interval and reinforcement presented on a fixed-interval schedule. The small preferences sometimes found for the fixed interval (a) were considerably smaller than when the delay and fixed intervals differed in duration, and (b) were caused by the absence of light during the delay. These results suggest that the effects of delayed reinforcement on prior responding can be reproduced by imposing a temporally equal fixed-interval schedule in place of the delay; and, therefore, that the time between a response and reinforcement controls the probability of that response, whether other responses intervene or not.

Journal ArticleDOI
TL;DR: Responding under fixed-ratio schedules was studied, and only one of the four pigeons studied showed a consistently higher response rate, exclusive of post-reinforcement pause, as a function of the longer access to food.
Abstract: Responding under fixed-ratio schedules was studied as a function of two durations of food presentation. Latency of the first response after food presentation (post-reinforcement pause) was consistently shorter when food was presented for the longer duration. Only one of the four pigeons studied showed a consistently higher response rate, exclusive of post-reinforcement pause, as a function of the longer access to food. When ratio size was reduced, pause durations decreased, and the differences related to the two durations of food presentations became progressively smaller.

Journal ArticleDOI
TL;DR: In squirrel monkeys responding under a schedule in which responding postponed the delivery of electric shock, the presentation of response-dependent shock under a fixed-interval (FI) schedule increased the rate of responding and raised fundamental questions about the traditional classification of stimuli as reinforcers or punishers.
Abstract: In squirrel monkeys responding under a schedule in which responding postponed the delivery of electric shock, the presentation of response-dependent shock under a fixed-interval (FI) schedule increased the rate of responding When the schedule of shock-postponement was eliminated, so that the only shocks delivered were those produced by responses under the FI schedule, a pattern of positively accelerated responding developed and was maintained over an extended period When responses did not produce shocks (extinction), responding decreased When shocks were again presented under the FI schedule, the previous pattern of responding quickly redeveloped In general, response rates were directly related to the intensity of the shock presented, and inversely related to the duration of the fixed-interval These results raise fundamental questions about the traditional classification of stimuli as reinforcers or punishers The basic similarities among FI schedules of food presentation, shock termination, and shock presentation strengthen the conclusion that the schedule under which an event is presented and the characteristics of the behavior at the time the event is presented, are of overriding importance in determining the effect of that event on behavior


Journal ArticleDOI
TL;DR: Four rhesus monkeys learned both a color and tilt discrimination by combining stimuli to produce incompatible behavior, which indicated that the stimuli producing "errors" were ignored.
Abstract: Four rhesus monkeys learned both a color and tilt discrimination. The stimuli were combined to produce incompatible behavior. The behavior controlled by one set of stimuli was reinforced until “errors” virtually disappeared. The stimuli were tested separately again. Sixteen replications of the entire procedure indicated that the stimuli producing “errors” were ignored.

Journal ArticleDOI
TL;DR: Reinforcement of a response, but not necessarily the occurrence of the response, inhibits other reinforced responses, as in the inhibition of excitatory effects of extinction.
Abstract: In an analysis of interactions between concurrent performances, variable-interval reinforcement was scheduled, in various sequences, for both keys, for only one key, or for neither key of a two-key pigeon chamber. With changeover delays of 0.5 or 1.0 sec, and with each key's reinforcements discriminated on the basis of key-correlated feeder stimuli, reinforcement of pecks on one key reduced the pecking maintained by reinforcement on the other key. The decrease in pecking early after reinforcement was discontinued on one key was not substantially affected by whether pecks on the other key were reinforced, but after reinforcement was discontinued on both keys, reinstatement of reinforcement for one key sometimes produced transient increases in pecking on the other key. Correlating the availability of right-key reinforcements with a stimulus, which maintained right-key reinforcement while reducing right-key pecking to negligible levels, demonstrated that these interactions depended on concurrent reinforcement, not concurrent responding. Thus, reinforcement of a response, but not necessarily the occurrence of the response, inhibits other reinforced responses. Compared with accounts in terms of excitatory effects of extinction, often invoked in treatments of behavioral contrast, this inhibitory account has the advantage of dealing only with observed dimensions of behavior.

Journal ArticleDOI
TL;DR: Pigeons' pecks at one or two wavelengths were reinforced intermittently and gradients of responding around the reinforced wavelengths were allowed to stabilize over a number of sessions.
Abstract: Pigeons' pecks at one or two wavelengths were reinforced intermittently. Random series of adjacent wavelengths appeared without reinforcement. Gradients of responding around the reinforced wavelengths were allowed to stabilize over a number of sessions. The single (one reinforced stimulus) and summation (two reinforced stimuli) gradients were consistent with a statistical decision account of the generalization process.

Journal ArticleDOI
TL;DR: Key-pressing responses in the cat were maintained under conditions in which brief electric shock was first postponed by responses, then periodically presented independently of responses, and finally produced by responses on a fixed-interval schedule of 15 min (FI 15-min).
Abstract: Key-pressing responses in the cat were maintained under conditions in which brief electric shock was first postponed by responses (avoidance), then periodically presented independently of responses, and finally produced by responses on a fixed-interval schedule of 15 min (FI 15-min). A steady rate of responding occurred under shock avoidance and under response-independent shock; positively accelerated responding was engendered by the FI 15-min schedule. A second experiment studied responding under second-order schedules composed of three FI 5-min components. Responding was suppressed when a stimulus was presented briefly at completion of each FI 5-min component and a shock followed the brief stimulus at completion of the third component. Responding was maintained when each of the first two components was completed either with or without presentation of a brief stimulus and a shock alone was presented at completion of the third FI 5-min component.

Journal ArticleDOI
TL;DR: The present work showed that extinction is not a necessary determinant of inhibitory stimulus control, and inhibition stimulus control about the stimulus correlated with the differential reinforcement of low rate was obtained.
Abstract: Interspersed reinforcement and extinction during discrimination learning generate a U-shaped gradient of inhibition about the stimulus correlated with extinction. The present work showed that extinction is not a necessary determinant of inhibitory stimulus control. In Exp. I, a reduction in the rate of reinforcement, through a shift from a multiple variable-interval 1-min variable-interval 1-min schedule to a multiple variable-interval 1-min variable-interval 5-min schedule, resulted in a post-discrimination line orientation gradient of inhibition about the stimulus correlated with the variable-interval 5-min schedule. In Exp. II, the rates of reinforcement, correlated with a pair of stimuli, were held constant during a shift from a multiple variable-interval 1-min variable-interval 1-min schedule to a multiple variable-interval 1-min differential-reinforcement-of-low-rate schedule. Inhibitory stimulus control about the stimulus correlated with the differential reinforcement of low rate was obtained. In both experiments, a reduction in the rate of responding during one stimulus and behavioral contrast during the other stimulus preceded the observation of inhibitory stimulus control.