Distinguish between the different schedules of reinforcement
What will be an ideal response?
Under continuous reinforcement, every occurrence of a targeted behavior results in a reinforcement. For example, the rat in the Skinner box is reinforced with one food pellet every time it presses the lever. Learning occurs quickly under continuous reinforcement, but extinction also occurs quickly. With a fixed-ratio schedule of reinforcement, the reinforcer is given after a predetermined number of responses is made. For example, under a FR-3 schedule, the rat in the Skinner box receives a food pellet after every third bar press. A variable-ratio schedule occurs when the number of correct responses required for the reinforcement varies around some predetermined number. For example, under a VR-5 schedule, the rat in the Skinner box may have to press the lever 8 times to get a reinforcement on one occasion, but on another occasion the first bar press results in a reinforcement. Over a large number of trials, the required number of bar presses averages to 5 . Variable-ratio schedules usually produce a very high, steady rate of responding, and are resistant to extinction. Under a fixed-interval schedule, reinforcement is given for the first correct response after a fixed amount of time has passed. For example, under a FI-15 schedule, the rat in a Skinner box will receive a food pellet for the first bar press after a 15-second timer has elapsed. The fixed-interval schedules frequently produced a "scalloped" response pattern, in which the frequency of responses drop after a reinforcement is given, then increase near the end of the interval. Elderly people sometimes display this pattern of behavior when checking the mail. They sometimes watch for the letter carrier and check their mailboxes several times as the time for mail delivery approaches, but once the letter carrier has come, they stop checking the mail until it is almost time for the next day's delivery. In a variableinterval schedule, the amount of time that must pass before a behavior results in a reinforcement is
allowed to vary from occasion to occasion. For example, a VI-30 schedule means that the period of time between reinforcements varies around an average of 30 seconds. On some trials the interval will be shorter, on others it will be longer. VI schedules
tend to produce slow, steady response rates, and tend to be more resistant to extinction than behaviors that are reinforced on a fixed-interval schedule.
You might also like to view...
Cuddles, the family dog, becomes wary and fretful when a stranger enters the home. Cuddles displays
a. stranger anxiety. b. the "strange situation.". c. behavioral inhibition. d. difficult temperament.
Evocations are produced when:
A. one's characteristics unintentionally cause other people to act in a certain way. B. different people see the same situation differently. C. individuals select environments to match their traits. D. one intentionally attempts to make other people act in a certain way.
After her father passed away, Violet asked her mother Sally to move in with her husband Ramzi and their kids. The kids love having grammy around and Violet and Ramzi appreciate the extra help. Which of the following best describes the change that occurred here?
A. A traditional family became a blended family. B. An immediate family became a nuclear family. C. An extended family became a transactional family. D. A nuclear family became an extended family.
Roy G. Biv is taught by science teacher to help students to remember the colors of the spectrum in order. Thus, Roy
G. Biv is an example of a. a mnemonic system. b. the serial position effect. c. rote learning. d. progressive-part learning.