
The Architect of Response Rates: Schedules of
Reinforcement
The frequency and pattern with which reinforcement is delivered significantly impact the acquisition,
maintenance, and extinction of behaviors. Skinner meticulously studied these schedules of
reinforcement, categorizing them into continuous and intermittent types.
Continuous Reinforcement (CRF): Every single instance of the desired behavior is
reinforced.
Effect: Leads to very rapid acquisition of a new behavior.
Vulnerability: Also leads to rapid extinction if reinforcement stops. (e.g., a vending
machine that always dispenses a drink when money is inserted).
Intermittent (Partial) Reinforcement: Only some instances of the desired behavior are
reinforced. This leads to slower initial learning but, crucially, much greater resistance to
extinction.
Fixed Ratio (FR): Reinforcement is delivered after a fixed, predictable number of
responses.
Effect: Produces a high rate of response, often with a brief pause after
reinforcement. (e.g., a garment worker paid per 10 shirts sewn).
Variable Ratio (VR): Reinforcement is delivered after an unpredictable, varying
number of responses.
Effect: Generates a very high, steady rate of response with no predictable
pauses, making it highly resistant to extinction. This is the schedule
underlying gambling. (e.g., slot machines, lottery tickets).
Fixed Interval (FI): Reinforcement is delivered for the first response after a fixed,
predictable amount of time has passed.
Effect: Produces a "scalloped" pattern of responding, with low response rates
immediately after reinforcement and increasing rates as the time for the next
reinforcement approaches. (e.g., studying only right before a weekly quiz).
Variable Interval (VI): Reinforcement is delivered for the first response after an
unpredictable, varying amount of time has passed.
Effect: Produces a moderate, steady rate of response. (e.g., checking email
or social media for new messages).