2.Secondary/conditioned reinforce – a previously neutral stimulus that acquires the ability to strengthen responses because the stimulus has been paired with a primary reinforce.
(for example- money itself isn't satisfying to eat, but we learn how to use money to buy food).
Schedules of Reinforcement
1. Continuous Reinforcement
In continuous reinforcement, the desired behavior is reinforced every single time it occurs. Generally, this schedule is best used during the initial stages of learning in order to create a strong association between the behavior and the response
2. Partial Reinforcement
In partial reinforcement, the response is reinforced only part of the time. Learned behaviors are acquired more slowly with partial reinforcement, but the response is more resistant to extinction.
Four schedules of partial Reinforcement
1.Fixed Ratio (FR)- schedules deliver reinforcement after every nth response.
Example (Many factory workers are paid according to the number of some product they produce.a worker may get paid $10.00 for every 100 widgets he makes.this would be an example of FR 100 schedule.
2Variable Ratio (VR)- The variable ratio schedule is the same as the FR except that the ratio varies,and is not stable like the Fr schedule.Reinforcement is given after every N th response but N is an average
Example (slot machines because, though the probability of hitting the jackpot is constant, the number of lever presses needed to hit the jackpot is variable).
3.Fixed interval (Fl)- a designated amount of time must pass,and then a certain response must be made in order to get reinforcement
Example (washing machine cycle.)
4.Variable interval (Vi) – same as Fl but now the time interval varies