Sensitivity Analysis
1
Introduction
When you use a mathematical model to describe reality you must make approximations. The world is more complicated than the kinds of optimization problems that we are able to solve. Linearity assumptions usually are significant approximations. Another important approximation comes because you cannot be sure of the data that you put into the model. Your knowledge of the relevant technology may be imprecise, forcing you to approximate values in A, b, or c.
Moreover, information may change. Sensitivity analysis is a systematic study of how sensitive (duh) solutions are to (small) changes in the data. The basic idea is to be able to give answers to questions of the form:
1. If the objective function changes, how does the solution change?
2. If resources available change, how does the solution change?
3. If a constraint is added to the problem, how does the solution change?
One approach to these questions is to solve lots of linear programming problems.
For example, if you think that the price of your primary output will be between
$100 and $120 per unit, you can solve twenty di↵erent problems (one for each whole number between $100 and $120).1 This method would work, but it is inelegant and (for large problems) would involve a large amount of computation time. (In fact, the computation time is cheap, and computing solutions to similar problems is a standard technique for studying sensitivity in practice.)
The approach that I will describe in these notes takes full advantage of the structure of LP programming problems and their solution. It turns out that you can often figure out what happens in “nearby” linear programming problems just by thinking and by examining the information provided by the simplex algorithm. In this section, I will describe the sensitivity analysis information provided in Excel computations. I will also try to give an intuition for the results. 2