Learning outcomes • The Big M Method to solve a linear programming problem.
In the previous discussions of the Simplex algorithm I have seen that the method must start with a basic feasible solution. In my examples so far, I have looked at problems that, when put into standard LP form, conveniently have an all slack starting solution. An all slack solution is only a possibility when all of the constraints in the problem have or = constraints, a starting basic feasible solution may not be readily apparent. The Big M method is a version of the Simplex Algorithm that first finds a basic feasible solution by adding "artificial" variables to the problem. The objective function of the original LP must, of course, be modified to ensure that the artificial variables are all equal to 0 at the conclusion of the simplex algorithm. Steps 1. Modify the constraints so that the RHS of each constraint is nonnegative (This requires that each constraint with a negative RHS be multiplied by 1. Remember that if you multiply an inequality by any negative number, the direction of the inequality is reversed!). After modification, identify each constraint as a , or = constraint. 2. Convert each inequality constraint to standard form (If constraint i is a < constraint, we add a
slack variable si; and if constraint i is a > constraint, we subtract an excess variable ei). 3. Add an artificial variable ai to the constraints identified as > or = constraints at the end of Step 1. Also add the sign restriction ai > 0. 4. If the LP is a max problem, add (for each artificial variable) -Mai to the objective function where M denote a very large positive number. 5. If the LP is a min problem, add (for each artificial variable) Mai to the objective function. 6. Solve the transformed problem by the simplex . Since each artificial variable will be in the starting basis, all artificial variables must be eliminated from row 0 before beginning the simplex. Now