Planning in the Paper Industry
K. Karen Yin
Dept. of Bio-based Products, University of Minnesota, St. Paul, MN 55108
G. George Yin
Dept. of Mathematics, Wayne State University, Detroit, MI 48202
Hu Liu
3M Commerce Services, 3M Center, St. Paul, MN 55144
DOI 10.1002/aic.10251
Published online in Wiley InterScience (www.interscience.wiley.com).
Problem formulations and solution procedures of production planning and inventory management for manufacturing systems under uncertainties is discussed. Markov decision processes and controlled Markovian dynamic systems are used in the models. Considering an inventory problem in discrete time and formulating it by a finite-state Markov chain lead to a Markov decision process model. Using the policy-improvement algorithm yields the optimal inventory policy. In controlled dynamic system modeling, the random demand and capacity processes involved in planning are described by two finite-state continuoustime Markov chains. Such an approach enables us to embed the randomness in the differential equations of the system. The optimal production rates that minimize an expected cost are obtained by numerically solving the corresponding Hamilton–Jacobi–
Bellman (HJB) equations. To overcome the so-called curse of dimensionality, frequently encountered in computation, we resort to a hierarchical approach. Illustrative examples using data collected from a large paper manufacturer are provided. © 2004 American
Institute of Chemical Engineers AIChE J, 50: 2877–2890, 2004
Keywords: inventory management; production planning; Markov chain; optimal policy; hierarchical approach
Introduction
In the pulp and paper industry, the search for good inventory control policies and production plans started several decades ago. Many models have been developed and used (see
Leiviska, 1999 and references therein). Nevertheless, similar to
¨
the situation in many other
Cited: Applequist, G., O. Samikoglu, J. Pekny, and G. Reklaitis, “Issues in the Use, Design and Evolution of Process Scheduling and Planning Systems,” ISA Trans., 36(2), 81 (1997). Bertsekas, D., Dynamic Programming and Stochastic Control, Academic Press, New York (1976). Bertsekas, D., Dynamic Programming: Deterministic and Stochastic Models, Prentice-Hall, Upper Saddle River, NJ (1987). (1995). Buffa, E. S., Modern Production/Operations Management, 6th Edition, Wiley, New York (1980). Davis, M. H. A., Markov Models and Optimization, Chapman & Hall, New York (1993). Fleming, W. H., and R. W. Rishel, Deterministic and Stochastic Optimal Control, Springer-Verlag, New York (1975). and Inventory Management,” Comput. Chem. Eng., 24(12), 2613 (2000). Hillier, F. S., and G. J. Lieberman, Introduction to Operations Research, 5th Edition, Holden-Day, Oakland, CA (1990). (1996). Leiviska, K., Process Control, Fapet Oy, Helsinki, Finland (1999). Ross, S., Introduction to Stochastic Dynamic Programming, Academic Press, New York (1983). Ross, S., Introduction to Probability Models, Academic Press, New York (2000). Sethi, S. P., and Q. Zhang, Hierarchical Decision Making in Stochastic Manufacturing Systems, Birkhauser, Boston (1994). Rev., 13(1), 116 (1934). Systems,” J. Optim. Theory Appl., 83, 511 (1994). Yin, G., and Q. Zhang, Continuous-Time Markov Chains and Applications: A Singular Perturbation Approach, Springer-Verlag, New York (1998). RI (1997). Systems,” Proc. IFAC Conf. of Youth Automation YAC ’95, pp. 450 – 454 (1995). (2002). Zipkin, P. H., Foundations of Inventory Management, McGraw Hill, Boston (2000). Manuscript received Jan. 31, 2003, and revision received Feb. 11, 2004.