Computers & Chemical Engineering, Vol.21, No.S, 445-450, 1997
Global Optimization of Minlp Problems in-Process Synthesis and Design
Two new methodologies for the global optimization of MINLP models, the Special structure Mixed Integer Nonlinear alpha BB, SMIN-alpha BB, and the General structure Mixed Integer Nonlinear alpha BB, GMIN-alpha BB, are presented. Their theoretical foundations provide guarantees that the global optimum solution of MINLPs involving twice-differentiable nonconvex functions in the continuous variables can be identified. The conditions imposed on the functionality of the binary variables differ for each method : linear and mixed bilinear terms can be treated with the SMIN-alpha BB; mixed nonlinear terms whose continuous relaxation is twice-differentiable are handled by the GMIN-alpha BB. While both algorithms use the concept of a branch & bound tree, they rely on fundamentally different bounding and branching strategies. In the GMIN-alpha BB algorithm, lower (upper) bounds at each node result from the solution of convex (nonconvex) MINLPs derived from the original problem. The construction of convex lower bounding MINLPs, using the techniques recently developed for the generation of valid convex underestimators for twice-differentiable functions (Adjiman et al., 1996; Adjiman and Floudas, 1996), is an essential task as it allows to solve the underestimating problems to global optimality using the GBD algorithm or the OA algorithm, provided that the binary variables participate separably and linearly. Moreover, the inherent structure of the MINLP problem can be fully exploited as branching is performed on the binary and the continuous variables. In the case of the SMIN-alpha BB algorithm, the lower and upper bounds are obtained by solving continuous relaxations of the original MINLP. Using the alpha BB algorithm, these nonconvex NLPs are solved as global optimization problems and hence valid lower bounds are generated. Since branching is performed exclusively on the binary variables, the maximum size of the branch-and-bound tree is smaller than that for the SMIN-alpha BB. The two proposed approaches are used to generate computational results on various nonconvex MINLP problems that arise in the areas of Process Synthesis and Design.