THE MM PROXIMAL DISTANCE ALGORITHM FOR CONSTRAINED OPTIMIZATION
Kevin Keys, Kenneth Lange.
University of California, Los Angeles, Los Angeles, CA.
The majorization-minimization (MM) principle is a framework for creating iterative optimization algorithms satisfying a monotonic descent property. The descent property lends MM algorithmâ€™s remarkable stability and introduces desirable properties such as parameter separation and smoothing into the minimization process. Instead of minimizing an objective function directly, an MM algorithm minimizes a surrogate that is tangent to the objective at every iteration. Thus, by minimizing the surrogate, an MM algorithm drives the objective downward. The current survey emphasizes the role of the MM principle in nonlinear programming. For convex objective functions subject to nonsmooth or nonconvex constraints, one can combine an exact penalty method with distance majorization to create versatile proximal distance algorithms that are effective in both discrete and continuous optimization. These proximal distance algorithms are highly modular and easy to implement. They rely on set projections and proximal mappings, both of which are very well understood techniques in optimization. We test MM proximal distance algorithms in linear programming, binary piecewise-linear programming, sparse regression, matrix completion, and sparse precision matrix fitting. In statistical settings, our algorithm tends to recover true models more reliably than current software, and we sketch some future statistical applications where our MM algorithm holds promise.