EconPapers    
Economics at your fingertips  
 

A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization

Mingyi Hong (), Tsung-Hui Chang (), Xiangfeng Wang (), Meisam Razaviyayn (), Shiqian Ma () and Zhi-Quan Luo ()
Additional contact information
Mingyi Hong: University of Minnesota, Minneapolis, Minnesota 55455;
Tsung-Hui Chang: The Chinese University of Hong Kong, 518172 Shenzhen, People’s Republic of China; Shenzhen Research Institute of Big Data, 518172 Shenzhen, People’s Republic of China;
Xiangfeng Wang: Shanghai Key Laboratory for Trustworthy Computing, East China Normal University, 200062 Shanghai, People’s Republic of China;
Meisam Razaviyayn: University of Southern California, Los Angeles, California 90007;
Shiqian Ma: Department of Mathematics, University of California, Davis, Davis, California 95616;
Zhi-Quan Luo: The Chinese University of Hong Kong, 518172 Shenzhen, People’s Republic of China; Shenzhen Research Institute of Big Data, 518172 Shenzhen, People’s Republic of China;

Mathematics of Operations Research, 2020, vol. 45, issue 3, 833-861

Abstract: Consider the problem of minimizing the sum of a smooth convex function and a separable nonsmooth convex function subject to linear coupling constraints. Problems of this form arise in many contemporary applications, including signal processing, wireless networking, and smart grid provisioning. Motivated by the huge size of these applications, we propose a new class of first-order primal–dual algorithms called the block successive upper-bound minimization method of multipliers (BSUM-M) to solve this family of problems. The BSUM-M updates the primal variable blocks successively by minimizing locally tight upper bounds of the augmented Lagrangian of the original problem, followed by a gradient-type update for the dual variable in closed form. We show that under certain regularity conditions, and when the primal block variables are updated in either a deterministic or a random fashion, the BSUM-M converges to a point in the set of optimal solutions. Moreover, in the absence of linear constraints and under similar conditions as in the previous result, we show that the randomized BSUM-M (which reduces to the randomized block successive upper-bound minimization method) converges at an asymptotically linear rate without relying on strong convexity.

Keywords: block successive upper-bound minimization; alternating direction method of multipliers; randomized block coordinate descent (search for similar items in EconPapers)
Date: 2020
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://doi.org/10.1287/moor.2019.1010 (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:inm:ormoor:v:45:y:2020:i:3:p:833-861

Access Statistics for this article

More articles in Mathematics of Operations Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().

 
Page updated 2025-04-06
Handle: RePEc:inm:ormoor:v:45:y:2020:i:3:p:833-861