Abstract
In this paper we consider a class of separable convex optimization problems with linear coupled constraints arising in many applications. Based on Nesterov’s smoothing technique and a fast gradient scheme, we present a fast dual proximal-gradient method to solve this class of problems. Under some conditions, we then give the convergence analysis of the proposed method and show that the computational complexity bound of the method for achieving an ε-optimal feasible solution is O(1 / ε). To further accelerate the proposed algorithm, we utilize a restart technique by successively decreasing the smoothing parameter. The advantage of our algorithms allows us to obtain the dual and primal approximate solutions simultaneously, which is fast and can be implemented in a parallel fashion. Several numerical experiments are presented to illustrate the effectiveness of the proposed algorithms. The numerical results validate the efficiency of our methods. © 2016, Springer Science+Business Media New York.
| Original language | English |
|---|---|
| Pages (from-to) | 671-697 |
| Journal | Computational Optimization and Applications |
| Volume | 64 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - 1 Jul 2016 |
| Externally published | Yes |
Bibliographical note
Publication details (e.g. title, author(s), publication statuses and dates) are captured on an “AS IS” and “AS AVAILABLE” basis at the time of record harvesting from the data source. Suggestions for further amendments or supplementary information can be sent to [email protected].Research Keywords
- Convex optimization
- Dual decomposition
- Fast proximal-gradient method
- Parallel computation
- Smoothing technique