Convex optimization with ascending constraints has wide applications in practice. In this work, we propose two algorithms for solving convex optimization problems with linear ascending constraints. When the objective function is separable, we propose a dual method which terminates in a finite number of iterations. In particular, the worst case complexity of our dual method improves over the best-known result for this problem proposed by Padakandla and Sundaresan [SIAM J. Optimization, 20 (2009), pp. 1185-1204]. We then propose a gradient projection method to solve a more general class of problems. The gradient projection method uses the dual method as a subroutine in each projection step and does not need to evaluate the inverse gradient functions as most dual methods do. Numerical experiments show that both our algorithms work very well in test problems. |