We firstly consider subgradient algorithms for non-smooth convex optimization problems whose feasible region is simple enough. We propose an extended version of the Mirror-Descent (MD) algorithm which was originally proposed by Nemirovsky and Yudin. We also consider some ideas of Nesterov's Dual-Averaging (DA) algorithm. These algorithms have similar complexity bounds for the number of iterations as the original algorithms. Our approach provides more simple analysis for the DA algorithm than original one and yields similar results for the extended MD algorithm. This assertion is also valid for algorithms which we develop for other problem classes. Secondly, we can solve structured convex optimization problems, which include the smooth optimization problem whose objective function has a Lipschitz continuous gradient, by these new algorithms. These algorithms solve only one auxiliary subproblem at each iteration. |