We present a random coordinate descent method suited for large scale problems with composite objective function. Moreover, we focus on linearly coupled constrained optimization problems (i.e., the constraint set is coupled through linear equalities). We prove for our method an expected convergence rate of order $O(N^2/k)$, where $N$ is number of blocks and $ k$ is the iteration counter. We show that for functions with cheap coordinate derivatives the new method is much faster, either in worst case complexity analysis, or numerical implementation, than schemes based on full gradient information. But our method also offers other important advantages, e.g., due to the randomization, our algorithm is easier to analyze and implement, it leads to more robust output and is adequate for modern computational architectures (e.g. parallel or distributed architectures). Analysis for rate of convergence in probability is also provided. For strongly convex functions we prove that the new method converges linearly. We also provide extensive numerical simulations and compare our algorithm against state-of-the-art methods from the literature on support vector machine problems. |