In recent years sparsity and structured sparsity have emerged as major tools for handling statistical and machine learning problems in high dimensions. It has been shown that in many cases one can formulate heuristics based on non-smooth convex optimization problems. These problems often combine several penalties that each promote a certain desired structure. In this talk we focus on those penalties that are obtained composing a simple convex function with a linear transformation. This setting includes, in particular, group/fused lasso methods, multi-task learning, system identification/realization techniques based on the nuclear norm and learning problems where data can be expressed as tensors. We discuss the solution of proximal problems and present splitting techniques suitable to solve constrained convex optimization problems involving composite penalties. |