Thanks to multi-linearity of tensor representations (canonical format, hierarchical Tucker format, TT format), block coordinate techniques which act on the separate factors of tensor products are of even more particular interest in tensor optimization than in general. The simplest and most famous example is the alternating least squares algorithm for tensor approximation (PARAFAC-ALS), but other methods like the recently proposed maximum block improvement also have their merits. The convergence analysis can choose between a viewpoint in the redundant parameter space of the factors (nonuniqueness of tensor representations) or on the set/manifold of parametrized tensors. Within the later viewpoint other techniques like the projected gradient method can be treated. The talk will survey some recent local convergence results in this field. |