Volume 48, pp. 1-14, 2018.

On accelerating the regularized alternating least-squares algorithm for tensors

Xiaofei Wang, Carmeliza Navasca, and Stefan Kindermann

Abstract

In this paper, we discuss the acceleration of the regularized alternating least-squares (RALS) algorithm for tensor approximations. We propose a fast iterative method using an Aitken-Stefensen-like update for the regularized algorithm. Through numerical experiments, a faster convergence rate for the accelerated version is demonstrated in comparison to both the standard and regularized alternating least-squares algorithms. In addition, we analyze global convergence based on the Kurdyka-Łojasiewicz inequality, and we show that the RALS algorithm has a linear local convergence rate.

Full Text (PDF) [667 KB], BibTeX

Key words

alternating least-squares, Kurdyka-Łojasiewicz inequality, tensor approximation

AMS subject classifications

15A69, 65F30

< Back