Affiliation:
1. DATA61 CSIRO Melbourne Victoria Australia
2. School of Mathematics and Physics University of Queensland Brisbane Queensland Australia
Abstract
AbstractThe canonical polyadic (CP) decomposition of tensors is one of the most important tensor decompositions. While the well‐known alternating least squares (ALS) algorithm is often considered the workhorse algorithm for computing the CP decomposition, it is known to suffer from slow convergence in many cases and various algorithms have been proposed to accelerate it. In this article, we propose a new accelerated ALS algorithm that accelerates ALS in a blockwise manner using a simple momentum‐based extrapolation technique and a random perturbation technique. Specifically, our algorithm updates one factor matrix (i.e., block) at a time, as in ALS, with each update consisting of a minimization step that directly reduces the reconstruction error, an extrapolation step that moves the factor matrix along the previous update direction, and a random perturbation step for breaking convergence bottlenecks. Our extrapolation strategy takes a simpler form than the state‐of‐the‐art extrapolation strategies and is easier to implement. Our algorithm has negligible computational overheads relative to ALS and is simple to apply. Empirically, our proposed algorithm shows strong performance as compared to the state‐of‐the‐art acceleration techniques on both simulated and real tensors.
Subject
Applied Mathematics,Algebra and Number Theory
Reference31 articles.
1. Tensor Decompositions and Applications
2. Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition
3. HarshmanRA.Foundations of the PARAFAC procedure: models and conditions for an “explanatory” multimodal factor analysis. UCLA Working Papers in Phonetics 16. 1970.
4. PARAFAC. Tutorial and applications
5. BroR.Multi‐way analysis in the food industry: models algorithms and applications. University of Amsterdam; 1998.