R Foundations and Trends• in Machine Learning Vol. 9, No. 4-5 (2016) 249–429 c 2017 A. Cichocki et al. • DOI: 10.1561/2200000059 Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Part 1 Low-Rank Tensor Decompositions Andrzej Cichocki RIKEN Brain Science Institute (BSI), Japan and Skolkovo Institute of Science and Technology (SKOLTECH)
[email protected] Namgil Lee RIKEN BSI,
[email protected] Ivan Oseledets Skolkovo Institute of Science and Technology (SKOLTECH) and Institute of Numerical Mathematics of Russian Academy of Sciences
[email protected] Anh-Huy Phan RIKEN BSI,
[email protected] Qibin Zhao RIKEN BSI,
[email protected] Danilo P. Mandic Department of Electrical and Electronic Engineering Imperial College London
[email protected] Contents 1 Introduction and Motivation 250 1.1 Challenges in Big Data Processing ............. 251 1.2 Tensor Notations and Graphical Representations ...... 252 1.3 Curse of Dimensionality and Generalized Separation of Vari- ables for Multivariate Functions ............... 260 1.4 Advantages of Multiway Analysis via Tensor Networks ... 268 1.5 Scope and Objectives .................... 269 2 Tensor Operations and Tensor Network Diagrams 272 2.1 Basic Multilinear Operations ................ 272 2.2 Graphical Representation of Fundamental Tensor Networks 292 2.3 Generalized Tensor Network Formats ............ 310 3 Constrained Tensor Decompositions: From Two-way to Mul- tiway Component Analysis 314 3.1 Constrained Low-Rank Matrix Factorizations ........ 314 3.2 The CP Format ....................... 317 3.3 The Tucker Tensor Format ................. 323 3.4 Higher Order SVD (HOSVD) for Large-Scale Problems .. 332 3.5 Tensor Sketching Using Tucker Model ..........