scispace - formally typeset
Search or ask a question
Posted Content

A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression

TL;DR: In this paper, a block sparsity pattern corresponds to some subspace of homogeneous multivariate polynomials, which is used to adapt the ansatz space to align better with known sample complexity results.
Abstract: Low-rank tensors are an established framework for high-dimensional least-squares problems. We propose to extend this framework by including the concept of block-sparsity. In the context of polynomial regression each sparsity pattern corresponds to some subspace of homogeneous multivariate polynomials. This allows us to adapt the ansatz space to align better with known sample complexity results. The resulting method is tested in numerical experiments and demonstrates improved computational resource utilization and sample efficiency.
References
More filters
Journal ArticleDOI
TL;DR: A generalization of the numerical renormalization-group procedure used first by Wilson for the Kondo problem is presented and it is shown that this formulation is optimal in a certain sense.
Abstract: A generalization of the numerical renormalization-group procedure used first by Wilson for the Kondo problem is presented. It is shown that this formulation is optimal in a certain sense. As a demonstration of the effectiveness of this approach, results from numerical real-space renormalization-group calculations for Heisenberg chains are presented.

5,625 citations

Book
23 Jun 1995
TL;DR: This book presents Semigroup Theory, a treatment of systems theory concepts in finite dimensions with a focus on Hankel Operators and the Nehari Problem.
Abstract: 1 Introduction.- 1.1 Motivation.- 1.2 Systems theory concepts in finite dimensions.- 1.3 Aims of this book.- 2 Semigroup Theory.- 2.1 Strongly continuous semigroups.- 2.2 Contraction and dual semigroups.- 2.3 Riesz-spectral operators.- 2.4 Delay equations.- 2.5 Invariant subspaces.- 2.6 Exercises.- 2.7 Notes and references.- 3 The Cauchy Problem.- 3.1 The abstract Cauchy problem.- 3.2 Perturbations and composite systems.- 3.3 Boundary control systems.- 3.4 Exercises.- 3.5 Notes and references.- 4 Inputs and Outputs.- 4.1 Controllability and observability.- 4.2 Tests for approximate controllability and observability.- 4.3 Input-output maps.- 4.4 Exercises.- 4.5 Notes and references.- 5 Stability, Stabilizability, and Detectability.- 5.1 Exponential stability.- 5.2 Exponential stabilizability and detectability.- 5.3 Compensator design.- 5.4 Exercises.- 5.5 Notes and references.- 6 Linear Quadratic Optimal Control.- 6.1 The problem on a finite-time interval.- 6.2 The problem on the infinite-time interval.- 6.3 Exercises.- 6.4 Notes and references.- 7 Frequency-Domain Descriptions.- 7.1 The Callier-Desoer class of scalar transfer functions.- 7.2 The multivariable extension.- 7.3 State-space interpretations.- 7.4 Exercises.- 7.5 Notes and references.- 8 Hankel Operators and the Nehari Problem.- 8.1 Frequency-domain formulation.- 8.2 Hankel operators in the time domain.- 8.3The Nehari extension problem for state linear systems.- 8.4 Exercises.- 8.5 Notes and references.- 9 Robust Finite-Dimensional Controller Synthesis.- 9.1 Closed-loop stability and coprime factorizations.- 9.2 Robust stabilization of uncertain systems.- 9.3 Robust stabilization under additive uncertainty.- 9.4 Robust stabilization under normalized left-coprime-factor uncertainty.- 9.5 Robustness in the presence of small delays.- 9.6 Exercises.- 9.7 Notes and references.- A. Mathematical Background.- A.1 Complex analysis.- A.2 Normed linear spaces.- A.2.1 General theory.- A.2.2 Hilbert spaces.- A.3 Operators on normed linear spaces.- A.3.1 General theory.- A.3.2 Operators on Hilbert spaces.- A.4 Spectral theory.- A.4.1 General spectral theory.- A.4.2 Spectral theory for compact normal operators.- A.5 Integration and differentiation theory.- A.5.1 Integration theory.- A.5.2 Differentiation theory.- A.6 Frequency-domain spaces.- A.6.1 Laplace and Fourier transforms.- A.6.2 Frequency-domain spaces.- A.6.3 The Hardy spaces.- A.7 Algebraic concepts.- A.7.1 General definitions.- A.7.2 Coprime factorizations over principal ideal domains.- A.7.3 Coprime factorizations over commutative integral domains.- References.- Notation.

2,923 citations

Journal ArticleDOI
TL;DR: This work develops a novel framework to discover governing equations underlying a dynamical system simply from data measurements, leveraging advances in sparsity techniques and machine learning and using sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data.
Abstract: Extracting governing equations from data is a central challenge in many diverse areas of science and engineering. Data are abundant whereas models often remain elusive, as in climate science, neuroscience, ecology, finance, and epidemiology, to name only a few examples. In this work, we combine sparsity-promoting techniques and machine learning with nonlinear dynamical systems to discover governing equations from noisy measurement data. The only assumption about the structure of the model is that there are only a few important terms that govern the dynamics, so that the equations are sparse in the space of possible functions; this assumption holds for many physical systems in an appropriate basis. In particular, we use sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data. This results in parsimonious models that balance accuracy with model complexity to avoid overfitting. We demonstrate the algorithm on a wide range of problems, from simple canonical systems, including linear and nonlinear oscillators and the chaotic Lorenz system, to the fluid vortex shedding behind an obstacle. The fluid example illustrates the ability of this method to discover the underlying dynamics of a system that took experts in the community nearly 30 years to resolve. We also show that this method generalizes to parameterized systems and systems that are time-varying or have external forcing.

2,784 citations

Journal ArticleDOI
TL;DR: The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.
Abstract: A simple nonrecursive form of the tensor decomposition in $d$ dimensions is presented. It does not inherently suffer from the curse of dimensionality, it has asymptotically the same number of parameters as the canonical decomposition, but it is stable and its computation is based on low-rank approximation of auxiliary unfolding matrices. The new form gives a clear and convenient way to implement all basic operations efficiently. A fast rounding procedure is presented, as well as basic linear algebra operations. Examples showing the benefits of the decomposition are given, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.

2,127 citations

Book
15 Sep 2015
TL;DR: This is the second edition of Travis Oliphant's A Guide to NumPy, designed to be a reference that can be used by practitioners who are familiar with Python but want to learn more about NumPy and related tools.
Abstract: This is the second edition of Travis Oliphant's A Guide to NumPy originally published electronically in 2006. It is designed to be a reference that can be used by practitioners who are familiar with Python but want to learn more about NumPy and related tools. In this updated edition, new perspectives are shared as well as descriptions of new distributed processing tools in the ecosystem, and how Numba can be used to compile code using NumPy arrays. Travis Oliphant is the co-founder and CEO of Continuum Analytics. Continuum Analytics develops Anaconda, the leading modern open source analytics platform powered by Python. Travis, who is a passionate advocate of open source technology, has a Ph.D. from Mayo Clinic and B.S. and M.S. degrees in Mathematics and Electrical Engineering from Brigham Young University. Since 1997, he has worked extensively with Python for computational and data science. He was the primary creator of the NumPy package and founding contributor to the SciPy package. He was also a co-founder and past board member of NumFOCUS, a non-profit for reproducible and accessible science that supports the PyData stack. He also served on the board of the Python Software Foundation.

1,351 citations