scispace - formally typeset
Open AccessPosted Content

Pricing high-dimensional Bermudan options with hierarchical tensor formats.

Reads0
Chats0
TLDR
In this paper, an efficient compression technique based on hierarchical tensors for popular option pricing methods is presented, which can be used for the computation of Bermudan option prices with the Monte Carlo least-squares approach as well as the dual martingale method, both using high-dimensional tensorized polynomial expansions.
Abstract
An efficient compression technique based on hierarchical tensors for popular option pricing methods is presented. It is shown that the "curse of dimensionality" can be alleviated for the computation of Bermudan option prices with the Monte Carlo least-squares approach as well as the dual martingale method, both using high-dimensional tensorized polynomial expansions. This discretization allows for a simple and computationally cheap evaluation of conditional expectations. Complexity estimates are provided as well as a description of the optimization procedures in the tensor train format. Numerical experiments illustrate the favourable accuracy of the proposed methods. The dynamical programming method yields results comparable to recent Neural Network based methods.

read more

Citations
More filters
Posted Content

Solving high-dimensional parabolic PDEs using the tensor train format

TL;DR: In this article, the authors argue that tensor trains provide an appealing approximation framework for parabolic PDEs: the combination of reformulations in terms of backward stochastic differential equations and regression-type methods in the tensor format holds the promise of leveraging latent low-rank structures enabling both compression and efficient computation.
Posted Content

Dynamical low-rank approximations of solutions to the Hamilton-Jacobi-Bellman equation.

TL;DR: In this paper, a low-rank tensor train (TT) decomposition based on the Dirac-Frenkel variational principle is proposed for nonlinear optimal control.
Posted Content

Convergence bounds for nonlinear least squares and applications to tensor recovery.

TL;DR: In this paper, the problem of approximating a function in general nonlinear subsets of $L 2$ when only a weighted Monte Carlo estimate of the norm can be computed was considered.
Posted Content

Asymptotic Log-Det Sum-of-Ranks Minimization via Tensor (Alternating) Iteratively Reweighted Least Squares.

TL;DR: In this paper, it was shown that iteratively reweighted least squares with weight strength (p = 0) remains a viable method for affine sum-of-ranks minimization.
Posted Content

A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression

TL;DR: In this paper, a block sparsity pattern corresponds to some subspace of homogeneous multivariate polynomials, which is used to adapt the ansatz space to align better with known sample complexity results.
References
More filters
Book

Monte Carlo Methods in Financial Engineering

TL;DR: This paper presents a meta-modelling procedure that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive process of manually computing random numbers and random Variables.
Journal ArticleDOI

Valuing American Options by Simulation: A Simple Least-Squares Approach

TL;DR: In this paper, a new approach for approximating the value of American options by simulation is presented, using least squares to estimate the conditional expected payoff to the optionholder from continuation.
Book

Optimization Algorithms on Matrix Manifolds

TL;DR: Optimization Algorithms on Matrix Manifolds offers techniques with broad applications in linear algebra, signal processing, data mining, computer vision, and statistical analysis and will be of interest to applied mathematicians, engineers, and computer scientists.
Journal ArticleDOI

Tensor-Train Decomposition

TL;DR: The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.
Journal ArticleDOI

Efficient classical simulation of slightly entangled quantum computations.

TL;DR: The results imply that a necessary condition for an exponential computational speedup is that the amount of entanglement increases with the size n of the computation, and provide an explicit lower bound on the required growth.
Related Papers (5)