Topic

# Interpolation

About: Interpolation is a research topic. Over the lifetime, 54021 publications have been published within this topic receiving 904265 citations. The topic is also known as: interpolation & numerical interpolation.

##### Papers published on a yearly basis

##### Papers

More filters

••

TL;DR: It is demonstrated that arbitrary accuracy can be achieved, independent of system size N, at a cost that scales as N log(N), which is comparable to that of a simple truncation method of 10 A or less.

Abstract: The previously developed particle mesh Ewald method is reformulated in terms of efficient B‐spline interpolation of the structure factors This reformulation allows a natural extension of the method to potentials of the form 1/rp with p≥1 Furthermore, efficient calculation of the virial tensor follows Use of B‐splines in place of Lagrange interpolation leads to analytic gradients as well as a significant improvement in the accuracy We demonstrate that arbitrary accuracy can be achieved, independent of system size N, at a cost that scales as N log(N) For biomolecular systems with many thousands of atoms this method permits the use of Ewald summation at a computational cost comparable to that of a simple truncation method of 10 A or less

17,897 citations

••

Nvidia

^{1}TL;DR: This paper proposed an alternative generator architecture for GANs, borrowing from style transfer literature, which leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images.

Abstract: We propose an alternative generator architecture for generative adversarial networks, borrowing from style transfer literature. The new architecture leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images (e.g., freckles, hair), and it enables intuitive, scale-specific control of the synthesis. The new generator improves the state-of-the-art in terms of traditional distribution quality metrics, leads to demonstrably better interpolation properties, and also better disentangles the latent factors of variation. To quantify interpolation quality and disentanglement, we propose two new, automated methods that are applicable to any generator architecture. Finally, we introduce a new, highly varied and high-quality dataset of human faces.

6,564 citations

••

TL;DR: In this article, the authors present finite-difference schemes for the evaluation of first-order, second-order and higher-order derivatives yield improved representation of a range of scales and may be used on nonuniform meshes.

Abstract: The present finite-difference schemes for the evaluation of first-order, second-order, and higher-order derivatives yield improved representation of a range of scales and may be used on nonuniform meshes. Various boundary conditions may be invoked, and both accurate interpolation and spectral-like filtering can be accomplished by means of schemes for derivatives at mid-cell locations. This family of schemes reduces to the Pade schemes when the maximal formal accuracy constraint is imposed with a specific computational stencil. Attention is given to illustrative applications of these schemes in fluid dynamics.

5,832 citations

••

01 May 1992

TL;DR: The Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data by examining the posterior probability distribution of regularizing constants and noise levels.

Abstract: Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. In this paper, the Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other data modeling problems. Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. Occam's razor is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling.

4,194 citations