Topic
Bicubic interpolation
About: Bicubic interpolation is a research topic. Over the lifetime, 3348 publications have been published within this topic receiving 73126 citations.
Papers published on a yearly basis
Papers
More filters
••
28 Sep 2009TL;DR: The proposed EHFC algorithm not only improves the quality of enlarged images produced by zero-order, bilinear, or bicubic interpolation methods, but also accelerates the convergence speed of IBP.
Abstract: In this paper, we propose an Estimated High Frequency Compensated (EHFC) algorithm for super resolution images. It is based on Iterative Back Projection (IBP) method combined with compensated high frequency models according to different applications. The proposed algorithm not only improves the quality of enlarged images produced by zero-order, bilinear, or bicubic interpolation methods, but also accelerates the convergence speed of IBP. In experiments with general tested images, EHFC method can increase the speed by 1 ∼ 6.5 times and gets 0.4 ∼ 0.7 dB PSNR gain. In text image tests, EHFC method can increase 1.5 ∼ 6.5 times in speed and 1.2 ∼ 8.3 dB improvement in PSNR.
14 citations
••
01 May 2016TL;DR: This work reports on the rest of the computation, which consists of two mappings: charges onto a grid and a potential grid onto the particles, and enables the building of a balanced accelerator for the entire long-range electrostatics computation on a single FPGA.
Abstract: Computing the forces derived from long-range electrostaticsis a critical application and also a central part of MolecularDynamics. Part of that computation, the transformation of a charge grid to a potential grid via a 3D FFT, has received some attentionrecently and has been found to work extremely well on FPGAs. Here we report on the rest of the computation, which consists oftwo mappings: charges onto a grid and a potential grid onto theparticles. These mappings are interesting in their own right as theyare far more compute intensive than the FFTs, each is typicallydone using tricubic interpolation. We believe that these mappingshave been studied only once previously for FPGAs and then foundto be exorbitantly expensive, i.e., only bicubic would fit on the chip. In the current work we find that, when using the Altera Arria 10, not only do both mappings fit, but also an appropriately sized 3DFFT. This enables the building of a balanced accelerator for theentire long-range electrostatics computation on a single FPGA. Thisdesign scales directly to FPGA clusters. Other contributions include a new mapping scheme based on table lookup and a measure of the utility of the floating point support of the Arria-10.
14 citations
••
01 Oct 2006TL;DR: In this article, a causality-constrained interpolation procedure is introduced, with the aim of reconstructing missing samples in the data, like e.g. the DC point, via a sound numerical procedure that does not compromise the self-consistency and the causality of the entire dataset.
Abstract: We apply a recently developed formulation of the generalized Hilbert transform to the processing of tabulated and finite-bandwidth frequency responses. A causality-constrained interpolation procedure is introduced, with the aim of reconstructing missing samples in the data, like e.g. the DC point, via a sound numerical procedure that does not compromise the self-consistency and the causality of the entire dataset
13 citations
••
TL;DR: A calendar time interpolation method for 2D seismic amplitude maps, done in two steps, is presented in this article, where the contour interpolation part is formulated as a quadratic programming problem, whereas the amplitude value interpolation is based on a conditional probability formulation.
Abstract: A calendar time interpolation method for 2D seismic amplitude maps, done in two steps, is presented. The contour interpolation part is formulated as a quadratic programming problem, whereas the amplitude value interpolation is based on a conditional probability formulation. The method is applied on field data from the Sleipner CO2 storage project. The output is a continuous image (movie) of the CO2 plume. Besides visualization, the output can be used to better couple 4D seismic to other types of data acquired. The interpolation uncertainty increases with the time gap between consecutive seismic surveys and is estimated by leaving a survey out (blind test). Errors from such tests can be used to identify problems in understanding the flow and possibly improve the interpolation scheme for a given case. Field-life cost of various acquisition systems and repeat frequencies are linked to the time-lapse interpolation errors. The error in interpolated amplitudes increased by 3%-4% per year of interpolation gap for the Sleipner case. Interpolation can never fully replace measurements.
13 citations