scispace - formally typeset
Search or ask a question
Topic

Bicubic interpolation

About: Bicubic interpolation is a research topic. Over the lifetime, 3348 publications have been published within this topic receiving 73126 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a reverse engineering technique is proposed for modeling composite sculptured surfaces, which includes a surface fitting algorithm and a surface blending algorithm, formulated as the least-squares minimisation problem for which an error expression was minimised which yields the optimised position of control points and parameter values for the non-uniform rational B-spline patch.
Abstract: A reverse engineering technique is proposed in this study for modelling composite sculptured surfaces. The proposed technique includes a surface fitting algorithm and a surface blending algorithm. The former is formulated as the least-squares minimisation problem for which an error expression was minimised which yields the optimised position of control points and parameter values for the non-uniform rational B-spline patch. Cubic spline and bicubic surface algorithms are developed for two-patch and four-patch blending, respectively. Computer simulation results clearly demonstrate that the proposed technique is useful for composite surfaces modelling applications. Factors affecting the convergence speed and surface accuracy in the optimisation process are also discussed.

23 citations

Journal ArticleDOI
TL;DR: The global look-up table strategy with cubic B-spline interpolation improves significantly the accuracy of the IC-GN algorithm-based DIC method compared with the one using the bicubic interpolation, at a trivial price of computation efficiency.

23 citations

Proceedings ArticleDOI
16 Jun 2019
TL;DR: This work proposes fractal residual network (FRN) for SISR, which extends residual in residual structure by adding new residual shells and name that structure FRN because of the self-similarity like the fractal.
Abstract: The degradation function in single image super-resolution (SISR) is usually bicubic with an integer scale factor. However, bicubic is not realistic and a scale factor is not always an integer number in the real world. We introduce some solutions that are appropriate for realistic SR. First, we propose down-upsampling module which allows general SR network to use GPU memory efficiently. With the module, we can stack more convolutional layers, resulting in a higher performance. We also adopt a new regularization loss, auto-encoder loss. That loss generalizes down-upsampling module. Furthermore, we propose fractal residual network (FRN) for SISR. We extend residual in residual structure by adding new residual shells and name that structure FRN because of the self-similarity like the fractal. We show that our proposed model outperforms state-of-the-art methods and demonstrate the effectiveness of our solutions by several experiments on NTIRE 2019 dataset.

23 citations

Proceedings ArticleDOI
01 Oct 2006
TL;DR: A novel method for interpolating images and the concept of non-local interpolation is introduced, which exploits the repetitive character of the image and its superiority at very large magnifications to other interpolation methods.
Abstract: In this paper we present a novel method for interpolating images and we introduce the concept of non-local interpolation. Unlike other conventional interpolation methods, the estimation of the unknown pixel values is not only based on its local surrounding neighbourhood, but on the whole image (non-locally). In particularly, we exploit the repetitive character of the image. A great advantage of our proposed approach is that we have more information at our disposal, which leads to better estimates of the unknown pixel values. Results show the effectiveness of non-local interpolation and its superiority at very large magnifications to other interpolation methods.

23 citations

Journal ArticleDOI
TL;DR: The method may yield a multiple of the implicit equation: it is characterized and quantify this situation by relating the nullspace dimension to the predicted support and its geometry, thus yielding a method of sparse approximate implicitization, which is important in tackling larger problems.
Abstract: We revisit implicitization by interpolation in order to examine its properties in the context of sparse elimination theory. Based on the computation of a superset of the implicit support, implicitization is reduced to computing the nullspace of a numeric matrix. The approach is applicable to polynomial and rational parameterizations of curves and (hyper)surfaces of any dimension, including the case of parameterizations with base points. Our support prediction is based on sparse (or toric) resultant theory, in order to exploit the sparsity of the input and the output. Our method may yield a multiple of the implicit equation: we characterize and quantify this situation by relating the nullspace dimension to the predicted support and its geometry. In this case, we obtain more than one multiple of the implicit equation; the latter can be obtained via multivariate polynomial GCD (or factoring). All of the above techniques extend to the case of approximate computation, thus yielding a method of sparse approximate implicitization, which is important in tackling larger problems. We discuss our publicly available Maple implementation through several examples, including the benchmark of a bicubic surface. For a novel application, we focus on computing the discriminant of a multivariate polynomial, which characterizes the existence of multiple roots and generalizes the resultant of a polynomial system. This yields an efficient, output-sensitive algorithm for computing the discriminant polynomial.

23 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
84% related
Image processing
229.9K papers, 3.5M citations
83% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202350
2022118
202187
202087
2019122
201892