scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computer Science in 2023"



Journal ArticleDOI
TL;DR: In this article , a new class of fully connected V-cycle MgNet is proposed for long-term time series forecasting, which is one of the most difficult tasks in forecasting.
Abstract: By investigating iterative methods for a constrained linear model, we propose a new class of fully connected V-cycle MgNet for long-term time series forecasting, which is one of the most difficult tasks in forecasting. MgNet is a CNN model that was proposed for image classification based on the multigrid (MG) methods for solving discretized partial differential equations (PDEs). We replace the convolutional operations with fully connected operations in the existing MgNet and then apply them to forecasting problems. Motivated by the V-cycle structure in MG, we further propose the FV-MgNet, a V-cycle version of the fully connected MgNet, to extract features hierarchically. By evaluating the performance of FV-MgNet on popular data sets and comparing it with state-of-the-art models, we show that the FV-MgNet achieves better results with less memory usage and faster inference speed. In addition, we develop ablation experiments to demonstrate that the structure of FV-MgNet is the best choice among the many variants.

2 citations







Journal ArticleDOI
TL;DR: Lenz et al. as mentioned in this paper presented a method of model regularization that preserves significant features of a data set while minimizing artificial oscillations, which varies the strength of a smoothing parameter automatically, removing artifacts in poorly-constrained regions while leaving other regions unchanged.
Abstract: B-spline models are a powerful way to represent scientific data sets with a functional approximation. However, these models can suffer from spurious oscillations when the data to be approximated are not uniformly distributed. Model regularization (i.e., smoothing) has traditionally been used to minimize these oscillations; unfortunately, it is sometimes impossible to sufficiently remove unwanted artifacts without smoothing away key features of the data set. In this article, we present a method of model regularization that preserves significant features of a data set while minimizing artificial oscillations. Our method varies the strength of a smoothing parameter throughout the domain automatically, removing artifacts in poorly-constrained regions while leaving other regions unchanged. The proposed method selectively incorporates regularization terms based on first and second derivatives to maintain model accuracy while minimizing numerical artifacts. The behavior of our method is validated on a collection of two- and three-dimensional data sets produced by scientific simulations. In addition, a key tuning parameter is highlighted and the effects of this parameter are presented in detail. This paper is an extension of our previous conference paper at the 2022 International Conference on Computational Science (ICCS) [Lenz et al. 2022].