scispace - formally typeset
Search or ask a question

Showing papers in "Technometrics in 2021"


Journal ArticleDOI
TL;DR: In this article, an optimal method referred to as SPlit is proposed for splitting a dataset into training and testing sets. SPlits is based on the method of support points (SP), which was initially designed for classification tasks.
Abstract: In this article, we propose an optimal method referred to as SPlit for splitting a dataset into training and testing sets. SPlit is based on the method of support points (SP), which was initially d...

35 citations


Journal ArticleDOI
TL;DR: In recent years, measurement or collection of heterogeneous sets of data such as those containing scalars, waveform signals, images, and even structured point clouds has become more common as mentioned in this paper.
Abstract: In recent years, measurement or collection of heterogeneous sets of data such as those containing scalars, waveform signals, images, and even structured point clouds, has become more common. Statis...

28 citations


Journal ArticleDOI
TL;DR: The proposed strategy of minimizing false negatives in conservative estimation achieves competitive performance both in terms of model-based and model-free indicators.
Abstract: We consider the problem of estimating the set of all inputs that leads a system to some particular behavior. The system is modeled by an expensive-to-evaluate function, such as a computer experiment, and we are interested in its excursion set, i.e. the set of points where the function takes values above or below some prescribed threshold. The objective function is emulated with a Gaussian Process (GP) model based on an initial design of experiments enriched with evaluation results at (batch-)sequentially determined input points. The GP model provides conservative estimates for the excursion set, which control false positives while minimizing false negatives. We introduce adaptive strategies that sequentially select new evaluations of the function by reducing the uncertainty on conservative estimates. Following the Stepwise Uncertainty Reduction approach we obtain new evaluations by minimizing adapted criteria. Tractable formulae for the conservative criteria are derived, which allow more convenient optimization. The method is benchmarked on random functions generated under the model assumptions in different scenarios of noise and batch size. We then apply it to a reliability engineering test case. Overall, the proposed strategy of minimizing false negatives in conservative estimation achieves competitive performance both in terms of model-based and model-free indicators.

26 citations


Journal ArticleDOI
TL;DR: This article presents a dynamic subspace learning method for multivariate functional data modeling that considers that different functions come from different subspaces, and only functions of the same subspace have cross-correlations with each other.
Abstract: Multivariate functional data from a complex system are naturally high-dimensional and have a complex cross-correlation structure. The complexity of data structure can be observed as that (1) some f...

25 citations


Journal ArticleDOI
TL;DR: Ahmed as discussed by the authors reviewed those books whose content and level reflect the general editorial policy of Technometrics, and sent books for review to Ejaz Ahmed, Department of Mathematics and Science.
Abstract: This section will review those books whose content and level reflect the general editorial policy of Technometrics. Publishers should send books for review to Ejaz Ahmed, Department of Mathematics ...

25 citations


Journal ArticleDOI
TL;DR: A faster version of the DetectDeviatingCells method is constructed to detect cellwise outliers, which can deal with much higher dimensions and is illustrated on genomic data with 12,600 variables and color video data with 920,000 dimensions.
Abstract: The product moment covariance matrix is a cornerstone of multivariate data analysis, from which one can derive correlations, principal components, Mahalanobis distances and many other results. Unfo...

24 citations


Journal ArticleDOI
TL;DR: A novel function-on-function kriging model for efficient emulation and tissue-mimicking optimization, which captures important spectral differences between two functional inputs and adopts shrinkage priors on both the input spectra and the output co-kriging covariance matrix.
Abstract: Three-dimensional printed medical prototypes, which use synthetic metamaterials to mimic biological tissue, are becoming increasingly important in urgent surgical applications. However, the mimicki...

24 citations


Journal ArticleDOI
TL;DR: A multivariate general path model for analyzing degradation data with multiple degradation characteristics (DCs) is proposed and includes random effects that are correlated among the multiple DCs to capture the unit-to-unit variation in the individual degradation paths and to model the interdependence among the multivariate measurements.
Abstract: Degradation data have been broadly used for assessing product and system reliability. Most existing work focuses on modeling and analysis of degradation data with a single characteristic. In some d...

21 citations


Journal ArticleDOI
TL;DR: A common challenge in computer experiments and related fields is to efficiently explore the input space using a small number of samples, that is, the experimental design problem as discussed by the authors, and much of the recent...
Abstract: A common challenge in computer experiments and related fields is to efficiently explore the input space using a small number of samples, that is, the experimental design problem. Much of the recent...

21 citations


Journal ArticleDOI
TL;DR: A new type of design is proposed, called component orthogonal array, as a fraction of the full design for order-of-addition experiments, and it is shown that component Orthogonal arrays have the same D-efficiency as thefull design under the proposed model.
Abstract: An order-of-addition experiment is a kind of experiment in which the response is affected by the addition order of materials or components. In many situations, performing the full design with all p...

21 citations


Journal ArticleDOI
TL;DR: The elastic depths are proposed, a new family of depth measures called the elastic depths that can be used to greatly improve shape anomaly detection in functional data and are assessed on simulated shape outlier scenarios and against popular shape anomaly detectors.
Abstract: We propose a new family of depth measures called the elastic depths that can be used to greatly improve shape anomaly detection in functional data. Shape anomalies are functions that have considera...

Journal ArticleDOI
TL;DR: A new functional control chart is elaborated on the residuals obtained from a function-on-function linear regression of the quality characteristic profile on the functional covariates of the shipping industry with particular regard to detecting their reduction after a specific energy efficiency initiative.
Abstract: The modern development of data acquisition technologies in many industrial processes is facilitating the collection of quality characteristics that are apt to be modeled as functions, which are usu...

Journal ArticleDOI
TL;DR: An active learning approach to estimate the unknown differential equations accurately with reduced experimental data size and an adaptive design criterion combining the D-optimality and the maximin space-filling criterion are proposed.
Abstract: In many areas of science and engineering, discovering the governing differential equations from the noisy experimental data is an essential challenge. It is also a critical step in understanding th...

Journal ArticleDOI
TL;DR: This research presents a novel and scalable approach that combines reinforcement learning and reinforcement learning to solve statistical process control (SPC) problems with real-time requirements.
Abstract: Machine learning methods have been widely used in different applications, including process control and monitoring. For handling statistical process control (SPC) problems, conventional supervised ...

Journal ArticleDOI
TL;DR: A new variable selection method based on an individually penalized ridge regression, a slightly generalized version of the ridge regression is proposed, which is shown to perform competitively based on simulation and a real data example.
Abstract: Ridge regression was introduced to deal with the instability issue of the ordinary least squares estimate due to multicollinearity. It essentially penalizes the least squares loss by applying a rid...

Journal ArticleDOI
TL;DR: In this article, a combination of Voronoi tessellation and separate Gaussian processes is used to model the function over different regions of the partitioned space. But the proposed method is highly flexible since it allows the Voroni cells to combine to form regions, which enables nonconvex and disconnected regions to be considered.
Abstract: Many methods for modeling functions over high-dimensional spaces assume global smoothness properties; such assumptions are often violated in practice. We introduce a method for modeling functions that display heterogeneity or contain discontinuities. The heterogeneity is dealt with by using a combination of Voronoi tessellation, to partition the input space, and separate Gaussian processes to model the function over different regions of the partitioned space. The proposed method is highly flexible since it allows the Voronoi cells to combine to form regions, which enables nonconvex and disconnected regions to be considered. In such problems, identifying the borders between regions is often of great importance and we propose an adaptive sampling method to gain extra information along such borders. The method is illustrated by simulated examples and an application to real data, in which we see improvements in prediction error over the commonly used stationary Gaussian process and other nonstationary variations. In our application, a computationally expensive computer model that simulates the formation of clouds is investigated, the proposed method more accurately predicts the underlying process at unobserved locations than existing emulation methods. Supplementary materials for this article are available online.

Journal ArticleDOI
TL;DR: In this article, a functional linear regression (FLR) model is used to model functional responses with respect to functional inputs, which is a widely used approach for modeling functional responses.
Abstract: Functional linear regression is a widely used approach to model functional responses with respect to functional inputs. However, classical functional linear regression models can be severely affect...

Journal ArticleDOI
TL;DR: In this article, satellite remote sensing for geographic information, thermal imaging analysis for manufacturing, and thermal modeling for manufacturing applications are discussed, and the applications of advanced imaging systems spur their applications in many areas.
Abstract: Recent developments of advanced imaging systems spur their applications in many areas, ranging from satellite remote sensing for geographic information to thermal imaging analysis for manufacturing...

Journal ArticleDOI
TL;DR: In this article, a new method for statistical process control (SPC) of a discrete part manufacturing system based on intrinsic geometrical properties of the parts, estimated from three-dimensional sensor data, is presented.
Abstract: We present a new method for statistical process control (SPC) of a discrete part manufacturing system based on intrinsic geometrical properties of the parts, estimated from three-dimensional sensor...

Journal ArticleDOI
TL;DR: A Bayesian model is proposed that capitalizes on the interpolation property of predictive distributions from Gaussian processes while still preserving the flexibility found in modern registration techniques.
Abstract: In experiments where observations on each experimental unit are functional in nature, it is often the case that, in addition to variability along the horizontal axis (height or amplitude variabilit...

Journal ArticleDOI
TL;DR: This work develops a real-time anomaly detection method for directed activity on large, sparse networks and is able to identify a red team attack with half the detection rate required of the model without latent interaction terms.
Abstract: We develop a real-time anomaly detection method for directed activity on large, sparse networks. We model the propensity for future activity using a dynamic logistic model with interaction terms fo...

Journal ArticleDOI
TL;DR: An effective exponentially weighted moving average chart is developed, in which its weighting parameter is chosen large if the related covariates included in the collected data tend to have a shift and small otherwise, and the chart is designed solely for detecting shifts in the quality/performance variables.
Abstract: Statistical process control (SPC) charts provide a powerful tool for monitoring production lines in manufacturing industries. They are also used widely in other applications, such as sequential mon...

Journal ArticleDOI
TL;DR: An efficient imputation mechanism is introduced which allows the practical implementation of co-kriging when the experimental design is nonhierarchically nested by enabling the specification of semiconjugate priors.
Abstract: Motivated by a multi-fidelity Weather Research and Forecasting (WRF) climate model application where the available simulations are not generated based on hierarchically nested experimental design, ...

Journal ArticleDOI
TL;DR: In this article, Accelerated repeated measures degradation tests are used to assess product or component reliability when there would be few or even no failures during a traditional life test, such tests are...
Abstract: Accelerated repeated measures degradation tests are often used to assess product or component reliability when there would be few or even no failures during a traditional life test. Such tests are ...

Journal ArticleDOI
TL;DR: Theoretically, it is established that the posterior predictive density from the proposed model is “close” to the true data generating density, the closeness being measured by the Hellinger distance between these two densities, which scales at a rate very close to the finite dimensional optimal rate of , depending on how the number of tensor nodes grow with the sample size.
Abstract: Motivated by brain connectome datasets acquired using diffusion weighted magnetic resonance imaging (DWI), this article proposes a novel generalized Bayesian linear modeling framework with a symmet...

Journal ArticleDOI
TL;DR: A statistical emulator to facilitate large-scale OSUEs in the OCO-2 mission that outperforms other competing statistical methods and a reduced order model that approximates the full-physics forward model.
Abstract: Observing system uncertainty experiments (OSUEs) have been recently proposed as a cost-effective way to perform probabilistic assessment of retrievals for NASA’s Orbiting Carbon Observatory-2 (OCO-...

Journal ArticleDOI
TL;DR: Constrained positive Var(s)-optimal designs, when paired with the Dantzig selector, are recommended when effect directions can be credibly specified in advance; this strategy reasonably controls Type I error rates while still identifying a high proportion of active factors.
Abstract: Despite the vast amount of literature on supersaturated designs (SSDs), there is a scant record of their use in practice. We contend this imbalance is due to conflicting recommendations regarding S...

Journal ArticleDOI
Abstract: Bayesian calibration of a functional input/parameter to a time-consuming simulator based on a Gaussian process (GP) emulator involves two challenges that distinguish it from other parameter calibra...

Journal ArticleDOI
TL;DR: The reconstruction approach uses an interpolator to parameterize the regression function with its values at finite knots, and then estimates these values by (regularized) least squares, which makes it very suitable for large datasets.
Abstract: This article introduces an interpolation-based method, called the reconstruction approach, for nonparametric regression. Based on the fact that interpolation usually has negligible errors compared ...

Journal ArticleDOI
TL;DR: This study proposes a methodology that applies to complex shapes represented in the form of triangulated meshes, which is the current standard for AM data format and combines a novel bi-directional way to model the deviation between the reconstructed geometry and the nominal geometry with a profile monitoring approach for the detection of out-of-control shapes.
Abstract: The industrial development of new production processes like additive manufacturing (AM) is making available novel types of complex shapes that go beyond traditionally manufactured geometries and 2.5D free-form surfaces. New challenges must be faced to characterize, model and monitor the natural variability of such complex shapes, since previously proposed methods based on parametric models are not applicable. The present study proposes a methodology that applies to complex shapes represented in the form of triangulated meshes, which is the current standard for AM data format. The method combines a novel bi-directional way to model the deviation between the reconstructed geometry (e.g., via x-ray computed tomography) and the nominal geometry (i.e., the originating 3D model) with a profile monitoring approach for the detection of out-of-control shapes. A paradigmatic example consisting of an egg-shaped trabecular shell representative of real parts produced via AM is used to illustrate the methodology and to test its effectiveness in detecting real geometrical distortions. Supplementary materials for the work are available online.