scispace - formally typeset
Search or ask a question
Author

Cheng-Kai Chen

Other affiliations: National Chiao Tung University
Bio: Cheng-Kai Chen is an academic researcher from University of California, Davis. The author has contributed to research in topics: Data visualization & Visualization. The author has an hindex of 7, co-authored 20 publications receiving 291 citations. Previous affiliations of Cheng-Kai Chen include National Chiao Tung University.

Papers
More filters
Journal ArticleDOI
TL;DR: This paper introduces a new streamline placement and selection algorithm for 3D vector fields that dynamically determine a set of streamlines which contributes to data understanding without cluttering the view.
Abstract: This paper introduces a new streamline placement and selection algorithm for 3D vector fields. Instead of considering the problem as a simple feature search in data space, we base our work on the observation that most streamline fields generate a lot of self-occlusion which prevents proper visualization. In order to avoid this issue, we approach the problem in a view-dependent fashion and dynamically determine a set of streamlines which contributes to data understanding without cluttering the view. Since our technique couples flow characteristic criteria and view-dependent streamline selection we are able achieve the best of both worlds: relevant flow description and intelligible, uncluttered pictures. We detail an efficient GPU implementation of our algorithm, show comprehensive visual results on multiple datasets and compare our method with existing flow depiction techniques. Our results show that our technique greatly improves the readability of streamline visualizations on different datasets without requiring user intervention.

93 citations

Proceedings ArticleDOI
21 Jun 2010
TL;DR: It is argued that a multi-GPU MapReduce library is a good fit for parallel volume renderering because it is easy to program for, scales well, and eliminates the need to focus on I/O algorithms thus allowing the focus to be on visualization algorithms instead.
Abstract: In this paper we present a multi-GPU parallel volume rendering implemention built using the MapReduce programming model. We give implementation details of the library, including specific optimizations made for our rendering and compositing design. We analyze the theoretical peak performance and bottlenecks for all tasks required and show that our system significantly reduces computation as a bottleneck in the ray-casting phase. We demonstrate that our rendering speeds are adequate for interactive visualization (our system is capable of rendering a 10243 floating-point sampled volume in under one second using 8 GPUs), and that our system is capable of delivering both in-core and out-of-core visualizations. We argue that a multi-GPU MapReduce library is a good fit for parallel volume renderering because it is easy to program for, scales well, and eliminates the need to focus on I/O algorithms thus allowing the focus to be on visualization algorithms instead. We show that our system scales with respect to the size of the volume, and (given enough work) the number of GPUs.

61 citations

Journal ArticleDOI
TL;DR: A novel visualization framework is presented that combines the advantages of clustering methods and illustrative rendering techniques to generate a concise and informative depiction of complex flow structures.
Abstract: Most 3D vector field visualization techniques suffer from the problem of visual clutter, and it remains a challenging task to effectively convey both directional and structural information of 3D vector fields. In this paper, we present a novel visualization framework that combines the advantages of clustering methods and illustrative rendering techniques to generate a concise and informative depiction of complex flow structures. Given a 3D vector field, we first generate a number of streamlines covering the important regions based on an entropy measurement. Then we decompose the streamlines into different groups based on a categorization of vector information, wherein the streamline pattern in each group is ensured to be coherent or nearly coherent. For each group, we select a set of representative streamlines and render them in an illustrative fashion to enhance depth cues and succinctly show local flow characteristics. The results demonstrate that our approach can generate a visualization that is relatively free of visual clutter while facilitating perception of salient information of complex vector fields.

38 citations

Proceedings ArticleDOI
01 Mar 2011
TL;DR: This paper classify sample voxels to produce static visualization that succinctly summarize the connection among all correlation volumes with respect to various reference locations, and investigates the error introduced by each step of the sampling scheme in terms of classification accuracy.
Abstract: Finding correlations among data is one of the most essential tasks in many scientific investigations and discoveries. This paper addresses the issue of creating a static volume classification that summarizes the correlation connection in time-varying multivariate data sets. In practice, computing all temporal and spatial correlations for large 3D time-varying multivariate data sets is prohibitively expensive. We present a sampling-based approach to classifying correlation patterns. Our sampling scheme consists of three steps: selecting important samples from the volume, prioritizing distance computation for sample pairs, and approximating volume-based correlation with sample-based correlation. We classify sample voxels to produce static visualization that succinctly summarize the connection among all correlation volumes with respect to various reference locations. We also investigate the error introduced by each step of our sampling scheme in terms of classification accuracy. Domain scientists participated in this work and helped us select samples and evaluate results. Our approach is generally applicable to the analysis of other scientific data where correlation study is relevant.

32 citations

Proceedings ArticleDOI
17 Jun 2006
TL;DR: This approach achieves the performance of on-target design; in the meanwhile, it significantly reduces the threshold voltage fluctuation and is believed to provide a novel way to accelerate the tuning of process parameters and benefit technology of nanodevices.
Abstract: In this paper, a TCAD-simulation-based optimization methodology for nanoscale CMOS device fabrication is advanced. Electrical characteristics fluctuation is considered and minimized in the optimization process. Integration of device and process simulation is performed to evaluate device performances, where the hybrid intelligent approach enables us to extract optimal recipes which are subject to specified device specification. It is known that production of CMOS devices now are in the sub-65 nm region; therefore, electrical characteristics fluctuation should be simultaneously considered when we extract a set of optimal process parameters. Verification of the efficiency and accuracy of the proposed computational methodology is tested and performed on a 65 nm CMOS device. Compared with realistic fabricated and measured data, this approach achieves the performance of on-target design; in the meanwhile, it significantly reduces the threshold voltage fluctuation. We believe this approach provides a novel way to accelerate the tuning of process parameters and benefit technology of nanodevices.

21 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Roughly one in six of Walsh's 281 publications are included, photographically reproduced, and reproduction is excellent except for one paper from 1918, which is an obituary.
Abstract: a 'sleeper', receiving only modest attention for 50 years before emerging as a cornerstone of communications engineering in more recent times. Roughly one in six of Walsh's 281 publications are included, photographically reproduced. Reproduction is excellent except for one paper from 1918. The book also reproduces three brief papers about Walsh and his work, by W. E. Sewell, D. V. Widder and Morris Marden. The first two were written for a special issue of the SIAM Journal celebrating Walsh's 70th birthday; the third is an obituary.

676 citations

Journal ArticleDOI
TL;DR: In this paper, the authors consider how one of the oldest and most widely applied statistical methods, principal components analysis (PCA), is employed with spatial data, and identify four main methodologies, which are defined as (1) PCA applied to spatial objects, (2) PCAs applied to raster data, (3) atmospheric science PCA, and (4)PCA on flows.
Abstract: This article considers critically how one of the oldest and most widely applied statistical methods, principal components analysis (PCA), is employed with spatial data. We first provide a brief guide to how PCA works: This includes robust and compositional PCA variants, links to factor analysis, latent variable modeling, and multilevel PCA. We then present two different approaches to using PCA with spatial data. First we look at the nonspatial approach, which avoids challenges posed by spatial data by using a standard PCA on attribute space only. Within this approach we identify four main methodologies, which we define as (1) PCA applied to spatial objects, (2) PCA applied to raster data, (3) atmospheric science PCA, and (4) PCA on flows. In the second approach, we look at PCA adapted for effects in geographical space by looking at PCA methods adapted for first-order nonstationary effects (spatial heterogeneity) and second-order stationary effects (spatial autocorrelation). We also describe how PCA can be...

331 citations

Journal ArticleDOI
TL;DR: The concept of 3D space–time density of trajectories to solve the problem of cluttering in the space– time cube is introduced and an application to real-time movement data is presented, that is, vessel movement trajectories acquired using the Automatic Identification System equipment on ships in the Gulf of Finland.
Abstract: Modern positioning and identification technologies enable tracking of almost any type of moving object. A remarkable amount of new trajectory data is thus available for the analysis of various phenomena. In cartography, a typical way to visualise and explore such data is to use a space-time cube, where trajectories are shown as 3D polylines through space and time. With increasingly large movement datasets becoming available, this type of display quickly becomes cluttered and unclear. In this article, we introduce the concept of 3D space-time density of trajectories to solve the problem of cluttering in the space-time cube. The space-time density is a generalisation of standard 2D kernel density around 2D point data into 3D density around 3D polyline data (i.e. trajectories). We present the algorithm for space-time density, test it on simulated data, show some basic visualisations of the resulting density volume and observe particular types of spatio-temporal patterns in the density that are specific to trajectory data. We also present an application to real-time movement data, that is, vessel movement trajectories acquired using the Automatic Identification System (AIS) equipment on ships in the Gulf of Finland. Finally, we consider the wider ramifications to spatial analysis of using this novel type of spatio-temporal visualisation.

224 citations

Journal ArticleDOI
TL;DR: A novel, nonparametric method for summarizing ensembles of 2D and 3D curves is presented and an extension of a method from descriptive statistics, data depth, to curves is proposed, which is a generalization of traditional whisker plots or boxplots to multidimensional curves.
Abstract: In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.

162 citations

Journal ArticleDOI
TL;DR: In this paper, the authors examined the predictability of intrinsically generated component of ENSO modulation, using a 4000-yr unforced control run from a global coupled GCM [GFDL Climate Model, version 2.1] with a fairly realistic representation.
Abstract: Observations and climate simulations exhibit epochs of extreme El Ni~ Oscillation (ENSO) behavior that can persist for decades. Previous studies have revealed a wide range of ENSO responses to forcings from greenhouse gases, aerosols, and orbital variations, but they have also shown that interdecadal modulation of ENSO can arise even without such forcings. The present study examines the predictability of this intrinsically generated component of ENSO modulation, using a 4000-yr unforced control run from a global coupled GCM [GFDL Climate Model, version 2.1 (CM2.1)] with a fairly realistic representation of ENSO.ExtremeENSOepochsfromthe unforced simulation arereforecastusingthesame(‘‘perfect’’) model but slightly perturbed initial conditions. These 40-member reforecast ensembles display potential predictability of the ENSO trajectory, extending up to several years ahead. However, no decadal-scale predictability of ENSO behavior is found. This indicates that multidecadal epochs of extreme ENSO behavior can arise not only intrinsically but also delicately and entirely at random. Previous work had shown that CM2.1 generates strong, reasonably realistic, decadally predictable high-latitude climate signals, as well as tropical and extratropical decadal signals that interact with ENSO. However, those slow variations appear not to lend significant decadal predictability to this model’s ENSO behavior, at least in the absence of external forcings. While the potential implications of these results are sobering for decadal predictability, they also offer an expedited approach to model evaluation and development, in which large ensembles of short runs are executed in parallel, to quickly and robustly evaluate simulations of ENSO. Further implications are discussed for decadal prediction, attribution of past and future ENSO variations, and societal vulnerability.

144 citations