scispace - formally typeset
Search or ask a question
Author

Xin Huang

Bio: Xin Huang is an academic researcher from Chinese Academy of Sciences. The author has contributed to research in topics: Medicine & Chemistry. The author has an hindex of 68, co-authored 585 publications receiving 17105 citations. Previous affiliations of Xin Huang include Hong Kong University of Science and Technology & University of Edinburgh.


Papers
More filters
Journal ArticleDOI
TL;DR: The aim of this work was to comprehensively review most of the studies published on this topic in China, including literature concerning field measurements, laboratory studies and the impacts of BB indoors and outdoors in China to provide a basis for formulation of policies and regulations by policy makers in China.

772 citations

Journal ArticleDOI
TL;DR: In this article, the authors show that black carbon aerosols play a key role in modifying the PBL meteorology and hence enhancing the haze pollution in megacities in China.
Abstract: Aerosol-planetary boundary layer (PBL) interactions have been found to enhance air pollution in megacities in China. We show that black carbon (BC) aerosols play the key role in modifying the PBL meteorology and hence enhancing the haze pollution. With model simulations and data analysis from various field observations in December 2013, we demonstrate that BC induces heating in the PBL, particularly in the upper PBL, and the resulting decreased surface heat flux substantially depresses the development of PBL and consequently enhances the occurrences of extreme haze pollution episodes. We define this process as the “dome effect” of BC and suggest an urgent need for reducing BC emissions as an efficient way to mitigate the extreme haze pollution in megacities of China.

563 citations

Journal ArticleDOI
TL;DR: It is shown that the haze during the COVID lockdown was driven by enhancements of secondary pollution, and that haze mitigation depends upon a coordinated and balanced strategy for controlling multiple pollutants.
Abstract: To control the spread of the 2019 novel coronavirus (COVID-19), China imposed nationwide restrictions on the movement of its population (lockdown) after the Chinese New Year of 2020, leading to large reductions in economic activities and associated emissions Despite such large decreases in primary pollution, there were nonetheless several periods of heavy haze pollution in eastern China, raising questions about the well-established relationship between human activities and air quality Here, using comprehensive measurements and modeling, we show that the haze during the COVID lockdown was driven by enhancements of secondary pollution In particular, large decreases in NOx emissions from transportation increased ozone and nighttime NO3 radical formation, and these increases in atmospheric oxidizing capacity in turn facilitated the formation of secondary particulate matter Our results, afforded by the tragic natural experiment of the COVID-19 pandemic, indicate that haze mitigation depends upon a coordinated and balanced strategy for controlling multiple pollutants

529 citations

Journal ArticleDOI
TL;DR: In this article, the authors monitor land-use through a nested hierarchy of land cover and use change vectors of Tasseled Cap brightness, greenness and wetness of Landsat Thematic Mapper (TM) images to map four stable classes and e ve changes classes.
Abstract: The Pearl River Delta in the People's Republic ofChina is experiencing rapid rates of economic growth. Government directives in the late 1970s and early 1980s spurred economic development that has led to widespread land conversion. In this study, we monitor land-use through a nested hierarchy of land-cover. Change vectors of Tasseled Cap brightness, greenness and wetness of Landsat Thematic Mapper (TM) images are combined with the brightness, greenness, wetness values from the initial date of imagery to map four stable classes and e ve changes classes. Most of the land-use change is conversion from agricultural land to urban areas. Results indicate that urban areas have increased by more than 300% between 1988 and 1996. Field assessments cone rm a high overall accuracy of the land-use change map (93.5%) and support the use of change vectors and multidate Landsat TM imagery to monitor land-use change. Results cone rm the importance of eeld-based accuracy assessment to identify problems in a land-use map and to improve area estimates for each class.

455 citations

Journal ArticleDOI
TL;DR: A new multifeature model, aiming to construct a support vector machine (SVM) ensemble combining multiple spectral and spatial features at both pixel and object levels is proposed, which provides more accurate classification results compared to the voting and probabilistic models.
Abstract: In recent years, the resolution of remotely sensed imagery has become increasingly high in both the spectral and spatial domains, which simultaneously provides more plentiful spectral and spatial information. Accordingly, the accurate interpretation of high-resolution imagery depends on effective integration of the spectral, structural and semantic features contained in the images. In this paper, we propose a new multifeature model, aiming to construct a support vector machine (SVM) ensemble combining multiple spectral and spatial features at both pixel and object levels. The features employed in this study include a gray-level co-occurrence matrix, differential morphological profiles, and an urban complexity index. Subsequently, three algorithms are proposed to integrate the multifeature SVMs: certainty voting, probabilistic fusion, and an object-based semantic approach, respectively. The proposed algorithms are compared with other multifeature SVM methods including the vector stacking, feature selection, and composite kernels. Experiments are conducted on the hyperspectral digital imagery collection experiment DC Mall data set and two WorldView-2 data sets. It is found that the multifeature model with semantic-based postprocessing provides more accurate classification results (an accuracy improvement of 1-4% for the three experimental data sets) compared to the voting and probabilistic models.

408 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal Article
Fumio Tajima1
30 Oct 1989-Genomics
TL;DR: It is suggested that the natural selection against large insertion/deletion is so weak that a large amount of variation is maintained in a population.

11,521 citations

Journal Article
TL;DR: In this article, the authors present a document, redatto, voted and pubblicato by the Ipcc -Comitato intergovernativo sui cambiamenti climatici - illustra la sintesi delle ricerche svolte su questo tema rilevante.
Abstract: Cause, conseguenze e strategie di mitigazione Proponiamo il primo di una serie di articoli in cui affronteremo l’attuale problema dei mutamenti climatici. Presentiamo il documento redatto, votato e pubblicato dall’Ipcc - Comitato intergovernativo sui cambiamenti climatici - che illustra la sintesi delle ricerche svolte su questo tema rilevante.

4,187 citations

01 Jan 1989
TL;DR: In this article, a two-dimensional version of the Pennsylvania State University mesoscale model has been applied to Winter Monsoon Experiment data in order to simulate the diurnally occurring convection observed over the South China Sea.
Abstract: Abstract A two-dimensional version of the Pennsylvania State University mesoscale model has been applied to Winter Monsoon Experiment data in order to simulate the diurnally occurring convection observed over the South China Sea. The domain includes a representation of part of Borneo as well as the sea so that the model can simulate the initiation of convection. Also included in the model are parameterizations of mesoscale ice phase and moisture processes and longwave and shortwave radiation with a diurnal cycle. This allows use of the model to test the relative importance of various heating mechanisms to the stratiform cloud deck, which typically occupies several hundred kilometers of the domain. Frank and Cohen's cumulus parameterization scheme is employed to represent vital unresolved vertical transports in the convective area. The major conclusions are: Ice phase processes are important in determining the level of maximum large-scale heating and vertical motion because there is a strong anvil componen...

3,813 citations