scispace - formally typeset
Search or ask a question

Showing papers by "Nanjing University of Information Science and Technology published in 2016"


Journal ArticleDOI
TL;DR: This paper constructs a special tree-based index structure and proposes a “Greedy Depth-first Search” algorithm to provide efficient multi-keyword ranked search over encrypted cloud data, which simultaneously supports dynamic update operations like deletion and insertion of documents.
Abstract: Due to the increasing popularity of cloud computing, more and more data owners are motivated to outsource their data to cloud servers for great convenience and reduced cost in data management. However, sensitive data should be encrypted before outsourcing for privacy requirements, which obsoletes data utilization like keyword-based document retrieval. In this paper, we present a secure multi-keyword ranked search scheme over encrypted cloud data, which simultaneously supports dynamic update operations like deletion and insertion of documents. Specifically, the vector space model and the widely-used TF $\;\times\;$ IDF model are combined in the index construction and query generation. We construct a special tree-based index structure and propose a “Greedy Depth-first Search” algorithm to provide efficient multi-keyword ranked search. The secure kNN algorithm is utilized to encrypt the index and query vectors, and meanwhile ensure accurate relevance score calculation between encrypted index and query vectors. In order to resist statistical attacks, phantom terms are added to the index vector for blinding search results. Due to the use of our special tree-based index structure, the proposed scheme can achieve sub-linear search time and deal with the deletion and insertion of documents flexibly. Extensive experiments are conducted to demonstrate the efficiency of the proposed scheme.

976 citations


Journal ArticleDOI
TL;DR: An extended TOPSIS method and an aggregation-based method respectively for multi-attribute group decision making (MAGDM) with probabilistic linguistic information, and apply them to a practical case concerning strategy initiatives.

807 citations


Journal ArticleDOI
TL;DR: A Stacked Sparse Autoencoder, an instance of a deep learning strategy, is presented for efficient nuclei detection on high-resolution histopathological images of breast cancer and out-performed nine other state of the art nuclear detection strategies.
Abstract: Automated nuclear detection is a critical step for a number of computer assisted pathology related image analysis algorithms such as for automated grading of breast cancer tissue specimens. The Nottingham Histologic Score system is highly correlated with the shape and appearance of breast cancer nuclei in histopathological images. However, automated nucleus detection is complicated by 1) the large number of nuclei and the size of high resolution digitized pathology images, and 2) the variability in size, shape, appearance, and texture of the individual nuclei. Recently there has been interest in the application of “Deep Learning” strategies for classification and analysis of big image data. Histopathology, given its size and complexity, represents an excellent use case for application of deep learning strategies. In this paper, a Stacked Sparse Autoencoder (SSAE), an instance of a deep learning strategy, is presented for efficient nuclei detection on high-resolution histopathological images of breast cancer. The SSAE learns high-level features from just pixel intensities alone in order to identify distinguishing features of nuclei. A sliding window operation is applied to each image in order to represent image patches via high-level features obtained via the auto-encoder, which are then subsequently fed to a classifier which categorizes each image patch as nuclear or non-nuclear. Across a cohort of 500 histopathological images (2200 $\times$ 2200) and approximately 3500 manually segmented individual nuclei serving as the groundtruth, SSAE was shown to have an improved F-measure 84.49% and an average area under Precision-Recall curve (AveP) 78.83%. The SSAE approach also out-performed nine other state of the art nuclear detection strategies.

735 citations


Journal ArticleDOI
TL;DR: This paper study and solve the problem of personalized multi-keyword ranked search over encrypted data (PRSE) while preserving privacy in cloud computing with the help of semantic ontology WordNet, and proposes two PRSE schemes for different search intentions.
Abstract: In cloud computing, searchable encryption scheme over outsourced data is a hot research field. However, most existing works on encrypted search over outsourced cloud data follow the model of “one size fits all” and ignore personalized search intention. Moreover, most of them support only exact keyword search, which greatly affects data usability and user experience. So how to design a searchable encryption scheme that supports personalized search and improves user search experience remains a very challenging task. In this paper, for the first time, we study and solve the problem of personalized multi-keyword ranked search over encrypted data (PRSE) while preserving privacy in cloud computing. With the help of semantic ontology WordNet, we build a user interest model for individual user by analyzing the user’s search history, and adopt a scoring mechanism to express user interest smartly. To address the limitations of the model of “one size fit all” and keyword exact search, we propose two PRSE schemes for different search intentions. Extensive experiments on real-world dataset validate our analysis and show that our proposed solution is very efficient and effective.

665 citations


Journal ArticleDOI
TL;DR: A comprehensive review of studies on Asian aerosols, monsoons, and their interactions is provided in this article, where a new paradigm is proposed on investigating aerosol-monsoon interactions, in which natural aerosols such as desert dust, black carbon from biomass burning, and biogenic aerosols from vegetation are considered integral components of an intrinsic aerosolmonsoon climate system, subject to external forcing of global warming, anthropogenic aerosol, and land use and change.
Abstract: The increasing severity of droughts/floods and worsening air quality from increasing aerosols in Asia monsoon regions are the two gravest threats facing over 60% of the world population living in Asian monsoon regions. These dual threats have fueled a large body of research in the last decade on the roles of aerosols in impacting Asian monsoon weather and climate. This paper provides a comprehensive review of studies on Asian aerosols, monsoons, and their interactions. The Asian monsoon region is a primary source of emissions of diverse species of aerosols from both anthropogenic and natural origins. The distributions of aerosol loading are strongly influenced by distinct weather and climatic regimes, which are, in turn, modulated by aerosol effects. On a continental scale, aerosols reduce surface insolation and weaken the land-ocean thermal contrast, thus inhibiting the development of monsoons. Locally, aerosol radiative effects alter the thermodynamic stability and convective potential of the lower atmosphere leading to reduced temperatures, increased atmospheric stability, and weakened wind and atmospheric circulations. The atmospheric thermodynamic state, which determines the formation of clouds, convection, and precipitation, may also be altered by aerosols serving as cloud condensation nuclei or ice nuclei. Absorbing aerosols such as black carbon and desert dust in Asian monsoon regions may also induce dynamical feedback processes, leading to a strengthening of the early monsoon and affecting the subsequent evolution of the monsoon. Many mechanisms have been put forth regarding how aerosols modulate the amplitude, frequency, intensity, and phase of different monsoon climate variables. A wide range of theoretical, observational, and modeling findings on the Asian monsoon, aerosols, and their interactions are synthesized. A new paradigm is proposed on investigating aerosol-monsoon interactions, in which natural aerosols such as desert dust, black carbon from biomass burning, and biogenic aerosols from vegetation are considered integral components of an intrinsic aerosol-monsoon climate system, subject to external forcing of global warming, anthropogenic aerosols, and land use and change. Future research on aerosol-monsoon interactions calls for an integrated approach and international collaborations based on long-term sustained observations, process measurements, and improved models, as well as using observations to constrain model simulations and projections.

585 citations


Journal ArticleDOI
TL;DR: A unique watermark is directly embedded into the encrypted images by the cloud server before images are sent to the query user, and when image copy is found, the unlawful query user who distributed the image can be traced by the watermark extraction.
Abstract: With the increasing importance of images in people’s daily life, content-based image retrieval (CBIR) has been widely studied. Compared with text documents, images consume much more storage space. Hence, its maintenance is considered to be a typical example for cloud storage outsourcing. For privacy-preserving purposes, sensitive images, such as medical and personal images, need to be encrypted before outsourcing, which makes the CBIR technologies in plaintext domain to be unusable. In this paper, we propose a scheme that supports CBIR over encrypted images without leaking the sensitive information to the cloud server. First, feature vectors are extracted to represent the corresponding images. After that, the pre-filter tables are constructed by locality-sensitive hashing to increase search efficiency. Moreover, the feature vectors are protected by the secure kNN algorithm, and image pixels are encrypted by a standard stream cipher. In addition, considering the case that the authorized query users may illegally copy and distribute the retrieved images to someone unauthorized, we propose a watermark-based protocol to deter such illegal distributions. In our watermark-based protocol, a unique watermark is directly embedded into the encrypted images by the cloud server before images are sent to the query user. Hence, when image copy is found, the unlawful query user who distributed the image can be traced by the watermark extraction. The security analysis and the experiments show the security and efficiency of the proposed scheme.

563 citations


Journal ArticleDOI
TL;DR: A new method of keyword transformation based on the uni-gram is developed, which will simultaneously improve the accuracy and creates the ability to handle other spelling mistakes and consider the keyword weight when selecting an adequate matching file set.
Abstract: Keyword-based search over encrypted outsourced data has become an important tool in the current cloud computing scenario. The majority of the existing techniques are focusing on multi-keyword exact match or single keyword fuzzy search. However, those existing techniques find less practical significance in real-world applications compared with the multi-keyword fuzzy search technique over encrypted data. The first attempt to construct such a multi-keyword fuzzy search scheme was reported by Wang et al. , who used locality-sensitive hashing functions and Bloom filtering to meet the goal of multi-keyword fuzzy search. Nevertheless, Wang’s scheme was only effective for a one letter mistake in keyword but was not effective for other common spelling mistakes. Moreover, Wang’s scheme was vulnerable to server out-of-order problems during the ranking process and did not consider the keyword weight. In this paper, based on Wang et al. ’s scheme, we propose an efficient multi-keyword fuzzy ranked search scheme based on Wang et al. ’s scheme that is able to address the aforementioned problems. First, we develop a new method of keyword transformation based on the uni-gram, which will simultaneously improve the accuracy and creates the ability to handle other spelling mistakes. In addition, keywords with the same root can be queried using the stemming algorithm. Furthermore, we consider the keyword weight when selecting an adequate matching file set. Experiments using real-world data show that our scheme is practically efficient and achieve high accuracy.

464 citations


Journal ArticleDOI
TL;DR: A Deep Convolutional Neural Networks (DCNN) based feature learning is presented to automatically segment or classify EP and ST regions from digitized tumor tissue microarrays (TMAs) and was shown to outperform three handcraft feature extraction based approaches in terms of the classification of EP andST regions.

403 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the contribution from open field straw burning during harvest or other active burning periods and showed that substantial contribution from straw burning would dramatically improve air quality in many Chinese regions.
Abstract: PM2.5 inventories have been developed in major Chinese cities to quantify the contributions from various sources based on annual emissions. This approach, however, could substantially underestimate the contribution from open straw burning during the harvest or other active burning periods. This study examines this issue by estimating monthly and annual straw-burning PM2.5 emissions in China and comparing with them with the corresponding emissions from other anthropogenic sources. Annually burned straw PM2.5 emissions during 1997 ~ 2013 for 31 China provinces were calculated based on crop and related burning information for 12 months based on satellite detection of agricultural burning. Annual emissions from other anthropogenic sources were collected from the literature and allocated to monthly values using air pollution index measurements. The results indicate that the annual PM2.5 emissions from open straw burning in China were 1.036 m tons. The monthly PM2.5 emission ratios of straw burning to other anthropogenic sources during June, the harvest period for many regions, were several times larger than the annual ratios at national, regional, and province levels, suggesting that, in contrast to annual emissions that were used in the PM2.5 inventories in Chinese cities to assess the contributions from other sources, monthly emissions should be used to assess the contributions from straw burning during the harvest or other active burning periods. The larger contributions from straw burning shown in this study also suggest that substantial reduction of open field straw burning would dramatically improve air quality in many Chinese regions during the harvest or other active burning periods.

378 citations


Journal ArticleDOI
TL;DR: This paper models the messages embedded by spatial least significant bit (LSB) matching as independent noises to the cover image, and reveals that the histogram of the differences between pixel gray values is smoothed by the stego bits despite a large distance between the pixels.
Abstract: This paper models the messages embedded by spatial least significant bit (LSB) matching as independent noises to the cover image, and reveals that the histogram of the differences between pixel gray values is smoothed by the stego bits despite a large distance between the pixels Using the characteristic function of difference histogram (DHCF), we prove that the center of mass of DHCF (DHCF COM) decreases after messages are embedded Accordingly, the DHCF COMs are calculated as distinguishing features from the pixel pairs with different distances The features are calibrated with an image generated by average operation, and then used to train a support vector machine (SVM) classifier The experimental results prove that the features extracted from the differences between nonadjacent pixels can help to tackle LSB matching as well

359 citations


Journal ArticleDOI
TL;DR: It is presented that, even without offline training with a large amount of auxiliary data, simple two-layer convolutional networks can be powerful enough to learn robust representations for visual tracking.
Abstract: Deep networks have been successfully applied to visual tracking by learning a generic representation offline from numerous training images. However, the offline training is time-consuming and the learned generic representation may be less discriminative for tracking specific objects. In this paper, we present that, even without offline training with a large amount of auxiliary data, simple two-layer convolutional networks can be powerful enough to learn robust representations for visual tracking. In the first frame, we extract a set of normalized patches from the target region as fixed filters, which integrate a series of adaptive contextual filters surrounding the target to define a set of feature maps in the subsequent frames. These maps measure similarities between each filter and useful local intensity patterns across the target, thereby encoding its local structural information. Furthermore, all the maps together form a global representation, via which the inner geometric layout of the target is also preserved. A simple soft shrinkage method that suppresses noisy values below an adaptive threshold is employed to de-noise the global representation. Our convolutional networks have a lightweight structure and perform favorably against several state-of-the-art methods on the recent tracking benchmark data set with 50 challenging videos.

Journal ArticleDOI
TL;DR: The proposed level set method can be directly applied to simultaneous segmentation and bias correction for 3 and 7T magnetic resonance images and demonstrates the superiority of the proposed method over other representative algorithms.
Abstract: It is often a difficult task to accurately segment images with intensity inhomogeneity, because most of representative algorithms are region-based that depend on intensity homogeneity of the interested object. In this paper, we present a novel level set method for image segmentation in the presence of intensity inhomogeneity. The inhomogeneous objects are modeled as Gaussian distributions of different means and variances in which a sliding window is used to map the original image into another domain, where the intensity distribution of each object is still Gaussian but better separated. The means of the Gaussian distributions in the transformed domain can be adaptively estimated by multiplying a bias field with the original signal within the window. A maximum likelihood energy functional is then defined on the whole image region, which combines the bias field, the level set function, and the piecewise constant function approximating the true image signal. The proposed level set method can be directly applied to simultaneous segmentation and bias correction for 3 and 7T magnetic resonance images. Extensive evaluation on synthetic and real-images demonstrate the superiority of the proposed method over other representative algorithms.

Journal ArticleDOI
TL;DR: This paper redefine some more logical operational laws for linguistic terms, hesitant fuzzy linguistic elements (HFLEs) and probabilistic linguistic term sets (PLTSs) based on two equivalent transformation functions to keep the operation results more reasonable in decision making with linguistic information.

Journal ArticleDOI
TL;DR: In this article, the effect of air quality regulations from economic growth by comparing them relatively to fossil fuel consumption is distinguished by using satellite data for monitoring long-term trends in atmospheric species.
Abstract: . Air quality observations by satellite instruments are global and have a regular temporal resolution, which makes them very useful in studying long-term trends in atmospheric species. To monitor air quality trends in China for the period 2005–2015, we derive SO2 columns and NOx emissions on a provincial level with improved accuracy. To put these trends into perspective they are compared with public data on energy consumption and the environmental policies of China. We distinguish the effect of air quality regulations from economic growth by comparing them relatively to fossil fuel consumption. Pollutant levels, per unit of fossil fuel, are used to assess the effectiveness of air quality regulations. We note that the desulfurization regulations enforced in 2005–2006 only had a significant effect in the years 2008–2009, when a much stricter control of the actual use of the installations began. For national NOx emissions a distinct decreasing trend is only visible from 2012 onwards, but the emission peak year differs from province to province. Unlike SO2, emissions of NOx are highly related to traffic. Furthermore, regulations for NOx emissions are partly decided on a provincial level. The last 3 years show a reduction both in SO2 and NOx emissions per fossil fuel unit, since the authorities have implemented several new environmental regulations. Despite an increasing fossil fuel consumption and a growing transport sector, the effects of air quality policy in China are clearly visible. Without the air quality regulations the concentration of SO2 would be about 2.5 times higher and the NO2 concentrations would be at least 25 % higher than they are today in China.

Journal ArticleDOI
TL;DR: In this paper, the authors provide evidence for a long-held hypothesis that the biogeochemical effect of urban aerosol or haze pollution is also a contributor to the urban heat island.
Abstract: The urban heat island (UHI), the phenomenon of higher temperatures in urban land than the surrounding rural land, is commonly attributed to changes in biophysical properties of the land surface associated with urbanization. Here we provide evidence for a long-held hypothesis that the biogeochemical effect of urban aerosol or haze pollution is also a contributor to the UHI. Our results are based on satellite observations and urban climate model calculations. We find that a significant factor controlling the nighttime surface UHI across China is the urban–rural difference in the haze pollution level. The average haze contribution to the nighttime surface UHI is 0.7±0.3 K (mean±1 s.e.) for semi-arid cities, which is stronger than that in the humid climate due to a stronger longwave radiative forcing of coarser aerosols. Mitigation of haze pollution therefore provides a co-benefit of reducing heat stress on urban residents. The impact of locally-sourced aerosols on the Urban Heat Island (UHI) effect has been difficult to quantify due to opposing long and shortwave radiation effects. Here, using satellite observations and climate model simulations, the authors reveal that urban haze pollution intensifies the nighttime UHI in China.

Journal ArticleDOI
TL;DR: In this paper, a comprehensive characterization of the sources, variations and processes of submicron aerosols were measured by an Aerodyne high-resolution aerosol mass spectrometer from 17 December 2013 to 17 January 2014 along with offline filter analysis by gas chromatography/mass spectrometry.
Abstract: . Winter has the worst air pollution of the year in the megacity of Beijing. Despite extensive winter studies in recent years, our knowledge of the sources, formation mechanisms and evolution of aerosol particles is not complete. Here we have a comprehensive characterization of the sources, variations and processes of submicron aerosols that were measured by an Aerodyne high-resolution aerosol mass spectrometer from 17 December 2013 to 17 January 2014 along with offline filter analysis by gas chromatography/mass spectrometry. Our results suggest that submicron aerosols composition was generally similar across the winter of different years and was mainly composed of organics (60 %), sulfate (15 %) and nitrate (11 %). Positive matrix factorization of high- and unit-mass resolution spectra identified four primary organic aerosol (POA) factors from traffic, cooking, biomass burning (BBOA) and coal combustion (CCOA) emissions as well as two secondary OA (SOA) factors. POA dominated OA, on average accounting for 56 %, with CCOA being the largest contributor (20 %). Both CCOA and BBOA showed distinct polycyclic aromatic hydrocarbons (PAHs) spectral signatures, indicating that PAHs in winter were mainly from coal combustion (66 %) and biomass burning emissions (18 %). BBOA was highly correlated with levoglucosan, a tracer compound for biomass burning (r2 = 0.93), and made a considerable contribution to OA in winter (9 %). An aqueous-phase-processed SOA (aq-OOA) that was strongly correlated with particle liquid water content, sulfate and S-containing ions (e.g. CH2SO2+) was identified. On average aq-OOA contributed 12 % to the total OA and played a dominant role in increasing oxidation degrees of OA at high RH levels (> 50 %). Our results illustrate that aqueous-phase processing can enhance SOA production and oxidation states of OA as well in winter. Further episode analyses highlighted the significant impacts of meteorological parameters on aerosol composition, size distributions, oxidation states of OA and evolutionary processes of secondary aerosols.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the groundwater quality in the Faridpur district of central Bangladesh based on preselected 60 sample points using water evaluation indices and a number of statistical approaches.
Abstract: This study investigates the groundwater quality in the Faridpur district of central Bangladesh based on preselected 60 sample points. Water evaluation indices and a number of statistical approaches...

Journal ArticleDOI
TL;DR: In this article, a comprehensive field campaign was carried out in summer 2014 in Wangdu, located in the North China Plain, where a month of continuous OH, HO2 and RO2 measurements was achieved.
Abstract: . A comprehensive field campaign was carried out in summer 2014 in Wangdu, located in the North China Plain. A month of continuous OH, HO2 and RO2 measurements was achieved. Observations of radicals by the laser-induced fluorescence (LIF) technique revealed daily maximum concentrations between (5–15) × 106 cm−3, (3–14) × 108 cm−3 and (3–15) × 108 cm−3 for OH, HO2 and RO2, respectively. Measured OH reactivities (inverse OH lifetime) were 10 to 20 s−1 during daytime. The chemical box model RACM 2, including the Leuven isoprene mechanism (LIM), was used to interpret the observed radical concentrations. As in previous field campaigns in China, modeled and measured OH concentrations agree for NO mixing ratios higher than 1 ppbv, but systematic discrepancies are observed in the afternoon for NO mixing ratios of less than 300 pptv (the model–measurement ratio is between 1.4 and 2 in this case). If additional OH recycling equivalent to 100 pptv NO is assumed, the model is capable of reproducing the observed OH, HO2 and RO2 concentrations for conditions of high volatile organic compound (VOC) and low NOx concentrations. For HO2, good agreement is found between modeled and observed concentrations during day and night. In the case of RO2, the agreement between model calculations and measurements is good in the late afternoon when NO concentrations are below 0.3 ppbv. A significant model underprediction of RO2 by a factor of 3 to 5 is found in the morning at NO concentrations higher than 1 ppbv, which can be explained by a missing RO2 source of 2 ppbv h−1. As a consequence, the model underpredicts the photochemical net ozone production by 20 ppbv per day, which is a significant portion of the daily integrated ozone production (110 ppbv) derived from the measured HO2 and RO2. The additional RO2 production from the photolysis of ClNO2 and missing reactivity can explain about 10 % and 20 % of the discrepancy, respectively. The underprediction of the photochemical ozone production at high NOx found in this study is consistent with the results from other field campaigns in urban environments, which underlines the need for better understanding of the peroxy radical chemistry for high NOx conditions.

Journal ArticleDOI
TL;DR: This paper modifications the existing score function and accuracy function for Pythagorean fuzzy number to make it conform to PFSs, and defines some novel Pythagorian fuzzy weighted geometric/averaging operators for PythAGorean fuzzy information, which can neutrally treat the membership degree and the nonmembership degree.
Abstract: Pythagorean fuzzy sets PFSs, originally proposed by Yager, are a new tool to deal with vagueness with the square sum of the membership degree and the nonmembership degree equal to or less than 1, which have much stronger ability than Atanassov's intuitionistic fuzzy sets to model such uncertainty. In this paper, we modify the existing score function and accuracy function for Pythagorean fuzzy number to make it conform to PFSs. Associated with the given operational laws, we define some novel Pythagorean fuzzy weighted geometric/averaging operators for Pythagorean fuzzy information, which can neutrally treat the membership degree and the nonmembership degree, and investigate the relationships among these operators and those existing ones. At length, a practical example is provided to illustrate the developed operators and to make a comparative analysis.

Journal ArticleDOI
TL;DR: The results demonstrate the important role of regional transport, largely from the southwest but also from the east, and of coal combustion emissions for winter haze formation in Beijing and an important downward mixing pathway during the severe haze in 2015 that can lead to rapid increases in certain aerosol species.
Abstract: Rapid formation and evolution of an extreme haze episode in Northern China during winter 2015

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed software-based liveness detection approach using multi-scale local phase quantity (LPQ) and principal component analysis (PCA) can detect the liveness of users' fingerprints and achieve high recognition accuracy.
Abstract: Fingerprint authentication system is used to verify users' identification according to the characteristics of their fingerprints. However, this system has some security and privacy problems. For example, some artificial fingerprints can trick the fingerprint authentication system and access information using real users' identification. Therefore, a fingerprint liveness detection algorithm needs to be designed to prevent illegal users from accessing privacy information. In this paper, a new software-based liveness detection approach using multi-scale local phase quantity (LPQ) and principal component analysis (PCA) is proposed. The feature vectors of a fingerprint are constructed through multi-scale LPQ. PCA technology is also introduced to reduce the dimensionality of the feature vectors and gain more effective features. Finally, a training model is gained using support vector machine classifier, and the liveness of a fingerprint is detected on the basis of the training model. Experimental results demonstrate that our proposed method can detect the liveness of users' fingerprints and achieve high recognition accuracy. This study also confirms that multi-resolution analysis is a useful method for texture feature extraction during fingerprint liveness detection.

Journal ArticleDOI
TL;DR: Based on the high-resolution gridding data (CN05) from 2416 station observations, a grid dataset of temperature and precipitation extreme indices with the resolution of 05°× 05° for China region was developed using the approach recommended by the Expert Team on Climate Change Detection and Indices.
Abstract: Based on the high-resolution gridding data (CN05) from 2416 station observations, a grid dataset of temperature and precipitation extreme indices with the resolution of 05° × 05° for China region was developed using the approach recommended by the Expert Team on Climate Change Detection and Indices This article comprehensively presents temporal and spatial changes of these indices for the time period 1961–2010 Results showed widespread significant changes in temperature extremes consistent with warming, for instance, decreases in cold extremes and increases in warm extremes over China The warming in the coldest day and night is larger than the warmest day and night, respectively, which is concurrent with the coldest night larger than the coldest day and the warmest night larger than the warmest day Changes in the number of the cold and warm nights are more remarkable than the cold and warm days Changes in precipitation extremes are, in general, spatially more complex and exhibit a less widespread spatial coverage than the temperature indices, for instance, the patterns of annual total precipitation amount, average daily precipitation rate, and the proportion of heavy precipitation in total annual precipitation are similar with negative trends in a southwest–northeast belt from Southwest China to Northeast China while positive trends in eastern China and northwestern China The consistency of changes in climate extremes from the CN05 with other datasets based on the stations and reanalyses is also analysed

Journal ArticleDOI
TL;DR: An overview on Big Data is presented including four issues, namely: concepts, characteristics and processing paradigms of Big data; the state-of-the-art techniques for decision making in Big Data; felicitous decision making applications of Big Data in social science; and the current challenges ofBig Data as well as possible future directions.

Journal ArticleDOI
TL;DR: This paper constructs a special model known as RELAX-RSMN with a totally unimodular constraint coefficient matrix to solve the relaxed 0-1 ILP rapidly through linear programming.
Abstract: Barrier coverage of wireless sensor networks is an important issue in the detection of intruders who are attempting to cross a region of interest. However, in certain applications, barrier coverage cannot be satisfied after random deployment. In this paper, we study how mobile sensors can be efficiently relocated to achieve k-barrier coverage. In particular, two problems are studied: relocation of sensors with minimum number of mobile sensors and formation of k-barrier coverage with minimum energy cost. These two problems were formulated as 0–1 integer linear programming (ILP). The formulation is computationally intractable because of integrality and complicated constraints. Therefore, we relax the integrality and complicated constraints of the formulation and construct a special model known as RELAX-RSMN with a totally unimodular constraint coefficient matrix to solve the relaxed 0–1 ILP rapidly through linear programming. Theoretical analysis and simulation were performed to verify the effectiveness of our approach.

Journal ArticleDOI
TL;DR: A fast motion estimation (ME) method to reduce the encoding complexity of the H.265/HEVC encoder based on the best motion vector selection correlation among the different size prediction modes, which achieves an average of 20% ME time saving as compared with the original HM-TZSearch.
Abstract: The high definition (HD) and ultra HD videos can be widely applied in broadcasting applications. However, with the increased resolution of video, the volume of the raw HD visual information data increases significantly, which becomes a challenge for storage, processing, and transmitting the HD visual data. The state-of-the-art video compression standard-H.265/High Efficiency Video Coding (HEVC) compresses the raw HD visual data efficiently, while the high compression rate comes at the cost of heavy computation load. Hence, reducing the encoding complexity becomes vital for the H.265/HEVC encoder to be used in broadcasting applications. In this paper, based on the best motion vector selection correlation among the different size prediction modes, we propose a fast motion estimation (ME) method to reduce the encoding complexity of the H.265/HEVC encoder. First, according to the prediction unit (PU) partition type, all PUs are classified into two classes, parent PU and children PUs, respectively. Then, based on the best motion vector selection correlation between the parent PU and children PUs, the block matching search process of the children PUs is adaptively skipped if their parent PU chooses the initial search point as its final optimal motion vector in the ME process. Experimental results show that the proposed method achieves an average of 20% ME time saving as compared with the original HM-TZSearch. Meanwhile, the rate distortion performance degradation is negligible.

Journal ArticleDOI
TL;DR: In this paper, the authors performed first-principle calculations to investigate the mechanical properties of the monolayer borophene, including ideal tensile strength and critical strain.
Abstract: Very recently, two-dimensional (2D) boron sheets (borophene) with rectangular structures were grown successfully on single crystal Ag(111) substrates (Mannix et al 2015 Science 350 1513). The fabricated boroprene is predicted to have unusual mechanical properties. We performed first-principle calculations to investigate the mechanical properties of the monolayer borophene, including ideal tensile strength and critical strain. It was found that monolayer borophene can withstand stress up to 20.26 N m−1 and 12.98 N m−1 in a and b directions, respectively. However, its critical strain was found to be small. In the a direction, the critical value is only 8%, which, to the best of our knowledge, is the lowest among all studied 2D materials. Our numerical results show that the tensile strain applied in the b direction enhances the bucking height of borophene resulting in an out-of-plane negative Poisson's ratio, which makes the boron sheet show superior mechanical flexibility along the b direction. The failure mechanism and phonon instability of monolayer borophene were also explored.

Journal ArticleDOI
TL;DR: PM2.5 pollution in China has incurred great health risks that are even worse than those of tobacco smoking, although there is still great potential to improve future air quality.

Journal ArticleDOI
TL;DR: An adaptive method aiming at spatial-temporal efficiency in a heterogeneous cloud environment based on an optimized Kernel-based Extreme Learning Machine algorithm is presented for faster forecast of job execution duration and space occupation and achieves 26.6% improvement over the original scheme.
Abstract: A heterogeneous cloud system, for example, a Hadoop 2.6.0 platform, provides distributed but cohesive services with rich features on large-scale management, reliability, and error tolerance. As big data processing is concerned, newly built cloud clusters meet the challenges of performance optimization focusing on faster task execution and more efficient usage of computing resources. Presently proposed approaches concentrate on temporal improvement, that is, shortening MapReduce time, but seldom focus on storage occupation; however, unbalanced cloud storage strategies could exhaust those nodes with heavy MapReduce cycles and further challenge the security and stability of the entire cluster. In this paper, an adaptive method is presented aiming at spatial-temporal efficiency in a heterogeneous cloud environment. A prediction model based on an optimized Kernel-based Extreme Learning Machine algorithm is proposed for faster forecast of job execution duration and space occupation, which consequently facilitates the process of task scheduling through a multi-objective algorithm called time and space optimized NSGA-II TS-NSGA-II. Experiment results have shown that compared with the original load-balancing scheme, our approach can save approximate 47-55i¾źs averagely on each task execution. Simultaneously, 1.254i¾ź of differences on hard disk occupation were made among all scheduled reducers, which achieves 26.6% improvement over the original scheme. Copyright © 2016 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper automatically learns spatio-temporal motion features for action recognition via an evolutionary method, i.e., genetic programming (GP), which evolves the motion feature descriptor on a population of primitive 3D operators (e.g., 3D-Gabor and wavelet).
Abstract: Extracting discriminative and robust features from video sequences is the first and most critical step in human action recognition. In this paper, instead of using handcrafted features, we automatically learn spatio-temporal motion features for action recognition. This is achieved via an evolutionary method, i.e., genetic programming (GP), which evolves the motion feature descriptor on a population of primitive 3D operators (e.g., 3D-Gabor and wavelet). In this way, the scale and shift invariant features can be effectively extracted from both color and optical flow sequences. We intend to learn data adaptive descriptors for different datasets with multiple layers, which makes fully use of the knowledge to mimic the physical structure of the human visual cortex for action recognition and simultaneously reduce the GP searching space to effectively accelerate the convergence of optimal solutions. In our evolutionary architecture, the average cross-validation classification error, which is calculated by an support-vector-machine classifier on the training set, is adopted as the evaluation criterion for the GP fitness function. After the entire evolution procedure finishes, the best-so-far solution selected by GP is regarded as the (near-)optimal action descriptor obtained. The GP-evolving feature extraction method is evaluated on four popular action datasets, namely KTH, HMDB51, UCF YouTube, and Hollywood2. Experimental results show that our method significantly outperforms other types of features, either hand-designed or machine-learned.

Journal ArticleDOI
TL;DR: A thorough experimental evaluation of 20 state-of-the-art tracking algorithms is presented with detailed analysis using different metrics and a large-scale database which contains 365 challenging image sequences of pedestrians and rigid objects is proposed.
Abstract: Numerous approaches on object tracking have been proposed during the past decade with demonstrated success. However, most tracking algorithms are evaluated on limited video sequences and annotations. For thorough performance evaluation, we propose a large-scale database which contains 365 challenging image sequences of pedestrians and rigid objects. The database covers 12 kinds of objects, and most of the sequences are captured from moving cameras. Each sequence is annotated with target location and occlusion level for evaluation. A thorough experimental evaluation of 20 state-of-the-art tracking algorithms is presented with detailed analysis using different metrics. The database is publicly available and evaluation can be carried out online for fair assessments of visual tracking algorithms.