scispace - formally typeset
Search or ask a question

Showing papers on "Sorting published in 2018"


Journal ArticleDOI
20 Sep 2018-Cell
TL;DR: This work presents a machine-intelligence technology based on a radically different architecture that realizes real-time image-based intelligent cell sorting at an unprecedented rate and is expected to enable machine-based scientific discovery in biological, pharmaceutical, and medical sciences.

366 citations


Journal ArticleDOI
TL;DR: A feature selection approach is proposed based on a new multi-objective artificial bee colony algorithm integrated with non-dominated sorting procedure and genetic operators that outperformed the other methods in terms of both the dimensionality reduction and the classification accuracy.

236 citations


Journal ArticleDOI
TL;DR: In this paper, a theory of the location choices of heterogeneous firms in a variety of sectors across cities is proposed to account for the uneven distribution of economic activity in space, and the authors find that nearly half of the productivity advantage of large cities is due to firm sorting, the rest coming from agglomeration economies.
Abstract: To account for the uneven distribution of economic activity in space, I propose a theory of the location choices of heterogeneous firms in a variety of sectors across cities. In equilibrium, the distribution of city sizes and the sorting patterns of firms are uniquely determined and affect aggregate TFP and welfare. I estimate the model using French firm-level data and find that nearly half of the productivity advantage of large cities is due to firm sorting, the rest coming from agglomeration economies. I quantify the general equilibrium effects of place-based policies: policies that subsidize smaller cities have negative aggregate effects.

131 citations


Journal ArticleDOI
TL;DR: A priority sorting approach based on simplified model predictive control (MPC) is proposed for modular multilevel converter (MMC) at reducing the computational burden of conventional MPC method while maintaining the system performance, especially under high voltage levels.
Abstract: In this paper, a priority sorting approach based on simplified model predictive control (MPC) is proposed for modular multilevel converter (MMC). It aims at reducing the computational burden of conventional MPC method while maintaining the system performance, especially under high voltage levels. The proposed approach mainly consists of three parts, i.e., grid-side current control (GCC), circulating current control (CCC), and capacitor voltage balancing control (CVBC). The GCC and CCC are separately designed with simplified MPCs, avoiding the weight factor. Meanwhile, the redundant calculations are eliminated in GCC by considering the desired predicted output voltage of equivalent MMC model. To further minimize the optional combinations of the switching states, the CCC is constructed by utilizing the output of GCC and the arm current. Besides, a novel priority sorting approach is proposed for the CVBC to alleviate the sorting operation. The submodules are divided into three groups according to the detected capacitor voltages. Moreover, the groups are assigned with different priorities based on the arm current, and only one group needs the sorting process. Additionally, a reduced frequency approach is introduced to decrease the power loss in the steady state. The effectiveness of the proposed approach is validated by both simulation and experimental results.

119 citations


Journal ArticleDOI
TL;DR: The results compared to other algorithms demonstrate the effectiveness of the MOFOA with knowledge-guided search in solving the multi-objective MSRCPSP.
Abstract: In this paper, a knowledge-guided multi-objective fruit fly optimization algorithm (MOFOA) is proposed for the multi-skill resource-constrained project scheduling problem (MSRCPSP) with the criteria of minimizing the makespan and the total cost simultaneously. First, a solution is represented by two lists, i.e. resource list and task list. Second, the minimum total cost rule is designed for the initialization according to the property of the problem. Third, the smell-based search is implemented via the neighborhood based search operators that are specially designed for the MSRCPSP, while the vision-based search adopts the technique for the order preference by similarity to an ideal solution (TOPSIS) and the non-dominated sorting collaboratively to complete the multi-objective evaluation. In addition, a knowledge-guided search procedure is introduced to enhance the exploration of the FOA. Finally, the design-of-experiment (DOE) method is used to investigate the effect of parameter setting, and numerical tests based on benchmark instances are carried out. The results compared to other algorithms demonstrate the effectiveness of the MOFOA with knowledge-guided search in solving the multi-objective MSRCPSP.

102 citations


Journal ArticleDOI
TL;DR: A new non-dominated sorting based on multi-objective whale optimization algorithm is proposed for content-based image retrieval (NSMOWOA) and shows a good performance in content- based image retrieval problem in terms of recall and precision.
Abstract: In the recent years, there are massive digital images collections in many fields of our life, which led the technology to find methods to search and retrieve these images efficiently. The content-based is one of the popular methods used to retrieve images, which depends on the color, texture and shape descriptors to extract features from images. However, the performance of the content-based image retrieval methods depends on the size of features that are extracted from images and the classification accuracy. Therefore, this problem is considered as a multi-objective and there are several methods that used to manipulate it such as NSGA-II and NSMOPSO. However, these methods have drawbacks such as their time and space complexity are large since they used traditional non-dominated sorting methods. In this paper, a new non-dominated sorting based on multi-objective whale optimization algorithm is proposed for content-based image retrieval (NSMOWOA). The proposed method avoids the drawbacks in other non-dominated sorting multi-objective methods that have been used for content-based image retrieval through reducing the space and time complexity. The results of the NSMOWOA showed a good performance in content-based image retrieval problem in terms of recall and precision.

99 citations


Journal ArticleDOI
TL;DR: A novel multi-objective distributed generation planning methodology in distribution network considering correlations among uncertainties, i.e., wind speed, light intensity and load demand is proposed and a balance between the economy and the security can be achieved by non-dominated sorting genetic algorithm II.

89 citations


Journal ArticleDOI
TL;DR: In the proposed algorithm, an elitist nondominated sorting method and a modified crowding-distance sorting method are introduced to acquire an evenly distributed Pareto Optimal Front to enhance the learning ability of population.

87 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyse university-level factors that affect the sorting of Chinese internation in higher education and propose a method to identify the factors that influence the mobility of international students.
Abstract: This research contributes to the booming literature on the mobility of international students in higher education. We analyse university-level factors that affect the sorting of Chinese internation...

87 citations


Journal ArticleDOI
Yan Zheng1, Jiarui Bai1, Jingna Xu1, Xiayang Li1, Yimin Zhang1 
TL;DR: A hyperspectral imaging system (HIS) with the identification model verified its practical application by using the unknown plastics and suggested that the discrimination model was potential to an on-line characterization and sorting platform of waste plastics based on HIS.

82 citations


Journal ArticleDOI
TL;DR: This paper presents an application of nondominated sorting genetic algorithm II (NSGA-II) for multiobjective feature selection in power quality disturbances classification and shows quick convergence, admirable accuracy, and reduced computational time.
Abstract: This paper presents an application of nondominated sorting genetic algorithm II (NSGA-II) for multiobjective feature selection in power quality disturbances classification. Classification error and number of features are collectively minimized to ensure good accuracy and feasible computation time. NSGA-II gives different Pareto-optimal solutions based on the combination of objectives. Considering equal priority for both the objectives, a fitness function is provided to retrieve the best solution set from the first Pareto-front. S-transform and time–time transform are employed for detection and feature extraction. Decision tree is used for classification. The proposed technique is tested on disturbances simulated as per IEEE-1159 standards and real disturbances acquired from an experimental setup. The results show quick convergence, admirable accuracy, and reduced computational time.

Journal ArticleDOI
TL;DR: This study proposes a novel VIKOR-based green supplier sorting methodology called VIKORSORT, which evaluates the environmental performance of suppliers and sorts them into the predefined ordered classes.
Abstract: Depleting natural resources and limited amount of landfill areas have forced many governments to impose stricter measures on environmental performance. In order to comply with those measures and to have a better environmental image, companies are investing heavily in environmental, social and economic responsibility issues. Moreover, they continuously track the environmental performance of their suppliers. Many green supplier evaluation and ranking methodologies were proposed in the literature in order to assist companies in the environmental performance evaluation of suppliers. However, the number of studies on the sorting of suppliers based on environmental criteria is very limited. In this study, we fill this research gap by proposing a novel VIKOR-based green supplier sorting methodology called VIKORSORT. This methodology evaluates the environmental performance of suppliers and sorts them into the predefined ordered classes. The proposed methodology can easily be embedded into an expert system which can suggest a suitable green supplier development program for each class.

Journal ArticleDOI
TL;DR: A novel hybrid multi-objective self-adaptive differential evolution algorithm is developed, which benefits from the variable neighborhood search with fuzzy dominance sorting and shows that the solution efficiency of a network can be balanced with its effectiveness through customer satisfaction.

Journal ArticleDOI
TL;DR: The NSTLBO algorithm is applied to solve the multi-objective optimization problems of three machining processes namely, turning, wire-electric-discharge machining and laser cutting process and two micro-machining processes and the Pareto-optimal set of solutions for each optimization problem is obtained.
Abstract: Selection of optimum machining parameters is vital to the machining processes in order to ensure the quality of the product, reduce the machining cost, increasing the productivity and conserve resources for sustainability. Hence, in this work a posteriori multi-objective optimization algorithm named as Non-dominated Sorting Teaching–Learning-Based Optimization (NSTLBO) is applied to solve the multi-objective optimization problems of three machining processes namely, turning, wire-electric-discharge machining and laser cutting process and two micro-machining processes namely, focused ion beam micro-milling and micro wire-electric-discharge machining. The NSTLBO algorithm is incorporated with non-dominated sorting approach and crowding distance computation mechanism to maintain a diverse set of solutions in order to provide a Pareto-optimal set of solutions in a single simulation run. The results of the NSTLBO algorithm are compared with the results obtained using GA, NSGA-II, PSO, iterative search method and MOTLBO and are found to be competitive. The Pareto-optimal set of solutions for each optimization problem is obtained and reported. These Pareto-optimal set of solutions will help the decision maker in volatile scenarios and are useful for real production systems.

Journal ArticleDOI
TL;DR: The performance results show that the SGA initiated with connection demands sorted based on increasing the number of possible space and spectrum assignment layouts (SAL) can achieve the best near-optimal solution for a small network experiment and the ascending SAL number (ASN) policy shows the best performance for realistic networks.
Abstract: Space division multiplexing (SDM) and elastic optical networking (EON) have been proposed to increase the transmission capacity and flexibility of optical transport networks. The problem of allocating available resources over the EON network is called routing and spectrum assignment (RSA); it is called routing, modulation level, and spectrum assignment (RMLSA) if modulation adaptivity is enabled. Considering SDM adds more flexibility to the resource allocation problem. In this paper, we formulate the routing, modulation level, space, and spectrum assignment (RMLSSA) as integer linear programming (ILP) in a path-based manner for static traffic. Next, the stepwise greedy algorithm (SGA) and four different sorting policies to initiate the algorithmare proposed as a heuristic method to find a near-optimal solution of the RMLSSA problem. Finally, the paper evaluates the effectiveness of sorting policies and the SGA algorithm with different metrics. The performance results show that the SGA initiated with connection demands sorted based on increasing the number of possible space and spectrum assignment layouts (SAL) can achieve the best near-optimal solution for a small network experiment. Moreover, the ascending SAL number (ASN) policy shows the best performance for realistic networks.

Journal ArticleDOI
01 Feb 2018
TL;DR: COCO as mentioned in this paper organizes data series based on a z-order curve, keeping similar series close to each other in the sorted order, and uses bulk-loading techniques that rely on sorting to quickly build a contiguous index using large sequential disk I/Os.
Abstract: Many modern applications produce massive amounts of data series that need to be analyzed, requiring efficient similarity search operations. However, the state-of-the-art data series indexes that are used for this purpose do not scale well for massive datasets in terms of performance, or storage costs. We pinpoint the problem to the fact that existing summarizations of data series used for indexing cannot be sorted while keeping similar data series close to each other in the sorted order. This leads to two design problems. First, traditional bulk-loading algorithms based on sorting cannot be used. Instead, index construction takes place through slow top-down insertions, which create a non-contiguous index that results in many random I/Os. Second, data series cannot be sorted and split across nodes evenly based on their median value; thus, most leaf nodes are in practice nearly empty. This further slows down query speed and amplifies storage costs. To address these problems, we present Coconut. The first innovation in Coconut is an inverted, sortable data series summarization that organizes data series based on a z-order curve, keeping similar series close to each other in the sorted order. As a result, Coconut is able to use bulk-loading techniques that rely on sorting to quickly build a contiguous index using large sequential disk I/Os. We then explore prefix-based and median-based splitting policies for bottom-up bulk-loading, showing that median-based splitting outperforms the state of the art, ensuring that all nodes are densely populated. Overall, we show analytically and empirically that Coconut dominates the state-of-the-art data series indexes in terms of construction speed, query speed, and storage costs.

Journal ArticleDOI
TL;DR: The results show that taking into account the processes of integration and consolidation in the cross-docking, processes of collecting products from suppliers and delivering them to customers makes it possible to obtain the more accurate estimates of the final date of delivering the products to the customers.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a unifying theory of production where management resolves a tradeoff between hiring more versus better workers, and provide a condition for sorting that captures this tradeoff.
Abstract: Two cornerstones of empirical and policy analysis of firms, in macro, labor and industrial organization, are the determinants of the firm size distribution and the determinants of sorting between workers and firms. We propose a unifying theory of production where management resolves a tradeoff between hiring more versus better workers. The span of control or size is therefore intimately intertwined with the sorting pattern. We provide a condition for sorting that captures this tradeoff between the quantity and quality of workers and that generalizes Becker's sorting condition. A system of differential equations determines the equilibrium allocation, the firm size, and wages, and allows us to characterize the allocation of the quality and quantity of labor to firms of different productivity. We show that our model nests a large number of widely used existing models. We also augment the model to incorporate labor market frictions in the presence of sorting with large firms.

Journal ArticleDOI
TL;DR: A novel area-efficient and power-efficient approach to sorting networks, based on “unary processing,” which is validated with two implementations of an important application of sorting: median filtering and is a low cost, energy-efficient implementation of median filtering with only a slight accuracy loss.
Abstract: Sorting is a common task in a wide range of applications from signal and image processing to switching systems. For applications that require high performance, sorting is often performed in hardware with application-specified integrated circuits or field-programmable gate arrays. Hardware cost and power consumption are the dominant concerns. The usual approach is to wire up a network of compare-and-swap units in a configuration called the Batcher (or bitonic) network. Such networks can readily be pipelined. This paper proposes a novel area-efficient and power-efficient approach to sorting networks, based on “unary processing.” In unary processing, numbers are encoded uniformly by a sequence of one value (say 1) followed by a sequence of the other value (say 0) in a stream of 0’s and 1’s with the value defined by the fraction of 1’s in the stream. Synthesis results of complete sorting networks show up to 92% area and power saving compared to the conventional binary implementations. However, the latency increases. To mitigate the increased latency, this paper uses a novel time-encoding of data. The approach is validated with two implementations of an important application of sorting: median filtering. The result is a low cost, energy-efficient implementation of median filtering with only a slight accuracy loss, compared to conventional implementations.

Journal ArticleDOI
01 Aug 2018-Ecology
TL;DR: It is demonstrated that metacommunities driven by neutral dynamics alone or in combination with species sorting leads to inflated estimates and Type I error rates when testing for the importance of species sorting and a general and flexible new variation partitioning procedure is proposed to adjust for spurious contributions due to spatial autocorrelation from the environmental fraction.
Abstract: The methods of direct gradient analysis and variation partitioning are the most widely used frameworks to evaluate the contributions of species sorting to metacommunity structure. In many cases, however, species are also driven by spatial processes that are independent of environmental heterogeneity (e.g., neutral dynamics). As such, spatial autocorrelation can occur independently in both species (due to limited dispersal) and the environmental data, leading to spurious correlations between species distributions and the spatialized (i.e., spatially autocorrelated) environment. In these cases, the method of variation partitioning may present high Type I error rates (i.e., reject the null hypothesis more often than the pre-established critical level) and inflated estimates regarding the environmental component that is used to estimate the importance of species sorting. In this paper, we (1) demonstrate that metacommunities driven by neutral dynamics (via limited dispersal) alone or in combination with species sorting leads to inflated estimates and Type I error rates when testing for the importance of species sorting; and (2) propose a general and flexible new variation partitioning procedure to adjust for spurious contributions due to spatial autocorrelation from the environmental fraction. We used simulated metacommunity data driven by pure neutral, pure species sorting, and mixed (i.e., neutral + species sorting dynamics) processes to evaluate the performances of our new methodological framework. We also demonstrate the utility of the proposed framework with an empirical plant dataset in which we show that half of the variation initially due to the environment by the standard variation partitioning framework was due to spurious correlations.

Journal ArticleDOI
TL;DR: PLO-WNSGA II is effective and efficient for the control of basin-wide floods in a mixed reservoir system and is more resilient under inflow uncertainty.

Journal ArticleDOI
TL;DR: This review outlines contemporary techniques for cell sorting and manipulation, and provides an in-depth view into the existing and prospective uses of light for cell sorted and manipulation.
Abstract: Contemporary biomedical research requires development of novel techniques for sorting and manipulation of cells within the framework of a microfluidic chip. The desired functions of a microfluidic chip are achieved by combining and integrating passive methods that utilize the channel geometry and structure, as well as active methods that include magnetic, electrical, acoustic and optical forces. Application of magnetic, electric and acoustics-based methods for sorting and manipulation have been and are under continuous scrutiny. Optics-based methods, in contrast, have not been explored to the same extent as other methods, since they attracted insufficient attention. This is due to the complicated, expensive and bulky setup required for carrying out such studies. However, advances in optical beam shaping and computer hardware, and software have opened up new opportunities for application of light to development of advanced sorting and manipulation techniques. This review outlines contemporary techniques for cell sorting and manipulation, and provides an in-depth view into the existing and prospective uses of light for cell sorting and manipulation.

Journal ArticleDOI
TL;DR: The proposed NSGWO algorithm works in such a manner that it first collects all non-dominated Pareto optimal solutions in achieve till the evolution of last iteration limit, which validates its efficiency in terms of Execution Time (ET) and effectiveness in Terms of Generalized Distance (GD), Diversity Metric (DM) on standard unconstraint, constraint and engineering design problem.

Journal ArticleDOI
TL;DR: A novel channel design with a series of reverse wavy channel structures for sheathless inertial particle focusing and cell sorting, enabling the sorting of cancer cells from whole blood without the use of sheath flows and could allow fast and efficient cell-sorting in many biomedical applications.
Abstract: Inertial microfluidics utilizing passive hydrodynamic forces has been attracting significant attention in the field of precise microscale manipulation owing to its low cost, simplicity and high throughput. In this paper, we present a novel channel design with a series of reverse wavy channel structures for sheathless inertial particle focusing and cell sorting. A single wavy channel unit consists of four semicircular segments, which produce periodically reversed Dean secondary flow along the cross-section of the channel. The balance between the inertial lift force and the Dean drag force results in deterministic equilibrium focusing positions, which also depends on the size of the flow-through particles and cells. Six sizes of fluorescent microspheres (15, 10, 7, 5, 3 and 1 μm) were used to study the size-dependent inertial focusing behavior. Our novel design with sharp-turning subunits could effectively focus particles as small as 3 μm, the average size of platelets, enabling the sorting of cancer cells from whole blood without the use of sheath flows. Utilizing an optimized channel design, we demonstrated the size-based sorting of MCF-7 breast cancer cells spiked in diluted whole blood samples without using sheath flows. A single sorting process was able to recover 89.72% of MCF-7 cells from the original mixture and enrich MCF-7 cells from an original purity of 5.3% to 68.9% with excellent cell viability. A technique for sorting tiny objects based on their size as they flow through narrow channels offers a simple system that could be used to separate cells and smaller bodies such as blood platelets. Separating different cells and other components of biological fluids is a vital aspect of medical research toward the development of new therapies. All existing methods have limitations, and improved techniques are eagerly sought. Ye Ai and colleagues at Singapore University of Technology and Design developed and tested a simple method based on the forces particles experience as they flow through channels with semi-circular sections linked in repeatedly reversing directions. The researchers demonstrated the general performance of their technique using fluorescent microspheres. They then successfully separated cancer cells from blood. The method could allow fast and efficient cell-sorting in many biomedical applications.

Posted Content
TL;DR: In this paper, the authors study optimal spatial policies in quantitative trade and geography frameworks with spillovers and sorting of heterogeneous workers and quantify the aggregate and distributional eects of implementing these policies in the U.S. economy.
Abstract: We study optimal spatial policies in quantitative trade and geography frameworks with spillovers and sorting of heterogeneous workers. We rst characterize ecient spatial transfers and the labor subsidies that would implement them. Then, we quantify the aggregate and distributional eects of implementing these policies in the U.S. economy. Under homogeneous workers and constant-elasticity spillovers, a constant labor subsidy over space restores efficiency regardless of micro heterogeneity in fundamentals and trade costs. In that case, the quantification suggests that the observed spatial transfers in the U.S. are close to ecient. Spillovers across heterogeneous workers create an additional rationale for place-specific subsidies to attain optimal sorting. Under heterogeneous workers, the quantication suggests that optimal spatial policies may require stronger redistribution towards low-wage cities than in the data, reduce wage inequality in larger cities, weaken spatial sorting by skill, and lead to signicant welfare gains. Spillovers across dierent types of workers are a key driving force behind these results.

Journal ArticleDOI
TL;DR: The application of the hybrid AHP–TOPSIS-2N model proved to be consistent and robust, generating two priority sorting possibilities aligned with the strategic situation of the organization and a range of improvements in terms of governance and processes for the ITGC of the company.
Abstract: The purpose of this paper is to analyze the results obtained by the information technology (IT) governance committee (ITGC) of a company undergoing a strategic realignment in the sorting and priori...

Journal ArticleDOI
TL;DR: It is demonstrated that the present method/algorithm performs better at classifying spikes and neurons and at assessing their modulating properties than other methods currently used in neurophysiology.
Abstract: Spike sorting is one of the most important data analysis problems in neurophysiology. The precision in all steps of the spike-sorting procedure critically affects the accuracy of all subsequent analyses. After data preprocessing and spike detection have been carried out properly, both feature extraction and spike clustering are the most critical subsequent steps of the spike-sorting procedure. The proposed spike sorting approach comprised a new feature extraction method based on shape, phase, and distribution features of each spike (hereinafter SS-SPDF method), which reveal significant information of the neural events under study. In addition, we applied an efficient clustering algorithm based on K-means and template optimization in phase space (hereinafter K-TOPS) that included two integrative clustering measures (validity and error indices) to verify the cohesion-dispersion among spike events during classification and the misclassification of clustering, respectively. The proposed method/algorithm was tested on both simulated data and real neural recordings. The results obtained for these datasets suggest that our spike sorting approach provides an efficient way for sorting both single-unit spikes and overlapping waveforms. By analyzing raw extracellular recordings collected from the rostral-medial prefrontal cortex (rmPFC) of behaving rabbits during classical eyeblink conditioning, we have demonstrated that the present method/algorithm performs better at classifying spikes and neurons and at assessing their modulating properties than other methods currently used in neurophysiology.

Journal ArticleDOI
TL;DR: This letter proposes an iterative algorithm, without sorting operation, for projection onto the parity-check polytope, which has a worst case complexity linear in the input dimension compared with the super-linear complexity of existing algorithms.
Abstract: Alternating direction method of multipliers (ADMM) is a popular technique for linear-programming decoding of low-density parity-check codes. The computational complexity of ADMM is dominated by the Euclidean projection of a real-valued vector onto a parity-check polytope. Existing algorithms for such a projection all require sorting operations, which happen to be the most complex part of the projection. In this letter, we propose an iterative algorithm, without sorting operation, for projection onto the parity-check polytope. The proposed algorithm has a worst case complexity linear in the input dimension compared with the super-linear complexity of existing algorithms.

Journal ArticleDOI
TL;DR: Computational comparisons on benchmark instances indicate the superiority of Im-NSGA-II over NSGA- II and MOGA, and an empirical study in Chongqing, China confirms the practicability of the solution approach.

Journal ArticleDOI
TL;DR: A new standard for defining and assessing categories and qualities of used textiles is suggested, adapted to real contemporary sorting technologies, and tested on waste samples.