scispace - formally typeset
Search or ask a question
Author

Praveen Kumar Tripathi

Bio: Praveen Kumar Tripathi is an academic researcher from University of Texas at Arlington. The author has contributed to research in topics: Cluster analysis & Metaheuristic. The author has an hindex of 7, co-authored 17 publications receiving 609 citations. Previous affiliations of Praveen Kumar Tripathi include Indian Statistical Institute & Stony Brook University.

Papers
More filters
Journal ArticleDOI
TL;DR: A new diversity parameter has been used to ensure sufficient diversity amongst the solutions of the non-dominated fronts, while retaining at the same time the convergence to the Pareto-optimal front.

482 citations

01 Jan 2007
TL;DR: AMOPSO algorithm's novelty lies in its adaptive nature, that is attained by incorporating inertia and the acceleration coefficient as control variables with usual optimization variables, and evolving these through the swarm procedure.
Abstract: In this article we describe a novel Particle Swarm Optimization (PSO) approach to Multi-objective Optimization (MOO) called Adaptive Multi-objective Particle Swarm Op- timization (AMOPSO). AMOPSO algorithm's novelty lies in its adaptive nature, that is attained by incorporating inertia and the acceleration coefficient as control variables with usual optimization variables, and evolving these through the swarm- ing procedure. A new diversity parameter has been used to ensure sufficient diversity amongst the solutions of the non dominated front. AMOPSO has been compared with some recently developed multi-objective PSO techniques and evo- lutionary algorithms for nine function optimization problems, using different performance measures.

63 citations

Proceedings ArticleDOI
01 Sep 2007
TL;DR: AMOPSO algorithm's novelty lies in its adaptive nature, that is attained by incorporating inertia and the acceleration coefficient as control variables with usual optimization variables, and evolving these through the swarming procedure.
Abstract: In this article we describe a novel Particle Swarm Optimization (PSO) approach to Multi-objective Optimization (MOO) called Adaptive Multi-objective Particle Swarm Optimization (AMOPSO). AMOPSO algorithm's novelty lies in its adaptive nature, that is attained by incorporating inertia and the acceleration coefficient as control variables with usual optimization variables, and evolving these through the swarming procedure. A new diversity parameter has been used to ensure sufficient diversity amongst the solutions of the non dominated front. AMOPSO has been compared with some recently developed multi-objective PSO techniques and evolutionary algorithms for nine function optimization problems, using different performance measures.

28 citations

Proceedings ArticleDOI
01 Sep 2015
TL;DR: K-DBSCAN is a novel density based spatial clustering algorithm with the main focus of identifying clusters of points with similar spatial density, which contrasts with many other approaches, whose main focus is spatial contiguity.
Abstract: Spatial clustering is a very important tool in the analysis of spatial data. In this paper, we propose a novel density based spatial clustering algorithm called K-DBSCAN with the main focus of identifying clusters of points with similar spatial density. This contrasts with many other approaches, whose main focus is spatial contiguity. The strength of K-DBSCAN lies in finding arbitrary shaped clusters in variable density regions. Moreover, it can also discover clusters with overlapping spatial regions, but differing density levels. The goal is to differentiate the most dense regions from lower density regions, with spatial contiguity as the secondary goal. The original DBSCAN fails to discover the clusters with variable density and overlapping regions. OPTICS and Shared Nearest Neighbour (SNN) algorithms have the capabilities of clustering variable density datasets but they have their own limitations. Both fail to detect overlapping clusters. Also, while handling varying density, both of the algorithms merge points from different density levels. K-DBSCAN has two phases: first, it divides all data objects into different density levels to identify the different natural densities present in the dataset, then it extracts the clusters using a modified version of DBSCAN. Experimental results on both synthetic data and a real-world spatial dataset demonstrate the effectiveness of our clustering algorithm.

19 citations

Proceedings ArticleDOI
06 Nov 2018
TL;DR: This paper proposes to recommend time-aware and preference-aware travel routes consisting of a sequence of POI locations with corresponding time information, and proposes an efficient solution to generate travel routes with those locations including time to visit each location.
Abstract: There have been vast advances and rapid growth in Location based social networking (LBSN) services in recent years. Travel route recommendation is one of the most important applications in the LBSN services. Travel route recommendation provides users a sequence of POIs (Point of Interests) as a route to visit. In this paper, we propose to recommend time-aware and preference-aware travel routes consisting of a sequence of POI locations with corresponding time information. It helps users not only to explore interesting locations in a new city, but also it will help to plan the entire trip with those locations with the approximated time information under specific time constraints. First, we find the interesting POI locations that considers the following factors: User's categorical preferences, temporal activities and popularity of location. Then, we propose an efficient solution to generate travel routes with those locations including time to visit each location. These travel routes will inform users where to visit and when to visit. We evaluate the efficiency and effectiveness of our solution on a real life LBSN dataset.

15 citations


Cited by
More filters
01 Jan 2002

9,314 citations

Journal ArticleDOI
TL;DR: A new optimization algorithm based on the law of gravity and mass interactions is introduced and the obtained results confirm the high performance of the proposed method in solving various nonlinear functions.

5,501 citations

Journal ArticleDOI
TL;DR: The qualitative and quantitative results prove the efficiency of SSA and MSSA and demonstrate the merits of the algorithms proposed in solving real-world problems with difficult and unknown search spaces.

3,027 citations

Journal ArticleDOI
TL;DR: This paper surveys the development ofMOEAs primarily during the last eight years and covers algorithmic frameworks such as decomposition-based MOEAs (MOEA/Ds), memetic MOEas, coevolutionary MOE As, selection and offspring reproduction operators, MOE as with specific search methods, MOeAs for multimodal problems, constraint handling and MOE
Abstract: A multiobjective optimization problem involves several conflicting objectives and has a set of Pareto optimal solutions. By evolving a population of solutions, multiobjective evolutionary algorithms (MOEAs) are able to approximate the Pareto optimal set in a single run. MOEAs have attracted a lot of research effort during the last 20 years, and they are still one of the hottest research areas in the field of evolutionary computation. This paper surveys the development of MOEAs primarily during the last eight years. It covers algorithmic frameworks such as decomposition-based MOEAs (MOEA/Ds), memetic MOEAs, coevolutionary MOEAs, selection and offspring reproduction operators, MOEAs with specific search methods, MOEAs for multimodal problems, constraint handling and MOEAs, computationally expensive multiobjective optimization problems (MOPs), dynamic MOPs, noisy MOPs, combinatorial and discrete MOPs, benchmark problems, performance indicators, and applications. In addition, some future research issues are also presented.

1,842 citations

Journal ArticleDOI
01 Dec 2009
TL;DR: An adaptive particle swarm optimization that features better search efficiency than classical particle Swarm optimization (PSO) is presented and can perform a global search over the entire search space with faster convergence speed.
Abstract: An adaptive particle swarm optimization (APSO) that features better search efficiency than classical particle swarm optimization (PSO) is presented. More importantly, it can perform a global search over the entire search space with faster convergence speed. The APSO consists of two main steps. First, by evaluating the population distribution and particle fitness, a real-time evolutionary state estimation procedure is performed to identify one of the following four defined evolutionary states, including exploration, exploitation, convergence, and jumping out in each generation. It enables the automatic control of inertia weight, acceleration coefficients, and other algorithmic parameters at run time to improve the search efficiency and convergence speed. Then, an elitist learning strategy is performed when the evolutionary state is classified as convergence state. The strategy will act on the globally best particle to jump out of the likely local optima. The APSO has comprehensively been evaluated on 12 unimodal and multimodal benchmark functions. The effects of parameter adaptation and elitist learning will be studied. Results show that APSO substantially enhances the performance of the PSO paradigm in terms of convergence speed, global optimality, solution accuracy, and algorithm reliability. As APSO introduces two new parameters to the PSO paradigm only, it does not introduce an additional design or implementation complexity.

1,713 citations