scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Systems Assurance Engineering and Management in 2018"


Journal ArticleDOI
TL;DR: The paper reports a host based intrusion detection model for Cloud computing environment along with its implementation and analysis, which provides security as a service (SecaaS) in the infrastructure layer of the Cloud environment.
Abstract: The paper reports a host based intrusion detection model for Cloud computing environment along with its implementation and analysis. This model alerts the Cloud user against the malicious activities within the system by analyzing the system call traces. The method analyses only selective system call traces, the failed system call trace, rather than all. An early detection of intrusions with reduced computational burden can be possible with this feature. The reported model provides security as a service (SecaaS) in the infrastructure layer of the Cloud environment. Implementation result shows 96 % average intrusion detection sensitivity.

89 citations


Journal ArticleDOI
TL;DR: In this paper, a technique of de-noising signals is presented by the stator current based on a series of decomposition which are compared with respect to each other which is an appropriate tool for studying transient phenomena and non-stationary signals.
Abstract: The analysis of motor current signature analysis was used many years ago, but the fast Fourier transform (FFT) technique has some disadvantages under some conditions when the speed and the load torque are not constants. The FFT has problems due to a non-stationary signal if we must report accurately the frequency characteristics of the defects. Discrete wavelets transform (DWT) treats the non-stationary stator current signal, which becomes complex when it has noises. In this paper, a technique of de-noising signals is presented by the stator current based on a series of decomposition which are compared with respect to each other. We studied a normal bearings and bearings with outer and inner faults. The choice of the decomposition order was for: Daubechies, Symlets and Meyer. The limit point of determination of the levels number is presented. In addition, we look for informations about the basic defect signal on the energy stored in each level of decomposition. DWT has the ability to allow simultaneous time–frequency analysis, so it is an appropriate tool for studying transient phenomena and non-stationary signals.

61 citations


Journal ArticleDOI
TL;DR: Interestingly, experiments demonstrated that the proposed approach outperforms the traditional user-based, item-based and some state of the art recommendation approaches in terms of accuracy of prediction and quality of recommendations.
Abstract: Memory based algorithms, generally referred as similarity based Collaborative Filtering (CF) algorithm, is one of the most widely accepted approaches to provide service recommendations. It provides personalized and automated suggestions to customers to select variety of products. Memory based algorithms mainly have two kinds of algorithms: User-based and Item-based algorithms. The User-based CF algorithm recommends items by finding similar users. Contrary to User-based CF, an Item-based CF algorithm recommends items by finding similar items. The core of memory based CF technologies is to calculate similarity among users or items. However, due to inherent sparsity, a large number of entries (ratings) in user-item rating matrix are missing. This results in only few available ratings to make prediction for the unknown ratings. This results in poor prediction quality of the CF algorithm. In this paper a hybrid approach is presented that combines user-based CF and item-based CF. It also leverage the biclustering technique to reduce the dimensionality. The biclustering helps to cluster all users/items into several groups. These clusters are then used to measure users/items similarities based on their respective parent groups. To obtain individual prediction, it adopts the user-based and item-based CF schemes based on the computed similarity respectively. Finally it combines the resultant predictions of each model to make final predictions. Interestingly, experiments demonstrated that the proposed approach outperforms the traditional user-based, item-based and some state of the art recommendation approaches in terms of accuracy of prediction and quality of recommendations.

49 citations


Journal ArticleDOI
TL;DR: A new algorithm based on discrete glowworm swarm optimization algorithm is applied to the 3-dimensional path planning problem for a flying vehicle whose task is to generate a viable trajectory for a source point to the destination point keeping a safe distance from the obstacles present in the way.
Abstract: Robot path planning is a task to determine the most viable path between a source and destination while preventing collisions in the underlying environment. This task has always been characterized as a high dimensional optimization problem and is considered NP-Hard. There have been several algorithms proposed which give solutions to path planning problem in deterministic and non-deterministic ways. The problem, however, is open to new algorithms that have potential to obtain better quality solutions with less time complexity. The paper presents a new approach to solving the 3-dimensional path planning problem for a flying vehicle whose task is to generate a viable trajectory for a source point to the destination point keeping a safe distance from the obstacles present in the way. A new algorithm based on discrete glowworm swarm optimization algorithm is applied to the problem. The modified algorithm is then compared with Dijkstra and meta-heuristic algorithms like PSO, IBA and BBO algorithm and their performance is compared to the path optimization problem.

43 citations


Journal ArticleDOI
TL;DR: The proposed internet of things enabled monitoring and tracking sensor is specially designed to cater the safety requirements of soldiers on the battlefield and provided the accurate location of the human subject in terms of longitude and latitude of place.
Abstract: The paper reports an internet of things enabled monitoring and tracking sensor for military applications. The proposed sensor is specially designed to cater the safety requirements of soldiers on the battlefield. It employs an Aurdino board for its operation along with various sensors to gauge the remote human vital sign. With the help of global positioning system based location tracking, the sensor provided the accurate location of the human subject in terms of longitude and latitude of place. Further, the designed sensor accurately provided the body temperature of the subject under test. This sensor is a low cost, portable and reliable solution for the military applications.

42 citations


Journal ArticleDOI
TL;DR: This paper presents an intense review of SMO, its variants, applications and relative performance with other algorithms.
Abstract: Algorithms inspired by the intelligent social behavior of simple agents have become popular among the researchers in the recent years. These algorithms are able to find the solution of those real-world optimization problems, which otherwise cannot be solved easily by deterministic techniques. Spider Monkey Optimization (SMO) is one such algorithm which is inspired by the intelligent behavior of spider monkeys. SMO and its variants have been successful and effective in dealing with complex real world optimization problems due to its high efficacy. This paper presents an intense review of SMO, its variants, applications and relative performance with other algorithms .

39 citations


Journal ArticleDOI
TL;DR: A review of four common and applicable variants of the vehicle routing problem, namely, capacitated vehicle routingProblem, vehicle routing problems with time windows, periodic vehicles routing problem and the dynamic vehicle routingproblem, are considered based on formulation techniques, methods of solution and areas of application.
Abstract: A vehicle routing problem involves finding a set of optimal route for a fleet of capacitated vehicles which are available at a location to service the demands of a set of customers. In its simplest form, a customer is required to be visited once and the capacity of a vehicle must not be exceeded. In this paper, a review of four common and applicable variants of the vehicle routing problem, namely, capacitated vehicle routing problem, vehicle routing problem with time windows, periodic vehicle routing problem and the dynamic vehicle routing problem, are considered based on formulation techniques, methods of solution and areas of application. A summary table is presented for each variant to emphasis some key features that represent direction of current research.

39 citations


Journal ArticleDOI
TL;DR: A novel variant of Bat algorithm based on dynamic frequency is introduced and hybridized with K-means to present a new approach for clustering in distributed environment and achieves significant speedup for dealing with massive datasets with increase in the number of nodes.
Abstract: In the past one decade there has been significant increase in the growth of digital data. Therefore, good data mining techniques are important for the better decision making. Clustering is one of the key element in the field of data mining. K-means is a very popular algorithm present in the literature which is widely used for the clustering purpose. However k-means algorithm suffers from the problem of stucking into local optimum solution because of it’s dependency on the random initialization of initial cluster center. In this paper a novel variant of Bat algorithm based on dynamic frequency is introduced. Further the proposed variant is hybridized with K-means to present a new approach for clustering in distributed environment. Since evolutionary computation is very computation intensive, traditional sequential algorithms are not able to provide satisfactory results within the reasonable amount of time for the large scale data problems. To mitigate this problem the proposed variant is parallelized using the MapReduce model in the Hadoop framework. The experimental results show that the proposed algorithm has outperformed K-means, PSO and Bat algorithm on eighty percent of the benchmark datasets in terms of intra-cluster distance. Further DBPKBA has also achieved significant speedup for dealing with massive datasets with increase in the number of nodes.

36 citations


Journal ArticleDOI
TL;DR: An evolutionary algorithm, called the cuckoo search (CS), is introduced in this paper for finding the optimal scaling factors (SFs) in digital image watermarking to improve robustness and imperceptibility.
Abstract: An evolutionary algorithm, called the cuckoo search (CS), is introduced in this paper for finding the optimal scaling factors (SFs) in digital image watermarking to improve robustness and imperceptibility. It is the first application of the CS technique to the image watermarking problem. The basic idea is to treat digital image watermarking as an optimization problem and then solve it using CS. Apply one/two-level discrete wavelet transform to the host image, then the coefficients of low and high frequency sub bands are modified by embedding the watermark multiplied by SFs. The SFs are optimized using the CS algorithm to obtain the highest possible robustness without compromising with the quality. To investigate the robustness of the scheme several attacks are applied to seriously distort the watermarked image. Empirical analysis of the results has demonstrated the efficiency of the proposed technique.

35 citations


Journal ArticleDOI
TL;DR: The proposed ABC is named as ABC with Global and Local Neighborhoods (ABCGLN) which concentrates to set a trade off between the exploration and exploitation and therefore increases the convergence rate of ABC.
Abstract: Artificial Bee Colony (ABC) is a well known population based efficient algorithm for global optimization. Though, ABC is a competitive algorithm as compared to many other optimization techniques, the drawbacks like preference on exploration at the cost of exploitation and slow convergence are also associated with it. In this article, basic ABC algorithm is studied by modifying its position update equation using the differential evolution with global and local neighborhoods like concept of food sources’ neighborhoods. Neighborhood of each colony member includes $$10\,\%$$ members from the whole colony based on the index-graph of solution vectors. The proposed ABC is named as ABC with Global and Local Neighborhoods (ABCGLN) which concentrates to set a trade off between the exploration and exploitation and therefore increases the convergence rate of ABC. To validate the performance of proposed algorithm, ABCGLN is tested over $$24$$ benchmark optimization functions and compared with standard ABC as well as its recent popular variants namely, Gbest guided ABC, Best-So-Far ABC and Modified ABC. Intensive statistical analyses of the results shows that ABCGLN is significantly better and takes on an average half number of function evaluations as compared to other considered algorithms.

32 citations


Journal ArticleDOI
TL;DR: Enhanced ant colony optimization algorithm is exhibit to resolve the optimal power flow with ecological emission in this article to reduce the total fuel cost of fossil thermal power generators and ecological emission.
Abstract: Enhanced ant colony optimization algorithm is exhibit to resolve the optimal power flow with ecological emission in this article. The chief objective is to reduce the total fuel cost of fossil thermal power generators and ecological emission. The exhibited technique sustains the satisfactory level of retaining the bus voltages, real and reactive power and power flow in transmission lines. It minimizes the computational time with satisfying all the power flow limitations. The projected technique validates on an IEEE 30 and 118 bus test system. The simulation results were promising when compared with the other techniques. Further, the outcomes clear that this technique is effective for resolving the power system with large ranged networks.

Journal ArticleDOI
TL;DR: Analysis indicates that C SA based PID controller provides better response compare to GA, PSO and FA based PI/PID controller and CSA based PI controller.
Abstract: Large integration of renewable energy in hybrid power system in isolated mode of operation make frequency control a challenging task. This paper investigates the performance of Cuckoo Search Algorithm (CSA) and Firefly Algorithm (FA) based frequency control strategy of such a hybrid power system, which is a unique work. The generating units of the system are plug in hybrid vehicle (PHEV), wind turbine generators, a diesel engine generator (DEG) and battery energy storage system (BESS). The proportional plus integral (PI)/proportional integral derivative (PID) controllers are employed with PHEV, DEG and BESS to adjust the total active power generation in accordance to the load demand. Addition of PHEV reduces the reliance on the DEG or BESS as a result of variability and uncertainty of wind power. Different disturbance conditions such as step perturbations, random variations of load as well as wind output power, have been considered in the case studies under Matlab simulation to assess the performance of CSA and FA based control strategy. Analysis indicates that CSA based PID controller provides better response compare to GA, PSO and FA based PI/PID controller and CSA based PI controller. Sensitivity analysis has been carried out to check the robustness of FA and CSA optimized PI/PID controller gains.

Journal ArticleDOI
TL;DR: The article begins with constructing main key technologies of geological disaster real-time monitoring and warning system and post-disaster rescue system, summarizes main features of new-generation information technology, and proposes the clews and methods of solving monitoring data acquisition, remote wireless transmission, architecture optimization and integration.
Abstract: Geological disaster monitoring and warning and post-disaster rescue system support is an important means to improve the ability of coping with geological disaster actively Geological disasters are aggravating increasingly along with engineering construction and excessive development of resources, thus it is necessary to utilize reasonable monitoring technology for scientific prevention The article begins with constructing main key technologies of geological disaster real-time monitoring and warning system and post-disaster rescue system, summarizes main features of new-generation information technology, proposes the clews and methods of solving monitoring data acquisition, remote wireless transmission, architecture optimization and integration, quick mining of key data, post-disaster rescue system and systematic management of rescue supplies based on technologies like Internet of things (IOT), new-generation information technology, cloud computing, mega data, and discusses the realization scheme of key technologies such as integration of RFID technology and sensor network technology and expert decision-making model As has been proved by practice, the said technology is feasible and systems are well reliable and advanced, which are able to provide considerable technological support for disaster reduction and prevention decision-making

Journal ArticleDOI
TL;DR: This study designs the time series model for predicting monthly based municipal solid waste generation in Faridabad city of Haryana State (India) using artificial neural network (ANN) time series autoregressive approach and concludes that the proposed ANN model gives accurate predictive results.
Abstract: Accurate prediction of municipal solid waste generation has an important role in future planning and waste management system. The characteristics of the generated solid waste are different at different places (municipality to municipality or country to country). The accurate prediction of municipal solid waste (MSW) generation becomes a crucial task in modern era. Its prediction requires accurate MSW data. The aim of the present study is to design the time series model for predicting monthly based municipal solid waste generation in Faridabad city of Haryana State (India) using artificial neural network (ANN) time series autoregressive approach. The collected municipal solid waste observations have been arranged monthly from 2010 to 2014. The 60 months data set is divided into 42 training data sets, 9 testing data sets and 9 validating data sets. Various structures of ANN have been investigated by changing the number of hidden layer neurons. Finally best optimized structure of neural network is found. The proposed model is validated by the minimum value of performance parameters such as mean square error 0.0003714, root mean square error 0.01927 and the high value of the coefficient of regression 0.8385. On the bases of these performance parameters it is concluded that the proposed ANN model gives accurate predictive results.

Journal ArticleDOI
TL;DR: The paper reports a meticulous review in the field of Cloud computing with a focus on the security risk assessment and service assurance.
Abstract: Cloud security and service assurance is a wide research area with an unrestrained amount of apprehensions, ensuring equipment and stage innovations, to secure information and asset access. In spite of the colossal advantages of Cloud computing paradigm, the security and service concerns have consistently been the center of various Cloud clients and obstruction to its extensive acceptance. The paper reports a meticulous review in the field of Cloud computing with a focus on the security risk assessment and service assurance. This effort will serve as a ready reckoner to the research aspirants to encompass a general thought of the risk factors in security and the service assurance in a Cloud environment.

Journal ArticleDOI
TL;DR: The suggested approach guides how to diagnose root causes of a fault and is not only helpful to maintenance personnel in effective diagnosis but also in guiding designers in development of reliable automobile systems, accident investigations of automobiles, etc.
Abstract: Fault diagnosis of automobile systems is critical, as it adds-up to repair and maintenance time. It is, therefore, desired to make it efficient and effective. One of the conventional approaches is to use the fault tree diagram. But this approach is inadequate with its implicit system structure. Structure of the system means system elements and their interrelations. To alleviate this limitation, a new approach is suggested wherein the structure is in-built, i.e. incorporated explicitly, through digraph modeling that employs a systems approach of graph theory. A system digraph is developed, considering relationships among input and output parameters of subsystems/components of the automobile system in normal and failed conditions. Fault tree of a failure symptom that represents abnormality or a breakdown of the automobile system is obtained from the system digraph. The novelty is extension of the structural approach to automobile systems using digraph model, which has been successfully applied to chemical and process systems. Step-by-step methodology of the structural approach is presented. Its two main two steps are Steps 1 and 2, i.e. ‘Development of Fault tree diagram’ and ‘Diagnosis of fault using the tree diagram’, respectively. The suggested approach is illustrated for hydraulic power steering, an automobile system that is fitted on all current automobiles and particularly, in special purpose vehicles like heavy-duty trucks, earthmovers, dumpers, etc. The suggested approach guides how to diagnose root causes of a fault. The approach is not only helpful to maintenance personnel in effective diagnosis but also in guiding designers in development of reliable automobile systems, accident investigations of automobiles, etc.

Book ChapterDOI
TL;DR: It was observed that ANFIS yields better results and it predicts the reliability more accurately and precisely as compared to all the above-mentioned techniques and comparative analysis between cumulative failure data and inter failure time data found that cumulative failureData give better and more promising results asCompared to inter failureTime data.
Abstract: Software reliability is an indispensable part of software quality. Software industry endures various challenges in developing highly reliable software. Application of machine learning (ML) techniques for software reliability prediction has shown meticulous and remarkable results. In this paper, we propose the use of machine learning techniques for software reliability prediction and evaluate them based on selected performance criteria. We have applied ML techniques including adaptive neuro fuzzy inference system (ANFIS), feed forward backpropagation neural network (FFBPNN), general regression neural network (GRNN), support vector machines (SVM), multilayer perceptron (MLP), bagging, cascading forward backpropagation neural network (CFBPNN), instance-based learning (IBK), linear regression (Lin Reg), M5P, reduced error pruning tree (reptree), and M5Rules to predict the software reliability on various datasets being chosen from industrial software. Based on the experiments conducted, it was observed that ANFIS yields better results and it predicts the reliability more accurately and precisely as compared to all the above-mentioned techniques. In this study, we also made comparative analysis between cumulative failure data and inter failure time data and found that cumulative failure data give better and more promising results as compared to inter failure time data.

Journal ArticleDOI
TL;DR: A novel clustering method based on Biogeography based optimization is proposed to extend the capabilities of traditional clustering methods while clustering high dimensional datasets viz micro array datasets.
Abstract: Unsupervised data classification (data clustering) is one of the mostly used data analysis methods which groups the unlabeled data into identical clusters (groups). Classical clustering methods do not perform effectively while clustering high dimensional datasets viz micro array datasets. Therefore, a novel clustering method based on Biogeography based optimization is proposed to extend the capabilities of traditional clustering methods. Performance of proposed method has been tested on the four micro-array datasets. Experimental results validate the effectiveness of proposed method.

Journal ArticleDOI
TL;DR: Simulation results indicate that amongst the 15 simulated models, backward feed with the added arrangements of feed split, steam split and feed preheating showed the best steam economy.
Abstract: Mathematical models have been studied and solved for evaluating an optimized process configuration for the energy intensive black liquor concentrating Kraft recovery process in paper mills. In the present study, a heptads effect evaporator system is considered and modeled first on the basis of three possible flow directions of black liquor feed and heating steam, i.e. backward, forward or mixed feed. Further, live steam split, liquor feed split, feed preheating and a hybrid of these energy saving schemes are coupled with the basic backward, forward and mixed feed arrangements. The systematically evaluated material and heat balance equations evolve into main model equations that are then represented in matrix forms, thereby, to generalize the models mathematically. The advantage of the studied models are their simplicity in linear representation of equations in matrix form and ease of numerical solution. The proposed mathematical models are iteratively solved using different numerical techniques: Gauss-Jordan, Gauss-elimination, Gauss–Seidel, Jacobi, successive over-relaxation and interior-point methods. The simulation results indicate that amongst the 15 simulated models, backward feed with the added arrangements of feed split, steam split and feed preheating showed the best steam economy. The studied models can be applied and easily extended to solve problems with different operating conditions as well once the liquor, steam and other evaporator effects parameters are known.

Journal ArticleDOI
TL;DR: A new algorithm for anomaly detection has been introduced which is a hybridization of K-Means and Firefly Algorithm, one of the new metaheuristic algorithms for optimization problems inspired by the flashing behavior of fireflies.
Abstract: During the last decade, anomaly detection has attracted the attention of many researchers to overcome the weakness of signature-based IDSs in detecting novel attacks. Indeed, it is difficult to provide secure information systems and to maintain them in a secure state during their lifetime. An IDS is a device or software application that monitors network or system activities for malicious task or policy violations and produces reports to a management station. A metaheuristic is a high-level problem independent algorithmic framework. These are problem-independent techniques and do not take advantage of any specificity of the problem. The main aim of meta-heuristic algorithms is to quickly find solution to a problem. This solution may not be the best of all possible solutions to the problem but still they stand valid as they do not require excessively long time to be solved. Firefly Algorithm is one of the new metaheuristic algorithms for optimization problems inspired by the flashing behavior of fireflies. In this work, a new algorithm for anomaly detection has been introduced which is a hybridization of K-Means and Firefly Algorithm. The algorithm uses clustering to build the training model and uses classification to evaluate on the test set. The subject algorithm is evaluated on the NSL-KDD dataset, which is quite impressive. Further, a comparison study has been performed between the newly developed algorithm with other clustering algorithms including K-Means + Cuckoo, K-Means + Bat, K-Means, K-Means++, Canopy and Farthest First. The results show that K-Means + Firefly and K-Means + Bat outperforms by a huge margin.

Journal ArticleDOI
TL;DR: The collaboration model is introduced, how it addresses the collaboration challenges between research and practice and how it has evolved is discussed, and the lessons learned from the experience are described.
Abstract: There is wide acceptance in the software engineering field that industry and research can gain significantly from each other and there have been several initiatives to encourage collaboration between the two. However there are some often-quoted challenges in this kind of collaboration. For example, that the timescales of research and practice are incompatible, that research is not seen as relevant for practice, and that research demands a different kind of rigour than practice supports. These are complex challenges that are not always easy to overcome. Since the beginning of 2013 we have been using an approach designed to address some of these challenges and to bridge the gap between research and practice, specifically in the agile software development arena. So far we have collaborated successfully with three partners and have investigated three practitioner-driven challenges with agile. The model of collaboration that we adopted has evolved with the lessons learned in the first two collaborations and been modified for the third. In this paper we introduce the collaboration model, discuss how it addresses the collaboration challenges between research and practice and how it has evolved, and describe the lessons learned from our experience.

Journal ArticleDOI
TL;DR: It can be concluded that FWA could be adopted as one of the new template algorithm for the training of ANNs, a class of population-based search method which imitates the explosion process of real fireworks at night.
Abstract: The challenge of training the artificial neural networks (ANNs) which is frequently used for classification purpose has been consistently growing over the last few years, this is probably due to the high dimensional and multi-modal nature of the search space. Nature-inspired metaheuristic algorithms have been successfully employed in the process of weight training of such complex continuous optimization problems. In this paper, a recently proposed fireworks algorithm (FWA) is presented for the training of the parameters of the ANNs. FWA is a class of population-based search method which imitates the explosion process of real fireworks at night. In order to investigate the performance of the proposed method, experiments were conducted on seven benchmark problem instance from the UCI machine learning laboratory and the results obtained by the proposed method are compared with those obtained by krill herd algorithm, harmony search algorithm and genetic algorithm. The results of the evaluation showed superiority of the proposed algorithm in both SSE and training CA and had comparative performance in testing CA and thus it can be concluded that FWA could be adopted as one of the new template algorithm for the training of ANNs.

Journal ArticleDOI
TL;DR: Special Session on Soft Computing and Cryptography is planned under SocProS 2015 to share and exchange knowledge about the developments on soft computing techniques and their uses in the field of cryptography for various security applications.
Abstract: Contact Person Several threats are being reported and increasing continuously on information as well as computer network infrastructure all around the Globe. Protection of such vital information and network is considered as one of the important and challenging tasks and lot of attention is being paid to have adequate security solutions. Special Session on Soft Computing and Cryptography is planned under SocProS 2015 to share and exchange knowledge about the developments on soft computing techniques and their uses in the field of cryptography for various security applications. The session offers an opportunity to explore and examine how soft computing is taking place to achieve solutions of various cryptographic problems.

Journal ArticleDOI
Harish Garg1
TL;DR: From the computed results, it is concluded that various expressions of the system such as failure rate, repair time, reliability, availability etc., are obtained corresponding to different types of numbers, namely gamma, normal, Cauchy and triangular for uncertainties.
Abstract: Uncertainties play a dominant role in the performance analysis of the system. For managing it, fuzzy set theory and its corresponding triangular fuzzy numbers have been utilized by most of the researchers for quantifying the data. However, in this manuscript, this hypothesis has been calmed by defining the different types of numbers, namely gamma, normal, Cauchy and triangular for uncertainties. Based on it, behavior, performance and sensitivity analysis of the system have been investigated at different levels of confidence and the preferences as provided by the decision makers towards the data. Based on it, various expressions of the system such as failure rate, repair time, reliability, availability etc., are obtained corresponding to these different types of the numbers. From the computed results, it is concluded that these indices are reduced range of prediction as compared to the existing approaches. A numerical example has been taken for demonstrating the approach.

Journal ArticleDOI
TL;DR: This paper describes the necessary conditions for the use of bandwidth that makes a routing solution feasible, and then link bandwidth estimation and applied ant colony optimization technique to optimize energy.
Abstract: A collection of small sensor nodes capable of sensing, processing and transmitting data Wireless related to some physical phenomenon is called wireless sensor networks (WSNs). Some severe constraints of sensor nodes are low network bandwidth, short wireless communication range, limited CPU processing capacity, memory storage and energy. The efficient bandwidth utilization and increased lifetime of wireless sensors network are essential for the proliferation of wireless sensor network for various applications. Many research papers have reported to maximize network lifetime, in which wireless link bandwidth has been optimistically assumed to be sufficient. In this paper, we describe the necessary conditions for the use of bandwidth that makes a routing solution feasible, and then link bandwidth estimation and applied ant colony optimization technique to optimize energy. First Part of the paper, Described the performance of the energy aware routing protocol without optimization technique to calculate the energy utilization for the sensor network area and estimate bandwidth. Second part of the paper, an ant colony optimization (ACO) has been applied to optimize energy consumption and compute optimized path bandwidth. Bit error rate has been analyzed to verify network performance. Result shows that energy aware routing protocols with ACO provides more feasible routing solutions and provide significant improvement on the lifetime of the sensor network.

Journal ArticleDOI
TL;DR: A model is proposed that identifies the percentage division of data so as to get the maximum possible accuracy for a particular dataset in a long-term ECG database.
Abstract: Electrocardiogram (ECG) data is one of the most important physiological parameter for detecting heartbeat, emotions and stress levels of patients. The problem is to develop a model that can diagnose an ECG data efficiently with higher accuracy overtime. In this paper, Authors have proposed a model that identifies the percentage division of data so as to get the maximum possible accuracy for a particular dataset. For experimental purpose, the authors have used neural networks for the analysis of the standard and raw data taken from MIT-BIH long-term ECG database using R as a platform. The database is divided into different ratios of training and testing data and the model is trained to attain the best percentage division of the particular patient’s data based upon its accuracy.

Journal ArticleDOI
TL;DR: The objective here is to measure efficiency of 19 departments of IIT Roorkee, a reputed higher education institute of India, to embed the supply chain management concepts on to the education sector.
Abstract: The present study focuses on embedding the Supply chain management concepts on to the education sector i.e. educational supply chain management. It is an important area of research as it helps in the management of one of the most important sector of a country, viz. the education sector. The objective here is to measure efficiency of 19 departments of Indian Institute of Technology Roorkee (IIT Roorkee), a reputed higher education institute of India.

Journal ArticleDOI
TL;DR: In the proposed approach, artificial bee colony (ABC) algorithm inspired fitness-based solution search process is incorporated with the PSO algorithm to balance the organization of the individuals in the search space.
Abstract: In the field of swarm intelligence inspired algorithms, particle swarm optimization (PSO) is a renowned meta-heuristic due to its simplicity, performance, and implementation. However, the PSO also have some downsides like stagnation and slow convergence due to improper balance between the diversification and convergence abilities of the population. Therefore, in this paper, solution search process of PSO algorithm is modified to balance the organization of the individuals in the search space. In the proposed approach, artificial bee colony (ABC) algorithm inspired fitness-based solution search process is incorporated with the PSO algorithm. The proposed approach is tested over 20 unbiased benchmark functions, and the reported results are compared with PSO 2011, ABC, differential evaluation, self-adaptive acceleration factor in PSO, and Mean PSO algorithms through proper statistical analyses.

Journal ArticleDOI
TL;DR: The result of the experiment indicates that the proposed chaotic flower pollination variant CFPA2 could increase the precision of minimization of function value and CPU time to run an algorithm.
Abstract: Flower pollination algorithm (FPA) is susceptible to local optimum and substandard precision of calculations. Chaotic operator (CO), which is used in local algorithms to optimize the best individuals in the population, can successfully enhance the properties of the flower pollination algorithm. A new chaotic flower pollination algorithm (CFPA) has been proposed in this work. Further FPA and its four proposed variants by using different chaotic maps are tested on nine mathematical benchmark functions of high dimensions. Proposed variants of CFPA are CFPA1, CFPA2, CFPA3 and CFPA4. The result of the experiment indicates that the proposed chaotic flower pollination variant CFPA2 could increase the precision of minimization of function value and CPU time to run an algorithm.

Journal ArticleDOI
TL;DR: The homotopy analysis method (HAM) is applied to obtain the approximate solutions of KdV equations of seventh order, which are Sawada Kotera Ito equation, Lax equation, and Kaup–Kuperschmidt equation, respectively.
Abstract: In this paper, the homotopy analysis method (HAM) is applied to obtain the approximate solutions of KdV equations of seventh order, which are Sawada Kotera Ito equation, Lax equation, and Kaup–Kuperschmidt equation, respectively. The convergence of the homotopy analysis method is discussed with the help of auxiliary parameter h, which controls the convergence of the method and also called as the convergence control parameter. The results obtained by the HAM are compared with the exact solutions, by fixing the value of arbitrary constant. It is found that HAM is very robust and elegant method, and by choosing a suitable value of h we can get the approximate solution in very few iteration. Computations are performed with the help of symbolic computation package MATHEMATICA.