scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Hybrid Information Technology in 2015"


Journal ArticleDOI
TL;DR: In this paper, an Artificial Neural Network (ANN) model for predicting the performance of a sophomore student enrolled in engineering majors in the Faculty of Engineering and Information Technology in Al-Azhar University of Gaza was developed and tested.
Abstract: In this paper an Artificial Neural Network (ANN) model, for predicting the performance of a sophomore student enrolled in engineering majors in the Faculty of Engineering and Information Technology in Al-Azhar University of Gaza was developed and tested. A number of factors that may possibly influence the performance of a student were outlined. Such factors as high school score, score of subject such as Math I, Math II, Electrical Circuit I, and Electronics I taken during the student freshman year, number of credits passed, student cumulative grade point average of freshman year, types of high school attended and gender, among others, were then used as input variables for the ANN model. A model based on the Multilayer Perceptron Topology was developed and trained using data spanning five generations of graduates from the Engineering Department of the Al-Azhar University, Gaza. Test data evaluation shows that the ANN model is able to correctly predict the performance of more than 80% of prospective students.

143 citations


Journal ArticleDOI
TL;DR: A new method on self- adaptive image block based on threshold value, and a new hybrid filter- bank of self-adaptive median and morphology, which is adopted to smooth the noise image.
Abstract: Aimed at the defects of the traditional Canny operator, this paper puts forward an improved algorithm in edge detection. First, this paper gives a new method on self- adaptive image block based on threshold value. Next, by proposing a new hybrid filter- bank of self-adaptive median and morphology, we adopt this hybrid filter-bank to smooth the noise image. Then, we add the information of gradient in two bevel directions, so that the information of gradient is more complete. Last, by using the threshold value to process the image of gradient which is after non-maxima suppression, we obtain the image edge. For the noise image, this improved algorithm not only can filter out noise well, but also the image edge is continuous, smooth, clear. The experimental results show that the improved algorithm has a good effect in edge detection, strong capability of noise immunity. The objective evaluation and visual effect are good, too.

77 citations


Journal ArticleDOI
TL;DR: huge potential feature information represented as word vectors are generated by neutral networks based on unlabeled biomedical text files and this result is closed to the state-of-the-art performance with only POS (Part of-speech) feature and represents the deep learning can effectively performed on biomedical NER.
Abstract: Many machine learning methods have been applied on the biomedical named entity recognition and achieve good results on GENIA corpus. However most of those methods reply on the feature engineering which is labor-intensive. In this paper,huge potential feature information represented as word vectors are generated by neutral networks based on unlabeled biomedical text files. We propose a Biomedical Named Entity Recognition (Bio-NER) method based on deep neural network architecture which has multiple layers and each layer abstracts features based upon the features generated by lower layers. Our system achieved F-score 71.01% on GENIA regular test corpus , F-score values for 5-fold cross-validation is 71.01% and this result is closed to the state-of-the-art performance with only POS (Part-of-speech) feature and represents the deep learning can effectively performed on biomedical NER.

66 citations


Journal ArticleDOI
TL;DR: This paper study and analyze the application of cloud computing and the Internet of Things on the field of medical environment and proposes the hospital medical information service cloud system monitoring and management application.
Abstract: With the fast development of cloud computing and computer science technology, the combination of the IOT and clod computing in the medical-assisted environment is urgently needed. The prior research focus more on individual development of the single technique, quite a less research on the field of medical monitoring and managing service application have been conducted. Therefore, in this paper, we study and analyze the application of cloud computing and the Internet of Things on the field of medical environment. We are trying to make the combination of the two kinds of technology monitoring and management information system in hospital. Remote monitoring cloud platform architecture model (RMCPHI) set up medical information in the first place. Then the RMCPHI architecture was analyzed. Eventually, the last effective PSOSAA algorithm proposed the hospital medical information service cloud system monitoring and management application. Experimental simulation illustrates that the proposed algorithm outperforms the other state-of-the-art algorithms. Further potential research areas are discussed.

66 citations


Journal ArticleDOI
TL;DR: An algorithm based on integrating Genetic Algorithms and Simulated Annealing methods to solve the Job Shop Scheduling problem and is an approximation algorithm for the optimization problem i.e. obtaining the minimum makespan in a job shop.
Abstract: The Job-Shop Scheduling Problem (JSSP) is a well-known and one of the challenging combinatorial optimization problems and falls in the NP-complete problem class. This paper presents an algorithm based on integrating Genetic Algorithms and Simulated Annealing methods to solve the Job Shop Scheduling problem. The procedure is an approximation algorithm for the optimization problem i.e. obtaining the minimum makespan in a job shop. The proposed algorithm is based on Genetic algorithm and simulated annealing. SA is an iterative well known improvement to combinatorial optimization problems. The procedure considers the acceptance of cost-increasing solutions with a nonzero probability to overcome the local minima. The problem studied in this research paper moves around the allocation of different operation to the machine and sequencing of those operations under some specific sequence constraint.

39 citations


Journal ArticleDOI
TL;DR: This review is discussing the description and categorization of accessible security properties, which refers to the capability of software to conclude of a creation on time and security attributes as well as durability.
Abstract: Security is a peak significant quality element in the pitch of software engineering. Software security improvement is easily done with the support of factors, models and metrics of security. Software security should be analyzed with the help of its security factors. Security dimension is the main attribute in evaluation, executing, and calculating security in the way to organize and develop quality of software. It is to be identified that qualifications of security factors increased through inspecting damages, discriminating susceptibility and attacks in design development process. This review is discussing the description and categorization of accessible security properties. Durability is an attribute of security that refers to the capability of software to conclude of a creation on time. Software security is affected with security attributes as well as durability. A stable state of the secure software enhances additional security.

34 citations


Journal ArticleDOI
TL;DR: The present evaluation metrics with its respective advantages and disadvantages are discussed and the results show that the applicability of Matthews correlation coefficient in the relative evaluation work of recommendation algorithm.
Abstract: The personalized recommendation systems could better improve the personalized service for network user and alleviate the problem of information overload in the Internet. As we all know, the key point of being a successful recommendation system is the performance of recommendation algorithm. When scholars put forward some new recommendation algorithms, they claim that the new algorithms have been improved in some respects, better than previous algorithm. So we need some evaluation metrics to evaluate the algorithm performance. Due to the scholar didn’t fully understand the evaluation mechanism of recommendation algorithms. They mainly emphasized some specific evaluation metrics like Accuracy, Diversity. What’s more, the academia did not establish a complete and unified assessment of recommendation algorithms evaluation system which is credibility to do the work of recommendation evaluation. So how to do this work objective and reasonable is still a challengeable task. In this article, we discussed the present evaluation metrics with its respective advantages and disadvantages. Then, we put forward to use the Matthews Correlation Coefficient to evaluate the recommendation algorithm’s performance. All this based on an open source projects called mahout which provides a rich set of components to construct the classic recommendation algorithm. The results of the experiments show that the applicability of Matthews correlation coefficient in the relative evaluation work of recommendation algorithm.

31 citations


Journal ArticleDOI
TL;DR: In this article, water samples were collected from two sites (Urdana Nala and Moti Nala) in Jabalpur city, were analyzed for their physio- chemical characteristics.
Abstract: Water samples were collected from two sites (Urdana Nala and Moti Nala) in Jabalpur city, were analyzed for their physio- chemical characteristics. This analysis result was compared with the WHO & United State Sanility labolatory standards of irrigation/drinking water quality parameters with the following water quality parameters namely pH, Electrical conductivity, Cu, Cr, SO4, Fe, NO3, Chloride, TH, TA, Na. The statistical analysis like Factor Analysis (FA) and Principle Component Analysis (PCA) of obtained data were carried out. The PCA extract and to define the parameters responsible for the main variability in water quality variance for Jabalpur city. Moti Nala and Urdana Nala water constrains high Na ions. But Cu, Cr, SO4, Fe, NO3, Cl, TH are within limits. TA is higher in most of the samples. The PCA produced more important parameter NO3, Cr, Fe, TH, Na, EC, SO4, Na in Urdana Nala site. NO3, Cu, Na, Fe, TA, pH, SO4, TH, Cr in Moti Nala site. Finally the results of PCA reflect a good look on the water quality monitoring and interpretation of the surface water.

27 citations


Journal ArticleDOI
TL;DR: An overview of Wireless Body Sensor Networks, devices used, its architecture, protocol stack, issues, topology, WBSN standard, challenges is given.
Abstract: With the rise in population and increase in older people Wireless Body Sensor Networks can prove to be beneficial in providing medical service to people who require continuous monitoring and care. Development in technologies for Wireless Communication has led to sensor nodes which can be worn over the body, implanted and embedded over the body. These are small sized devices which can perform processing on the signals sensed from the human body and then communicate it to required destination where this data can be used for various research purposes or other diagnosis. In the past much of the research work concerned with Body Area Networks deals with the designing of Sensor nodes, miniaturizing the nodes, various protocols related to communication and routing. This paper gives an overview of Wireless Body Sensor Networks, devices used, its architecture, protocol stack, issues, topology, WBSN standard, challenges. Some protocols and security schemes for Wireless Body Sensor Networks have also been discussed.

25 citations


Journal ArticleDOI
TL;DR: In this article, double unscented Kalman filtering (D-UKF) algorithm is designed to calculate the real value of state of charge (SOC) and Ohmic resistance of lithium battery at the same time.
Abstract: This paper considers an accurate estimation method of State of Health (SOH) for lithium battery. An improved battery model is proposed which is based on equivalent circuit model and battery internal electrochemical characteristics. In our study, Double Unscented Kalman Filtering (D-UKF) algorithm is designed to calculate State of Charge (SOC) and SOH of lithium battery at the same time. The feature of our new method is SOH estimation model is derived based on battery internal resistances. The Ohmic resistance (one of internal resistances) can be identified online based on D-UKF algorithm. Two filters defined as UKF1 and UKF2 are working together to calculate the real-value of SOC and Ohmic resistance to obtain final SOH value. The experimental results indicate that our new battery model considers different value of battery internal resistances on different working condition (as different voltages, different currents). Besides, our study verifies the performance and feasibility of new estimation method based on D-UKF. This new algorithm has the practical value to further study for other types of lithium battery.

24 citations


Journal ArticleDOI
TL;DR: A novel hybrid Bat Algorithm with the Differential Evolution strategy using the feasibility-based rules, namely BADE is proposed to deal with the constrained optimization problems and demonstrates that BADE performs more efficient, accurate, and robust than the original BA, DE, and some other optimization methods.
Abstract: A novel hybrid Bat Algorithm (BA) with the Differential Evolution (DE) strategy using the feasibility-based rules, namely BADE is proposed to deal with the constrained optimization problems. The sound interferences induced by other things are inevitable for the bats which rely on the echolocation to detect and localize the things. Through integration of the DE strategy with BA, the insects’ interferences for the bats can be effectively mimicked by BADE. Moreover, the bats swarm’ mean velocity is simulated as the other bats’ effects on each bat. Having considered the living environments the bats inhabit, the virtual bats can be lifelike. Experiments on some benchmark problems and engineering designs demonstrate that BADE performs more efficient, accurate, and robust than the original BA, DE, and some other optimization methods.

Journal ArticleDOI
TL;DR: Two modified versions of CSA, where new solutions are generated using two distributions including Gaussian and Cauchy distributions in addition to imposing bound by best solutions mechanism are proposed for solving economic load dispatch (ELD) problem with multiple fuel options.
Abstract: Cuckoo Search Algorithm (CSA), a new meta-heuristic algorithm based on natural phenomenon of Cuckoo species and Lévy flights random walk has been widely and successfully applied to several optimization problems so far. In the paper two modified versions of CSA, where new solutions are generated using two distributions including Gaussian and Cauchy distributions in addition to imposing bound by best solutions mechanism are proposed for solving economic load dispatch (ELD) problem with multiple fuel options. The advantages of CSA with Gaussian distribution (CSA-Gauss) and CSA with Cauchy distribution (CSA-Cauchy) over CSA with Lévy distribution and other meta-heuristic are fewer parameters. The proposed CSA methods are tested on two systems with several load cases and obtained results are compared to other methods. The result comparisons have shown that the proposed methods are highly effective for solving ELD problem with multiple fuel options and/nor valve point effect.

Journal ArticleDOI
TL;DR: This paper analyzes the growth of ten releases of five open source projects from different domains and shows how complexity evolves over time and how these systems conform to the second Lehman's law of software evolution.
Abstract: When the software system evolves, its scale is increasingly growing to the degree where it is very hard to handle. Measuring the internal quality of the source code is one of the goals of making software development an engineering practice. Source Lines of Code (SLOC) and Cyclomatic Complexity (CC) are usually considered indicators of the complexity of a software system. Software complexity is an essential characteristic of a software system where it plays an important role in its success or failure. Although understanding the complexity is very important, yet it is not clear how complexity evolves in open source systems. In this paper, we study the complexity evolution of five open source projects from different domains. We analyze the growth of ten releases of these systems and show how complexity evolves over time. We then show how these systems conform to the second Lehman's law of software evolution.

Journal ArticleDOI
TL;DR: In this paper, a review of the use of metamaterials in microstrip patch antennas is presented, where the authors first overview the metammaterials, its types and then apply them in Microstrip Patch antennas over the last 13-15 years.
Abstract: Metamaterial is the arrangement of "artificial" elements in a periodic manner providing unusual electromagnetic properties. This unusual property has made it an area of interest for last few decades. It has wide applications in antennas. Gain, directivity, bandwidth, efficiency, and many other parameters of microstrip patch antenna can be improved using metamaterials. In this review paper, we first overview the metamaterials, its types and then the application of metamaterials in Microstrip patch antennas over the last 13-15 years.

Journal ArticleDOI
TL;DR: Different techniques of live analysis are looked into and critically review them by identifying their benefits and limitations and the key areas focused in this study pertain to virtualization, pagefile extraction and identifying the encryption keys.
Abstract: The widespread availability and extensive use of Internet across the world has caught attention of the criminals and digital crimes are occurring at an epidemic scale nowadays. The field of digital forensics is constantly evolving by employing new tools and technique to counter novel approaches employed by the criminals as well as to investigate the nature of the criminal activity and bring the culprits to justice. Traditionally, the static analysis was used to investigate the digital incidents. But due to advancement in technology and the fact that hackers are developing malware that do not leave footprint on the hard disk, the need for performing live digital forensic analysis in addition to the static analysis has become imperative. Live forensic analysis techniques have evolved during the last decade to analyses the memory content to get a better picture of the running application programmers, processes and active binaries. In this study, we look into different techniques of live analysis and critically review them by identifying their benefits and limitations. The key areas focused in this study pertain to virtualization, pagefile extraction and identifying the encryption keys.

Journal ArticleDOI
TL;DR: Experiments show that this method of ontology mapping and merging based on rough concept lattice isomorphic model is better than the traditional method in semantic annotation accuracy and breadth.
Abstract: Semantic annotation is the process based on ontology annotation concept class, attribute and other metadata for cyber source and its various parts. Ontology mapping is to calculate the similarity between two ontology elements. Ontology merging is two or more source ontology merging into a goal Ontology. The basic principle of the concept lattice isomorphic generating is isomorphic to the background of the isomorphic concept lattice, and as concept lattice isomorphic background can generate the concept lattice. This paper analyzes the methods of ontology mapping and merging based on rough concept lattice isomorphic model and presents semantic annotation of ontology by using rough concept lattice isomorphic model. Experiments show that this method is better than the traditional method in semantic annotation accuracy and breadth.


Journal ArticleDOI
TL;DR: The improved Kmeans algorithm with density constraints can reduce the processing time, especially, as the increase of the value of K, that is number of cluster, the calculating time of the clustering algorithm can be decreased greatly, and with the increases of the scale of data size, the stability of the improved KMeans algorithm has been verified.
Abstract: With development of 3D scanner, it becomes more convenient to access point data. However, for processing the large-scale point cloud, it raised a new challenge for computer graphics. This paper places an emphasis on the point data own characteristics, and then the point data have been divided into certain point sets by clustering algorithm, that is will be divided into different clusters. In order to suit for the point data organization or space division, the clustering algorithm would be improved. This paper provided a new Kmeans algorithm with density constraints. Before processing the point cloud by Kmeans algorithm with density constraints, the density of the point cloud have been defined in this paper, the density of the point cloud can be used for quantification of the convergence. Finally, the Kmeans algorithm with density constraints is verified by the experiment results. Our experiment showed that the improved Kmeans can reduce the processing time, especially, As the increase of the value of K, that is number of cluster, the calculating time of the clustering algorithm can be decreased greatly. In addition, with the increases of the the scale of data size, the stability of the improved Kmeans algorithm has been verified.

Journal ArticleDOI
TL;DR: This paper proposes and implements a cloudbased electronic medical record (CloudeMR) system to improve the delivery of healthcare system in the rural communities of Nigeria.
Abstract: The utilization of modern information technology in the delivery of healthcare is to enhance the availability and reliability of improved healthcare services to patients at a reduced cost. The alternative in this context is to outsource the computing storage resources with the help of cloud infrastructure. The drastic reduction in the cost of healthcare services, utilization of resources, maintainability and the adoption of new technologies are some of the benefits that healthcare centers in rural areas can get from cloud-based medical information system. Also, new prospects such as easy and everpresent access to medical records and the chances to make use of services of physicians that are not readily available in the rural areas are some of the opportunities offered by a cloud-based medical information system. This paper proposes and implements a cloudbased electronic medical record (CloudeMR) system to improve the delivery of healthcare system in the rural communities of Nigeria.

Journal ArticleDOI
TL;DR: A new method to confront the SSDF attacks by excluding malicious users is proposed is cognitive radio, which shows that both the detection probability and the false alarm probability are significantly improved compared to the case when all users are by default trusted to be normal users.
Abstract: Cognitive radio (CR) can improve the utilization of the spectrum by making use of licensed spectrum in an opportunistic manner. However, the security aspects of cognitive radio networks have garnered little attention. In this paper, we identify a threat to cognitive radio networks, which we call the spectrum sensing data falsification (SSDF) attack. SSDF attack can hugely degrade the achievable detection accuracy. To counter this threat, we proposed a new method to confront the SSDF attacks by excluding malicious users is cognitive radio. In detail, the proposed scheme defense the SSDF attacks by calculating and updating the credit value of the Secondary Users (SUs), malicious users are excluded to avoid the attacks affect in cooperative spectrum sensing. Simulations results show that both the detection probability and the false alarm probability are significantly improved compared to the case when all users are by default trusted to be normal users.

Journal ArticleDOI
TL;DR: This survey paper defines architecture of traditional data warehouse and ways in which data warehouse techniques are used to support academic decision making and defines different data warehouse types and techniques used in educational environment to extract, transform and load data.
Abstract: Data Warehouse and Data mining are technologies that deliver optimallyvaluable information to ease effective decision making. This survey paper defines architecture of traditional data warehouse and ways in which data warehouse techniques are used to support academic decision making. This paper defines different data warehouse types and techniques used in educational environment to extract, transform and load data, and the ways to improve these techniques to have maximum benefit of data warehouse in educational environment. Further this paper have define different data warehouse framework for different situations.

Journal ArticleDOI
TL;DR: This paper argues that the main reason to prefer a larger page is to increase the virtual to physical translation speed i.e. because the size of a TLB is limited, to facilitate increasing of TLB coverage the authors have to use larger pages.
Abstract: Choosing the best page size for Virtual Memory requires considering several factors. A smaller page size reduces the amount of internal fragmentation. On the other hand, a larger page needs smaller page tables. However, this paper argues that the main reason to prefer a larger page is to increase the virtual to physical translation speed i.e. because the size of a TLB is limited, to facilitate increasing of TLB coverage we have to use larger pages

Journal ArticleDOI
TL;DR: This paper transplants some of the parameters used to enhance the performance of Particle Swarm Optimization technique by introducing new parameters like constriction coefficient and inertia weight.
Abstract: Particle swarm optimization (PSO) is an artificial intelligence (AI) technique that can be used to find approximate solutions to extremely difficult or impossible numeric maximization and minimization problems. Particle swarm optimization is an optimization method. It is an optimization algorithm, which is based on swarm intelligence. Optimization problems are widely used in different fields of science and technology. Sometimes such problems can be complex due to its practical nature. Particle swarm optimization (PSO) is a stochastic algorithm used for optimization. It is a very good technique for the optimization problems. But still there is a drawback that it gets stuck in local minima. To improve the performance of PSO, the researchers have proposed some variants of PSO. Some researchers try to improve it by improving the initialization of swarm. Some of them introduced new parameters like constriction coefficient and inertia weight. Some define different methods of the inertia weight to improve performance of PSO and some of them work on the global and local best. This paper transplants some of the parameters used to enhance the performance of Particle Swarm Optimization technique.

Journal ArticleDOI
TL;DR: There are some security and privacy issues for utilizing any form of virtualization, and the aim of this survey is to highlight such threats and techniques to solve these issues.
Abstract: Virtualization is a term that refers to the abstraction of computer resources. Virtualization has many applications within any organization. This makes possible virtual storage network and utilizing hardware resources efficiently. Virtualization also makes the foundations of cloud computing services, allowing users to use the hardware as an on-demand service. With all such advantages, there are also some security and privacy issues for utilizing any form of virtualization. The aim of this survey is to highlight such threats and techniques to solve these issues.

Journal ArticleDOI
TL;DR: An improvement structure of cat swarm optimization (ICSO) is presented, capable of improving search efficiency within the problem space under the conditions of a small population size and a few iteration numbers, and gets higher accuracy than the existing methods and requires less computational time.
Abstract: Cat swarm optimization (CSO) is a novel meta-heuristic for evolutionary optimization algorithms based on swarm intelligence. CSO imitates the behavior of cats through two submodes: seeking and tracing. Previous studies have indicated that CSO algorithms outperform other well-known meta-heuristics, such as genetic algorithms and particle swarm optimization, because of complexity, sometimes the pure CSO takes a long time to converge to reach to optimal solution. For improving the convergence of CSO with better accuracy and less computational time, this study presents an improvement structure of cat swarm optimization (ICSO), capable of improving search efficiency within the problem space under the conditions of a small population size and a few iteration numbers. In this paper, an improved algorithm is presented by mixing two concepts, first concept found in parallel cat swarm optimization (PCSO) method for solving numerical optimization problems. The parallel cat swarm optimization (PCSO) method is an optimization algorithm designed to solve optimization problems Based on cats’ cooperation and competition for improving the convergence of Cat Swarm Optimization,, the second concept found in Average-Inertia Weighted CSO (AICSO) by adding a new parameter to the velocity update equation as an inertia weight and used a new form of the position update equation in the tracing mode of algorithm. The performance of ICSO is sensitive to the control parameters selection. The experimental results show that the proposed algorithm gets higher accuracy than the existing methods and requires less computational time and has much better convergence than pure CSO, and the proposed effective algorithm can provide the optimum block matching in a very short time, finding the best solution in less iteration and suitable for video tracking applications.

Journal ArticleDOI
TL;DR: Analyzes the aggregation and propagation of self-similar traffic between nodes in satellite network, and a sort of special network node called ground gateway is modeled, based on which the characteristics of the output traffic that the input traffic from terrestrial network passes gateway into satellite network are analyzed.
Abstract: It has already been confirmed that the traffic in high-speed terrestrial network presents self-similarity, but there is little research on self-similarity of traffic in satellite network. Considering time-varying network topology and link status, this paper analyzes the aggregation and propagation of self-similar traffic between nodes in satellite network. Furthermore, a sort of special network node called ground gateway is modeled, based on which the characteristics of the output traffic that the input traffic from terrestrial network passes gateway into satellite network are analyzed. Theoretically analyses demonstrate that after aggregation and propagation between satellite nodes, traffic is still self-similar, and the self-similarity of the output traffic generated by gateway from terrestrial network to satellite network is more often than not weakened.

Journal ArticleDOI
TL;DR: Experimental results show that the global search capability of IPSO has been significantly improved and IPSO can effectively avoid premature convergence problem and solve the multi-agent coalition formation problem effectively and efficiently.
Abstract: How to generate the task-oriented optimal agent coalition is a key issue of multi-agent system, which is a typical optimization problem. In this paper, an improved particle swarm optimization (IPSO) is proposed to solve this problem. In order to overcome the premature and local optimization problem in traditional particle swarm optimization (PSO), we proposed a variation of inertia weight PSO algorithm by analyzing the feasibility of particle optimization process in PSO. Compared with several well-known algorithms such as PSO, ACO, experimental results show that the global search capability of IPSO has been significantly improved and IPSO can effectively avoid premature convergence problem. Also it can solve the multi-agent coalition formation problem effectively and efficiently.

Journal ArticleDOI
TL;DR: A survey on trust based routing in AODV based VANET to find secure location is presented and a review of various researchers on trustbased VANet is presented.
Abstract: During the last few years, a vehicular ad hoc network (VANETs) was extensively focused by researchers. A vehicular ad hoc network (VANETs) is a subclass of Mobile ad hoc networks builds to make sure the safety of traffic. VANET is a type of mobile peer to peer network, although it exhibits some different characters (fast moving, short lived connection etc). VANET is different from MANET due to large scale networks, higher mobility of nodes, geographically constrained topology and frequent network partitioning. In this paper, we present a survey on trust based routing in AODV based VANET to find secure location. In this first discussed about VANETs, their applications, characteristics, attacks, routing protocols and present a review of various researchers on trust based VANET.

Journal ArticleDOI
TL;DR: A double mutation cuckoo search algorithm (DMCS) to overcome the disadvantages of traditional cuckoos search algorithms, such as bad accuracy, low convergence rate, and easiness to fall into local optimal value.
Abstract: This paper presents a double mutation cuckoo search algorithm (DMCS) to overcome the disadvantages of traditional cuckoo search algorithms, such as bad accuracy, low convergence rate, and easiness to fall into local optimal value. The algorithm mutates optimal fitness parasitic nests using small probability, which enhances the local search range of the optimal solution and improves the search accuracy. Meanwhile, the algorithm uses large probability to mutate parasitic nests in poor situation, which enlarges the search space and benefits the global convergence. The experimental results show that the algorithms can effectively improve the convergence speed and optimization accuracy when applied to basic test functions and systems of nonlinear equations.

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors conducted an empirical research on the factors of user adoption of mobile games and found that perceived usefulness, perceived entertainment, economic cost and subject norm significantly affect the attitude of use, which results in behavior intention to adopt the mobile game.
Abstract: The mobile game is experiencing a rapid development and one of the favorite mobile applications. This article focuses on Chinese mobile game market and conducts an empirical research on the factors of the user adoption of mobile game. The research results show that perceived usefulness, perceived entertainment, economic cost and subject norm significantly affect the attitude of use, which results in behavior intention to adopt the mobile game, while the perceived ease of use, similarity and brand trust, which proposed to affect the user adoption in related literatures insignificantly affect use adoption of mobile game .