scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computer Applications in 2010"


Journal Article•DOI•
TL;DR: An improved artificial potential field was utilized to solve the problem of Goals Nonreachable with Obstacle Nearby GNON and adopted the improved potential function that ensured the goal is the global minimum so the robot can reach the goal freely.
Abstract: This paper utilized an improved artificial potential field to solve the problem of Goals Nonreachable with Obstacle Nearby GNON.The improved artificial potential field adopted the improved potential function that ensured the goal is the global minimum so the robot can reach the goal freely.To address the local minimum problem in classical potential field method a method composed of obstacles connection and discrete model of sensor of robot was proposed.The improved method was adaptable to path planning of robots in complex indoor environment.The effectiveness of the improved algorithm was verified by simulation.

45 citations


Journal Article•DOI•
TL;DR: A new clustering validity index was designed from the standpoint of sample geometry and based on the index a new method for determining the optimal clustering number in K-means clustering algorithm was proposed.
Abstract: K-means clustering algorithm clusters datasets according to the certain clustering number k.However k cannot be confirmed beforehand.A new clustering validity index was designed from the standpoint of sample geometry.Based on the index a new method for determining the optimal clustering number in K-means clustering algorithm was proposed.Theoretical research and experimental results demonstrate the validity and good performance of the above-mentioned algorithm.

43 citations


Journal Article•
Tang Jin1•
TL;DR: This paper surveyed the recent techniques for image dehazing from the point view of physical model and digital image processing and some fundamental principles of typical methods were summarized and state-of-the-art progress was presented.
Abstract: Image dehazing is an important issue that interests both digital image processing and computer vision areas. This paper surveyed the recent techniques for image dehazing from the point view of physical model and digital image processing. Some fundamental principles of typical methods were summarized and the state-of-the-art progress was presented. For some typical, new haze removal algorithms, both the perceptual visual effect and objective evaluation data were presented to illustrate their haze removal performance. Finally, some future research topics on image dehazing were suggested.

35 citations


Journal Article•
TL;DR: Choosing Matlab as the emulator, it is proved that the localization accuracy of the improved algorithm is better than those of the original algorithm and some existing improved algorithms.
Abstract: In the wireless sensor networks,node localization is one of the key technologiesThe authors theoretically analyzed DV-HOP algorithm and pointed out the main reason for the errorThen,an improved algorithm was brought out to use correction value as the estimated distance between anchor nodes and unknown nodesThis value was composed of the correction of the multi-hop and anchor node's average distance errorMeanwhile,Total Least Squares(TLS) was applied to node localization to enhance the localization accuracyChoosing Matlab as the emulator,it is proved that the localization accuracy of the improved algorithm is better than those of the original algorithm and some existing improved algorithms

30 citations


Journal Article•
TL;DR: A novel no-reference image quality assessment index called No-Reference Structural Sharpness (NRSS) was proposed for quality evaluation of blurred images and the experimental results show that the new index is well in accordance with quality assessment results of both subjective evaluation and full-reference methods.
Abstract: With the analysis of image blur based on the imaging model,a method was proposed for constructing reference images,and at the same time the Structural Similarity(SSIM)index was introduced into no-reference image quality assessment.A novel no-reference image quality assessment index called No-Reference Structural Sharpness(NRSS)was then proposed for quality evaluation of blurred images.This method constructed a reference image by a low-pass filter,and assessed the image quality by computing the SSIM between the original image and the reference one,thus considering the mathematical model of imaging system as well as the advantages of SSIM.The experimental results show that the new index is well in accordance with quality assessment results of both subjective evaluation and full-reference methods.

26 citations


Journal Article•
TL;DR: It has been proved that the new short-term traffic flow composite forecasting model proposed has high forecasting precision, the forecasting result can maintain at 88% or more, and the model also has good practicality.
Abstract: In view of missing data issue of traffic detection,this paper proposed a new short-term traffic flow composite forecasting model.The model adopted reconstruction method to solve the missing data problem,and used improved Kalman smoothing to implement short-term traffic flow forecasting.The model resolved the defeats of traditional forecasting methods which cannot deal with the missing data,and also can attain a high forecasting precision.Through the validation of Shenzhen data and being compared with the traditional methods,it has been proved that the new method has high forecasting precision,the forecasting result can maintain at 88% or more,and the model also has good practicality.

22 citations


Journal Article•
Zhao Xue-quan1•
TL;DR: A novel method called silhouette coefficient was proposed in this paper and was applied to evaluate the K-means algorithm and could achieve the better judgement for the clustering effect than the others.
Abstract: Several methods were used to study the validity of clustering result.According to the comparison of many different methods,a novel method called silhouette coefficient was proposed in this paper and was applied to evaluate the K-means algorithm.This method could achieve the better judgement for the clustering effect than the others.Finally,the extensive experiments performed on standard dataset verify the effectiveness of the proposed method.

22 citations







Journal Article•
Xu Chen1•
TL;DR: An Enhanced Particle Swarm Optimization algorithm was proposed to overcome the disadvantage of PSO such as easily falling into local optimal at the latter part of the evolution.
Abstract: An Enhanced Particle Swarm Optimization (EPSO) algorithm was proposed to overcome the disadvantage of PSO such as easily falling into local optimal at the latter part of the evolution.In this algorithm,when particle fell into local extremum point,the algorithm enhanced its ability of searching global optimal value by enhancing the particle's self-study ability,the other relative particles' ability of exploring new search space and the information communication in particles.The experimental results indicate that the new method has good ability of searching optimal value.

Journal Article•
LI Qing1•
TL;DR: A fast template matching algorithm that has high accuracy and high speed, and it can satisfy the request of real-time.
Abstract: The traditional template matching method has low efficiency and low speed.This paper proposed a fast template matching algorithm.In the beginning,just a small part of points were involved in template matching,and gradually more and more points got involved in template matching.Through the comparison of correlation coefficients,it was determined whether to increase the number of matching points or to abandon the current matching position and move to a new position for a new match.When calculating the correlation coefficient,we just calculated with the new points,and then merged it with the original correlation coefficient to get a new correlation coefficient.This greatly reduced the computation of the algorithm.The points involved in calculating the correlation coefficient always were distributed on the template uniformly,which ensured the accuracy of method.The proposed algorithm has high accuracy and high speed,and it can satisfy the request of real-time.

Journal Article•
TL;DR: The experimental results show that, the packet loss rate and response time of this new gateway are less than the other gateways like wire line network and Wi-Fi bluetooth wireless network, and it can be applied to the smart home system.
Abstract: To release the connection problem between different kinds of wireless communication technologies in conventional smart home system, a new direct-connected intelligent home gateway between two different wireless communication technologies was designed. This gateway was based on ARM920T embedded processor S3C2440A, ARM Linux operation system and new communication technology combining Zigbee and Wireless-Fidelity ( Wi-Fi) , was considered to achieve the personalized objective of remote monitoring, home security, home appliances control and so on. An embedded Web server was built in ARM9 processor, the Wi-Fi module and Zigbee module were expanded to achieve the construction of wireless network and the connection to Internet, and the hardware structure and software process were shown too. The experimental results show that, the packet loss rate and response time of this new gateway are less than the other gateways like wire line network and Wi-Fi bluetooth wireless network, and it can be applied to the smart home system.


Journal Article•
TL;DR: The improved Apriori algorithm improved it from three aspects: the strategy of the join step and the prune step was improved when candidate frequent (k+1)-itemsets were generated from frequent k-itemsets, and the method of dealing with transaction was improved to reduce the time of pattern matching to be used in the A Priori algorithm.
Abstract: The classic Apriori algorithm for discovering frequent itemsets scans the database many times and the pattern matching between candidate itemsets and transactions is used repeatedly, so a large number of candidate itemsets were produced, which results in low efficiency of the algorithm. The improved Apriori algorithm improved it from three aspects: firstly, the strategy of the join step and the prune step was improved when candidate frequent (k+1)-itemsets were generated from frequent k-itemsets; secondly, the method of dealing with transaction was improved to reduce the time of pattern matching to be used in the Apriori algorithm; in the end, the method of dealing with database was improved, which lead to only once scanning of the database during the whole course of the algorithm. According to these improvements, an improved algorithm was introduced. The efficiency of Apriori algorithm got improvement both in time and in space. The experimental results of the improved algorithm show that the improved algorithm is more efficient than the original.

Journal Article•
TL;DR: An image was smoothed while preserving the boundaries by mean shift algorithm and the initial segmented regions were obtained using K-means clustering algorithm in the feature space to form the final segmentation result by a new region merging strategy.
Abstract: This paper proposed a novel algorithm of color image segmentation,based on clustering and region mergingFirst,an image was smoothed while preserving the boundaries by mean shift algorithmSecond,the initial segmented regions were obtained using K-means clustering algorithm in the feature spaceFinally,the initial regions were merged to form the final segmentation result by a new region merging strategyThe simulation results show that color image segmentation results of the proposed approach are well consistent with human perception

Journal Article•DOI•
TL;DR: Focusing on the requirements of low cost and low power in Wireless Sensor Network (WSN), the paper introduced a new method using the Particle Swarm Optimization to correct the position estimated by DV-Hop during the third stage of DV- Hop algorithm with the use of estimates of the distance between the nodes and the position of anchor nodes.
Abstract: Focusing on the requirements of low cost and low power in Wireless Sensor Network(WSN),the paper introduced a new method,which was based on DV-Hop algorithm,using the Particle Swarm Optimization(PSO) to correct the position estimated by DV-Hop,during the third stage of DV-Hop algorithm with the use of estimates of the distance between the nodes and the position of anchor nodes.This algorithm does not need any additional devices and increase in traffic.The simulation shows that the improved algorithm can decrease the average location error up to 30%,and effectively reduce the cost.


Journal Article•
TL;DR: An algorithm with the behaviors of preying, following and swarming of artificial fish for searching optimal solution was proposed in this paper and computational results show that the algorithm can quickly find optimal solution.
Abstract: The Multiple Knapsack Problem(MKP) is a NP-hard combinatorial optimization problem in many real-word applications.An algorithm with the behaviors of preying,following and swarming of artificial fish for searching optimal solution was proposed in this paper.With regard to the problem that infeasible solutions are largely produced in the process of initializing individuals and implementing the behaviors of artificial fish due to the multiple constraints,which undermines the algorithm performance,an adjusting operator based on heuristic rule was designed to ensure all the individuals in the feasible solution areas.Computational results show that the algorithm can quickly find optimal solution.The proposed algorithm can also be applied to other constrained combinatorial optimization problems.

Journal Article•
TL;DR: In order to improve the speed of image registration, a fast method based on improved RANdom SAmple Consensus(RANSAC) algorithm was proposed, and a method of pre-detection was used to discard those temporary models that were not preview models.
Abstract: In order to improve the speed of image registration,a fast method based on improved RANdom SAmple Consensus(RANSAC) algorithm was proposed.At first,Harris corner detector was used to extract the feature points in the reference and target images.Next,based on the proximity and similarity of their intensity,the feature points were matched.Finally,the improved RANSAC algorithm was used to estimate transform matrix more fast and accurately.To improve the speed of computation,a method of pre-detection was used to discard those temporary models that were not preview models.To delete outliers a random block selecting method was used to select samples,which improves the precisions.The experiment shows that this algorithm reduces the amount of computation largely and improves the speed of image registration when the precisions do not have notable change.


Journal Article•DOI•
TL;DR: In this article, an Integer Planning (IP) model was designed to describe the scheduling problem based on the time slot representation of each firm's available scheduling periods, and an One-by-One Selection Heuristic (OOSH) algorithm was proposed to resolve the IP model.
Abstract: In order to solve the insertion order scheduling problem of agile supply chains for production planning,a two stage supply chain composed of one factory and many suppliers was studied.Taking the minimization of the total supply chain cost as the objective,an Integer Planning (IP) model was designed to describe the scheduling problem based on the time slot representation of each firm's available scheduling periods,and an One-by-One Selection Heuristic (OOSH) algorithm was proposed to resolve the IP model.Finally,by contrast with the calculation results of Distance Priority (DP) and Cycle Time Priority (CTP) algorithms in some scheduling experiments,the feasibility and effectiveness of the model and algorithm were verified.The experimental results also reveal that the form of agile supply chain is competitive.

Journal Article•
TL;DR: This feature vector made the Gaussian Mixture Model (GMM) classifier outperform MFCC and Differential MFCC features in classification and achieve an average recognition rate of more than 90%, and small computational complexity.
Abstract: Concerning the high complexity and low rate in abnormal audio recognition,the abnormal audio recognition system based on the Mel-Frequency Cepstrum Coefficients(MFCC)and short-term energy was proposed.This feature vector made the Gaussian Mixture Model(GMM)classifier outperform MFCC and Differential MFCC features in classification.The classifier can achieve an average recognition rate of more than 90%,and small computational complexity.The steps of system implementation were elaborated.The simulation results prove the effectiveness of the proposed algorithm.

Journal Article•
TL;DR: The algorithm used improved weighted Bayesian method to predict the rating of unrated items and showed that the measure provides better recommendation results for the system.
Abstract: Collaborative filtering is used extensively in personalized recommendation systems.With the development of E-commence,the magnitudes of users and commodities grow rapidly,resulting in the extreme sparseness of user rating data.To address the problem a collaborative filtering recommendation algorithm based on naive Bayesian method was proposed.The algorithm used improved weighted Bayesian method to predict the rating of unrated items.Through predicting unrated data,the sparseness of rating data problem had been alleviated and the accurate degree of searching nearest neighbor items had been improved simultaneously.The experiment shows that the measure provides better recommendation results for the system.


Journal Article•
TL;DR: The experimental results show that the real-time traffic lights recognition method presented is effective and validated by statistical methods.
Abstract: Traffic lights recognition is important for the intelligent vehicles researchA real-time traffic lights recognition method was presentedFirstly,the image was pre-processed by morphological operationsAnd then the candidate areas were found according to the samples of three kinds of traffic light got from Lab color space,which describes red,green,yellow color more effectivelyThen,the candidate areas were filtered by traffic lights' shape characteristicsAfter that,three kinds of traffic lights templates,which were surrounded by a black rectangular box,were designed and used to confirm the candidate areasAt last,the results were validated by statistical methodsThe experimental results show that the method is effective

Journal Article•
Zhao Peng1•
TL;DR: A majority transform rule modified with persistency of cell was proposed by a new equation as well as an algorithm of considering the information influence of cell movement traversing lattice space, and a serial simulation results prove that the proposed method is valid and effective.
Abstract: A structure of Cellular Automata(CA)for dissemination of network public sentiment was designed,which was composed of selection of cellular state data,lattice space,and neighbor.Then a majority transform rule modified with persistency of cell was proposed by a new equation as well as an algorithm of considering the information influence of cell movement traversing lattice space.A serial simulation was put into experiment to evaluate above model.Four parameters including inclination intensity,inclination focusing degree,cell number peak,and ratio of cell number in different inclination class were designed before discussion.The iteration calculation showed the results had different transit procedure,patterns and meaning respectively followed by a cell with variable persistency and in static or moving condition.Finally it suggested that the cell moving traversing lattice with persistency should be given special concern as it conformed to Internet reality.The simulation results prove that the proposed method is valid and effective.