scispace - formally typeset
Search or ask a question

Showing papers in "Computer Engineering in 2011"


Journal Article
TL;DR: Experimental results show that the scheme can detect complete moving vehicles, and has good reliability and robustness to meet the requirements of intelligent transportation systems.
Abstract: By considering the temporal correlation and spatial correlation of the moving vehicles image,the method combining three-frame-difference method and two-dimensional cross-entropy threshold method is used to detect the moving vehiclesThree-frame-difference method is used to detect the moving vehicles in the video image,from which difference image of gray-scale image is obtained,and two-dimensional cross-entropy threshold method is combined to make binarization of difference imageExperimental results show that the scheme can detect complete moving vehicles,and has good reliability and robustness to meet the requirements of intelligent transportation systems

32 citations


Journal Article
TL;DR: Experimental results show that the expert evaluation method based on Delphi method can collect, express and integrate expert knowledge effectively.
Abstract: When all the problems are evaluated by the experts,expert knowledge expression and integration is very important and difficult as well.Aiming at the problem,expert evaluation method based on Delphi method is applied to collect expert knowledge and evaluate by expert,based on the analysis of the basic theory of Delphi method,the related definition of expert knowledge and the expression of expert knowledge integration result is proposed,the reliability function of expert knowledge is given,the method of expert knowledge integration is proposed.Experimental results show that the method can collect,express and integrate expert knowledge effectively.

19 citations


Journal Article
TL;DR: This paper builds SIR model and discusses the propagation taking into account topology of microblogging communication network and the rules of rumor, revealing that the propagation is impacted by the infection rate and degree distribution entropy.
Abstract: The propagation of rumors is similar to virus spreading in social networks.This paper builds SIR model and discusses the propagation taking into account topology of microblogging communication network and the rules of rumor.The mathematical derivation and computer simulation reveal that the propagation is impacted by the infection rate and degree distribution entropy.More serious the infection rate affects,more enormous the scale of infection is.Smaller the degree distribution entropy is,easier the rumors spread in it.

13 citations


Journal Article
He Chong1
TL;DR: An improved real-coding genetic algorithm is proposed that can effectively avoid premature convergence, improve the accuracy and convergence rate for solving the genetic algorithm and adopt the smoothing function as fitness function.
Abstract: An improved real-coding genetic algorithm is proposed that simple genetic algorithm exists some defect such as premature convergence,large amount of calculation,and slow convergence speed in solving multi-peak and multi-dimensional function global optimization problem.In the algorithm,some improved genetic mechanisms,for example,the improved crossover operator and adaptive change of mutation scaling are adopted,also the smoothing function as fitness function.Numerical experiments show that the algorithm can effectively avoid premature and improve the accuracy and convergence rate for solving the genetic algorithm.

13 citations


Journal Article
Yan Li1
TL;DR: The method proposed in this paper can not only suppress noise in the largest degree, but also detect more low-intensity edge.
Abstract: Aiming at the disability of traditional Canny operator in noise suppression and detecting low-intensity edge,this paper proposes an edge detection method combined LOG operator and Canny operator.LOG operator is used to the picture for noise filtering and Canny operator is improved in the flowing three aspects to execute the edge detection:(1)It designs Gaussian smoothing kernel to intense the edge of picture filtered noise,which makes the low-intensity edge detect easily;(2)Gradient magnitude and direction are calculated by pixels within a M-by-N neighborhood;(3)It integrates gradient direction with the calculation of gradient magnitude,which can be the ground for gradient magnitude in edge detection.Through carrying out a lot experiment for picture increased salt and pepper noise,the method proposed in this paper can not only suppress noise in the largest degree,but also detect more low-intensity edge.

13 citations


Journal Article
TL;DR: Two parameters including attribute importance and number of attribute values are introduced to improve the existed formula of information gain of ID3 algorithm to enhance the importance of the critical attributes with fewer values and making the algorithm better reflect the actual decision-making situation.
Abstract: ID3 algorithm tends to choose the attributes of more values as the splitting attributes.Aiming at the problem,this paper introduces two parameters including attribute importance and number of attribute values to improve the existed formula of information gain of ID3 algorithm.This contributes to enhancing the importance of the critical attributes with fewer values and making the algorithm better reflect the actual decision-making situation.According to the properties of the convex function,it simplifies the calculating formula of information entropy to improve the efficiency of constructing a decision tree.A concrete example is given to describe the specific application of improved algorithm,and the result shows that it is more efficient than the original algorithm.

11 citations


Journal Article
TL;DR: The studies show that the algorithm not only can get the optimal decision rules, but also can greatly decrease search space that the information system requires, and get more perfect attribute reduction effect.
Abstract: This paper expounds the basic conceptions of the rough set theory and information entropy.In order to find the effective approach of attribute reduction,an algorithm of attribute reduction based on rough set and information entropy is put forward.In decision table,a size of mutual information caused by an attribute reflects on the attribute significance,and gets the relative reduction.The studies show that the algorithm not only can get the optimal decision rules,but also can greatly decrease search space that the information system requires,and get more perfect attribute reduction effect.

11 citations


Journal Article
TL;DR: A KNN classification algorithm based on k-nearest neighbor graph for small sample sets is presented to improve the classification accuracy and the experimental results show the algorithm can enhance the accuracy of classification, reduce the influence of the value of k, and achieve a satisfying result.
Abstract: A KNN classification algorithm based on k-nearest neighbor graph for small sample sets is presented to improve the classification accuracy,which partitions the k-nearest neighbor graph into clusters with high similarity,labels the unlabel data of each cluster with the label of the label data in the same cluster,and deletes the noise data.The sample set is expended by this method.The algorithm use the expended sample set to label the unlabel data.The presented algorithm is demonstrated with standard datasets,and the experimental results show the algorithm can enhance the accuracy of classification,reduce the influence of the value of k,and achieve a satisfying result.

11 citations


Journal Article
TL;DR: This paper optimizes the mechanism of memory pre-copy migration and uses Markov prediction model to improve the algorithm that reckon the working set of memory dirty page, and designs an new algorithm that calculate workingSet of memorydirty page by forecasting the probability of dirty pages.
Abstract: Aiming at longer time and memory pages repeated retransmission in the process of memory pre-copy,this paper optimizes the mechanism of memory pre-copy migration and uses Markov prediction model to improve the algorithm that reckon the working set of memory dirty page,designs an new algorithm that calculate working set of memory dirty page by forecasting the probability of dirty pages.This algorithm calculates probability of being modified next round of iteration using dirty pages history of the operation visits,only the memory pages with a lower probability can be translated.Experimental results show that new algorithm shortens the total time of migration and downtime,and effectively support dynamic migration of Virtual Machine(VM).

11 citations


Journal Article
TL;DR: Aiming at heterogeneous characteristics of Wireless Sensor Network,an energy-efficient distributed clustering algorithm called EEDC is proposed, which chooses sensor nodes with high residual energy as tentative cluster heads to participate in the final cluster head competition.
Abstract: Aiming at heterogeneous characteristics of Wireless Sensor Network(WSN),an energy-efficient distributed clustering algorithm called EEDC is proposedIt chooses sensor nodes with high residual energy as tentative cluster heads to participate in the final cluster head competitionEEDC elects tentative cluster heads to be final cluster heads by a novel probability based on the intra-cluster communication costThe cluster heads generated by EEDC are nodes with high residual energy and low intra-cluster communication costTheoretical analysis and simulation results show that the protocol can obtain good cluster head distribution and prolong the network lifetime significantly

11 citations


Journal Article
TL;DR: Matlab simulation results show that the resource allocation model which is based on the CDA mechanism can ensure the creditability of the resources, improve the operation efficiency and maximize the Quality of Service (QoS) when dynamically providing resources.
Abstract: This paper introduces the resource allocation model which is based on the Continuous Double Auction(CDA) mechanism and the pricing strategy of the supplier and the demander under cloud computing environment.After the law of the failure of resources nodes is analyzed,it promotes a dynamic resource allocation strategy which is based on the CDA mechanism and the credibility of the nodes.Matlab simulation results show that the strategy can ensure the creditability of the resources,improve the operation efficiency and maximize the Quality of Service(QoS) when dynamically providing resources.

Journal Article
TL;DR: This paper proposes a method of emotional classification of the house-designing image based on color feature by the extraction of global and local color feature combine with the classifier based on the Radial Basis Function Neural Network.
Abstract: This paper proposes a method of emotional classification of the house-designing image based on color featureA relationship model is built between the color features and emotional semantics based on the perceptual understanding of colorIt uses the house-designing image as data,and the classification of house-designing image is performed by the extraction of global and local color feature combine with the classifier based on the Radial Basis Function Neural Network(RBFNN)Experimental results show the availability of the classification

Journal Article
TL;DR: The feature vectors that represent the users of network forum are shown, which are associated with seven feature values and are used to identify the subclass that matches opinion leaders best from the data set extracted from real posts of a forum.
Abstract: This paper shows the feature vectors that represent the users of network forum,which are associated with seven feature values.In this paper,a users clustering algorithm is designed on the basis of EM algorithm.The algorithm is used to identify the subclass that matches opinion leaders best from the data set extracted from real posts of a forum.An example is given to compare this algorithm with other algorithms,which verifies the correctness of this algorithm.

Journal Article
TL;DR: Experimental results show that, the virtual machine migration schedule strategy based on load characteristic can realise the self-managed of virtual machines migration and the utilization efficiency of resources can be significantly enhanced.
Abstract: In order to enhance the resources utilization and services availability of virtual machines migration,a virtual machine migration schedule strategy based on load characteristic is proposed.Aiming at the trigger types of nodes and the load characteristics of virtual machines,using multi-threshold mode to trigger the migration,the virtual machine to be migrated is selected and the destination node is positioned.Experimental results show that,the strategy can realise the self-managed of virtual machine migration.The utilization efficiency of resources can be significantly enhanced.The strategy has good adaptivity.

Journal Article
TL;DR: The results show that compared with the SASI protocol and the Gossamer Protocol, the proposed protocol is resistant to the DoS attack and algebra attack, and requires less storage capacity on tags, it is cheaper and securer.
Abstract: This paper gives a Denial of Service(DoS) attack against the Gossamer protocol,and proposes a novel ultra-lightweight Radio Frequency Identification(RFID) reader-tag mutual authentication protocol.It analyzes the security and efficiency of the protocol,the results show that compared with the SASI protocol and the Gossamer Protocol,the proposed protocol is resistant to the DoS attack and algebra attack,and requires less storage capacity on tags,it is cheaper and securer.

Journal Article
TL;DR: Two improvements are proposed that combines local search with global search which is based on comparative population, and expands the search space from one sphere to three-sphere according to the characteristics of three chains.
Abstract: In order to solve the problems of optimization efficiency and inferior local search of quantum genetic algorithm whose coding is based on bloch coordinates,this paper proposes two improvements,combines local search with global search which is based on comparative population,and expands the search space from one sphere to three-sphere according to the characteristics of three chains.With the application of function extreme optimization of multi-variables,simulation results show that the improved algorithm has lower generation,higher efficiency,diverse populations,and it proves that the improvement is efficient.

Journal Article
Li Jun1
TL;DR: The proposed data recover algorithm is tested in the case of the bridge structural health monitoring system, and the simulation results indicate the algorithm is correct and efficient in theory and practice with the performance index for mean square error and largest error absolute value.
Abstract: For the problem of data lose in structural healthy monitoring system for the bridge,Granger causality test is introduced to calculate the casual relation between two sensors,and select the sensor signal of larger relation as input vector for extreme learning machine to recover the lost sensor signal dataThe proposed data recover algorithm is tested in the case of the bridge structural health monitoring system,and the simulation results indicate the algorithm is correct and efficient in theory and practice with the performance index for mean square error and largest error absolute value,compared with Back Propagation(BP) network and Least Squares Support Vector Machine(LS_SVM)

Journal Article
Sun Yu1
TL;DR: The concept and classification of ontology is introduced, ontology construction principles, construction methods,ruction methods, construction tools and ontology description languages are discussed and compared respectively.
Abstract: Ontology is used to study the physical presence and physical presence of the essence.It is a conceptual model to describe concepts and the relationship between these concepts in a certain field.As a kind of means of knowledge representation and sharing,ontology is widely concerned and extensively applied in computer science and technology.This paper introduces the concept and classification of ontology,discusses ontology construction principles,construction methods,construction tools and ontology description languages.Ontology construction methods and ontology description languages are compared respectively.

Journal Article
TL;DR: A niching multi-objective Particle Swarm Optimization (PSO) algorithm that applies ring neighborhood topology, which does not require any nichin parameters, and can resolve the problem of traditional parameters setting.
Abstract: This paper describes a niching multi-objective Particle Swarm Optimization(PSO) algorithm.The algorithm applies ring neighborhood topology,which does not require any niching parameters.Hence,it can resolve the problem of traditional parameters setting.Non-dominated sorting and dynamic weight method are used to select the best particles.To enhance the global exploratory capability,a mutation operation is to operate when the crowding-distance decreases to the required precision.The proposed algorithm is tested by five well-known benchmark test functions ZDT1~ZDT4 and ZDT6.Simulation results prove that this algorithm performs better than those classical algorithms do in convergence and diversity.

Journal Article
TL;DR: Test results of typical application parameters show that the practical application of the scheme in digital community is feasible.
Abstract: This paper proposes a scheme to construct digital community based on the Internet of Things(IOT) to meet the need of the practical application.The scheme concludes technical framework improvements of the IoT for application,discussion of the networking mechanisms,implementations of technical details and design of network nodes.It presents a multi-frequency-multi-structure network architecture.Test results of typical application parameters show that the practical application of the scheme in digital community is feasible.

Journal Article
TL;DR: Experimental results show that the algorithm can effectively avoid the basic AFSA into local extremum and can converge quickly with high adjustment and the effectiveness is also demonstrated by parameter estimation.
Abstract: ( )It is a difficulty to obtain satisfactory solution based on a single structure and mechanism generally. Adding heuristic information of Evolutionary Strategy(ES) and Particle Swarm Optimization(PSO) to artificial fish swarm algorithm, a novel Artificial Fish Swarm Algorithm(AFSA) is proposed. Its convergence is proved. Experimental results show that the algorithm can effectively avoid the basic AFSA into local extremum. It can converge quickly with high adjustment and the effectiveness is also demonstrated by parameter estimation.

Journal Article
TL;DR: An improved SIFT feature matching algorithm based on image Radon transform that has higher matching accuracy and needs less matching time is presented, it is quite suitable for high real-time demanded system such as virtual space roaming and target identification.
Abstract: Aiming at the problems of large calculating scale and high complexity in Scale Invariant Feature Transform(SIFT) feature matching algorithm,this paper presents an improved SIFT feature matching algorithm based on image Radon transform.It makes d beelines on different directions in image SIFT feature point zone.Image Radon transform integral values on d beelines are adopted as SIFT feature vector descriptors,it reduces the dimensions of SIFT feature vector to improve the efficiency of feature matching.Experimental result proves that the improved algorithm has higher matching accuracy and needs less matching time,it is quite suitable for high real-time demanded system such as virtual space roaming and target identification.

Journal Article
TL;DR: Experimental results show that the MA-based technology can improve the quality of generating rules, so that it can improved the performance of IDS and be used to detect or classify network intrusions in a real-time environment.
Abstract: ( ) For the current Intrusion Detection System(IDS) has high false negative rate, this paper presents an intrusion detection technology based on Monkey Algorithm(MA). It uses the MA to derive a set of classification rules from network data, KDD99 data set, and the support-confidence framework is utilized as fitness function to judge the quality of each rule. The generated rules are used to detect or classify network intrusions in a real-time environment. Experimental results show that the MA-based technology can improve the quality of generating rules, so that it can improve the performance of IDS.

Journal Article
TL;DR: The Differential Evolution(DE) algorithm is put into making decision of choosing the ACA's parameters, and a new adaptive ACA is proposed, named DEAS, which effectively overcomes the influence of control parameters of ACA and decreases the numbers of useless experiments.
Abstract: Aiming at the phenomena such as the dependence on parameter control,precocity and stagnation of Ant Colony Algorithm(ACA),and the character that ACA is easily combined with other algorithms,the Differential Evolution(DE) algorithm is put into making decision of choosing the ACA's parameters.A new adaptive ACA is proposed,named DEAS.This algorithm regards the parameters of ACA as the elements of DE algorithm's solution vector and adaptively finds the optimal combination of parameters,and the optimal solution for solving the problem.The new algorithm effectively overcomes the influence of control parameters of ACA and decreases the numbers of useless experiments.It is adaptive,good at global-search and prevents the degradation of populations.The comparison with the basic ACA indicates DEAS improves the performance significantly.With some appropriate attempts the algorithm can also be used to solve other combinatorial optimization problems.

Journal Article
TL;DR: This paper presents a Web application development scheme based on Struts and Hibernate that achieves the loose coupling between layers and easy maintenance, reduces the difficulty of development of business model.
Abstract: Model realization of Struts have some problems,such as complex JDBC connecting to database,high coupling between layers and difficult code maintenance,et al.Aiming at these problems,this paper presents a Web application development scheme based on Struts and Hibernate.It really achieves the loose coupling between layers and easy maintenance,reduces the difficulty of development of business model.Through part of development codes and running window of student management system,it explains the process of integration and proves the feasibility of integrated solutions.

Journal Article
TL;DR: An in-depth study and analysis is carried out on how to accelerate clustering in clustering system and the parallelism algorithm based on data parallelism and symmetric data-partition is put forward.
Abstract: Considering the insufficiency of clustering speed which exists in the selecting the initial centroid of Bisecting K-Means(BKM) clustering algorithm,the idea of selecting the two patterns with distance maximum as the initial cluster centroid is implementedAn in-depth study and analysis is carried out on how to accelerate clustering in clustering systemAccording to the characteristics of BKM,the parallelism algorithm based on data parallelism and symmetric data-partition is put forwardExperimental results show that the improvement of algorithm gets ideal speedup performance and efficiency

Journal Article
TL;DR: According to the management problems of complex operator's Broadband Access Network (BAN), an improved general topology discovery algorithm is proposed and a method for verifying the correctness of the generated topological data is presented.
Abstract: According to the management problems of complex operator's Broadband Access Network(BAN),this paper proposes an improved general topology discovery algorithm.The corresponding configuration log files are obtained in Telnet mode by means of multithreading visits,and then interpreted.The parameters are stored in the database.The basic principle is that the physical Media Access Control(MAC) address of lower layer Network Element(NE) is mapped in the address forwarding tables of upper layer NE.It triggers from one access switch as leaf node,ended with Broadband Access Server(BAS) switches as root node,which makes a sole physical topological path.Double uplink of logical topology discovery is solved because the algorithm executes only between the convergence layer and BAS layer.And a method for verifying the correctness of the generated topological data is presented.

Journal Article
TL;DR: Simulation results show that the data scheduling algorithm based on network coding can be simulated effectively with the proposed platform and the platform possesses general and scalability.
Abstract: Due to the lack consideration of general and scalability in the existing simulators of Peer-to-Peer-Video-on-Demand(P2P-VoD) systems based on network coding,a general simulation platform based on NS2 is proposed.The network coding mechanism is extended on the basis of NS2.The Socket model and interface model are designed to provide P2P application.To complete the simulation platform,the P2P-VoD application models are designed and developed.Simulation results show that the data scheduling algorithm based on network coding can be simulated effectively with the proposed platform and the platform possesses general and scalability.

Journal Article
TL;DR: Experimental results indicate that the improved SIFT algorithm can reduce mismatch probability of building images and improve matching results great and the color and global information is introduced to improve the performance of SIFT descriptor.
Abstract: On the building image processing with Scale Invarint Feature Transform(SIFT) descriptor,there will be a large number of falsely matching points.Aiming at this problem,the color and global information is introduced to improve the performance of SIFT descriptor.It introduces l1l2l3 model which is robust against light change and build log-polar coordinates.For each key point,it builds circular neighborhood to cumulate the value of l1,l2,l3.Color invariant descriptor can be constructed.Global descriptor can be constructed with the same method.The Euclidian Distances of SIFT color invariant descriptor and global descriptor will be as similarity measurement.Experimental results indicate that the improved SIFT algorithm can reduce mismatch probability of building images and improve matching results greatly.

Journal Article
LU Xiao-feng1
TL;DR: Experimental result indicates that improved algorithm has better detecting effect, and it is for better out-door monitoring appliances and the method of updating background according to the thought of frame difference.
Abstract: Through the analysis of the current typical moving target detecting algorithm,this paper proposes an improved moving target detecting algorithm based on background modeling and frame difference.It improves the phenomenon of ghost caused by Gaussian mixture background modeling and the phenomena of shadow or cavity and improves the method of updating background according to the thought of frame difference.Improved algorithm can also present effective detection under the variety changing background.Experimental result indicates that improved algorithm has better detecting effect,and it is for better out-door monitoring appliances.