scispace - formally typeset
Search or ask a question

Showing papers in "Computer Engineering and Science in 2008"


Journal Article
TL;DR: A large number of sensors are usually deployed around some discrete targets in wireless sensor networks, and a distributed energy-efficient data aggregation protocol (EETO) is proposed, which reduces energy consumption and prolongs the coverage lifetime of the network.
Abstract: A large number of sensors are usually deployed around some discrete targets in wireless sensor networks. For such target coverage networks, this paper proposes a distributed energy-efficient data aggregation protocol (EETO). EETO groups sensors which have commonly covered targets organized into one cluster, where all the cluster members are the K-hop coverage neighbors of the cluster head. Therefore,the relative data can be aggregated in time and completely at the cluster head, so that the data transmissions are reduced greatly. Detailed simulation results show that EETO reduces energy consumption and prolongs the coverage lifetime of the network.

10 citations


Journal Article
TL;DR: A modified application of the ant colony algorithm in solving the assignment problem is designed, which improves the accuracy and efficiency of the algorithm effectively.
Abstract: The assignment problem is a very important one that frequently appears in mass production and people's daily lifeThe paper constructs the model of the assignment problem,and analyzes the existing ant colony algorithm applied in the problemIt designs a modified application of the ant colony algorithm in solving the assignment problem,which improves the accuracy and efficiency of the algorithm effectivelyIt also gives a brief analysis of the feasibility and advantages of using the ant colony algorithm in solving the assignment problem according to the results of experiments

9 citations


Journal Article
TL;DR: The information platform of logistics is a network interaction platform and a comprehensive relevant business system service platform of serving logistics that promotes the cooperation between enterprises, increases the working efficiency, reduces the working cost, and provides an integrated logistics information service for the users.
Abstract: The information platform of logistics is a network interaction platform and a comprehensive relevant business system service platform of serving logistics.By providing a basic platform building cooperation mechanism,and integrating the logistics-related business information,it realizes the information exchange between the business management department and enterprises and between enterprises throuth computer business systems.Meanwhile ,it promotes the cooperation between enterprises,increases the working efficiency,reduces the working cost,supports the information management of the business management departments,and provides an integrated logistics information service for the users.

8 citations


Journal Article
TL;DR: This algorithm adopts the internal structure characteristics of complex social networks, and gives existent "possible" short paths to estimate the real shortest paths between nodes, and can largely reduce the computational complexity and meanwhile remains a high approximation efficiency.
Abstract: With the rapid growth of Internet users,social network mining and analyzing based on the Internet has become a hot research topic.The social networks mined from the Internet usually have large scales which need an efficient algorithm to calculate the statistics.Betweenness centrality,as an important network statistic property,has been used in many graph cluster/classification algorithms,and how to decrease its computational complexity has become an emergent problem.The recent algorithms on network statistics calculating usually use the approximation methods to estimate the graph distance between the pairs of nodes,but these typical approximation algorithms do not take the complex network property into account;furthermore,the distance estimation formulas can not be directly used to estimate betweenness centrality.In this paper,an efficient approximation algorithm to calculate betweenness properties is proposed.This algorithm adopts the internal structure characteristics of complex social networks,and gives existent "possible" short paths to estimate the real shortest paths between nodes.Experimental results demonstrate that our algorithm can largely reduce the computational complexity and meanwhile remains a high approximation efficiency.Some useful conclusions are obtained through the analysis of the experimental results,which lays a solid foundation for further research.

8 citations


Journal Article
TL;DR: The comparison experiment proves that the algorithm promotes the accuracy of identifying the moving people under a complicated environment and it has good stability.
Abstract: The MeanShift algorithm which is applied to tracking moving objects mainly uses a single histogram to describe the color characteristics of the object.This method obviously lacks spatial distribution information.As for this defect,Emilio Maggio et al have put forward an improved algorithm of blocking the object into regions.But the discriminant effect and the stability are not good enough in a complex environment.So this paper proposes a new method to improve it.On the one hand,reducing the number of human body regions is used to cut down the processing time without losing the space-related information.On the other hand,each sub-block is weighted by certain coefficients so as to improve the discriminant effect.The comparison experiment proves that the algorithm promotes the accuracy of identifying the moving people under a complicated environment and it has good stability.

7 citations


Journal Article
TL;DR: The traditionally affirmative methods and popular meta-heuristic methods are discussed and the advantages and disadvantages of each method are discussed.
Abstract: The traveling salesman problem(TSP)is a typical combination optimization problem,and possesses a practical application value.However,there is no effective corresponding solution to it today.So,in this paper,the traditionally affirmative methods and popular meta-heuristic methods are discussed.The advantages and disadvantages of each method are discussed.The future research direction of the TSP problem is also given.

6 citations


Journal Article
YI Dong-yun1
TL;DR: A novel class of adaptive PSO is proposed based on the Cultural Algorithm, where the fuzzy rules represent the experiences of the particles, and are shared in the population to form the culture.
Abstract: The Particle Swarm Optimization(PSO) is a population-based evolution algorithm and the inertia weight plays a key role in PSO.In this paper,a novel class of adaptive PSO is proposed based on the Cultural Algorithm(CA).The fuzzy rules represent the experiences of the particles,and are shared in the population to form the culture.When the population is evolving,the culture is evolved by the Genetic Algorithm(GA).The fuzzy systems,which are constructed by the fuzzy rules in the belief space,approximate to the controller of inertia weight which is the fittest controller to the particular problem.The simulation results illustrate that the PSO using CA with fuzzy knowledge evolution is a promising optimization algorithm.

6 citations


Journal Article
TL;DR: A flexible and analytical process model is constructed based on the Petri net techniques to model the business process, and the average time performance is analyzed by using some knowledge of the stochastic Petri nets and the probability theory.
Abstract: The whole workflow management is based on business process modeling.Choosing an efficient modeling technique for the formalization of a complicated and variational business process is very important to the construction of flexible workflow management. The paper takes advantage of the Petri net techniques to model the business process, and a flexible and analytical process model is constructed. Firstly, some related workflow modeling techniques based on Petri nets are introduced, and the mapping of the Petri net to the execution of the workflow process modeling are described.Meanwhile a specific case based on the Petri net techniques is proposed. Finally, the average time performance is analyzed by using some knowledge of the stochastic Petri net and the probability theory.

5 citations


Journal Article
TL;DR: The security objective and implementation constraints of wireless sensor networks are introduced and the main defense means are summarized.
Abstract: With wireless sensor networks' application in military and other data-sensitive fields,their security becomes a hot topic.In this paper,the security objective and implementation constraints of wireless sensor networks are introduced.Then the attacks which wireless sensor networks may suffer and the main defense means are summarized.Various security technologies are analyzed.Finally the future security research of wireless sensor networks are explored.

5 citations


Journal Article
TL;DR: The paper begins with the several insufficiencies of traditional expert systems, expands on the fundamental theory and the framework of expert systems based on neural networks, and gives the implementation of a fault-diagnosis system for artesian wells by selecting a three-layer BP neural network model.
Abstract: In allusion to the insufficiencies such as the weak inference capability and low intelligence level of traditional expert systems,this paper solves the questions of knowledge representation and knowledge acquisition of traditional expert systems based on neural networks.The paper begins with the several insufficiencies of traditional expert systems,expatiates on the fundamental theory and the framework of expert systems based on neural networks,and finally gives the implementation of a fault-diagnosis system for artesian wells by selecting a three-layer BP neural network model.

4 citations


Journal Article
TL;DR: TTVOD, a VOD model based on hybrid P2P, mainly discusses the buffer control strategy which improves data delivery efficiency and redundancy, and verifies the validity of the model through experiments.
Abstract: Because of the limitation of video sources and deficient VCR support, Video-On-Demand (VOD in short) based on the P2P network is not applied widely. This paper presents TTVOD,a VOD model based on hybrid P2P, mainly discusses the buffer control strategy which improves data delivery efficiency and redundancy,and verifies the validity of the model through experiments.

Journal Article
TL;DR: The authors' library information retrieval system takes advantage of ontology, which expands the requirements of users to a semantic word set, and provides a document analyzer to filter the Web pages returned by the search agent according to a certain algorithm.
Abstract: The rapid growth and diversities of Web information bring a lot of difficulties to efficient information retrieval.The current information retrieval tools just offer-based searching,but ignore the semantic content of the keyword itself.The authors' library information retrieval system takes advantage of ontology,which expands the requirements of users to a semantic word set,and provides a document analyzer to filter the Web pages returned by the search agent according to a certain algorithm.Consequently it presents the most relavant documents to the users.

Journal Article
LI Zhi-yong1
TL;DR: A new algorithm based on ant colony algorithms and genetic algorithms called Multi-Objective Ant-Genetic Algorithm, which is used to solve the multi-objective optimization problem constrained by some conditions, is presented, which can approach the Pareto front more quickly and accurately than the previous algorithm.
Abstract: A new algorithm based on ant colony algorithms and genetic algorithms called Multi-Objective Ant-Genetic Algorithm, which is used to solve the multi-objective optimization problem constrained by some conditions, is presented in this paper. Firstly, the solution space is divided into some subspaces, and all the subspaces are labeled by pheromone, then the pheromone guides the inheritance searching and updates itself. Meanwhile, the strategy of updating the Pareto optimal decisions and the scheme of converging and exiting the searching are used to promote the efficiency and reduce the complexity of the algorithm. In the end, an example is listed to prove that the algorithm can approach the Pareto front more quickly and accurately than the previous algorithm.

Journal Article
TL;DR: The importance of a gateway in greenhouse intelligent measuring and controlling systems is analyzed, the design principle of a Gateway is proposed, and the choice of devices is made.
Abstract: Wireless sensor networks are widely used in the modernization of agriculture. This paper analyzes the importance of a gateway in greenhouse intelligent measuring and controlling systems, proposes the design principle of a gateway, and makes the choice of devices. Based on the low-power PXA270 embedded processor, the hardware platform of a gateway is implemented with the Ethernet, the USB host, and the CF card interface. Finally, we analyze elaborately the booting process of the boot loader, and implement the porting of Blob in the gateway designed by ourselves.

Journal Article
TL;DR: With software fault severity considered, a software fault-proneness prediction model is proposed in this paper by Support Vector Machine and the Chidamber-Kemererobject-oriented metrics that obtains better results than that of the Nave Bayesian, the Random forest, and the NNge prediction models when the high and low severity faults and the ungraded severity fault are distinguished.
Abstract: With software fault severity considered,a software fault-proneness prediction model is proposed in this paper by Support Vector Machine and the Chidamber-Kemerer(CK)object-oriented metrics.The experimental results show that this presented model obtains better results than that of the Nave Bayesian,the Random forest,and the NNge prediction models when the high and low severity faults and the ungraded severity fault are distinguished.

Journal Article
TL;DR: A body-centered cubic structure is proposed for the deterministic deployment and organization of sensor nodes in a 3D space and a sensor node organization strategy for random 3D sensor networks is provided based on the virtual Voronoi cell.
Abstract: A large number of sensor networks embedded in the real physical world will be three-dimensional(3D).However,most current sensor network research assumes that sensors are deployed on a two-dimensional plane.This paper focuses on the problem of deployment and organization of sensor nodes in a 3D space.A body-centered cubic structure is proposed for the deterministic deployment of 3D sensor networks.And a sensor node organization strategy for random 3D sensor networks is provided based on the virtual Voronoi cell.

Journal Article
TL;DR: Experiments on real-world text data sets demonstrate that BVM has accuracies comparable to SVM, but is much faster than SVM.
Abstract: In recent years,SVM(Support Vector Machine)for text classification has been regarded as one of the important progresses in the text classification field.Many experiments show that SVM has higher classification accuracy than any other machine learning algorithms in text classification,but it has a slower rate of convergence for large-scale data,which becomes a big flaw in its practice.BVM(Ball Vector Machine)is a faster machine learning algorithm than SVM.This paper applies BVM to text categorization.Experiments on real-world text data sets demonstrate that BVM has accuracies comparable to SVM,but is much faster than SVM.

Journal Article
TL;DR: A prototype system of the expert knowledge map is realized, in which the main information and features of experts as well as social network can be accessed by web mining.
Abstract: The tacit knowledge of experts is very important for the society and the expert knowledge map is a useful means to utilize the tacit knowledge of experts.The key techniques to build expert knowledge maps are studied,and a scheme is built,in which the main information and features of experts as well as social network can be accessed by web mining.A prototype system of the expert knowledge map is realized.

Journal Article
TL;DR: The possibility that human eyebrow works as an independent biometric for personal identification is analyzed, and the idea that PCA may be applied to eyebrow recognition is proposed.
Abstract: This paper analyzes the possibility that human eyebrow works as an independent biometric for personal identification,and proposes the idea that PCA may be applied to eyebrow recognition.Using a small-scale database of eyebrow images taken from 32 persons,the problem of how PCA-based eyebrow recognition accuracy is related to the size-normalized methods and the information threshold has been studied respectively in the two cases:one constructing feature vectors from pure eyebrow images,the other from their Fourier Transform.The experimental results show that the highest accuracies are respectively 60.00% in the first case and 90.63% in the second one.

Journal Article
Chen Huo-wang1
TL;DR: This paper presents an analysis of the current major software models and a taxonomy of component models according to their component interfaces and component compositions and a discussion on the future research topics.
Abstract: Component-based software reuse is considered an important and feasible approach to solving software crisis,and the component model which depicts the intrinsical features and the composition of components is one of the essential ingredients for realizing reuse.In this paper,we present an analysis of the current major software models and a taxonomy of component models according to their component interfaces and component compositions.This paper also presents a summary of the current component models,and a discussion on the future research topics.

Journal Article
TL;DR: This paper studies the localization algorithms of wireless sensor networks, introduces the current research status of localization and a localization algorithm using the concept of ring-overlapping and proposes two proposed algorithms that have higher performance.
Abstract: This paper studies the localization algorithms of wireless sensor networks, introduces the current research status of localization and a localization algorithm using the concept of ring-overlapping. The disadvantage of this algorithm is also pointed out; two possible improvements are given and compared with each other.Through a theoretical analysis and emulation, two proposed algorithms are proved to have higher performance.

Journal Article
TL;DR: It is thought Coroutine is the most suitable pattern for distributed system programming, and Libresync, a Coroutine-based supporting library for distributed systems, is presented, which brings clear control flow, high flexibility and expressiveness.
Abstract: Programming patterns for distributed systems mainly includes multi-threaded and event-driven patterns,and the event-driven pattern is the major oneThis paper,however,discusses their disadvantages,and the Coroutine pattern's advantages over themWe think Coroutine is the most suitable pattern for distributed system programmingAnd we present Libresync,a Coroutine-based supporting library for distributed systems,which brings us clear control flow,high flexibility and expressivenessThe performance of Libresync is also reasonably good enough for most purposes

Journal Article
Xiao Wen-sheng1
TL;DR: A new method is presented which can convert scattered points into the standard VTK data format and an algorithm of incremental mesh construction is introduced, along with the self-adapting flatness, so surface rendering can be carried out with the VTK pipeline.
Abstract: Scattered point data can be easily generated in most fields such as reverse engineering for mechanical products and geographic information systems.In order to extend the applicability of data processing and surface rendering on the VTK-based visualization platform,a new method is presented which can convert scattered points into the standard VTK data format.As the core of the conversion process,an algorithm of incremental mesh construction is introduced,along with the self-adapting flatness.Thus surface rendering can be carried out with the VTK pipeline.The algorithm implements the triangulation in three dimensions directly and adjusts the approaching error parameter dynamically.The results of the examples show that surfaces of the models can be reconstructed in an effective and reliable way.The method is significant for both the triangulation analysis and the VTK visualization process.

Journal Article
TL;DR: Experimental results show that the developed algorithm not only detects and removes ghosts fast and effectively, but also overcomes the shortcomings of other algorithms.
Abstract: This paper studies the usual problem of ghosts in moving objects detection based on background subtraction.After analyzing the pixel distribution of the border area of every foreground blob(PDBA),a method of ghosts detection is proposed.Firstly,histogram matching(HM) and mean variable ratio(MVR) are used to weigh the difference of the PDBA between the current and previous images.Secondly,histogram matching is also used to weigh the difference of the PDBA between every foreground blob and the background connection with that blob.Finally,a threshold segmentation method is used to determine the ghost's blob with extremely high HM and low MVR.Experimental results show that the developed algorithm not only detects and removes ghosts fast and effectively,but also overcomes the shortcomings of other algorithms.

Journal Article
TL;DR: This paper mainly uses several specific TSP problems to compare the several important parameters in the algorithm, obtains a group of effective parameter values, and lays an effective foundation for solving other problems similar to the TSP problem.
Abstract: The simulated annealing algorithm is an effective approach to solving combinatorial optimization problems,but how to set the parameters has a greater influence on the results and effectiveness.Thus,this paper mainly uses several specific TSP problems to compare the several important parameters in the algorithm,obtains a group of effective parameter values,and lays an effective foundation for solving other problems similar to the TSP problem.

Journal Article
TL;DR: An attribute reduction method based on knowledge dependency with the rough set theory is proposed, which can quickly and effectively find the attributes that really affect decisions.
Abstract: This paper studies the methods of processing massive data with data mining techniques,and proposes an attribute reduction method based on knowledge dependency with the rough set theory,which can quickly and effectively find the attributes that really affect decisions.In addition,the paper puts forth an algorithm for generating decision trees based on knowledge dependency,which features high prediction precision and easy computation.

Journal Article
TL;DR: The technology of attribute reduction based on the intuitionistic fuzzy rough set theory is described as to the problem of information loss in the process of discretization and the advantages and disadvantages are compared.
Abstract: The technology of attribute reduction based on the intuitionistic fuzzy rough set theory is described as to the problem of information loss in the process of discretization.A model of rough sets under intuitionistic fuzzy equivalant relations is systematically investigated,and the definitions of positive field,dependence degree and nondependence degree are given.An attribute reduction algorithm based on intuitionistic fuzzy rough sets is particularly analyzed,and the advantages and disadvantages are compared.Experiments show the feasibility of the application of the algorithm.

Journal Article
LI Wan-long1
TL;DR: This paper takes advantage of some theories related to the ontology knowledge model and discusses the application and the method of ontology in knowledge-based systems, and proposes an ontology-based knowledge model.
Abstract: As a modeling tool in describing the concept model of information systems in at the semantic and knowledge level, ontology is widely used in many areas of computer science in recent years. Among which the ontology-based knowledge construction is a complex project, according to the current research situation,and the key is to resolve the problems of how to build a knowledge model. This paper takes advantage of some theories related to the ontology knowledge model.Based on the analysis of ontology, we discuss the application and the method of ontology in knowledge-based systems.Finally we propose an ontology-based knowledge model, and explain the method of ontology-based knowledge base construction.

Journal Article
TL;DR: An algorithm of texture classification by SVM is proposed, which uses GLCM to extract features and can classify the texture features more exactly.
Abstract: Support vector machine(SVM)has excellent performance in classification.And the Gray Level Co-occurrence Matrix(GLCM)is a promising method for texture analysis.So an algorithm of texture classification by SVM is proposed,which uses GLCM to extract features.Compared with the method using images' gray information directly for SVM classification,the method proposed in this paper can classify the texture features more exactly.

Journal Article
TL;DR: The paper summarizes the movement law of yarn carriers in the four-step braiding technique which is widely used,creates the mathematical models between the braiding techniques parameters and the geometric structures, and propses a fixed grid method of step-by-step simulation according to the characteristics of theFour-step Braiding technique.
Abstract: The paper summarizes the movement law of yarn carriers in the four-step braiding technique which is widely used,creates the mathematical models between the braiding technique parameters and the geometric structures,and propses a fixed grid method of step-by-step simulation according to the characteristics of the four-step braiding techniqueThe papet first resolves the design of the quadrate four-step braiding method,and analyses the simiarities and differences between the method and the circular four-step braiding methodFinally a 3D geometric simulation of the pre-modeling entity is realized