scispace - formally typeset
Search or ask a question

Showing papers in "Computer Technology and Development in 2006"


Journal Article•
TL;DR: In this paper, some key questions about wireless sensor network are introduced and analyzed, and the future of the wireless sensors network is presented according to the current research development.
Abstract: Wireless sensor network,composed by sensors,microprocessor and wireless communication interface is an interesting field,and gains more and more attentions.The wide application prospects make it developing rapidly in some fields such as health care,environment monitoring and military field.In this paper,some key questions about wireless sensor network are introduced and analyzed.At the last part,the future of the wireless sensor network is presented according to the current research development.

27 citations


Journal Article•
TL;DR: Several main collaborative filtering algorithms are described, strong-points of the algorithms comparing with other methods are introduced, and several open research problems and directions on the algorithm are pointed out.
Abstract: In E-commerce recommender system,collaborative filtering technology is the most popular and successful method at present.It supposes similar users may have the same behavior in shopping.In this article,first introduce strong-points of the algorithms comparing with other methods,then describe several main collaborative filtering algorithms,at last,point out several open research problems and directions on the algorithm.

12 citations


Journal Article•
Zhang Xiu-ru1•
TL;DR: Back-propagation neural network is an extensively applied multi-layer feedforward neural network in artificial neural network and several methods such as genetic algorithm and simulated annealing algorithm etc. are led to optimize BP algorithm.
Abstract: Back-propagation neural network is an extensively applied multi-layer feedforward neural network in artificial neural networkBasic principle of BP algorithm is analyzed firstlyThen some defects such as slow convergence rate and getting into local minimum in BP algorithm are pointed out,and the root of the defects is presentedFinally,in view of these limitations,several methods such as genetic algorithm and simulated annealing algorithm etcare led to optimize BP algorithmExperiment results show that these methods increase efficiently the convergence performance of BP algorithm and avoid local minimum

11 citations


Journal Article•
Lu You1•
TL;DR: The elements of statistical learning theory for support vector machines used in classification and algorithms are introduced and the main issues of support vector machine are discussed, and the application foreground ofSupport vector machine is prospected.
Abstract: Support vector machines are a kind of novel machine learning method, which have become the hotspot of machine learning because of their excellent performance.In this paper,the elements of statistical learning theory for support vector machines used in classification and algorithms are introduced.The main issues of support vector machine are discussed,and the application foreground of support vector machine is prospected.

11 citations


Journal Article•
TL;DR: To explore the more valuable rules, present weighted association rule algorithms that is to use frequentness and profit to express the importance, and improve the classical Apriori algorithms.
Abstract: Association rule mining can find interesting associations among a large set of data items,and has been applied widely in many fields.But the importance of data items is seldom considered in the traditional association rules which think every data item has the same importance for rules,actually the result of mining is not good.To explore the more valuable rules,present weighted association rule algorithms that is to use frequentness and profit to express the importance,and then improve the classical Apriori algorithms.Finally use the example to testify the improved algorithms that is reasonable and find much more valuable information.

9 citations


Journal Article•
TL;DR: An image encryption algorithm based on chaotic binary sequences is discussed and the algorithm of gray distance improves and shows that the encryption algorithm effect is good and has low computational complexity and high security.
Abstract: With the advent of the Internet and multimedia technology,the multimedia communication has become more and more important,which makes research multimedia information and picture information and sound information.This paper discusses an image encryption algorithm based on chaotic binary sequences and improves algorithm of gray distance to evaluate the effect of binary image encryption.Experiment on chaotic sequence sensitivity on their initial condition,the scrambling of distance pixels,and the encrypting and de encrypting speed of images.The result shows that the encryption algorithm effect is good and has low computational complexity and high security.

8 citations


Journal Article•
Liang Xun1•
TL;DR: A survey for data mining from the points of modeling, algorithms, applications, applications and software systems outlines the concept and characteristics of data mining, and discusses the data sets.
Abstract: Data mining is an interdisciplinary area formulated in the end of the 20th century.Up to now,it has been widely and successfully used in the applications of banks,stock markets,retails,medicine,telecommunication,electronic engineering,aviation industry and travel industry,where huge amounts of data are available and await in-depth analysis.Provides a survey for data mining from the points of modeling,algorithms,applications and software systems.First,it outlines the concept and characteristics of data mining,and discusses the data sets.Second,the steps and procedure are summarized.Third,the tasks and models are explored in the scenarios of applications.Fourth,the popularly used data mining algorithms are briefly analyzed with the practical considerations.Fifth,the applications of data mining are illustrated.Sixth,the data mining software tools,their features and vendors are listed and commented.Finally,the prospect and the issues to be solved by data mining are addressed.

8 citations


Journal Article•
TL;DR: The expression and auto implicit learning algorithm of user profile based on user search histories are given in this paper and the evaluation criteria for the system is stated.
Abstract: Puts forward a new user profile model based on user search histories to solve the problem that now-used search engines can't consider the users' personal interests for their personalized search and user profile is hard to updateThe expression and auto implicit learning algorithm of user profile based on user search histories are given in this paperThe construction and update of user profile are discussed thoroughlyFinally,the evaluation criteria for the system is stated

7 citations


Journal Article•
TL;DR: It is proved that this algorithm finds an effective way to solve the calendar problem that is coupling with lock chambers arranging and scheduling in the plan-arranging part of the co-scheduling of the Three Gorges Dam and the Gezhouba Dam System.
Abstract: The co-scheduling of the Three Gorges Dam and the Gezhouba Dam System is a system used to improve the efficiency of navigation.The plan-arranging part of the co-scheduling system is a calendar problem that is coupling with lock chambers arranging and scheduling.The arranging and scheduling of lock chambers is described with a mathematical model of tow-dimension Packing problem,which is a typical NP totality problem.An improved dimensionality reduction quickly arranging algorithm that based on the thought of sub-step dimensionality solves the tow-dimension Packing problem with the sub-step dimensionality method.It is proved that this algorithm finds an effective way to solve the calendar problem that is coupling with lock chambers arranging and scheduling in the plan-arranging part of the co-scheduling of the Three Gorges Dam and the Gezhouba Dam System. It is proved to be improving the area utilization ration effectively and getting the purpose in the practical engineering.

6 citations


Journal Article•
TL;DR: The results show that the SVM performance by using mixtures of kernels is much better than that by using traditional kernels.
Abstract: Support vector machine(SVM) can be used in function regressionIt is important to choose an optimal kernel in order to enhance the characteristics of the SVMSince every traditional kernel has its advantages and disadvantages for the SVM,in this paper,choose mixtures of kernels which have the desirable characteristics for SVM learning and generalization,and adopt it to function regression,then compare with the SVM using traditional kernelsThe results show that the SVM performance by using mixtures of kernels is much better than that by using traditional kernels

5 citations


Journal Article•
TL;DR: The method compensates every area of different luminance proportion in order to make the luminance of part area near to that of the whole area, then segment image using ordinary methods.
Abstract: To the objective-image segmentation in non-even background,they cannot acquire good effect if using ordinary methods to segment imageIn order to solve this problem,offers a new way to segment imagesThe method compensates every area of different luminance proportion in order to make the luminance of part area near to that of the whole area,then segment image using ordinary methodsThe experiment proves that this way can get better result

Journal Article•
TL;DR: An inductive learning approach based on modified rough set is proposed, where the continuous attributes in the decision table are fuzzified with the proper fuzzy membership functions, and the fuzzy similar matrix of the attributes is constructed with the fuzzy degree of nearness.
Abstract: In this paper,an inductive learning approach based on modified rough set is proposed.Firstly,the continuous attributes in the decision table are fuzzified with the proper fuzzy membership functions,and the fuzzy similar matrix of the attributes is constructed with the fuzzy degree of nearness, then the k-w method is applied to evaluate the relative importance of every continuous attribute.The continuous decision table is discretized into a compatible table based on the fuzzy similarity relation.Secondly,an improved definition of the attribute significance based on the weighed sum is proposed.A prototype system based on the proposed approach is developed.Finally,an engineering example proves the effectiveness and feasibility of the proposed method.

Journal Article•
Liu Qian1•
TL;DR: A new adaptive filter based on wavelet transform is described, proving that this method makes effective denoising and the result is better.
Abstract: In process of recording heart sound,it is inevitable that many kinds of noise will be merged in the main signal.The noise will produce the disadvantage factor and influence the results.Heart sound is a highly nonstationary signal,the common way will throw off the noise,but the part signal will be thrown off.A new adaptive filter based on wavelet transform is described.Tests prove that this method makes effective denoising.And the result is better.

Journal Article•
TL;DR: The test shows that the accuracy of the prediction is obviously higher than traditional NN classification ways, such as BP sl algorithm and thus it has a satisfying result.
Abstract: According to the characteristics of the stock prediction,this paper selects the data that greatly influence the stock development trend of listed companies.In order to avoid the disadvantages of the traditional NN classification methods(e.g.BP algorithm),this paper uses the support vector machine(SVM) to predict the stock development trend of listed companies.The test shows that the accuracy of the prediction is obviously higher than traditional NN classification ways,such as BP slgorithm and thus it has a satisfying result.

Journal Article•
TL;DR: Analysis the clustering methods and representative clustering algorithm, put forward the typical requests of clustering and compared the common clustering algorithms, so that people can easily find a clustering method that suit a special problem.
Abstract: Data mining is one of pop research in information industry last few years.Clustering analysis is the core technique of data mining.Clustering method has been studied very deeply.During the time occurred many different clustering methods that suit data mining,but these methods are only suited special problems and users.In order to use these methods better,analysis the clustering methods and representative clustering algorithm,put forward the typical requests of clustering and compared the common clustering algorithm,so that people can easily find a clustering method that suit a special problem.

Journal Article•
TL;DR: A RBP neural network learning algorithm based on particle swarm optimizers(PSO), that is grouping training and composing optimizer, is proposed in this paper and shows that it is good in speed.
Abstract: A RBP neural network learning algorithm based on particle swarm optimizers(PSO),that is grouping training and composing optimizer,is proposed in this paperThe optimizer realizes multi-dimension searching ability to multi-dimension complex space for the best weight of neural networkAt last,through the comparison of least square method,the result shows that it is good in speed

Journal Article•
TL;DR: KQML is not well suited for agents task-level complex interaction and can not build contract net, so the set of KQML performers is analyzed in the hope of supporting as much kinds of strategies of contract net as possible.
Abstract: KQML has been accepted as a standard of ACL in fact and widely used at presentThis paper points out that KQML is not well suited for agents task-level complex interaction and can not build contract netFor this,analyzes the set of KQML performers in the hope of supporting as much kinds of strategies of contract net as possibleThe definitions,semantics and application to contract net of the eleven extended performative are provided

Journal Article•
TL;DR: An approach is presented that formalizes temporal expressions and augments spatial terms with ontological information and uses this data in the dictation, instead of using a single term vector as document representation.
Abstract: Topic detection and tracking is an event-based information organization task where online new streams are monitored in order to spot new unreported events and link documents with previously detected events.So present an approach that formalizes temporal expressions and augments spatial terms with ontological information and uses this data in the dictation.In addition,instead using a single term vector as document representation,split the terms into four semantic classes and process,including character,time,space and content,and weigh the classes separately.The approach is motivated by experiment.

Journal Article•
Tu Chao1•
TL;DR: Protection of historical cultural resource based on GIS is using GIS to query, search and display detail of historicalcultural resource and its establishment.
Abstract: Protection of historical cultural resource based on GIS is using GIS to query,search and display detail of historical cultural resource and its establishment.Supervise and plan historical cultural resource using analytic functions of GIS,and offer decision-making function for protection of historical cultural resource based on forecast model.The method can be applied in protection of cultural resource,urban-development planning,travel resource development,sight planning and etc.

Journal Article•
Gao Ling1•
TL;DR: Some measuring models and issues are discussed through the analysis of delay, packet losing, filtering out of "packet noise" and removal of clock skew.
Abstract: Discuss the common methods,measurement parameters and key technologies in network measurement.Several common key technologies in network measurement,such as delay,packet losing,filtering out of "packet noise" and removal of clock skew,are iotroduced.Some measuring models and issues are discussed through the analysis of delay,packet losing,filtering out of "packet noise" and removal of clock skew.Because of the fast development of network and the continually advance of new applications,network is becoming more and more complex.In order to deal with the complex network,it is necessary to put forward new measurement models,which is important basis for moderate,reliably,effectively operating of network.

Journal Article•
TL;DR: The basic concept of sequential pattern mining is introduced, the main algorithms are described and finally their performance is analyzed.
Abstract: An active research in data mining area is the discovery of sequential patterns,which finds all frequent sub-sequences in a sequence database.Recent studies can be divided into two major classes of sequential pattern mining methods:a candidate generation-and-test approach;a pattern-growth method.This paper firstly introduces the basic concept of sequential pattern mining, then describes the main algorithms and finally analyzes their performance.

Journal Article•
TL;DR: In order to improve the security, compatibility and practicability of application systems, through combining the advantages of RBAC and TBAC model, a new-type model, T-RBAC(task-role based access control), is discussed.
Abstract: The research work of RBAC(role-based access control) and TBAC(task-based access control) is greatly emphasized in recent yearsThis paper compares the characteristics and applicability spectrum of some recent modelsTo the deficiency of the existing model,in order to improve the security,compatibility and practicability of application systems,through combining the advantages of RBAC and TBAC model,a new-type model,T-RBAC(task-role based access control),is discussedThe configuration and characteristics of the model is describedThe support of least privilege,separation of duties,data abstraction and roles hierarchies in the model is explainedAn application of the model in computer supported cooperative system and the main goal of future research is presented

Journal Article•
TL;DR: Based on the model of mathematics of L system and composition principle, computer simulation algorithm investigation about the plant of the similar structure in three-dimensional space is put forward, and the fidelity to natural scene simulation is improved.
Abstract: Based on the model of mathematics of L system and composition principle,put forward computer simulation algorithm investigation about the plant of the similar structure in three-dimensional space.Change the application of simulation of the plant of L system from two-dimensional space to three-dimensional space.Define product of the cosine value on three space coordinate axes X,Y,Z and three given definitely rotation matrix in the respective coordinate axe as the rotation parameter of plant in the three coordinate axes,and make the plant drawn out to produce the obvious three-dimensional result.And with VC++6.0 combines the strong drawing function offered in OPENGL function storehouse on this basis as the running environment,carry on the realization on the computer to the algorithm.Probe into through increasing the number of times of drawing the plant,change the position of observer's view point,realize the simulation of the woods.Finally through lead into random number generator make row,rank rooms of distance produce at random changing,can transfer different character clusters of facsimile rule make woods produce different kinds of at random at the same time,the color,the trees of the size.The fidelity to natural scene simulation is improved.

Journal Article•
TL;DR: The main idea is to apply data mining methods to learn rules that can capture normal and intrusion activities from pre-processed audit data that contain network connection information, which can be used to detect intrusion behavior later.
Abstract: Along with the rapid development of Internet,many new network attacks emerge unceasingly.Traditional intrusion detection system(IDS) based on expert system depending on handwork and experience,is already very difficult to satisfy the existing application request now,because it is facing challenges from new forms of attacks and system upgrade.So it is necessary to find a method that can extract intrusion patterns from substantive network data automatically.The main idea is to apply data mining methods to learn rules that can capture normal and intrusion activities from pre-processed audit data that contain network connection information.These rules can be used to detect intrusion behavior later.In this paper,data mining technology has been applied to intrusion detection, some algorithms of data mining have been discussed.Then a model of data-mining based on intrusion detection system has been proposed.The experiment proved that,compared with the traditional system,this model has certain superiority in auto-adaptive and extensive.

Journal Article•
TL;DR: This paper proposes an image segmentation method which is a solution for the over-segmentation problem by combining the wavelet and watersheds and labeled images and inverse wavelet transform.
Abstract: Watershed based image segmentation has the drawback of over-segmentationCombining the wavelet and watersheds,this paper proposes an image segmentation method which solves this problemWavelet transform is first used to create multi-resolution imagesThen the marker-controlled watershed segmentation algorithm is applied to segment the lowest-resolution image and get the initial watershed segmentationAt last,labeled images and inverse wavelet transform are used together to get the full-resolution imageThe experiment results show that this method is a solution for the over-segmentation problem

Journal Article•
TL;DR: Content-based image retrieval is achieved by using a constructed similarity matrix that can raise the efficiency of retrieval by adjusting sub-block weight value and based on the dynamic, spatial information and weight value, better retrieval results are obtained.
Abstract: In this paper,image color distribution is obtained based on dynamic sub-block splitting.The image is split into sub-blocks according to the size of object in the query image.Adopt non-uniform quantization of HSV color space,which is in accordance with human visual perception.Combined color feature vector consists of dominant colors and their percentage in each sub-block.A similarity measure is defined for the combined color feature vector.Content-based image retrieval is achieved by using a constructed similarity matrix.It can raise the efficiency of retrieval by adjusting sub-block weight value.Compare the proposed method with traditional global histogram method.Based on the dynamic splitting,spatial information and weight value,better retrieval results are obtained.

Journal Article•
Xian Xue-feng1•
TL;DR: Based on the study of the principle and efficiency of the Apriori algorithm, the authors point out the defect and present a new algorithm that the efficiency of mining with association rules could be improved through employing a new method of producing candidate set.
Abstract: Association rule is an important and issue in data mining.Based on the study of the principle and efficiency of the Apriori algorithm,point out the defect and present a new algorithm that the efficiency of mining with association rules could be improved through employing a new method of producing candidate set.

Journal Article•
TL;DR: This paper analyses the technique of access-control requirements in P2P environments and proposes a trust domain-based access control framework, different from traditional access- Control framework, which can define P1P user's trust domain -based in reputation management and define the access- control strategy for trust domain.
Abstract: Peer-to-peer(P2P) has become popular as a new technology for the resources sharing and coordination.However,the P2P environments make the task of controlling access to network security more difficult,which cannot be done by traditional access control methods.In this paper,analyses the technique of access-control requirements in such environments and proposes a trust domain-based access control framework for P2P environment.The framework is different from traditional access-control framework,which can define P2P user's trust domain-based in reputation management and define the access-control strategy for trust domain.The proposed scheme is realistic and feasible in P2P application.

Journal Article•
TL;DR: The requirement of the credit card company for data mining and neural network technology which apply for personal credit evaluating is described, and a vicinage-extended clustering algorithm is described which is more fit forpersonal credit evaluating than other methods.
Abstract: For the purpose of process the personal credit evaluating timely and correctly,increase the decision rate,this paper describes the requirement of the credit card company for data mining and neural network technology which apply for personal credit evaluating.Contrasted and analyzed some of personal credit evaluating model,e.g.statistical model,classification-clustering model,and so on.Demonstrated those excellence and disadvantage.Constructed a decision tree-neural network personal credit evaluating model.At last,give a vicinage-extended clustering algorithm,the algorithm needn't give number of clustering,and can put up unsupervised learning.The algorithm is more fit for personal credit evaluating than other methods.

Journal Article•
Yang Xue1•
TL;DR: The proposed new software automation test framework, which is called TAF, is highly independent from software systems, products and data, and supports data-driven automation methodology and is integrated into real life project and achieve a good performance.
Abstract: In this paper,describe the strategy targets of software automation test framework.Also,analyze the critical factors of successfully integrating the automation test framework into a project.After a comparative study over five existing products,propose a new software automation test framework,which is called TAF.This framework is highly independent from software systems,products and data.Moreover,it supports data-driven automation methodology.It is integrated into real life project and achieve a good performance.