scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Chinese Computer Systems in 2009"


Journal Article
TL;DR: This paper proposes an approach for standardization of facial image quality, and develops facial symmetry based methods for the assessment of it by measuring facial asymmetries caused by non-frontal lighting and improper facial pose.
Abstract: Performance of biometric systems is dependent on quality of acquired biometric samples. Poor sample quality is a main reason for matching errors in biometric systems and may be the main weakness of some implementations. This paper proposes an approach for standardization of facial image quality, and develops facial symmetry based methods for the assessment of it by measuring facial asymmetries caused by non-frontal lighting and improper facial pose. Experimental results are provided to illustrate the concepts, definitions and effectiveness.

28 citations


Journal Article
TL;DR: An overview of the field of collaborative filtering recommender systems is presented and main techniques applied in the current generation of Collaborative filtering algorithm are described.
Abstract: Recommender system in E-commerce analyzes preferences of users and presents recommendations,offering personalized purchase serviceThis paper presents an overview of the field of collaborative filtering recommender systems and describes main techniques applied in the current generation of collaborative filtering algorithmThis paper also describes various limitations of current recommendation methods and discusses the possible ways that can improve recommendation capabilities and make recommender systems applicable to an even larger range of applications

22 citations


Journal Article
TL;DR: The presented algorithm, named as SDFAR, is proved to be complete and the algorithm's complexity analysis and experimental analysis are given, proving its completeness.
Abstract: Knowledge reduction is one of the most important problems in rough set theory.In this paper,simple discernibility function is defined by discernibility matrice,and then two kinds of operations and relevant concepts such as minimal coverage are defined,with which the problem of finding reduction for DS is turned into another problem of finding minimal coverage of simple discernibility function;significance of attributes in decision table is defined for minimal reduction based on simple discernibility function,which is used as heuristic information to design a novel knowledge reduction algorithm with relevant theory of minimal coverage.The presented algorithm,named as SDFAR,is proved to be complete.This paper also gives the algorithm's complexity analysis and experimental analysis,proving its completeness.The proposed algorithm is relatively effective for finding minimal reduct.

12 citations


Journal Article
TL;DR: A simplified artifical fish swarm algorithm (SAFSA) is proposed, aimed at some defects of AFSA, such as low optimization precision and long running time, and the running speed of that is higher than before.
Abstract: This paper simplifies AFSA,presents evolution equation of AFSA,and then proposes a simplified artifical fish swarm algorithm(SAFSA),which is aimed at some defects of AFSA,such as low optimization precision and long running timeAccording to the preying,the center position and the optimal position of the fish swarm,SAFSA adjusts the next position during the same iterative evolution,so as to ensure that the algorithm will move to the global optimal position,at the same time because AFS swims stochastically just as preying,the algorithm could skip over the local extremum,so as to enhance the global searching ability of itThe simulation results show that optimization effects of the simplified algorithm are obvious and the running speed of that is higher than before

12 citations


Journal Article
TL;DR: Simulations show that the proposed algorithm can prolong the network lifetime significantly and compare with the well-known protocol LEACH, the two-step cluster-head selection mechanism balances the distribution of the cluster-heads.
Abstract: This paper analyses the problem in LEACH when the network is in a large scope,considering on the clusters which is far with the BS deed earlier for the big dispatch of energy in once communication in one hop routing and the clusters which is closer the BS deed earlier for transferring more date packets in multiple hop routingA new Routing Protocols named LEACH-L is proposed,EACH-L restrict a minimum distance for multiple hop routing which can avoid increase the energy dispatch in a multiple hop communicationThe result of experiment shows that LEACH-L can balance the energy dispatch of the sensors in different area and make a longer life span of sensor network when the scope of network goes into larger

12 citations


Journal Article
Zhao Wen-yun1
TL;DR: An automatic refactoring method of detected cloned code based on abstract syntax tree and static analysis that can assist developer achieve the goal of automatically refatoring ofCloned code both accurately and effectively.
Abstract: Code clone in single software system or several similar systems makes software maintenance very difficult.This paper offers an automatic refactoring method of detected cloned code based on abstract syntax tree and static analysis.At first,this method builds abstract syntax trees for cloned code separately.Then the difference between statements is used to establish the relationship between flow control statements in abstract syntax trees.Based on these steps,this method analyses the difference between flow control statements and the difference between simple statement blocks,and finally combines the cloned code through extracting the variation points in source code.We have developed a prototype tool which supports refatoring of cloned code of Java.And experiments on automatic refactoring are taken on JDK1.5 and a business system.The initial result shows that this method can assist developer achieve the goal of automatically refatoring of cloned code both accurately and effectively.

11 citations


Journal Article
TL;DR: A new RSSI-verify based localization algorithm for wireless sensor network which takes the known distance and its corresponding RSSI between the fixed nodes into account and amends the weights of the fixed node, thus improving the localization accuracy of the mobile nodes.
Abstract: Since the positions of events or nodes are the most important part of monitoring information from sensor nodes, how to obtain the accurate localization has become the focus of the attention. Based on Weighted Centroid method, this paper presents a new RSSI-verify based localization algorithm for wireless sensor network which takes the known distance and its corresponding RSSI between the fixed nodes into account and amends the weights of the fixed nodes, thus improving the localization accuracy of the mobile nodes. The experimental results show that the proposed method outperforms Weighted Centroid method under the same environment. Especially, the former decreases the average localization error by about 25% compared with the latter.

9 citations


Journal Article
TL;DR: A novel localization algorithm for wireless sensor network established an interval mapping between the received signal power and the transmission range on the basis of the log-normal shadowing model and the "3σ" principle of normal distribution.
Abstract: A novel localization algorithm for wireless sensor network is presented.It established an interval mapping between the received signal power and the transmission range on the basis of the log-normal shadowing model and the "3σ" principle of normal distribution.With this interval mapping technique,an unknown node got the energy intervals through received signal strengths(RSS),and then determined which distance intervals of the transmission range it was located in from beacons.By transforming those intervals information into distance constraints,this algorithm drew arcs and thus formed several annulus-segment areas.Their intersection,named "the target annulus-segment region",is the minimum area by which the unknown node is hemmed in,and its center position is applied to estimate the location of the unknown node.Compared with other RF-based localization algorithms such as Centroid,Bounding-Box, APIT and MLE,our algorithm performs better when the side length of the nodes distribution area is approximate to or shorter than the transmission range of the sensor nodes.

6 citations


Journal Article
TL;DR: The limitation of the Xu Zhang Yan's algorithm is analyzed and an improvement to quick attribution reduction algorithm is proposed, which optimizes the method of the equivalence partition and computing positive region, and adds the most important condition attribute into the reduction sets.
Abstract: The attribute reduction is an important operation of decision table information system. Currently the most efficient algorithm is Xu Zhang Yan's RedueBaseSig algorithm,its time complications is max{O(|C||U|),O(|C|2|U|)}. However,in some cases,the algorithm is not obtained reduction. In the paper,we analyze the limitation of the Xu Zhang Yan's algorithm and propose an improvement to quick attribution reduction algorithm. The algorithm optimizes the method of the equivalence partition and computing positive region,uses core attributes as the initial reduction sets,and adds the most important condition attribute into the reduction sets. The time complexity of the improved algorithm is O(|C|2|U|) in the worst case. The experimental results show that the algorithm is correct and efficient.

6 citations


Journal Article
TL;DR: The effect of the critical flight resources including aircraft, cockpit crews and cabin crews on flight delay propagation is analyzed and suggestions for preventing and dealing with flights delay are offered.
Abstract: A flight delay might result in the downstream flights delay because one airplane and one crew fly more than one flight linked together in a day.In this paper,we analyze the effect of the critical flight resources including aircraft,cockpit crews and cabin crews on flight delay propagation.Firstly,a DAG(directed acyclic graph) of all of the downstream flights of initial delay flight(which is the root vertex) is created based on flights plan and crews plan.Then algorithm for calculating the property values of each vertex is presented,and flight delay propagation DAG is constructed by rendering the vertices of initial DAG,thus the delay propagation information of the downstream flights is gained.These along with indices of delay flights and delay time provide an effective method to quantificationally analyse flights delay.Finally,suggestions for preventing and dealing with flights delay are offered based on simulation analysis.

5 citations


Journal Article
TL;DR: Results showed that the proposed algorithm with proper sub-population size can effectively decrease the number of users′ comparisons and the algorithm′s convergence time and therefore realize user fatigue reduction.
Abstract: Interactive Genetic Algorithm based on paired comparison(PC-IGA) can reduce users′ mental burden by enabling users to evaluate individuals by comparing two ones and selecting the better instead of traditional grading methods.But too many comparisons in PC-IGA aggravate users′ physical fatigue.To solve this problem,a new user evaluation method named Tournament Selection was proposed.And the key technologies and implementation steps of Interactive Genetic Algorithm based on Tournament Selection(TS-IGA) was given.Then,TS-IGA was applied in a dress color optimization system to study how the population size and sub-population size influence the performance of the algorithm.At last,comparisons between experimental results of TS-IGA and PC-IGA showed that the proposed algorithm with proper sub-population size can effectively decrease the number of users′ comparisons and the algorithm′s convergence time and therefore realize user fatigue reduction.

Journal Article
TL;DR: Fuzzy set theory is introduced to the autonomy levels for unmanned systems based on the analysis of the feedback values produced by system after executing a series of behaviors.
Abstract: In order to evaluate the autonomy levels of unmanned systems dealing with the mission,a model of evaluating the autonomy levels is constructed.Firstly,a set of performance index system to evaluate the autonomy levels is set up which include variation of environment,the stability of the system states and the degree of interaction between the system and operators based on the analysis of the feedback values produced by system after executing a series of behaviors.Then,fuzzy set theory is introduced to the autonomy levels for unmanned systems.Finally,the method above is verified by an example.

Journal Article
TL;DR: The extensive simulation shows that the proposed algorithm is self-adaptive and can achieve high localization accuracy.
Abstract: Applying the classical graph drawing algorithms to node localization in wireless sensor networks is a novel idea. This paper proposes a novel node localization algorithm for wireless sensor networks. It includes two phases. During the first phase, the localization algorithm similar to Kamada Kawai graph drawing algorithm is used to achieve a layout close to the original network layout. During the second phase, Mass-spring graph drawing algorithm is used to refine the layout of the first phase. The extensive simulation shows that the proposed algorithm is self-adaptive and can achieve high localization accuracy.

Journal Article
TL;DR: A new s-curve acceleration and deceleration algorithm is proposed combining the cubic polynomial model with the moving average filtering technique, theoretical analysis shows that the method can ensure the continuity of velocity acceleration and jerk.
Abstract: Aiming at the CNC machine tools vibration due to unsmoothed velocity and jerk during starting/stopping stage,A new s-curve acceleration and deceleration algorithm is proposed combining the cubic polynomial model with the moving average filtering technique.Theoretical analysis of the velocity model,acceleration model,and jerk model shows that the method can ensure the continuity of velocity acceleration and jerk.Practical application on CNC system shows the effectiveness of the method.

Journal Article
TL;DR: The experimental results prove that culture algorithm can effectively improve the efficiency to select diversity individual neural networks to construct ensemble.
Abstract: To improve the ability of generalization,a selective constructing approach to neural network ensemble is proposed,in which the culture algorithm is used to select part of the trained individual networks to be ensembled.This method puts multilayer belief spaces into the framework of culture algorithm which can fully utilize the outstanding characteristics of the individual that may maintain the diversity of neural networks and decrease the effect of collinearity and noise of sample.The experimental results prove that culture algorithm can effectively improve the efficiency to select diversity individual neural networks to construct ensemble.

Journal Article
TL;DR: Simulation experiments show that DCLD can reach high detection accuracy and defenses the attacks near the sources, consequently mitigates the impacts of both attacks and defense mechanism on legitimate traffics.
Abstract: Low-rate Denial-of-Service, very different form traditional flooding DoS attacks, is a new kind of attacks. A distributed collaborative detection method——DCLD(Distributed Collaborative LDoS Detection)which is deployed in the middle network defending against this kind of attacks and their distributed forms is presented. Attack traffic features are extracted using multi-scale wavelet analysis. Then, the feature-evidences are combined to make integrating judgement based on the D-S evidence theory. A distributed collaborative algorithm is also proposed, detection nodes exchange their information to realize collaborative detection through it. Simulation experiments show that DCLD can reach high detection accuracy and defenses the attacks near the sources, consequently mitigates the impacts of both attacks and defense mechanism on legitimate traffics.

Journal Article
TL;DR: The differential power analysis attack on SMS4 algorithm is discussed, and an attack method on every byte of round keys is presented that can obtain the round keys of the last four rounds of SMS4 and then the 128bit encryption key can be found out.
Abstract: SMS4 algorithm is a block cipher used in WLAN products. In this paper, the differential power analysis attack on SMS4 algorithm is discussed. Based on analyses of the algorithm structure and principles of differential power analysis technologies, an attack method on every byte of round keys is presented. Through this attack, the round keys of the last four rounds of SMS4 can be obtained, and then the 128bit encryption key can be found out. The results of simulation experiments indicate that this attack method is effective and practical on SMS4 round operation. SMS4 algorithm is vulnerable to differential power analysis attacks, and cryptographic devices should be protected to prevent this kind of attacks.

Journal Article
TL;DR: It is demonstrated from the performance evaluation that policy cache technology improved the performance of migration policy execution by an order of magnitude and a new implementation of file-reparse is presented to resolve the user directly access file system problem and consolidate the data in different hierarchies.
Abstract: Information lifecycle management is a sustainable development storage policy,which balance the storage cost and manage depending on its characters.This thesis presents the design and implementation of the Tsinghua ILM based on hierarchical storage in SAN and NAS environment.A new implementation of file-reparse is presented to resolve the user directly access file system problem and consolidate the data in different hierarchies.It brings little overhead to the system and could be ignored after performance evaluation.The optimization of the management of application-level metadata and policy cache technology is presented at the end of the paper.It is demonstrated from the performance evaluation that policy cache technology improved the performance of migration policy execution by an order of magnitude.Besides,Policy Metadata Container is designed to improve the poor performance in the initialization process of the policy cache.The time spent on initialization decreased by 16 to 20 times due to the performance evaluation.


Journal Article
TL;DR: Compared with the existing corresponding algorithm, the proposed fast recursive algorithms for maximum entropic correlation threshold selection based on gray level(or average gray level)-gradient two-dimensional histogram, achieves better segmentation quality which obtains uniform regions, accurate borders and robust noise resistances.
Abstract: In view of the obvious shortage of the commonly used regional division of gray level-average gray level two-dimensional histogram,an improved maximum entropic correlation threshold selection method based on gray level(or average gray level)-gradient two-dimensional histogram is proposed.The formulas for corresponding fast recursive algorithms are deduced.The experimental results are presented,analyzed and compared.The results show that compared with the existing corresponding algorithm,the proposed fast recursive algorithms for maximum entropic correlation threshold selection based on gray level(or average gray level)-gradient two-dimensional histogram,achieves better segmentation quality which obtains uniform regions,accurate borders and robust noise resistances.The running time of the proposed algorithm reduces by about 20%.

Journal Article
TL;DR: Experimental results show that the proposed watermarking extract scheme is robust and secure against a wide range of image processing operations such as adding noise, filtering and lossy compression.
Abstract: This paper presents a new robust watermarking extract scheme for color image using scale invariant features transform(SIFT) image correction.A binary watermark image is permuted with sequence numbers generated by a secret key in a spatiotemporal chaos system.Then,the binary image watermark was encoded by Gray code,and then adaptively embedded into low frequency components of discrete cosine transform domain in the original color image′s blue channel.In watermarking extract scheme,the scale invariant features of images are extracted,and the match points between the watermarking image and the reference image are found.Then the watermarking image is corrected by affine transform of these match points.At last,the watermark is extracted from the corrected image.Experimental results show that the proposed scheme is robust and secure against a wide range of image processing operations such as adding noise,filtering and lossy compression.

Journal Article
TL;DR: A multi-objective optimization algorithm based on user preference region is proposed, which only finds a preferred and smaller set of Pareto-optimal solutions, instead of the entire Pare to frontier, so that the number of solutions is reduced and the convergence rate is improved.
Abstract: Aiming at the problem of the existing multi-objective optimization methodologies in actual application, this paper proposes a multi-objective optimization algorithm based on user preference region, which only finds a preferred and smaller set of Pareto-optimal solutions, instead of the entire Pareto frontier, so that the number of solutions is reduced and the convergence rate is improved. The algorithm adopts the elitist non-dominated sorting strategy, takes the distance between individual and the user preference region as a factor affecting individuals fitness, and applies the crowding strategy to maintain the diversity of solution. Simulation results show that the proposed algorithm is effective.

Journal Article
TL;DR: A brief introduction of available kernel structure for operating system and Servant/Exe-Flow Model (SEFM)—a new abstraction of operating system, and the design techniques of the kernel (kernel servant) for a typical component based operating system Minicore are presented.
Abstract: Adapting component based model has become a new tendency to design operating system. For component based operating system, the key constructing techniques constrictively express themselves on the design and implementation of the kernel. In this paper a brief introduction of available kernel structure for operating system and Servant/Exe-Flow Model (SEFM)—a new abstraction of operating system. Then the design techniques of the kernel (kernel servant) for a typical component based operating system Minicore are presented in detail. Finally, a set of test data are given to show the effectiveness of the techniques presented.

Journal Article
TL;DR: A brief survey of the approaches for builcling relationships between database and ontology, classified into two categories which are extracting ontology from data source and building mapping from database to existing ontology.
Abstract: Along with the development of the Semantic Web,ontology is playing a more and more important role in the research area such as data integration and semantic interoperability.This paper provides a brief survey of the approaches for builcling relationships between database and ontology.The mapping approaches are classified into two categories which are extracting ontology from data source and building mapping from database to existing ontology.Based on each category,this paper describes the characteristics,surveys related work based on each category,and compares typical tools or systems.Finally,the challenges and future research directions are summarized as mapping maintenance,tools visualization,instance or domain knowledge-assisted mapping generation,and evaluation of automatic mappings.

Journal Article
TL;DR: To reduce the transmissions among sensor nodes and prolong the lifecycle of the wireless sensor network, a data fusion mechanism based on immune was put forward, in which the aggregation strategy was used to minimize the energies of the network.
Abstract: To reduce the transmissions among sensor nodes and prolong the lifecycle of the wireless sensor network, a data fusion mechanism based on immune was put forward. At first, a hierarchical distribute algorithm was proposed, in which the aggregation strategy was used to minimize the energies of the network. At the same time, the reliability of the network was also given attention. Secondly, the self-study and self-adaptation characters of artificial immune system were used to bring an immune fusion algorithm forward. In the algorithm, the immune anti-redundancy, immune selection and immune memory were used to ensure high reliability and low redundancy. The experiments show that the mechanism can reduce the energy consumption effectively, and has preferable common use character.

Journal Article
Niu Dejiao1
TL;DR: The result proves efficient negative selection algorithm can reduce the number of comparing detector of antigen, and increase efficient ofnegative selection algorithm.
Abstract: Artificial immune algorithm has been used to many fields researching such as intrusion detection system,information retrieval system and data mining system.Negative selection algorithm is the typical method for artificial immune algorithm.But it is not efficient because that checking antigen repeated using same sub-string,finding detector method is slow and comparing detector of antigen by bit need large consumption.By analyzing antigen,detector and matching rules of them,this paper presents conversion algorithm to convert self to some self-numbers,convert detector to detector-numbers and convert antigen to some antigen-numbers.Then giving the red-black tree based negative selection algorithm to index self-number and detector-number.It can avoid checking antigen and obtaining sub-string repeatedly.Using efficient negative selection algorithm and negative selection algorithm to realize prototype,testing and comparing their performance,the result proves efficient negative selection algorithm can reduce the number of comparing detector of antigen,and increase efficient of negative selection algorithm.

Journal Article
TL;DR: It is proved that the reasoning upon DLRDM could be reduced to the consistency checking upon ABox ofDLRDM, and results indicate that DLR DM is correct and effective on consistency checking on data mining metamodel.
Abstract: The standardization of data mining metadata is a new focus problem in the development of data mining techniques in recent years. Constructing integrative and public data mining metadata is the common aim of data mining manufacturers. During the data mining metadata creation based on common warehouse metamodel (CWM),the evolution of data mining techniques,the different experiences and views of describing data of organizations cause inconsistencies inevitably. However,current data mining metamodal lacks precise semantic due to its description with natural language and graphs. In this research,a formal logic DLRDM that belongs to the family of description logic is proposed. The syntax and semantic of DLRDM are given. It is proved that the reasoning upon DLRDM could be reduced to the consistency checking upon ABox of DLRDM. Formalization upon the metamodal and metadata of data mining is analyzed in detail. The reasoning engine RacerPro is applied to check the consistency upon the data mining metadata,results indicate that DLRDM is correct and effective on consistency checking on data mining metamodel.

Journal Article
TL;DR: A two-level out-of-band virtualization data management model which can fully utilize the I/O capacity of individual storage devices as well as maximize the utilization of underlying storage network is presented.
Abstract: This paper presents a network storage system using out-of-band virtualization approach,called BW-VSDS.It has following characteristics:(1) adopt a two-level out-of-band virtualization data management model which can fully utilize the I/O capacity of individual storage devices as well as maximize the utilization of underlying storage network;(2) use distributed data storage management protocols which coordinate multiple independent storage devices to realize advanced data storage semantics;(3) support multiple network data transport standards in case of different applications.Presently,BW-VSDS has applications in fields of video monitor,information processing and enterprise computing,etc.

Journal Article
TL;DR: A method of measuring class authority based on the relationships among classes, in which classe's authority is defined from class diagram iteratly is put forward, and it is proved that this measure has minimum and maximum values by virtute of linear algebra.
Abstract: Software measurement is one of important research domains of software engineering It effects cost of software development and maintenance Till now,researchers have proposed lots of cohesion and couple measures However very little work is involved in meausring classes' authority This paper put forward a method of measuring class authority based on the relationships among classes,in which classe's authority is defined from class diagram iteratly Then we prove that this measure has minimum and maximum values by virtute of linear algebra Finally,a small but realistic example is illustrated

Journal Article
TL;DR: The present paper proposes a novel vetorization method based on data regrouping locally which can vectorize some program which can't be vectorized by other compilers such as ICC, and highly improves the performance of some SPEC CPU2000 floating point program by up to 241.6%.
Abstract: At present,non-multimedia program vectorization with multimedia extension has become an important way to improve the program performance.However,compared to multimedia program,there are a large amount of non-adjacent and non-alignment data references in the non-multimedia program,which seriously impede the program vectorized and decrease program vectorization performance.The present paper proposes a novel vetorization method based on data regrouping locally.This method changes non-adjacent data reference to adjacent data reference by data regrouping locally so as to vectorize regrouped loop,and makes alignment analysis and alignment optimization so as to improve program vectorization performance.For SPEC CPU2000 float point test sets,the proposed method can vectorize some program which can't be vectorized by other compilers such as ICC,and highly improves the performance of some SPEC CPU2000 floating point program by up to 241.6%.