scispace - formally typeset
Search or ask a question

Showing papers in "Chinese Journal of Computers in 2006"


Journal Article
TL;DR: A new and relatively reasonable formula measuring attribution importance is designed for reducing searching space as quickly as possible, and the recursive calculating method of the formula is provided.
Abstract: Computing U/C is one of the most important and time-consuming computation in attribute reduction based on positive region.At present,the idea of the best algorithm for computing U/C is based on quick sorting,and it's time complexity is O(|C||U|log|U|).In this paper,a new algorithm based on radix sorting for computing U/C is provided,and its complexity is cut down to O(|C||U|).On the other hand,it is not fully reasonable to regard approximate quality as heuristic information in attribution reduction algorithm based on positive region.So a new and relatively reasonable formula measuring attribution importance is designed for reducing searching space as quickly as possible,and the recursive calculating method of the formula is provided.The algorithm complexity of calculating the formula is descended to O(|C-P||U′-U′_(P)|).Then the formula measuring attribute importance is used as heuristic information to design an efficient attribute reduction algorithm,whose worst time complexity is cut down to(max)(O(|C||U|,O(|C|~(2)|U/C|)).An example is used to illustrate the efficiency of the new algorithm. At last,experimental result shows that the new algorithm is not only efficient but also scalable.

133 citations


Journal Article
TL;DR: Experiments on QoS-aware Web services selection show that the genetic algorithm with this matrix can get more excellent composite service plan than the Genetic algorithm with the one dimension coding scheme, and that the mutation policy plays a role at the improvement of the genetic algorithms fitness.
Abstract: A novel genetic algorithm is presented for Quality of Service (QoS)-aware Web services selection. The genetic algorithm includes a special relation matrix coding scheme of chromosomes that can express simultaneously all of composite paths, which can not be expressed simultaneously by the one dimension coding scheme. This matrix can also represent effectively the composite service re-planning and cyclic paths with the help of a simple method. Many composition scenarios can also be showed by the matrix but not by the one dimension. Elements along the main diagonal of the matrix are used to represent all tasks in all of composite paths and others elements in the matrix for the direct relationship between every two tasks. The proposed genetic algorithm running only once can construct the composite service plan according with the QoS requirements from a great deal of services compositions with different QoSes. Meanwhile, the algorithm adopts a mutation policy to improve the fitness. Experiments on QoS-aware Web services selection show that the genetic algorithm with this matrix can get more excellent composite service plan than the genetic algorithm with the one dimension coding scheme, and that the mutation policy plays a role at the improvement of the genetic algorithm fitness.

69 citations


Journal Article
TL;DR: A theoretical model and method to design and analyze RFID protocols within the provable security framework is discussed, with a focus on cryptographic protocols.
Abstract: Recently,RFID system is being widely considered as a main technology to realize(ubiquitous) computing environment,but the features of the RFID systems and the constraints of RFID devices may bring about various privacy problemsThe biggest challenge for RFID technology is to provide benefits without threatening the privacy of consumersThis paper reviews the existing RFID system security mechanisms,with a focus on cryptographic protocolsWeaknesses or flaws in these protocols are examinedThen a theoretical model and method to design and analyze RFID protocols within the provable security framework is discussed

43 citations


Journal Article
TL;DR: This paper comprehensively surveys the skin color detection techniques by dividing them into statistics-based and physics-based approaches, including color space selection, static and dynamic skin color modeling, skin reflection models, and visual and infrared skin spectrum.
Abstract: Skin color detection has many applications in tasks like detecting and tracking human faces and gestures, filtering web image contents, retrieving people in databases and Internet, and diagnosing diseases This paper comprehensively surveys the skin color detection techniques by dividing them into statistics-based and physics-based approaches Important aspects of skin color detection are discussed, including color space selection, static and dynamic skin color modeling, skin reflection models, and visual and infrared skin spectrum Based on the discussion, the color space selection is related to feature exaction and classification methods The selection of color space is actually the selection of a feature base for the classification; and features are not independent from the classifier, so how well they cooperate can affect the overall performance of the classification system Therefore, it is of little significance to discuss the optimal color space for skin color detection without considering the methods of skin color modeling and the principle and implementation of the classifier A successful dynamic skin color model should be able to work well under varying illumination, which is also the precondition for a skin color detection system to work properly outdoors Physics-based skin color detection should study skin reflection model or skin spectrum, the combination of reflected and emitted data will essentially improve the performance of skin color detection system The major challenge of skin color detection techniques is how to deal properly with different illumination conditions and complex background The further research directions of the skin color detection techniques include intercrossing related disciplines, fusing related algorithms and combining related features It is a trend to integrate physics-based techniques in statistics-based skin detection techniques in high end applications

38 citations


Journal Article
TL;DR: The strength of SMS4 against the differential fault attack is examined, and the authors suggest that the encryption device should be protected to prevent the adversary from deducing faults.
Abstract: SMS4 is the block cipher used in WAPI,and it is also the first commercial block(cipher) disclosed by the government.Since it was disclosed only a short time ago,on its security,there has been no published paper at present.In this paper the strength of SMS4(against) the differential fault attack is examined.The authors use the byte-oriented fault model,and take advantage of the differential analysis as well.Theoretically,the 128bit master key for SMS4 can be obtained by using 32 faulty ciphertexts.But in practice,for the fact that the byte position where the fault happens isn't equally distributed,the number of faulty ciphertexts needed will be a little bigger than the theoretical value.The attack experiment result validates this fact too.The result shows that only need average 47 faulty ciphertexts to recover the 128bit keys for SMS4.So SMS4 is vulnerable to differential fault attack.To(avoid) this kind of attack, the authors suggest that the encryption device should be protected to prevent the adversary from deducing faults.

37 citations


Journal Article
Gao Yan1
TL;DR: A hierarchical structure for Composite Web Service selection oriented QoS including QoS for assuring quality of basic Web services,QoS for evaluating relation degree between services and QoSfor measuring the whole quality of Composite Web Services is proposed.
Abstract: The provision of guaranteed QoS is critical to the success of Web Services in business domains. Thus, how to make the Composite Web Services meet users' QoS requirement becomes a hot issue in the research field. For such reason, this paper proposes a hierarchical structure for Composite Web Service selection oriented QoS including QoS for assuring quality of basic Web services, QoS for evaluating relation degree between services and QoS for measuring the whole quality of Composite Web Services. Based on this, a selection algorithm for QoS-driven Composite Web Services selection is also proposed. The experiment result demonstrates its efficiency.

36 citations


Journal Article
TL;DR: An incremental updating algorithm of the computation of a core based on discernibility matrix in the case of inserting, which only inserts a new row and column, or deletes one row and updates corresponding column when updating the decernibility matrix, is introduced.
Abstract: Rough set theory is a new mathematical tool to deal with imprecise,incomplete and inconsistent dataAttributes reduction is one of important parts researched in rough set theoryThe core of a decision table is the start point to many existing algorithms of attributes reductionMany algorithms were proposed for the computation of a coreHowever,very little work has been done in updating of a coreTherefore,this paper introduces an incremental updating algorithm of the computation of a core based on discernibility matrix in the case of inserting,which only inserts a new row and column,or deletes one row and updates corresponding column when updating the decernibility matrix,so the updating efficiency of a core is remarkably improvedTheoretical analysis and experimental results show that the algorithms of this paper are efficient and effective

35 citations


Journal Article
TL;DR: This paper proposes an identity-based authentication model for multi-domain based on identity- based public key cryptography, so as to overcome some problems posed by traditional authentication model based on PKI.
Abstract: In this paper, the authors considers the especial requirements of the authentication model in multi-domain environments. He analyzes the problems of existing authentication frameworks, and proposes an identity-based authentication model for multi-domain. The model is based on identity-based public key cryptography, so as to overcome some problems posed by traditional authentication model based on PKI. Moreover, entity authentication for cross-domain and anonymity for subject is supported in the model. Especially, by using the modular approach under the CK-model, the security of entity authentication and anonymity is analyzed. It is showed that the proposed model is secure and could achieve the security requirements.

33 citations


Journal Article
TL;DR: A SOA reference model that can be used to design SOA and its service bus and meta model for service contracts is presented and the maturity model for evaluating service-oriented architectures is presented.
Abstract: Recently, SOA plays a more and more important role in software research and software development. On the basis of the research findings on SOA, this paper presents a SOA reference model that can be used to design SOA. The paper deeply discusses some concepts on SOA, and expatiates on the structure of the reference model and its service bus and meta model for service contracts. Moreover, the paper presents the maturity model for evaluating service-oriented architectures. The reference model lays the foundation for building service-oriented architectures.

32 citations


Journal Article
TL;DR: The paper firstly introduces the mathematical model of regression least squares support vector machine (LSSVM), and analyzes its property, then designs incremental and online learning algorithms based on LSSVM by the calculation formula of block matrix and kernel function matrix's property.
Abstract: Support vector machine is a learning technique based on the structural risk minimization principle,and it is also a class of regression method with good generalization ability.The paper firstly introduces the mathematical model of regression least squares support vector machine(LSSVM),and analyzes its property,then designs incremental and online learning algorithms based on LSSVM by the calculation formula of block matrix and kernel function matrix's property.The proposed learning algorithms fully utilizes the historical training results,reduces storage space and calculate time.Experimental results of simulation indicate the feasibility of the two learning algorithms.

32 citations


Journal Article
TL;DR: Theoretical analysis and simulation show that DyTrust has advantages in modeling dynamic trust relationship and aggregating feedback information over the existing trust metrics and is highly effective in countering malicious peers regarding strategic altering behavior and dishonest feedbacks of malicious peers.
Abstract: An important challenge regarding peer’s trust valuation in P2P systems is how to cope with strategically altering behaviors and dishonest feedbacks of malicious peers efficiently However, the trust models employed by the existing systems do not provide adequate support to coping with quick changes in peers’ behavior and aggregating feedback information, so the authors present a time-frame based dynamic trust model DyTrust After incorporating time dimension using time-frame, which captures experience and recommendation’s time-sensitivity, the authors also introduce four trust parameters in computing trustworthiness of peers, namely, short time trust, long time trust, misusing trust accumulation and feedback credibility Together, these parameters are adjusted in time to reflect the dynamics of the trust environment using feedback control mechanism, thus, the trust evaluation has better adaptability to the dynamics of trust Theoretical analysis and simulation show that DyTrust has advantages in modeling dynamic trust relationship and aggregating feedback information over the existing trust metrics It is highly effective in countering malicious peers regarding strategic altering behavior and dishonest feedbacks of malicious peers

Journal Article
TL;DR: The problem of service composition is presented including the definition of composite Web service system and the service cooperation, and the rules of automatic service composition are proposed and the soundness and the completeness are proven.
Abstract: Automatic Composition of Web services is one of the most important issues in the research of service-oriented computing (SOC) According to the relationship of messages and activities, Web services are defined as message oriented activity based Petri net model (Moap) The model is characterized in terms of message domain and service process The formal is used to the service cooperation and the communication with client; the latter is the description of Petri net based service process Moap supports the reuse of the composite services Compared with automata based models, Moap can describe parallel composition and the meta-message mechanism benefits the automatic composition Based on Moap, the problem of service composition is presented including the definition of composite Web service system and the service cooperation Then, the rules of automatic service composition are proposed and the soundness and the completeness are proven Finally, an example also proofs the usability of Moap

Journal Article
TL;DR: In this article, a new method with measuring logic truth scale, having background of medium mathematics system, is proposed, and the relation between truths of the predicate and areas of numerical value of general application is described.
Abstract: To process fuzzy phenomenon existed widely in engineering and scientific research, a new method with measuring logic truth scale, having background of medium mathematics system, is proposed. After establishing about e standard pointer of the predicate, the relation between truths of the predicate and areas of numerical value of general application is described. Adopting the concept of distance and using length of numerical value interval to different predicate truths as norm, thereby the function of distance ratio and its special transformation are defined, and from this the individual truth grad function in one-dimensional is found. Additional, the ramification predicate about n-dimensional, the inverse mapping of power set and the vector of standard pointer are advanced to describe the application form in n-dimensional, such as the sum function of truth grad, polar value branch function and so on. The given demonstrations show that the interval of truth scale can be expended from [0,1] to (-∞,+∞) in theory with the concept of super-truth advanced in this paper, and the definition of truth grade function possesses quantitative form processed by computers and features of objective property and universal adaptability. Consequently, the method of measure of medium truth scale will find efficacious application in fields dealing with vague phenomenon.

Journal Article
TL;DR: In this paper, a CP-net model for Web service composition is proposed to describe the logical relation of components graphically and dynamic behaviors of services can be simulated, analyzed by executing of the CPnet model.
Abstract: A CP-net Model for Web service composition is proposed To each service, a CP-net model is constructed to describe the logical relation of components graphically Furthermore dynamic behaviors of services can be simulated, analyzed by executing of the CP-net model Operators to construct new complex services from known ones as blocks are defined formally Some algebra and dynamic properties of this model are studied and proved Algorithms to construct and execute a composite service are delivered also

Journal Article
Li Tao1
TL;DR: The experiment result shows that the new immune based model, called AINM for computer network monitoring, has the capability of real-time,self-learning, self-adaptive and diversity.
Abstract: In a traditional computer immune system(CIS),the detector training efficiency is very low,and,there is no dynamic evolutionary mechanism for self/nonself definition,resulting a lower self-adaptability,therefore,not satisfying the requirements of network monitoring in a real network environment.To solve this problem,a new immune based model,which is called AINM for computer network monitoring,is proposed.The concepts and the formal definitions of self,nonself,antigen,detector and digital evidence are introduced.Furthermore,the dynamic evolutive models and the recursive equations to the self,antigen,dynamic computer forensics,immunological tolerance,and the detector lifecycle are presented. The simulation for this model has been given.The experiment result shows that the new model has the capability of real-time,self-learning,self-adaptive and diversity.

Journal Article
Li Ning, Sun De, Zou Tong, Qin Yuan, Wei Yu 
TL;DR: The theoretical guide formulas and conditional expression for the PSO parameters choosing are proposed, which are used to guide and balance the ability of exploration and exploitation of algorithm, and are helpful for the choosing and adjustment ofPSO parameters in practical application.
Abstract: For particle swarm optimization is a dynamic discrete process,the authors made a thorough research on the stability of particle's trajectory in particle swarm through difference(equation) and Z transform,discuss the influences of pBest,gBest and randomicity on particle's trajectory,and analyze the relationship between trajectory's stability and algorithm convergence.The theoretical guide formulas and conditional expression for the PSO parameters choosing are also proposed in this paper,which are used to guide and balance the ability of exploration and exploitation of algorithm,and are helpful for the choosing and adjustment of PSO parameters in practical application.

Journal Article
TL;DR: The authors introduce a trust model that is based on the uncertainty reasoning theory (D-S theory) and propose a novel scheduling algorithm that is called Trustworthy and Dynamic Level Scheduling (TDLS).
Abstract: The uncertainty of Grid users, resources and services may play a negative affect on the execution of Grid tasks, which makes it difficult to design a scheduling algorithm to minimize execution time and cheat probability of Grid tasks. Referring to the social trust relationship, the authors introduce a trust model that is based on the uncertainty reasoning theory (D-S theory). In addition, by combining the trust model and Dynamic Level Scheduling(DLS) algorithm, the authors propose a novel scheduling algorithm that is called Trustworthy and Dynamic Level Scheduling (TDLS). The algorithm takes the Grid nodes' trust degree into account when calculating the scheduling-level of task-node pairs. Simulations prove that the algorithm can efficiently satisfy the QoS requirement in trust, with costing a few more time.

Journal Article
TL;DR: A method of Web services composition according to clients' QoS requirements is proposed to solve the problem of the composition process, which satisfies Markov Process and a prototype of E-WsFrame is implemented.
Abstract: A method of Web services composition according to clients' QoS requirements is proposed to solve the problem of the composition process, which satisfies Markov Process. First of all, a Web services description model is proposed to specify the QoS descriptions, and the QoS description of the whole life cycle of composition is implemented. Then a selecting algorithm is given based on the multiple object decision-making theory and k -armed bandit problem. Compared with existed methods, the algorithm this paper provided can select and compose services by clients' preference for QoS attribute under the condition of incomplete QoS information. Finally, a prototype of E-WsFrame is implemented. Experimental results show that E-WsFrame can satisfy both functional and QoS requirements of composed services when selecting and composing Web services at runtime.

Journal Article
TL;DR: A high performance method employing a two-step strategy to classify texts, where a portion of texts which are currently thought of being unreliable in categorization are identified, forming a fuzzy area between categories.
Abstract: Text filtering for topic-sensitive information is one of the important applications in text categorization.To effectively filter out the topic-sensitive information from Chinese text collections is a technical challenge.This paper presents a high performance method employing a two-step strategy to classify texts.In the first step,authors regard the words with parts of speech verb,noun,adjective and adverb as candidate features,perform feature selection on them in terms of the improved mutual information formula,and classify the input texts with a naive Bayes classifier.A portion of texts which are currently thought of being unreliable in categorization are identified,forming a fuzzy area between categories.In the second step,authors regard the bigrams of words with parts of speech verb and noun as candidate features,use the same feature selection and classifier to deal with the texts in the fuzzy area.The experiments on a test set consisting of 12600 Chinese texts show that this method achieves a high performance.The precision,recall and F_(1)is 97.19%,93.94% and 95.54% respectively.

Journal Article
TL;DR: The authors present a new identitybased signcryption scheme using the bilinear pairings and prove its security in the random oracle model to be secure assuming the bil inear Diffle-Hellman problem is hard.
Abstract: Signcryption is a cryptographic primitive that combines both the functions of digital signature and public key encryption in a logical single step,at lower computational costs and communication overheads than the traditional signaturethen-encryption approach.In this paper,the authors present a new identitybased signcryption scheme using the bilinear pairings and prove its security in the random oracle model.The proposed scheme is proved to be secure assuming the bilinear Diffle-Hellman problem is hard.As compared with the most efficient Chen Malone-Lee scheme to date,the proposed scheme decreases one pairing operation and only requires two pairing operations.

Journal Article
TL;DR: By the experiments of ten typical test functions' optimization, it's proved that the FLAGA is fast and stable and easy to realize.
Abstract: A genetic algorithm with fast local adjustment is presented.In the mutation,high fitness individuals in population are executed inducing mutation and others are executed stochastic-dynamic-range mutation;in the crossover,the algorithm is divided into searching phase and adjusting phase.And in two phases,the stochastic linear combination crossover and partial certainty inducing crossover are adopted respectively.The high accurate numerical solution can be found in the short time when the FLAGA adopted in an adjacent domain of the global optimization solution.By the experiments of ten typical test functions' optimization,it's proved that the FLAGA is fast and stable and easy to realize.In the global search,the FLAGA's convergence rate and solution quality exceed GA's obviously when it's control parameters are adjusted rightly.

Journal Article
TL;DR: A method to calculate similarity between question and sentence based on Latent Semantic Analysis(LSA) is proposed, which makes a very better effect and solves the problem of synonymy and polysemy.
Abstract: When extracting answers in Chinese question-answering system,synonymy will cause to lose several correct answers,and polysemy will cause to extract wrong answers.In order to solve these problems,this paper proposes a method to calculate similarity between question and sentence based on Latent Semantic Analysis(LSA).This method represents the question and sentence with space vector model,statistically analyzes the abundant question-answering sentence pair corpus with the help of latent senmatic analysis theory,and constructs a latent word-sentence semantic space,which gets rids of the correlativity between word.And then similarity calculation between question and sentence is implemented in this semantic space.So the question of synonymy and polysemy is solved effectively.Finally,combining question type and similarity between question and sentence,the experiment on extracting sentence as answer for Chinese factoid question is done.The MRR value with LSA is 0.47,which is better than VSM obviously.The results show that this method makes a very better effect.

Journal Article
TL;DR: A novel trust-QoS enhanced heuristic based on trust relationship is put forward and results demonstrate trust-driven heuristics perform better than time-driven Heuristics, Min-Min and Sufferage heuristic algorithms.
Abstract: At present existing scheduling algorithms in service grid largely ignore the impact of trust mechanism It is rather doubtful to adopt these algorithms in a realistic environment Based on the grid trust model and trust utilization functions, a computational service scheduling problem based on trust QoS enhancement is proposed A novel trust-QoS enhanced heuristic based on trust relationship is put forward The algorithms are evaluated with large scale simulation Simulation results demonstrate trust-driven heuristics perform better than time-driven heuristic The algorithm based on trust relationship achieves better integrative performance on performance QoS, trust-QoS, etc than trust-driven Min-Min and Sufferage heuristic algorithms

Journal Article
TL;DR: The image registration technique based on Fourier-Mellin transform can be used to register images which are misaligned due to rotation, scaling and translation and find its applications in many different fields thanks to its high accuracy, robustness and low computational cost.
Abstract: The image registration technique based on Fourier-Mellin transform can be used to register images which are misaligned due to rotation,scaling and translation and find its applications in many different fields thanks to its high accuracy,robustness and low computational cost.In this paper,the technique is extended to two new application fields.The first one is panoramic mosaics.Unlike conventional methods,this technique is capable of successfully building a coarse full view of a large scene without either requiring special hardware to control camera motion or knowing camera's focus length,or detecting image features and their correspondences.The other extended application is of curve matching.In the most traditional curve matching methods,the correspondence of curve features,such as corners,extrema of the curvature etc,should be at first established,and then the matching parameters are computed.Here a new approach is proposed,where the curves matched are at first converted into binary images and then the matching of these binary images is carried out by the Fourier-Mellin transform based registration technique.Numerous experiments show that for most of images captured by a hand-held camera,if the projective distortions are not too severe,the registration results are satisfactory.

Journal Article
TL;DR: Experiments on some UCI databases show that the result of CF-WFCM is better than that of FCM, and the index CFuzziness(w) not only can be used to learn feature weight, but also is a valid entropy function to evaluate the feature evaluation indexes.
Abstract: This paper proposes CF-WFCM algorithm including feature weight learning algorithm and clustering algorithm.According to data's similarity,feature weight learning algorithm gives each feature a feature weight by minimizing the feature evaluation index CFuzziness(w) through gradient descent technique.When the feature weight is applied in the Fuzzy C Mean(FCM) clustering algorithm,it forms the clustering algorithm of CF-WFCM algorithm.CF-WFCM emphasizes the important feature's effect and lessens the redundant feature's effect in the procedure of clustering so that the performance of clustering has been improved.Experiments on some UCI databases show that the result of CF-WFCM is better than that of FCM.In addition,the index CFuzziness(w) not only can be used to learn feature weight,but also is a valid entropy function to evaluate the feature evaluation indexes.If we can choose a better validity index to learn the feature weight before clustering,large computation will be avoided,which is showed in an example.In the end,the authors discuss the CF-WFCM algorithm.

Journal Article
TL;DR: Based on ECDSA, a signcryption scheme called SC-ECDSA is designed, which will be equivalent to an AtE(OTP_($),MAC) encryption scheme or E CDSA when one of party is absent.
Abstract: Signcryption is a new cryptographic primitive that simultaneously fulfills both the functions of signature and encryptionThe definition of generalized signcryption is proposed in the paper firstlyGeneralized signcryption has a special feature that provides confidentiality or authenticity separately under specific inputsSo it is more useful than common onesBased on ECDSA,a signcryption scheme called SC-ECDSA is designedIt will be equivalent to an AtE(OTP_($),MAC) encryption scheme or ECDSA when one of party is absentA third party can verify the signcryption text publicly in the method of ECDSASecurity properties are proven based on Random Oracle mode: Confidentiality(CUF-CPA),unforgeability(UF-CMA) and non-repudiationFor typical security parameters for high level security applications,compared with the others,SC-ECDSA presents a 78% reduction in computational cost

Journal Article
TL;DR: The authors propose the so-called corner-occupying and largest hole degree first placement policy based on Euclidian distance, and an effective heuristic algorithm is presented, and the solution to the rectangles Packing problem can be obtained quickly by applying this algorithm.
Abstract: Solving NP hard problem is the bottleneck task for computer science and technology nowadays. In recent years, investigations show that for NP hard problems, there may not exist an algorithm that is both complete and rigorous and not too slow. So its solution methods are usually heuristic. The rectangles Packing problem is NP hard. Given a set of rectangles with fixed width and height and a larger rectangle, the rectangles Packing problem is to find a good layout by Packing these rectangles without overlapping entirely inside a larger rectangle. In this paper, based on the quasi-human strategy, the authors propose the so-called corner-occupying and largest hole degree first placement policy based on Euclidian distance. An effective heuristic algorithm is presented, and the solution to the rectangles Packing problem can be obtained quickly by applying this algorithm. Experimental results on MCNC and GSRC benchmark circuits demonstrate that the algorithm is quite effective in solving the problem.

Journal Article
TL;DR: RiMOM treats the mapping problem as a decision problem and formalizes mapping discovery as that of risk minimization, and can outperform existing methods in terms of precision and recall.
Abstract: Interoperability over distributed ontologies is one of the six challenges for Semantic Web.Ontology mapping is the key point to each the interoperability.In this paper,based on Bayesian decision theory,the authors propose an approach called RiMOM(Risk Minimization based Ontology Mapping) to automatically discover the mapping between ontologies.RiMOM treats the mapping problem as a decision problem and formalizes mapping discovery as that of risk minimization.Based on multiple strategies,RiMOM deals with not only 1∶1 mapping,but also n∶1 mapping.Experiments on several public data sets show that RiMOM can outperform existing methods in terms of precision and recall.

Journal Article
TL;DR: According to Tamura's texture model, this paper puts forward an image texture semantic description framework based on linguistic variable that has strong significance for reducing the "semantic gap" between the visual feature and semantic visual.
Abstract: According to Tamura's texture model,this paper puts forward an image texture semantic description framework based on linguistic variable.Authors successfully construct the mapping from low-level visual feature to high-level semantic feature through genetic programming algorithm,and propose the fuzzy retrieval algorithm according to the extracted semantic features.The experiment results show that above approach not only has an excellent retrieval precision but also has a good accordance with the human visual perception.The approach has strong significance for reducing the "semantic gap" between the visual feature and semantic visual.

Journal Article
TL;DR: Several algorithms are developed to check the existential consistency and mandatory consistency including the forward, backward and bidirectional consistency of component-based system designs for scenario-based specifications.
Abstract: Component-based system design is becoming more and more popular in software engineering.Checking the important behavioral properties formally in the design phase is an effective way to improve the system reliability.In this paper,the authors consider the problem of checking component-based system designs for scenario-based specifications.Specifically,the authors use the interface automata networks to model the component-based system designs which include a set of interface automata synchronized by shared actions,and the scenario-based specifications are specified by UML sequence diagrams.Based on investigating the reachability graph of the state space of the interface automata networks,the authors develop several algorithms to check the existential consistency and mandatory consistency including the forward,backward and bidirectional consistency.