scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computer Applications in 2012"


Journal Article
TL;DR: Based on the conceptual model of situational awareness, three main problems with regard to network security situational awareness were discussed: extraction of the elements in the network security situation, preparation of the networkSecurity situation and projection of future situation.
Abstract: The research of network security Situation Awareness(SA) is important in improving the abilities of network detection,response to emergency and predicting the network security trend.In this paper,based on the conceptual model of situational awareness,three main problems with regard to network security situational awareness were discussed: extraction of the elements in the network security situation,comprehension of the network security situation and projection of future situation.The core issues to be resolved,and major algorithms as well as the advantages and disadvantages of various algorithms were focused.Finally,the opening issues and challenges for network security situation awareness concerning both theory and implementation in near future were proposed.

23 citations


Journal Article
TL;DR: By analyzing their courses of development, history and interrelationship, and comparing their applications in daily life, the authors are in favor that vehicular CPS can be regarded as an application of IoT in vehicular networks.
Abstract: This paper discussed the connotation and extension of two hot concepts,namely,"The Internet of Things(IoT)" and "vehicular CPS(Cyber Physical System)".By analyzing their courses of development,history and interrelationship,and comparing their applications in daily life,the authors are in favor that vehicular CPS can be regarded as an application of IoT in vehicular networks.Furthermore,the authors look into the bright future of services and applications resulting from vehicular CPS.With this in mind,the authors discussed the key technologies of implementing vehicular CPS and introduced an initiative.The authors also discussed in detail the academic research at home and abroad in the area of the Internet of Things and vehicular CPS.

14 citations


Journal Article
TL;DR: The experimental results show that the fall detection system can alarm 100% when the human body is falling fore-and-aft, laterally and rapidly rising after falling, and can achieve the detection level of normal human falling.
Abstract: In order to satisfy the requirements of the elderly care,and lessen the physical and psychological hurt caused by falling,the authors presented a fall detection system using a tri-axis accelerometer.The study idea was based on the attitude estimation using tri-axis accelerometer for the judgment of falling detection.In addition,considering the impact from the noise and the high accuracy of the falling detection system,the Kalman filtering algorithm was used to improve the system's reliability.The experimental results show that the system can alarm 100% when the human body is falling fore-and-aft,laterally and rapidly rising after falling,and can achieve the detection level of normal human falling.

13 citations


Journal Article
TL;DR: The experimental results show that the method for calculating the relative similarity between the concepts of ontology is effective.
Abstract: This paper represented the information of each vertex in ontology graph as a vector.According to its structure of ontology graph,the vertices were divided into k parts.It chose vertices from each part,and chose the ranking loss function.It used k-partite ranking learning algorithm to get the optimization ranking function,mapped each vertex of ontology structure graph into a real number,and then calculated the relative similarities of concepts by comparing the difference between real numbers.The experimental results show that the method for calculating the relative similarity between the concepts of ontology is effective.

12 citations


Journal Article
TL;DR: A new protocol was introduced in which the shared secret message was calculated by the private key and temporary secret information of users of the protocol, and its security was also proved in standard model.
Abstract: Wang et al(WANG SHENG-BAO,CAO ZHEN-FU,DONG XIAO-LEIProvably secure identity-based authenticated key agreement protocols in the standard modelChinese Journal of Computers,2007,30(10):1842-1854) proposed an ID-based Authenticated Key Agreement(IDAKA) protocol which was proved secure under standard model but without attribute of Private Key Generator(PKG) forward securityIn order to remedy the flaw,a new protocol was introduced in which the shared secret message was calculated by the private key and temporary secret information of users of the protocol,and its security was also proved in standard modelCompared with known protocols,the new protocol is more efficientAdditionally,a method of jointly generating private key by PKG and user was proposedThe private key of user was calculated by the main secret key of system and secret information provided by userIt effectively solves the problem of PKG forward security of ID-based authenticated key agreement protocol

11 citations


Journal Article
TL;DR: This paper presented a Hadoop-based storage architecture for massive MP3 files, fully using the MP3 file's rich metadata, and the experimental results show that the approach can achieve a better performance.
Abstract: MP3 as the de facto standard for digital music,the number of files is quite large and user access requirements rapidly grow up.How to effectively store and manage vast amounts of MP3 files to provide good user experience has become a matter of bigger concern.The emergence of Hadoop provides us with new ideas.However,because Hadoop itself is not suitable for handling massive small files,this paper presented a Hadoop-based storage architecture for massive MP3 files,fullly using the MP3 file's rich metadata.The classification algorithm by pre-processing module would merged small files into sequence file,and the introduction of efficient indexing mechanism served as a good solution to the problem of small files.The experimental results show that the approach can achieve a better performance.

9 citations


Journal Article
TL;DR: Role mapping properties of the threshold and domain properties ofThe threshold were proposed, to solve the different problems of the same role on different domains and inter-domain trust level, and the different domains were organized in a more fine-grained access control,thus further improving the inter- domain interoperability security issues.
Abstract: The role-based access control policy is mainly to take the role mapping to achieve inter-domain interoperability.The role mapping does not consider the extent of the same role of impact on different domains,and different level of mutual trust between one domain and the other domain.Role mapping properties of the threshold and domain properties of the threshold were proposed,to solve the different problems of the same role on different domains and inter-domain trust level,and the different domains were organized in a more fine-grained access control,thus further improving the inter-domain interoperability security issues.

9 citations


Journal Article
TL;DR: The proposed algorithm can solve the lack in information of AIS equipment, and accurately predict the location of a vessel, and the accuracy and the reliability of intelligence supporting command system can be ensured in controlled waterway.
Abstract: Due to the lack of information of Automatic Identification System(AIS) equipment,the location of a vessel cannot be accurately judged by intelligent supporting command system based on AISIt is difficult to accurately issue the traffic signal from itMeanwhile,due to the narrow and winding features in controlled waterway,it is difficult for traditional Kalman filter to accurately predict track of moving vesselIn this situation,the real-time estimation of system noise in Kalman filter algorithm was proposed to increase the accuracy of track prediction of moving vesselSimulation analysis was carried out on the tracking effect of the traditional Kalman filter and improved Kalman filterThe results indicate that the proposed algorithm can solve the lack in information of AIS equipment,and accurately predict the location of a vesselThe accuracy and the reliability of intelligence supporting command system can be ensured in controlled waterway

9 citations


Journal Article
TL;DR: The experimental results have proved that the overall efficiency of coarse-grained evaluation was higher when the deep-layer searches were processing, and dynamic evaluation of evaluation-function is more reasonable.
Abstract: Concerning the speed bottleneck of the gobang machine game when relying on the configuration of the stones to evaluate the game states,this paper proposed a multi-layer evaluation-function method combining identification granularity of the stones' configuration with the search depth.The experimental results have proved: the overall efficiency of coarse-grained evaluation was higher when the deep-layer searches were processing;multi-layer judging for the configuration of the stones could obviously accelerate the evaluation;if the move-generating functions were introduced into the rapid evaluation,it was more efficient to prune certain branches from game-tree in advance;gobang game tree searches also benefited if floating the values of non-critical stone's configuration to balance the search depth.Therefore,dynamic evaluation of evaluation-function is more reasonable.

8 citations


Journal Article
Xiong Lei1
TL;DR: The experimental results show that the new algorithm based on matrix factorization can improve the recommendation accuracy effectively, and solve the problems of data sparsity and new user.
Abstract: Concerning the difficulty of data sparsity and new user problems in many collaborative recommendation algorithms,a new collaborative recommendation algorithm based on matrix factorization and user nearest neighbor was proposed.To guarantee the prediction accuracy of the new users,the user nearest neighbor model based on user data and profile information was used.Meanwhile,large data sets and the problem of matrix sparsity would significantly increase the time and space complexity.Therefore,matrix factorization was introduced to alleviate the effect of data problems and improve the prediction accuracy.The experimental results show that the new algorithm can improve the recommendation accuracy effectively,and solve the problems of data sparsity and new user.

8 citations


Journal Article
TL;DR: The experimental results on data set MovieLens demonstrate that the number of hit high rating items by WNBI increases obviously in contrast with NBI, especially when the length of recommendation list is shorter than 20.
Abstract: In Network-Based Inference(NBI) algorithm,the weight of edge between user and item is ignored;therefore,the items with high rating have not got the priority to be recommended.In order to solve the problem,a Weigted Network-Based Inference(WNBI) algorithm was proposed.The edge between user and item was weighted with item's rating by proposed algorithm,the resources were allocated according to the ratio of the edge's weight to total edges' weight of the node,so that high rating items could be recommended with priority.The experimental results on data set MovieLens demonstrate that the number of hit high rating items by WNBI increases obviously in contrast with NBI,especially when the length of recommendation list is shorter than 20,the numbers of hit items and hit high rating items both increase.

Journal Article
Huang Ke-kun1
TL;DR: An improved Set Partitioning In Hierarchical Trees algorithm based on prior scanning the coefficients around which there were more significant coefficients was proposed and can improve PSNR and the subjective visual experience compared with SPIHT.
Abstract: In order to obtain better compression on image edge,an improved Set Partitioning In Hierarchical Trees(SPIHT) algorithm based on prior scanning the coefficients around which there were more significant coefficients was proposed.The coefficients or sets were sorted according to the number of surrounding significant coefficients before being coded,and the previous significant coefficients were refined as soon as the sets around which there existed any significant coefficients had been scanned.The scanning order was confirmed adaptively and did not need any extra storage.It can code more significant coefficients at a specified compression ratio.The experimental results show that the method can improve PSNR and the subjective visual experience compared with SPIHT.

Journal Article
TL;DR: The experimental results show that this improved collaborative filtering algorithm can efficiently solve the problem of similarity measurement inaccuracy caused by the extreme sparsity of user rating data, and provide better recommendation results than traditional collaborative filtering algorithms.
Abstract: The user rating data in traditional collaborative filtering recommendation algorithm are extremely sparse,which results in bad similarity measurement and poor recommendation quality.In view of this problem,this paper presented an improved collaborative filtering algorithm,which was based on item attribute and cloud model filling.The algorithm proposed a new similarity measurement method,using the data filling based on cloud model and the similarity of the item's attributes.The new method computed the rating similarity by using the traditional similarity measurement on the basis of the filling matrix and computed the attributing similarity by using item's attributes,then got the last similarity by using weighting factor.The experimental results show that this method can efficiently solve the problem of similarity measurement inaccuracy caused by the extreme sparsity of user rating data,and provide better recommendation results than traditional collaborative filtering algorithms.

Journal Article
TL;DR: An improved shuffled frog leaping algorithm is proposed that can attain above twice improvement on accuracy, convergence speed and success rate, and it demonstrates a better optimization capability especially in solving the high dimensional complex optimization problem.
Abstract: To enhance the performance of Shuffled Frog Leaping Algorithm(SFLA) in solving optimization problems,this paper proposed an improved shuffled frog leaping algorithm.By adding mutation operator to the original algorithm,the improved algorithm regulated the scale of mutation operator via fuzzy controller,made a dynamic adjustment of mutation operator in the searching range of solution space with different phase and candidate solution distribution of evolution process.The simulation results of four typical functions of optimization problems show that the proposed algorithm can attain above twice improvement on accuracy,convergent speed and success rate,and it demonstrates a better optimization capability especially in solving the high dimensional complex optimization problem,in comparison with the basic shuffled frog leaping algorithm and the known improved algorithm.

Journal Article
TL;DR: The results of simulation in CloudSim environment prove that using the proposed PM-LB algorithm can obtain better load-balancing performance and higher resource utilization.
Abstract: Regarding the virtual machine deployment issues in cloud computing,the Performance Matching-Load Balancing(PM-LB) algorithm of virtual machine deployment was proposed.With performance vector,the performance standardization of virtual infrastructure was described.The matching vector was obtained by calculating the relative vector distance of virtual machine and the servers,then a comprehensive analysis of matching vector and load balancing vector was done to get the deployment result.The results of simulation in CloudSim environment prove that using the proposed algorithm can obtain better load-balancing performance and higher resource utilization.

Journal Article
TL;DR: An improvement of OpenCV watershed was proposed by replacing the absolute difference between adjacent pixels with the difference between them and the experiment shows that the improved watershed algorithm has a better segmentation performance than that of original algorithm with same processing speed.
Abstract: Watershed is an image segmentation algorithm based on morphology,which can determine the boundary of connected section efficiently and effectivelyApplication of marker watershed algorithm in cell image segmentation leads to the solution of adhesion cell segmentationWhen implementing this algorithm,a flaw of the watershed algorithm in OpenCV has been foundThe cause of this bug is the wrong description of the difference between two adjacent pixelsAn improvement of OpenCV watershed was proposed by replacing the absolute difference between adjacent pixels with the difference between themThe experiment shows that the improved OpenCV watershed algorithm has a better segmentation performance than that of original algorithm with same processing speed

Journal Article
MU You-jing1
TL;DR: In this paper, the authors studied the relation between several possibility degree formulas and proposed a possibility degree matrices-based method that aimed to objectively determine the weights of criteria in multiple attribute decision-making with intervals.
Abstract: The authors studied the relation between several possibility degree formulas,and proposed a possibility degree matrices-based method that aimed to objectively determine the weights of criteria in Multiple Attribute Decision-Making(MADM) with intervals.Each pair of interval values belonging to the same attributes in a decision matrix was compared to construct corresponding possibility degree matrices,whose priority vectors were subsequently utilized to convert the decision matrix expressed as intervals into a matrix with precise numbers as a measure.In this way,an uncertainty of determining weights of criteria in MADM with intervals could be converted into a certainty which was easier to handle,and with the attribute weights obtained,the possibility degree method for ranking interval numbers was still used to get the priorities of alternatives.Two numerical examples were given to illustrate the proposed method and examine its feasibility and validity.Finally,a necessary discussion was made on the conversion from uncertainty to certainty in MADM with intervals,and some potential problems coming from it.

Journal Article
Meng Shui-jin1
TL;DR: The experimental results show that the improved Canny algorithm has not only a good anti-noise function, but also a very good precision.
Abstract: The traditional Canny operator uses the global threshold method,but when the gray of the input image's background and foreground change largely,the global threshold method will lose some weak edgeConcerning this issue,an improved adaptive Canny operator was put forwardFirstly,the image was divided into blocks according to the gradient varianceSecondly,the threshold of the sub blocks was obtained by Otsu methodThen,the threshold matrix was got by interpolationFinally,an improved edge connection algorithm was proposed to extract the edge with the threshold matrixThe experimental results show that the improved Canny algorithm has not only a good anti-noise function,but also a very good precision

Journal Article
TL;DR: A multi-feature adaptive fusion scheme based on color, shape and texture was proposed and the improved algorithm has higher tracking accuracy than traditional algorithm in the scene with illumination variation or similar background.
Abstract: The Camshift algorithm based on color-kernel can effectively track objects in a simple background,but it is easy to be interfered by illumination variation or the similar color object in the backgroundTo improve the algorithm's ability to respond to illumination variation,a multi-feature adaptive fusion scheme based on color,shape and texture was proposedAnd further improvements have been proposed through modifying feature histogram and setting a reasonable search region to solve the problem of similar backgroundThe experimental results show that the improved algorithm has higher tracking accuracy than traditional algorithm in the scene with illumination variation or similar background

Journal Article
TL;DR: Compared with Particle Swarm Optimization (PSO), the method given in this paper shows up to two times faster convergence rate, and for some subjects, the new method shows ten times higher precision.
Abstract: It is difficult to estimate the parameters of software reliability models,since most of them are non-linear models.The most widely used methods for parameters estimating of software reliability models have been summarized,and a new approach based on ant colony algorithm was proposed.The experiments with three typical models,G-O model,Weibull model and M-O model,show that this algorithm demonstrates good applicability.And the results demonstrate that the proposed method has solved the nonconvergent problem that resulted from traditional methods.Compared with Particle Swarm Optimization(PSO),the method given in this paper shows up to two times faster convergence rate,and for some subjects,the new method shows ten times higher precision.

Journal Article
TL;DR: Classification experiments demonstrate that, with the same number of features, classification accuracy of the proposed algorithm is obviously higher than the traditional approaches.
Abstract: The traditional feature selection algorithms are limited to single-label data.Concerning this problem,multi-label ReliefF algorithm was proposed for multi-label feature selection.For multi-label data,based on label co-occurrence,this algorithm assumed the label contribution value was equal.Combined with three new methods calculating the label contribution,the updating formula of feature weights was improved.Finally a distinguishable feature subset was selected from original features.Classification experiments demonstrate that,with the same number of features,classification accuracy of the proposed algorithm is obviously higher than the traditional approaches.

Journal Article
TL;DR: In order to deal with the problems that decision information is intuitionist fuzzy and attribute weights are unknown, a decision-making method based on intuitionistic fuzzy entropy and score function was proposed.
Abstract: In order to deal with the problems that decision information is intuitionistic fuzzy and attribute weights are unknown,a decision-making method based on intuitionistic fuzzy entropy and score function was proposed.Firstly,a new concept of intuitionistic fuzzy entropy was presented to measure the intuitionism and fuzziness of intuitionistic fuzzy sets,and relevant properties were also discussed.Secondly,to decrease decision effects of uncertain information,a programming model was constructed to determine attribute weights combined with intuitionistic fuzzy entropy.Meanwhile,in the view of membership,non-membership and hesitancy degree,correlation coefficients between objects of the universe and the ideal object were constructed,and according to decision makers' attitude,the optimal decision was obtained by defining the score function.Finally,the article proposed a multiple attribute decision making method on intuitionistic fuzzy information,and the feasibility and effectiveness of the method are verified through a case study of candidates evaluation.

Journal Article
TL;DR: Wang et al. as mentioned in this paper proposed a new method based on watershed algorithm to raise the segmentation accuracy of the crop disease leaf images, and the results of experiment indicate that disease spots can be separated precisely from the crop leaf images.
Abstract: A new method based on watershed algorithm was proposed to raise the segmentation accuracy of the crop disease leaf images.At first,distance transformation and watershed segmentation were conducted on the binary crop disease leaf images to get the background marker,and the preliminary foreground markers were generated by extracting the regional minimum from the reconstructed gradient images,and then some fake foreground markers were eliminated by the further filter.In the next step,both background markers and foreground markers were imposed on the gradient image by the compulsive minimum algorithm.At last,the watershed transformation was carried out on the modified gradient image.Lots of cucumber disease leaf images were segmented effectively using the method.The results of experiment indicate that disease spots can be separated precisely from the crop leaf images.Additionally,the segmentation results are not influenced by leaf texture and its accuracy is up to more than 90 percent,so the method has certain validity and practical value.

Journal Article
Liu Bo1
TL;DR: The simulation experimental results suggest that the proposed algorithm could solve the MOP-MCSC problem efficiently and effectively with a better performance than conventional particle swarm optimization.
Abstract: To cope with Multi-objective Programming on Manufacturing Cloud Service Composition(MOP-MCSC) problem in cloud manufacturing(CMfg) system,a mathematical model and a solution algorithm were proposed and studied.Firstly,inspired by the resource service composition technology in manufacturing grid,a QoS-aware MOP-MCSC model in CMfg system had been explored and described.Secondly,by analyzing the characteristics of manufacturing cloud services according to the domain knowledge of manufacturing,an eight-dimensional QoS evaluation criterion with corresponding quantitative calculation formulas was defined.Then,the QoS expression of manufacturing cloud service was eventually formulated.Lastly,the MOP-MCSC model was built,and an Adaptive Mutation Particle Swarm Optimization(AMPSO) was designed to realize this model.The simulation experimental results suggest that the proposed algorithm could solve the MOP-MCSC problem efficiently and effectively with a better performance than conventional particle swarm optimization.

Journal Article
TL;DR: The pruning and compression strategies were developed through theoretical analysis and verification, which could decrease the search space and the scale of FP-tree and the new algorithm was compared with NHTFPG algorithm and FpMAX algorithm respectively in terms of accuracy and efficiency.
Abstract: In order to reduce the repeated traversal times of path in the FP-tree,the conditional pattern bases of all frequent 1-itemsets in the FP-tree need to be saved in the existing algorithms.Concerning this problem,in the new algorithm,the data structure of FP-tree was improved that only the conditional pattern bases were saved which were constituted by the items in the path from every leaf node' parents to the root in the FP-tree,and the storage space of the conditional pattern bases was reduced.After studying search space and the method of data representation in the algorithm for mining maximal frequent itemsets,the pruning and compression strategies were developed through theoretical analysis and verification,which could decrease the search space and the scale of FP-tree.Finally,the new algorithm was compared with NHTFPG algorithm and FpMAX algorithm respectively in terms of accuracy and efficiency.The experimental results show that the new FP-tree algorithm saves the required conditions for model-based storage space more than 50% than NHTFPG algorithm,and the efficiency ratio improves by 2 to 3 times than FpMAX algorithm.


Journal Article
TL;DR: A cloud computing access control model based on the Role-Based Access Control (RBAC) model is proposed that can not only ensure the security and reliability of the data stored in the cloud, but also guarantee a certain of elasticity and flexibility.
Abstract: Because of the virtualization and elastic properties of cloud computing,the access control of cloud environment is different from the traditional access control under limiting conditions,thus the properties of the host and the guest and the role of the host are in the dynamic changes.In consideration of these characteristics of cloud computing access control,a cloud computing access control model based on the Role-Based Access Control(RBAC) model is proposed.This model can not only ensure the security and reliability of the data stored in the cloud,but also guarantee a certain of elasticity and flexibility.Finally,the model's realization process was given,and was achieved in the health care system based on cloud computing environment.

Journal Article
TL;DR: Simulation results prove that the proposed scheduling algorithm not only shortens the makespan, but also decreases the power consumption effectively.
Abstract: Under the cloud computing environment,it has become a significant problem to decrease the power consumption while the makespan is shortened in the process of scheduling resource.Thus,this paper made span and power consumption as the optimization objectives and established power-aware resource scheduling model,then improved the Non-dominated Sorting Genetic Algorithm Ⅱ(NSGA-Ⅱ) by adopting special initialization and the learning algorithm,to solve the problem of power-aware scheduling.Consequently,the simulation results prove that the proposed scheduling algorithm not only shortens the makespan,but also decreases the power consumption effectively.

Journal Article
TL;DR: An improved Timing-sync Protocol for Sensor Networks (TPSN) algorithm based on hierarchy is proposed that enhances the accuracy of synchronization, and saves the energy consumption for the network.
Abstract: Time synchronization is one of the key technologies for wireless sensor networks,and it plays an irreplaceable role for the work and development of the entire wireless sensor network.This paper proposed an improved Timing-sync Protocol for Sensor Networks(TPSN) algorithm based on hierarchy.This algorithm took level-broadcast in phase of level establishment,and a combined active and inactive two-way synchronization algorithm was used in phase of time synchronization,obtaining a relatively small cost of packet and system maintenance.This improved algorithm carried out time-frequency offset correction,and ensured the accuracy of the node.The improved TPSN algorithm not only enhances the accuracy of synchronization,but also saves the energy consumption for the network.

Journal Article
TL;DR: The experimental results show the proposed face recognition method can increase the sample capacity, overcome the effect of illumination and posture, and raise the recognition rate, and in the comprehensive performance, it is better than contrast method.
Abstract: In order to improve the practicability of face recognition technology,a new face recognition method was proposed by adopting the facial mirror symmetry and Kernel Principle Component Analysis(KPCA)Firstly,the original images were decomposed by wavelet transform,and the low frequency components could be obtainedThen,the odd symmetry samples and the even symmetry samples were obtained by mirror transformingOdd/even eigen vector were separately extracted through KPCA and fused to composite features by an odd-even weighted factorA nearest neighbor classifier was used to classify the imagesThe proposed method was tested on the ORL face image databaseThe experimental results show the method can increase the sample capacity,overcome the effect of illumination and posture,and raise the recognition rateBesides,in the comprehensive performance,it is better than contrast method