scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Software Engineering and Applications in 2012"


Journal ArticleDOI
TL;DR: A comprehensive survey on real-time issues in virtualization for embedded systems, covering popular virtualization systems including KVM, Xen, L4 and others is presented.
Abstract: Virtualization has gained great acceptance in the server and cloud computing arena. In recent years, it has also been widely applied to real-time embedded systems with stringent timing constraints. We present a comprehensive survey on real-time issues in virtualization for embedded systems, covering popular virtualization systems including KVM, Xen, L4 and others.

81 citations


Journal ArticleDOI
TL;DR: Flex sensor is a device, which accomplish the pick and place operation of the robotics arm can be efficiently controlled using micro controller programming with great degree of accuracy.
Abstract: Sensor plays an important role in robotics. Sensors are used to determine the current state of the system. Robotic applications demand sensors with high degrees of repeatability, precision, and reliability. Flex sensor is such a device, which accomplish the above task with great degree of accuracy. The pick and place operation of the robotics arm can be efficiently controlled using micro controller programming. This designed work is an educational based concept as robotic control is an exciting and high challenge research work in recent year.

55 citations


Journal ArticleDOI
TL;DR: The most important issues in this case study are: UX designers cannot collaborate closely with developers because UX designers are working on multiple projects and thatUX designers cannot work up front because they are too busy with too many projects at the same time.
Abstract: We used existing studies on the integration of user experience design and agile methods as a basis to develop a framework for integrating UX and Agile. We performed a field study in an ongoing project of a medium-sized company in order to check if the proposed framework fits in the real world, and how some aspects of the integration of UX and Agile work in a real project. This led us to some conclusions situating contributions from practice to theory and back again. The framework is briefly described in this paper and consists of a set of practices, artifacts, and techniques derived from the literature. By combining theory and practice we were able to confirm some thoughts and identify some gaps—both in the company process and in our proposed framework—and drive our attention to new issues that need to be addressed. We believe that the most important issues in our case study are: UX designers cannot collaborate closely with developers because UX designers are working on multiple projects and that UX designers cannot work up front because they are too busy with too many projects at the same time.

53 citations


Journal ArticleDOI
TL;DR: The results indicate that: 1) complexity, size, cohesion and (to some extent) coupling were found significant predictors of the unit testing effort of classes and 2) multivariate regression models based on object-oriented design metrics are able to accurately predict the unitTesting effort ofclasses.
Abstract: In this paper, we investigate empirically the relationship between object-oriented design metrics and testability of classes. We address testability from the point of view of unit testing effort. We collected data from three open source Java software systems for which JUnit test cases exist. To capture the testing effort of classes, we used metrics to quantify the corresponding JUnit test cases. Classes were classified, according to the required unit testing effort, in two categories: high and low. In order to evaluate the relationship between object-oriented design metrics and unit testing effort of classes, we used logistic regression methods. We used the univariate logistic regression analysis to evaluate the individual effect of each metric on the unit testing effort of classes. The multivariate logistic regression analysis was used to explore the combined effect of the metrics. The performance of the prediction models was evaluated using Receiver Operating Characteristic analysis. The results indicate that: 1) complexity, size, cohesion and (to some extent) coupling were found significant predictors of the unit testing effort of classes and 2) multivariate regression models based on object-oriented design metrics are able to accurately predict the unit testing effort of classes.

43 citations


Journal ArticleDOI
TL;DR: A method to prioritize new test cases by calculating risk exposure value for requirements and analyzing risk items based on the calculation to evaluate relevant test cases and thereby determining the test case priority through the evaluated values is proposed.
Abstract: Test case prioritization techniques have been focused on regression testing which is conducted on an already executed test suite. In fact, the test case prioritization for new testing is also required. In this paper, we propose a method to prioritize new test cases by calculating risk exposure value for requirements and analyzing risk items based on the calculation to evaluate relevant test cases and thereby determining the test case priority through the evaluated values. Moreover, we demonstrate effectiveness of our technique through empirical studies in terms of both APFD and fault severity.

36 citations


Journal ArticleDOI
TL;DR: This paper proposes a performance measurement framework for Cloud Computing systems, which integrates software quality concepts from ISO 25010.
Abstract: Cloud Computing is an emerging technology for processing and storing very large amounts of data. Sometimes anomalies and defects affect part of the cloud infrastructure, resulting in a performance degradation of the cloud. This paper proposes a performance measurement framework for Cloud Computing systems, which integrates software quality concepts from ISO 25010.

31 citations


Journal ArticleDOI
TL;DR: The findings show that some issues and challenges that practitioners consider important are understudied such as software related issues, and challenges pertaining to learning fast-evolving technologies.
Abstract: As cloud computing continues to gain more momentum in the IT industry, more issues and challenges are being reported by academics and practitioners. In this paper, we aim to attain an understanding of the types of issues and challenges that have been emerging over the past five years and identify gaps between the focus of the literature and what practitioners deem important. A systematic literature review as well as interviews with experts have been conducted to answer our research questions. Our findings suggest that researchers have been mainly focusing on issues related to security and privacy, infrastructure, and data management. Interoperability across different service providers has also been an active area of research. Despite the significant overlap between the topics being discussed in the literature and the issues raised by the practitioners, our findings show that some issues and challenges that practitioners consider important are understudied such as software related issues, and challenges pertaining to learning fast-evolving technologies.

29 citations


Journal ArticleDOI
TL;DR: The experimental results show that the proposed algorithm could achieve an excellent compression ratio without losing data when compared to the standard compression algorithms.
Abstract: The development of multimedia and digital imaging has led to high quantity of data required to represent modern imagery. This requires large disk space for storage, and long time for transmission over computer networks, and these two are relatively expensive. These factors prove the need for images compression. Image compression addresses the problem of reducing the amount of space required to represent a digital image yielding a compact representation of an image, and thereby reducing the image storage/transmission time requirements. The key idea here is to remove redundancy of data presented within an image to reduce its size without affecting the essential information of it. We are concerned with lossless image compression in this paper. Our proposed approach is a mix of a number of already existing techniques. Our approach works as follows: first, we apply the well-known Lempel-Ziv-Welch (LZW) algorithm on the image in hand. What comes out of the first step is forward to the second step where the Bose, Chaudhuri and Hocquenghem (BCH) error correction and detected algorithm is used. To improve the compression ratio, the proposed approach applies the BCH algorithms repeatedly until “inflation” is detected. The experimental results show that the proposed algorithm could achieve an excellent compression ratio without losing data when compared to the standard compression algorithms.

29 citations


Journal ArticleDOI
TL;DR: The applications of the support vector machine with mixture of kernel (SVM-MK) to design a text classification system that uses the 1-norm based object function and adopts the convex combinations of single feature basic kernels.
Abstract: Recent studies have revealed that emerging modern machine learning techniques are advantageous to statistical models for text classification, such as SVM. In this study, we discuss the applications of the support vector machine with mixture of kernel (SVM-MK) to design a text classification system. Differing from the standard SVM, the SVM-MK uses the 1-norm based object function and adopts the convex combinations of single feature basic kernels. Only a linear programming problem needs to be resolved and it greatly reduces the computational costs. More important, it is a transparent model and the optimal feature subset can be obtained automatically. A real Chinese corpus from FudanUniversityis used to demonstrate the good performance of the SVM- MK.

27 citations


Journal ArticleDOI
TL;DR: NOP provides a new manner to conceive, structure, and execute software, which allows better performance, causal-knowledge organization, and entity decoupling than standard solutions based upon current paradigms.
Abstract: This paper presents a new programming paradigm named Notification Oriented Paradigm (NOP) and analyses performance aspects of NOP programs by means of an experiment. NOP provides a new manner to conceive, structure, and execute software, which allows better performance, causal-knowledge organization, and entity decoupling than standard solutions based upon current paradigms. These paradigms are essentially Imperative Paradigm (IP) and Declarative Paradigm (DP). In short, DP solutions are considered easier to use than IP solutions thanks to the concept of high-level programming. However, they are considered slower to execute and lesser flexible to program than IP. Anyway, both paradigms present similar drawbacks like causal-evaluation redundancies and strongly coupled entities, which decrease software performance and processing distribution feasibility. These problems exist due to an orientation to monolithic inference mechanism based upon sequential evaluation by means of searches over passive computational entities. NOP proposes another manner to structure software and make its inferences, which is based upon small, smart, and decoupled collaborative entities whose interaction happen by means of precise notifications. This paper discusses NOP as a paradigm and presents certain comparison of NOP against IP. Actually, performance is evaluated by means of IP and NOP programs with respect to a same application, which allow demonstrating NOP superiority.

25 citations


Journal ArticleDOI
TL;DR: The Bayesian Network is demonstrated on a dataset taken from literature, whose quality is described by 9 attributes, and it is concluded that Na?ve based Bayesian network performs better than other two techniques comparable to the classification done in literature.
Abstract: In this paper, we employed Na?ve Bayes, Markov blanket and Tabu search to rank web services. The Bayesian Network is demonstrated on a dataset taken from literature. The dataset consists of 364 web services whose quality is described by 9 attributes. Here, the attributes are treated as criteria, to classify web services. From the experiments, we conclude that Na?ve based Bayesian network performs better than other two techniques comparable to the classification done in literature.

Journal ArticleDOI
TL;DR: A new method for soft error detection using software redundancy (SEDSR) that is able to detect transient faults and its success in satisfaction of the existing tradeoff between fault coverage, performance and memory overheads is shown.
Abstract: This paper presents a new method for soft error detection using software redundancy (SEDSR) that is able to detect transient faults. Soft errors damage the control flow and data of programs and designers usually use hardware-based solutions to handle them. Software-based techniques for soft error detection force less cost and delay to systems and do not change their configuration. Therefore, these kinds of methods are appropriate alternatives for hardware-based techniques. SEDSR has two separate parts for data and control flow errors detection. Fault injection method is used to compare SEDSR with previous methods of this field based on the new parameter of “Evaluation Factor” that takes in account fault coverage, memory and performance overheads. These parameters are important in real time safety critical applications. Experimental results on SPEC2000 and some traditional benchmarks of this field show that SEDSR is much better than previous methods of this field. SEDSR’s evaluation factor is about 50% better than other methods of this field. These results show its success in satisfaction of the existing tradeoff between fault coverage, performance and memory overheads.

Journal ArticleDOI
TL;DR: Different categories of design patterns are introduced as a vehicle for capturing and reusing good analyses, designs and implementation applied to TOGAF framework while detailing a motivating exemplar on how design patterns can be composed to create generic types of architectures of TOG AF framework.
Abstract: Design pattern suggests that developers must be able to reuse proven solutions emerging from the best design practices to solve common design problems while composing patterns to create reusable designs that can be mapped to different types of enterprise frameworks and architectures such as The Open Group Architecture Framework (TOGAF). Without this, business analysts, designers and developers are not properly applying design solutions or take full benefit of the power of patterns as reuse blocks, resulting in poor performance, poor scalability, and poor usability. Furthermore, these professionals may “reinvent the wheel” when attempting to implement the same design for different types of architectures of TOGAF framework. In this paper, we introduce different categories of design patterns as a vehicle for capturing and reusing good analyses, designs and implementation applied to TOGAF framework while detailing a motivating exemplar on how design patterns can be composed to create generic types of architectures of TOGAF framework. Then, we discuss why patterns are a suitable for developing and documenting various architectures including enterprise architectures as TOGAF.

Journal ArticleDOI
TL;DR: The purpose of this work is to analyze how the wear progression on the wheelsets affects the dynamic behavior of railways vehicles and its interaction with the track.
Abstract: The search for fast, reliable and cost effective means of transport that presents better energy efficiency and less impact on the environment has resulted in renewed interest and rapid development in railway technology. To improve its efficiency and competitiveness, modern trains are required to travel faster, with high levels of safety and comfort and with reduced Life Cycle Costs (LCC). These increasing demands for vehicle requirements imposed by railway operators and infrastructure companies include maintaining the top operational speeds of trainsets during their life cycle, having low LCC and being track friendly. This is a key issue in vehicle design and in train operation since it has a significant impact on the safety and comfort of railway systems and on the maintenance costs of vehicles and infrastructures. The purpose of this work is to analyze how the wear progression on the wheelsets affects the dynamic behavior of railways vehicles and its interaction with the track. For this purpose a vehicle, assembled with new and worn wheels, is studied in realistic operation scenarios. The influence of the wheel profile wear on the vehicle dynamic response is assessed here based on several indicators used by the railway industry. The running stability of the railway vehicles is also emphasized in this study.

Journal ArticleDOI
TL;DR: This paper introduced Scrum from qualitative perspective by applying ethnography and in-depth interview to two different types of project teams to articulate what the success factors are for running Scrum framework.
Abstract: Scrum—Agile programming—is getting more attention in Software Engineering practices. Many software projects began with small and were not certain about the requirements until projects have completed; this makes Scrum more appropriate than other development methodologies. This paper reintroduced Scrum from qualitative perspective by applying ethnography and in-depth interview to two different types of project teams to articulate what the success factors are for running Scrum framework. It clearly demonstrated how qualitative research could help in disclosing the essence of facts during the Scrum adaptation in depth. It also articulated how these successful factors mutually affect to one another from System Dynamics perspective and to give further recommendations to Scrum teams and those who tend to apply Scrum development methodology.

Journal ArticleDOI
TL;DR: A FSP language is applied to formally specify the workflow for fault-tolerant composition of web services and shows how the execution order of the services is determined such that upon a service failure a recovery process with the lowest cost is started.
Abstract: In previous researches in the field of supporting reliability and fault tolerance in web service composition, only low level programming constructs such as exception handling (for example in WSBPEL) were considered. However we believe that the reliability and fault tolerance for composite services must be handled at a higher level of abstraction, i.e. at the workflow level. Therefore a language and technology independent method for fault-tolerant composition of web services is needed. To do this, a fault tolerant workflow is built in which the execution order of the services is determined such that upon a service failure a recovery process with the lowest cost is started. The cost of a service failure includes the cost of failed service and the total costs of roll-baking the previously executed services which are dependent on the failed service. In this article a FSP language is applied to formally specify the workflow.

Journal ArticleDOI
TL;DR: The proposed algorithm CLUBAS (Classification of Software Bugs Using Bug Attribute Similarity) is a hybrid algorithm, and is designed by using text clustering, frequent term calculations and taxonomic terms mapping techniques, an example of classification using clustering technique.
Abstract: In this paper, a software bug classification algorithm, CLUBAS (Classification of Software Bugs Using Bug Attribute Similarity) is presented. CLUBAS is a hybrid algorithm, and is designed by using text clustering, frequent term calculations and taxonomic terms mapping techniques. The algorithm CLUBAS is an example of classification using clustering technique. The proposed algorithm works in three major steps, in the first step text clusters are created using software bug textual attributes data and followed by the second step in which cluster labels are generated using label induction for each cluster, and in the third step, the cluster labels are mapped against the bug taxonomic terms to identify the appropriate categories of the bug clusters. The cluster labels are generated using frequent and meaningful terms present in the bug attributes, for the bugs belonging to the bug clusters. The designed algorithm is evaluated using the performance parameters F-measures and accuracy. These parameters are compared with the standard classification techniques like Na?ve Bayes, Naive Bayes Multinomial, J48, Support Vector Machine and Weka’s classification using clustering algorithms. A GUI (Graphical User Interface) based tool is also developed in java for the implementation of CLUBAS algorithm.

Journal ArticleDOI
TL;DR: The MSEC is able to correctly recognize 6 PA types with an accuracy of 93.50%, which is 7% higher than the non-ensemble support vector machine method and effective in reducing the subject-to-subject variability in activity recognition.
Abstract: This paper presents a multi-sensor ensemble classifier (MSEC) for physical activity (PA) pattern recognition of human subjects. The MSEC, developed for a wearable multi-sensor integrate measurement system (IMS),combines multiple classifiers based on different sensor feature sets to improve the accuracy of PA type identification.Experimental evaluation of 56 subjects has shown that the MSECis more effectivein assessing activities of varying intensitiesthan the traditional homogeneous classifiers. It is able to correctly recognize 6 PA types with an accuracy of 93.50%, which is 7% higher than the non-ensemble support vector machine method. Furthermore, the MSECis effective in reducing the subject-to-subject variabilityin activity recognition.

Journal ArticleDOI
TL;DR: Experimental results, based on a set of 12 gray-level images, demonstrate that the proposed scheme gives mean compression ratio that are higher those compared to the conventional arithmetic encoders.
Abstract: This paper presents a new method of lossless image compression. An image is characterized by homogeneous parts. The bit planes, which are of high weight are characterized by sequences of 0 and 1 are successive encoded with RLE, whereas the other bit planes are encoded by the arithmetic coding (AC) (static or adaptive model). By combining an AC (adaptive or static) with the RLE, a high degree of adaptation and compression efficiency is achieved. The proposed method is compared to both static and adaptive model. Experimental results, based on a set of 12 gray-level images, demonstrate that the proposed scheme gives mean compression ratio that are higher those compared to the conventional arithmetic encoders.

Journal ArticleDOI
TL;DR: A comparative modeling study carried out using Radial Basis Function Neural Network (RBFN) and Response Surface Methodology (RSM) to predict and optimize the performance of a biofilter system treating toluene (a model VOC).
Abstract: Biofiltration is emerging as a promising cost effective technique for the Volatile Organic Compounds (VOCs) removal from industrial waste gases. In the present investigation a comparative modeling study has been carried out using Radial Basis Function Neural Network (RBFN) and Response Surface Methodology (RSM) to predict and optimize the performance of a biofilter system treating toluene (a model VOC). Experimental biofilter system performance data collected over a time period by daily measurement of inlet VOC concentration, retention time, pH, temperature and packing moisture content was used to develop the mathematical model. These independent variables acted as the inputs to the mathematical model developed using RSM and RBFN, while the VOC removal efficiency was the biofilter system performance parameter to be predicted. The data set was divided into two parts: 60% of data was used for training phase and remaining 40% of data was used for the testing phase. The average % error for RSM and RBFN were 7.76% and 3.03%, and R2 value obtained were 0.8826 and 0.9755 respectively. The results indicated the superiority of RBFN in the prediction capability due to its ability to approximate higher degree of nonlinearity between the input and output variables. The optimization of biofilter parameters was also done using RSM to optimize the biofilter performance. RSM being structured in nature enabled the study of interaction effect between the independent variables on biofilter performance.

Journal ArticleDOI
TL;DR: The results show that the proposed fuzzy-logic-based scheme achieves really satisfactory and consistently load balance than of other randomize approaches in grid computing services.
Abstract: Load balancing is essential for efficient utilization of resources and enhancing the responsiveness of a computational grid, especially that hosts of services most frequently used, i.e. food, health and nutrition. Various techniques have been developed and applied; each has its own limitations due to the dynamic nature of the grid. Efficient load balancing can be achieved by an effective measure of the node’s/cluster’s utilization. In this paper, as a part of an NSTIP project # 10-INF1381-04 and in order to assess of FAQIH framework ability to support the load balance in a computational grid that hosts of food, health and nutrition inquire services. We detail the design and implementation of a proposed fuzzy-logic-based scheme for dynamic load balancing in grid computing services. The proposed scheme works by using a fuzzy logic inference system which uses some metrics to capture the variability of loads and specifies the state of each node per a cluster. Then, based on the overall nodes’ states, the state of the corresponding cluster will be defined in order to assign the newly arrived inquires such that load balancing among different clusters and nodes is accomplished. Many experiments are conducted to investigate the effectiveness of the proposed fuzzy-logic-based scheme to support the load balance where the results show that the proposed scheme achieves really satisfactory and consistently load balance than of other randomize approaches in grid computing services.

Journal ArticleDOI
TL;DR: It is shown in this study that spherical SOMs allow us to find similarities in data otherwise undetectable with plane SOMs, and the performance is implemented and evaluated using parallel sphere processing with several GPU environments.
Abstract: In this study, we visualize Pareto-optimum solutions derived from multiple-objective optimization using spherical self-organizing maps (SOMs) that lay out SOM data in three dimensions. There have been a wide range of studies involving plane SOMs where Pareto-optimal solutions are mapped to a plane. However, plane SOMs have an issue that similar data differing in a few specific variables are often placed at far ends of the map, compromising intuitiveness of the visualization. We show in this study that spherical SOMs allow us to find similarities in data otherwise undetectable with plane SOMs. We also implement and evaluate the performance using parallel sphere processing with several GPU environments.

Journal ArticleDOI
TL;DR: A proposed system for semantically translating Arabic text to Arabic SignWriting in the jurisprudence of prayer domain is presented, designed to translate Arabic text by applying Arabic Sign Language (ArSL) grammatical rules as well as semantically looking up the words in domain ontology.
Abstract: Arabic Sign Language (ArSL) is the native language for the Arab deaf community. ArSL allows deaf people to communicate among themselves and with non-deaf people around them to express their needs, thoughts and feelings. Opposite to spoken languages, Sign Language (SL) depends on hands and facial expression to express the thought instead of sounds. In recent years, interest in translating sign language automatically for different languages has increased. However, a small set of these works are specialized in ArSL. Basically, these works translate word by word without taking care of the semantics of the translated sentence or the translation rules of Arabic text to Arabic sign language. In this paper we present a proposed system for semantically translating Arabic text to Arabic SignWriting in the jurisprudence of prayer domain. The system is designed to translate Arabic text by applying Arabic Sign Language (ArSL) grammatical rules as well as semantically looking up the words in domain ontology. The results of qualitatively evaluating the system based on a SignWriting expert judgment proved the correctness of the translation results.

Journal ArticleDOI
TL;DR: The results of the study have indicated that the proposed analysis methods of facial expressions, was capable of undertaking face recognition using a minimum set of features improving efficiency and computation.
Abstract: The purpose of this study is to enhance the algorithms towards the development of an efficient three dimensional face recognition system in the presence of expressions. The overall aim is to analyse patterns of expressions based on techniques relating to feature distances compare to the benchmarks. To investigate how the use of distances can help the recognition process, a feature set of diagonal distance patterns, were determined and extracted to distinguish face models. The significant finding is that, to solve the problem arising from data with facial expressions, the feature sets of the expression-invariant and expression-variant regions were determined and described by geodesic distances and Euclidean distances. By using regression models, the correlations between expressions and neutral feature sets were identified. The results of the study have indicated that our proposed analysis methods of facial expressions, was capable of undertaking face recognition using a minimum set of features improving efficiency and computation.

Journal ArticleDOI
Lingyan Jiang, Jian Yao1, Baopu Li, Fei Fang, Qi Zhang, Max Q.-H. Meng 
TL;DR: An efficient approach for silhouette and contour detection is used to represent the contour curves of a human body shape with Freeman’s 8-connected chain codes and 101 feature points with clearly geometric properties are extracted automatically.
Abstract: Human body feature extraction based on 2D images provides an efficient method for many applications, e.g. non-contact body size measurements, constructing 3D human model and recognizing human actions. In this paper a systematic approach is proposed to detect feature points of human body automatically from its front and side images. Firstly, an efficient approach for silhouette and contour detection is used to represent the contour curves of a human body shape with Freeman’s 8-connected chain codes. The contour curves are considered as a number of segments connected together. Then, a series of feature points on human body are extracted based on the specified rules by measuring the differences between the directions of the segments. In total, 101 feature points with clearly geometric properties (that rather accurately reflect the bump or turning of the contours) are extracted automatically, including 27 points corresponding to the definitions of the landmarks about garment measurements. Finally, the proposed approach was tested on ten human subjects and the entire 101 feature points with specific geography geometrical characteristics were correctly extracted, indicating an effective and robust performance.

Journal ArticleDOI
TL;DR: The issues associated with the design, development, and use of web-based FES from a standpoint of the benefits and challenges of developing and using them are examined.
Abstract: Using of the Internet technology and the field of Fuzzy expert systems has proposed new branches of sharing and distributing knowledge. However, there has been a general lack of investigation in the area of web-based Fuzzy expert systems (FES). In this paper the issues associated with the design, development, and use of web-based FES from a standpoint of the benefits and challenges of developing and using them. The original theory and concepts in conventional FES were reviewed and a knowledge engineering framework for developing them was revisited. Student in an educational place need an educational advisor for solve problems. Some of educational circulars order changing because advisor must update information away. The student's request is linguistic and crisp Expert System cannot solve problems completely. In my approach we build Web-Based Fuzzy Expert System for Student Education Advisor (FES-SEA) and stays in university portal. This system implemented with ASP.NET, SQL-SERVER 2008.

Journal ArticleDOI
TL;DR: In this framework, a Privacy Service is used in combination with privacy policies to create privacy contracts that out- line what can and cannot be done with a consumer's personally identifiable information (PII).
Abstract: Service-Oriented Architecture (SOA) is a computer systems design concept which aims to achieve reusability and inte- gration in a distributed environment through the use of autonomous, loosely coupled, interoperable abstractions known as services. In order to interoperate, communication between services is very important due to their autonomous nature. This communication provides services with their functional strengths, but also creates the opportunity for the loss of privacy. In this paper, a Privacy Protection Framework for Service-Oriented Architecture (PPFSOA) is described. In this framework, a Privacy Service (PS) is used in combination with privacy policies to create privacy contracts that out- line what can and cannot be done with a consumer's personally identifiable information (PII). The privacy policy con- sists of one-to-many privacy rules, with each rule created from a set of six privacy elements: collector, what, purpose, retention, recipient and trust. The PS acts as an intermediary between the service consumer and service provider, to es- tablish an unbiased contract before the two parties begin sending PII. It is shown how many Privacy Services work to- gether to form the privacy protection framework. An examination of what current approaches to protecting privacy in an SOA environment is also presented. Finally, the operations the PS must perform in order to fulfill its tasks are out- lined.

Journal ArticleDOI
TL;DR: Results of RCDA cluster ensemble algorithm are encouraging and indicate better regionalization of the rainfall in different parts of India.
Abstract: The magnitude and frequency of precipitation is of great significance in the field of hydrologic and hydraulic design and has wide applications in varied areas. However, the availability of precipitation data is limited to a few areas, where the rain gauges are successfully and efficiently installed. The magnitude and frequency of precipitation in ungauged sites can be assessed by grouping areas with similar characteristics. The procedure of grouping of areas having similar behaviour is termed as Regionalization. In this paper, RCDA cluster ensemble algorithm is employed to identify the homogeneous regions of rainfall in India. Cluster ensemble methods are commonly used to enhance the quality of clustering by combining multiple clustering schemes to produce a more robust scheme delivering similar homogeneous regions. The goal is to identify, analyse and describe hydrologically similar regions using RCDA cluster ensemble algorithm. RCDA cluster ensemble algorithm, which is based on discriminant analysis. The algorithm takes H base clustering schemes each with K clusters, obtained by any clustering method, as input and constructs discriminant function for each one of them. Subsequently, all the data tuples are predicted using H discriminant functions for cluster membership. Tuples with consistent predictions are assigned to the clusters, while tuples with inconsistent predictions are analyzed further and either assigned to clusters or declared as noise. RCDA algorithm has been compared with Best of K-means and Clue cluster ensemble of R software using traditional clustering quality measures. Further, domain knowledge based comparison has also been performed. All the results are encouraging and indicate better regionalization of the rainfall in different parts of India.

Journal ArticleDOI
TL;DR: The Service Consumer is projected to provide the QoS requirements as part of Service discovery query and this framework will discover and filter the Web Services form the cloud and rank them according to Service Consumer preferences to facilitate Service on time.
Abstract: Enhancements in technology always follow Consumer requirements. Consumer requires best of service with least possible mismatch and on time. Numerous applications available today are based on Web Services and Cloud Computing. Recently, there exist many Web Services with similar functional characteristics. Choosing “a-right” Service from group of similar Web Service is a complicated task for Service Consumer. In that case, Service Consumer can discover the required Web Service using non functional attributes of the Web Services such as QoS. Proposed layered architecture and Web Service-Cloud, i.e. WS-Cloud computing Framework synthesizes the non functional attributes that includes reliability, availability, response time, latency etc. The Service Consumer is projected to provide the QoS requirements as part of Service discovery query. This framework will discover and filter the Web Services form the cloud and rank them according to Service Consumer preferences to facilitate Service on time.

Journal ArticleDOI
TL;DR: A team member ranking technique is presented for software bug repositories and can also be used for classifying and rating the software bugs using the ratings of members participating in the software bug repository.
Abstract: In this paper a team member ranking technique is presented for software bug repositories. Member ranking is performed using numbers of attributes available in software bug repositories, and a ranked list of developers is generated who are participating in development of software project. This ranking is generated from the contribution made by the individual developers in terms of bugs fixed, severity and priority of bugs, reporting newer problems and comments made by the developers. The top ranked developers are the best contributors for the software projects. The proposed algorithm can also be used for classifying and rating the software bugs using the ratings of members participating in the software bug repository.