scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Software Engineering and Applications in 2015"


Journal ArticleDOI
TL;DR: The goal of this paper is to review these approaches regarding their suitability for the domain of production automation in order to identify current trends and research gaps.
Abstract: As production automation systems have been and are becoming more and more complex, the task of quality assurance is increasingly challenging. Model-based testing is a research field addressing this challenge and many approaches have been suggested for different applications. The goal of this paper is to review these approaches regarding their suitability for the domain of production automation in order to identify current trends and research gaps. The different approaches are classified and clustered according to their main focus which is either testing and test case generation from some form of model automatons, test case generation from models used within the development process of production automation systems, test case generation from fault models or test case selection and regression testing.

45 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a metaheuristic optimization method for optimizing the parameters of three COCOMO-based models, including the basic COCO model and other two models proposed in the literature.
Abstract: Software development effort estimation is considered a fundamental task for software development life cycle as well as for managing project cost, time and quality. Therefore, accurate estimation is a substantial factor in projects success and reducing the risks. In recent years, software effort estimation has received a considerable amount of attention from researchers and became a challenge for software industry. In the last two decades, many researchers and practitioners proposed statistical and machine learning-based models for software effort estimation. In this work, Firefly Algorithm is proposed as a metaheuristic optimization method for optimizing the parameters of three COCOMO-based models. These models include the basic COCOMO model and other two models proposed in the literature as extensions of the basic COCOMO model. The developed estimation models are evaluated using different evaluation metrics. Experimental results show high accuracy and significant error minimization of Firefly Algorithm over other metaheuristic optimization algorithms including Genetic Algorithms and Particle Swarm Optimization.

43 citations


Journal ArticleDOI
TL;DR: This paper provides a broad view and discussion of the current state of this subject with a particular focus on data modeling and data analytics, describing and clarifying the main differences between the three main approaches in what concerns these aspects, namely: operational databases, decision support databases and Big Data technologies.
Abstract: These last years we have been witnessing a tremendous growth in the volume and availability of data. This fact results primarily from the emergence of a multitude of sources (e.g. computers, mobile devices, sensors or social networks) that are continuously producing either structured, semi-structured or unstructured data. Database Management Systems and Data Warehouses are no longer the only technologies used to store and analyze datasets, namely due to the volume and complex structure of nowadays data that degrade their performance and scalability. Big Data is one of the recent challenges, since it implies new requirements in terms of data storage, processing and visualization. Despite that, analyzing properly Big Data can constitute great advantages because it allows discovering patterns and correlations in datasets. Users can use this processed information to gain deeper insights and to get business advantages. Thus, data modeling and data analytics are evolved in a way that we are able to process huge amounts of data without compromising performance and availability, but instead by “relaxing” the usual ACID properties. This paper provides a broad view and discussion of the current state of this subject with a particular focus on data modeling and data analytics, describing and clarifying the main differences between the three main approaches in what concerns these aspects, namely: operational databases, decision support databases and Big Data technologies.

34 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the constructs that affect students' intention to use the computer-based assessment (CBA) and concluded that a system is more likely to be used by students if it is playful.
Abstract: Computer Based Assessment (CBA) is being a very popular method to evaluate students’ performance at the university level. This research aims to examine the constructs that affect students’ intention to use the CBA. The proposed model is based on previous technology models such as Technology Acceptance Model (TAM), Theory of Planned Behavior (TPB), and Unified Theory of Acceptance and Usage of Technology (TAUT). The proposed CBA model is based on nine variables: Goal Expectancy, Social Influence, Facilitating Conditions, Computer Self Efficacy, Content, Perceived Usefulness, Perceived Ease of Use, Perceived Playfulness, and Behavioral Intention. Data were collected using a survey questionnaire from 546 participants who had used the computer based exam system at the University of Jordan. Results indicate that Perceived Playfulness has a direct effect on CBA use. Perceived Ease of Use, Perceived Usefulness, Computer Self Efficacy, Social Influence, Facilitating Conditions, Content and Goal Expectancy have only indirect effects. The study concludes that a system is more likely to be used by students if it is playful and CBA is more likely to be playful when it is easy to use and useful. Finally, the studied acceptance model for computer based assessment explains approximately only 10% of the variance of behavioral intention to use CBA.

31 citations


Journal ArticleDOI
TL;DR: The Jordanian government has invested heavily in EGovernment initiatives for the last 10 years to transform from traditional service delivery to more effective and efficient service to deliver high-quality customer-centric and performance-driven services to E-government stakeholders as mentioned in this paper.
Abstract: Life is being developed every day in all of the life aspects. One of the major developing aspects is information technology (IT), and communication technology which makes life easier, faster, and more connected. ICT is evolving fast in Jordan and offering the government to deliver multiple delivery services with different characteristics among E-government services. The Jordanian government has invested heavily in E-government initiatives for the last 10 years to transform from traditional service delivery to more effective and efficient service to deliver high-quality customer-centric and performance-driven services to E-government stakeholders. However the global rank of E-government readiness as well as regional rank of Jordan is still in low rank according to the global countries but it is still quite according to the Arab countries. This research provides a trend analysis to find the trends (positive or negative) in the UN E-government indicators in Jordan and provides an overview to the E-government in Jordan where the researchers analyze the development of E-government in Jordan by introducing a general framework for the E-government through discussing the past, present status and the future plans for E-government in Jordan to get better service to their recipients and to improve overall progress of Jordan achievements compared with regional and global countries.

30 citations


Journal ArticleDOI
TL;DR: This paper presents a blind digital video watermarking technique based on a combination scheme between the Discrete Wavelet transform in (DWT) and the real Schur Decomposition which shows high efficiency due to the use of Schur decomposition which requires fewer computations compared to other transforms.
Abstract: Digital watermarking is one of the most powerful tools used in ownership and copyrights protection in digital media. This paper presents a blind digital video watermarking technique based on a combination scheme between the Discrete Wavelet transform in (DWT) and the real Schur Decomposition. The scheme starts with applying two-level DWT to the video scene followed by Schur decomposition in which the binary watermark bits are embedded in the resultant block upper triangular matrix. The proposed technique shows high efficiency due to the use of Schur decomposition which requires fewer computations compared to other transforms. The imperceptibility of the scheme is also very high due to the use of DWT transform; therefore, no visual distortion is noticed in the watermarked video after embedding. Furthermore, the technique proves to be robust against set of standard attacks like: Gaussian, salt and pepper and rotation and some video attacks such as: frame dropping, cropping and averaging. Both capacity and blindness features are also considered and achieved in this technique.

26 citations


Journal ArticleDOI
TL;DR: The claim of efficacy and superiority of proposed framework is established through results of a comparative study, involving the proposed frame-work and some well-known models for software defect prediction.
Abstract: Despite the fact that a number of approaches have been proposed for effective and accurate prediction of software defects, yet most of these have not found widespread applicability. Our objective in this communication is to provide a framework which is expected to be more effective and acceptable for predicting the defects in multiple phases across software development lifecycle. The proposed framework is based on the use of neural networks for predicting defects in software development life cycle. Further, in order to facilitate the easy use of the framework by project managers, a software graphical user interface has been developed that allows input data (including effort and defect) to be fed easily for predicting defects. The proposed framework provides a probabilistic defect prediction approach where instead of a definite number, a defect range (minimum, maximum, and mean) is predicted. The claim of efficacy and superiority of proposed framework is established through results of a comparative study, involving the proposed frame-work and some well-known models for software defect prediction.

24 citations


Journal ArticleDOI
TL;DR: The paper addresses a literature review of the technologies used in the transmission of measuring and logging data during well drilling and presents a discussion about efficiency in data density transmission and reliability, especially when it comes to software and automated tools.
Abstract: The paper addresses a literature review of the technologies used in the transmission of measuring and logging data during well drilling. It presents a discussion about efficiency in data density transmission and reliability, especially when it comes to software and automated tools. Initially, this paper analyzes the principle of the telemetry systems, considering the mud pulse telemetry, acoustic telemetry, electromagnetic telemetry and wired drill pipe telemetry. They were detailed highlighting information about functionality, data transmission and its linkage to supporting software. Focus is also given to details of the main advantages and disadvantages of each technology considering the influences of lithology, drilling fluid and formation fluids in the reliability and capacity of data transmission.

24 citations


Journal ArticleDOI
TL;DR: This paper proposes a CBIR method by extracting both color and texture feature vectors using the Discrete Wavelet Transform (DWT) and the Self Organizing Map (SOM) artificial neural networks to demonstrate promising retrieval results on the Wang Database.
Abstract: Content-Based Image Retrieval (CBIR) from a large database is becoming a necessity for many applications such as medical imaging, Geographic Information Systems (GIS), space search and many others. However, the process of retrieving relevant images is usually preceded by extracting some discriminating features that can best describe the database images. Therefore, the retrieval process is mainly dependent on comparing the captured features which depict the most important characteristics of images instead of comparing the whole images. In this paper, we propose a CBIR method by extracting both color and texture feature vectors using the Discrete Wavelet Transform (DWT) and the Self Organizing Map (SOM) artificial neural networks. At query time texture vectors are compared using a similarity measure which is the Euclidean distance and the most similar image is retrieved. In addition, other relevant images are also retrieved using the neighborhood of the most similar image from the clustered data set via SOM. The proposed method demonstrated promising retrieval results on the Wang Database compared to the existing methods in literature.

19 citations


Journal ArticleDOI
TL;DR: An artificial neural network model is introduced as a type of supervised learning, meaning that the network is provided with example input parameters of learning and the desired optimized and correct output for that input.
Abstract: Improving learning outcome has always been an important motivating factor in educational inquiry. In a blended learning environment where e-learning and traditional face to face class tutoring are combined, there are opportunities to explore the role of technology in improving student’s grades. A student’s performance is impacted by many factors such as engagement, self-regulation, peer interaction, tutor’s experience and tutors’ time involvement with students. Furthermore, e-course design factors such as providing personalized learning are an urgent requirement for improved learning process. In this paper, an artificial neural network model is introduced as a type of supervised learning, meaning that the network is provided with example input parameters of learning and the desired optimized and correct output for that input. We also describe, by utilizing e-learning interactions and social analytics how to use artificial neural network to produce a converging mathematical model. Then students’ performance can be efficiently predicted and so the danger of failing in an enrolled e-course should be reduced.

16 citations


Journal ArticleDOI
TL;DR: A metrics based model “Reusability Quantification of Object Oriented Design” has been proposed by establishing the relationship among design properties and reusability and justifying the correlation with the help of statistical measures and empirical significance of the study shows the high correlation for model acceptance.
Abstract: The quality factor of class diagram is critical because it has a significant influence on overall quality of the product, delivered finally. Testability analysis, when done early in the software creation process, is a criterion of critical importance to software quality. Reusability is an important quality factor to testability. Its early measurement in object oriented software especially at design phase, allows a design to be reapplied to a new problem without much extra effort. This research paper proposes a research framework for quantification process and does an extensive review on reusability of object oriented software. A metrics based model “Reusability Quantification of Object Oriented Design” has been proposed by establishing the relationship among design properties and reusability and justifying the correlation with the help of statistical measures. Also, “Reusability Quantification Model” is empirically validated and contextual significance of the study shows the high correlation for model acceptance. This research paper facilitates to software developers and designer, the inclusion of reusability quantification model to access and quantify software reusability for quality product.

Journal ArticleDOI
TL;DR: Results show that Naive Bayes takes least time to train data but with least accuracy as compared to MLP and Decision Tree algorithms.
Abstract: Two important performance indicators for data mining algorithms are accuracy of classification/ prediction and time taken for training. These indicators are useful for selecting best algorithms for classification/prediction tasks in data mining. Empirical studies on these performance indicators in data mining are few. Therefore, this study was designed to determine how data mining classification algorithm perform with increase in input data sizes. Three data mining classification algorithms—Decision Tree, Multi-Layer Perceptron (MLP) Neural Network and Naive Bayes— were subjected to varying simulated data sizes. The time taken by the algorithms for trainings and accuracies of their classifications were analyzed for the different data sizes. Results show that Naive Bayes takes least time to train data but with least accuracy as compared to MLP and Decision Tree algorithms.

Journal ArticleDOI
TL;DR: A novel gesture spotting and recognition technique is proposed to handle hand gesture from continuous hand motion based on Conditional Random Fields in conjunction with Support Vector Machine, which enforces vigorous view invariant task.
Abstract: In this paper, a novel gesture spotting and recognition technique is proposed to handle hand gesture from continuous hand motion based on Conditional Random Fields in conjunction with Support Vector Machine. Firstly, YCbCr color space and 3D depth map are used to detect and segment the hand. The depth map is to neutralize complex background sense. Secondly, 3D spatio-temporal features for hand volume of dynamic affine-invariants like elliptic Fourier and Zernike moments are extracted, in addition to three orientations motion features. Finally, the hand gesture is spotted and recognized by using the discriminative Conditional Random Fields Model. Accordingly, a Support Vector Machine verifies the hand shape at the start and the end point of meaningful gesture, which enforces vigorous view invariant task. Experiments demonstrate that the proposed method can successfully spot and recognize hand gesture from continuous hand motion data with 92.50% recognition rate.

Journal ArticleDOI
TL;DR: The core objective of this system is to produce a method which can identify detailed humanoid nods and use them to either deliver ones thoughts and feelings, or for device control, standing as an effective replacement for speech.
Abstract: In a general overview, signed language is a technique used for communicational purposes by deaf people. It is a three-dimensional language that relies on visual gestures and moving hand signs that classify letters and words. Gesture recognition has been always a relatively fearful subject that is adherent to the individual on both academic and demonstrative levels. The core objective of this system is to produce a method which can identify detailed humanoid nods and use them to either deliver ones thoughts and feelings, or for device control. This system will stand as an effective replacement for speech, enhancing the individual’s ability to express and intermingle in society. In this paper, we will discuss the different steps used to input, recognize and analyze the hand gestures, transforming them to both written words and audible speech. Each step is an independent algorithm that has its unique variables and conditions.

Journal ArticleDOI
TL;DR: Results show that applying Genetic Algorithm achieves better performance in terms of average number of test data generations, execution time, and percentage of branch coverage to test JSC applications.
Abstract: The main objective of software testing is to have the highest likelihood of finding the most faults with a minimum amount of time and effort. Genetic Algorithm (GA) has been successfully used by researchers in software testing to automatically generate test data. In this paper, a GA is applied using branch coverage criterion to generate the least possible set of test data to test JSC applications. Results show that applying GA achieves better performance in terms of average number of test data generations, execution time, and percentage of branch coverage.

Journal ArticleDOI
TL;DR: This research work evaluates the performance of Re-UCP model and compares the results with the UCP and e-U CP method of software effort estimation and highlights the accuracy of results by using MRE, MMRE, MdMRE, and MMRE tools to check the error rate.
Abstract: This research work evaluates the performance of Re-UCP model and compares the results with the UCP and e-UCP method of software effort estimation. In this research work, an attempt has been made to highlight the accuracy of results by using MRE (Magnitude of Relative Error), MMRE (Mean Magnitude Relative Error), MdMRE (Median of Magnitude Relative Error) tools to check the error rate and PRED (20) and PRED (10) method to find out the predictability of accuracy of Re-UCP software effort estimation method. The observations made from the results are based on the comparison of Re-UCP, e-UCP and UCP models of software effort estimation.

Journal ArticleDOI
TL;DR: This paper presents an efficient pattern matching algorithm (FSW), which improves the searching process for a pattern in a text with the help of four sliding windows, with the best time case being while the average case time complexity is .
Abstract: This paper presents an efficient pattern matching algorithm (FSW). FSW improves the searching process for a pattern in a text. It scans the text with the help of four sliding windows. The windows are equal to the length of the pattern, allowing multiple alignments in the searching process. The text is divided into two parts; each part is scanned from both sides simultaneously using two sliding windows. The four windows slide in parallel in both parts of the text. The comparisons done between the text and the pattern are done from both of the pattern sides in parallel. The conducted experiments show that FSW achieves the best overall results in the number of attempts and the number of character comparisons compared to the pattern matching algorithms: Two Sliding Windows (TSW), Enhanced Two Sliding Windows algorithm (ETSW) and Berry-Ravindran algorithm (BR). The best time case is calculated and found to be while the average case time complexity is .

Journal ArticleDOI
TL;DR: A distributed algorithm for big data classification, and its application for Magnetic Resonance Images (MRI) segmentation, using the well-known c-means method is presented.
Abstract: The aim of this paper is to present a distributed algorithm for big data classification, and its application for Magnetic Resonance Images (MRI) segmentation. We choose the well-known classification method which is the c-means method. The proposed method is introduced in order to perform a cognitive program which is assigned to be implemented on a parallel and distributed machine based on mobile agents. The main idea of the proposed algorithm is to execute the c-means classification procedure by the Mobile Classification Agents (Team Workers) on different nodes on their data at the same time and provide the results to their Mobile Host Agent (Team Leader) which computes the global results and orchestrates the classification until the convergence condition is achieved and the output segmented images will be provided from the Mobile Classification Agents. The data in our case are the big data MRI image of size (m × n) which is splitted into (m × n) elementary images one per mobile classification agent to perform the classification procedure. The experimental results show that the use of the distributed architecture improves significantly the big data segmentation efficiency.

Journal ArticleDOI
TL;DR: A comparative study among several of the most commonly used ECM tools is presented, which defines a characterization schema instantiated in a particular case, the Regional Government of Andalusia.
Abstract: Managing documentation in a suitable way has become a critical issue for any organization. Organizations depend on the information they store and they are required to have appropriate mechanisms to support the functional needs of information storage, management and retrieval. Currently, there are several tools in the market, both free software and proprietary license, normally named Enterprise Content Management (ECM) tools, which offer relevant solutions in this context. This paper presents a comparative study among several of the most commonly used ECM tools. It starts with a systematic review of the literature to analyze possible solutions and then it defines a characterization schema instantiated in a particular case, the Regional Government of Andalusia.

Journal ArticleDOI
TL;DR: This paper will present the complexity factors that are related to project time, cost and quality management and then they will apply them to a number of selected projects, in order to compare the acquired results.
Abstract: Project management is a well understood management method, widely adopted today, in order to give predictable results to complex problems. However, quite often projects fail to satisfy their initial objectives. This is why studying the factors that affect the complexity of projects is quite important. In this paper, we will present the complexity factors that are related to project time, cost and quality management and then we will apply them to a number of selected projects, in order to compare the acquired results. The projects have been chosen in a way that results can be easily compared.

Journal ArticleDOI
TL;DR: This paper provides a detailed, formal investigation of variability in the family of schema architectures, which are central components in the architecture of federated database systems, and combines the semi-formal object-oriented modeling language UML with the formal object- oriented specification language Object-Z.
Abstract: Data integration requires managing heterogeneous schema information. A federated database system integrates heterogeneous, autonomous database systems on the schema level, whereby both local applications and global applications accessing multiple component database systems are supported. Such a federated database system is a complex system of systems which requires a well-designed organization at the system and software architecture level. A specific challenge that federated database systems face is the organization of schemas into a schema architecture. This paper provides a detailed, formal investigation of variability in the family of schema architectures, which are central components in the architecture of federated database systems. It is shown how the variability of specific architectures can be compared to the reference architecture and to each other. To achieve this, we combine the semi-formal object-oriented modeling language UML with the formal object-oriented specification language Object-Z. Appropriate use of inheritance in the formal specification, as enabled by Object-Z, greatly supports specifying and analyzing the variability among the studied schema architectures. The investigation also serves to illustrate the employed specification techniques for analyzing and comparing software architecture specifications.

Journal ArticleDOI
TL;DR: The tests and analysis results of the proposed triangular chaotic map show a great sensitivity to initial conditions, have unpredictability, are uniformly distributed and random-like and have an infinite range of intensive chaotic population with large positive Lyapunov exponent values.
Abstract: Chaos theory attempts to explain the result of a system that is sensitive to initial conditions, complex, and shows an unpredictable behaviour. Chaotic systems are sensitive to any change or changes in the initial condition(s) and are unpredictable in the long term. Chaos theory are implementing today in many different fields of studies. In this research, we propose a new one-dimensional Triangular Chaotic Map (TCM) with full intensive chaotic population. TCM chaotic map is a one-way function that prevents the finding of a relationship between the successive output values and increases the randomness of output results. The tests and analysis results of the proposed triangular chaotic map show a great sensitivity to initial conditions, have unpredictability, are uniformly distributed and random-like and have an infinite range of intensive chaotic population with large positive Lyapunov exponent values. Moreover, TCM characteristics are very promising for possible utilization in many different study fields.

Journal ArticleDOI
TL;DR: This paper describes how the test design can be done by using the Combinatorial Testing approach for internet of things operating systems and discusses what can be the approach for RIOT and Tiny OS operating systems.
Abstract: In this paper we describe how the test design can be done by using the Combinatorial Testing approach for internet of things operating systems. Contiki operating system is taken as a case study but we discuss what can be the approach for RIOT and Tiny OS operating systems. We discuss how the combinatorial coverage measurement can be gathered in addition to the traditional metrics code coverage. The test design generated by using Advanced Combinatorial Testing for Software is analyzed for Contiki operating system. We elaborate the code coverage gathering technique for Contiki simulator which happens to be in Java. We explain the usage of Combinatorial Coverage Measurement tool. Although we have explained the test design methodology for internet of things operating systems, the approach explained can be followed for other open source software.

Journal ArticleDOI
TL;DR: A new class of similarity measure based on the Lp metric between fuzzy sets is proposed which gives better results when compared to the existing distance measures in the area with Linear Discriminant Analysis (LDA).
Abstract: Face recognition systems have been in the active research in the area of image processing for quite a long time. Evaluating the face recognition system was carried out with various types of algorithms used for extracting the features, their classification and matching. Similarity measure or distance measure is also an important factor in assessing the quality of a face recognition system. There are various distance measures in literature which are widely used in this area. In this work, a new class of similarity measure based on the Lp metric between fuzzy sets is proposed which gives better results when compared to the existing distance measures in the area with Linear Discriminant Analysis (LDA). The result points to a positive direction that with the existing feature extraction methods itself the results can be improved if the similarity measure in the matching part is efficient.

Journal ArticleDOI
TL;DR: In this paper, the authors identified testability factors namely effectiveness and reusability and established the correlation among testability, effectiveness, and reuability and justified the correlation with the help of statistical measures.
Abstract: The quality factor of class diagram is critical because it has significant influence on overall quality of the product, delivered finally. Testability has been recognized as a key factor to software quality. Estimating testability at design stage is a criterion of crucial significance for software designers to make the design more testable. Taking view of this fact, this paper identifies testability factors namely effectiveness and reusability and establishes the correlation among testability, effectiveness and reusability and justifies the correlation with the help of statistical measures. Moreover study developed metric based testability estimation model and developed model has been validated using experimental test. Subsequently, research integrates the empirical validation of the developed model for high level acceptance. Finally a hypothesis test performs by the two standards to test the significance of correlation.

Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the efficacy of a tool developed in the Monte Carlo simulation and referred to as control chart, which is used in order to detect changes in productivity resulting from the occurrence of a given event during the welding of land pipelines with self shielded flux cored wire (FCAW).
Abstract: This article evaluates the efficacy of a tool developed in the Monte Carlo simulation and referred to as control chart. This tool is used in order to detect changes in productivity resulting from the occurrence of a given event during the welding of land pipelines with self shielded flux cored wire (FCAW). The elaboration of this control chart is based on the data from the Cumulative Probability Density Function (CDF) curve, and generated in the Monte Carlo simulation using version 6 of the Palisade Corporation’s @Risk software for Excel, in a sample with productivity data from 29 welded joints, gathered through direct observation which considers the productive and unproductive times. In order to evaluate the control chart efficacy, the performance of welding productivity with a FCAW process with low alloy steels has been assessed during 29 days, summing up to 842 welded joints registered on “Relatorios Diarios de Obras” (Construction Works Daily Reports). The results show that the model developed for the control chart elaboration is effective in monitoring the productivity of the observed welding procedure.

Journal ArticleDOI
TL;DR: The results of this study firm the original TAM’s findings and reveal that the faculties of students and number of previously E-learning courses have an influence on perceive ease of use and perceived usefulness while the level of the academic year and GPA have no significant influence on perceived ease ofuse.
Abstract: Despite of the advantages of Information and Communication Technology (ICT) which makes our lives easier, faster, and more connected, the development of ICT is pushing the countries in the direction of ICT applications. The higher education is one of the sectors that try to adopt one of ICT applications through E-learning and using (Moodle) as learning management system. This paper finds out the impact of Moodle on students through examining the students’ acceptance of the system using TAM model. This study was carried out by some teaching members of the Faculty of King Abdullah II School for Information Technology at the University of Jordan during the spring semester of the academic year 2013/2014. The results of this study firm the original TAM’s findings and reveal that the faculties of students and number of previously E-learning courses have an influence on perceive ease of use and perceived usefulness while the level of the academic year and GPA have no significant influence on perceived ease of use. Even though, they have affected on perceived usefulness. Finally, the student’s skills on computer with student’s difficulty in reading from the screen affect perceived ease of use but, it has been found that they have no influence on perceived usefulness.

Journal ArticleDOI
TL;DR: An evaluation model to assess software architecture (Architecture Design Testability Evaluation Model (ADTEM) is presented and results show that ADTEM is efficient and gave a considerable improvement to the software testability process.
Abstract: Architectural design is a crucial issue in software engineering. It makes testing more effective as it contribute to carry out the testing in an early stage of the software development. To improve software testability, the software architect should consider different testability metrics while building the software architecture. The main objective of this research is to conduct an early assessment of the software architecture for the purpose of its improvement in order to make the testing process more effective. In this paper, an evaluation model to assess software architecture (Architecture Design Testability Evaluation Model (ADTEM)) is presented. ADTEM is based on two different testability metrics: cohesion and coupling. ADTEM consists of two phases: software architecture evaluation phase, and component evaluation phase. In each phase, a fuzzy inference system is used to perform the evaluation process based on cohesion and coupling testing metrics. The model is validated by using a case study: Elders Monitoring System. The experimental results show that ADTEM is efficient and gave a considerable improvement to the software testability process.

Journal ArticleDOI
TL;DR: This paper described the need of knowledge management in software engineering and then proposed a model based on knowledge management to support the software development process and can save the overall cost and time of requirement engineering process as well as software development.
Abstract: Requirement engineering in any software development is the most important phase to ensure the success or failure of software. Knowledge modeling and management are helping tools to learn the software organizations. The traditional Requirements engineering practices are based upon the interaction of stakeholders which causes iteratively changes in requirements and difficulties in communication and understanding problem domain etc. So, to resolve such issues we use knowledge based techniques to support the RE practices as well as software development process. Our technique is based on two prospective, theoretical and practical implementations. In this paper, we described the need of knowledge management in software engineering and then proposed a model based on knowledge management to support the software development process. To verify our results, we used controlled experiment approach. We have implemented our model, and verify results by using and without using proposed knowledge based RE process. Our resultant proposed model can save the overall cost and time of requirement engineering process as well as software development.

Journal ArticleDOI
TL;DR: A multi-agent based transport system is modeled by timed automata model extended with clock variables that can be systematic and precise in assessing correctness by rigorously specifying the functional requirements.
Abstract: A multi-agent based transport system is modeled by timed automata model extended with clock variables. The correctness properties of safety and liveness of this model are verified by timed automata based UPPAAL. Agents have a degree of control on their own actions, have their own threads of control, and under some circumstances they are also able to take decisions. Therefore they are autonomous. The multi-agent system is modeled as a network of timed automata based agents supported by clock variables. The representation of agent requirements based on mathematics is helpful in precise and unambiguous specifications, thereby ensuring correctness. This formal representation of requirements provides a way for logical reasoning about the artifacts produced. We can be systematic and precise in assessing correctness by rigorously specifying the functional requirements.