scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computer Science in 2005"


Journal ArticleDOI
TL;DR: The cohesion metrics examine the fundamental quality of cohesion as it relates to ontologies in order to effectively make use of domain specific ontology development.
Abstract: Recently, domain specific ontology development has been driven by research on the Semantic Web. Ontologies have been suggested for use in many application areas targeted by the Semantic Web, such as dynamic web service composition and general web service matching. Fundamental characteristics of these ontologies must be determined in order to effectively make use of them: for example, Sirin, Hendler and Parsia have suggested that determining fundamental characteristics of ontologies is important for dynamic web service composition. Our research examines cohesion metrics for ontologies. The cohesion metrics examine the fundamental quality of cohesion as it relates to ontologies.

170 citations


Journal ArticleDOI
TL;DR: The basal virus reproduction rate gives some theoretical hints about how to avoid infections in a computer network as a function of time.
Abstract: To investigate the use of classical epidemiological models for studying computer virus propagation we described analogies between computer and population disease propagation using SIR (Susceptible-Infected-Removed) epidemiological models. By modifying these models with the introduction of anti-viral individuals we analyzed the stability of the disease free equilibrium points. Consequently, the basal virus reproduction rate gives some theoretical hints about how to avoid infections in a computer network. Numerical simulations show the dynamics of the process for several parameter values giving the number of infected machines as a function of time.

92 citations


Journal ArticleDOI
TL;DR: Results demonstrate that the neural network models trained using Bayesian Regularization provide the best results and are suitable for estimating the Source Lines of Code.
Abstract: It is a well known fact that at the beginning of any project, the software industry needs to know, how much will it cost to develop and what would be the time required ? . This paper examines the potential of using a neural network model for estimating the lines of code, once the functional requirements are known. Using the International Software Benchmarking Standards Group (ISBSG) Repository Data (release 9) for the experiment, this paper examines the performance of back propagation feed forward neural network to estimate the Source Lines of Code. Multiple training algorithms are used in the experiments. Results demonstrate that the neural network models trained using Bayesian Regularization provide the best results and are suitable for this purpose.

76 citations


Journal ArticleDOI
TL;DR: This study proposes a four parameter integrated measure of software maintainability using a fuzzy model and includes empirical data of maintenance time of projects which has been used to validate the proposed model.
Abstract: Software maintenance is a task that every development group has to face when the software is delivered to the customers' site, installed and is operational. The time spent and effort required for keeping software operational consumes about 40-70% of cost of entire life cycle. This study proposes a four parameter integrated measure of software maintainability using a fuzzy model. The study also includes empirical data of maintenance time of projects which has been used to validate the proposed model.

76 citations


Journal ArticleDOI
TL;DR: In this article, the authors propose a software life cycle, named the Y model, which provides guidance for the major phases to be followed under its umbrella, including domain engineering, frame working, assembling, archiving and design of software components.
Abstract: With the need to produce ever larger and more complex software systems, the use of reusable components has become increasingly imperative. Of the many existing and proposed techniques for software development, it seems clear that component-based software development will be at the forefront of new approaches to the production of software systems and holds the promise of substantially enhancing the software production and maintenance process. Attempts to rationalize component-based development have to recognize that the construction of a software system is a complex multifaceted activity that involves domain engineering, frame working, assembling, archiving and design of software components. These activities, among others, are encompassed by a software life cycle, named the Y model, put forward in this study. The Y model provides guidance for the major phases to be followed under its umbrella.

48 citations


Journal ArticleDOI
TL;DR: The paper combines OO software metric values into a single overall value (called Testability Index) that can be used to calculate the testability of a class and validates thetestability index as a good integrated measure for arriving at the testable of the class.
Abstract: For large software systems, testing phase seems to have profound effect on the overall acceptability and quality of the final product. The success of this activity can be judged by measuring the testability of the software. A good measure for testability can better manage the testing effort and time. Different Object Oriented Metrics are used in measurement of object-oriented testability but none of them is alone sufficient to give an overall reflection of software testability. Thus an integrated measure considering the effect of all these measures is required to well define the testability. The paper combines OO software metric values into a single overall value (called Testability Index) that can be used to calculate the testability of a class. The approach uses fuzzy techniques and concepts (fuzzification of crisp metric values, inference and aggregation, defuzzification of fuzzy output). We include empirical data of testing time of 25 different Java classes, which proves that individual metric values are not sufficient to arrive at the testability of a class and validates the testability index as a good integrated measure for arriving at the testability of the class.

28 citations


Journal ArticleDOI
TL;DR: Theoretical analysis and experimental evaluations demonstrate that the techniques developed can improve both the recommendation recall and recommendation precision.
Abstract: With the huge amount and large variety of information available in a digital library, it’s becoming harder and harder for users to identify and get hold of their interested documents. To alleviate the difficulty, personalized recommendation techniques have been developed. Current recommendation techniques rely on similarity between documents. In our work, recommendations are made based on three factors: similarity between documents, information amount, and information novelty. With the introduction of degree of interest, users’ interests can be better characterized. Theoretical analysis and experimental evaluations demonstrate that our techniques can improve both the recommendation recall and recommendation precision.

26 citations


Journal ArticleDOI
TL;DR: A firm authentication method is proposed by extracting some features for the original name of the holder with the passport number and digest them in a form, by applying some techniques, that can be hidden in the passport's photo.
Abstract: One of the serious problems is how to authenticate the passport document for its holder. The major factor of this authenticity is the corresponding of the Passport's photo with its holder. Most of the Passport document contains a holder's signature in addition, of course, to the full name. We propose a firm authentication method by extracting some features for the original name of the holder with the passport number and digest them in a form, by applying some techniques, that can be hidden in the passport's photo. The modern method of issuing a passport now is by using a computer in fixing the passport's photo (imaging). In using this method we could hide the invisible watermark which contains the digest name and passport number inside the passport's photo. During the hidden process there are many techniques could be applied to disguise any color difference appears during the hidden process. After using this technique, it is very simple to use the computer in scanning and verifying, at check point, that the passport's photo has been not replaced and that by comparing the invisible watermark with the digest name of the holder and passport number.

23 citations


Journal ArticleDOI
TL;DR: A new model to software quality assurance is presented which addresses the problems of theses approaches and includes quality factors that represent a common set of criteria while allowing tailoring to a local environment.
Abstract: Many of the early quality models have followed a hierarchical approach in which a set of factors that affect quality are defined, with little scope for expansion. More recent models have been developed that follow a 'Define your own' approach with locally tailored factors. The aim of this study is to present a new model to software quality assurance which addresses the problems of theses approaches and includes quality factors that represent a common set of criteria while allowing tailoring to a local environment. In addition the proposed model allows the quality factors to be determined and analysed in an integrated, adaptable fashion.

21 citations


Journal ArticleDOI
TL;DR: This study applies the situation awareness concept to and extends the current RFID system architecture to be able to adapt to the environment and can provides the ability to recognize, analyze and use with situation information from sensors under the ubiquitous computing environment.
Abstract: Many sensors providing situation data will be in everywhere under the ubiquitous environment. The current RFID system should be extended to recognize and use situation information from the sensors. This study proposes RFID system architectures that are suitable for the ubiquitous environment. This study applies the situation awareness concept to and extends the current RFID system architecture to be able to adapt to the environment. The key components include an inference engine, use policy and definition language. The proposed architecture is named SA-RFID system architecture and can provides the ability to recognize, analyze and use with situation information from sensors under the ubiquitous computing environment. As a result, the usability of the current RFID system is improved and its application increases.

20 citations


Journal ArticleDOI
TL;DR: This research reports on the work in progress to develop a framework forSemantic Web mining and exploration, a practical method towards Semantic Web application to e-Learning along with its design framework is suggested.
Abstract: The Semantic Web has opened new horizons for internet applications in general and for e-Learning in particular. The e-Learning community is aiming at having much more effective services than what is currently provided by any of the available computer aided tutoring, or learning management systems. The e-Learning process is no more restricted to providing courses online. Nowadays, knowledge is distributed throughout the Web on millions of pages, PDF files, multimedia and other resources. The learner is not necessarily someone who is registered in a course or needs e-Learning to support a particular course. Students and researchers need vast amount of material and spend considerable amount of time trying to learn about a particular subject or find relevant information. This research reports on the work in progress to develop a framework for Semantic Web mining and exploration, a practical method towards Semantic Web application to e-Learning along with its design framework is suggested.

Journal ArticleDOI
TL;DR: There is overlap between some of the complexity and cohesion metrics and this study points to a more basic relationship between complexity andhesion: that a lack of cohesion may be associated with high complexity.
Abstract: Many metrics have been proposed to measure the complexity or cohesion of object-oriented software. However, the complexity or cohesion of a piece of software is more difficult to capture than these metrics imply. In fact, studies have shown that existing metrics consistently fail to capture complexity or cohesion well. This study explores the reasons behind these results: cohesion is difficult to capture from syntactic elements of code, complexity is too multi-faceted to be captured by one metric and the qualities of complexity and cohesion are not independent. These factors have resulted in metrics that are purported to measure complexity or cohesion but are inadequate or misclassified. This study shows that there is overlap between some of the complexity and cohesion metrics and points to a more basic relationship between complexity and cohesion: that a lack of cohesion may be associated with high complexity.

Journal ArticleDOI
Barry O1, Kevin Curran1, Derek Woods
TL;DR: The aim of the study was to establish which mobile phone text input method best suits the requirements of a select group of target users.
Abstract: Human Computer Interaction is a primary factor in the success or failure of any device but if an objective view is taken of the current mobile phone market you would be forgiven for thinking usability was secondary to aesthetics. Many phone manufacturers modify the design of phones to be different than the competition and to target fashion trends, usually at the expense of usability and performance. There is a lack of awareness among many buyers of the usability of the device they are purchasing and the disposability of modern technology is an effect rather than a cause of this. Designing new text entry methods for mobile devices can be expensive and labour-intensive. The assessment and comparison of a new text entry method with current methods is a necessary part of the design process. The best way to do this is through an empirical evaluation. The aim of the study was to establish which mobile phone text input method best suits the requirements of a select group of target users. This study used a diverse range of users to compare devices that are in everyday use by most of the adult population. The proliferation of the devices is as yet unmatched by the study of their application and the consideration of their user friendliness.

Journal ArticleDOI
TL;DR: This study presents a real-time face detection system which uses an image based neural network to detect images and naturally leads onto the solution to the problems of extremely constraint testing environments.
Abstract: As continual research is being conducted in the area of computer vision, one of the most practical applications under vigorous development is in the construction of a robust real-time face detection system. Successfully constructing a real-time face detection system not only implies a system capable of analyzing video streams, but also naturally leads onto the solution to the problems of extremely constraint testing environments. Analyzing a video sequence is the current challenge since faces are constantly in dynamic motion, presenting many different possible rotational and illumination conditions. While solutions to the task of face detection have been presented, detection performances of many systems are heavily dependent upon a strictly constrained environment. The problem of detecting faces under gross variations remains largely uncovered. This study presents a real-time face detection system which uses an image based neural network to detect images.

Journal ArticleDOI
TL;DR: It is shown how the teaching of an algorithm can be greatly enhanced and the tutor's effort decreased with a visualization tool that provides an interactive and animated view of the subject being taught to the students.
Abstract: This study present a new learning tool developed for the visualization of an algorithm for the assignment problem. We show how the teaching of an algorithm can be greatly enhanced and the tutor's effort decreased with a visualization tool that provides an interactive and animated view of the subject being taught to the students. This tool makes use of the Java technology, therefore it is platform independed and can be used efficiently in Distance Education. The "Achatz, Kleinschmidt and Paparrizos" algorithm features significant tree modifications and furthermore, it is the first time that this visualization is being made. This tool has a friendly user interface, thus enabling the user - student to familiarize quickly. It can be used efficiently in courses like Graph Theory or Network Optimization. Benefits and drawbacks are thoroughly described in order to support the significance of this tool in computer-aided education.

Journal ArticleDOI
TL;DR: An approach for adaptive pedagogical hypermedia document generation is proposed and implemented in a prototype called KnowledgeClass, based on a specialized artificial neural network model that allows automatic generation of individualised courses according to the learner's goal and previous knowledge.
Abstract: Traditional sequencing technology developed in the field of intelligent tutoring systems have not find an immediate place in large-scale Web-based education. This study investigates the use of computational intelligence for adaptive lesson generation in a distance learning environment over the Web. An approach for adaptive pedagogical hypermedia document generation is proposed and implemented in a prototype called KnowledgeClass. This approach is based on a specialized artificial neural network model. The system allows automatic generation of individualised courses according to the learner’s goal and previous knowledge and can dynamically adapt the course according to the learner’s success in acquiring knowledge. Several experiments showed the effectiveness of the proposed method.

Journal ArticleDOI
TL;DR: The proposed method analyses the essentiality of a given variable order based on the complexity of sub functions derived from variable substitution to minimize the time complexity of Binary Decision Diagrams.
Abstract: A large variety of problems in digital system design, combinational optimization and verification can be formulated in terms of operations performed on Boolean functions. The time complexity of Binary Decision Diagram (BDD) representing a Boolean function is directly related to the path length of that BDD. In this paper we present a method to generate a BDD with minimum path length. The Average Path Length (APL) and Longest Path Length (LPL) of the BDD are evaluated and discussed. The proposed method analyses the essentiality of a given variable order based on the complexity of sub functions derived from variable substitution. The variable that produces minimal cumulative complexity for the sub-functions is given priority over other variables. The experimental results and comparisons using benchmark circuits show that the proposed method is an encouraging approach towards minimizing the evaluation time of Boolean functions, consequently minimizing the time complexity of BDDs.

Journal ArticleDOI
TL;DR: The Q-Learning training method is presented, which is a derivative of the reinforcement learning called sometimes training by penalty-reward, and illustrated by an application to the mobility of a mobile in an enclosure closed on the basis of a starting point towards an unspecified arrival point.
Abstract: In this article, we presented the Q-Learning training method which is a derivative of the reinforcement learning called sometimes training by penalty-reward. We illustrate this by an application to the mobility of a mobile in an enclosure closed on the basis of a starting point towards an unspecified arrival point. The objective is to find an optimal way optimal without leaving the enclosure.

Journal ArticleDOI
TL;DR: This work further refine the framework, the network challenges, and defines a global network cost function, which takes into account both physical, and network layers parameters to derive general guidelines for ad-hoc networks based on UWB technology.
Abstract: Short-range wireless systems have recently gained a lot of attention to provide seamless, multimedia communications around a user-centric concept in so called Wireless Personal Area Networks (WPAN). Ultra-Wide Band (UWB) technology presents itself as a good candidate for the Physical Layer (PHY) of WPAN, both for high and low data rate applications. Many ongoing developments of Ultra-Wide Band (UWB) system concentrate mainly on the physical layer, without significant considerations being given to the medium access techniques especially for targeted ad-hoc networking scenarios. Here, we further refine the framework, the network challenges, and defines a global network cost function, which takes into account both physical, and network layers parameters to derive general guidelines for ad-hoc networks based on UWB technology.

Journal ArticleDOI
TL;DR: This study develops two new static partitioning schemes that can split each dimension of the space within linear space complexity and support an effective mechanism for handling skewed data in heavily sparse spaces.
Abstract: This study introduces a class of region preserving space transformation (RPST) schemes for accessing high-dimensional data. The access methods in this class differ with respect to their space-partitioning strategies. The study develops two new static partitioning schemes that can split each dimension of the space within linear space complexity. They also support an effective mechanism for handling skewed data in heavily sparse spaces. The techniques are experimentally compared to the Pyramid Technique, which is another example of static partitioning designed for high-dimensional data. On real high-dimensional data, the proposed RPST schemes outperform the Pyramid Technique by a significant margin.

Journal ArticleDOI
TL;DR: This study presents a framework for the quality metric development called Metric Development Framework (qMDF), which is prescriptive in nature and has been implemented to develop a good quality design metric, as a validation of proposed framework.
Abstract: Several object-oriented metrics have been developed and used in conjunction with the quality models to predict the overall quality of software. However, it may not be enough to propose metrics. The fundamental question may be of their validity, utility and reliability. It may be much significant to be sure that these metrics are really useful and for that their construct validity must be assured. Thereby, good quality metrics must be developed using a foolproof and sound framework / model. A critical review of literature on the attempts in this regard reveals that there is no standard framework or model available for such an important activity. This study presents a framework for the quality metric development called Metric Development Framework (qMDF), which is prescriptive in nature. qMDF is a general framework but it has been established specially with ideas of object-oriented metrics. qMDF has been implemented to develop a good quality design metric, as a validation of proposed framework. Finally, it is defended that adaptation of qMDF by metric developers would yield good quality metrics, while ensuring their construct validity, utility, reliability and reduced developmental effort.

Journal ArticleDOI
TL;DR: The results indicate that increasing WLAN traffic increases the overall downward handoff latency more than increasing GPRS traffic.
Abstract: This study presents the performance analysis of a new tight coupling based WLAN/GPRS interworking architecture. The effects of network traffic on downward handoff latency are investigated. The results indicate that increasing WLAN traffic increases the overall downward handoff latency more than increasing GPRS traffic. On the other hand, increasing GPRS traffic results in higher packet buffering requirements at the Serving GPRS Support Node (SGSN).

Journal ArticleDOI
TL;DR: Comparative analysis with an existing neural approach shows the superiority of the proposed scheme of computational complexity and performance over the existing methods.
Abstract: The present study proposes a novel technique for copyright protection by utilizing digital watermarking of Images. The watermark is embedded and detected by using Functional Link Artificial Neural Network (FLANN) and Discrete Cosine Transform (DCT). The exhaustive simulation results of the proposed scheme show improved performance over the existing methods in all cases, i.e. when the watermarked image is subjected to compression, cropping, sharpening, blurring and noise. Comparative analysis with an existing neural approach shows the superiority of the proposed scheme of computational complexity and performance.

Journal ArticleDOI
TL;DR: Comprehensive mathematics are used to design and describe each part of the presented system, which allows methodical development and changes of the system parameters for future advances.
Abstract: An electronic nose system for multi application purposes is designed and tested. Our hardware design allows various types of sensors to be used for different applications. The system is capable of being interfaced to both analog and digital instruments with special filtering devices that isolate the system from surrounding signals. The design of the system is distinguished through the use of two logic controlled micro fans that stabilize the system atmosphere, which surrounds the smelling sensor and also serves an important function of removing any adsorbed odors on the surface of the sensor. Comprehensive mathematics are used to design and describe each part of the presented system, which allows methodical development and changes of the system parameters for future advances. Testing of the hardware was carried out under computer control using various TGS sensors such as TGS822, and TGS824 and TGS30. Database based software with neural network facility was designed to interface the built hardware and to process the electronic nose signals before being classified.

Journal ArticleDOI
TL;DR: This study proposed a formal representation of abstraction and generality, along with a two-dimensional space for the representation of their application, based on examples of software development.
Abstract: Although the evolving field of software engineering introduces many methods and modelling techniques, we conjecture that the concepts of abstraction and generality are among the fundamentals of each such methodology. This study proposed a formal representation of these two concepts, along with a two-dimensional space for the representation of their application. Based on the examples, we further elaborate and discuss the notion of abstraction and generalisation transformations in various domains of software development.

Journal ArticleDOI
TL;DR: An asynchronous NoC router, for use in 2-D mesh-connected networks, based on the Speed Independent State Transition Graph model, and its performance is compared with a synchronous router of the same functionality.
Abstract: The Quality of Service Network on Chip (QNoC) is the most perferment solution that provides low latency transfers and power efficient System on Chip (SoC) interconnect. This study presents an asynchronous NoC router, for use in 2-D mesh-connected networks. It comprises multiple interconnected input and output ports and dynamic arbitration mechanisms that resolve any output port conflicts based on the messages priorities. The proposed router protocol and its asynchronous modeling are based on the Speed Independent State Transition Graph (STG) model. The generated STG are transformed into VHDL data flow descriptions and the low level implementation is based onto a parameterized library. This library integrates the popular asynchronous SI modules such as C-element, Q-element, fairly arbiter, etc. The device is implemented in 0.35 µm CMOS technology and its performance is compared with a synchronous router of the same functionality. The asynchronous router enables a higher data rate and a comparable silicon area.

Journal ArticleDOI
TL;DR: It is shown how the same techniques of meta-modeling and meta-levels can be applied in component-based software architecture and the need to propose mechanisms of reflexivity within the domain of software architecture meta- modeling is shown.
Abstract: The techniques of meta-modeling and meta-levels have become a mature concept and have been largely used to solve real problems in programming languages, distributed environments, knowledge representation, or data bases. In this article it is shown how the same techniques can be applied in component-based software architecture. It also shown the need to propose mechanisms of reflexivity within the domain of software architecture meta-modeling. The outcome of this is a meta-meta-architecture with a minimal core whose finality is to define meta-components, meta-connectors and meta-architectures. Call this meta-meta-architecture MADL (Meta Architecture Description Language).

Journal ArticleDOI
TL;DR: This study provides an updated focused survey of major aspects and problems related to the task of modeling the user behavior and points out some recent advancement in areas related to automatic Web navigation and implicit capturing of user interests in regard to a particular page.
Abstract: Techniques which assist in recognizing various access patterns and interests of the Web users are normally referred to as an application of Web usage mining. The application provides useful insights to address crucial points such as user interests into a particular page, server load balancing, Web site reorganization, clustering of similar browsing patterns, classification of user profiles, content caching or data distribution and replication. In this study, we provide an updated focused survey of major aspects and problems related to the task of modeling the user behavior. We also point out some recent advancement in areas related to automatic Web navigation and implicit capturing of user interests in regard to a particular page. Main future directions are also outlined in the conclusion.

Journal ArticleDOI
TL;DR: This research proposes a hybrid heuristic approach for further improving the quality of solutions obtained from Tabu Search in combination with variable neighborhood search, a recent metaheuristic that is based on the principle of systematic change of neighborhood during the local search.
Abstract: The problem of assigning cells to switches in wireless networks consists of minimizing the total operating cost, that is, the cost of linking cells to switches and the cost of handover from one cell to another, by taking into account factors such as network topology, switch capacity and traffic load in the entire network. Such a problem is well known in the literature as NP-hard, such that exact enumerative approaches are not suitable for solving real-size instances of this problem. Thus, heuristics are recommended and have been used for finding good solutions in reasonable execution times. Tabu Search (TS) is one of the best heuristics used to solve this problem. This research proposes a hybrid heuristic approach for further improving the quality of solutions obtained from TS. This approach applies TS in combination with variable neighborhood search, a recent metaheuristic that is based on the principle of systematic change of neighborhood during the local search.. A key element in the success of this approach is the use of several neighborhood structures that complement each other well and that remain within the feasible region of the search space.

Journal ArticleDOI
TL;DR: OntoCell regroups and unifies the main concepts and relations related to the cell's structure and behavior and will be validated in the context of the development of a multi-agent system simulating the behavior of a cellular population.
Abstract: This article presents OntoCell, a cellular ontology that we constructed and edited under Protege2000. It regroups and unifies the main concepts and relations related to the cell's structure and behavior. OntoCell has been validated by experts in biology (by the UMRS INSERM 514). Moreover, it will be validated in the context of the development of a multi-agent system simulating the behavior of a cellular population.