scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Software and Data Technologies in 2008"


Proceedings Article
01 Jan 2008
TL;DR: A way to model quality of service tradeoffs based on utility theory, which research indicates end-users with diverse backgrounds are able to leverage for guiding the adaptive behaviors towards activity-specific quality goals is presented.
Abstract: This paper presents a framework for engineering resource-adaptive software systems targeted at small mobile devices. The proposed framework empowers users to control tradeoffs among a rich set of service- specific aspects of quality of service. After motivating the problem, the paper proposes a model for capturing user preferences with respect to quality of service, and illustrates prototype user interfaces to elicit such models. The paper then describes the extensions and integration work made to accommodate the proposed framework on top of an existing software infrastructure for ubiquitous computing. The research question addressed here is the feasibility of coordinating resource allocation and adaptation policies in a way that end-users can understand and control in real time. The evaluation covered both systems and the usability perspectives, the latter by means of a user study. The contributions of this work are: first, a set of design guidelines for resource-adaptive systems, including APIs for integrating new applications; second, a concrete infrastructure that implements the guidelines. And third, a way to model quality of service tradeoffs based on utility theory, which our research indicates end-users with diverse backgrounds are able to leverage for guiding the adaptive behaviors towards activity-specific quality goals.

28 citations


Proceedings Article
01 Jan 2008
TL;DR: A new Visual Notation for GUI Modelling and testing (VAN4GUIM) is presented which aims to hide, as much as possible, formalism details inherent to models used in model-based testing (MBT) approaches and to promote the use of MBT in industrial environments providing a visual front-end for modelling which is more attractive to testers than textual notation.
Abstract: This paper presents a new Visual Notation for GUI Modelling and testing (VAN4GUIM) which aims to hide, as much as possible, formalism details inherent to models used in model-based testing (MBT) approaches and to promote the use of MBT in industrial environments providing a visual front-end for modelling which is more attractive to testers than textual notation. This visual notation is developed as five different UML profiles and based on three notations/concepts: Canonical Abstract Prototyping notation; ConcurTaskTrees (CTT) notation; and the Window Manager concept. A set of translation rules was defined in order to automatically perform conversion from VAN4GUIM to Spec#. GUI models are developed with VAN4GUIM notation then translated automatically to Spec# that can be then completed manually with additional behaviour not included in the visual model. As soon as a Spec# model is completed, it can be used as input to Spec Explorer (model-based testing tool) which generates test cases and executes those tests automatically. 1 Work partially supported by FCT (Portugal) under contract PTDC/EIA/66767/2006.

16 citations


Proceedings Article
01 Jan 2008
TL;DR: This paper describes the pattern-based techniques used in LABAS for service identification, for transformation from business models to service architectures and for architecture modifications.
Abstract: Service architectures are an increasingly adopted architectural approach for solving the Enterprise Application Integration (EAI) problem originated by business process automation requirements. In previous work, we developed a methodological framework for the designing of service architectures for EAI. The framework is structured in a layered architecture called LABAS, and is distinguished by using architectural abstractions in different layers. This paper describes the pattern-based techniques used in LABAS for service identification, for transformation from business models to service architectures and for architecture modifications.

13 citations


Proceedings Article
05 Jul 2008
TL;DR: The part of recommender system which is based on automatically identifying opinions using natural language processing knowledge is described, which allows the automatic collection, evaluation and rating of critics and opinions of the movies.
Abstract: This paper describes the part of recommender system designed for movies' critics recognition. Such a system allows the automatic collection, evaluation and rating of critics and opinions of the movies. First the system searches and retrieves texts supposed to be movies' reviews from the Internet. Subsequently the system carries out an evaluation and rating of movies' critics. Finally the system automatically associates a numerical mark to each critic. The goal of system is to give the score of critics associated to the users' who wrote them. All of this data are the input to the cognitive engine. Data from our base allow making correspondences which are required for cognitive algorithms to improve advanced recommending functionalities for e-business and e-purchases websites. Our sesystem uses three different methods for classifying opinions from reviews critics. In this paper we describe the part of system which is based on automatically identifying opinions using natural language processing knowledge.

13 citations


Proceedings Article
01 Jan 2008
TL;DR: Amodelling approach of dynamic process for diagnosis that is compatible with the Stochas-tic Approach framework for discovering temporal knowledge from the timed observations contained in adatabase is presented.
Abstract: Cemagref, Unite´ Ouvrages Hydrauliques et Hydrologie, Aix en Provence, Franceemilie.masse@cemagref.fr, corinne.curt@cemagref.frKeywords: Multi modeling, diagnosis reasoning, dynamic systemAbstract: Thispaper presents amodelling approach of dynamic process for diagnosis that iscompatible withthe Stochas-tic Approach framework for discovering temporal knowledge from the timed observations contained in adatabase. The motivation is to define a multi-model formalism that is able to represent both the knowledgeof these two sources. The aim is to model the process at the same level of abstraction that an expert uses todiagnose the process. The underlying idea is that at this level of abstraction, the model is simple enough toallow an efficient diagnosis. The proposed formalism repres ents the knowledge in four models: a structuralmodel defining the components and the connection relations o f the process, a behavioural model defining thestates and the transitions states of the process, a functional model containing the logical relations between thevalues of the process’s variables, which are defined in the pe rception model. The models are linked with theprocess’s variables. This point facilitates the analysis of the consistency of the four models and is the basis ofthe corresponding knowledge modelling methodology. The formalism and the methodology is illustrated withthe model of a hydraulic dam of Cublize (France).

12 citations


Proceedings Article
01 Jan 2008
TL;DR: This work presents a programming technique, based on dynamic proxies, that allows to augment an object’s type at runtime while preserving strong static type safety, and enables role-based implementations that lead to more reuse and better separation of concerns.
Abstract: Purely class-based implementations of object-oriented software are often inappropriate for reuse. In contrast, the notion of objects playing roles in a collaboration has been proven to be a valuable reuse abstraction. However, existing solutions to enable role-based programming tend to require vast extensions of the underlying programming language, and thus, are difficult to use in every day work. We present a programming technique, based on dynamic proxies, that allows to augment an object’s type at runtime while preserving strong static type safety. It enables role-based implementations that lead to more reuse and better separation of concerns.

11 citations


Proceedings Article
01 Jan 2008
TL;DR: This paper will present one of the first steps of the methodology of development for secure mobile Grid computing based systems, consisting of analyzing security requirements of mobile grid systems, and obtaining a set of security requirements that the methodology must cover and implement.
Abstract: The interest to incorporate mobile devices into Grid systems has arisen with two main purposes. The first one is to enrich users of these devices while the other is that of enriching the own Grid infrastructure. Security of these systems, due to their distributed and open nature, is considered a topic of great interest. A formal approach to security in the software life cycle is essential to protect corporate resources. However, little attention has been paid to this aspect of software development. Due to its criticality, security should be integrated as a formal approach into the software life cycle. We are developing a methodology of development for secure mobile Grid computing based systems that helps to design and build secure Grid systems with support for mobile devices directed by use cases and security use cases and focused on service-oriented security architecture. In this paper, we will present one of the first steps of our methodology consisting of analyzing security requirements of mobile grid systems. This analysis will allow us to obtain a set of security requirements that our methodology must cover and implement.

10 citations


Book ChapterDOI
22 Jul 2008
TL;DR: This work presents a programming technique, based on dynamic proxies, that allows to augment an object’s type at runtime while preserving strong static type safety, and enables role-based implementations that lead to more reuse and better separation of concerns.
Abstract: Purely class-based implementations of object-oriented software are often inappropriate for reuse. In contrast, the notion of objects playing roles in a collaboration has been proven to be a valuable reuse abstraction. However, existing solutions to enable role-based programming tend to require vast extensions of the underlying programming language, and thus, are difficult to use in every day work. We present a programming technique, based on dynamic proxies, that allows to augment an object’s type at runtime while preserving strong static type safety. It enables role-based implementations that lead to more reuse and better separation of concerns.

10 citations


Proceedings Article
01 Jan 2008
TL;DR: A technique for extracting coarse-grained parallelism in loops is presented based on splitting a set of dependence relations into two sets and a way of proper inserting and executing send and receive functions is demonstrated.
Abstract: A technique for extracting coarse-grained parallelism in loops is presented. It is based on splitting a set of dependence relations into two sets. The first one is to be used for generating code scanning slices while the second one permits us to insert send and receive functions to synchronize the slices execution. Codes of send and receive functions based on both OpenMP and POSIX locks functions are presented. A way of proper inserting and executing send and receive functions is demonstrated. Using agglomeration and freescheduling are discussed for the purpose of improving program performance. Results of experiments are presented.

9 citations


Book ChapterDOI
22 Jul 2008
TL;DR: The goal of this work is to show an environment that allows for assessments that are in accord with various different process assessment models, on several process reference models, in the context of the COMPETISOFT project.
Abstract: A process assessment method provides elements in order to produce information which gives an overall view of the process capability. This information helps to determine the current state of software processes, their strengths and weaknesses. Owing to the fact that the assessment can be performed according to a given assessment process or any other and the processes of the organization can also use one particular process model or any other, we have developed an environment composed of two components; one of these generates the database schema for storing the process reference model and assessment information and the other one assesses the process with reference to this information, generating results in several formats, to make it possible to interpret data. The goal of this work is to show an environment that allows us to carry out assessments that are in accord with various different process assessment models, on several process reference models. This assessment environment has been developed and used in the context of the COMPETISOFT project in order to support its process assessment activities.

9 citations


Proceedings Article
05 Jun 2008
TL;DR: This work proposes a method to build unified, fitted and multi-viewpoints process meta-models based on a process domain ontology and patterns and is composed of two phases.
Abstract: Many different process meta-models offer different viewpoints of a same information system engineering process: activity oriented, product oriented, decision oriented, context oriented and strategy oriented. However, the complementarity between their concepts is not explicit and there is no consensus about the concepts themselves. This leads to inadequate process meta-models with organization needs, so the instantiated models do not correspond to the specific demands and constraints of the organizations or projects. Nevertheless, method engineers should be able to build process meta-models according to the specific organization needs. We propose a method to build unified, fitted and multi-viewpoints process meta-models. The method is composed of two phases and is based on a process domain ontology and patterns.

Proceedings Article
01 Jan 2008
TL;DR: This paper formalizes the virtual environment using Electronic Institutions and makes the agent use these formalizations for observing a human principle and learning believable behaviors from the human and proposes an implicit training method that can address the aforementioned drawbacks.
Abstract: Using 3D Virtual Worlds for commercial activities on the Web and the development of human-like sales assistants operating in such environments are ongoing trends of E-Commerce. The majority of the existing approaches oriented towards the development of such assistants are agent-based and are focused on explicit programming of the agents’ decision making apparatus. While effective in some very specific situations, these approaches often restrict agents’ capabilities to adapt to the changes in the environment and learn new behaviors. In this paper we propose an implicit training method that can address the aforementioned drawbacks. In this method we formalize the virtual environment using Electronic Institutions and make the agent use these formalizations for observing a human principle and learning believable behaviors from the human. The training of the agent can be conducted implicitly using the specific data structures called recursive-arc graphs.

Proceedings Article
01 Jan 2008
TL;DR: The objective is to lay the ground, albeit still rather informally, of a program of assessing the usability of an interactive system using formal methods, and further research can then extend this into an algebra of interactive systems.
Abstract: The paper is concerned with automatic usability assessment, based on heuristic principles. The objective is to lay the ground, albeit still rather informally, of a program of assessing the usability of an interactive system using formal methods. Further research can then extend this into an algebra of interactive systems.

Proceedings Article
01 Jan 2008
TL;DR: This work has developed a framework, which allows to generate ANSI conformant C code from UML models, built on top of Eclipse technology, so that it can be integrated easily with available UML modeling tools.
Abstract: Model-driven engineering has recently gained broad acceptance in the field of embedded and real-time software systems. While larger embedded and real-time systems, developed e.g. in aerospace, telecommunication, or automotive industry, are quite well supported by model-driven engineering approaches based on the UML, small embedded and real-time systems, as they can for example be found in the industrial automation industry, are still handled a bit novercal. A major reason for this is that the code generation facilities, being offered by most of the UML modeling tools on the market, do indeed support C/C++ code generation in all its particulars, but neglect the generation of plain ANSI-C code. However, this would be needed for small embedded and real-time systems, which have special characteristics in terms of hard time and space constraints. Therefore we developed a framework, which allows to generate ANSI conformant C code from UML models. It is built on top of Eclipse technology, so that it can be integrated easily with available UML modeling tools. Because flexibility and customizability are important requirements, the generation process consists of a model-to-model transformation between the UML source model and an intermediate ANSI-C model, as well as a final model-to-text generation from the intermediate ANSI-C model into C code files. This approach has several advantages compared to a direct code generation strategy.

Book ChapterDOI
22 Jul 2008
TL;DR: This paper presents a framework for engineering resource-adaptive software targeted at small mobile devices that extends and integrates existing work on software infrastructures for ubiquitous computing, and on resource- Adaptive applications.
Abstract: This paper presents a framework for engineering resource-adaptive software targeted at small mobile devices. Rather than building a solution from scratch, we extend and integrate existing work on software infrastructures for ubiquitous computing, and on resource-adaptive applications.

Proceedings Article
01 Jan 2008
TL;DR: This paper proposes a programming paradigm based on processes that exchange messages that may be possible to provide the flexibility and expressiveness of programming with processes while bounding the complexity caused by hardware changes.
Abstract: Software systems bridge the gap between the information processing needs of the world and computer hardware As system requirements grow in complexity and hardware evolves, the gap does not necessarily widen, but it undoubtedly changes Although today’s applications require concurrency and today’s hardware provides concurrency, programming languages remain predominantly sequential Concurrent programming is considered too difficult and too risky to be practiced by “ordinary programmers” Software development is moving towards a paradigm shift, following which concurrency will play a fundamental role in programming In this paper, we introduce an approach that we believe will reduce the difficulties of developing and maintaining certain kinds of concurrent software Building on earlier work but applying modern insights, we propose a programming paradigm based on processes that exchange messages Innovative features include scale-free program structure, extensible modular components with multiple interfaces, protocols that specify the form of messages, and separation of semantics and deployment We suggest that it may be possible to provide the flexibility and expressiveness of programming with processes while bounding the complexity caused by

Proceedings Article
01 Jan 2008
TL;DR: This paper introduces possible replication architectures for multi-tier architectures, and identifies the parameters influencing the performance, and presents a simulation prototype that is suitable to integrate and compare several replication solutions.
Abstract: Multi-tier architectures have become the main building block in service-oriented architecture solutions with stringent requirements on performance and reliability. Replicating the reusable software components of the business logic and the application dependent state of business data is a promising means to provide fast local access and high availability. However, while replication of databases is a well explored area and the implications of replica maintenance are well understood, this is not the case for data replication in application servers where entire business objects are replicated, Web Service interfaces are provided, main memory access is much more prevalent, and which have a database server as a backend tier. In this paper, we introduce possible replication architectures for multi-tier architectures, and identify the parameters influencing the performance. We present a simulation prototype that is suitable to integrate and compare several replication solutions. We describe in detail one solution that seems to be the most promising in a wide-area setting.

Proceedings Article
01 Jan 2008
TL;DR: The development of a semantic mediator prototype is described to provide a common access point to coastal data, maps and information from distributed CWAs, using the Open Geospatial Consortium’s Catalogue Services for the Web.
Abstract: In recent years significant momentum has occurred in the development of Internet resources for decision makers and scientists interested in the coast. Chief among these has been the development of coastal web atlases (CWAs). While multiple benefits are derived from these tailor-made atlases (e.g., speedy access to multiple sources of coastal data and information), the potential exists to derive added value from the integration of disparate CWAs, to optimize decision making at a variety of levels and across themes. This paper describes the development of a semantic mediator prototype to provide a common access point to coastal data, maps and information from distributed CWAs. The prototype showcases how ontologies and ontology mappings can be used to integrate different heterogeneous and autonomous atlases, using the Open Geospatial Consortium’s Catalogue Services for the Web.

Proceedings Article
01 Jan 2008
TL;DR: A process control is provided for continuously monitoring and adjusting an apparatus for manufacturing discrete workpieces and the system has the capability of calculating a statistical distribution and printing out this data as well as performing a capability study.
Abstract: A process control is provided for continuously monitoring and adjusting an apparatus for manufacturing discrete workpieces. The system measures a current workpiece by taking fifteen readings and deleting the highest five and lowest five readings and averaging the center five readings. This value is then used for sorting the workpiece in accordance with preset tolerance limits. The measured average value is also compared with an expected value which the system has computed and stored. An adjustment value is then calculated based on a comparison of the measured dimension of the current workpiece and a measured dimension of the last-produced workpiece and the expected value for the current workpiece. The apparatus is then adjusted in accordance with the computed adjustment value. The system also has the ability to calculate the statistical distribution of a plurality of previously produced workpieces and to adjust the apparatus in accordance with a comparison of the dimension of the current workpiece with the three sigma limits of the distribution. Furthermore, the system has the capability of calculating a statistical distribution and printing out this data as well as performing a capability study.

Proceedings Article
01 Jan 2008
TL;DR: This paper presents Bitstream Segment Graphs as a complete model on data format instances and presents an example PNG where a complete models is required, as a step towards the description of data formats as a whole.
Abstract: Manual development of format-compliant software components is complex, time-consuming and thus errorprone and expensive, as data formats are defined in semi-formal, textual specifications for human engineers. Existing approaches on a formal description of data formats remain at high-level descriptions and fail to describe phenomena such as compression or fragmentation that are especially common in Multimedia file formats. As a step-stone towards the description of data formats as a whole, this paper presents Bitstream Segment Graphs as a complete model on data format instances and presents an example PNGwhere a complete model on data format instances is required.

Book ChapterDOI
22 Jul 2008
TL;DR: This work presents an approach to the user-centered usability evaluation of an e-learning scenario for Biology developed on an Augmented Reality educational platform and compares the usability evaluation results.
Abstract: The development of Augmented Reality (AR) systems is creating new challenges and opportunities for the designers of e-learning systems The mix of real and virtual requires appropriate interaction techniques that have to be evaluated with users in order to avoid usability problems Formative usability aims at finding usability problems as early as possible in the development life cycle and is suitable to support the development of such novel interactive systems This work presents an approach to the user-centered usability evaluation of an e-learning scenario for Biology developed on an Augmented Reality educational platform The evaluation has been carried on during and after a summer school held within the ARiSE research project The basic idea was to perform usability evaluation twice In this respect, we conducted user testing with a small number of students during the summer school in order to get a fast feedback from users having good knowledge in Biology Then, we repeated the user testing in different conditions and with a relatively larger number of representative users In this paper we describe both experiments and compare the usability evaluation results

Proceedings Article
01 Jan 2008
TL;DR: DSAW is an aspectoriented system that supports both dynamic and static weaving homogeneously over the .Net platform, and an aspect can be used to adapt an application both statically and dynamically, without needing to modify its source code.
Abstract: Aspect Oriented Software Development is an effective realization of the Separation of Concerns principle. A key issue of this paradigm is the moment when components and aspects are weaved together, composing the final application. Static weaving tools perform application composition prior to its execution. This approach reduces dynamic aspectation of running applications. In response to this limitation, dynamic weaving aspect tools perform application composition at runtime. The main benefit of dynamic weaving is runtime adaptability; its main drawback is runtime performance. Existing research has identified the suitability of hybrid approaches, obtaining the benefits of both methods in the same platform. Applying static weaving where possible and dynamic weaving when needed provides a balance between runtime performance and dynamic adaptability. This paper presents DSAW, an aspectoriented system that supports both dynamic and static weaving homogeneously over the .Net platform. An aspect can be used to adapt an application both statically and dynamically, without needing to modify its source code. Moreover, DSAW is language and platform neutral, and source code of neither components nor aspects is required.

Proceedings Article
01 Jan 2008
TL;DR: A normative approach to the design of distributed applications that must consistently integrate a number of environments (i.e. form-based interfaces, 3D Virtual Worlds, physical world) that allows for maintaining a consistent causal connection amongst all these environments.
Abstract: The paper outlines a normative approach to the design of distributed applications that must consistently integrate a number of environments (i.e. form-based interfaces, 3D Virtual Worlds, physical world). The application of the described ideas is illustrated on the example of a fish market, which is an application that can simultaneously be accessed by the people from the physical world, people using form-based interfaces and people embodied in a 3D Virtual World as avatars. The Normative Virtual Environments approach in this case allows for maintaining a consistent causal connection amongst all these environments.

Proceedings Article
01 Jan 2008
TL;DR: 12 V pulses are transmitted at low impedance either as frame pulses at the beginning or at the end of a signal pulse frame of the multiplex signal or supplementarily as clock pulses between successive signal samples in the multipleX signal.
Abstract: When a transmitter sends information signals time multiplexed in a single channel over a coaxial cable to a receiver, the expense necessary for synchronizing the demultiplexing operations to the multiplexing operations at the transmitter and for supplying the current necessary to energize the receiver is reduced by impressing or interposing pulses on or into the multiplex signal at the transmitter for synchronization and current supply at the receiver. 12 V pulses are transmitted at low impedance either as frame pulses at the beginning or at the end of a signal pulse frame of the multiplex signal or supplementarily as clock pulses between successive signal samples in the multiplex signal.

Proceedings Article
01 Jan 2008
TL;DR: This proposal presents a project to devise a novel model that aims at increasing the adaptability of the resources exposed through it by dynamically managing their non-functional requirements.
Abstract: One of the primary benefits of Service Oriented Architecture (SOA) [1] is the ability to compose applications, processes or more complex services from other services. As the complexity and sophistication of theses structures increases, so does the need for adaptability of each component. In recent years, a lot of effort has been put into improving the flexibility of these systems so that totally loose services can be integrated dynamically without imposing any architectural restrictions. This proposal presents a project to devise a novel model that aims at increasing the adaptability of the resources exposed through it by dynamically managing their non-functional requirements. To manage these non-functional properties, we aggregate the services required into what we define as a profile.

Proceedings Article
01 Jan 2008
TL;DR: QADPZ, an open source platform for heterogeneous desktop grid computing, which enables users from a local network (organization-wide) or Internet (volunteer computing) to share their resources, is presented briefly and is improved with some refined capabilities.
Abstract: In this paper we first present briefly QADPZ, an open source platform for heterogeneous desktop grid computing, which enables users from a local network (organization-wide) or Internet (volunteer computing) to share their resources. Users of the system can submit compute-intensive applications to the system, which are then automatically scheduled for execution. The scheduling is made based on the hardware and software requirements of the application. Users can later monitor and control the execution of the applications. Each application consists of one or more tasks. Applications can be independent, when the composing tasks do not require any interaction, or parallel, when the tasks communicate with each other during the computation. QADPZ uses a master worker-model that is improved with some refined capabilities: push of work units, pipelining, sending more work-units at a time, adaptive number of workers, adaptive timeout interval for work units, and use of multithreading, to be presented further in this paper. These improvements are meant to increase the performance and efficiency of such applications.

Proceedings Article
01 Jan 2008
TL;DR: This communication proposes to expose the main pedagogical ressource description standards limitations for the description of raw ressources, through the scope of Pedagogical indexation of texts for language teaching, and proposes a model supposed to exceed these limitations.
Abstract: In this communication we propose to expose the main pedagogical ressource description standards limitations for the description of raw ressources, through the scope of pedagogical indexation of texts for language teaching. To do so we will resort to the testimony of language teachers reagarding their practices. We will then propose a model supposed to exceed these limitations. This model is articulated around the notion of text facet, which we introduce here. 1 CORPORA FOR LANGUAGE TEACHING Thanks to the communicative approach’s widespread use (cf. (Levy, 1997)) authentic text is at the heart of the teachers set of problems. However, corpora, despite numerous, are not dedicated to text search for language teaching. Querying mecanisms mostly rely on traditional keywords queries. Teachers display an ability to adapt computer tools of which they were not meant to be the end user, such as in Tim Johns’ Data Driven Learning (DDL)(Higgins and Johns, 1984). All the same, some of the flaws of CALL systems mentioned in (Antoniadis et al., 2004) accurately describe the situation of language corpora for language teaching: when a teacher seeks to find a text in a corpus , there is no system that allows him/her to express his/her request in terms of his/her set of problems, using pedagogical concepts. 1.1 Pedagogical Indexation for Language Teaching The project to create a pedagogically indexed text base for language teaching directly stems from these considerations. This project will lead to the implementation of a prototype (under development). It should fulfill the following use cases: text query and text addition. Lefevre’s definition of “documentary language”1(Lefevre, 2000) explicitely puts the users at the center of the indexation process. Consequently: Definition (Pedagogical Indexation). Pedagogical indexation is performed following a documentary language, which describes objects according to pedagogical criteria (relevant to didactics). In our project, the considered objects are texts and we want the users (language teachers) to be able to find those objects by formulating questions that are relevant to their set of problems, i.e. language didactics. The scope of this article is that of pedagogical indexation of texts for language teaching.

Proceedings Article
01 Jan 2008
TL;DR: This paper addresses the incompatibility problem and describes a solution to the problem based upon the already known design pattern of message translation and the ASP.NET 2.0 Web service platform.
Abstract: One of the challenges that Web service providers face is service evolution management. In general, the challenge is to ensure the substitutability of service versions, i.e. correct functioning of all ongoing client applications relying on the old version of a service after the version has been substituted with a new one. Unfortunately, no currently available design approach can guarantee a perfectly extensible architecture that preserves full backward compatibility during its evolution. Hence, incompatibilities are very likely to occur if an old version of a service is replaced with a new one. This paper addresses the incompatibility problem and describes a solution to the problem. This solution is based upon the already known design pattern of message translation and the ASP.NET 2.0 Web service platform. Using the platform’s API the standard ASP.NET pipeline has been augmented with an additional step of applying XSL transformations to the XML payload of the messages. The solution is then verified against the Electronic Commerce Service from Amazon.com web services suite. Thus, the contribution of the work is a new .NET implementation of the translator pattern.

Book ChapterDOI
22 Jul 2008
TL;DR: The rapid increase in the amount of available information from various online sources poses new challenges for programs that endeavor to process these sources automatically and identify the most relevant material for a given application.
Abstract: The rapid increase in the amount of available information from various online sources poses new challenges for programs that endeavor to process these sources automatically and identify the most relevant material for a given application.

Proceedings Article
01 Jan 2008
TL;DR: This paper demonstrates the effectiveness of the genetic programming paradigm, in two major software engineering duties, effort estimation and defect prediction, by examining data domains from both the commercial and the scientific sector, for each task.
Abstract: Research in software engineering data analysis has only recently incorporated computational intelligence methodologies. Among these approaches, genetic programming retains a remarkable position, facilitating symbolic regression tasks. In this paper, we demonstrate the effectiveness of the genetic programming paradigm, in two major software engineering duties, effort estimation and defect prediction. We examine data domains from both the commercial and the scientific sector, for each task. The proposed model is proved superior to past literature works.