Showing papers in "Clei Electronic Journal in 2014"
••
TL;DR: This research presents a probabilistic assessment of the response of the public to the use of smart phones and their applications in the contexts of social media and traditional media.
Abstract: Fil: Pacini Naumovich, Elina Rocio. Universidad Nacional de Cuyo. Instituto de Tecnologias de la Informacion y las Comunicaciones; Argentina
32 citations
••
TL;DR: This work presents a broad and detailed set of transformations that allows the automatable generation of an ontology implemented in OWL 2 from the SBVR specifications of a business domain.
Abstract: The wide applicability of mapping business rules expressions to ontology statements have been recently recognized. Some of the most important applications are: (1) using of ontology reasoners to prove the consistency of business domain information, (2) generation of an ontology intended to be used in the analysis stage of a software development process, and (3) the possibility of encapsulate the declarative specification of business knowledge into information software systems by means of an implemented ontology. The Semantics of Business Vocabulary and Business Rules (SBVR) supports that approach by providing business people with a linguistic way to semantically describe business concepts and specify business rules in an independent way of any information system design. Although previous work have presented some proposals, an exhaustive and automatable approach for them is still lacking. This work presents a broad and detailed set of transformations that allows the automatable generation of an ontology implemented in OWL 2 from the SBVR specifications of a business domain. Such transformations are rooted on the structural specification of both standards and are depicted through a case study. A real case validation example was performed, approaching the feasibility of the mappings by the quality assessment of the developed ontology.
14 citations
••
TL;DR: In this work, Model Checking (MC) verification technique for software and Timed Automata (TA) formal language are integrated within a formal verification approach to check BPs modeled with BPMN.
Abstract: The most important result to standardize the notation for graphical representation of Business Processes (BPs) is the Business Process Model and Notation (BPMN). Despite the BPs modeled with BPMN being able to support business designers, BPMN models are not appropriate to support the analysis phase. BPMN models have no formal semantics to conduct qualitative analysis (validation and verification). In this work is presented how Model Checking (MC) verification technique for software and Timed Automata (TA) formal language are integrated within a formal verification approach to check BPs modeled with BPMN. Also, are introduced a set of guideline to transform BPMN models into TA. The use of our approach allow to business analysts and designers to perform evaluation (i.e., qualitative analysis) of BPs, based on the formal specification of BP–task model with TA. The application of the approach is aimed to evaluate the behavior of the BP–task model with respect to business performance indicators (for instance, service time, waiting time or queue size) derived from business needs, as is shown in an instance of an enterprise–project related to Customer Relationship Management.
11 citations
••
TL;DR: A Unified Process for Domain Analysis (UPDA) is proposed, based on Aspect and Goal orientations to deal with NFR, specified by quality standards to enhance communication.
Abstract: The Requirements Engineering (RE) discipline is where the software system needs or requirements are captured; these are then "translated" into software components. At present, functional requirements are treated, but non-functional requirements (NFR) are neglected, causing problems at later stages of development. In an industrial software production context, product quality must be considered, and the Domain Analysis discipline within RE, proposes different approaches to treat NFR for building a Reference Architecture (RA) from which all products of a domain family can be generated. Consequently, the same process is adapted to different contexts and abstraction levels. This paper proposes a Unified Process for Domain Analysis (UPDA), based on Aspect and Goal orientations to deal with NFR, specified by quality standards to enhance communication. UPDA integrates techniques that are separately used: - the Chung and others extended process of Losavio and others, based on the NFR Framework with treatment of crosscutting concerns, and - the ISO/IEC 25010 quality standard to specify NFR. Three sub-processes constitute UPDA: - Construction of the quality model, - Identification of crosscutting concerns and - RA design. The main artifact obtained is the RA, which can be reused as an asset in the context of software product lines.
9 citations
••
TL;DR: The goal was to identify the most active research groups within this workshop, the most debated topics and the trends in the Requirements Engineering area, and the aim was to analyze the relevance of WER.
Abstract: This work refers to the review of 258 papers published in the WER throughout 15 editions. This review s goal was to identify the most active research groups within this workshop, the most debated topics and the trends in the Requirements Engineering area. Furthermore, the aim of this review was to analyze the relevance of WER. It was made identifying where WER papers have been cited. The results showed that Brazil, Argentina and Spain hold the most active groups. The requirements modeling is one of the most discussed topics in this event. Moreover, the results pointed out the international conference as one type of publication that more referenced WER papers, and the Requirements Engineering as one of the journals that more referenced WER papers.
6 citations
••
TL;DR: A business process model that can be used in medium and large organizations as a framework for modelling and analysing their IT management processes and provides a better definition, organization and comprehension of the essential and support IT management activities.
Abstract: The successful application of Information Technologies (IT) in an organization depends on the business processes used for managing such technologies. It is widely recognized that the use of the Enterprise Architecture (EA) practice for organizing these technologies into a framework is a key factor for achieving a better IT business alignment. This article presents a business process model for the IT Management that can be used in medium and large organizations as a framework for modelling and analysing their IT management processes. The main difference between the described model and others found in the literature is that our model places EA concept at the centre of the organization of IT Management activities. It provides a better definition, organization and comprehension of the essential and support IT management activities. The described model is being used in several organizations as a referential framework to improve their current IT Management processes.
6 citations
••
TL;DR: Quark is presented, a method to assist software architects in architectural decision-making, and the conceptualization of the relationship between NFRs and ADs defined in Arteon, an ontology to represent and manage architectural knowledge is conceptualized.
Abstract: Non-Functional Requirements (NFRs) and constraints are among the principal drivers of architectural decision-making NFRs are improved or damaged by architectural decisions (ADs), while constraints directly include or exclude parts of the architecture (eg, logical components or technologies) We may determine the impact of an AD, or which parts of the architecture are aected by a constraint, but at the end it is hard to know if we are respecting the NFRs and the imposed constraints with all the ADs made In the usual approach, architects use their own experience to produce software architectures that comply with the NFRs and imposed constraints, but at the end, especially for crucial decisions, the architect has to deal with complex trade-os between NFRs and juggle with possible incompatibilities raised by the imposed constraints In this paper we present Quark, a method to assist software architects in architectural decision-making, and the conceptualization of the relationship between NFRs and ADs defined in Arteon, an ontology to represent and manage architectural knowledge Finally, we provide an overview of the Quark and Arteon implementation, the ArchiTech tool
5 citations
••
TL;DR: This work addresses the problem of recognizing the ASL hand alphabet relying only on depth information acquired from an RGB-D sensor with a novel Iterative Closest Point (ICP) based recognition methodology where it comprehensively analyzes the inputs and outputs of the alignment as efficiency and accuracy determinants.
Abstract: This work addresses the problem of recognizing the American Sign Language (ASL) hand alphabet relying only on depth information acquired from an RGB-D sensor. To accomplish this goal, a novel Iterative Closest Point (ICP) based recognition methodology is proposed where it comprehensively analyzes the inputs and outputs of the alignment as efficiency and accuracy determinants. Next, a novel classification technique, denoted Approximated KB-fit, is proposed to efficiently handle the space complexity of the database template matching. The overall accuracy of the recognition reached a performance of 99.04% in a cross-validation workbench with 520 distinct input depth images. The achieved frame rate was 7.41 FPS performed on a 2:4 GHz single processor based machine.
5 citations
••
TL;DR: Jose Aguilar, Jaelson Castro, Sergio España, Alexandra La Cruz, María Villapol, Paula Zabala Universidad de Los Andes, Venezuela, aguilar@ula.ve Universidade Federal de Pernambuco, Brasil, jbc@cin.ufpe.ar
Abstract: Jose Aguilar, Jaelson Castro, Sergio España, Alexandra La Cruz, María Villapol, Paula Zabala Universidad de Los Andes, Venezuela, aguilar@ula.ve Universidade Federal de Pernambuco, Brasil, jbc@cin.ufpe.br Universidad Politécnica de Valencia, España, sergio.espana@dsic.upv.es Universidad Simón Bolívar, Venezuela, alexandra.lacruz@gmail.com Universidad Central de Venezuela, Venezuela, maria.villapol@ciens.ucv.ve Universidad de Buenos Aires, Argentina, pzabala@dc.uba.ar
4 citations
••
TL;DR: A framework proposed by Crismond and Adams in order to create pedagogical activities for the CS1 course at Universidad ORT Uruguay is described and it is proposed to extend that framework with competences oriented activities.
Abstract: In Introductory Computer Science courses, especially Computer Science 1 (CS1), dropout rates are generally high and results are often disappointing. In order to motivate and engage students to achieve better results in CS1, our teaching strategy is based on designing several activities using a competences oriented approach. This paper describes the use of a framework proposed by Crismond and Adams in order to create pedagogical activities for the CS1 course at Universidad ORT Uruguay. We propose to extend that framework with competences oriented activities. We present a detailed description of each activity. Our thesis is that including this kind of activities helps to obtain better results. Experimentation was done in 2012 and 2013. Teachers of the experimental group referred a high level of motivation of the students. Results show that the inclusion of those activities seems to be helpful for students and the proposed pedagogical design appears to produce better final results.
4 citations
••
TL;DR: The need of establishing strategies to create, from the early years of the curriculum, the necessary conditions to promote a positive attitude towards curricular integration processes and therefore overcome, at least partially, the identified limitations.
Abstract: The Universidad Nacional in Costa Rica is making significant efforts to improve the professional performance of graduates in order to respond to society and industry needs. In particular, the School of Informatics has been exploring the opportunities to implement, through the use of a pedagogy oriented to problems and projects, a curricular integration between diverse areas of knowledge in the curriculum. From this perspective and with the aim of integrating the areas of programming, databases and systems engineering, an initial exploratory study was carried out with faculty members and a group of students from the fourth level of the curriculum. The findings show several limitations, such as mismatches between the course contents, lack of faculty commitment to collaborative work and student resistance. They also show the need of establishing strategies to create, from the early years of the curriculum, the necessary conditions to promote a positive attitude towards curricular integration processes and therefore overcome, at least partially, the identified limitations.
••
TL;DR: The main results obtained from a systematic mapping study that investigated software measurement architectures are presented and an approach proposed aiming to help organizations define software measurement architecture is proposed.
Abstract: During the execution of software projects, it is necessary to collect, store and analyze data to support project and organizational decisions. Software measurement is a fundamental practice for project management and process improvement. It is present in the main models and standards that address software process improvement, such as ISO/IEC 12207, CMMI and MR MPS.BR. In order to effectively perform software measurement, it is necessary an infrastructure to support data collection, storage and analysis. This infrastructure can be defined by means of an architecture, which describes the components necessary to support software measurement. In this paper we present the main results obtained from a systematic mapping study that investigated software measurement architectures and an approach proposed aiming to help organizations define software measurement architectures.
••
TL;DR: A model for competences development and assessment using formative projects by means of an articulated set of pedagogical strategies that are used along the time to solve the problems of the context is presented.
Abstract: The competence-based formation model has oriented educational policies in different countries during the last decades. From this model, the socioformative approach has its basis and works as a referent in Latin America to orient the formation and competence assessment. The socioformative approach uses the methodology of the formative projects by means of an articulated set of pedagogical strategies that are used along the time to solve the problems of the context. This paper presents a model for competences development and assessment using formative
••
TL;DR: The design of this library of thread-level speculation is presented and different speculative models are studied: speculation of decision structures, speculation of loops, and speculation of critical sections are studied.
Abstract: Developments in parallel architectures are an important branch in computer science. The success of such architectures derives from their inherent ability to improve the program performances. However, their ability to improve the performance on programs depends on the parallelism extraction strategies, which are always limited by the logic of each sequential program. Speculation is the only known alternative to overcome these constraints and increase the parallelism. In this paper, we study the explicit speculative parallelism using a library of thread-level speculation. We present the design of this library and study different speculative models: speculation of decision structures, speculation of loops, and speculation of critical sections. Our study evaluates different cases taken from SPEC CPU 2000, allowing acceleration of about 1.8x in multicore architectures (four core) with coarse-grained
••
TL;DR: The pairwise algorithm MC64-NW/SW, originally developed for the Tile64 processor, is ported to the x86 architecture (Intel Xeon Quad Core and Intel i7 Quad Core processors) with excellent results and should represent significant performance improvements for bioinformatics.
Abstract: The performance of the many-core Tile64 versus the multi-core Xeon x86 architecture on bioinformatics has been compared. We have used the pairwise algorithm MC64-NW/SW that we have previously developed to align nucleic acid (DNA and RNA) and peptide (protein) sequences for the benchmarking, being an enhanced and parallel implementation of the Needleman-Wunsch and Smith-Waterman algorithms. We have ported the MC64-NW/SW (originally developed for the Tile64 processor), to the x86 architecture (Intel Xeon Quad Core and Intel i7 Quad Core processors) with excellent results. Hence, the evolution of the x86-based architectures towards coprocessors like the Xeon Phi should represent significant performance improvements for bioinformatics.
••
TL;DR: An experience of technology transfer success of software for visually impaired children in Ecuador, in which a cooperation agreement was signed among Vice presidency of Ecuador, FENCE (Organization of blind people) and Universidad Indoamerica and Computer Science students installed the software.
Abstract: This article describes an experience of technology transfer success of software for visually impaired children in Ecuador. The starting point was the personal experience of the author with his daughter with special needs. This fact allowed the development of AINIDIU (Intelligent agent for visually impaired children). AINIDIU is a computer-based technology to facilitate the interaction between visually impaired children and computers. The background, technological structure and evaluation of AINIDIU are described in this article. In addition, the process of transfer technology is described, in which a cooperation agreement was signed among Vice presidency of Ecuador, FENCE (Organization of blind people) and Universidad Indoamerica. In this context, Computer Science students installed the software in approximately 1,000 laptops donated by program "Mision Manuela Espejo" supported by Vice presidency of Ecuador. FENCE and students of Education career were responsible for training to teachers of schools, in the use of technologies and Virtual Learning Platform for blind people. As result of the process visually impaired children of 22 provinces of Ecuador were beneficed. The author motivated by this rewarding experience continues with new projects in the Doctoral Program in Computer Science of Universidad de Costa Rica.
••
TL;DR: Comparison of the results from this survey with those obtained from a previous one, conducted under the same project, shows that Costa Rican local governments are increasingly using FOSS, and cost advantages provided by FOSS might be the main reason for this trend.
Abstract: This paper presents and discusses the results from an electronic survey on the use of free and open source software (FOSS) by local governments in Costa Rica. Barriers preventing the use of this type of software are also identified. Comparison of the results from this survey with those obtained from a previous one, conducted under the same project, shows that Costa Rican local governments are increasingly using FOSS. Cost advantages provided by FOSS might be the main reason for this trend. However, other factors need to be considered for justifying the use of FOSS. These factors need to be articulated into organizational strategies —issue in which Costa Rican local governments are falling behind.
••
TL;DR: This work presents architecture for managing safely multimedia transmission channels to encrypt or encode images and videos in an efficient and dynamic way and applications of on-demand parallel code written in OpenCL are proposed.
Abstract: The amount of multimedia information transmitted through the web is very high and increasing. Generally, this kind of data is not correctly protected, since users do not appreciate the amount of information that images and videos may contain. In this work, we present architecture for managing safely multimedia transmission channels. The idea is to encrypt or encode images and videos in an efficient and dynamic way. At the same time, these media could be enhanced applying a real-time image process. The main novelty of the proposal is the application of on-demand parallel code written in OpenCL. The algorithms and data structure are known by the parties only at communication time, what we suppose increases the robustness against possible attacks. We conducted a complete description of the proposal and several performance tests with different known algorithms.
••
TL;DR: A decentralized form of organization of the teaching staff is proposed and it is shown how this makes a difference to make better use of the resources and lowering the drop-out rates and increasing the approval rates.
Abstract: Teaching how to develop medium size information systems in courses with high matriculation rates is a complex task. In Latin America this is an important issue because, due to high matriculation rates and scarce resources, Schools of Engineering have, in general, very low graduation rates and high drop-out rates. This paper presents the experiences of a team of teachers, during a period of eight years. Two strategies were proposed to cope with high drop-out and low graduation rates, and they were implemented and analyzed in two stages. At the initial stage, key roles were defined for each of the members of the teaching staff, focusing on programming assignments and the development and usage of e-learning support systems, such as a course web page and a newsgroup list. At the second stage the main goal was to consolidate the results from the previous stage, but also to make the contents on-line evolve, in order to improve the communication with students and among them. In this second stage it was also important to address the problem of high drop-out/abandon rates, trying to find out the causes and reduce their effect. This work proposes a decentralized form of organization of the teaching staff and shows how this makes a difference to make better use of the resources. It also shows that practical work on a project and the transformation to a semi-distance learning course had the effect of lowering the drop-out rates and increasing the approval rates. Finally this experience and the results have an inspiring effect on teachers and educational institutions to improve courses with similar characteristics.
••
TL;DR: A study made with university students to determine the types of video games that they play, the platforms they use to play, and what motivates them to play or stop playing games distinguishes among genders.
Abstract: This paper presents the results of a study made with university students to determine the types of video games that they play, the platforms they use to play, and what motivates them to play or stop playing games. The study results distinguish among genders in order to be able to design appropriate teaching strategies which appeal to both genders. This work was done as part of a larger study whose aim is to both attract and retain students to the computer-engineering program.
••
TL;DR: A computational tool is developed called CLEMAS, which has been used to apply the proposed learning model for coordination schemes in Multi-Agent Systems based on Cultural Algorithms to a case of study in industrial automation, related to a Faults Management System based on Agents.
Abstract: This paper aims to present a learning model for coordination schemes in Multi-Agent Systems (MAS) based on Cultural Algorithms (CA). In this model, the individuals (one of the CA components) are the different conversations that may occur in any multi-agent systems, and the coordination scheme learned is at the level of the way to perform the communication protocols into the conversation. A conversation can has sub-conversations, and the sub-conversations and/or conversations are identified with a particular type of conversation associated with a certain interaction patterns. The interaction patterns use the coordination mechanisms existing in the literature. In order to simulate the proposed learning model, we develop a computational tool called CLEMAS, which has been used to apply the model to a case of study in industrial automation, related to a Faults Management System based on Agents.
••
TL;DR: The blocked version of this algorithm employed in the experimental evaluation mostly comprises matrix-matrix products, so that the results from the evaluation carry beyond the simple matrix inversion and are representative for a wide variety of dense linear algebra operations/codes.
Abstract: We analyze the performance-power-energy balance of a conventional Intel Xeon multicore processor and two low-power architectures ‐an Intel Atom processor and a system with a quad-core ARM Cortex A9+NVIDIA Quadro 1000M‐ using a high performance implementation of Gauss-Jordan elimination (GJE) for matrix inversion. The blocked version of this algorithm employed in the experimental evaluation mostly comprises matrix-matrix products, so that the results from the evaluation carry beyond the simple matrix inversion and are representative for a wide variety of dense linear algebra operations/codes.
••
TL;DR: It is proposed that evolutionary game theory can be a useful tool for policy-makers in order to improve cooperation and discourage defection in sanitation boards.
Abstract: In a group of individuals that come together to produce a good or provide a service, the cooperators (who pay to produce the good) are often exploited by those who receive the benet without paying the cost. Models were developed over time using incentives (reward or punishment) and the option of abandoning the initiative to promote and stabilize the cooperation. In this paper we analyze several models based on the evolutionary game theory and public good games. We compare and organize them in a taxonomic table following their main characteristics to select the most suitable for a specic problem. The analyzed models are compared by using a public good problem in community projects for water supply. We have reasonable assurance that phenomena that appear on models also occurs in these community projects. Therefore, we propose that evolutionary game theory can be a useful tool for policy-makers in order to improve cooperation and discourage defection in sanitation boards.
••
TL;DR: This paper presents an empirical study of the parameters tuning to evaluate the effectiveness of different configurations and the impact of their use in the Forest Fires prediction.
Abstract: Forest fires are a major risk factor with strong impact at eco-environmental and socioeconomical levels, reasons why their study and modeling are very important. However, the models frequently have a certain level of uncertainty in some input parameters given that they must be approximated or estimated, as a consequence of diverse difficulties to accurately measure the conditions of the phenomenon in real time. This has resulted in the development of several methods for the uncertainty reduction, whose trade-off between accuracy and complexity can vary significantly. The system ESS (EvolutionaryStatistical System) is a method whose aim is to reduce the uncertainty, by combining Statistical Analysis, High Performance Computing (HPC) and Parallel Evolutionary Algorithms (PEAs). The PEAs use several parameters that require adjustment and that determine the quality of their use. The calibration of the parameters is a crucial task for reaching a good performance and to improve the system output. This paper presents an empirical study of the parameters tuning to evaluate the effectiveness of different configurations and the impact of their use in the Forest Fires prediction.
••
TL;DR: This article proposes to apply for the first time a state-of-the-art parallel asynchronous cooperative coevolutionary variant of the nondominated sorting genetic algorithm II, named CCNSGA-II, on the injection network problem in vehicular ad hoc networks (VANETs).
Abstract: Cooperative coevolutionary evolutionary algorithms differ from standard evolutionary algorithms’ architecture in that the population is split into subpopulations, each of them optimising only a sub-vector of the global solution vector. All subpopulations cooperate by broadcasting their local partial solutions such that each subpopulation can evaluate complete solutions. Cooperative coevolution has recently been used in evolutionary multi-objective optimisation, but few works have exploited its parallelisation capabilities or tackled real-world problems. This article proposes to apply for the first time a state-of-the-art parallel asynchronous cooperative coevolutionary variant of the nondominated sorting genetic algorithm II (NSGA-II), named CCNSGA-II, on the injection network problem in vehicular ad hoc networks (VANETs). This multi-objective optimisation problem, consists in finding the minimal set of nodes with backend connectivity, referred to as injection points, to constitute a fully connected overlay that will optimise the small-world properties of the resulting network. Recently, the well-known NSGAII algorithm was used to tackle this problem on realistic instances in the city-centre of Luxembourg. In this work we analyse the performance of the CCNSGA-II when using different numbers of subpopulations, and compare them to the original NSGA-II in terms of both quality of the obtained Pareto front approximations and execution time speedup.
••
TL;DR: An assessment of the Computer Science curriculum based on the use of various data collection instruments used to determine the professor, student and graduates perception of the program as well as deficiencies and potential of graduates, according to the companies and organizations that hire them.
Abstract: The Central University of Venezuela, as part of its efforts for adapting its academic offer to national and international needs, is conducting a project to review, evaluate and modify the Computer Science curriculum, in order to form the professional required by the country. In this paper we present the results of the first stage of the project: an assessment of the Computer Science curriculum based on the use of various data collection instruments used to determine the professor, student and graduates perception of the program as well as deficiencies and potential of our graduates, according to the companies and organizations that hire them. The results show that there exist a gap between the perception of our professors about the program and the opinion of the employers. We also present information about student performance during the last decade, which is an input to the next stage of the program redesign.
••
TL;DR: Part of the scenario of Computer Science education and of the internship’s activities in Teaching Degree courses is presented, showing its organizational structure and its directions of activities that provide experiences with teaching.
Abstract: In Teaching Degree courses in Brazil, the internship intends to complement the student’s graduation process, considering this field of work as an object of investigation and critical reflection on the environment which is around it. However, when it is related to Teaching Degree in Computer Science which forms educators to this area, some difficulties preclude the achievement of the internship in its fullness due to the absence of educational policies which may establish the performance of these teachers. Despite the incentive to actions, which involve the teaching of Computer Science in basic education in Brazil, Computer Science does not make part of any curriculum in schools. Trying to encourage this dialogue, this paper presents part of the scenario of Computer Science education and of the internship’s activities in Teaching Degree courses, showing its organizational structure and its directions of activities that provide experiences with teaching. The research also represents the challenges and the lessons learned while conducting internship disciplines in a specific course.