Author
Javier Pereira
Other affiliations: University of Talca, Diego Portales University, University of Vigo ...read more
Bio: Javier Pereira is an academic researcher from University of A Coruña. The author has contributed to research in topics: Information system & Context (language use). The author has an hindex of 18, co-authored 126 publications receiving 1181 citations. Previous affiliations of Javier Pereira include University of Talca & Diego Portales University.
Papers published on a yearly basis
Papers
More filters
TL;DR: The work presents a network for indoor and outdoor air quality monitoring whose nodes include tin dioxide sensor arrays connected to an acquisition and control system with WiFi communication capabilities and advanced processing based on multi-input single-output neural networks is implemented at the network sensing nodes.
Abstract: This paper presents a network for indoor and outdoor air quality monitoring. Each node is installed in a different room and includes tin dioxide sensor arrays connected to an acquisition and control system. The nodes are hardwired or wirelessly connected to a central monitoring unit. To increase the gas concentration measurement accuracy and to prevent false alarms, two gas sensor influence quantities, i.e., temperature and humidity, are also measured. Advanced processing based on multiple-input-single-output neural networks is implemented at the network sensing nodes to obtain temperature and humidity compensated gas concentration values. Anomalous operation of the network sensing nodes and power consumption are also discussed.
193 citations
TL;DR: The main advantages of the proposed conductivity sensor include a wide measurement range, an intrinsic capability to minimize errors caused by fouling and polarization effects, and an automatic compensation of conductivity measurements caused by temperature variations.
Abstract: In this paper, a new four-electrode sensor for water conductivity measurements is presented. In addition to the sensor itself, all signal conditioning is implemented together with signal processing of the sensor outputs to determine the water conductivity. The sensor is designed for conductivity measurements in the range from 50 mS/m up to 5 S/m through the correct placement of the four electrodes inside the tube where the water flows. The implemented prototype is capable of supplying the sensor with the necessary current at the measurement frequency, acquiring the sine signals across the voltage electrodes of the sensor and across a sampling impedance to determine the current. A temperature sensor is also included in the system to measure the water temperature and, thus, compensate the water-conductivity temperature dependence. The main advantages of the proposed conductivity sensor include a wide measurement range, an intrinsic capability to minimize errors caused by fouling and polarization effects, and an automatic compensation of conductivity measurements caused by temperature variations.
83 citations
TL;DR: In this paper, a brief overview of some existing solutions is presented and two systems for axial linear displacement measurement based on light intensity detection are introduced. The systems have redundancy and were designed with the purpose of achieving identification and automatic correction of errors arising from inadvertent angular variations between the sensor and the light beam positions.
Abstract: The present work is a contribution to the field of linear displacement measurements by optical means. For that purpose, a brief overview of some existing solutions is presented and two systems for axial linear displacement measurement based on light intensity detection are introduced. The systems have redundancy and were designed with the purpose of achieving identification and automatic correction of errors arising from inadvertent angular variations between the sensor and the light beam positions.
68 citations
TL;DR: The results of a survey in which Chilean software practitioners' perceptions of project success are compared with previous research with US practitioners indicate that there is a relationship between team-work and success; and the data suggests peer control inside the US teams indicating a less stressful environment.
Abstract: Due to the increasing globalization of software development we are interested to discover if there exist significant cultural differences in practitioners' definition of a successful software project. This study presents the results of a survey in which Chilean software practitioners' perceptions of project success are compared with previous research with US practitioners. Responses from both groups of practitioners indicate that there is a relationship between team-work and success; our results also indicate that there are similar perceptions related to the importance of job satisfaction and project success. However, Chilean responses suggest that if a practitioner is allowed too much freedom within the work environment, job stress results; this in turn is reflected in increasing demands for both job satisfaction and good environmental conditions. This may indicate the potential for the attribution of failure to conditions outside the team, thus preventing a search for problematic team issues and technical problems. In contrast, the data suggests peer control inside the US teams indicating a less stressful environment.
56 citations
TL;DR: This article proposes a methodology for the extraction of ANN rules, regardless of their architecture, and based on genetic programming, which aims at achieving the generalization capacity that is characteristic of ANNs by means of symbolic rules that are understandable to human beings.
Abstract: Various techniques for the extraction of ANN rules have been used, but most of them have focused on certain types of networks and their training. There are very few methods that deal with ANN rule extraction as systems that are independent of their architecture, training, and internal distribution of weights, connections, and activation functions. This article proposes a methodology for the extraction of ANN rules, regardless of their architecture, and based on genetic programming. The strategy is based on the previous algorithm and aims at achieving the generalization capacity that is characteristic of ANNs by means of symbolic rules that are understandable to human beings.
42 citations
Cited by
More filters
31 Oct 2001
TL;DR: The American Society for Testing and Materials (ASTM) as mentioned in this paper is an independent organization devoted to the development of standards for testing and materials, and is a member of IEEE 802.11.
Abstract: The American Society for Testing and Materials (ASTM) is an independent organization devoted to the development of standards.
3,792 citations
01 Jan 2004
TL;DR: Comprehensive and up-to-date, this book includes essential topics that either reflect practical significance or are of theoretical importance and describes numerous important application areas such as image based rendering and digital libraries.
Abstract: From the Publisher:
The accessible presentation of this book gives both a general view of the entire computer vision enterprise and also offers sufficient detail to be able to build useful applications. Users learn techniques that have proven to be useful by first-hand experience and a wide range of mathematical methods. A CD-ROM with every copy of the text contains source code for programming practice, color images, and illustrative movies. Comprehensive and up-to-date, this book includes essential topics that either reflect practical significance or are of theoretical importance. Topics are discussed in substantial and increasing depth. Application surveys describe numerous important application areas such as image based rendering and digital libraries. Many important algorithms broken down and illustrated in pseudo code. Appropriate for use by engineers as a comprehensive reference to the computer vision enterprise.
3,627 citations
Journal Article•
3,385 citations
Book•
01 Jan 1975
TL;DR: The major change in the second edition of this book is the addition of a new chapter on probabilistic retrieval, which I think is one of the most interesting and active areas of research in information retrieval.
Abstract: The major change in the second edition of this book is the addition of a new chapter on probabilistic retrieval. This chapter has been included because I think this is one of the most interesting and active areas of research in information retrieval. There are still many problems to be solved so I hope that this particular chapter will be of some help to those who want to advance the state of knowledge in this area. All the other chapters have been updated by including some of the more recent work on the topics covered. In preparing this new edition I have benefited from discussions with Bruce Croft, The material of this book is aimed at advanced undergraduate information (or computer) science students, postgraduate library science students, and research workers in the field of IR. Some of the chapters, particularly Chapter 6 * , make simple use of a little advanced mathematics. However, the necessary mathematical tools can be easily mastered from numerous mathematical texts that now exist and, in any case, references have been given where the mathematics occur. I had to face the problem of balancing clarity of exposition with density of references. I was tempted to give large numbers of references but was afraid they would have destroyed the continuity of the text. I have tried to steer a middle course and not compete with the Annual Review of Information Science and Technology. Normally one is encouraged to cite only works that have been published in some readily accessible form, such as a book or periodical. Unfortunately, much of the interesting work in IR is contained in technical reports and Ph.D. theses. For example, most the work done on the SMART system at Cornell is available only in reports. Luckily many of these are now available through the National Technical Information Service (U.S.) and University Microfilms (U.K.). I have not avoided using these sources although if the same material is accessible more readily in some other form I have given it preference. I should like to acknowledge my considerable debt to many people and institutions that have helped me. Let me say first that they are responsible for many of the ideas in this book but that only I wish to be held responsible. My greatest debt is to Karen Sparck Jones who taught me to research information retrieval as an experimental science. Nick Jardine and Robin …
822 citations
TL;DR: It is shown how network techniques can help in the identification of single-target, edgetic, multi-target and allo-network drug target candidates and an optimized protocol of network-aided drug development is suggested, and a list of systems-level hallmarks of drug quality is provided.
Abstract: Despite considerable progress in genome- and proteome-based high-throughput screening methods and in rational drug design, the increase in approved drugs in the past decade did not match the increase of drug development costs. Network description and analysis not only give a systems-level understanding of drug action and disease complexity, but can also help to improve the efficiency of drug design. We give a comprehensive assessment of the analytical tools of network topology and dynamics. The state-of-the-art use of chemical similarity, protein structure, protein-protein interaction, signaling, genetic interaction and metabolic networks in the discovery of drug targets is summarized. We propose that network targeting follows two basic strategies. The "central hit strategy" selectively targets central nodes/edges of the flexible networks of infectious agents or cancer cells to kill them. The "network influence strategy" works against other diseases, where an efficient reconfiguration of rigid networks needs to be achieved by targeting the neighbors of central nodes/edges. It is shown how network techniques can help in the identification of single-target, edgetic, multi-target and allo-network drug target candidates. We review the recent boom in network methods helping hit identification, lead selection optimizing drug efficacy, as well as minimizing side-effects and drug toxicity. Successful network-based drug development strategies are shown through the examples of infections, cancer, metabolic diseases, neurodegenerative diseases and aging. Summarizing >1200 references we suggest an optimized protocol of network-aided drug development, and provide a list of systems-level hallmarks of drug quality. Finally, we highlight network-related drug development trends helping to achieve these hallmarks by a cohesive, global approach.
806 citations