scispace - formally typeset
Search or ask a question

Showing papers presented at "Virtual Environments, Human-Computer Interfaces and Measurement Systems in 2005"


Proceedings ArticleDOI
18 Jul 2005
TL;DR: This methodology, based on a balancing between processing and transmission tasks, can be applied in all those situations where subjective image based measurements are required and tries to find the best compression ratio able to give a significant reduction of transmission time.
Abstract: The paper presents a methodology to reduce the energy consumption of visual sensor node in wireless sensor networks. This methodology, based on a balancing between processing and transmission tasks, can be applied in all those situations where subjective image based measurements are required. Considering that in a sensor node, the communication task is the most energy consumption, the proposed method tries to find the best compression ratio able to give a significant reduction of transmission time, returning adequate and objective image quality and compression time. Some results are presented applying the proposed methodology to the particular case study: the use of wireless visual sensors to the remote metering of water counters.

141 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: A new concept of low-cost, high-resolution, lightweight, compact and highly-portable tactile display based on shape memory alloy (SMA) technology that allows the development of 60 g weight tactile devices of compact dimensions easily carried in the user's hand.
Abstract: This paper presents a new concept of low-cost, high-resolution, lightweight, compact and highly-portable tactile display. The prototype consists of an array of 8 /spl times/ 8 upward/downward independent moveable pins based on shape memory alloy (SMA) technology. Each actuator is capable of developing a 320 mN pull force at 1.5 Hz bandwidth by using simple forced-air convection. The proposed concept allows the development of 60 g weight tactile devices of compact dimensions easily carried in the user's hand. SMA active element, tactile actuator and tactile display are presented and discussed.

75 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: The authors propose a Web sensor build up adopting the Web service approach for data transmission between server and client, which offers great possibility in terms of easy access to measured data, of integration of large complex Web sensor network, of realization of flexible custom applications and of service reusability.
Abstract: The usual solutions commonly adopted in order to build up smart Web sensors are based on data communication that produces static Web pages, adopting HTML, or data socket connection adopting applet Java. The main disadvantages of these data communication techniques are the impossibility to access to the single data sent from smart Web sensor and the necessity to use the system developed by the manufacturer because only he knows the way how server and client send information each other. This situation of course is very restrictive to the development of complex sensor network or flexible application based on Web smart sensor. In this paper the authors propose a Web sensor build up adopting the Web service approach for data transmission between server and client. In this way, smart Web sensor sensors aren't simply supplier of Internet pages to browsers but they become server of function in a monitoring system. Moreover, this solution offers great possibility in terms of easy access to measured data, of integration of large complex Web sensor network, of realization of flexible custom applications and of service reusability.

29 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: An architecture for the implementation of distributed laboratory for metrological confirmation of power quality instruments is described and the generic node of the distributed laboratory, the field lab and the communication interfaces of field instruments are compared.
Abstract: With the liberalization of market, the electric energy is more and more considered as a commercial product; this means that its quality must be considered and then continuously monitored with certified devices. In this paper, an architecture for the implementation of distributed laboratory for metrological confirmation of power quality instruments is described. The paper is structured in three main parts: the first one describes the proposed architecture and its advantage; the second one illustrates in detail the generic node of the distributed laboratory, the field lab and in the last part the communication interfaces of field instruments are compared. The paper is completed with experimental results on metrological confirmation of harmonics and flicker field instruments.

13 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: Face orientation was determined by measuring 3D positions of pupils and nostrils using the stereo camera method and head orientation detection was possible for a horizontal large angle of /spl plusmn/45 deg and for a vertical angel of / Splplusmn/15 deg.
Abstract: Face orientation was determined by measuring 3D positions of pupils and nostrils using the stereo camera method. The two cameras were set at relatively low positions in order to grab the nostril image of a user. Two sets of several LEDs were arranged around the apertures of the two cameras, respectively. The two sets of the light sources were lit alternately and synchronously with the odd/even signal. This illumination condition alternately produced so-called bright and dark pupils in both camera images every field. Differentiation of the bright and dark pupil images in each camera image made it easy to detect the pupils by image processing. A large window was applied to a possible position of the nostrils, which were estimated by the two detected pupils. When the face was illuminated by the light source attached to another camera each other, the nostrils were darker than the other parts in the window. Accordingly, the nostrils can be detected easily. However, when the horizontal angle of the face is large, one nostril may be undetected. So the position of the undetected nostril was estimated using the ratio of the distance between the two nostrils to the distance between the two pupils. The ratio was obtained from an initial calibration procedure. The 3D positions of the two pupils and two nostrils were calculated by stereo matching. A composition vector of the normal vectors of the plane including the two pupils and the left nostril and the plane including the two pupils and the right nostril were defined as the face orientation. The experimental results showed that for all of three subjects head orientation detection was possible for a horizontal large angle of /spl plusmn/45 deg and for a vertical angel of /spl plusmn/15 deg.

13 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: Proposals to improve spatial perception and immersion experience in CAVEs through sound field simulation and correct matching of audio and visual cues are proposed and a spatial sound immersion grading scale is proposed to allow for system assessment and comparison of capabilities in delivering spatial immersion.
Abstract: A correct and wide coupling of sound to visual applications is still missing in most immersive VR environments, while future and advanced applications tend to demand a more realistic and integrated audiovisual solution to permit complete immersive experiences. Still there is a vast field of investigations till a correct and complete immersive system can reproduce realistic constructions of worlds. Sound fields simulation, although complex and of expensive implementation in the past, is now a potential candidate to improve spatial perception and correctness in CAVEs and other VR systems, but there are serious challenges and multiple techniques to do the job. In this paper, we introduce our investigations in such fields and proposals to improve spatial perception and immersion experience in CAVEs through sound field simulation and correct matching of audio and visual cues. Additionally, a spatial sound immersion grading scale is proposed, to allow for system assessment and comparison of capabilities in delivering spatial immersion.

13 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: This paper leads to provide a methodological procedure, in order to establish optimal calibration intervals according to the reference standards, and a virtual instrument has been developed to perform the proposed procedure.
Abstract: This paper leads to provide a methodological procedure, in order to establish optimal calibration intervals according to the reference standards. At this aim a virtual instrument has been developed to perform the proposed procedure. Generally speaking, the goal of a calibration process is to reduce out-of-tolerance occurrences to an acceptable level with respect to the desired quality target. The inevitable calibration uncertainty contributions can affect the measurement capability, and an inappropriate estimation of these aspects could lead to incorrect calibration intervals. So, the proposed methods base themselves on a statistical procedure for the evaluation of the impact of calibration process uncertainty on the decision risks. The adopted statistical approach is suitable for estimating the maintenance time needed to meet desired reliability goals. By evaluating the erroneous decision probabilities regarding the in-tolerance state within fixed limits, the statistical and metrological features of the calibration process take part in computing the most reliable interval according to an acceptable decision reliability target. The proposed methodology presents full compliance with the standards of the sector and the guidelines of ISO-9000 for quality assurance.

10 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: This proposal allows a versatile and automatic shape reconstruction for cultural heritage retrieval, restoration and virtual training, coping with the unavoidable error problem inherent in the cultural heritage.
Abstract: This paper proposes a practical, topologically robust and ranging error resistive shape modeling procedure that approximates a real 3D object, such as heritage artifacts, with the matrix-format meshing for the 3D shape processing. The processing is used for the modeled shape modification, especially, for restoration of the broken parts of the artifacts or the virtual manipulation of the 3D shape. A geometric model with desired meshing is directly reconstructed based on a solid modeling approach. The radial distance of each scanning point from the axis of the cylindrical coordinates is measured by laser triangulation. The angular and vertical positions of the laser beam are two other coordinate values of the scanning. A face array listing (topology), which defines the vertex (sampling point) connectivity and the shape of the mesh, is assigned to meet the desired meshing. Stable meshing, and hence, an accurate approximation, free from the shape ambiguity unavoidable in the widely used ICP (iterative closest point) modeling, is then accomplished. This proposal allows a versatile and automatic shape reconstruction for cultural heritage retrieval, restoration and virtual training, coping with the unavoidable error problem inherent in the cultural heritage.

8 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: A framework that identifies behavioral patterns through physical parameters such as direction, force, pressure and velocity has been built and the objective of this research is to exploit people's habits in handling devices to identity individuals.
Abstract: Biometrics has been introduced recently to identify people by their behavior and physiological features. It offers a wide application scope to detect fraud attempts in organizations, corporations, educational institutions, electronic resources and even crime scenes. The field of biometrics can be divided into two main classes according to features that humans are born with, such as fingerprints or facial features, or behavioral characteristics of humans, like a handwritten signature or voice (J. Ortega-Garcia et al., 2004). The work presented in this paper pursues the latter class, specifically how a person reacts to using daily devices or tools. The fact that we can exploit people's habits in handling devices to identity individuals was the hypothesis that motivated this work. Among the many examples of the potential use of this class of biometrics is the particular force applied to the keys in a keyboard. There is also the time interval between each keypad when dialing a telephone number. Another example that can be extracted from the latter would be the map described by the fingers in navigating through solving maze operation. Extracting these features by using a haptic-based application and defining the subsequent individual pattern is the objective of this research. A framework that identifies behavioral patterns through physical parameters such as direction, force, pressure and velocity has been built. The set up for the experimental work consisted of a multisensory tool, using the Reachin system (Reachin Technologies, User's Programmers Guide and API).

7 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: This paper discusses a multiagent formalism intended for the minimization of entropy and maximization of efficiency within an environment monitoring application.
Abstract: This paper discusses a multiagent formalism intended for the minimization of entropy and maximization of efficiency within an environment monitoring application.

6 citations


Proceedings ArticleDOI
18 Jul 2005
TL;DR: It is shown that, the presence of accent in the speech can increase the error rate, and a method is proposed to lower the recognition error rate by first determining the accent of the utterance and then applying the appropriate ASR engine from a bank of engines trained for different accents.
Abstract: This paper examines the impact of accent present in speech on the performance of speaker independent automatic speech recognition (ASR) systems. In this paper, we show that, the presence of accent in the speech can increase the error rate. We validate a fundamental assumption that a speaker independent ASR engine, trained by a variety of accents, performs poorer than an engine that is trained for a particular accent, when tested by the same accent. Based on the results, we propose a method to lower the recognition error rate and measure the improvement by first determining the accent of the utterance and then applying the appropriate ASR engine from a bank of engines trained for different accents. We show that applying this method, will results to an average decrease of 24% overall error rate. The results are encouraging for a future complementary work. The research was carried out on an HMM based speech recognizer and TIMIT database was used to train and test the ASR engine.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: It is proved that these automata are the platforms, based on application program interfaces (APIs), of platform-based design paradigm: so obtaining a theoretical result, which highlights the effectiveness of APIs in modelling of these interactions.
Abstract: This paper presents a novel theoretical formalization that is enable model interactions between virtual environment, and components, by means of multilayer automata. Secondly, we prove that these automata are the platforms, based on application program interfaces (APIs), of platform-based design paradigm: so obtaining a theoretical result, which highlights the effectiveness of APIs in modelling of these interactions. Finally a case study of an embedded system for electronic measurement of gas concentration is proposed, and described.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: An approach to developing a mathematical model of color harmony will be applied in the color harmonizer, an automated tool for coloring computer interfaces and Web sites that will incorporate a color harmony engine that can incorporate a variety of theories for color harmony.
Abstract: We describe an approach to developing a mathematical model of color harmony. This will be applied in the color harmonizer, an automated tool for coloring computer interfaces and Web sites. The tool will incorporate a color harmony engine that can incorporate a variety of theories for color harmony, and in the first instance, will use the rules proposed by Munsell and adapted to use in computer displays. We describe abstract and concrete color schemes, the Chromotome (a tool developed to facilitate the selection of colors) and techniques for grouping interface elements.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: This paper presents the implementation and the preliminary experimental validation of a distributed robotic system for monitoring magnetic fields starting from measurements performed by mobile robots equipped with magnetic sensors.
Abstract: In this paper we present the implementation and the preliminary experimental validation of a distributed robotic system for monitoring magnetic fields. In particular, we concentrate on localizing magnetic field sources starting from measurements performed by mobile robots equipped with magnetic sensors.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: This paper presents the stochastic analysis platform for peer-to-peer networks (SAPP2P), which is enable to measure the QoS of every P2P network by the simulation of its dynamics (until 10/sup 7/ peers, and with calculus time of [10/sup 3/ - 10/Sup 4/]s).
Abstract: The measurement of quality of service (QoS) for peer-to-peer (P2P) networks is principally stimulated by the increasing need to model, and regulate the nondeterministic behaviour of this network kind that has recently achieved great success, and popularity: representing a high part of the total Internet data-communications. The stochastic nature, and the complexity of P2P networks make difficultly measurable the QoS. This paper presents the stochastic analysis platform for peer-to-peer networks (SAPP2P), which is enable to measure the QoS of every P2P network (described by means of an opportune language) by the simulation of its dynamics (until 10/sup 7/ peers, and with calculus time of [10/sup 3/ - 10/sup 4/]s). This measurement technique becomes an useful aid in design, and optimization of highly performing P2P networks.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: A novel method based on virtual instruments for a real time correction of errors due to metrological characteristics of the transducers and measurement chain: sensitivity, phase response, microphone directional characteristics and interchannel delay is described.
Abstract: Sound intensity can be evaluated by means of pressures measured by two microphones. With the use of four or six microphones one can measure the three-dimensional sound intensity vector. Measurements carried out in this way are subject to errors due to various causes. Aim of this paper is to describe a novel method (based on virtual instruments) for a real time correction of such errors. The correction algorithm compensates errors due to metrological characteristics of the transducers and measurement chain: sensitivity, phase response, microphone directional characteristics and interchannel delay. The effects of environmental conditions, namely pressure, humidity and temperature, can also be compensated.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: Inertia turned out not to influence performance and immersion in both delayed and non-delayed scenarios, and neither training in delayed scenarios nor personal attributes, showed any significant effect on operator performance.
Abstract: Telepresent control of robots over the Internet or across large distances is in many applications either necessary or beneficial. However, exposing important control data to unreliable network conditions largely restricts the ability to carry out many long distance operations. This paper addresses a number of research questions regarding: (1) the impact of network delay on operator performance and immersion; (2) the implementation of simulated inertia to possibly improve performance and immersion; (3) personal attributes, such as sensomotoric coordination, on operator performance; and (4) pre-training in delayed telepresence scenarios. To accommodate these research goals, a virtual telepresence scenario was developed, utilizing an international network between Munich in Germany, and Wollongong in Australia. The results of this research confirm the negative impact of delay. Inertia turned out not to influence performance and immersion in both delayed and non-delayed scenarios. Additionally, neither training in delayed scenarios nor personal attributes, showed any significant effect on operator performance.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: The implementation of a smart sensor based on microcontroller for the extraction of new conceived power quality indexes and a new measurement algorithm has been implemented, that is considered more rapid for detecting a sag occurrence.
Abstract: In this paper the authors describe the implementation of a smart sensor based on microcontroller for the extraction of new conceived power quality indexes. The work is carried on starting from the improvement of three indexes presented in a previous work (De Capua et al., 2004) for exhaustively detecting voltage sags. After an examination of the loads which could be more susceptible to the duration or to the depth of the sag, an ANOVA analysis has been conducted in order to evaluate the indexes' sensibility to these characteristics of the sag. Then a new measurement algorithm has been implemented, that is considered more rapid for detecting a sag occurrence. Finally a dsPic-based smart sensor has been realized, to monitor the voltage RMS value and extract the indexes values. These values are transmitted to software located on an external peripheral, through serial communication, for the successive data processing stage.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: The architecture that is proposed is a layer based approach to overcome the lack of a uniform controlling mechanism that makes full use of the functionalities of haptic devices according to each user's personal needs.
Abstract: Research work that has been performed in the area of haptics have surely produced some remarkable results; there is a good variety of haptic devices that possess the potential to offer users a rich experience in a virtual reality setting. However, users always face difficulties in operating in such environments. Without doubt the need of users having to 'adjust to' operating haptic devices is an inconvenience. Nevertheless the lack of a uniform controlling mechanism able to use fully the abilities of haptic devices according to each user's personal attributes and needs is the main hurdle that must be overtaken. To bring about such architecture is an inspiring adventure that presents some challenges, but in turn, is immensely rewarding. The architecture that we are proposing is a layer based approach to overcome the lack of a uniform controlling mechanism that makes full use of the functionalities of haptic devices according to each user's personal needs. This framework is integrated with a behavioral data container, which defines the personalized information of the users. The latter is our context-aware, or intelligent component that dictates what device to use and how, according to the user's data that has already been collected and analyzed. This intelligent component takes the decisions and directs the user to the appropriate environment. With the relevant software and hardware, it is possible to offer to the user optimum experience of a haptic-induced virtual world.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: A practical mathematical algorithm, as an undersampling method, using the sampling theorem in two known forms, referring to the bandwidth and to maximum/highest frequency of spectrum is presented.
Abstract: The present paper deals in the first part with a presentation of the sampling theorem in two known forms, referring to the bandwidth and to maximum/highest frequency of spectrum. In the second part of the paper, a practical mathematical algorithm, as an undersampling method is presented. Characteristics and experimental results are finally analyzed.

Proceedings ArticleDOI
M. Aziz1, R.D. Macredie1
18 Jul 2005
TL;DR: On the basis of users' familiarity with the system, a conceptual model is proposed that identifies several critical perceptual measures at various stages of interaction and demonstrates significant insights into varying users' perceptions of ease of use on the based of their familiarity with systems.
Abstract: The user reported interaction difficulties with information systems are because these systems are designed on the basis of developers' perceptions. It is more challenging to address this issue in the context of Web-based information systems where users come from varying backgrounds. Therefore, this paper aims to analyse the interaction of Web-based information systems on the basis of individual users' perceptions of ease of use. The variability in users' backgrounds is quantified in terms of their familiarity with the systems. On the basis of users' familiarity with the system, a conceptual model is proposed. The model identifies several critical perceptual measures at various stages of interaction. The model is tested and validated in a three phase empirical study. Findings demonstrate significant insights into varying users' perceptions of ease of use on the basis of their familiarity with systems. These results have implications for interaction design and usability evaluation processes, among others.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: A method for the evaluation of an interval in which the actual edge falls with a specific confidence level for each identified edge is proposed and validated with reference to synthesized and real images.
Abstract: The paper deals with the expression of the uncertainty on edge localization in image analysis applications. A method for the evaluation of an interval in which the actual edge falls with a specific confidence level for each identified edge is proposed and validated with reference to synthesized and real images.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: A system that allows wireless transmitting of peripheral arterial pulse waveforms with use of a GPRS network is proposed and the photoplethysmogram reveals circulatory depression and arrhythmia.
Abstract: The shape of the arterial pulsations depends upon the thickness of the blood vessels and contractility of the heart as well as the state of the vascular smooth muscle in the vessel wall. Photoplethysmography (PPG) is by definition an optoelectronic method for measuring and recording changes in volume of a body part. The shape and stability of the PPG waveform can be used as an indication of possible motion artifacts or low perfusion conditions. Furthermore, the photoplethysmogram reveals circulatory depression and arrhythmia. In the paper, we propose a system that allows wireless transmitting of peripheral arterial pulse waveforms with use of a GPRS network. Selected examples of results, which were obtained during investigations made on a number of real PPG signals, are presented.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: The paper deals with the use of support vector machines in software-based instrument fault accommodation schemes and a performance comparisons between SVMs and artificial neural networks is reported.
Abstract: The paper deals with the use of support vector machines (SVMs) in software-based instrument fault accommodation schemes. A performance comparisons between SVMs and artificial neural networks (ANNs) is also reported. As an example, a real case study on an automotive system is presented. The ANNs and SVMs regression capability are employed to accommodate faults that could occur on main sensors involved in the engine operating. The obtained results prove the good behaviour of both tools. Similar performances have been achieved in terms of accuracy.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: Experimental results show that GFRB scheme can be a potential solution to data synchronization in large distributed virtual environments using a reduced number of messages and multicast groups, by taking into consideration the percentage of overlap of a region on a cell.
Abstract: In large-scale distributed virtual environments - LDVEs - extensive and/or multidimensional virtual 3D environments can be shared by thousands of participant users in applications ranging from multiplayer games to virtual cities, virtual shopping mall, open space military training etc. In such systems, distribution seems inherent. The world geometric data of an extensive region or terrain may need to be divided in areas, which are distributed across the network. By distributing the data, host memory use and processing power are reduced since less data needs to be received, processed and stored. However, these gains can be easily surpassed by the amount of traffic generated by the synchronization messages transmitted, as well as the processing time due to the control infrastructure and the bulky data transmission of parts of the distributed VE. So, good mechanisms for data distribution management (DDM) are an important requirement for LDVEs. This paper describes the grid filtered region-based (GFRB) DDM scheme, which combines the positive aspects of both region-based and grid-based DDM methods to reduce network traffic and latency in LDVEs using a minimum number of multicast groups. GFRB scheme is unique because it provides a finer grained mechanism to exact match publishers to subscribers with intersecting regions. It does that with a reduced number of messages and multicast groups, by taking into consideration the percentage of overlap of a region on a cell. Experimental results show that GFRB scheme can be a potential solution to data synchronization in large distributed virtual environments.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: This paper addresses the concept of immersive telemeasurement, where resources such as instruments and programmable devices are distributed in a network of different laboratories and could cooperate to set up augmented experiments.
Abstract: This paper addresses the concept of immersive telemeasurement, where resources such as instruments and programmable devices are distributed in a network of different laboratories and could cooperate to set up augmented experiments. The immersivity for the user is given by a three-dimensional (3D) representation of the remote aggregated laboratory and is augmented by the possibility of interaction with real devices through their 3D representation. The use of these techniques in addition to stereoscopic projection allows the creation of an environment with immersive characteristics, increasing the level of presence experimented by users.

Proceedings ArticleDOI
18 Jul 2005
TL;DR: The article gives a review of modern tools supported developing and adopting of a new model of book in the field of instrumentation & measurement and includes a short description of the structure of the electronic book.
Abstract: The article gives a review of modern tools supported developing and adopting of a new model of book in the field of instrumentation & measurement. It includes a short description of the structure of the electronic book. The last part presents some details concerned to Web based distributed measurement systems, remote access to laboratory and virtual laboratory.