scispace - formally typeset
Search or ask a question

Showing papers on "Eye tracking published in 1990"


Proceedings ArticleDOI
01 Mar 1990
TL;DR: Some of the human factors and technical considerations that arise in trying to use eye movements as an input medium are discussed and the first eye movement-based interaction techniques that are devised and implemented in the laboratory are described.
Abstract: In seeking hitherto-unused methods by which users and computers can communicate, we investigate the usefulness of eye movements as a fast and convenient auxiliary user-to-computer communication mode. The barrier to exploiting this medium has not been eye-tracking technology but the study of interaction techniques that incorporate eye movements into the user-computer dialogue in a natural and unobtrusive way. This paper discusses some of the human factors and technical considerations that arise in trying to use eye movements as an input medium, describes our approach and the first eye movement-based interaction techniques that we have devised and implemented in our laboratory, and reports our experiences and observations on them.

644 citations


Book
01 Jan 1990
TL;DR: The role of visual and cognitive processes in the control of eye movement and the role of eye movements in the detection of contrast and spatial detail is examined.
Abstract: Preface. 1. The role of visual and cognitive processes in the control of eye movement (E. Kowler). 2. Predictive control of eye movement (M. Pavel). 3. The role of eye movement in the detection of contrast and spatial detail (R.M. Steinman and J.Z. Levinson). 4. Binocular eye movements and the perception of depth (H. Collewijn and C.J. Erkelens). 5. Eye movement and visual localization of objects in space (A.A. Skavenski). 6. The role of eye movements in the perception of motion and shape (H. Wallach). 7. Comparison of perception in the moving and stationary eye (G. Sperling). 8. Eye movements in visual search: cognitive, perceptual and motor control aspects (P. Viviani). 9. Eye movements and reading (J.K. O'Regan). 10. Eye-movement models for arithmetic and reading performance (P. Suppes). Subject index. Imprint: Elsevier Amsterdam

546 citations



Journal ArticleDOI
TL;DR: It is shown that observers can perceive their direction of self-motion during stationary fixations and pursuit eye movements and with displays that simulate the optical effects of eye movements, indicating that the visual system can perform the decomposition with both continuous and discontinuous fields on the basis of flow-field information alone but requires a three-dimensional environmental structure to do so.
Abstract: Translation of an observer through a static environment generates a pattern of optical flow that specifies the direction of self-motion, but the retinal flow pattern is confounded by pursuit eye movements. How does the visual system decompose the translational and rotational components of flow to determine heading? It is shown that observers can perceive their direction of self-motion during stationary fixations and pursuit eye movements and with displays that simulate the optical effects of eye movements. Results indicate that the visual system can perform the decomposition with both continuous and discontinuous fields on the basis of flow-field information alone but requires a three-dimensional environmental structure to do so. The findings are inconsistent with general computational models and theories based on the maximum of divergence, oculomotor signals, or multiple fixations but are consistent with the theory of reliance on differential motion produced by environmental variation in depth.

341 citations


Proceedings ArticleDOI
01 Mar 1990
TL;DR: An information display system is described which uses eye-tracking to monitor user looking about its graphics screen and analyzes the user's patterns of eye movements and fixations in real-time to make inferences about what item or collection of items shown holds most relative interest for the user.
Abstract: An information display system is described which uses eye-tracking to monitor user looking about its graphics screen. The system analyzes the user's patterns of eye movements and fixations in real-time to make inferences about what item or collection of items shown holds most relative interest for the user. Material thus identified is zoomed-in for a closer look, and described in more detail via synthesized speech.

239 citations




Journal ArticleDOI
TL;DR: The relation between parafoveal letter and space information in eye movement guidance during reading was investigated and the results are consistent with a model of eye movement control in which 2 independent processes are operating in tandem to determine when and where to move the eyes during reading.
Abstract: The relation between parafoveal letter and space information in eye movement guidance during reading was investigated in 2 experiments Contingent upon the reader's fixation, the type of parafoveal information available to the right of fixation was varied by (a) space information only, (b) space information with letter information added at some delay, or (c) letter and space information simultaneously In addition, the onset of the relevant parafoveal information was delayed between 0 and 250 ms into the fixation The time course of processing the 2 types of information (letters or spaces) differed, as did the nature of their impact on the eye movement record Although both letter and space information influenced saccade length and initial landing positions within words, only letter information had an effect on fixation duration In addition, fixation duration was affected only by information entering within the first 50 ms of the fixation, whereas saccade length was affected by information arriving at any time during the fixation The results are consistent with a model of eye movement control in which 2 independent processes are operating in tandem to determine when and where to move the eyes during reading

187 citations


Proceedings ArticleDOI
01 Feb 1990
TL;DR: In this article, a ray tracer for volume data is proposed, in which the number of rays cast per unit area on the image plane and the numbers of samples drawn per unit length along each ray are functions of local retinal acuity.
Abstract: We direct our gaze at an object by rotating our eyes or head until the object's projection falls on the fovea, a small region of enhanced spatial acuity near the center of the retina. In this paper, we explore methods for encorporating gaze direction into rendering algorithms. This approach permits generation of images exhibiting continuously varying resolution, and allows these images to be displayed on conventional television monitors. Specifically, we describe a ray tracer for volume data in which the number of rays cast per unit area on the image plane and the number of samples drawn per unit length along each ray are functions of local retinal acuity. We also describe an implementation using 2D and 3D mip maps, an eye tracker, and the Pixel-Planes 5 massively parallel raster display system. Pending completion of Pixel-Planes 5 in the spring of 1990, we have written a simulator on a Stellar graphics supercomputer. Preliminary results indicate that while users are aware of the variable-resolution structure of the image, the high-resolution sweet spot follows their gaze well and promises to be useful in practice.

182 citations


Journal ArticleDOI
01 Sep 1990
TL;DR: An adaptive method for visually tracking a known moving object with a single mobile camera to predict the location of features of the object on the image plane based on past observations and past control inputs and to determine an optimal control input that will move the camera so that the image features align with their desired positions.
Abstract: An adaptive method for visually tracking a known moving object with a single mobile camera is described. The method differs from previous methods of motion estimation in that both the camera and the object are moving. The objective is to predict the location of features of the object on the image plane based on past observations and past control inputs and then to determine an optimal control input that will move the camera so that the image features align with their desired positions. A resolved motion rate control structure is used to control the relative position and orientation between the camera and the object. A geometric model of the camera is used to determine the linear differential transformation from image features to camera position and orientation. To adjust for modeling errors and system nonlinearities, a self-tuning adaptive controller is used to update the transformation and compute the optimal control. Computer simulations were conducted to verify the performance of the adaptive feature prediction and control. >

171 citations



Journal ArticleDOI
Hitoshi Honda1
TL;DR: Egocentric localization was not affected by the mode of eye movements, indicating that the egocentric localized system functions without interference from the inaccurate information from the pursuit-eye-movement system.
Abstract: The accuracy of perceptual judgment of the distance of a moving target tracked at various velocities by pursuit eye movements was examined in relation to the amount of two types of eye movement (smooth pursuit eye movement and compensatory saccade) involved in eye tracking. The perceptually judged distance became shorter as the amount of pursuit-eye-movement component in eye tracking increased. A detailed analysis of the eye-movement data and the size of perceptual underestimation indicated that the underestimation was mainly caused by inaccurate extraretinal information derived from the pursuit-eye-movement system, which underestimated the distance at a constant ratio, irrespective of the velocity of tracking. Egocentric localization was not affected by the mode of eye movements, indicating that the egocentric localization system functions without interference from the inaccurate information from the pursuit-eye-movement system.

Journal ArticleDOI
TL;DR: A new approach to the aperture problem is presented, using an adaptive neural network model that accommodates its structure to long-term statistics of visual motion, but also simultaneously uses its acquired structure to assimilate, disambiguate, and represent visual motion events in real-time.

Journal ArticleDOI
TL;DR: It is concluded that the effects of initial tracking conditions on eye velocity at latencies of less than 70 ms cannot be caused by visual feedback through the smooth-pursuit system, and there must be another mechanism for short-latency control over the VOR.
Abstract: 1. Monkeys normally use a combination of smooth head and eye movements to keep the eyes pointed at a slowly moving object. The visual inputs from target motion evoke smooth pursuit eye movements, whereas the vestibular inputs from head motion evoke a vestibuloocular reflex (VOR). Our study asks how the eye movements of pursuit and the VOR interact. Is there a linear addition of independent commands for pursuit and the VOR? Or does the interaction of visual and vestibular stimuli cause momentary, "parametric" modulation of transmission through VOR pathways? 2. We probed for the state of the VOR and pursuit by presenting transient perturbations of target and/or head motion under different steady-state tracking conditions. Tracking conditions included fixation at straight-ahead gaze, in which both the head and the target were stationary; "times-zero (X0) tracking," in which the target and head moved in the same direction at the same speed; and "times-two (X2) tracking," in which the target and head moved in opposite directions at the same speed. 3. Comparison of the eye velocities evoked by changes in the direction of X0 versus X2 tracking revealed two components of the tracking response. The earliest component, which we attribute to the VOR, had a latency of 14 ms and a trajectory that did not depend on initial tracking conditions. The later component had a latency of 70 ms or less and a trajectory that did depend on tracking conditions. 4. To probe the latency of pursuit eye movements, we imposed perturbations of target velocity imposed during X0 and X2 tracking. The resulting changes in eye velocity had latencies of at least 100 ms. We conclude that the effects of initial tracking conditions on eye velocity at latencies of less than 70 ms cannot be caused by visual feedback through the smooth-pursuit system. Instead, there must be another mechanism for short-latency control over the VOR; we call this component of the response "short-latency tracking." 5. Perturbations of head velocity or head and target velocity during X0 and X2 tracking showed that short-latency tracking depended only on the tracking conditions at the time the perturbation was imposed. The VOR appeared to be suppressed when the initial conditions were X0 tracking. 6. The magnitude of short-latency tracking depended on the speed of initial head and target movement. During X0 tracking at 15 deg/s, short-latency tracking was modest. When the initial speed of head and target motion was 60 deg/s, the amplitude of short-latency tracking was quite large and its latency became as short as 36 ms.(ABSTRACT TRUNCATED AT 400 WORDS)


Journal ArticleDOI
TL;DR: A new systems approach for evaluating field performance of drivers is presented in a practical experimental demonstration and results suggest that the eye and head movements are highly dependent upon the type of turn configuration, type of targets, type and frequency of distractors, and traffic control configurations.


Journal ArticleDOI
TL;DR: Eye tracking performance was evaluated both qualitatively and quantitatively using percent root-mean-square (%RMS) error and pursuit gain scores and has implications for understanding of schizophrenic information processing during visual tracking.

Journal ArticleDOI
TL;DR: Great variability in tracking performance was found among nine normal subjects, and tracking quality improved in each when asked to perform a simple analysis of the tracking target: saccade counts and amplitudes decreased, as did a root-mean-square error measure of overall tracking performance.

Journal ArticleDOI
TL;DR: It was confirmed that eye refraction measurement during vergence is possible using this eye‐tracking infra‐red optometer and this instrument was developed to measure dynamic refractive power during slow horizontal eye movement.

Proceedings ArticleDOI
01 Nov 1990
TL;DR: In this paper, a Kalman filter is used to separate smooth pursuit from saccadic eye movements after estimation of the beginning and the end of the saccades, which is then recognized as a saccade using a few simple rules.
Abstract: A simple, but efficient, algorithm has been worked out for online analysis of visual tracking signal. This program enables to separate smooth pursuit from saccadic eye movements after estimation of the beginning and the end of the saccades. The used procedure comprises four steps : after having differenciated the eye position signal, a Kalman filter is used, the innovation of which is tested by means of Hinkley stopping rule. The event detected in this way is then recognized as a saccade using a few simple rules.


Proceedings ArticleDOI
01 Oct 1990
TL;DR: Methods for evaluation of the eye tracker are described and experimental results presented that reveal its present performance characteristics.
Abstract: The Fiber Optic Helmet Mounted Display (FOHMD) projects high and low resolution computer generated imagery via fiber optic bundles through collimated helmet mounted optics to each eye. Combined head and eye position information is then used to position a high resolution area of interest within the head tracked low resolution background. Methods for evaluation of the eye tracker are described and experimental results presented that reveal its present performance characteristics.

ReportDOI
01 Mar 1990
TL;DR: The system architecture, initial implementation, advantages and limitations, and future application of an eye position monitor for evaluating the field-of-view requirements for simulator visual systems are described.
Abstract: : This paper describes the use of an eye position monitor as a research tool for evaluating the field-of-view (FOV) requirements for simulator visual systems. Traditional evaluation methods rely on the use of pilot opinion and/or objective pilot performance measures. Neither provides a direct index of the pilot's visual behavior under alternative FOV conditions. Without a direct measure, interpretation of data is often problematic. The use of an eye position monitor provides a useful adjunct to these traditional methods. The present paper describes the system architecture, initial implementation, advantages and limitations, and future application. Keywords: Attention allocation; Flight simulators; Visual behavior; Data collection methodology; Eye tracking; Head tracking; Field of view; Performance measures.

Journal ArticleDOI
TL;DR: Results indicate that an eye gaze technique can be used effectively in a clinical setting and strategies and techniques are proposed for adapting a variety of speech audiometric procedures.
Abstract: When assessing the hearing of physically challenged nonspeaking individuals, examiners generally restrict the assessment to pure tone testing which limits the diagnostic value of the evaluation. This study recommends adapting the standard speech audiometry procedures by utilizing an eye gaze response mode. Results indicate that an eye gaze technique can be used effectively in a clinical setting. Strategies and techniques are proposed for adapting a variety of speech audiometric procedures.


Proceedings ArticleDOI
01 Nov 1990
TL;DR: Two types of the double target step methods which are developed so as to remove the re-adaptation are used, one is SDexperiment to increase saccadic gain, while the other is ODexperiments to decrease it.
Abstract: Adaptive process of visually triggered saccade has been investigated. Two types of the double target step methods which are developed so as to remove the re-adaptation are used. The one is SDexperiment to increase saccadic gain, while the other is ODexperiment to decrease it. Gain adaptation of the saccade differs within the direction of eye movement. Also the adaptation has different time courses for SD and OD states. Furthermore, the change of gain for the adapted eye has influences on the gain for the non-adapted eye. As a result, adaptive process of visually triggered saccade is parametric.