scispace - formally typeset
Search or ask a question
Author

Oral Sekender

Bio: Oral Sekender is an academic researcher. The author has contributed to research in topics: Stylus. The author has an hindex of 1, co-authored 1 publications receiving 1229 citations.
Topics: Stylus

Papers
More filters
Patent
18 Dec 1995
TL;DR: In this article, the authors proposed the use of a data surface (e.g., paper) formatted with a position-related coding (figure 1a) for indicating X-Y coordinates capable of reflecting a frequency of light.
Abstract: The present invention is a form of a digitizer and absolute position determination device for indicating the instantaneous position and movement of a stylus (8) on a surface. It proposes the use of a data surface (e.g., paper) formatted with a position-related coding (figure 1a) for indicating X-Y coordinates capable of reflecting a frequency of light. The stylus (8) comprising a writing element (9) has a light source (17) of a frequency for illuminating the position-related coding (figure 1a). The frequency of light is absorbed by the data surface but reflected by the position-related coding (figure 1a) onto a charge-coupled device (CCD) (13) located within the stylus (8). The coordinate information from the CCD (13) is sent to a computer (16) for processing and finally the desired information output to the user.

1,229 citations


Cited by
More filters
Patent
Wayne Carl Westerman1
21 Dec 2007
TL;DR: In this article, techniques for identifying and discriminating between different types of contacts to a multi-touch touch-screen device are described, such as fingertips, thumbs, palms and cheeks, using patch eccentricity parameter.
Abstract: Techniques for identifying and discriminating between different types of contacts to a multi-touch touch-screen device are described. Illustrative contact types include fingertips, thumbs, palms and cheeks. By way of example, thumb contacts may be distinguished from fingertip contacts using a patch eccentricity parameter. In addition, by non-linearly deemphasizing pixels in a touch-surface image, a reliable means of distinguishing between large objects (e.g., palms) from smaller objects (e.g., fingertips, thumbs and a stylus) is described.

436 citations

Patent
06 Oct 2010
TL;DR: In this paper, a device for capturing rendered text is described, which consists of one or more visual sensors that receive visual information as a part of capturing text, and a visual information disposition subsystem for disposing of visual information received by the visual sensors.
Abstract: A device for capturing rendered text is described. The device incorporates one or more visual sensors that receive visual information as a part of capturing rendered text. The visual sensors are collectively capable of capturing both text that is permanently printed on a page, and text that is displayed transitorily on a dynamic device. The device further incorporates a visual information disposition subsystem for disposing of visual information received by the visual sensors. The device further incorporates a package that bears the visual sensors and the visual information disposition subsystem, and is suitable to be held in a human hand.

420 citations

Patent
02 Apr 2004
TL;DR: In this paper, the authors propose a sensing device for: sensing coded data disposed on a surface; and generating interaction data based on the sensed coded data, the interaction data being indicative of interaction of the sensing device with the surface; the device comprising an image sensor for capturing image information; at least one analog to digital converter for converting the captured image information into image data; an image processor for processing the image data to generate processed image data.
Abstract: A sensing device for: sensing coded data disposed on a surface; and generating interaction data based on the sensed coded data, the interaction data being indicative of interaction of the sensing device with the surface; the sensing device comprising: (a) an image sensor for capturing image information; (b) at least one analog to digital converter for converting the captured image information into image data; (c) an image processor for processing the image data to generate processed image data; (d) a host processor for generating the interaction data based at least partially on the processed image data.

356 citations

Patent
01 Aug 2005
TL;DR: In this article, a controller uses wave front modulation to match the curvature of the wave fronts of light reflected from the display device to the user's eyes with the curvatures of the waves that would be transmitted through the device display if the virtual imagery were situated at a predetermined position relative to the surface, such that the user sees the virtual images at the predetermined position regardless of changes in position of the users' eyes with respect to the see-through display.
Abstract: An augmented reality device for inserting virtual imagery into a user's view of their physical environment, the device comprising: a display device through which the user can view the physical environment; an optical sensing device for sensing at least one surface in the physical environment; and, a controller for projecting the virtual imagery via the display device; wherein during use, the controller uses wave front modulation to match the curvature of the wave fronts of light reflected from the display device to the user's eyes with the curvature of the wave fronts of light that would be transmitted through the device display if the virtual imagery were situated at a predetermined position relative to the surface, such that the user sees the virtual imagery at the predetermined position regardless of changes in position of the user's eyes with respect to the see-through display.

284 citations

Patent
25 May 2011
TL;DR: In this article, the authors present a system that has a remote control, equipped with a relative motion sensor that outputs data indicative of a change in position of the wand, which includes the absolute position of a reference point chosen on the wand and the absolute orientation.
Abstract: A system that has a remote control, e.g., a wand, equipped with a relative motion sensor that outputs data indicative of a change in position of the wand. The system also has one or more light sources and a photodetector that detects their light and outputs data indicative of the detected light. The system uses one or more controllers to determine the absolute position of the wand based on the data output by the relative motion sensor and by the photodetector. The data enables determination of the absolute pose of the wand, which includes the absolute position of a reference point chosen on the wand and the absolute orientation of the wand. To properly express the absolute parameters of position and/or orientation of the wand a reference location is chosen with respect to which the calculations are performed. The system is coupled to a display that shows an image defined by a first and second orthogonal axes such as two axes belonging to world coordinates (X o ,Y o ,Z o ). The one or more controllers are configured to generate signals that are a function of the absolute position of the wand in or along a third axis for rendering the display. To simplify the mapping of a real three-dimensional environment in which the wand is operated to the cyberspace of the application that the system is running, the third axis is preferably the third Cartesian coordinate axis of world coordinates (X o ,Y o ,Z o ).

281 citations