scispace - formally typeset
Search or ask a question
Topic

Augmented reality

About: Augmented reality is a research topic. Over the lifetime, 36039 publications have been published within this topic receiving 479617 citations. The topic is also known as: AR.


Papers
More filters
Journal ArticleDOI
20 Nov 2020
TL;DR: Compared with other 3D displays, the holographic display has unique advantages in providing natural depth cues and correcting eye aberrations and holds great promise to be the enabling technology for next-generation VR/AR devices.
Abstract: Wearable near-eye displays for virtual and augmented reality (VR/AR) have seen enormous growth in recent years. While researchers are exploiting a plethora of techniques to create life-like three-dimensional (3D) objects, there is a lack of awareness of the role of human perception in guiding the hardware development. An ultimate VR/AR headset must integrate the display, sensors, and processors in a compact enclosure that people can comfortably wear for a long time while allowing a superior immersion experience and user-friendly human–computer interaction. Compared with other 3D displays, the holographic display has unique advantages in providing natural depth cues and correcting eye aberrations. Therefore, it holds great promise to be the enabling technology for next-generation VR/AR devices. In this review, we survey the recent progress in holographic near-eye displays from the human-centric perspective.

175 citations

Patent
31 Mar 2003
TL;DR: In this article, an augmented reality navigation aid is used to overlay relevant computer-generated images, which are anchored to real-world locations of hazards, onto one or more users' field of view.
Abstract: Method and apparatus are presented for prioritizing and assessing navigation data using an Augmented Reality navigation aid. Navigators are often placed in treacherous, unfamiliar, or low-visibility situations. An augmented reality navigation aid is used to overlay relevant computer-generated images, which are anchored to real-world locations of hazards, onto one or more users' field of view. Areas of safe passage for transportation platforms such as ships, land vehicles, and aircraft can be displayed via computer-generated imagery or inferred from various attributes of the computer-generated display. The invention is applicable to waterway navigation, land navigation, and to aircraft navigation (for aircraft approaching runways or terrain in low visibility situations). A waterway embodiment of the invention is called WARN™, or Waterway Augmented Reality Navigation™. A method is presented for visualization of hazards which pose a serious threat to those in the immediate vicinity. Such hazards include, but are not limited to, fire, smoke, radiation, and invisible gasses. The method utilizes augmented reality, which is defined as the mixing of real world imagery with computer-generated graphical elements. Computer-generated three-dimensional representations of hazards can be used in training and operations of emergency first responders and others. The representations can be used to show the locations and actions of a variety of dangers, real or computer-generated, perceived or not perceived, in training or operations settings. The representations, which may be graphic, iconic, or textual, are overlaid onto a view of the user's real world, thus providing a reality augmented with computer-generated hazards. A user can then implement procedures (training and operational) appropriate to the hazard at hand. A method is presented which uses Augmented Reality for visualization of text and other messages sent to an EFR by an incident commander. The messages are transmitted by the incident commander via a computer at the scene to an EFR/trainee in an operational or training scenario. Messages to an EFR/trainee, including (but not limited to) iconic representation of hazards, victims, structural data, environmental conditions, and exit directions/locations, are superimposed right onto an EFR/trainee's view of the real emergency/fire and structural surroundings. The primary intended applications are for improved safety for the EFR, and improved EFR-incident commander communications both on-scene and in training scenarios.

175 citations

Journal ArticleDOI
01 Jul 2012
TL;DR: This paper expands tactile interfaces based on electrovibration beyond touch surfaces and bring them into the real world with a broad range of application scenarios where the technology can be used to enhance AR interaction with dynamic and unobtrusive tactile feedback.
Abstract: REVEL is an augmented reality (AR) tactile technology that allows for change to the tactile feeling of real objects by augmenting them with virtual tactile textures using a device worn by the user. Unlike previous attempts to enhance AR environments with haptics, we neither physically actuate objects or use any force- or tactile-feedback devices, nor require users to wear tactile gloves or other apparatus on their hands. Instead, we employ the principle of reverse electrovibration where we inject a weak electrical signal anywhere on the user body creating an oscillating electrical field around the user's fingers. When sliding his or her fingers on a surface of the object, the user perceives highly distinctive tactile textures augmenting the physical object. By tracking the objects and location of the touch, we associate dynamic tactile sensations to the interaction context. REVEL is built upon our previous work on designing electrovibration-based tactile feedback for touch surfaces [Bau, et al. 2010]. In this paper we expand tactile interfaces based on electrovibration beyond touch surfaces and bring them into the real world. We demonstrate a broad range of application scenarios where our technology can be used to enhance AR interaction with dynamic and unobtrusive tactile feedback.

174 citations

Journal ArticleDOI
TL;DR: The augmented reality microscope (ARM) overlays AI-based information onto the current view of the sample in real time, enabling seamless integration of AI into routine workflows and will remove barriers towards the use of AI designed to improve the accuracy and efficiency of cancer diagnosis.
Abstract: The microscopic assessment of tissue samples is instrumental for the diagnosis and staging of cancer, and thus guides therapy. However, these assessments demonstrate considerable variability and many regions of the world lack access to trained pathologists. Though artificial intelligence (AI) promises to improve the access and quality of healthcare, the costs of image digitization in pathology and difficulties in deploying AI solutions remain as barriers to real-world use. Here we propose a cost-effective solution: the augmented reality microscope (ARM). The ARM overlays AI-based information onto the current view of the sample in real time, enabling seamless integration of AI into routine workflows. We demonstrate the utility of ARM in the detection of metastatic breast cancer and the identification of prostate cancer, with latency compatible with real-time use. We anticipate that the ARM will remove barriers towards the use of AI designed to improve the accuracy and efficiency of cancer diagnosis.

174 citations

Proceedings ArticleDOI
09 Aug 2008
TL;DR: The combined VR and AR head-mounted display (HMD) allowed us to develop very careful calibration procedures based on real-world calibration widgets, which cannot be replicated with VR-only HMDs.
Abstract: As the use of virtual and augmented reality applications becomes more common, the need to fully understand how observers perceive spatial relationships grows more critical. One of the key requirements in engineering a practical virtual or augmented reality system is accurately conveying depth and layout. This requirement has frequently been assessed by measuring judgments of egocentric depth. These assessments have shown that observers in virtual reality (VR) perceive virtual space as compressed relative to the real-world, resulting in systematic underestimations of egocentric depth. Previous work has indicated that similar effects may be present in augmented reality (AR) as well.This paper reports an experiment that directly measured egocentric depth perception in both VR and AR conditions; it is believed to be the first experiment to directly compare these conditions in the same experimental framework. In addition to VR and AR, two control conditions were studied: viewing real-world objects, and viewing real-world objects through a head-mounted display. Finally, the presence and absence of motion parallax was crossed with all conditions. Like many previous studies, this one found that depth perception was underestimated in VR, although the magnitude of the effect was surprisingly low. The most interesting finding was that no underestimation was observed in AR.

173 citations


Network Information
Related Topics (5)
User interface
85.4K papers, 1.7M citations
86% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Object detection
46.1K papers, 1.3M citations
82% related
Segmentation
63.2K papers, 1.2M citations
82% related
Image segmentation
79.6K papers, 1.8M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20242
20231,885
20224,115
20212,941
20204,123
20194,549