scispace - formally typeset
Search or ask a question
Author

Eduardo Cuervo

Other affiliations: Duke University
Bio: Eduardo Cuervo is an academic researcher from Microsoft. The author has contributed to research in topics: Virtual reality & Mobile device. The author has an hindex of 14, co-authored 23 publications receiving 3012 citations. Previous affiliations of Eduardo Cuervo include Duke University.

Papers
More filters
Proceedings ArticleDOI
15 Jun 2010
TL;DR: MAUI supports fine-grained code offload to maximize energy savings with minimal burden on the programmer, and decides at run-time which methods should be remotely executed, driven by an optimization engine that achieves the best energy savings possible under the mobile device's current connectivity constrains.
Abstract: This paper presents MAUI, a system that enables fine-grained energy-aware offload of mobile code to the infrastructure. Previous approaches to these problems either relied heavily on programmer support to partition an application, or they were coarse-grained requiring full process (or full VM) migration. MAUI uses the benefits of a managed code environment to offer the best of both worlds: it supports fine-grained code offload to maximize energy savings with minimal burden on the programmer. MAUI decides at run-time which methods should be remotely executed, driven by an optimization engine that achieves the best energy savings possible under the mobile device's current connectivity constrains. In our evaluation, we show that MAUI enables: 1) a resource-intensive face recognition application that consumes an order of magnitude less energy, 2) a latency-sensitive arcade game application that doubles its refresh rate, and 3) a voice-based language translation application that bypasses the limitations of the smartphone environment by executing unsupported components remotely.

2,530 citations

Proceedings ArticleDOI
18 May 2015
TL;DR: Outatime is presented, a speculative execution system for mobile cloud gaming that is able to mask up to 120ms of network latency, and is found that players strongly prefer Outatime to traditional thin-client gaming where the network RTT is fully visible, and that Outatimes successfully mimics playing across a low-latency network.
Abstract: Gaming on phones, tablets and laptops is very popular. Cloud gaming - where remote servers perform game execution and rendering on behalf of thin clients that simply send input and display output frames - promises any device the ability to play any game any time. Unfortunately, the reality is that wide-area network latencies are often prohibitive; cellular, Wi-Fi and even wired residential end host round trip times (RTTs) can exceed 100ms, a threshold above which many gamers tend to deem responsiveness unacceptable. In this paper, we present Outatime, a speculative execution system for mobile cloud gaming that is able to mask up to 120ms of network latency. Outatime renders speculative frames of future possible outcomes, delivering them to the client one entire RTT ahead of time, and recovers quickly from mis-speculations when they occur. Clients perceive little latency. To achieve this, Outatime combines: 1) future state prediction; 2) state approximation with image-based rendering and event time-shifting; 3) fast state checkpoint and rollback; and 4) state compression for bandwidth savings. To evaluate the Outatime speculation system, we use two high quality, commercially-released games: a twitch-based first person shooter, Doom 3, and an action role playing game, Fable 3. Through user studies and performance bench- marks, we find that players strongly prefer Outatime to traditional thin-client gaming where the network RTT is fully visible, and that Outatime successfully mimics playing across a low-latency network.

165 citations

Proceedings ArticleDOI
20 Jun 2016
TL;DR: FLASHBACK is presented, an unorthodox design point for HMD VR that eschews all real-time scene rendering and aggressively precomputes and caches all possible images that a VR user might encounter, and delivers better framerates and responsiveness than a tethered HMD configuration on graphically complex scenes.
Abstract: Virtual reality head-mounted displays (VR HMDs) are attracting users with the promise of full sensory immersion in virtual environments. Creating the illusion of immersion for a near-eye display results in very heavy rendering workloads: low latency, high framerate, and high visual quality are all needed. Tethered VR setups in which the HMD is bound to a powerful gaming desktop limit mobility and exploration, and are difficult to deploy widely. Products such as Google Cardboard and Samsung Gear VR purport to offer any user a mobile VR experience, but their GPUs are too power-constrained to produce an acceptable framerate and latency, even for scenes of modest visual quality. We present FLASHBACK, an unorthodox design point for HMD VR that eschews all real-time scene rendering. Instead, FLASHBACK aggressively precomputes and caches all possible images that a VR user might encounter. FLASHBACK memoizes costly rendering effort in an offline step to build a cache full of panoramic images. During runtime, FLASHBACK constructs and maintains a hierarchical storage cache index to quickly lookup images that the user should be seeing. On a cache miss, FLASHBACK uses fast approximations of the correct image while concurrently fetching more closely-matching entries from its cache for future requests. Moreover, FLASHBACK not only works for static scenes, but also for dynamic scenes with moving and animated objects. We evaluate a prototype implementation of FLASHBACK and report up to a 8x improvement in framerate, 97x reduction in energy consumption per frame, and 15x latency reduction compared to a locally-rendered mobile VR setup. In some cases, FLASHBACK even delivers better framerates and responsiveness than a tethered HMD configuration on graphically complex scenes.

135 citations

Proceedings ArticleDOI
02 Jun 2014
TL;DR: Kahawai is a system that provides high-quality gaming on mobile devices, such as tablets and smartphones, by offloading a portion of the GPU computation to server-side infrastructure by using collaborative rendering to combine the output of a mobile GPU and a server- side GPU into the displayed output.
Abstract: This paper presents Kahawai1, a system that provides high-quality gaming on mobile devices, such as tablets and smartphones, by offloading a portion of the GPU computation to server-side infrastructure In contrast with previous thin-client approaches that require a server-side GPU to render the entire content, Kahawai uses collaborative rendering to combine the output of a mobile GPU and a server-side GPU into the displayed output Compared to a thin client, collaborative rendering requires significantly less network bandwidth between the mobile device and the server to achieve the same visual quality and, unlike a thin client, collaborative rendering supports disconnected operation, allowing a user to play offline - albeit with reduced visual quality Kahawai implements two separate techniques for collaborative rendering: (1) a mobile device can render each frame with reduced detail while a server sends a stream of per-frame differences to transform each frame into a high detail version, or (2) a mobile device can render a subset of the frames while a server provides the missing frames Both techniques are compatible with the hardware-accelerated H264 video decoders found on most modern mobile devices We implemented a Kahawai prototype and integrated it with the idTech 4 open-source game engine, an advanced engine used by many commercial games In our evaluation, we show that Kahawai can deliver gameplay at an acceptable frame rate, and achieve high visual quality using as little as one-sixth of the bandwidth of the conventional thin-client approach Furthermore, a 50-person user study with our prototype shows that Kahawai can deliver the same gaming experience as a thin client under excellent network conditions

70 citations

Proceedings ArticleDOI
12 Feb 2018
TL;DR: This study indicates that while display technology will be capable of Life-Like VR, rendering computation is likely to be the key bottleneck and current wireless and compression technology may not be sufficient to accommodate the bandwidth and latency requirements.
Abstract: As Virtual Reality (VR) Head Mounted Displays (HMD) push the boundaries of technology, in this paper, we try and answer the question, "What would it take to make the visual experience of a VR-HMD Life-Like, i.e., indistinguishable from physical reality?" Based on the limits of human perception, we first try and establish the specifications for a Life-Like HMD. We then examine crucial technological trends and speculate on the feasibility of Life-Like VR headsets in the near future. Our study indicates that while display technology will be capable of Life-Like VR, rendering computation is likely to be the key bottleneck. Life-Like VR solutions will likely involve frames rendered on a separate machine and then transmitted to the HMD. Can we transmit Life-Like VR frames wirelessly to the HMD and make the HMD cable-free? We find that current wireless and compression technology may not be sufficient to accommodate the bandwidth and latency requirements. We outline research directions towards achieving Life-Like VR.

63 citations


Cited by
More filters
Journal ArticleDOI
Weisong Shi1, Jie Cao1, Quan Zhang1, Youhuizi Li1, Lanyu Xu1 
TL;DR: The definition of edge computing is introduced, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge Computing.
Abstract: The proliferation of Internet of Things (IoT) and the success of rich cloud services have pushed the horizon of a new computing paradigm, edge computing, which calls for processing the data at the edge of the network. Edge computing has the potential to address the concerns of response time requirement, battery life constraint, bandwidth cost saving, as well as data safety and privacy. In this paper, we introduce the definition of edge computing, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge computing. Finally, we present several challenges and opportunities in the field of edge computing, and hope this paper will gain attention from the community and inspire more research in this direction.

5,198 citations

Journal ArticleDOI
TL;DR: A comprehensive survey of the state-of-the-art MEC research with a focus on joint radio-and-computational resource management is provided in this paper, where a set of issues, challenges, and future research directions for MEC are discussed.
Abstract: Driven by the visions of Internet of Things and 5G communications, recent years have seen a paradigm shift in mobile computing, from the centralized mobile cloud computing toward mobile edge computing (MEC). The main feature of MEC is to push mobile computing, network control and storage to the network edges (e.g., base stations and access points) so as to enable computation-intensive and latency-critical applications at the resource-limited mobile devices. MEC promises dramatic reduction in latency and mobile energy consumption, tackling the key challenges for materializing 5G vision. The promised gains of MEC have motivated extensive efforts in both academia and industry on developing the technology. A main thrust of MEC research is to seamlessly merge the two disciplines of wireless communications and mobile computing, resulting in a wide-range of new designs ranging from techniques for computation offloading to network architectures. This paper provides a comprehensive survey of the state-of-the-art MEC research with a focus on joint radio-and-computational resource management. We also discuss a set of issues, challenges, and future research directions for MEC research, including MEC system deployment, cache-enabled MEC, mobility management for MEC, green MEC, as well as privacy-aware MEC. Advancements in these directions will facilitate the transformation of MEC from theory to practice. Finally, we introduce recent standardization efforts on MEC as well as some typical MEC application scenarios.

2,992 citations

Journal ArticleDOI
TL;DR: This article surveys existing mobile phone sensing algorithms, applications, and systems, and discusses the emerging sensing paradigms, and formulates an architectural framework for discussing a number of the open issues and challenges emerging in the new area ofMobile phone sensing research.
Abstract: Mobile phones or smartphones are rapidly becoming the central computer and communication device in people's lives. Application delivery channels such as the Apple AppStore are transforming mobile phones into App Phones, capable of downloading a myriad of applications in an instant. Importantly, today's smartphones are programmable and come with a growing set of cheap powerful embedded sensors, such as an accelerometer, digital compass, gyroscope, GPS, microphone, and camera, which are enabling the emergence of personal, group, and communityscale sensing applications. We believe that sensor-equipped mobile phones will revolutionize many sectors of our economy, including business, healthcare, social networks, environmental monitoring, and transportation. In this article we survey existing mobile phone sensing algorithms, applications, and systems. We discuss the emerging sensing paradigms, and formulate an architectural framework for discussing a number of the open issues and challenges emerging in the new area of mobile phone sensing research.

2,316 citations

Posted Content
TL;DR: A comprehensive survey of the state-of-the-art MEC research with a focus on joint radio-and-computational resource management and recent standardization efforts on MEC are introduced.
Abstract: Driven by the visions of Internet of Things and 5G communications, recent years have seen a paradigm shift in mobile computing, from the centralized Mobile Cloud Computing towards Mobile Edge Computing (MEC). The main feature of MEC is to push mobile computing, network control and storage to the network edges (e.g., base stations and access points) so as to enable computation-intensive and latency-critical applications at the resource-limited mobile devices. MEC promises dramatic reduction in latency and mobile energy consumption, tackling the key challenges for materializing 5G vision. The promised gains of MEC have motivated extensive efforts in both academia and industry on developing the technology. A main thrust of MEC research is to seamlessly merge the two disciplines of wireless communications and mobile computing, resulting in a wide-range of new designs ranging from techniques for computation offloading to network architectures. This paper provides a comprehensive survey of the state-of-the-art MEC research with a focus on joint radio-and-computational resource management. We also present a research outlook consisting of a set of promising directions for MEC research, including MEC system deployment, cache-enabled MEC, mobility management for MEC, green MEC, as well as privacy-aware MEC. Advancements in these directions will facilitate the transformation of MEC from theory to practice. Finally, we introduce recent standardization efforts on MEC as well as some typical MEC application scenarios.

2,289 citations

Journal ArticleDOI
TL;DR: A survey of MCC is given, which helps general readers have an overview of the MCC including the definition, architecture, and applications and the issues, existing solutions, and approaches are presented.
Abstract: Together with an explosive growth of the mobile applications and emerging of cloud computing concept, mobile cloud computing (MCC) has been introduced to be a potential technology for mobile services. MCC integrates the cloud computing into the mobile environment and overcomes obstacles related to the performance (e.g., battery life, storage, and bandwidth), environment (e.g., heterogeneity, scalability, and availability), and security (e.g., reliability and privacy) discussed in mobile computing. This paper gives a survey of MCC, which helps general readers have an overview of the MCC including the definition, architecture, and applications. The issues, existing solutions, and approaches are presented. In addition, the future research directions of MCC are discussed. Copyright © 2011 John Wiley & Sons, Ltd.

2,259 citations