scispace - formally typeset
Search or ask a question
Topic

Latency (engineering)

About: Latency (engineering) is a research topic. Over the lifetime, 7278 publications have been published within this topic receiving 115409 citations. The topic is also known as: lag.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors propose to use coding to seamlessly distribute coded payload and redundancy data across multiple available communication interfaces, and formulate an optimization problem to find the payload allocation weights that maximize the reliability at specific target latency values.
Abstract: An important ingredient of the future 5G systems will be ultra-reliable low-latency communication (URLLC). A way to offer URLLC without intervention in the baseband/PHY layer design is to use interface diversity and integrate multiple communication interfaces, each interface based on a different technology. In this paper, we propose to use coding to seamlessly distribute coded payload and redundancy data across multiple available communication interfaces. We formulate an optimization problem to find the payload allocation weights that maximize the reliability at specific target latency values. In order to estimate the performance in terms of latency and reliability of such an integrated communication system, we propose an analysis framework that combines traditional reliability models with technology-specific latency probability distributions. Our model is capable to account for failure correlation among interfaces/technologies. By considering different scenarios, we find that the optimized strategies can in some cases significantly outperform strategies based on $k$ -out-of- $n$ erasure codes, where the latter do not account for the characteristics of the different interfaces. The model has been validated through simulation and is supported by experimental results.

135 citations

Proceedings ArticleDOI
01 Jun 2017
TL;DR: This work proposes Cachier, a system that uses the caching model along with novel optimizations to minimize latency by adaptively balancing load between the edge and the cloud, by leveraging spatiotemporal locality of requests, using offline analysis of applications, and online estimates of network conditions.
Abstract: Recognition and perception based mobile applications, such as image recognition, are on the rise. These applications recognize the user's surroundings and augment it with information and/or media. These applications are latency-sensitive. They have a soft-realtime nature - late results are potentially meaningless. On the one hand, given the compute-intensive nature of the tasks performed by such applications, execution is typically offloaded to the cloud. On the other hand, offloading such applications to the cloud incurs network latency, which can increase the user-perceived latency. Consequently, edge computing has been proposed to let devices offload intensive tasks to edge servers instead of the cloud, to reduce latency. In this paper, we propose a different model for using edge servers. We propose to use the edge as a specialized cache for recognition applications and formulate the expected latency for such a cache. We show that using an edge server like a typical web cache, for recognition applications, can lead to higher latencies. We propose Cachier, a system that uses the caching model along with novel optimizations to minimize latency by adaptively balancing load between the edge and the cloud, by leveraging spatiotemporal locality of requests, using offline analysis of applications, and online estimates of network conditions. We evaluate Cachier for image-recognition applications and show that our techniques yield 3x speedup in responsiveness, and perform accurately over a range of operating conditions. To the best of our knowledge, this is the first work that models edge servers as caches for compute-intensive recognition applications, and Cachier is the first system that uses this model to minimize latency for these applications.

135 citations

Journal ArticleDOI
TL;DR: The P300 component was elicited by an auditory oddball paradigm in 55 normal adults from a wide age range and an abnormal delay in P300 was found to be less sensitive and specific to dementia.

133 citations

Proceedings ArticleDOI
14 Oct 2001
TL;DR: This work considers a directed network in which every edge possesses a latency function specifying the time needed to traverse the edge given its congestion, and proves that for networks with n nodes and continuous, nondecreasing latency functions, there is no approximation algorithm for this problem with approximation ratio less than n/2.
Abstract: We consider a directed network in which every edge possesses a latency function specifying the time needed to traverse the edge given its congestion. Selfish, noncooperative agents constitute the network traffic and wish to travel from a source s to a sink t as quickly as possible. Since the route chosen by one network user affects the congestion (and hence the latency) experienced by others, we model the problem as a noncooperative game. Assuming each agent controls only a negligible portion of the overall traffic, Nash equilibria in this noncooperative game correspond to s-t flows in which all flow paths have equal latency. We give optimal inapproximability results and approximation algorithms for several network design problems of this type. For example, we prove that for networks with n nodes and continuous, nondecreasing latency functions, there is no approximation algorithm for this problem with approximation ratio less than n/2 (unless P = NP). We also prove this hardness result to be best possible by exhibiting an n/2-approximation algorithm. For networks in which the latency of each edge is a linear function of the congestion, we prove that there is no (4/3 - /spl epsi/)-approximation algorithm for the problem (for any /spl epsi/ > 0, unless P = NP); the existence of a 4/3-approximation algorithm follows easily from existing work, proving this hardness result sharp.

132 citations


Network Information
Related Topics (5)
The Internet
213.2K papers, 3.8M citations
75% related
Node (networking)
158.3K papers, 1.7M citations
75% related
Wireless
133.4K papers, 1.9M citations
74% related
Server
79.5K papers, 1.4M citations
74% related
Network packet
159.7K papers, 2.2M citations
74% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
2021485
2020529
2019533
2018500
2017405