scispace - formally typeset
Search or ask a question
Author

Livia Elena Chatzieleftheriou

Other affiliations: Athens State University
Bio: Livia Elena Chatzieleftheriou is an academic researcher from Athens University of Economics and Business. The author has contributed to research in topics: Cache & Computer science. The author has an hindex of 3, co-authored 8 publications receiving 139 citations. Previous affiliations of Livia Elena Chatzieleftheriou include Athens State University.

Papers
More filters
Proceedings ArticleDOI
01 May 2017
TL;DR: This paper approaches recommender systems as network traffic engineering tools that can actively shape content demand towards optimizing user- and network-centric performance objectives and formulate the resulting joint theoretical optimization problem of deciding on the cached content and the recommendations to each user.
Abstract: Caching decisions by default seek to maximize some notion of social welfare: the content to be cached is determined so that the maximum possible aggregate demand over all users served by the cache is satisfied. Recommendation systems, on the contrary, are oriented towards user individual preferences: the recommended content should be most appealing to the user so as to elicit further content consumption. In our paper we explore how these, phenomenically conflicting, objectives can be jointly addressed. To this end, we depart radically from current practice with recommender systems, and we approach them as network traffic engineering tools that can actively shape content demand towards optimizing user- and network-centric performance objectives. We formulate the resulting joint theoretical optimization problem of deciding on the cached content and the recommendations to each user so that the cache hit ratio is maximized subject to a maximum tolerable distortion that the recommendation should undergo. We conclude on its complexity, and we propose a practical algorithm for its solution. The algorithm is essentially a form of lightweight control over the user recommendations so that the recommended content is both appealing to the end user and more friendly to the caching system and the network resources.

87 citations

Journal ArticleDOI
TL;DR: A simpler heuristic algorithm is introduced that essentially serves as a form of lightweight control over recommendations so that they are both appealing to end-users and friendly to network resources.
Abstract: Caching decisions typically seek to cache content that satisfies the maximum possible demand aggregated over all users. Recommendation systems, on the contrary, focus on individual users and recommend to them appealing content in order to elicit further content consumption. In our paper, we explore how these, phenomenally conflicting, objectives can be jointly addressed. First, we formulate an optimization problem for the joint caching and recommendation decisions, aiming to maximize the cache hit ratio under minimal controllable distortion of the inherent user content preferences by the issued recommendations. Then, we prove that the problem is NP-complete and that its objective function lacks those monotonicity and submodularity properties that would guarantee its approximability. Hence, we proceed to introduce a simpler heuristic algorithm that essentially serves as a form of lightweight control over recommendations so that they are both appealing to end-users and friendly to network resources. Finally, we draw on both analysis and simulations with real and synthetic datasets to evaluate the performance of the algorithm. We point out its fundamental properties, provide bounds for the achieved cache hit ratio, and study its sensitivity to its own as well as system-level parameters.

73 citations

Journal ArticleDOI
25 Jan 2019
TL;DR: This paper establishes a framework for the joint user association, content caching and recommendations problem, and proposes a heuristic that tackles the joint problem when the objective is to maximize the total hit ratio over all caches.
Abstract: In this paper, we investigate the performance gains that are achievable when jointly controlling (i) in which Small-cell Base Stations (SBSs) mobile users are associated to, (ii) which content items are stored at SBS co-located caches and (iii) which content items are recommended to the mobile users who are associated to different SBSs. We first establish a framework for the joint user association, content caching and recommendations problem, by specifying a set of necessary conditions for all three component functions of the system. Then, we provide a concrete formulation of the joint problem when the objective is to maximize the total hit ratio over all caches. We analyze the problems that emerge as special cases of the joint problem, when one of the three functions is carried out independently, and use them to characterize its complexity. Finally, we propose a heuristic that tackles the joint problem. Proof-of-concept simulations demonstrate that even this simple heuristic outperforms an optimal algorithm that takes only caching and recommendation decisions into account and provide evidence of the achievable performance gains when decisions over all three functions are jointly optimized.

21 citations

Proceedings ArticleDOI
07 Jun 2020
TL;DR: A fast-converging computationally simple heuristic algorithm that iterates between assigning users to small cells and content to SBS caches, to maximize the overall cache hit ratio and could become a valuable tool for small cell network operators seeking to optimize the use of radio network resources.
Abstract: Caching at the edge of the radio network is increasingly viewed as a promising countermeasure to the staggering demand for mobile video content. The persistent orientation of newer generations of mobile communication systems towards lower latency and faster radio access speeds only strengthens the arguments in its favor. When content caching is coordinated with other radio resource management functions, in particular, the benefits for the end users and the network operator are significant. In this paper, we investigate these benefits in cache-enabled small cell networks that jointly control (i) the Small-Cell Base Stations (SBSs) that serve as network access points for the mobile users; and (ii) the content that is stored at the SBS co-located caches. Our main contribution is a fast-converging computationally simple heuristic algorithm that iterates between assigning users to small cells and content to SBS caches, to maximize the overall cache hit ratio. The algorithm solutions compete with the optimal assignments at small problem instances and outperform alternative solutions for larger instances, especially when the content demand exhibits spatial locality. Combining good performance with non-prohibitive complexity, the algorithm could become a valuable tool for small cell network operators seeking to optimize the use of radio network resources.

6 citations

Proceedings ArticleDOI
01 Aug 2018
TL;DR: This work defines the context classification precision as a function of the quantity of information that users provide, and demonstrates through numerical experiments that appropriate management of the limited resources at the wireless edge can maximize the classification precision of data analytics mechanisms needed for augmented reality applications.
Abstract: From entertainment to education, augmented reality (AR) is about to impact positively our everyday lives. Enhanced capabilities of mobile devices, such as smartphones or wear-ables, as well as ubiquitous network connectivity give AR the opportunity to prosper. Despite these improvements, AR requires computationally heavy tasks, such as context recognition and classification through image or video processing, which are hard to fulfill on mobile devices. To this end, solutions for computation offloading to cloud servers have been proposed. We consider a scenario where context identification is performed through elicitation of user-generated information, such as images or small video files. It is the quantity of this information that ultimately determines the context classification precision, which we model as a Binomial random variable. We introduce the problem of maximizing a lower bound of the precision of context classification through prudent resource allocation, namely computation offloading, and bandwidth and computational capacity allocation at the wireless network edge. We define the context classification precision as a function of the quantity of information that users provide, and we demonstrate through numerical experiments that appropriate management of the limited resources at the wireless edge can maximize the classification precision of data analytics mechanisms needed for augmented reality applications.

3 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Caching has been studied for more than 40 years and has recently received increased attention from industry and academia as mentioned in this paper, with the following goal: to convince the reader that content caching is an exciting research topic for the future communication systems and networks.
Abstract: This paper has the following ambitious goal: to convince the reader that content caching is an exciting research topic for the future communication systems and networks. Caching has been studied for more than 40 years, and has recently received increased attention from industry and academia. Novel caching techniques promise to push the network performance to unprecedented limits, but also pose significant technical challenges. This tutorial provides a brief overview of existing caching solutions, discusses seminal papers that open new directions in caching, and presents the contributions of this special issue. We analyze the challenges that caching needs to address today, also considering an industry perspective, and identify bottleneck issues that must be resolved to unleash the full potential of this promising technique.

245 citations

Journal ArticleDOI
TL;DR: This paper explores the cache deployment in a large-scale WiFi system, which contains 8,000 APs and serves more than 40,000 active users, to maximize the long-term caching gain, and proposes a cache deployment strategy, named LeaD, which is able to achieve the near-optimal caching performance and can outperform other benchmark strategies significantly.
Abstract: Widespread and large-scale WiFi systems have been deployed in many corporate locations, while the backhual capacity becomes the bottleneck in providing high-rate data services to a tremendous number of WiFi users. Mobile edge caching is a promising solution to relieve backhaul pressure and deliver quality services by proactively pushing contents to access points (APs). However, how to deploy cache in large-scale WiFi system is not well studied yet quite challenging since numerous APs can have heterogeneous traffic characteristics, and future traffic conditions are unknown ahead. In this paper, given the cache storage budget, we explore the cache deployment in a large-scale WiFi system, which contains 8,000 APs and serves more than 40,000 active users, to maximize the long-term caching gain. Specifically, we first collect two-month user association records and conduct intensive spatio-temporal analytics on WiFi traffic consumption, gaining two major observations. First, per AP traffic consumption varies in a rather wide range and the proportion of AP distributes evenly within the range, indicating that the cache size should be heterogeneously allocated in accordance to the underlying traffic demands. Second, compared to a single AP, the traffic consumption of a group of APs (clustered by physical locations) is more stable, which means that the short-term traffic statistics can be used to infer the future long-term traffic conditions. We then propose our cache deployment strategy, named LeaD (i.e., L arge-scale WiFi E dge c A che D eployment), in which we first cluster large-scale APs into well-sized edge nodes, then conduct the stationary testing on edge level traffic consumption and sample sufficient traffic statistics in order to precisely characterize long-term traffic conditions, and finally devise the TEG ( T raffic-w E ighted G reedy) algorithm to solve the long-term caching gain maximization problem. Extensive trace-driven experiments are carried out, and the results demonstrate that LeaD is able to achieve the near-optimal caching performance and can outperform other benchmark strategies significantly.

93 citations

Journal ArticleDOI
TL;DR: In this article, the authors introduce the concept of soft cache hits (SCHs), which occurs when a user's requested content is not in the local cache, but the user can be satisfied by a related content that is.
Abstract: Pushing popular content to small cells with local storage (“helper” nodes) has been proposed to cope with the ever-growing data demand. Nevertheless, the collective storage of a few nearby helper nodes may not suffice to achieve a high hit rate in practice. In this paper, we introduce the concept of “soft cache hits” (SCHs). An SCH occurs if a user’s requested content is not in the local cache, but the user can be (partially) satisfied by a related content that is. In case of a cache miss, an application proxy (e.g., YouTube) running close to the helper node (e.g., at a multi-access edge computing server) can recommend the most related files that are locally cached. This system could be activated during periods of predicted congestion, or for selected users (e.g., low-cost plans), to improve cache hit ratio with limited (and tunable) user quality of experience performance impact. Beyond introducing a model for soft cache hits, our next contribution is to show that the optimal caching policy should be revisited when SCHs are allowed. In fact, we show that optimal caching with SCH is NP-hard even for a single cache . To this end, we formulate the optimal femto-caching problem with SCH in a sufficiently generic setup and propose efficient algorithms with provable performance. Finally, we use a large range of real datasets to corroborate our proposal.

78 citations

Journal ArticleDOI
TL;DR: A simpler heuristic algorithm is introduced that essentially serves as a form of lightweight control over recommendations so that they are both appealing to end-users and friendly to network resources.
Abstract: Caching decisions typically seek to cache content that satisfies the maximum possible demand aggregated over all users. Recommendation systems, on the contrary, focus on individual users and recommend to them appealing content in order to elicit further content consumption. In our paper, we explore how these, phenomenally conflicting, objectives can be jointly addressed. First, we formulate an optimization problem for the joint caching and recommendation decisions, aiming to maximize the cache hit ratio under minimal controllable distortion of the inherent user content preferences by the issued recommendations. Then, we prove that the problem is NP-complete and that its objective function lacks those monotonicity and submodularity properties that would guarantee its approximability. Hence, we proceed to introduce a simpler heuristic algorithm that essentially serves as a form of lightweight control over recommendations so that they are both appealing to end-users and friendly to network resources. Finally, we draw on both analysis and simulations with real and synthetic datasets to evaluate the performance of the algorithm. We point out its fundamental properties, provide bounds for the achieved cache hit ratio, and study its sensitivity to its own as well as system-level parameters.

73 citations

Posted Content
TL;DR: This survey article provides a comprehensive introduction to edge intelligence and its application areas and presents a systematic classification of the state of the solutions by examining research results and observations for each of the four components.
Abstract: Edge intelligence refers to a set of connected systems and devices for data collection, caching, processing, and analysis in locations close to where data is captured based on artificial intelligence. The aim of edge intelligence is to enhance the quality and speed of data processing and protect the privacy and security of the data. Although recently emerged, spanning the period from 2011 to now, this field of research has shown explosive growth over the past five years. In this paper, we present a thorough and comprehensive survey on the literature surrounding edge intelligence. We first identify four fundamental components of edge intelligence, namely edge caching, edge training, edge inference, and edge offloading, based on theoretical and practical results pertaining to proposed and deployed systems. We then aim for a systematic classification of the state of the solutions by examining research results and observations for each of the four components and present a taxonomy that includes practical problems, adopted techniques, and application goals. For each category, we elaborate, compare and analyse the literature from the perspectives of adopted techniques, objectives, performance, advantages and drawbacks, etc. This survey article provides a comprehensive introduction to edge intelligence and its application areas. In addition, we summarise the development of the emerging research field and the current state-of-the-art and discuss the important open issues and possible theoretical and technical solutions.

65 citations