scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Energy consumption of photo sharing in online social networks

26 May 2014-pp 604-611
TL;DR: This paper provides a comprehensive framework and a set of measurements for understanding the energy consumption of cloud applications such as photo sharing in social networks and indicates that achieving an energy-efficient cloud service requires energy efficiency improvement in the transport network and end-user devices along with the related data centers.
Abstract: Online social networks (OSNs) with their huge number of active users consume significant amount energy both in the data centers and in the transport network. Existing studies focus mainly on the energy consumption in the data centers and do not take into account the energy consumption during the transport of data between end-users and data centers. To indicate the amount of the neglected energy, this paper provides a comprehensive framework and a set of measurements for understanding the energy consumption of cloud applications such as photo sharing in social networks. A new energy model is developed to estimate the energy consumption of cloud applications and applied to sharing photos on Facebook, as an example. Our results indicate that the energy consumption involved in the network and end-user devices for photo sharing is approximately equal to 60% of the energy consumption of all Facebook data centers. Therefore, achieving an energy-efficient cloud service requires energy efficiency improvement in the transport network and end-user devices along with the related data centers.
Citations
More filters
Journal ArticleDOI
TL;DR: An in-depth study of the existing literature on data center power modeling, covering more than 200 models, organized in a hierarchical structure with two main branches focusing on hardware-centric and software-centric power models.
Abstract: Data centers are critical, energy-hungry infrastructures that run large-scale Internet-based services. Energy consumption models are pivotal in designing and optimizing energy-efficient operations to curb excessive energy consumption in data centers. In this paper, we survey the state-of-the-art techniques used for energy consumption modeling and prediction for data centers and their components. We conduct an in-depth study of the existing literature on data center power modeling, covering more than 200 models. We organize these models in a hierarchical structure with two main branches focusing on hardware-centric and software-centric power models. Under hardware-centric approaches we start from the digital circuit level and move on to describe higher-level energy consumption models at the hardware component level, server level, data center level, and finally systems of systems level. Under the software-centric approaches we investigate power models developed for operating systems, virtual machines and software applications. This systematic approach allows us to identify multiple issues prevalent in power modeling of different levels of data center systems, including: i) few modeling efforts targeted at power consumption of the entire data center ii) many state-of-the-art power models are based on a few CPU or server metrics, and iii) the effectiveness and accuracy of these power models remain open questions. Based on these observations, we conclude the survey by describing key challenges for future research on constructing effective and accurate data center power models.

741 citations


Cites background or methods from "Energy consumption of photo sharing..."

  • ...[244] Network device Mathematical integration based power model....

    [...]

  • ...A slightly extended version of the energy consumption of network elements can be made by taking the integration of power consumed by the device [244]....

    [...]

Journal ArticleDOI
TL;DR: This survey starts by providing an overview and fundamental of fog computing architecture, and provides an extensive overview of state-of-the-art network applications and major research aspects to design these networks.
Abstract: Fog computing is an emerging paradigm that extends computation, communication, and storage facilities toward the edge of a network. Compared to traditional cloud computing, fog computing can support delay-sensitive service requests from end-users (EUs) with reduced energy consumption and low traffic congestion. Basically, fog networks are viewed as offloading to core computation and storage. Fog nodes in fog computing decide to either process the services using its available resource or send to the cloud server. Thus, fog computing helps to achieve efficient resource utilization and higher performance regarding the delay, bandwidth, and energy consumption. This survey starts by providing an overview and fundamental of fog computing architecture. Furthermore, service and resource allocation approaches are summarized to address several critical issues such as latency, and bandwidth, and energy consumption in fog computing. Afterward, compared to other surveys, this paper provides an extensive overview of state-of-the-art network applications and major research aspects to design these networks. In addition, this paper highlights ongoing research effort, open challenges, and research trends in fog computing.

475 citations


Cites methods from "Energy consumption of photo sharing..."

  • ...Since the energy consumption in a cloud service depends on the proportional allocation of the equipment’s power over all the flows through the equipment, power consumption in a typical network equipment with C capacity (in bit/second) is modeled as a linear form as [105] and [106] P(C) = Pidle + C(Pmax − Pidle) Cmax , (13)...

    [...]

Journal ArticleDOI
TL;DR: It is shown that nano servers in Fog computing can complement centralized DCs to serve certain applications, mostly IoT applications for which the source of data is in end-user premises, and lead to energy saving if the applications are off-loadable from centralizedDCs and run on nDCs.
Abstract: Tiny computers located in end-user premises are becoming popular as local servers for Internet of Things (IoT) and Fog computing services. These highly distributed servers that can host and distribute content and applications in a peer-to-peer (P2P) fashion are known as nano data centers (nDCs). Despite the growing popularity of nano servers, their energy consumption is not well-investigated. To study energy consumption of nDCs, we propose and use flow-based and time-based energy consumption models for shared and unshared network equipment, respectively. To apply and validate these models, a set of measurements and experiments are performed to compare energy consumption of a service provided by nDCs and centralized data centers (DCs). A number of findings emerge from our study, including the factors in the system design that allow nDCs to consume less energy than its centralized counterpart. These include the type of access network attached to nano servers and nano server’s time utilization (the ratio of the idle time to active time). Additionally, the type of applications running on nDCs and factors such as number of downloads, number of updates, and amount of preloaded copies of data influence the energy cost. Our results reveal that number of hops between a user and content has little impact on the total energy consumption compared to the above-mentioned factors. We show that nano servers in Fog computing can complement centralized DCs to serve certain applications, mostly IoT applications for which the source of data is in end-user premises, and lead to energy saving if the applications (or a part of them) are off-loadable from centralized DCs and run on nDCs.

358 citations


Cites background or methods from "Energy consumption of photo sharing..."

  • ...We also measured the upload traffic and found it was similar to download traffic although there are some cloud applications for which upload and download traffic are not the same; such as Google Drive and Facebook [10], [11]....

    [...]

  • ...The content passes through an access network which might be an Ethernet, WiFi, PON, 3G or 4G connection, or a combination of these to reach the end-user terminal [8]–[11]....

    [...]

Proceedings ArticleDOI
01 Nov 2016
TL;DR: The results indicate that the type of IoT application, the availability of local renewable energy, and weather forecasting all influence how a system makes dynamic decisions in terms of saving energy.
Abstract: The Internet of things (IoT) is hailed as the next big phase in the evolution of the Internet. IoT devices are rapidly proliferating, owing to widespread adoption by industries in various sectors. Unlike security and privacy concerns, the energy consumption of the IoT and its applications has received little attention by the research community. This paper explores different options for deploying energy-efficient IoT applications. Specifically, we evaluate the use of a combination of Fog computing and microgrids for reducing the energy consumption of IoT applications. First, we study the energy consumption of different types of IoT applications (such as IoT applications with differing traffic generation or computation) running from both Fog and Cloud computing. Next, we consider the role of local renewable energy provided by microgrids, and local weather forecasting, along with Fog computing. To evaluate our proposal, energy consumption modeling, practical experiments and measurements were performed. The results indicate that the type of IoT application, the availability of local renewable energy, and weather forecasting all influence how a system makes dynamic decisions in terms of saving energy.

63 citations

Journal ArticleDOI
TL;DR: The ideology of component-level power modeling presented in this paper helps realize fine-grained power control and provides CSPs with useful guidance on optimizing energy management of cloud data centers.

33 citations

References
More filters
Proceedings ArticleDOI
24 Oct 2010
TL;DR: PowerBooter is an automated power model construction technique that uses built-in battery voltage sensors and knowledge of battery discharge behavior to monitor power consumption while explicitly controlling the power management and activity states of individual components.
Abstract: This paper describes PowerBooter, an automated power model construction technique that uses built-in battery voltage sensors and knowledge of battery discharge behavior to monitor power consumption while explicitly controlling the power management and activity states of individual components. It requires no external measurement equipment. We also describe PowerTutor, a component power management and activity state introspection based tool that uses the model generated by PowerBooter for online power estimation. PowerBooter is intended to make it quick and easy for application developers and end users to generate power models for new smartphone variants, which each have different power consumption properties and therefore require different power models. PowerTutor is intended to ease the design and selection of power efficient software for embedded systems. Combined, PowerBooter and PowerTutor have the goal of opening power modeling and analysis for more smartphone variants and their users.

1,225 citations


"Energy consumption of photo sharing..." refers background in this paper

  • ...Currently, more than half of the users access Facebook via mobile devices [21], the incremental energy for uploading a photo using a smart-phone is obtained by a mobile phone application named PowerTutor [22], [23]....

    [...]

Journal ArticleDOI
TL;DR: An overview of the components and capabilities of the Akamai platform is given, and some insight into its architecture, design principles, operation, and management is offered.
Abstract: Comprising more than 61,000 servers located across nearly 1,000 networks in 70 countries worldwide, the Akamai platform delivers hundreds of billions of Internet interactions daily, helping thousands of enterprises boost the performance and reliability of their Internet applications. In this paper, we give an overview of the components and capabilities of this large-scale distributed computing platform, and offer some insight into its architecture, design principles, operation, and management.

769 citations


"Energy consumption of photo sharing..." refers background in this paper

  • ...When friends request the photo, DC1 sends the photo to Akamai intermediate nodes [15] and then after a few hops it goes to an Akamai server at the edge of the network which is very close to the users....

    [...]

01 Jan 2011
TL;DR: It is shown thatEnergy consumption in transport and switching can be a significant percentage of total energy consumption in cloud computing, and considers both public and private clouds, and includes energy consumption of the transmission and switching networks.
Abstract: Network-based cloud computing is rapidly expanding as an alternative to conventional office-based computing. As cloud computing becomes more widespread, the energy consumption of the network and computing resources that underpin the cloud will grow. This is happening at a time when there is increasing attention being paid to the need to manage energy consumption across the entire information and communications technology (ICT) sector. While data center energy use has received much attention recently, there has been less attention paid to the energy consumption of the transmission and switching networks that are key to connecting users to the cloud. In this paper, we present an analysis of energy consumption in cloud computing. The analysis considers both public and private clouds, and includes energy consumption in switching and transmission as well as data processing and data storage. We show that energy consumption in transport and switching can be a significant percentage of total energy consumption in cloud computing. Cloud computing can enable more energy-efficient use of computing power, especially when the computing tasks are of low intensity or infrequent. However, under some circum- stances cloud computing can consume more energy than conventional computing where each user performs all com- puting on their own personal computer (PC).

748 citations


"Energy consumption of photo sharing..." refers background or methods in this paper

  • ...The power consumption of each network device typically follows a linear trend [2], [7], shown schematically in Figure 3....

    [...]

  • ...The average incremental energyper-bit (E ′ b) for n >> 1 network elements (base stations, edge and core devices, servers, etc.) is given by E ′ b = Ptotal−< Pidle > Ctotal ≈ ( 1ρ −1)< Pidle >+< Pmax > <Cmax > (3) where, Ctotal is the capacity of the network elements....

    [...]

  • ...Also, we define the mean incremental energy per bit as < Eb >= < Pmax >−< Pidle > <Cmax > With these definitions, the total power consumption of the network with n >> 1 network elements is: Ptotal = n(< Pidle >+ρEb <Cmax >) where, ρ is the utilization threshold of the network elements for adding new equipment....

    [...]

  • ...Therefore the incremental energy associated with a user device for photo sharing is Einc-ter = ∫ t2 t1 (P(t)−Pidle)dt = Pmax−PidleCmax Nbit = Eb-terNbit (2) In this P(t) is the power consumption of the device from time t1 to time t2 which are the start and end times of the upload or download transaction and Eb-ter is the incremental energyper-bit for the customer terminal equipment and Cmax is the maximum throughput capacity of the equipment....

    [...]

  • ...Nbit is the number of transmitted and received bits when interacting with a cloud service [2], [7]....

    [...]

Journal ArticleDOI
01 Jan 2011
TL;DR: In this paper, the authors present an analysis of energy consumption in cloud computing, considering both public and private clouds, and include energy consumption of switching and transmission as well as data processing and data storage.
Abstract: Network-based cloud computing is rapidly expanding as an alternative to conventional office-based computing. As cloud computing becomes more widespread, the energy consumption of the network and computing resources that underpin the cloud will grow. This is happening at a time when there is increasing attention being paid to the need to manage energy consumption across the entire information and communications technology (ICT) sector. While data center energy use has received much attention recently, there has been less attention paid to the energy consumption of the transmission and switching networks that are key to connecting users to the cloud. In this paper, we present an analysis of energy consumption in cloud computing. The analysis considers both public and private clouds, and includes energy consumption in switching and transmission as well as data processing and data storage. We show that energy consumption in transport and switching can be a significant percentage of total energy consumption in cloud computing. Cloud computing can enable more energy-efficient use of computing power, especially when the computing tasks are of low intensity or infrequent. However, under some circumstances cloud computing can consume more energy than conventional computing where each user performs all computing on their own personal computer (PC).

704 citations

Proceedings ArticleDOI
16 Jun 2009
TL;DR: The GreenCloud architecture is presented, which aims to reduce data center power consumption, while guarantee the performance from users' perspective, and enables comprehensive online-monitoring, live virtual machine migration, and VM placement optimization.
Abstract: Nowadays, power consumption of data centers has huge impacts on environments. Researchers are seeking to find effective solutions to make data centers reduce power consumption while keep the desired quality of service or service level objectives. Virtual Machine (VM) technology has been widely applied in data center environments due to its seminal features, including reliability, flexibility, and the ease of management. We present the GreenCloud architecture, which aims to reduce data center power consumption, while guarantee the performance from users' perspective. GreenCloud architecture enables comprehensive online-monitoring, live virtual machine migration, and VM placement optimization. To verify the efficiency and effectiveness of the proposed architecture, we take an online real-time game, Tremulous, as a VM application. Evaluation results show that we can save up to 27% of the energy when applying GreenCloud architecture.

436 citations


"Energy consumption of photo sharing..." refers background in this paper

  • ...Energy consumption of the transport network and end-user devices have been ignored in most studies of energy consumption in cloud based applications and services [8], [9]....

    [...]