Author

# Alberto Leon-Garcia

Bio: Alberto Leon-Garcia is an academic researcher from University of Toronto. The author has contributed to research in topics: Cloud computing & Quality of service. The author has an hindex of 37, co-authored 363 publications receiving 11422 citations.

##### Papers published on a yearly basis

##### Papers

More filters

••

TL;DR: This paper presents an autonomous and distributed demand-side energy management system among users that takes advantage of a two-way digital communication infrastructure which is envisioned in the future smart grid.

Abstract: Most of the existing demand-side management programs focus primarily on the interactions between a utility company and its customers/users. In this paper, we present an autonomous and distributed demand-side energy management system among users that takes advantage of a two-way digital communication infrastructure which is envisioned in the future smart grid. We use game theory and formulate an energy consumption scheduling game, where the players are the users and their strategies are the daily schedules of their household appliances and loads. It is assumed that the utility company can adopt adequate pricing tariffs that differentiate the energy usage in time and level. We show that for a common scenario, with a single utility company serving multiple customers, the global optimal performance in terms of minimizing the energy costs is achieved at the Nash equilibrium of the formulated energy consumption scheduling game. The proposed distributed demand-side energy management strategy requires each user to simply apply its best response strategy to the current total load and tariffs in the power distribution system. The users can maintain privacy and do not need to reveal the details on their energy consumption schedules to other users. We also show that users will have the incentives to participate in the energy consumption scheduling game and subscribing to such services. Simulation results confirm that the proposed approach can reduce the peak-to-average ratio of the total energy demand, the total energy costs, as well as each user's individual daily electricity charges.

2,715 citations

••

TL;DR: Simulation results show that the combination of the proposed energy consumption scheduling design and the price predictor filter leads to significant reduction not only in users' payments but also in the resulting peak-to-average ratio in load demand for various load scenarios.

Abstract: Real-time electricity pricing models can potentially lead to economic and environmental advantages compared to the current common flat rates. In particular, they can provide end users with the opportunity to reduce their electricity expenditures by responding to pricing that varies with different times of the day. However, recent studies have revealed that the lack of knowledge among users about how to respond to time-varying prices as well as the lack of effective building automation systems are two major barriers for fully utilizing the potential benefits of real-time pricing tariffs. We tackle these problems by proposing an optimal and automatic residential energy consumption scheduling framework which attempts to achieve a desired trade-off between minimizing the electricity payment and minimizing the waiting time for the operation of each appliance in household in presence of a real-time pricing tariff combined with inclining block rates. Our design requires minimum effort from the users and is based on simple linear programming computations. Moreover, we argue that any residential load control strategy in real-time electricity pricing environments requires price prediction capabilities. This is particularly true if the utility companies provide price information only one or two hours ahead of time. By applying a simple and efficient weighted average price prediction filter to the actual hourly-based price values used by the Illinois Power Company from January 2007 to December 2009, we obtain the optimal choices of the coefficients for each day of the week to be used by the price predictor filter. Simulation results show that the combination of the proposed energy consumption scheduling design and the price predictor filter leads to significant reduction not only in users' payments but also in the resulting peak-to-average ratio in load demand for various load scenarios. Therefore, the deployment of the proposed optimal energy consumption scheduling schemes is beneficial for both end users and utility companies.

1,782 citations

•

01 Jan 2000

TL;DR: This book is designed for introductory one-semester or one-year courses in communications networks in upper-level undergraduate programs and assumes a general knowledge of computer systems and programming, and elementary calculus.

Abstract: This book is designed for introductory one-semester or one-year courses in communications networks in upper-level undergraduate programs. The second half of the book can be used in more advanced courses. As pre-requisites the book assumes a general knowledge of computer systems and programming, and elementary calculus. The second edition expands on the success of the first edition by updating on technological changes in networks and responding to comprehensive market feedback.
Table of contents
1 Communication Networks and Services
2 Layered Architectures
3 Digital Transmission Fundamentals
4 Circuit-Switching Networks
5 Peer-to-Peer Protocols and Data Link Layer
6 Medium Access Control Protocols and Local Area Networks
7 Packet-Switching Networks
8 TCP/IP
9 ATM Networks
10 Advanced Network Architectures
11 Security Protocols
12 Multimedia Information and Networking
Appendix A Delay and Loss Performance
Appendix B Network Management

824 citations

••

TL;DR: A simple alternative method to estimate the shape parameter for the generalized Gaussian PDF is proposed that significantly reduces the number of computations by eliminating the need for any statistical goodness-of-fit test.

Abstract: A subband decomposition scheme for video signals, in which the original or difference frames are each decomposed into 16 equal-size frequency subbands, is considered. Westerink et al. (1991) have shown that the distribution of the sample values in each subband can be modeled with a "generalized Gaussian" probability density function (PDF) where three parameters, mean, variance, and shape are required to uniquely determine the PDF. To estimate the shape parameter, a series of statistical goodness-of-fit tests such as Kolmogorov-Smirnov or chi-squared tests have been used. A simple alternative method to estimate the shape parameter for the generalized Gaussian PDF is proposed that significantly reduces the number of computations by eliminating the need for any statistical goodness-of-fit test. >

565 citations

•

01 Jan 2008

TL;DR: In this paper, the authors present an axiomatic approach to a theory of probability, based on the axiomatization of probability models, for the analysis and design of wireless networks.

Abstract: 1. Probability Models in Electrical and Computer Engineering. Mathematical models as tools in analysis and design. Deterministic models. Probability models. Statistical regularity. Properties of relative frequency. The axiomatic approach to a theory of probability. Building a probability model. A detailed example: a packet voice transmission system. Other examples. Communication over unreliable channels. Processing of random signals. Resource sharing systems. Reliability of systems. Overview of book. Summary. Problems. 2. Basic Concepts of Probability Theory. Specifying random experiments. The sample space. Events. Set operations. The axioms of probability. Discrete sample spaces. Continuous sample spaces. Computing probabilities using counting methods. Sampling with replacement and with ordering. Sampling without replacement and with ordering. Permutations of n distinct objects. Sampling without replacement and without ordering. Sampling with replacement and without ordering. Conditional probability. Bayes' Rule. Independence of events. Sequential experiments. Sequences of independent experiments. The binomial probability law. The multinomial probability law. The geometric probability law. Sequences of dependent experiments. A computer method for synthesizing randomness: random number generators. Summary. Problems. 3. Random Variables. The notion of a random variable. The cumulative distribution function. The three types of random variables. The probability density function. Conditional cdf's and pdf's. Some important random variables. Discrete random variables. Continuous random variables. Functions of a random variable. The expected value of random variables. The expected value of X. The expected value of Y = g(X). Variance of X. The Markov and Chebyshev inequalities. Testing the fit of a distribution to data. Transform methods. The characteristic function. The probability generating function. The laplace transform of the pdf. Basic reliability calculations. The failure rate function. Reliability of systems. Computer methods for generating random variables. The transformation method. The rejection method. Generation of functions of a random variable. Generating mixtures of random variables. Entropy. The entropy of a random variable. Entropy as a measure of information. The method of a maximum entropy. Summary. Problems. 4. Multiple Random Variables. Vector random variables. Events and probabilities. Independence. Pairs of random variables. Pairs of discrete random variables. The joint cdf of X and Y. The joint pdf of two jointly continuous random variables. Random variables that differ in type. Independence of two random variables. Conditional probability and conditional expectation. Conditional probability. Conditional expectation. Multiple random variables. Joint distributions. Independence. Functions of several random variables. One function of several random variables. Transformation of random vectors. pdf of linear transformations. pdf of general transformations. Expected value of functions of random variables. The correlation and covariance of two random variables. Joint characteristic function. Jointly Gaussian random variables. n jointly Gaussian random variables. Linear transformation of Gaussian random variables. Joint characteristic function of Gaussian random variables. Mean square estimation. Linear prediction. Generating correlated vector random variables. Generating vectors of random variables with specified covariances. Generating vectors of jointly Gaussian random variables. Summary. Problems. 5. Sums of Random Variables and Long-Term Averages. Sums of random variables. Mean and variance of sums of random variables. pdf of sums of independent random variables. Sum of a random number of random variables. The sample mean and the laws of large numbers. The central limit theorem. Gaussian approximation for binomial probabilities. Proof of the central limit theorem. Confidence intervals. Case 1: Xj's Gaussian unknown mean and known variance. Case 2: Xj's Gaussian mean and variance unknown. Case 3: Xj's Non-Gaussian mean and variance unknown. Convergence of sequences of random variables. Long-term arrival rates and associated averages. Long-term time averages. A computer method for evaluating the distribution of a random variable using the discrete Fourier transform. Discrete random variables. Continuous random variables. Summary. Problems. Appendix: subroutine FFT(A,M,N). 6. Random Processes. Definition of a random process. Specifying of a random process. Joint distributions of time samples. The mean, autocorrelation, and autocovariance functions. Gaussian random processes. Multiple random processes. Examples of discrete-time random processes. iid random processes. Sum processes the binomial counting and random walk processes. Examples of continuous-time random processes. Poisson process. Random telegraph signal and other processes derived from the Poisson Process. Wiener process and Brownian motion. Stationary random processes. Wide-sense stationary random processes. Wide-sense stationary Gaussian random processes. Cylostationary random processes. Continuity, derivative, and integrals of random processes. Mean square continuity. Mean square derivatives. Mean square integrals. Response of a linear system to random input. Time averages of random processes and ergodic theorems. Fourier series and Karhunen-Loeve expansion. Karhunen-Loeve expansion. Summary. Problems. 7. Analysis and Processing of Random Signals. Power spectral density. Continuous-time random processes. Discrete-time random processes. Power spectral density as a time average. Response of linear systems to random signals. Continuous-time systems. Discrete-time systems. Amplitude modulation by random signals. Optimum linear systems. The orthogonality condition. Prediction. Estimation using the entire realization of the observed process. Estimation using causal filters. The Kalman filter. Estimating the power spectral density. Variance of periodogram estimate. Smoothing of periodogram estimate. Summary. Problems. 8. Markov Chains. Markov processes. Discrete-time Markov chains. The n-step transition probabilities. The state probabilities. Steady state probabilities. Continuous-time Markov chains. State occupancy times. Transition rates and time-dependent state probabilities. Steady state probabilities and global balance equations. Classes of states, recurrence properties, and limiting probabilities. Classes of states. Recurrence properties. Limiting probabilities. Limiting probabilities for continuous-time Markov chains. Time-reversed Markov chains. Time-reversible Markov chains. Time-reversible continuous-time Markov chains. Summary. Problems. 9. Introduction to Queueing Theory. The elements of a queueing system. Little's formula. The M/M/I queue. Distribution of number in the system. Delay distribution in M/M/I system and arriving customer's distribution. The M/M/I system with finite capacity. Multi-server systems: M/M/c, M/M/c/c, and M/M/infinity. Distribution of number in the M/M/c system. Waiting time distribution for M/M/c. The M/M/c/c queueing system. The M/M/infinity queueing system. Finite-source queueing systems. Arriving customer's distribution. M/G/I queueing systems. The residual service time. Mean delay in M/G/I systems. Mean delay in M/G/I systems with priority service discipline. M/G/I analysis using embedded Markov chains. The embedded Markov chains. The number of customers in an M/G/I system. Delay and waiting time distribution in an M/G/I system. Burke's theorem: Departures from M/M/c systems Proof of Burke's theorem using time reversibility. Networks of queues: Jackson's theorem. Open networks of queues. Proof of Jackson's theorem. Closed networks of queues. Mean value analysis. Proof of the arrival theorem. Summary. Problems. Appendix A. Mathematical Tables. Appendix B. Tables of Fourier Transformation. Appendix C. Computer Programs for Generating Random Variables. Answers to Selected Problems. Index.

556 citations

##### Cited by

More filters

••

[...]

TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.

Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

••

TL;DR: Despite its simplicity, it is able to show that BRISQUE is statistically better than the full-reference peak signal-to-noise ratio and the structural similarity index, and is highly competitive with respect to all present-day distortion-generic NR IQA algorithms.

Abstract: We propose a natural scene statistic-based distortion-generic blind/no-reference (NR) image quality assessment (IQA) model that operates in the spatial domain. The new model, dubbed blind/referenceless image spatial quality evaluator (BRISQUE) does not compute distortion-specific features, such as ringing, blur, or blocking, but instead uses scene statistics of locally normalized luminance coefficients to quantify possible losses of “naturalness” in the image due to the presence of distortions, thereby leading to a holistic measure of quality. The underlying features used derive from the empirical distribution of locally normalized luminances and products of locally normalized luminances under a spatial natural scene statistic model. No transformation to another coordinate frame (DCT, wavelet, etc.) is required, distinguishing it from prior NR IQA approaches. Despite its simplicity, we are able to show that BRISQUE is statistically better than the full-reference peak signal-to-noise ratio and the structural similarity index, and is highly competitive with respect to all present-day distortion-generic NR IQA algorithms. BRISQUE has very low computational complexity, making it well suited for real time applications. BRISQUE features may be used for distortion-identification as well. To illustrate a new practical application of BRISQUE, we describe how a nonblind image denoising algorithm can be augmented with BRISQUE in order to perform blind image denoising. Results show that BRISQUE augmentation leads to performance improvements over state-of-the-art methods. A software release of BRISQUE is available online: http://live.ece.utexas.edu/research/quality/BRISQUE_release.zip for public use and evaluation.

3,780 citations

••

TL;DR: This work has recently derived a blind IQA model that only makes use of measurable deviations from statistical regularities observed in natural images, without training on human-rated distorted images, and, indeed, without any exposure to distorted images.

Abstract: An important aim of research on the blind image quality assessment (IQA) problem is to devise perceptual models that can predict the quality of distorted images with as little prior knowledge of the images or their distortions as possible. Current state-of-the-art “general purpose” no reference (NR) IQA algorithms require knowledge about anticipated distortions in the form of training examples and corresponding human opinion scores. However we have recently derived a blind IQA model that only makes use of measurable deviations from statistical regularities observed in natural images, without training on human-rated distorted images, and, indeed without any exposure to distorted images. Thus, it is “completely blind.” The new IQA model, which we call the Natural Image Quality Evaluator (NIQE) is based on the construction of a “quality aware” collection of statistical features based on a simple and successful space domain natural scene statistic (NSS) model. These features are derived from a corpus of natural, undistorted images. Experimental results show that the new index delivers performance comparable to top performing NR IQA models that require training on large databases of human opinions of distorted images. A software release is available at http://live.ece.utexas.edu/research/quality/niqe_release.zip.

3,722 citations

••

TL;DR: The kinds of things that social scientists have tried to explain using social network analysis are reviewed and a nutshell description of the basic assumptions, goals, and explanatory mechanisms prevalent in the field is provided.

Abstract: Over the past decade, there has been an explosion of interest in network research across the physical and social sciences. For social scientists, the theory of networks has been a gold mine, yielding explanations for social phenomena in a wide variety of disciplines from psychology to economics. Here, we review the kinds of things that social scientists have tried to explain using social network analysis and provide a nutshell description of the basic assumptions, goals, and explanatory mechanisms prevalent in the field. We hope to contribute to a dialogue among researchers from across the physical and social sciences who share a common interest in understanding the antecedents and consequences of network phenomena.

3,423 citations