scispace - formally typeset
Search or ask a question

Showing papers by "Robert J. Thomas published in 2010"


Proceedings ArticleDOI
04 Nov 2010
TL;DR: The problem of constructing malicious data attack of smart grid state estimation is considered together with countermeasures that detect the presence of such attacks and an efficient algorithm with polynomial-time complexity is obtained.
Abstract: The problem of constructing malicious data attack of smart grid state estimation is considered together with countermeasures that detect the presence of such attacks. For the adversary, using a graph theoretic approach, an efficient algorithm with polynomial-time complexity is obtained to find the minimum size unobservable malicious data attacks. When the unobservable attack does not exist due to restrictions of meter access, attacks are constructed to minimize the residue energy of attack while guaranteeing a certain level of increase of mean square error. For the control center, a computationally efficient algorithm is derived to detect and localize attacks using the generalized likelihood ratio test regularized by an L_1 norm penalty on the strength of attack.

353 citations


Journal ArticleDOI
TL;DR: An algorithm is proposed that generates random topology power grids featuring the same topology and electrical characteristics found from the real data.
Abstract: In order to design an efficient communication scheme and examine the efficiency of any networked control architecture in smart grid applications, we need to characterize statistically its information source, namely the power grid itself. Investigating the statistical properties of power grids has the immediate benefit of providing a natural simulation platform, producing a large number of power grid test cases with realistic topologies, with scalable network size, and with realistic electrical parameter settings. The second benefit is that one can start analyzing the performance of decentralized control algorithms over information networks whose topology matches that of the underlying power network and use network scientific approaches to determine analytically if these architectures would scale well. With these motivations, in this paper we study both the topological and electrical characteristics of power grid networks based on a number of synthetic and real-world power systems. The most interesting discoveries include: the power grid is sparsely connected with obvious small-world properties; its nodal degree distribution can be well fitted by a mixture distribution coming from the sum of a truncated geometric random variable and an irregular discrete random variable; the power grid has very distinctive graph spectral density and its algebraic connectivity scales as a power function of the network size; the line impedance has a heavy-tailed distribution, which can be captured quite accurately by a clipped double Pareto lognormal distribution. Based on the discoveries mentioned above, we propose an algorithm that generates random topology power grids featuring the same topology and electrical characteristics found from the real data.

271 citations


Proceedings ArticleDOI
01 Dec 2010
TL;DR: New measures of centrality for power grid structure that are based on its functionality are defined that show that the relative importance analysis based on centrality in graph theory can be generalized to power grid network with its electrical parameters taken into account.
Abstract: Centrality measures are used in network science to rank the relative importance of nodes and edges of a graph. Here we define new measures of centrality for power grid structure that are based on its functionality. We show that the relative importance analysis based on centrality in graph theory can be generalized to power grid network with its electrical parameters taken into account. In the paper we experiment with the proposed electrical centrality measures on the NYISO-2935 system and the IEEE 300-bus system. We analyze the centrality distribution in order to identify important nodes or branches in the system which are of essential importance in terms of system vulnerability. We also present and discuss a number of interesting discoveries regarding the importance rank of power grid nodes and branches.

171 citations


Proceedings ArticleDOI
17 Mar 2010
TL;DR: An easily computable heuristic is developed to find bad adversarial attacks in all cases, and a new L∞ norm detector is introduced that outperforms more standard L2 norm based detectors by taking advantage of the inherent sparsity of the false data injection.
Abstract: Malicious attacks against power system state estimation are considered. It has been recently observed that if an adversary is able to manipulate the measurements taken at several meters in a power system, it can sometimes change the state estimate at the control center in a way that will never be detected by classical bad data detectors. However, in cases when the adversary is not able to perform this attack, it was not clear what attacks might look like. An easily computable heuristic is developed to find bad adversarial attacks in all cases. This heuristic recovers the undetectable attacks, but it will also find the most damaging attack in all cases. In addition, a Bayesian formulation of the bad data problem is introduced, which captures the prior information that a control center has about the likely state of the power system. This formulation softens the impact of undetectable attacks. Finally, a new L ∞ norm detector is introduced, and it is demonstrated that it outperforms more standard L 2 norm based detectors by taking advantage of the inherent sparsity of the false data injection.

151 citations


Proceedings Article
03 Dec 2010
TL;DR: The problem of detecting and characterizing impacts of malicious attacks against smart grid state estimation is considered and a Bayesian framework is presented for the characterization of fundamental tradeoffs at the control center and for the adversary.
Abstract: The problem of detecting and characterizing impacts of malicious attacks against smart grid state estimation is considered. Different from the classical bad data detection for state estimation, the detection of malicious data injected by an adversary must take into account carefully designed attacks capable of evading conventional bad data detection. A Bayesian framework is presented for the characterization of fundamental tradeoffs at the control center and for the adversary. For the control center, a detector based on the generalized likelihood ratio test (GRLT) is introduced and compared with conventional bad detection detection schemes. For the adversary, the tradeoff between increasing the mean square error (MSE) of the state estimation vs. the probability of being detected by the control center is characterized. A heuristic is presented for the design of worst attack.

87 citations


Proceedings ArticleDOI
23 May 2010
TL;DR: This paper numerically study the topology robustness of power grids under random and selective node breakdowns, and analytically estimate the critical node-removal thresholds to disintegrate a system, based on the available US power grid data.
Abstract: In this paper we numerically study the topology robustness of power grids under random and selective node breakdowns, and analytically estimate the critical node-removal thresholds to disintegrate a system, based on the available US power grid data. We also present an analysis on the node degree distribution in power grids because it closely relates with the topology robustness. It is found that the node degree in a power grid can be well fitted by a mixture distribution coming from the sum of a truncated Geometric random variable and an irregular Discrete random variable. With the findings we obtain better estimates of the threshold under selective node breakdowns which predict the numerical thresholds more correctly.

58 citations


Proceedings ArticleDOI
04 Nov 2010
TL;DR: The main finding is that typical grid admittance matrices have singular values and vectors with only a small number of strong components, which can be exploited to construct an efficient decentralized system-wide monitoring and control architecture.
Abstract: In this paper we apply the Singular Value Decomposition (SVD) analysis to examining the coupling structure of an electrical power grid in order to highlight opportunities for reducing the network traffic, by identifying what are the salient data that need to be communicated between parts of the infrastructure to apply a control action. Our main finding is that typical grid admittance matrices have singular values and vectors with only a small number of strong components. The SVD sparsity can be exploited to construct an efficient decentralized system-wide monitoring and control architecture. We also discuss the potential applications of the proposed architecture and its robustness under contingency; and experiment the SVD analysis with the NYISO-2935 system and the IEEE-300 system.

28 citations


Journal ArticleDOI
01 May 2010
TL;DR: In this paper, an approach using nonlinear time series analysis is proposed and tested on actual offer behavior observed in the electricity markets in the United States.
Abstract: In electricity markets where supply and demand drives the price for the purchase and sale of electricity, generating firms change capacity for various reasons including load level, policy, and varying market conditions. These types of fluctuating production patterns can result in the reduction of market efficiency. In an inefficient market, where the price for electricity exceeds marginal cost, the locational marginal price (LMP) is often used to measure market efficiency. Stochastically driven changes in the market are captured by this approach, however, these random changes (frequently observed in efficient markets as well) do not affect market efficiency in the long run. Conversely, a slow, consistent change is not captured by the snap-shot approach and affects the efficiency significantly. Therefore, it is necessary to construct an algorithm that captures only consistent changes that truly affect market efficiency. Fractal analysis can characterize a price behavior in the electricity markets because the price exhibits a self-similarity. Once a system undergoes a change, the fractal dimension of the system reflects the change. In this paper, an approach using nonlinear time series analysis is proposed and tested on actual offer behavior observed in the electricity markets in the United States.

7 citations


Proceedings Article
03 Dec 2010
TL;DR: In this paper, the authors investigated the impact of variability from a stochastic generation resource on the optimal hour-to-hour dispatch of generating units and the corresponding operating costs and wholesale prices.
Abstract: The distribution of stochastic generation from renewables across different geographical locations can, in certain cases, help to mitigate the inherent variability in output. This variability of generation from renewables may 1) increase the operating costs of the conventional generators used to follow the net load not supplied by stochastic capacity and 2) increase the amount of reserve conventional generating capacity needed to maintain Operating Reliability. In this scenario, customers have lower wholesale prices, due to reductions in the total annual generation from fossil fuels, while generators face higher operating costs for conventional generators caused by additional ramping that partly offset the customer benefits. However, the lower wholesale prices ($/MWh) imply lower annual earnings for conventional generators that lead to higher amounts of missing money ($/MW) needed to maintain the financial adequacy of installed generating units. The objective of this paper is to determine how variability from a stochastic generation resource affects the optimal hour-to-hour dispatch of generating units and the corresponding operating costs and wholesale prices. The results show that the inclusion of ramping costs for conventional generation affect the amount of energy dispatched from the stochastic generator, and the total costs composition observed in the system. The Cornell SuperOPF1 is used to illustrate how the operating costs and wholesale prices can be determined for a reliable network (the amount of conventional generating capacity needed to maintain Operating Reliability is determined endogenously). The results in this paper use a typical daily pattern of load and capture the cost of ramping by including additions to the operating costs of the generating units associated with the hour-to-hour changes in their optimal dispatch. The calculations for determining endogenous up and down reserves are included, and the wind generation cost is assumed to be zero. Additionally, the maximum and minimum available capacities for all hours in the day are constrained to the optimal capacities for the hours with the highest and the lowest loads. Different scenarios are evaluated for a given hourly realization of wind speeds using specified amounts of installed wind capacity with and without ramping costs. The analysis also evaluates the effects of eliminating network constraints, as well as the elimination of wind variability by accounting for the effects of spatial aggregation of different wind locations.

6 citations


Journal ArticleDOI
TL;DR: In this paper, a method for estimating the M matrix by using publically available data is suggested, indicating that any market participant who does the computation will know when conditions permit them to lower or raise prices through decreased/increased bids and/or offers.
Abstract: The abuse of market power is a potentially serious problem for market designers. Few indices, if any, exist to measure the potential for market power in real time. In this regard, Murillo-Sanchez et al. derived an expression for a dispatch-to-price sensitivity matrix, M . The expression requires information about network topology and parameters, as well as the rules used to operate the market. While computing the matrix is conceptually easy for those with all the market and system information, such as an independent system operator, the method is probably impractical for market participants due to the inaccessibility of much of the information. In this paper, a method for estimating the M matrix by using publically available data are suggested, indicating that any market participant who does the computation will know when conditions permit them to lower or raise prices through decreased/increased bids and/or offers.

5 citations



Journal ArticleDOI
TL;DR: In this paper, the authors attempt to remedy the deficiencies of both the economists’ approach and the security-constrained optimization approach through a collaboration of economists and engineers to examine the theoretical properties of a networked power system that provides optimal resource allocation.
Abstract: The theory used by economists to support restructuring of the electric power industry has ignored several important technological constraints and public goods that affect the way in which power is delivered. Similarly, engineers, by using security-constrained optimization to incorporate the demand for reliability, have failed to properly define the economic problem. In this two-part paper we attempt to remedy the deficiencies of both the economists’ approach and the security-constrained optimization approach through a collaboration of economists and engineers to examine the theoretical properties of a networked power system that provides optimal resource allocation.

Journal ArticleDOI
TL;DR: In this article, the authors show mathematically that real and reactive power are private goods, in that power consumed by one customer cannot be used by another and customers can be excluded from receiving any power.
Abstract: Electric power is traditionally comprised of valued services, including real and reactive power, voltage, frequency and reliability in its most general sense. In this second part of our two-part paper we show mathematically that of these, only real and reactive power are purely private goods, in that power consumed by one customer cannot be used by another and customers can be excluded from receiving any power. The other ancillary services, including voltage, frequency and reliability are shown to be public goods. The first order conditions presented clearly illustrate that the public goods occurring in electric power systems comprise a significant problem for market design.