scispace - formally typeset
Search or ask a question

Showing papers by "Robert J. Thomas published in 2012"


Proceedings ArticleDOI
04 Jan 2012
TL;DR: A stochastic Markov model, whose transition probabilities are derived from a Stochastic model for the flow redistribution, that can potentially capture the progression of cascading failures and its time span is proposed.
Abstract: The electric power grid is a complex critical infrastructure network. Its inter-connectivity enables long-distance transmission of power for more efficient system operation. The same inter-connectivity, however, also allows the propagation of disturbances. In fact, blackouts due to cascading failures occur because of the intrinsic electrical properties of this propagation and physical mechanisms that are triggered by it. In this paper we propose a stochastic Markov model, whose transition probabilities are derived from a stochastic model for the flow redistribution, that can potentially capture the progression of cascading failures and its time span. We suggest a metric that should be monitored to expose the risk of failure and the time margin that is left to perform corrective action. Finally we experiment with the proposed stochastic model on the IEEE 300 bus system and provide numerical analysis.

102 citations


Posted Content
TL;DR: It is shown that the power system state space is partitioned into price regions of convex polytopes and under different bad data models, the worst case impacts of bad data on real-time LMP are analyzed.
Abstract: The problem of characterizing impacts of data quality on real-time locational marginal price (LMP) is considered. Because the real-time LMP is computed from the estimated network topology and system state, bad data that cause errors in topology processing and state estimation affect real-time LMP. It is shown that the power system state space is partitioned into price regions of convex polytopes. Under different bad data models, the worst case impacts of bad data on real-time LMP are analyzed. Numerical simulations are used to illustrate worst case performance for IEEE-14 and IEEE-118 networks.

89 citations


Journal ArticleDOI
TL;DR: Digital Direct Load Scheduling is a direct load control mechanism in which individual requests for energy are unbundle and digitized so that they can be automatically scheduled in a cellular architecture.
Abstract: At present, the power grid has tight control over its dispatchable generation capacity but a very coarse control on the demand. Energy consumers are shielded from making price-aware decisions, which degrades the efficiency of the market. This state of affairs tends to favor fossil fuel generation over renewable sources. Because of the technological difficulties of storing electric energy, the quest for mechanisms that would make the demand for electricity controllable on a day-to-day basis is gaining prominence. The goal of this paper is to provide one such mechanisms, which we call Digital Direct Load Scheduling (DDLS). DDLS is a direct load control mechanism in which we unbundle individual requests for energy and digitize them so that they can be automatically scheduled in a cellular architecture. Specifically, rather than storing energy or interrupting the job of appliances, we choose to hold requests for energy in queues and optimize the service time of individual appliances belonging to a broad class which we refer to as "deferrable loads". The function of each neighborhood scheduler is to optimize the time at which these appliances start to function. This process is intended to shape the aggregate load profile of the neighborhood so as to optimize an objective function which incorporates the spot price of energy, and also allows distributed energy resources to supply part of the generation dynamically.

76 citations


Proceedings ArticleDOI
04 Jan 2012
TL;DR: Impacts of malicious data data attack on the real-time electricity market are studied and a geometric framework is introduced based on which upper and lower bounds on the optimal data attack are obtained and evaluated in simulations.
Abstract: Impacts of malicious data data attack on the real-time electricity market are studied. It is assumed that an adversary has access to a limitted number of meters and has the ability to construct data attack based on what it observes. Different observation models are considered. A geometric framework is introduced based on which upper and lower bounds on the optimal data attack are obtained and evaluated in simulations.

66 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that the economic benefits of reducing the peak system load using storage or controllable demand will be higher with high penetrations of wind generation, and that the benefits are very sensitive to how much of the inherent variability of wind power is mitigated, and how the missing money is determined.
Abstract: Earlier research has shown that adding wind generation to a network can lower the total annual operating cost by displacing conventional generation. At the same time, the variability of wind generation and the need for higher levels of reserve generating capacity to maintain reliability standards impose additional costs on the system that should not be ignored. The important implication for regulators is that the capacity payments ["missing money"] for each MW of peak system load are now much higher. Hence, the economic benefits of reducing the peak system load using storage or controllable demand will be higher with high penetrations of wind generation. These potential benefits are illustrated in a case study using a test network and a security constrained Optimal Power Flow (OPF) with endogenous reserves (SuperOPF). The results show that the benefits are very sensitive to 1) how much of the inherent variability of wind generation is mitigated, and 2) how the missing money is determined (e.g. comparing regulation with deregulation).

61 citations


Proceedings ArticleDOI
22 Jul 2012
TL;DR: In this article, the effects of nonlinearity in the power systems on the effectiveness of malicious data attack on state estimation and real-time market were examined and it was demonstrated that attack algorithms designed for the DC model may not be effective when they are applied to nonlinear system with nonlinear state estimators.
Abstract: There has been a growing literature on the malicious data attack (or data injection attack) on power systems. Most existing work focuses on the DC (linear) model with linear state estimators. This paper examines the effects of nonlinearity in the power systems on the effectiveness of malicious data attack on state estimation and real-time market. It is demonstrated that attack algorithms designed for the DC model may not be effective when they are applied to nonlinear system with nonlinear state estimators. Discussion and experiments results about nonlinearity are provided.

50 citations


Posted Content
TL;DR: In this paper, the authors show how applying a network lens reveals a significant number of key players (including marginalized talent, hidden talent and underutilized talent) that traditional performance management systems miss.
Abstract: Leaders and human resources professionals are searching for ways to generate more value from their employees. Recent studies show that companies perform at a higher level when they have integrated talent management programs that are aligned with business strategy and operations. Organizations can get more from their investments in talent management, the authors argue, by focusing on collaboration. Job design and performance management are typically based on individual accountability despite the fact that most work today is collaborative. Talent management practices tend to focus on individual competencies and experiences, while overlooking the importance of employee networks. By examining individual performance data together with the results of organizational network analysis, the authors say, senior managers can look at talent along two important dimensions. In addition to looking at individual employee performance for the purpose of succession or work force planning, they can take a network view to assess the same employees in terms of their broader collaborative contributions to the organization. The authors show how applying a network lens reveals a significant number of key players (including marginalized talent, hidden talent and underutilized talent) that traditional performance management systems miss. They identify best practices for nurturing networks through talent management initiatives, illustrating them with examples from organizations including IDEO, Nokia, Dow Chemical, Best Buy, Gallo and the US Army.

14 citations


Book ChapterDOI
01 Jan 2012
TL;DR: It is shown that the relative importance analysis based on centrality in graph theory can be performed on power grid network with its electrical parameters taken into account, and proposed electrical centrality measures are experimented with on the NYISO-2935 system and the IEEE 300-bus system.
Abstract: This chapter investigates measures of centrality that are applicable to power grids. Centrality measures are used in network science to rank the relative importance of nodes and edges of a graph. Here we define new measures of centrality for power grids that are based on its functionality. More specifically, the coupling of the grid network can be expressed as the algebraic equation YU = I, where U and I represent the vectors of complex bus voltage and injected current phasors; and Y is the network admittance matrix which is defined not only by the connecting topology but also by the network’s electrical parameters and can be viewed as a complex-weighted Laplacian. We show that the relative importance analysis based on centrality in graph theory can be performed on power grid network with its electrical parameters taken into account. In the chapter the proposed electrical centrality measures are experimented with on the NYISO-2935 system and the IEEE 300-bus system. The centrality distribution is analyzed in order to identify important nodes or branches in the system which are of essential importance in terms of system vulnerability. A number of interesting discoveries are also presented and discussed regarding the importance rank of power grid nodes and branches.

9 citations


Proceedings Article
01 Dec 2012
TL;DR: This paper proposes a novel power grid vulnerability measure called the minimum safety time after 1 line trip, defined based on the stochastic cascading failure model, and compares its performance with several other vulnerability measures through a set of statistical analysis.
Abstract: Cascading failure in power grids has long been recognized as a sever security threat to national economy and society, which happens infrequent but can cause severe consequences. The causes of cascading phenomena can be extremely complicated due to the many different and interactive mechanisms such as transmission overloads, protection equipment failures, transient instability, voltage collapse, etc. In the literature a number of vulnerability measures to cascading failures have been proposed to identify the most critical components in the grid and evaluate the damages caused by the removal of such recognized components from the grid. In this paper we propose a novel power grid vulnerability measure called the minimum safety time after 1 line trip, defined based on the stochastic cascading failure model[1]. We compare its performance with several other vulnerability measures through a set of statistical analysis.

6 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a new methodology for allocating losses at each bus, decomposed real power flow is decomposed into preserved power flow and lost flow, which are established by applying Kirchhoff's law to each bus.
Abstract: Electricity is a necessity for modern society. Similar to other commodities, electricity is generated and delivered to consumers through a transmission network. In the delivery process, it obeys Kirchhoff’s laws and undergoes losses. To determine a fair pricing in a deregulated electricity market, an exact analysis of loss is required. Several methodologies have been proposed to evaluate the nodal value of the system loss. However, losses over individual lines have different values. Consequently, the effect of losses on price is difficult to evaluate. In this study, analysts present a new methodology for allocating losses at each bus. Real power flow is decomposed into preserved power flow and lost flow, which are established by applying Kirchhoff’s law to each bus. With this procedure, it is possible to attribute how much power injection is used for preserved flow and loss and, consequently, to evaluate the component of losses in prices.

3 citations


Proceedings ArticleDOI
16 Jan 2012
TL;DR: A model to digitize and aggregate smart loads requests that allows to properly account for inconvenience costs, while modulating the load demand time optimally to come as close as possible to the energy dispatch.
Abstract: Demand Side Management (DSM) and Demand-Response (DR) programs are aimed at exploiting the intrinsic elasticity of electricity demand to make it responsive to the near term cost of supplying generation. Loads that automatically respond to incentives, are referred to as “transactive”. While it is clear that DSM and DR will be indispensable to loosen the control over the generation and decrease the reserves requirements, there is much debate on what is the right architecture for DSM and DR programs. It is possible that one solution will not fit all possible loads. In this talk, we discuss a model to digitize and aggregate smart loads requests that allows to properly account for inconvenience costs, while modulating the load demand time optimally to come as close as possible to the energy dispatch. We envision that, in addition to pricing, rewarding the tolerance to service delays can be considered as additional powerful mechanism to operate the market efficiently, especially when tapping in micro-grid green generation. The solution we propose provides a natural model for Vehicles to Grid service (V2G), that can be easily extended to a large class of transactive loads.

Proceedings ArticleDOI
19 Jan 2012
TL;DR: The objective of this paper is to analyze how the variability of wind affects optimal dispatches and reserves in a daily optimization cycle and capture the cost of ramping by including additions to the operating costs of the generating units associated with the hour-to-hour changes in their optimal dispatch.
Abstract: The objective of this paper is to analyze how the variability of wind affects optimal dispatches and reserves in a daily optimization cycle. The Cornell SuperOPF is used to illustrate how the system costs can be determined for a reliable network (the amount of conventional generating capacity needed to maintain System Adequacy is determined endogenously). Eight cases are studied to illustrate the effects of geographical distribution, ramping costs and load response to customers payment in the wholesale market, and the amount of potential wind generation that is dispatched. The results in this paper use a typical daily pattern of load and capture the cost of ramping by including additions to the operating costs of the generating units associated with the hour-to-hour changes in their optimal dispatch. The proposed regulatory changes for electricity markets are 1) to establish a new market for ramping services, 2) to aggregate the loads of customers on a distribution network so that they can be represented as a single wholesale customer on the bulk-power transmission network and 3) to make use of controllable load and geographical distribution of wind to mitigate the variability of wind generation as an alternative to upgrading the capacity of the transmission network.

Proceedings ArticleDOI
04 Jan 2012
TL;DR: This paper examines the changes in planning and operating methods and tools necessary to successfully integrate a large share of VG into the power system, and identifies the associated research requirements.
Abstract: Conventional power systems have been planned with dispatch able generation using deterministic planning and operating methods and tools since the inception of the industry, with little exception. It now appears that the future power system will need to successfully integrate a high share of non-dispatch able renewable energy whose production exhibits much greater variability and uncertainty than that of conventional energy sources. The need for long distance, high capacity transmission is often greater for integrating the best renewable resources than it is for more conventional resources. Current market structures were not designed with the needs of variable generation in mind, nor were the existing wind forecasting tools. Computational capability is currently limited in its ability to handle the requirements of the new tools and methods required. This paper examines the changes in planning and operating methods and tools necessary to successfully integrate a large share of VG into the power system, and identifies the associated research requirements.

Journal ArticleDOI
TL;DR: Digital Direct Load Scheduling (DDLS) as mentioned in this paper is a direct load control mechanism in which individual requests for energy and digitize them so that they can be automatically scheduled in a cellular architecture.
Abstract: At present, the power grid has tight control over its dispatchable generation capacity but a very coarse control on the demand. Energy consumers are shielded from making price-aware decisions, which degrades the efficiency of the market. This state of affairs tends to favor fossil fuel generation over renewable sources. Because of the technological difficulties of storing electric energy, the quest for mechanisms that would make the demand for electricity controllable on a day-to-day basis is gaining prominence. The goal of this paper is to provide one such mechanisms, which we call Digital Direct Load Scheduling (DDLS). DDLS is a direct load control mechanism in which we unbundle individual requests for energy and digitize them so that they can be automatically scheduled in a cellular architecture. Specifically, rather than storing energy or interrupting the job of appliances, we choose to hold requests for energy in queues and optimize the service time of individual appliances belonging to a broad class which we refer to as "deferrable loads". The function of each neighborhood scheduler is to optimize the time at which these appliances start to function. This process is intended to shape the aggregate load profile of the neighborhood so as to optimize an objective function which incorporates the spot price of energy, and also allows distributed energy resources to supply part of the generation dynamically.