scispace - formally typeset
Search or ask a question
Author

J Lopatto

Bio: J Lopatto is an academic researcher. The author has contributed to research in topics: Blackout. The author has an hindex of 1, co-authored 1 publications receiving 650 citations.
Topics: Blackout

Papers
More filters
01 Apr 2004
TL;DR: The U.S.-Canada Power System Outage Task Force examines the electricity system before and during the massive power outage on August 14, 2003 which affected approximately 50 million people in the Midwest and Northeast United States and Ontario, Canada as mentioned in this paper.
Abstract: This final report by the U.S.-Canada Power System Outage Task Force examines the electricity system before and during the massive power outage on August 14, 2003 which affected approximately 50 million people in the Midwest and Northeast United States and Ontario, Canada. The report identifies the causes of the outage and why they were not contained. It also gives recommendations to prevent or minimize future blackouts, some of which include implementing reliability standards and increasing the physical and cyber security of the network. The four group causes for the blackout have been identified as: (1) inadequate system understanding, (2) inadequate situational awareness, (3) inadequate tree trimming, and (4) inadequate reliability coordinator diagnostic support. This final report covers work done by 3 working groups which focused on the electric system, security and nuclear facilities. The chapters of this report dealt with the following issues: the North American electric power system and its reliability organizations; causes of the blackout and violations of North American Electric Reliability Council (NERC) standards; preconditions for the blackout with reference to the northeastern power grid before the blackout; how the blackout began in Ohio; the cascade stage of the blackout; the August 14 blackout compared with previous major North American outages; performance of nuclear power plants affected by the blackout; and, physical and cyber security aspects of the blackout. The report indicates that the loss of FirstEnergy's overloaded Sammis-Star line triggered the cascade. Its 345-kV line into northern Ohio from eastern Ohio began tripping out because the lines were in contact with overgrown trees. The loss of the line created major and unsustainable burdens on lines in adjacent areas. The cascade spread rapidly as lines and generating units automatically took themselves out of service to avoid physical damage. The blackout had many contributing factors in common with earlier outages including: inadequate tree trimming, failure to identify emergency conditions, inadequate operator training, and inadequate regional-scale visibility over the power system. refs., tabs., figs.

650 citations


Cited by
More filters
01 Jan 2011
TL;DR: In this paper, a polynomial dimensional decomposition (PDD) method for global sensitivity analysis of stochastic systems subject to independent random input following arbitrary probability distributions is presented.
Abstract: This paper presents a polynomial dimensional decomposition (PDD) method for global sensitivity analysis of stochastic systems subject to independent random input following arbitrary probability distributions. The method involves Fourier-polynomial expansions of lower-variate component functions of a stochastic response by measure-consistent orthonormal polynomial bases, analytical formulae for calculating the global sensitivity indices in terms of the expansion coefficients, and dimension-reduction integration for estimating the expansion coefficients. Due to identical dimensional structures of PDD and analysis-of-variance decomposition, the proposed method facilitates simple and direct calculation of the global sensitivity indices. Numerical results of the global sensitivity indices computed for smooth systems reveal significantly higher convergence rates of the PDD approximation than those from existing methods, including polynomial chaos expansion, random balance design, state-dependent parameter, improved Sobol’s method, and sampling-based methods. However, for non-smooth functions, the convergence properties of the PDD solution deteriorate to a great extent, warranting further improvements. The computational complexity of the PDD method is polynomial, as opposed to exponential, thereby alleviating the curse of dimensionality to some extent. Mathematical modeling of complex systems often requires sensitivity analysis to determine how an output variable of interest is influenced by individual or subsets of input variables. A traditional local sensitivity analysis entails gradients or derivatives, often invoked in design optimization, describing changes in the model response due to the local variation of input. Depending on the model output, obtaining gradients or derivatives, if they exist, can be simple or difficult. In contrast, a global sensitivity analysis (GSA), increasingly becoming mainstream, characterizes how the global variation of input, due to its uncertainty, impacts the overall uncertain behavior of the model. In other words, GSA constitutes the study of how the output uncertainty from a mathematical model is divvied up, qualitatively or quantitatively, to distinct sources of input variation in the model [1].

1,296 citations

Journal ArticleDOI
TL;DR: To better understand CISs to support planning, maintenance and emergency decision making, modeling and simulation of interdependencies across CISs has recently become a key field of study and this paper reviews the studies in the field and broadly groups the existing modeling and Simulation approaches into six types.

891 citations

Journal ArticleDOI
TL;DR: An architectural framework for resilience and survivability in communication networks is provided and a survey of the disciplines that resilience encompasses is provided, along with significant past failures of the network infrastructure.

698 citations

Journal ArticleDOI
TL;DR: The objective of this paper is to provide a review of distributed control and management strategies for the next generation power system in the context of microgrids and identifies challenges and opportunities ahead.
Abstract: The objective of this paper is to provide a review of distributed control and management strategies for the next generation power system in the context of microgrids. This paper also identifies future research directions. The next generation power system, also referred to as the smart grid, is distinct from the existing power system due to its extensive use of integrated communication, advanced components such as power electronics, sensing, and measurement, and advanced control technologies. At the same time, the need for increased number of small distributed and renewable energy resources can exceed the capabilities of an available computational resource. Therefore, the recent literature has seen a significant research effort on dividing the control task among different units, which gives rise to the development of several distributed techniques. This paper discusses features and characteristics of these techniques, and identifies challenges and opportunities ahead. The paper also discusses the relationship between distributed control and hierarchical control.

594 citations

Book
01 Jan 2007
TL;DR: The workload model that is the basis of traditional analysis of the single queue becomes a foundation for workload relaxations used in the treatment of complex networks and Lyapunov functions and dynamic programming equations lead to the celebrated MaxWeight policy.
Abstract: Power grids, flexible manufacturing, cellular communications: interconnectedness has consequences. This remarkable book gives the tools and philosophy you need to build network models detailed enough to capture essential dynamics but simple enough to expose the structure of effective control solutions and to clarify analysis. Core chapters assume only exposure to stochastic processes and linear algebra at the undergraduate level; later chapters are for advanced graduate students and researchers/practitioners. This gradual development bridges classical theory with the state-of-the-art. The workload model that is the basis of traditional analysis of the single queue becomes a foundation for workload relaxations used in the treatment of complex networks. Lyapunov functions and dynamic programming equations lead to the celebrated MaxWeight policy along with many generalizations. Other topics include methods for synthesizing hedging and safety stocks, stability theory for networks, and techniques for accelerated simulation. Examples and figures throughout make ideas concrete. Solutions to end-of-chapter exercises available on a companion website.

555 citations