scispace - formally typeset
Search or ask a question

Showing papers by "Chen-Ching Liu published in 2007"


Journal ArticleDOI
G. L. Bayatian, S. Chatrchyan, G. Hmayakyan, Albert M. Sirunyan  +2060 moreInstitutions (143)
TL;DR: In this article, the authors present a detailed analysis of the performance of the Large Hadron Collider (CMS) at 14 TeV and compare it with the state-of-the-art analytical tools.
Abstract: CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider (LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking--through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start-up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb−1 or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, Bs production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb−1 to 30 fb−1. The Standard Model processes include QCD, B-physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z0 boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2-6 describe examples of full analyses, with photons, electrons, muons, jets, missing ET, B-mesons and τ's, and for quarkonia in heavy ion collisions. Chapters 7-15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model

973 citations


Proceedings ArticleDOI
24 Jun 2007
TL;DR: A methodology to evaluate the cybersecurity vulnerability using attack trees based on power system control networks is proposed and can be extended to security investment analysis.
Abstract: By penetrating the SCADA system, an intruder may remotely operate a power system using supervisory control privileges. Hence, cybersecurity has been recognized as a major threat due to the potential intrusion to the online system. This paper proposes a methodology to evaluate the cybersecurity vulnerability using attack trees. The attack tree formulation based on power system control networks is used to evaluate the system, scenario, and leaf vulnerabilities. The measure of vulnerabilities in the power system control framework is determined based on existing cybersecurity conditions before the vulnerability indices are evaluated. After the indices are evaluated, an upper bound is imposed on each scenario vulnerability in order to determine the pivotal attack leaves that require countermeasure improvements. The proposed framework can be extended to security investment analysis.

200 citations


Journal ArticleDOI
TL;DR: In this article, a combination of fuzzy inference system (FIS), least squares estimation (LSE), and the combination of FIS and LSE are proposed for electricity price forecasting in locational marginal pricing spot markets.
Abstract: Accurate electricity price forecasting is critical to market participants in wholesale electricity markets. Market participants rely on price forecasts to decide their bidding strategies, allocate assets, negotiate bilateral contracts, hedge risks, and plan facility investments. Market operators can also use electricity price forecasts to predict market power indexes for the purpose of monitoring participants' behaviors. Various forecasting techniques are applied to different time horizons for electricity price forecasting in locational marginal pricing (LMP) spot markets. Available correlated data also have to be selected to improve the short-term forecasting performance. In this paper, fuzzy inference system (FIS), least-squares estimation (LSE), and the combination of FIS and LSE are proposed. Based on extensive testing with various techniques, LSE provides the most accurate results, and FIS, which is also highly accurate, provides transparency and interpretability

198 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the effects of the 2003 U.S. blackout on the security values of the electric utilities and manufacturing firms in the electric power equipment industry, using an event study method.
Abstract: On August 14, 2003, the U.S. faced the largest blackout in history, which left over 50 million people without electricity in eight U.S. states and part of Canada. This paper investigates the effects of the blackout on the security values of the U.S. electric utilities and manufacturing firms in the electric power equipment industry, using an event study method. The results of this empirical study show that the electric utilities were negatively affected, but the electrical equipment manufacturing firms were significantly, positively affected.

45 citations


Proceedings ArticleDOI
01 Oct 2007
TL;DR: This paper is an overview of the cybersecurity issues for electric power control and automation systems, the control architectures, and the possible methodologies for vulnerability assessment of existing systems.
Abstract: Disruption of electric power operations can be catastrophic on the national security and economy. Due to the complexity of widely dispersed assets and the interdependency between computer, communication, and power systems, the requirement to meet security and quality compliance on the operations is a challenging issue. In recent years, NERC's cybersecurity standard was initiated to require utilities compliance on cybersecurity in control systems - NERC CIP 1200. This standard identifies several cyber-related vulnerabilities that exist in control systems and recommends several remedial actions (e.g., best practices). This paper is an overview of the cybersecurity issues for electric power control and automation systems, the control architectures, and the possible methodologies for vulnerability assessment of existing systems.

36 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: In this article, the day-ahead electricity market is modeled as a multi-agent system with interacting agents including supplier agents, load serving entities, and a market operator, and it is shown that, with Q-learning, electricity suppliers are making more profits compared to the scenario without learning due to strategic gaming.
Abstract: The day-ahead electricity market is modeled as a multi-agent system with interacting agents including supplier agents, load serving entities, and a market operator. Simulation of the market clearing results under the scenario in which agents have learning capabilities is compared with the scenario where agents report true marginal costs. It is shown that, with Q-Learning, electricity suppliers are making more profits compared to the scenario without learning due to strategic gaming. As a result, the LMP at each bus is substantially higher.

31 citations


Journal ArticleDOI
TL;DR: In this paper, a system-theoretic method is proposed for identification of the fault location based on the limited data available, which is being implemented for the field test in Monterey, California.
Abstract: The objective of the North Eastern Pacific Time-Series Undersea Networked Experiment (NEPTUNE) program is to construct an underwater cabled observatory on the floor of the Pacific Ocean, encompassing the Juan de Fuca Tectonic Plate. The power system associated with the proposed observatory is unlike conventional terrestrial power systems in many ways due to the unique operating conditions of underwater cabled observatories. In the event of a backbone cable fault, the location of the fault must be identified accurately so that a repair ship can be sent to repair the cable. Due to the proposed networked, mesh structure, traditional techniques for cable fault identification can not achieve the desired level of accuracy. In this paper, a system-theoretic method is proposed for identification of the fault location based on the limited data available. The method has been tested with extensive simulations and is being implemented for the field test in Monterey, California. In this study, a lab test is performed for the fault location function

28 citations


Journal ArticleDOI
TL;DR: In this paper, the authors introduce the concept of perpetual options theory to analyze merchant transmission investment through an approved rate, and apply the perpetual option theory to determine the most opportunistic time to start a transmission project and obtain the maximum revenue returns or let the option expire when the economic incentive is not sufficient.
Abstract: This paper introduces the concept of perpetual options theory to analyze merchant transmission investment through an approved rate. Since the electric usages and the associated revenue is stochastic in nature, applying the perpetual options theory allows an investor to determine the most opportunistic time to start a transmission project and obtain the maximum revenue returns or let the option expire when the economic incentive is not sufficient. In today's environment, this decision approach is more appropriate since it provides transmission investors a better projection of its return on investment and an exit strategy for an investment decision.

24 citations


01 Jan 2007
TL;DR: In this article, an anticipatory reinforcement learning technique is used to model the learning behaviors of electricity suppliers in a Day-Ahead electricity market, where the market is modeled as a multi-agent system with interacting agents including supplier agents, Load Serving Entities and a Market Operator.
Abstract: An important objective of electricity suppliers is to maximize their profits over a planning horizon and comply with the market rules. This objective requires suppliers to learn from their bidding experience and behave in an anticipatory way. With volatile Locational Marginal Prices (LMPs), ever-changing transmission grid conditions, and incomplete information about other market participants, decision making for a supplier is a complex task. A learning algorithm that does not require an analytical model of the complicated market but allows suppliers to learn from experience and act in an anticipatory way is a suitable approach to this problem. Q-Learning, an anticipatory reinforcement learning technique, has all these desired properties. Therefore, it is used in this research to model the learning behaviors of electricity suppliers in a Day-Ahead electricity market. The Day-Ahead electricity market is modeled as a multi-agent system with interacting agents including supplier agents, Load Serving Entities and a Market Operator. Simulation of the market clearing results under the scenario in which agents have learning capabilities is compared with the scenario where agents report true marginal costs. It is shown that, with Q- Learning, electricity suppliers are making more profits compared to the scenario without learning due to strategic gaming. As a result, the LMP at each bus is substantially higher.

4 citations


Proceedings ArticleDOI
24 Jun 2007
TL;DR: In this article, an optimization based load management algorithm is presented for an underwater cabled observatory system and an analytical tool is needed to calculate the optimal operating condition that serves the maximum number of loads while satisfying all the constraints.
Abstract: The objective of the North-East Pacific Time-Series Undersea Networked Experiment (NEPTUNE) program is to construct an underwater cabled observatory on the floor of the Pacific Ocean. While the goal is to provide as much power as possible to users of the system, the system conditions need to be monitored on line in order to determine whether or not any corrective action is needed. Due to the unique nature of the underwater observatory system and load characteristics, an analytical tool is needed to calculate the optimal operating condition that serves the maximum number of loads while satisfying all the constraints. In this paper, an optimization based load management algorithm is presented. The method takes into account all system constraints as well as priorities of the loads.

3 citations