scispace - formally typeset
Search or ask a question

Showing papers in "Philosophical Transactions of the Royal Society A in 2007"


Journal ArticleDOI
TL;DR: Technical challenges that must be addressed if SHM is to gain wider application are discussed in a general manner and the historical overview and summarizing the SPR paradigm are provided.
Abstract: This introduction begins with a brief history of SHM technology development. Recent research has begun to recognise that a productive approach to the Structural Health Monitoring (SHM) problem is to regard it as one of statistical pattern recognition (SPR); a paradigm addressing the problem in such a way is described in detail herein as it forms the basis for the organisation of this book. In the process of providing the historical overview and summarising the SPR paradigm, the subsequent chapters in this book are cited in an effort to show how they fit into this overview of SHM. In the conclusions are stated a number of technical challenges that the authors believe must be addressed if SHM is to gain wider acceptance.

2,152 citations


Journal ArticleDOI
TL;DR: The motivation for using multi-model ensembles, the methodologies published so far and their results for regional temperature projections are outlined, and the challenges in interpreting multi- model results are discussed.
Abstract: Recent coordinated efforts, in which numerous climate models have been run for a common set of experiments, have produced large datasets of projections of future climate for various scenarios. Thos...

1,582 citations


Journal ArticleDOI
TL;DR: The motivations for and recent history of SHM applications to various forms of civil infrastructure are described, the present state-of-the-art and future developments in terms of instrumentation, data acquisition, communication systems and data mining and presentation procedures for diagnosis of infrastructural ‘health’ are discussed.
Abstract: Structural health monitoring (SHM) is a term increasingly used in the last decade to describe a range of systems implemented on full-scale civil infrastructures and whose purposes are to assist and inform operators about continued 'fitness for purpose' of structures under gradual or sudden changes to their state, to learn about either or both of the load and response mechanisms. Arguably, various forms of SHM have been employed in civil infrastructure for at least half a century, but it is only in the last decade or two that computer-based systems are being designed for the purpose of assisting owners/operators of ageing infrastructure with timely information for their continued safe and economic operation. This paper describes the motivations for and recent history of SHM applications to various forms of civil infrastructure and provides case studies on specific types of structure. It ends with a discussion of the present state-of-the-art and future developments in terms of instrumentation, data acquisition, communication systems and data mining and presentation procedures for diagnosis of infrastructural 'health'.

823 citations


Journal ArticleDOI
TL;DR: Data normalization is a procedure to normalize datasets, so that signal changes caused by operational and environmental variations of the system can be separated from structural changes of interest, such as structural deterioration or degradation.
Abstract: Stated in its most basic form, the objective of structural health monitoring is to ascertain if damage is present or not based on measured dynamic or static characteristics of a system to be monitored. In reality, structures are subject to changing environmental and operational conditions that affect measured signals, and these ambient variations of the system can often mask subtle changes in the system’s vibration signal caused by damage. Data normalization is a procedure to normalize datasets, so that signal changes caused by operational and environmental variations of the system can be separated from structural changes of interest, such as structural deterioration or degradation. This paper first reviews the effects of environmental and operational variations on real structures as reported in the literature. Then, this paper presents research progresses that have been made in the area of data normalization.

685 citations


Journal ArticleDOI
TL;DR: Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence and it is found that a large-scale motion in the log region becomes increasingly comparable to the near-wall cycle as the Reynolds number increases.
Abstract: Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

650 citations


Journal ArticleDOI
TL;DR: While a multistep process seems to be required, isoprene oxidation products are more likely to participate in growth and sulphuric acid is more likelyTo participate in nucleation, Iodine oxides are likely to participation in both nucleation and growth.
Abstract: The current knowledge in primary and secondary marine aerosol formation is reviewed. For primary marine aerosol source functions, recent source functions have demonstrated a significant flux of submicrometre particles down to radii of 20 nm. Moreover, the source functions derived from different techniques up to 10 mm have come within a factor of two of each other. For secondary marine aerosol formation, recent advances have identified iodine oxides and isoprene oxidation products, in addition to sulphuric acid, as contributing to formation and growth, although the exact roles remains to be determined. While a multistep process seems to be required, isoprene oxidation products are more likely to participate in growth and sulphuric acid is more likely to participate in nucleation. Iodine oxides are likely to participate in both nucleation and growth.

548 citations


Journal ArticleDOI
TL;DR: A reassessment of the role of complex climate models as predictive tools on decadal and longer time scales is argued for and a reconsideration of strategies for model development and experimental design is considered.
Abstract: Over the last 20 years, climate models have been developed to an impressive level of complexity. They are core tools in the study of the interactions of many climatic processes and justifiably provide an additional strand in the argument that anthropogenic climate change is a critical global problem. Over a similar period, there has been growing interest in the interpretation and probabilistic analysis of the output of computer models; particularly, models of natural systems. The results of these areas of research are being sought and utilized in the development of policy, in other academic disciplines, and more generally in societal decision making. Here, our focus is solely on complex climate models as predictive tools on decadal and longer time scales. We argue for a reassessment of the role of such models when used for this purpose and a reconsideration of strategies for model development and experimental design. Building on more generic work, we categorize sources of uncertainty as they relate to this specific problem and discuss experimental strategies available for their quantification. Complex climate models, as predictive tools for many variables and scales, cannot be meaningfully calibrated because they are simulating a never before experienced state of the system; the problem is one of extrapolation. It is therefore inappropriate to apply any of the currently available generic techniques which utilize observations to calibrate or weight models to produce forecast probabilities for the real world. To do so is misleading to the users of climate science in wider society. In this context, we discuss where we derive confidence in climate forecasts and present some concepts to aid discussion and communicate the state-of-the-art. Effective communication of the underlying assumptions and sources of forecast uncertainty is critical in the interaction between climate science, the impacts communities and society in general.

444 citations


Journal ArticleDOI
TL;DR: Only intense simultaneous efforts to slow CO2 emissions and reduce non-CO2 forcings can keep climate within or near the range of the past million years.
Abstract: Palaeoclimate data show that the Earth's climate is remarkably sensitive to global forcings. Positive feedbacks predominate. This allows the entire planet to be whipsawed between climate states. One feedback, the 'albedo flip' property of ice/water, provides a powerful trigger mechanism. A climate forcing that 'flips' the albedo of a sufficient portion of an ice sheet can spark a cataclysm. Inertia of ice sheet and ocean provides only moderate delay to ice sheet disintegration and a burst of added global warming. Recent greenhouse gas (GHG) emissions place the Earth perilously close to dramatic climate change that could run out of our control, with great dangers for humans and other creatures. Carbon dioxide (CO2) is the largest human-made climate forcing, but other trace constituents are also important. Only intense simultaneous efforts to slow CO2 emissions and reduce non-CO2 forcings can keep climate within or near the range of the past million years. The most important of the non-CO2 forcings is methane (CH4), as it causes the second largest human-made GHG climate forcing and is the principal cause of increased tropospheric ozone (O3), which is the third largest GHG forcing. Nitrous oxide (N2O) should also be a focus of climate mitigation efforts. Black carbon ('black soot') has a high global warming potential (approx. 2000, 500 and 200 for 20, 100 and 500 years, respectively) and deserves greater attention. Some forcings are especially effective at high latitudes, so concerted efforts to reduce their emissions could preserve Arctic ice, while also having major benefits for human health, agricultural productivity and the global environment.

437 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that the very large-scale and main turbulent motions act to decelerate the flow in the region above the maximum of the Reynolds shear stress.
Abstract: Large-scale motions (LSMs; having wavelengths up to 2–3 pipe radii) and very-LSMs (having wavelengths more than 3 pipe radii) have been shown to carry more than half of the kinetic energy and Reynolds shear stress in a fully developed pipe flow. Studies using essentially the same methods of measurement and analysis have been extended to channel and zero-pressure-gradient boundary-layer flows to determine whether large structures appear in these canonical wall flows and how their properties compare with that of the pipe flow. The very large scales, especially those of the boundary layer, are shorter than the corresponding scales in the pipe flow, but otherwise share a common behaviour, suggesting that they arise from similar mechanism(s) aside from the modifying influences of the outer geometries. Spectra of the net force due to the Reynolds shear stress in the channel and boundary layer flows are similar to those in the pipe flow. They show that the very-largescale and main turbulent motions act to decelerate the flow in the region above the maximum of the Reynolds shear stress.

414 citations


Journal ArticleDOI
TL;DR: In this article, the sources of aberrations, their effects and their correction with adaptive optics, particularly in confocal and two-photon microscopes, are discussed. And applications of adaptive optics in the related areas of optical data storage, optical tweezers and micro/nanofabrication are also reviewed.
Abstract: The imaging properties of optical microscopes are often compromised by aberrations that reduce image resolution and contrast. Adaptive optics technology has been employed in various systems to correct these aberrations and restore performance. This has required various departures from the traditional adaptive optics schemes that are used in astronomy. This review discusses the sources of aberrations, their effects and their correction with adaptive optics, particularly in confocal and two-photon microscopes. Different methods of wavefront sensing, indirect aberration measurement and aberration correction devices are discussed. Applications of adaptive optics in the related areas of optical data storage, optical tweezers and micro/nanofabrication are also reviewed.

404 citations


Journal ArticleDOI
TL;DR: This paper concludes the theme issue on structural health monitoring (SHM) by discussing the concept of damage prognosis (DP), which attempts to forecast system performance by assessing the current damage state, estimating the future loading environments for that system, and predicting through simulation and past experience the remaining useful life of the system.
Abstract: This paper concludes the theme issue on structural health monitoring (SHM) by discussing the concept of damage prognosis (DP). DP attempts to forecast system performance by assessing the current damage state of the system (i.e. SHM), estimating the future loading environments for that system, and predicting through simulation and past experience the remaining useful life of the system. The successful development of a DP capability will require the further development and integration of many technology areas including both measurement/processing/telemetry hardware and a variety of deterministic and probabilistic predictive modelling capabilities, as well as the ability to quantify the uncertainty in these predictions. The multidisciplinary and challenging nature of the DP problem, its current embryonic state of development, and its tremendous potential for life-safety and economic benefits qualify DP as a ‘grand challenge’ problem for engineers in the twenty-first century.

Journal ArticleDOI
TL;DR: To improve the ability of low-cost wireless sensing units to detect the onset of structural damage, the wireless sensing unit paradigm is extended to include the capability to command actuators and active sensors.
Abstract: Wireless monitoring has emerged in recent years as a promising technology that could greatly impact the field of structural monitoring and infrastructure asset management. This paper is a summary of research efforts that have resulted in the design of numerous wireless sensing unit prototypes explicitly intended for implementation in civil structures. Wireless sensing units integrate wireless communications and mobile computing with sensors to deliver a relatively inexpensive sensor platform. A key design feature of wireless sensing units is the collocation of computational power and sensors; the tight integration of computing with a wireless sensing unit provides sensors with the opportunity to self-interrogate measurement data. In particular, there is strong interest in using wireless sensing units to build structural health monitoring systems that interrogate structural data for signs of damage. After the hardware and the software designs of wireless sensing units are completed, the Alamosa Canyon Bridge in New Mexico is utilized to validate their accuracy and reliability. To improve the ability of low-cost wireless sensing units to detect the onset of structural damage, the wireless sensing unit paradigm is extended to include the capability to command actuators and active sensors.

Journal ArticleDOI
TL;DR: A number of problems that exist with the use of inverse methods in damage detection and location, including modelling error, environmental effects, damage localization and regularization are discussed.
Abstract: This paper gives an overview of the use of inverse methods in damage detection and location, using measured vibration data. Inverse problems require the use of a model and the identification of uncertain parameters of this model. Damage is often local in nature and although the effect of the loss of stiffness may require only a small number of parameters, the lack of knowledge of the location means that a large number of candidate parameters must be included. This paper discusses a number of problems that exist with this approach to health monitoring, including modelling error, environmental effects, damage localization and regularization.

Journal ArticleDOI
Masato Hirose1, Kenichi Ogawa
TL;DR: The continuous transition from walking in a straight line to making a turn has been achieved with the latest humanoid robot ASIMO, the most advanced robot of Honda so far in the mechanism and the control system.
Abstract: Honda has been doing research on robotics since 1986 with a focus upon bipedal walking technology. The research started with straight and static walking of the first prototype two-legged robot. Now, the continuous transition from walking in a straight line to making a turn has been achieved with the latest humanoid robot ASIMO. ASIMO is the most advanced robot of Honda so far in the mechanism and the control system. ASIMO9s configuration allows it to operate freely in the human living space. It could be of practical help to humans with its ability of five-finger arms as well as its walking function. The target of further development of ASIMO is to develop a robot to improve life in human society. Much development work will be continued both mechanically and electronically, staying true to Honda9s ‘challenging spirit’.

Journal ArticleDOI
TL;DR: The object of this paper is to illustrate the utility of the data-driven approach to damage identification by means of a number of case studies.
Abstract: In broad terms, there are two approaches to damage identification Model-driven methods establish a high-fidelity physical model of the structure, usually by finite element analysis, and then establish a comparison metric between the model and the measured data from the real structure If the model is for a system or structure in normal (ie undamaged) condition, any departures indicate that the structure has deviated from normal condition and damage is inferred Data-driven approaches also establish a model, but this is usually a statistical representation of the system, eg a probability density function of the normal condition Departures from normality are then signalled by measured data appearing in regions of very low density The algorithms that have been developed over the years for data-driven approaches are mainly drawn from the discipline of pattern recognition, or more broadly, machine learning The object of this paper is to illustrate the utility of the data-driven approach to damage identification by means of a number of case studies

Journal ArticleDOI
TL;DR: This work estimates point-source ebullition for 16 lakes in Alaska and Siberia that represent several common northern lake types: glacial, alluvial floodplain, peatland and thermokarst (thaw) lakes and estimates that northern lakes are a globally significant source of atmospheric CH4.
Abstract: Large uncertainties in the budget of atmospheric methane (CH4) limit the accuracy of climate change projections. Here we describe and quantify an important source of CH4 -- point-source ebullition (bubbling) from northern lakes -- that has not been incorporated in previous regional or global methane budgets. Employing a method recently introduced to measure ebullition more accurately by taking into account its spatial patchiness in lakes, we estimate point-source ebullition for 16 lakes in Alaska and Siberia that represent several common northern lake types: glacial, alluvial floodplain, peatland and thermokarst (thaw) lakes. Extrapolation of measured fluxes from these 16 sites to all lakes north of 45 degrees N using circumpolar databases of lake and permafrost distributions suggests that northern lakes are a globally significant source of atmospheric CH4, emitting approximately 24.2+/-10.5Tg CH4yr(-1). Thermokarst lakes have particularly high emissions because they release CH4 produced from organic matter previously sequestered in permafrost. A carbon mass balance calculation of CH4 release from thermokarst lakes on the Siberian yedoma ice complex suggests that these lakes alone would emit as much as approximately 49000Tg CH4 if this ice complex was to thaw completely. Using a space-for-time substitution based on the current lake distributions in permafrost-dominated and permafrost-free terrains, we estimate that lake emissions would be reduced by approximately 12% in a more probable transitional permafrost scenario and by approximately 53% in a 'permafrost-free' Northern Hemisphere. Long-term decline in CH4 ebullition from lakes due to lake area loss and permafrost thaw would occur only after the large release of CH4 associated thermokarst lake development in the zone of continuous permafrost.

Journal ArticleDOI
TL;DR: A methodology is described for probabilistic predictions of future climate based on a set of ensemble simulations of equilibrium and time-dependent changes carried out by perturbing poorly constrained parameters controlling key physical and biogeochemical processes in the HadCM3 coupled ocean–atmosphere global climate model.
Abstract: A methodology is described for probabilistic predictions of future climate. This is based on a set of ensemble simulations of equilibrium and time-dependent changes, carried out by perturbing poorl...

Journal ArticleDOI
TL;DR: It is concluded that the overall trend in SSTs, and tropical cyclone and hurricane numbers is substantially influenced by greenhouse warming.
Abstract: We find that long-period variations in tropical cyclone and hurricane frequency over the past century in the North Atlantic Ocean have occurred as three relatively stable regimes separated by sharp transitions. Each regime has seen 50% more cyclones and hurricanes than the previous regime and is associated with a distinct range of sea surface temperatures (SSTs) in the eastern Atlantic Ocean. Overall, there appears to have been a substantial 100-year trend leading to related increases of over 0.78C in SST and over 100% in tropical cyclone and hurricane numbers. It is concluded that the overall trend in SSTs, and tropical cyclone and hurricane numbers is substantially influenced by greenhouse warming. Superimposed on the evolving tropical cyclone and hurricane climatology is a completely independent oscillation manifested in the proportions of tropical cyclones that become major and minor hurricanes. This characteristic has no distinguishable net trend and appears to be associated with concomitant variations in the proportion of equatorial and higher latitude hurricane developments, perhaps arising from internal oscillations of the climate system. The period of enhanced major hurricane activity during 1945–1964 is consistent with a peak period in major hurricane proportions.

Journal ArticleDOI
TL;DR: The mechanics of hard biological materials, and more specifically of nacre and bone are discussed, which are made up of relatively weak components arranged in intricate ways to achieve specific combinations of stiffness, strength and toughness.
Abstract: Billions of years of evolution have produced extremely efficient natural materials, which are increasingly becoming a source of inspiration for engineers. Biomimetics—the science of imitating nature—is a growing multidisciplinary field which is now leading to the fabrication of novel materials with remarkable mechanical properties. This article discusses the mechanics of hard biological materials, and more specifically of nacre and bone. These high-performance natural composites are made up of relatively weak components (brittle minerals and soft proteins) arranged in intricate ways to achieve specific combinations of stiffness, strength and toughness (resistance to cracking). Determining which features control the performance of these materials is the first step in biomimetics. These ‘key features’ can then be implemented into artificial bio-inspired synthetic materials, using innovative techniques such as layer-by-layer assembly or ice-templated crystallization. The most promising approaches, however, are self-assembly and biomineralization because they will enable tight control of structures at the nanoscale. In this ‘bottom-up’ fabrication, also inspired from nature, molecular structures and crystals are assembled with a little or no external intervention. The resulting materials will offer new combinations of low weight, stiffness and toughness, with added functionalities such as self-healing. Only tight collaborations between engineers, chemists, materials scientists and biologists will make these ‘next-generation’ materials a reality.

Journal ArticleDOI
TL;DR: It is found that many of the previously proposed empirical relations accurately describe the local Cf behaviour when modified and underpinned by the same experimental data.
Abstract: Flat plate turbulent boundary layers under zero pressure gradient at high Reynolds numbers are studied to reveal appropriate scale relations and asymptotic behaviour. Careful examination of the skin-friction coefficient results confirms the necessity for direct and independent measurement of wall shear stress. We find that many of the previously proposed empirical relations accurately describe the local Cf behaviour when modified and underpinned by the same experimental data. The variation of the integral parameter, H, shows consistent agreement between the experimental data and the relation from classical theory. In accordance with the classical theory, the ratio of D and d asymptotes to a constant. Then, the usefulness of the ratio of appropriately defined mean and turbulent time-scales to define and diagnose equilibrium flow is established. Next, the description of mean velocity profiles is revisited, and the validity of the logarithmic law is re-established using both the mean velocity profile and its diagnostic function. The wake parameter, P, is shown to reach an asymptotic value at the highest available experimental Reynolds numbers if correct values of logarithmic-law constants and an appropriate skin-friction estimate are used. The paper closes with a discussion of the Reynolds number trends of the outer velocity defect which are important to establish a consistent similarity theory and appropriate scaling.

Journal ArticleDOI
TL;DR: Recently discovered cells based on mesoscopic inorganic or organic semiconductors commonly referred to as ‘bulk’ junctions due to their three-dimensional structure are very attractive alternatives which offer the prospect of very low cost fabrication.
Abstract: The Sun provides approximately 100,000 terawatts to the Earth which is about 10000 times more than the present rate of the world's present energy consumption. Photovoltaic cells are being increasingly used to tap into this huge resource and will play a key role in future sustainable energy systems. So far, solid-state junction devices, usually made of silicon, crystalline or amorphous, and profiting from the experience and material availability resulting from the semiconductor industry, have dominated photovoltaic solar energy converters. These systems have by now attained a mature state serving a rapidly growing market, expected to rise to 300 GW by 2030. However, the cost of photovoltaic electricity production is still too high to be competitive with nuclear or fossil energy. Thin film photovoltaic cells made of CuInSe or CdTe are being increasingly employed along with amorphous silicon. The recently discovered cells based on mesoscopic inorganic or organic semiconductors commonly referred to as 'bulk' junctions due to their three-dimensional structure are very attractive alternatives which offer the prospect of very low cost fabrication. The prototype of this family of devices is the dye-sensitized solar cell (DSC), which accomplishes the optical absorption and the charge separation processes by the association of a sensitizer as light-absorbing material with a wide band gap semiconductor of mesoporous or nanocrystalline morphology. Research is booming also in the area of third generation photovoltaic cells where multi-junction devices and a recent breakthrough concerning multiple carrier generation in quantum dot absorbers offer promising perspectives.

Journal ArticleDOI
TL;DR: Experiments with technically implemented leg models, which explore some of the basic principles of locomotion and respective implications for construction and control, are promising.
Abstract: Research on the biomechanics of animal and human locomotion provides insight into basic principles of locomotion and respective implications for construction and control. Nearly elastic operation of the leg is necessary to reproduce the basic dynamics in walking and running. Elastic leg operation can be modelled with a spring-mass model. This model can be used as a template with respect to both gaits in the construction and control of legged machines. With respect to the segmented leg, the humanoid arrangement saves energy and ensures structural stability. With the quasi-elastic operation the leg inherits the property of self-stability, i.e. the ability to stabilize a system in the presence of disturbances without sensing the disturbance or its direct effects. Self-stability can be conserved in the presence of musculature with its crucial damping property. To ensure secure foothold visco-elastic suspended muscles serve as shock absorbers. Experiments with technically implemented leg models, which explore some of these principles, are promising.

Journal ArticleDOI
TL;DR: An overview of the principles and techniques of time-series methods for fault detection, identification and estimation in vibrating structures is presented, and certain new methods are introduced.
Abstract: An overview of the principles and techniques of time-series methods for fault detection, identification and estimation in vibrating structures is presented, and certain new methods are introduced. The methods are classified, and their features and operation are discussed. Their practicality and effectiveness are demonstrated through brief presentations of three case studies pertaining to fault detection, identification and estimation in an aircraft panel, a scale aircraft skeleton structure and a simple nonlinear simulated structure.

Journal ArticleDOI
Mat Collins1
TL;DR: This paper introduces some of the concepts and issues in these new approaches to deal with a number of sources of uncertainty that arise in the prediction process of future climate.
Abstract: Predictions of future climate are of central importance in determining actions to adapt to the impacts of climate change and in formulating targets to reduce emissions of greenhouse gases. In the absence of analogues of the future, physically based numerical climate models must be used to make predictions. New approaches are under development to deal with a number of sources of uncertainty that arise in the prediction process. This paper introduces some of the concepts and issues in these new approaches, which are discussed in more detail in the papers contained in this issue.

Journal ArticleDOI
TL;DR: This work focuses on the usage of decadal to centennial time-scale climate change simulations as inputs to decision making, but acknowledges that robust adaptation to the variability of present day climate encourages the development of less vulnerable systems as well as building critical experience in how to respond to climatic uncertainty.
Abstract: There is a scientific consensus regarding the reality of anthropogenic climate change. This has led to substantial efforts to reduce atmospheric greenhouse gas emissions and thereby mitigate the impacts of climate change on a global scale. Despite these efforts, we are committed to substantial further changes over at least the next few decades. Societies will therefore have to adapt to changes in climate. Both adaptation and mitigation require action on scales ranging from local to global, but adaptation could directly benefit from climate predictions on regional scales while mitigation could be driven solely by awareness of the global problem; regional projections being principally of motivational value. We discuss how recent developments of large ensembles of climate model simulations can be interpreted to provide information on these scales and to inform societal decisions. Adaptation is most relevant as an influence on decisions which exist irrespective of climate change, but which have consequences on decadal time-scales. Even in such situations, climate change is often only a minor influence; perhaps helping to restrict the choice of 'no regrets' strategies. Nevertheless, if climate models are to provide inputs to societal decisions, it is important to interpret them appropriately. We take climate ensembles exploring model uncertainty as potentially providing a lower bound on the maximum range of uncertainty and thus a non-discountable climate change envelope. An analysis pathway is presented, describing how this information may provide an input to decisions, sometimes via a number of other analysis procedures and thus a cascade of uncertainty. An initial screening is seen as a valuable component of this process, potentially avoiding unnecessary effort while guiding decision makers through issues of confidence and robustness in climate modelling information. Our focus is the usage of decadal to centennial time-scale climate change simulations as inputs to decision making, but we acknowledge that robust adaptation to the variability of present day climate encourages the development of less vulnerable systems as well as building critical experience in how to respond to climatic uncertainty.

Journal ArticleDOI
TL;DR: An experimental study is presented to demonstrate how this technique can be used to detect structural damage in real time and a modified frequency-domain autoregressive model with exogenous inputs (ARX) is described.
Abstract: This paper presents an overview and recent advances in impedance-based structural health monitoring. The basic principle behind this technique is to apply high-frequency structural excitations (typically greater than 30 kHz) through surface-bonded piezoelectric transducers, and measure the impedance of structures by monitoring the current and voltage applied to the piezoelectric transducers. Changes in impedance indicate changes in the structure, which in turn can indicate that damage has occurred. An experimental study is presented to demonstrate how this technique can be used to detect structural damage in real time. Signal processing methods that address damage classifications and data compression issues associated with the use of the impedance methods are also summarized. Finally, a modified frequency-domain autoregressive model with exogenous inputs (ARX) is described. The frequency-domain ARX model, constructed by measured impedance data, is used to diagnose structural damage with levels of statistical confidence.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the suitability of the time-dependent global temperature change potential (GTP) metric in the context of climate policy and found that the weighting of emissions using the GTP is significantly dependent on the scenarios of future emissions and the sensitivity of the climate system.
Abstract: Multi-gas climate agreements require a metric by which emissions of gases with different lifetimes and radiative properties can be placed on a common scale. The Kyoto Protocol to the United Nations Framework Convention on Climate Change uses the global warming potential (GWP) as such a metric. The GWP has attracted particular criticism as being inappropriate in the context of climate policy which seeks to restrict warming below a given target, because it gives equal weight to emissions irrespective of the target and the proximity to the target. The use of an alternative metric, the time-dependent global temperature change potential (GTP), is examined for its suitability and the prospects for it including very short-lived species. It retains the transparency and relative ease of use, which are attractive features of the GWP, but explicitly includes a dependence on the target of climate policy. The weighting of emissions using the GTP is found to be significantly dependent on the scenarios of future emissions and the sensitivity of the climate system. This may indicate that the use of any GTP-based weighting in future policymaking would necessitate regular revisions, as the global-mean temperature moves towards a specified target.

Journal ArticleDOI
TL;DR: In this paper, the authors present the wide-ranging applications of nanodiamond particles to date and discuss future research directions in this field, predicting a huge increase in research with these materials in the very near future.
Abstract: Although nanocrystalline diamond powders have been produced in industrial quantities, mainly by detonation synthesis, for many decades their use in applications other than traditional polishing and grinding have been limited, until recently. This paper presents the wide-ranging applications of nanodiamond particles to date and discusses future research directions in this field. Owing to the recent commercial availability of these powders and the present interest in nanotechnology, one can predict a huge increase in research with these materials in the very near future. However, to fully exploit these materials, fundamental as well as applied research is required to understand the transition between bulk and surface properties as the size of particles decreases.

Journal ArticleDOI
TL;DR: It is shown that a probabilistic approach provides more informative results that enable the potential risk of impacts to be quantified, but that details of the risks are dependent on the approach used in the analysis.
Abstract: Climate change impacts and adaptation assessments have traditionally adopted a scenario-based approach, which precludes an assessment of the relative risks of particular adaptation options. Probabilistic impact assessments, especially if based on a thorough analysis of the uncertainty in an impact forecast system, enable adoption of a risk-based assessment framework. However, probabilistic impacts information is conditional and will change over time. We explore the implications of a probabilistic end-to-end risk-based framework for climate impacts assessment, using the example of water resources in the Thames River, UK. We show that a probabilistic approach provides more informative results that enable the potential risk of impacts to be quantified, but that details of the risks are dependent on the approach used in the analysis.

Journal ArticleDOI
TL;DR: This article examined future economic damages from tropical cyclones under a range of assumptions about societal change, climate change and the relationship of climate change to damage in 2050 and found that efforts to reduce vulnerability to losses, often called climate adaptation, have far greater potential effectiveness to reduce damage related to tropical cyclone than efforts to modulate the behaviour of storms through greenhouse gas emissions reduction policies, typically called climate mitigation and achieved through energy policies.
Abstract: This paper examines future economic damages from tropical cyclones under a range of assumptions about societal change, climate change and the relationship of climate change to damage in 2050. It finds in all cases that efforts to reduce vulnerability to losses, often called climate adaptation, have far greater potential effectiveness to reduce damage related to tropical cyclones than efforts to modulate the behaviour of storms through greenhouse gas emissions reduction policies, typically called climate mitigation and achieved through energy policies. The paper urges caution in using economic losses of tropical cyclones as justification for action on energy policies when far more potentially effective options are available.