scispace - formally typeset
Search or ask a question

Showing papers by "Lancaster University published in 2012"


Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations


Journal ArticleDOI
11 Oct 2012-Nature
TL;DR: This work reviews recent progress in graphene research and in the development of production methods, and critically analyse the feasibility of various graphene applications.
Abstract: Recent years have witnessed many breakthroughs in research on graphene (the first two-dimensional atomic crystal) as well as a significant advance in the mass production of this material. This one-atom-thick fabric of carbon uniquely combines extreme mechanical strength, exceptionally high electronic and thermal conductivities, impermeability to gases, as well as many other supreme properties, all of which make it highly attractive for numerous applications. Here we review recent progress in graphene research and in the development of production methods, and critically analyse the feasibility of various graphene applications.

7,987 citations


Journal ArticleDOI
TL;DR: These guidelines are presented for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.

4,316 citations


Journal ArticleDOI
TL;DR: A straightforward guide to understanding, selecting, calculating, and interpreting effect sizes for many types of data and to methods for calculating effect size confidence intervals and power analysis is provided.
Abstract: The Publication Manual of the American Psychological Association (American Psychological Association, 2001, American Psychological Association, 2010) calls for the reporting of effect sizes and their confidence intervals. Estimates of effect size are useful for determining the practical or theoretical importance of an effect, the relative contributions of factors, and the power of an analysis. We surveyed articles published in 2009 and 2010 in the Journal of Experimental Psychology: General, noting the statistical analyses reported and the associated reporting of effect size estimates. Effect sizes were reported for fewer than half of the analyses; no article reported a confidence interval for an effect size. The most often reported analysis was analysis of variance, and almost half of these reports were not accompanied by effect sizes. Partial η2 was the most commonly reported effect size estimate for analysis of variance. For t tests, 2/3 of the articles did not report an associated effect size estimate; Cohen's d was the most often reported. We provide a straightforward guide to understanding, selecting, calculating, and interpreting effect sizes for many types of data and to methods for calculating effect size confidence intervals and power analysis.

3,117 citations


Book
17 May 2012
TL;DR: The Dynamics of Social Practice Introducing Theories of Practice Materials and Resources Sequence and Structure Making and Breaking Links Material, Competence and Meaning Car-Driving: Elements and Linkages Making Links Breaking Links Elements Between Practices Standardization and Diversity Individual and Collective Careers The Life of Elements Modes of Circulation Transportation and Access: Material Abstraction, Reversal and Migration: Competence Association and Classification: Meaning Packing and Unpacking Emergence, Disappearance and Persistence Recruitment, Defection and Reproduction First Encounters: Careers and Carriers Collapse and Transformation
Abstract: The Dynamics of Social Practice Introducing Theories of Practice Materials and Resources Sequence and Structure Making and Breaking Links Material, Competence and Meaning Car-Driving: Elements and Linkages Making Links Breaking Links Elements Between Practices Standardization and Diversity Individual and Collective Careers The Life of Elements Modes of Circulation Transportation and Access: Material Abstraction, Reversal and Migration: Competence Association and Classification: Meaning Packing and Unpacking Emergence, Disappearance and Persistence Recruitment, Defection and Reproduction First Encounters: Networks and Communities Capture and Commitment: Careers and Carriers Collapse and Transformation: The Dynamics of Defection Daily Paths, Life Paths and Dominant Projects Connections Between Practices Bundles and Complexes Collaboration and Competition Selection and Integration Coordinating Daily Life Circuits of Reproduction Monitoring Practices-as-Performances Monitoring Practices-as-Entities Cross-Referencing Practices-as-Performances Cross-Referencing Practices-as-Entities Aggregation Elements of Coordination Intersecting Circuits Representing the Dynamics of Social Practice Representing Elements and Practices Characterizing Circulation Competition, Transformation and Convergence Reproducing Elements, Practices and Relations between Them Time and Practice Space and Practice Dominant Projects and Power Promoting Transitions in Practice Climate Change and Behaviour Change Basis of Action Processes of Change Positioning Policy Transferable Lessons Practice Theory and Climate Change Policy Configuring Elements of Practice Configuring Relations between Practices Configuring Careers: Carriers and Practices Configuring Connections Practice Oriented Policy Making

2,250 citations


Journal ArticleDOI
TL;DR: This work considers the problem of detecting multiple changepoints in large data sets and introduces a new method for finding the minimum of such cost functions and hence the optimal number and location of changepoints that has a computational cost which is linear in the number of observations.
Abstract: In this article, we consider the problem of detecting multiple changepoints in large datasets. Our focus is on applications where the number of changepoints will increase as we collect more data: for example, in genetics as we analyze larger regions of the genome, or in finance as we observe time series over longer periods. We consider the common approach of detecting changepoints through minimizing a cost function over possible numbers and locations of changepoints. This includes several established procedures for detecting changing points, such as penalized likelihood and minimum description length. We introduce a new method for finding the minimum of such cost functions and hence the optimal number and location of changepoints that has a computational cost, which, under mild conditions, is linear in the number of observations. This compares favorably with existing methods for the same problem whose computational cost can be quadratic or even cubic. In simulation studies, we show that our new method can...

1,647 citations


Journal ArticleDOI
20 Apr 2012-Science
TL;DR: Given the scale of use of neonicotinoid insecticides, it is suggested that they may be having a considerable negative impact on wild bumble bee populations across the developed world.
Abstract: Growing evidence for declines in bee populations has caused great concern because of the valuable ecosystem services they provide. Neonicotinoid insecticides have been implicated in these declines because they occur at trace levels in the nectar and pollen of crop plants. We exposed colonies of the bumble bee Bombus terrestris in the laboratory to field-realistic levels of the neonicotinoid imidacloprid, then allowed them to develop naturally under field conditions. Treated colonies had a significantly reduced growth rate and suffered an 85% reduction in production of new queens compared with control colonies. Given the scale of use of neonicotinoids, we suggest that they may be having a considerable negative impact on wild bumble bee populations across the developed world.

1,066 citations


Journal ArticleDOI
TL;DR: In this paper, the electronic structure of silicene and the stability of its weakly buckled honeycomb lattice in an external electric field oriented perpendicular to the monolayer of Si atoms were analyzed.
Abstract: We report calculations of the electronic structure of silicene and the stability of its weakly buckled honeycomb lattice in an external electric field oriented perpendicular to the monolayer of Si atoms. The electric field produces a tunable band gap in the Dirac-type electronic spectrum, the gap being suppressed by a factor of about eight by the high polarizability of the system. At low electric fields, the interplay between this tunable band gap, which is specific to electrons on a honeycomb lattice, and the Kane-Mele spin-orbit coupling induces a transition from a topological to a band insulator, whereas at much higher electric fields silicene becomes a semimetal.

969 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used the structure-from-motion (SfM) and multi-view-stereo (MVS) algorithms to estimate erosion rates along a 50m-long coastal cliff.
Abstract: Topographic measurements for detailed studies of processes such as erosion or mass movement are usually acquired by expensive laser scanners or rigorous photogrammetry. Here, we test and use an alternative technique based on freely available computer vision software which allows general geoscientists to easily create accurate 3D models from field photographs taken with a consumer-grade camera. The approach integrates structure-from-motion (SfM) and multi-view-stereo (MVS) algorithms and, in contrast to traditional photogrammetry techniques, it requires little expertise and few control measurements, and processing is automated. To assess the precision of the results, we compare SfM-MVS models spanning spatial scales of centimeters (a hand sample) to kilometers (the summit craters of Piton de la Fournaise volcano) with data acquired from laser scanning and formal close-range photogrammetry. The relative precision ratio achieved by SfM-MVS (measurement precision : observation distance) is limited by the straightforward camera calibration model used in the software, but generally exceeds 1:1000 (i.e. centimeter-level precision over measurement distances of 10s of meters). We apply SfM-MVS at an intermediate scale, to determine erosion rates along a ~50-m-long coastal cliff. Seven surveys carried out over a year indicate an average retreat rate of 0.70±0.05 m a-1. Sequential erosion maps (at ~0.05 m grid resolution) highlight the spatio-temporal variability in the retreat, with semivariogram analysis indicating a correlation between volume loss and length scale. Compared with a laser scanner survey of the same site, SfM-MVS produced comparable data and reduced data collection time by ~80%.

859 citations


Journal ArticleDOI
TL;DR: This work presents the most general covariant ghost-free gravitational action in a Minkowski vacuum and includes a large class of nonlocal actions with improved UV behavior, which nevertheless recover Einstein's general relativity in the IR.
Abstract: We present the most general covariant ghost-free gravitational action in a Minkowski vacuum. Apart from the much studied f(R) models, this includes a large class of non-local actions with improved UV behavior, which nevertheless recover Einstein's general relativity in the IR.

720 citations



Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek  +3081 moreInstitutions (197)
TL;DR: A combined search for the Standard Model Higgs boson with the ATLAS experiment at the LHC using datasets corresponding to integrated luminosities from 1.04 fb(-1) to 4.9 fb(1) of pp collisions is described in this paper.

Journal ArticleDOI
TL;DR: In this paper, a radio-controlled mini quad-rotor UAV of the Super-Sauze, France landslide has been used to produce a high-resolution ortho-mosaic of the entire landslide and digital terrain models (DTMs) of several regions.

Journal ArticleDOI
TL;DR: This work shows how to construct appropriate summary statistics for ABC in a semi‐automatic manner, and shows that optimal summary statistics are the posterior means of the parameters.
Abstract: Summary. Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.

Journal ArticleDOI
TL;DR: This paper reported a longitudinal study investigating the predictors of reading comprehension and word reading accuracy between the ages of 7 to 8 (UK Year 3) and 10 to 11 years (Year 6).
Abstract: We report a longitudinal study investigating the predictors of reading comprehension and word reading accuracy between the ages of 7 to 8 (UK Year 3) and 10 to 11 years (Year 6). We found that different skills predicted the development of each. Reading comprehension skill measured in Year 3 was a strong predictor of comprehension in Year 6; vocabulary and verbal IQ also made significant unique contributions to the prediction of comprehension ability across time. Three comprehension components (inference, comprehension monitoring, and knowledge and use of story structure) emerged as distinct predictors of reading comprehension in Year 6, even after the autoregressive effect of comprehension was controlled. For word reading accuracy, early measures of word reading accuracy and phonemic awareness predicted later performance.

Journal ArticleDOI
Georges Aad, B. Abbott1, Jalal Abdallah2, A. A. Abdelalim3  +3013 moreInstitutions (174)
TL;DR: In this article, detailed measurements of the electron performance of the ATLAS detector at the LHC were reported, using decays of the Z, W and J/psi particles.
Abstract: Detailed measurements of the electron performance of the ATLAS detector at the LHC are reported, using decays of the Z, W and J/psi particles. Data collected in 2010 at root s = 7 TeV are used, corresponding to an integrated luminosity of almost 40 pb(-1). The inter-alignment of the inner detector and the electromagnetic calorimeter, the determination of the electron energy scale and resolution, and the performance in terms of response uniformity and linearity are discussed. The electron identification, reconstruction and trigger efficiencies, as well as the charge misidentification probability, are also presented.

Journal ArticleDOI
TL;DR: It is found that variation in soil microbial communities was explained by abiotic factors like climate, pH and soil properties, and more bacterial-dominated microbial communities were associated with exploitative plant traits versus fungal-dominated communities with resource-conservative traits, showing that plant functional traits and soil microbial Communities are closely related at the landscape scale.
Abstract: The controls on aboveground community composition and diversity have been extensively studied, but our understanding of the drivers of belowground microbial communities is relatively lacking, despite their importance for ecosystem functioning. In this study, we fitted statistical models to explain landscape-scale variation in soil microbial community composition using data from 180 sites covering a broad range of grassland types, soil and climatic conditions in England. We found that variation in soil microbial communities was explained by abiotic factors like climate, pH and soil properties. Biotic factors, namely community-weighted means (CWM) of plant functional traits, also explained variation in soil microbial communities. In particular, more bacterial-dominated microbial communities were associated with exploitative plant traits versus fungal-dominated communities with resource-conservative traits, showing that plant functional traits and soil microbial communities are closely related at the landscape scale.

Journal ArticleDOI
TL;DR: In this paper, the concept of intersectionality is reviewed and further developed for more effective use, and six dilemmas in the debates on the concept are disentangled, addressed and resolved: the distinction between structural and political intersectionality; the tension between categories and inequalities; the significance of class; the balance between a fluidity and stability; varyingly competitive, cooperative, hierarchical and hegemonic relations between inequalities and between projects; and the conundrum of "visibility" in the tension of the mutual shaping and the "mutual constitution of inequalities.
Abstract: The concept of intersectionality is reviewed and further developed for more effective use. Six dilemmas in the debates on the concept are disentangled, addressed and resolved: the distinction between structural and political intersectionality; the tension between ‘categories’ and ‘inequalities’; the significance of class; the balance between a fluidity and stability; the varyingly competitive, cooperative, hierarchical and hegemonic relations between inequalities and between projects; and the conundrum of ‘visibility’ in the tension between the ‘mutual shaping’ and the ‘mutual constitution’ of inequalities. The analysis draws on critical realism and on complexity theory in order to find answers to the dilemmas in intersectionality theory.

Journal ArticleDOI
TL;DR: In this paper, the authors survey the literature on non-convex mixed-integer nonlinear programs, discussing applications, algorithms, and software, and special attention is paid to the case in which the objective and constraint functions are quadratic.

Journal ArticleDOI
TL;DR: The paper provides a perspective on the challenge faced by science and technology in agriculture which must be met both in terms of increased crop productivity but also in increased resource use efficiency and the protection of environmental quality.
Abstract: In recent years, agricultural growth in China has accelerated remarkably, but most of this growth has been driven by increased yield per unit area rather than by expansion of the cultivated area Looking towards 2030, to meet the demand for grain and to feed a growing population on the available arable land, it is suggested that annual crop production should be increased to around 580 Mt and that yield should increase by at least 2% annually Crop production will become more difficult with climate change, resource scarcity (eg land, water, energy, and nutrients) and environmental degradation (eg declining soil quality, increased greenhouse gas emissions, and surface water eutrophication) To pursue the fastest and most practical route to improved yield, the near-term strategy is application and extension of existing agricultural technologies This would lead to substantial improvement in crop and soil management practices, which are currently suboptimal Two pivotal components are required if we are to follow new trajectories First, the disciplines of soil management and agronomy need to be given increased emphasis in research and teaching, as part of a grand food security challenge Second, continued genetic improvement in crop varieties will be vital However, our view is that the biggest gains from improved technology will come most immediately from combinations of improved crops and improved agronomical practices The objectives of this paper are to summarize the historical trend of crop production in China and to examine the main constraints to the further increase of crop productivity The paper provides a perspective on the challenge faced by science and technology in agriculture which must be met both in terms of increased crop productivity but also in increased resource use efficiency and the protection of environmental quality

Journal ArticleDOI
TL;DR: Judicious use of vegetation can create an efficient urban pollutant filter, yielding rapid and sustained improvements in street-level air quality in dense urban areas.
Abstract: Street-level concentrations of nitrogen dioxide (NO2) and particulate matter (PM) exceed public health standards in many cities, causing increased mortality and morbidity. Concentrations can be reduced by controlling emissions, increasing dispersion, or increasing deposition rates, but little attention has been paid to the latter as a pollution control method. Both NO2 and PM are deposited onto surfaces at rates that vary according to the nature of the surface; deposition rates to vegetation are much higher than those to hard, built surfaces. Previously, city-scale studies have suggested that deposition to vegetation can make a very modest improvement (<5%) to urban air quality. However, few studies take full account of the interplay between urban form and vegetation, specifically the enhanced residence time of air in street canyons. This study shows that increasing deposition by the planting of vegetation in street canyons can reduce street-level concentrations in those canyons by as much as 40% for NO2 and 60% for PM. Substantial street-level air quality improvements can be gained through action at the scale of a single street canyon or across city-sized areas of canyons. Moreover, vegetation will continue to offer benefits in the reduction of pollution even if the traffic source is removed from city centers. Thus, judicious use of vegetation can create an efficient urban pollutant filter, yielding rapid and sustained improvements in street-level air quality in dense urban areas.

Journal ArticleDOI
TL;DR: In this article, a study showed that fungal-based food webs of grassland were more resistant to bouts of drought than those of intensively managed wheat, and retained more carbon and nitrogen in the soil.
Abstract: A study shows that soil food webs directly help mitigate the effects of drought on soil nutrients. The fungal-based food webs of grassland were more resistant to bouts of drought than the bacterial-based food webs of intensively managed wheat, and retained more carbon and nitrogen in the soil.

Book
09 Feb 2012
TL;DR: In this article, the authors present an overview of environmental justice, evidence and process in the context of urban green spaces and the politics of dumping in urban areas, and discuss air quality, inequality and flooding vulnerability.
Abstract: 1. Understanding Environmental Justice 2. Globalising and Framing Environmental Justice 3. Making Claims: Justice, Evidence and Process 4. Locating Waste: Siting and the Politics of Dumping 5. Breathing Unequally: Air Quality and Inequality 6. Flood Vulnerability: Uneven Risk and the Injustice of Disaster 7. Urban Greenspace: Distributing an Environmental Good 8. Climate Justice: Scaling the Politics of the Future 9. Analysing Environmental Justice: Some Conclusions

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, S. Abdel Khalek4  +3073 moreInstitutions (193)
TL;DR: In this paper, a Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) =.
Abstract: Differential measurements of charged particle azimuthal anisotropy are presented for lead-lead collisions at root sNN = 2.76 TeV with the ATLAS detector at the LHC, based on an integrated luminosity of approximately 8 mu b(-1). This anisotropy is characterized via a Fourier expansion of the distribution of charged particles in azimuthal angle relative to the reaction plane, with the coefficients v(n) denoting the magnitude of the anisotropy. Significant v(2)-v(6) values are obtained as a function of transverse momentum (0.5 = 3 are found to vary weakly with both eta and centrality, and their p(T) dependencies are found to follow an approximate scaling relation, v(n)(1/n)(p(T)) proportional to v(2)(1/2)(p(T)), except in the top 5% most central collisions. A Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) = . For pairs of charged particles with a large pseudorapidity gap (|Delta eta = eta(a) - eta(b)| > 2) and one particle with p(T) < 3 GeV, the v(2,2)-v(6,6) values are found to factorize as v(n,n)(p(T)(a), p(T)(b)) approximate to v(n) (p(T)(a))v(n)(p(T)(b)) in central and midcentral events. Such factorization suggests that these values of v(2,2)-v(6,6) are primarily attributable to the response of the created matter to the fluctuations in the geometry of the initial state. A detailed study shows that the v(1,1)(p(T)(a), p(T)(b)) data are consistent with the combined contributions from a rapidity-even v(1) and global momentum conservation. A two-component fit is used to extract the v(1) contribution. The extracted v(1) isobserved to cross zero at pT approximate to 1.0 GeV, reaches a maximum at 4-5 GeV with a value comparable to that for v(3), and decreases at higher p(T).

Journal ArticleDOI
Georges Aad1, Georges Aad2, Brad Abbott2, Brad Abbott3  +5592 moreInstitutions (189)
TL;DR: The ATLAS trigger system as discussed by the authors selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy.
Abstract: Proton-proton collisions at root s = 7 TeV and heavy ion collisions at root(NN)-N-s = 2.76 TeV were produced by the LHC and recorded using the ATLAS experiment's trigger system in 2010. The LHC is designed with a maximum bunch crossing rate of 40 MHz and the ATLAS trigger system is designed to record approximately 200 of these per second. The trigger system selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy. An overview of the ATLAS trigger system, the evolution of the system during 2010 and the performance of the trigger system components and selections based on the 2010 collision data are shown. A brief outline of plans for the trigger system in 2011 is presented.

Journal ArticleDOI
TL;DR: In this paper, the authors consider how fuel poverty may be aligned to various alternative concepts of social and environmental justice, and argue that other understandings of injustice are also implicated and play important roles in producing and sustaining inequalities in access to affordable warmth.

Journal ArticleDOI
TL;DR: It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among hydrologics modellers, DA developers, and operational forecasters.
Abstract: Data assimilation (DA) holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters. The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, the Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1) theoretical or mathematical aspects in DA algorithms, (2) the estimation of different types of uncertainty, (3) new observations and their objective use in hydrologic DA, (4) the use of DA for real-time control of water resources systems, and (5) the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among hydrologic modellers, DA developers, and operational forecasters.

Journal ArticleDOI
TL;DR: It is noted that model scrutiny and use of expert opinion in modelling will benefit from formal, systematic and transparent procedures that include as wide a range of stakeholders as possible and the role for science to maintain and enhance the rigour and formality of the information that informs decision making is emphasised.
Abstract: The inevitable though frequently informal use of expert opinion in modelling, the increasing number of models that incorporate formally expert opinion from a diverse range of experience and stakeholders, arguments for participatory modelling and analytic-deliberative-adaptive approaches to managing complex environmental problems, and an expanding but uneven literature prompt this critical review and analysis. Aims are to propose common definitions, identify and categorise existing concepts and practice, and provide a frame of reference and guidance for future environmental modelling. The extensive literature review and classification conducted demonstrate that a broad and inclusive definition of experts and expert opinion is both required and part of current practice. Thus an expert can be anyone with relevant and extensive or in-depth experience in relation to a topic of interest. The literature review also exposes informal model assumptions and modeller subjectivity, examines in detail the formal uses of expert opinion and expert systems, and critically analyses the main concepts of, and issues arising in, expert elicitation and the modelling of associated uncertainty. It is noted that model scrutiny and use of expert opinion in modelling will benefit from formal, systematic and transparent procedures that include as wide a range of stakeholders as possible. Enhanced awareness and utilisation of expert opinion is required for modelling that meets the informational needs of deliberative fora. These conclusions in no way diminish the importance of conventional science and scientific opinion but recognise the need for a paradigmatic shift from traditional ideals of unbiased and impartial experts towards unbiased processes of expert contestation and a plurality of expertise and eventually models. Priority must be given to the quality of the enquiry for those responsible for environmental management and policy formulation, and this review emphasises the role for science to maintain and enhance the rigour and formality of the information that informs decision making.

Journal ArticleDOI
TL;DR: In this paper, a measurement model for firm performance, based on subjective indicators, is proposed and tested, which corroborates the idea that stakeholders have different demands that need to be managed independently.
Abstract: Firm performance is a relevant construct in strategic management research and frequently used as a dependent variable. Despite this relevance, there is hardly a consensus about its definition, dimensionality and measurement, what limits advances in research and understanding of the concept. This article proposes and tests a measurement model for firm performance, based on subjective indicators. The model is grounded in stakeholder theory and a review of empirical articles. Confirmatory Factor Analyses, using data from 116 Brazilian senior managers, were used to test its fit and psychometric properties. The final model had six first-order dimensions: profitability, growth, customer satisfaction, employee satisfaction, social performance, and environmental performance. A second-order financial performance construct, influencing growth and profitability, correlated with the first-order intercorrelated, non-financial dimensions. Results suggest dimensions cannot be used interchangeably, since they represent different aspects of firm performance, and corroborate the idea that stakeholders have different demands that need to be managed independently. Researchers and practitioners may use the model to fully treat performance in empirical studies and to understand the impact of strategies on multiple performance facets.

Journal ArticleDOI
TL;DR: The comparison of density functional theory type model calculations and molecular dynamics simulations with the experimental results revealed structure and mechanistic details of the evolution of the different types of (single) molecular junctions upon stretching quantitatively.
Abstract: Employing a scanning tunneling microscopy based beak junction technique and mechanically controlled break junction experiments, we investigated tolane (diphenylacetylene)-type single molecular junctions having four different anchoring groups (SH, pyridyl (PY), NH2, and CN) at a solid/liquid interface. The combination of current–distance and current–voltage measurements and their quantitative statistical analysis revealed the following sequence for junction formation probability and stability: PY > SH > NH2 > CN. For all single molecular junctions investigated, we observed the evolution through multiple junction configurations, with a particularly well-defined binding geometry for PY. The comparison of density functional theory type model calculations and molecular dynamics simulations with the experimental results revealed structure and mechanistic details of the evolution of the different types of (single) molecular junctions upon stretching quantitatively.