scispace - formally typeset
Search or ask a question

Showing papers by "University of California published in 2015"


Journal ArticleDOI
TL;DR: In this paper, the effects of backreaction on holographic correlators were studied in the context of 1+1 dimensional dilaton gravity models, which describe flows to AdS2 from higher dimensional AdS spaces.
Abstract: We develop models of 1+1 dimensional dilaton gravity describing flows to AdS2 from higher dimensional AdS and other spaces. We use these to study the effects of backreaction on holographic correlators. We show that this scales as a relevant effect at low energies, for compact transverse spaces. We also discuss effects of matter loops, as in the CGHS model.

686 citations


Journal ArticleDOI
TL;DR: The purpose of this review is to summarize the recent developments in lung cancer biology and its therapeutic strategies, and discuss the latest treatment advances including therapies currently under clinical investigation.

593 citations


Journal ArticleDOI
TL;DR: Promising results, based on robust analysis of a larger meta-dataset, suggest that appropriate investment in agroecological research to improve organic management systems could greatly reduce or eliminate the yield gap for some crops or regions.
Abstract: Agriculture today places great strains on biodiversity, soils, water and the atmosphere, and these strains will be exacerbated if current trends in population growth, meat and energy consumption, and food waste continue. Thus, farming systems that are both highly productive and minimize environmental harms are critically needed. How organic agriculture may contribute to world food production has been subject to vigorous debate over the past decade. Here, we revisit this topic comparing organic and conventional yields with a new meta-dataset three times larger than previously used (115 studies containing more than 1000 observations) and a new hierarchical analytical framework that can better account for the heterogeneity and structure in the data. We find organic yields are only 19.2% (±3.7%) lower than conventional yields, a smaller yield gap than previous estimates. More importantly, we find entirely different effects of crop types and management practices on the yield gap compared with previous studies. For example, we found no significant differences in yields for leguminous versus non-leguminous crops, perennials versus annuals or developed versus developing countries. Instead, we found the novel result that two agricultural diversification practices, multi-cropping and crop rotations, substantially reduce the yield gap (to 9 ± 4% and 8 ± 5%, respectively) when the methods were applied in only organic systems. These promising results, based on robust analysis of a larger meta-dataset, suggest that appropriate investment in agroecological research to improve organic management systems could greatly reduce or eliminate the yield gap for some crops or regions.

579 citations


Journal ArticleDOI
TL;DR: Modern methods based on nucleic acid and protein analysis are described, which represent unprecedented tools to render agriculture more sustainable and safe, avoiding expensive use of pesticides in crop protection.
Abstract: Plant diseases are responsible for major economic losses in the agricultural industry worldwide. Monitoring plant health and detecting pathogen early are essential to reduce disease spread and facilitate effective management practices. DNA-based and serological methods now provide essential tools for accurate plant disease diagnosis, in addition to the traditional visual scouting for symptoms. Although DNA-based and serological methods have revolutionized plant disease detection, they are not very reliable at asymptomatic stage, especially in case of pathogen with systemic diffusion. They need at least 1–2 days for sample harvest, processing, and analysis. Here, we describe modern methods based on nucleic acid and protein analysis. Then, we review innovative approaches currently under development. Our main findings are the following: (1) novel sensors based on the analysis of host responses, e.g., differential mobility spectrometer and lateral flow devices, deliver instantaneous results and can effectively detect early infections directly in the field; (2) biosensors based on phage display and biophotonics can also detect instantaneously infections although they can be integrated with other systems; and (3) remote sensing techniques coupled with spectroscopy-based methods allow high spatialization of results, these techniques may be very useful as a rapid preliminary identification of primary infections. We explain how these tools will help plant disease management and complement serological and DNA-based methods. While serological and PCR-based methods are the most available and effective to confirm disease diagnosis, volatile and biophotonic sensors provide instantaneous results and may be used to identify infections at asymptomatic stages. Remote sensing technologies will be extremely helpful to greatly spatialize diagnostic results. These innovative techniques represent unprecedented tools to render agriculture more sustainable and safe, avoiding expensive use of pesticides in crop protection.

553 citations


PatentDOI
TL;DR: It is demonstrated that factors known to increase brightness in bulk experiments lose importance at higher excitation powers and that, paradoxically, the brightest probes under single-molecule excitation are barely luminescent at the ensemble level.
Abstract: Various embodiments of the invention describe the synthesis of upconverting nanoparticles (UCNPs), lanthanide-doped hexagonal β-phase sodium yttrium fluoride NaYF 4 :Er 3+ /Yb 3 nanocrystals, less than 10 nanometers in diameter that are over an order of magnitude brighter under single-particle imaging conditions than existing compositions, allowing visualization of single UCNPs as small (d=4.8 nm) as fluorescent proteins. We use Advanced single-particle characterization and theoretical modeling is demonstrated to find that surface effects become critical at diameters under 20 nm, and that the fluences used in single-molecule imaging change the dominant determinants of nanocrystal brightness. These results demonstrate that factors known to increase brightness in bulk experiments lose importance at higher excitation powers, and that, paradoxically, the brightest probes under single-molecule excitation are barely luminescent at the ensemble level.

458 citations


Book ChapterDOI
26 Apr 2015
TL;DR: A new method to homomorphically compute simple bit operations, and refresh (bootstrap) the resulting output, which runs on a personal computer in just about half a second, and is presented on the performance of the prototype implementation.
Abstract: The main bottleneck affecting the efficiency of all known fully homomorphic encryption (FHE) schemes is Gentry’s bootstrapping procedure, which is required to refresh noisy ciphertexts and keep computing on encrypted data. Bootstrapping in the latest implementation of FHE, the HElib library of Halevi and Shoup (Crypto 2014), requires about six minutes. We present a new method to homomorphically compute simple bit operations, and refresh (bootstrap) the resulting output, which runs on a personal computer in just about half a second. We present a detailed technical analysis of the scheme (based on the worst-case hardness of standard lattice problems) and report on the performance of our prototype implementation.

458 citations


Journal ArticleDOI
TL;DR: The pineapple lineage has transitioned from C3 photosynthesis to CAM, with CAM-related genes exhibiting a diel expression pattern in photosynthetic tissues, providing the first cis-regulatory link between CAM and circadian clock regulation.
Abstract: Pineapple (Ananas comosus (L.) Merr.) is the most economically valuable crop possessing crassulacean acid metabolism (CAM), a photosynthetic carbon assimilation pathway with high water-use efficiency, and the second most important tropical fruit. We sequenced the genomes of pineapple varieties F153 and MD2 and a wild pineapple relative, Ananas bracteatus accession CB5. The pineapple genome has one fewer ancient whole-genome duplication event than sequenced grass genomes and a conserved karyotype with seven chromosomes from before the ρ duplication event. The pineapple lineage has transitioned from C3 photosynthesis to CAM, with CAM-related genes exhibiting a diel expression pattern in photosynthetic tissues. CAM pathway genes were enriched with cis-regulatory elements associated with the regulation of circadian clock genes, providing the first cis-regulatory link between CAM and circadian clock regulation. Pineapple CAM photosynthesis evolved by the reconfiguration of pathways in C3 plants, through the regulatory neofunctionalization of preexisting genes and not through the acquisition of neofunctionalized genes via whole-genome or tandem gene duplication.

424 citations


Journal ArticleDOI
TL;DR: High-quality microdata from Mexico is used to characterize empirically the relationship between temperature, income, and air conditioning, and how climate and income drive air conditioning adoption decisions, and to forecast future energy consumption.
Abstract: As household incomes rise around the world and global temperatures go up, the use of air conditioning is poised to increase dramatically. Air conditioning growth is expected to be particularly strong in middle-income countries, but direct empirical evidence is scarce. In this paper we use high-quality microdata from Mexico to describe the relationship between temperature, income, and air conditioning. We describe both how electricity consumption increases with temperature given current levels of air conditioning, and how climate and income drive air conditioning adoption decisions. We then combine these estimates with predicted end-of-century temperature changes to forecast future energy consumption. Under conservative assumptions about household income, our model predicts near-universal saturation of air conditioning in all warm areas within just a few decades. Temperature increases contribute to this surge in adoption, but income growth by itself explains most of the increase. What this will mean for electricity consumption and carbon dioxide emissions depends on the pace of technological change. Continued advances in energy efficiency or the development of new cooling technologies could reduce the energy consumption impacts. Similarly, growth in low-carbon electricity generation could mitigate the increases in carbon dioxide emissions. However, the paper illustrates the enormous potential impacts in this sector, highlighting the importance of future research on adaptation and underscoring the urgent need for global action on climate change.

396 citations


Journal ArticleDOI
TL;DR: It is shown that human-derived stressors can act to erode resilience of desirable macroalgal beds while strengthening resilience of urchin barrens, thus exacerbating the risk, spatial extent and irreversibility of an unwanted regime shift for marine ecosystems.
Abstract: A pronounced, widespread and persistent regime shift among marine ecosystems is observable on temperate rocky reefs as a result of sea urchin overgrazing. Here, we empirically define regime-shift dynamics for this grazing system which transitions between productive macroalgal beds and impoverished urchin barrens. Catastrophic in nature, urchin overgrazing in a well-studied Australian system demonstrates a discontinuous regime shift, which is of particular management concern as recovery of desirable macroalgal beds requires reducing grazers to well below the initial threshold of overgrazing. Generality of this regime-shift dynamic is explored across 13 rocky reef systems (spanning 11 different regions from both hemispheres) by compiling available survey data (totalling 10 901 quadrats surveyed in situ ) plus experimental regime-shift responses (observed during a total of 57 in situ manipulations). The emergent and globally coherent pattern shows urchin grazing to cause a discontinuous ‘catastrophic’ regime shift, with hysteresis effect of approximately one order of magnitude in urchin biomass between critical thresholds of overgrazing and recovery. Different life-history traits appear to create asymmetry in the pace of overgrazing versus recovery. Once shifted, strong feedback mechanisms provide resilience for each alternative state thus defining the catastrophic nature of this regime shift. Importantly, human-derived stressors can act to erode resilience of desirable macroalgal beds while strengthening resilience of urchin barrens, thus exacerbating the risk, spatial extent and irreversibility of an unwanted regime shift for marine ecosystems.

380 citations


Journal ArticleDOI
TL;DR: In this paper, a systematic review of the literature from 1981 to 2014 provides a framework and analytical account of how coping during public service delivery has been studied since 1980, highlighting the importance of the type of profession (such as being a teacher or a police officer), the amount of workload, and the degree of discretion for understanding how frontline workers cope with stress.
Abstract: Frontline workers, such as teachers and social workers, often experience stress when delivering public services to clients, for instance because of high workloads. They adapt by coping, using such practices as breaking or bending rules for clients, or rationing services. Although coping is recognized as an important response to the problems of frontline work, the public administration field lacks a comprehensive view of coping. The first contribution of this article is therefore theoretical: conceptualizing coping during public service delivery and developing a coherent classification of coping. This is done via a systematic review of the literature from 1981 to 2014. The second contribution is empirical: via a systematic review of the literature from 1981–2014 this article provides a framework and analytical account of how coping during public service delivery has been studied since 1980. It highlights the importance of the type of profession (such as being a teacher or a police officer), the amount of workload, and the degree of discretion for understanding how frontline workers cope with stress. It also reveals that frontline workers often draw on the coping family “moving towards clients” revealing a strong tendency to provide meaningful public service to clients, even under stressful conditions. We conclude with an agenda for future studies, examining new theoretical, methodological and empirical opportunities to advance understanding of coping during public service delivery.

367 citations


Journal ArticleDOI
TL;DR: The Solar Wind Ion Analyzer (SWIA) as discussed by the authors was used on the MAVEN mission to measure the solar wind ion flows around Mars, both in the upstream solar wind and in the magneto-sheath and tail regions inside the bow shock.
Abstract: The Solar Wind Ion Analyzer (SWIA) on the MAVEN mission will measure the solar wind ion flows around Mars, both in the upstream solar wind and in the magneto-sheath and tail regions inside the bow shock. The solar wind flux provides one of the key energy inputs that can drive atmospheric escape from the Martian system, as well as in part controlling the structure of the magnetosphere through which non-thermal ion escape must take place. SWIA measurements contribute to the top level MAVEN goals of characterizing the upper atmosphere and the processes that operate there, and parameterizing the escape of atmospheric gases to extrapolate the total loss to space throughout Mars’ history. To accomplish these goals, SWIA utilizes a toroidal energy analyzer with electrostatic deflectors to provide a broad 360∘×90∘ field of view on a 3-axis spacecraft, with a mechanical attenuator to enable a very high dynamic range. SWIA provides high cadence measurements of ion velocity distributions with high energy resolution (14.5 %) and angular resolution (3.75∘×4.5∘ in the sunward direction, 22.5∘×22.5∘ elsewhere), and a broad energy range of 5 eV to 25 keV. Onboard computation of bulk moments and energy spectra enable measurements of the basic properties of the solar wind at 0.25 Hz.

Proceedings ArticleDOI
01 Jan 2015
TL;DR: Firmalice is presented, a binary analysis framework to support the analysis of firmware running on embedded devices that utilizes a novel model of authentication bypass flaws, based on the attacker’s ability to determine the required inputs to perform privileged operations.
Abstract: Embedded devices have become ubiquitous, and they are used in a range of privacy-sensitive and security-critical applications. Most of these devices run proprietary software, and little documentation is available about the software’s inner workings. In some cases, the cost of the hardware and protection mechanisms might make access to the devices themselves infeasible. Analyzing the software that is present in such environments is challenging, but necessary, if the risks associated with software bugs and vulnerabilities must be avoided. As a matter of fact, recent studies revealed the presence of backdoors in a number of embedded devices available on the market. In this paper, we present Firmalice, a binary analysis framework to support the analysis of firmware running on embedded devices. Firmalice builds on top of a symbolic execution engine, and techniques, such as program slicing, to increase its scalability. Furthermore, Firmalice utilizes a novel model of authentication bypass flaws, based on the attacker’s ability to determine the required inputs to perform privileged operations. We evaluated Firmalice on the firmware of three commercially-available devices, and were able to detect authentication bypass backdoors in two of them. Additionally, Firmalice was able to determine that the backdoor in the third firmware sample was not exploitable by an attacker without knowledge of a set of unprivileged credentials.

Journal ArticleDOI
TL;DR: Omalizumab 300 mg administered subcutaneously every 4 weeks reduced weekly ISS and other symptom scores versus placebo in CIU/CSU patients who remained symptomatic despite treatment with approved doses of H1 antihistamines.

Book ChapterDOI
01 Jan 2015
TL;DR: In this article, the authors discuss the role of marine plastic debris as a novel medium for environmental partitioning of chemical contaminants in the ocean and the toxic effects that may result from plastic debris in marine animals.
Abstract: For decades we have learned about the physical hazards associated with plastic debris in the marine environment, but recently we are beginning to realize the chemical hazards. Assessing hazards associated with plastic in aquatic habitats is not simple, and requires knowledge regarding organisms that may be exposed, the exposure concentrations, the types of polymers comprising the debris, the length of time the debris was present in the aquatic environment (affecting the size, shape and fouling) and the locations and transport of the debris during that time period. Marine plastic debris is associated with a ‘cocktail of chemicals’, including chemicals added or produced during manufacturing and those present in the marine environment that accumulate onto the debris from surrounding seawater. This raises concerns regarding: (i) the complex mixture of chemical substances associated with marine plastic debris, (ii) the environmental fate of these chemicals to and from plastics in our oceans and (iii) how this mixture affects wildlife, as hundreds of species ingest this material in nature. The focus of this chapter is on the mixture of chemicals associated with marine plastic debris. Specifically, this chapter discusses the diversity of chemical ingredients, byproducts of manufacturing and sorbed chemical contaminants from the marine environment among plastic types, the role of marine plastic debris as a novel medium for environmental partitioning of chemical contaminants in the ocean and the toxic effects that may result from plastic debris in marine animals.

Journal ArticleDOI
TL;DR: The status of known vertebrate genome projects, recommend standards for pronouncing a genome as sequenced or completed, and the present and future vision of the landscape of Genome 10K are provided.
Abstract: The Genome 10K Project was established in 2009 by a consortium of biologists and genome scientists determined to facilitate the sequencing and analysis of the complete genomes of 10,000 vertebrate species. Since then the number of selected and initiated species has risen from ∼26 to 277 sequenced or ongoing with funding, an approximately tenfold increase in five years. Here we summarize the advances and commitments that have occurred by mid-2014 and outline the achievements and present challenges of reaching the 10,000-species goal. We summarize the status of known vertebrate genome projects, recommend standards for pronouncing a genome as sequenced or completed, and provide our present and future vision of the landscape of Genome 10K. The endeavor is ambitious, bold, expensive, and uncertain, but together the Genome 10K Consortium of Scientists and the worldwide genomics community are moving toward their goal of delivering to the coming generation the gift of genome empowerment for many vertebrate species.

Book ChapterDOI
01 Jan 2015
TL;DR: Cytochrome P450s (P450s) are subject to inhibition/inactivation by various chemically diverse agents, which may interact directly or indirectly with either the P450 prosthetic heme or the protein moiety or with both moieties.
Abstract: Cytochrome P450s (P450s) are subject to inhibition/inactivation by various chemically diverse agents, which may interact directly or indirectly with either the P450 prosthetic heme or the protein moiety or with both moieties. This inhibition is clinically relevant, as it can block physiological pathways as well as result in therapeutically significant beneficial or adverse effects and clinically relevant drug–drug interactions. Herein, we discuss the various chemical agents and their chemical mechanisms that lead to such reversible, quasi-irreversible, or irreversible inhibition of these enzymes. We note that such inhibition targeted at P450-dependent physiological and/or pathological processes has been ingeniously and effectively harnessed not only in the design and development of clinically relevant therapeutic agents but also of fungicides, insecticides, and herbicides.

Journal ArticleDOI
TL;DR: In this paper, the current status of theoretical and experimental constraints on the real Higgs singlet extension of the standard model is discussed and the impact of perturbative unitarity, electroweak precision data with a special focus on higher-order contributions to the Higgs boson mass, perturbativity of the couplings as well as vacuum stability.
Abstract: We discuss the current status of theoretical and experimental constraints on the real Higgs singlet extension of the standard model. For the second neutral (non-standard) Higgs boson we consider the full mass range from $$1~\mathrm{GeV}$$ to $$1~\mathrm{TeV}$$ accessible at past and current collider experiments. We separately discuss three scenarios, namely, the case where the second Higgs boson is lighter than, approximately equal to, or heavier than the discovered Higgs state at around $$125~\mathrm{GeV}$$ . We investigate the impact of constraints from perturbative unitarity, electroweak precision data with a special focus on higher-order contributions to the $$W$$ boson mass, perturbativity of the couplings as well as vacuum stability. The latter two are tested up to a scale of $$\sim $$ $$4 \times 10^{10}\,\mathrm{GeV}$$ using renormalization group equations. Direct collider constraints from Higgs signal rate measurements at the LHC and $$95\,\%$$ confidence level exclusion limits from Higgs searches at LEP, Tevatron, and LHC are included via the public codes HiggsSignals and HiggsBounds, respectively. We identify the strongest constraints in the different regions of parameter space. We comment on the collider phenomenology of the remaining viable parameter space and the prospects for a future discovery or exclusion at the LHC.

Journal ArticleDOI
TL;DR: In this paper, the authors revisited the twin Higgs scenario as a dark solution to the little hierarchy problem, identifying the structure of a minimal model and its viable parameter space, and analyzed its collider implications.
Abstract: We revisit the Twin Higgs scenario as a “dark” solution to the little hierarchy problem, identify the structure of a minimal model and its viable parameter space, and analyze its collider implications. In this model, dark naturalness generally leads to Hidden Valley phenomenology. The twin particles, including the top partner, are all Standard-Model-neutral, but naturalness favors the existence of twin strong interactions — an asymptotically-free force that confines not far above the Standard Model QCD scale — and a Higgs portal interaction. We show that, taken together, these typically give rise to exotic decays of the Higgs to twin hadrons. Across a substantial portion of the parameter space, certain twin hadrons have visible and often displaced decays, providing a potentially striking LHC signature. We briefly discuss appropriate experimental search strategies.

Journal ArticleDOI
TL;DR: An overview of both probability-based and convenience panels is provided, discussing potential benefits and cautions for each method, and summarizing the approaches used to weight panel respondents in order to better represent the underlying population.
Abstract: The use of Internet panels to collect survey data is increasing because it is cost-effective, enables access to large and diverse samples quickly, takes less time than traditional methods to obtain data for analysis, and the standardization of the data collection process makes studies easy to replicate. A variety of probability-based panels have been created, including Telepanel/CentERpanel, Knowledge Networks (now GFK KnowledgePanel), the American Life Panel, the Longitudinal Internet Studies for the Social Sciences panel, and the Understanding America Study panel. Despite the advantage of having a known denominator (sampling frame), the probability-based Internet panels often have low recruitment participation rates, and some have argued that there is little practical difference between opting out of a probability sample and opting into a nonprobability (convenience) Internet panel. This article provides an overview of both probability-based and convenience panels, discussing potential benefits and cautions for each method, and summarizing the approaches used to weight panel respondents in order to better represent the underlying population. Challenges of using Internet panel data are discussed, including false answers, careless responses, giving the same answer repeatedly, getting multiple surveys from the same respondent, and panelists being members of multiple panels. More is to be learned about Internet panels generally and about Web-based data collection, as well as how to evaluate data collected using mobile devices and social-media platforms.

Book ChapterDOI
01 Jan 2015
TL;DR: In this article, the authors provide definitions and reuse typologies and describe common reuse patterns and their driving factors, and call for standardized data collection and reporting efforts across the formal and informal reuse sectors to provide reliable and updated information on the wastewater and sludge cycles, essential to develop proper diagnosis and effective policies for the safe and productive use of these resources.
Abstract: Cities produce large amounts and very diverse types of waste including wastewater. The quality of these wastes depends on their source, the way in which they are collected and the treatment they receive. The final fate of these wastes is also very diverse. To better understand these systems this chapter provides definitions and reuse typologies and describes common reuse patterns and their driving factors. The chapter also shows that, while the prospects for resource recovery from wastewater and sludge are promising the potential is still largely untapped, except in the informal sector. The resources embedded in the approximately 330 km3/year of municipal wastewater that are globally generated would be theoretically enough to irrigate and fertilize millions of hectares of crops and to produce biogas to supply energy for millions of households. However, only a tiny proportion of these wastes is currently treated, and the portion which is safely reused is significantly smaller than the existing direct and especially indirect use of untreated wastewater, which are posing significant potential health risks. The chapter ends with a call for standardized data collection and reporting efforts across the formal and informal reuse sectors to provide more reliable and updated information on the wastewater and sludge cycles, essential to develop proper diagnosis and effective policies for the safe and productive use of these resources.

PatentDOI
TL;DR: Lithium sulfide spheres with size control have been synthesized for the first time, and a CVD method for converting them into stable carbon-coated Li2S core-shell (Li2S@C) particles has been successfully employed.
Abstract: The disclosure provides methods for producing uniformly sized lithium sulfide materials which are coated with one or more durable and conductive carbon shells that impede the polysulfide shuttle. The disclosure further provides for the carbon coated lithium sulfide materials made therefrom, and the use of these materials in lithium sulfide batteries.

Book ChapterDOI
16 Aug 2015
TL;DR: The arrival of indistinguishability obfuscation has transformed the cryptographic landscape by enabling several security goals that were previously beyond our reach as mentioned in this paper, and one of the pressing goals currently is to construct a cryptographic obfuscation from well-studied standard cryptographic assumptions.
Abstract: The arrival of indistinguishability obfuscation (\(i\mathrm {O}\)) has transformed the cryptographic landscape by enabling several security goals that were previously beyond our reach Consequently, one of the pressing goals currently is to construct \(i\mathrm {O}\) from well-studied standard cryptographic assumptions

Journal ArticleDOI
TL;DR: The MAVEN SupraThermal And Thermal Ion Compostion (STATIC) instrument is designed to measure the ion composition and distribution function of the cold Martian ionosphere, the heated suprathermal tail of this plasma in the upper ionosphere and the pickup ions accelerated by solar wind electric fields.
Abstract: The MAVEN SupraThermal And Thermal Ion Compostion (STATIC) instrument is designed to measure the ion composition and distribution function of the cold Martian ionosphere, the heated suprathermal tail of this plasma in the upper ionosphere, and the pickup ions accelerated by solar wind electric fields. STATIC operates over an energy range of 0.1 eV up to 30 keV, with a base time resolution of 4 seconds. The instrument consists of a toroidal “top hat” electrostatic analyzer with a $360^{\circ} \times 90^{\circ}$ field-of-view, combined with a time-of-flight (TOF) velocity analyzer with $22.5^{\circ}$ resolution in the detection plane. The TOF combines a $-15~\mbox{kV}$ acceleration voltage with ultra-thin carbon foils to resolve $\mathrm{H}^{+}$ , $\mathrm{He}^{++}$ , $\mathrm{He}^{+}$ , $\mathrm{O}^{+}$ , $\mathrm{O}_{2}^{+}$ , and $\mathrm{CO}_{2}^{+}$ ions. Secondary electrons from carbon foils are detected by microchannel plate detectors and binned into a variety of data products with varying energy, mass, angle, and time resolution. To prevent detector saturation when measuring cold ram ions at periapsis ( $\sim10^{1 1}~\mbox{eV/cm}^{2}\,\mbox{s}\,\mbox{sr}\,\mbox{eV}$ ), while maintaining adequate sensitivity to resolve tenuous pickup ions at apoapsis ( $\sim10^{3}~\mbox{eV/cm}^{2}\,\mbox{s}\,\mbox{sr}\,\mbox{eV}$ ), the sensor includes both mechanical and electrostatic attenuators that increase the dynamic range by a factor of $10^{3}$ . This paper describes the instrument hardware, including several innovative improvements over previous TOF sensors, the ground calibrations of the sensor, the data products generated by the experiment, and some early measurements during cruise phase to Mars.

Journal ArticleDOI
TL;DR: Recent successes in the development of C. reinhardtii as a biomanufacturing host for recombinant proteins, including antibodies and immunotoxins, hormones, industrial enzymes, an orally-active colostral protein for gastrointestinal health, and subunit vaccines are reviewed.
Abstract: Recombinant proteins are widely used for industrial, nutritional, and medical applications. Green microalgae have attracted considerable attention recently as a biomanufacturing platform for the production of recombinant proteins for a number of reasons. These photosynthetic eukaryotic microorganisms are safe, scalable, easy to genetically modify through transformation, mutagenesis, or breeding, and inexpensive to grow. Many microalgae species are genetically transformable, but the green alga Chlamydomonas reinhardtii is the most widely used host for recombinant protein expression. An extensive suite of molecular genetic tools has been developed for C. reinhardtii over the last 25 years, including a fully sequenced genome, well-established methods for transformation, mutagenesis and breeding, and transformation vectors for high levels of recombinant protein accumulation and secretion. Here, we review recent successes in the development of C. reinhardtii as a biomanufacturing host for recombinant proteins, including antibodies and immunotoxins, hormones, industrial enzymes, an orally-active colostral protein for gastrointestinal health, and subunit vaccines. In addition, we review the biomanufacturing potential of other green algae from the genera Dunaliella and Chlorella.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a framework for landscape restoration, offering seven principles for managing large-scale habitat connectivity and disturbance flow issues, and discuss their implication for management, and illustrate their application with examples.
Abstract: More than a century of forest and fire management of Inland Pacific landscapes has transformed their successional and disturbance dynamics. Regional connectivity of many terrestrial and aquatic habitats is fragmented, flows of some ecological and physical processes have been altered in space and time, and the frequency, size and intensity of many disturbances that configure these habitats have been altered. Current efforts to address these impacts yield a small footprint in comparison to wildfires and insect outbreaks. Moreover, many current projects emphasize thinning and fuels reduction within individual forest stands, while overlooking large-scale habitat connectivity and disturbance flow issues. We provide a framework for landscape restoration, offering seven principles. We discuss their implication for management, and illustrate their application with examples. Historical forests were spatially heterogeneous at multiple scales. Heterogeneity was the result of variability and interactions among native ecological patterns and processes, including successional and disturbance processes regulated by climatic and topographic drivers. Native flora and fauna were adapted to these conditions, which conferred a measure of resilience to variability in climate and recurrent contagious disturbances. To restore key characteristics of this resilience to current landscapes, planning and management are needed at ecoregion, local landscape, successional patch, and tree neighborhood scales. Restoration that works effectively across ownerships and allocations will require active thinking about landscapes as socio-ecological systems that provide services to people within the finite capacities of ecosystems. We focus attention on landscape-level prescriptions as foundational to restoration planning and execution.

Journal ArticleDOI
TL;DR: In this article, a subtraction method utilizing the N -jettiness observable, T ≥ 0, was presented to perform QCD calculations for arbitrary processes at next-to-next-toleading order (NNLO).
Abstract: We present a subtraction method utilizing the N -jettiness observable, T N , to perform QCD calculations for arbitrary processes at next-to-next-to-leading order (NNLO). Our method employs soft-collinear effective theory (SCET) to determine the IR singular contributions of N -jet cross sections for T N → 0, and uses these to construct suitable T N -subtractions. The construction is systematic and economic, due to being based on a physical observable. The resulting NNLO calculation is fully differential and in a form directly suitable for combining with resummation and parton showers. We explain in detail the application to processes with an arbitrary number of massless partons at lepton and hadron colliders together with the required external inputs in the form of QCD amplitudes and lower-order calculations. We provide explicit expressions for the T N -subtractions at NLO and NNLO. The required ingredients are fully known at NLO, and at NNLO for processes with two external QCD partons. The remaining NNLO ingredient for three or more external partons can be obtained numerically with existing NNLO techniques. As an example, we employ our results to obtain the NNLO rapidity spectrum for Drell-Yan and gluon-fusion Higgs production. We discuss aspects of numerical accuracy and convergence and the practical implementation. We also discuss and comment on possible extensions, such as more-differential subtractions, necessary steps for going to N3LO, and the treatment of massive quarks.

Journal ArticleDOI
TL;DR: Infant gains in social-communicative and developmental skills were observed following intervention in most of the reviewed studies, while comparisons with treatment-as-usual control groups elucidate the need for further research.
Abstract: Early detection methods for autism spectrum disorder (ASD) in infancy are rapidly advancing, yet the development of interventions for infants under two years with or at-risk for ASD remains limited. In order to guide research and practice, this paper systematically reviewed studies investigating interventions for infants under 24 months with or at-risk for ASD. Nine studies were identified and evaluated for: (a) participants, (b) intervention approach (c) experimental design, and (d) outcomes. Studies that collected parent measures reported positive findings for parent acceptability, satisfaction, and improvement in parent implementation of treatment. Infant gains in social-communicative and developmental skills were observed following intervention in most of the reviewed studies, while comparisons with treatment-as-usual control groups elucidate the need for further research. These studies highlight the feasibility of very early intervention and provide preliminary evidence that intervention for at-risk infants may be beneficial for infants and parents.

Journal ArticleDOI
TL;DR: Although drought sensitivity generally decreased with increasing MAP as predicted, there was evidence that the identity and traits of the dominant species, as well as plant functional diversity, influenced sensitivity.
Abstract: Terrestrial ecosystems often vary dramatically in their responses to drought, but the reasons for this are unclear. With climate change forecasts for more frequent and extensive drought in the future, a more complete under- standing of the mechanisms that determine differential eco- system sensitivity to drought is needed. In 2012, the Central US experienced the fourth largest drought in a century, with a regional-scale 40 % reduction in growing season precipi- tation affecting ecosystems ranging from desert grassland to mesic tallgrass prairie. This provided an opportunity to assess ecosystem sensitivity to a drought of common mag- nitude in six native grasslands. We tested the prediction that drought sensitivity is inversely related to mean annual precipitation (MAP) by quantifying reductions in above- ground net primary production (ANPP). Long-term ANPP data available for each site (mean length = 16 years) were used as a baseline for calculating reductions in ANPP, and drought sensitivity was estimated as the reduction in ANPP

Journal ArticleDOI
TL;DR: In the forebrain, MOR and DOR are mainly detected in separate neurons, suggesting system-level interactions in high-order processing, and potential MOR/DOR intracellular interactions within the nociceptive pathway offer novel therapeutic perspectives.
Abstract: Opioid receptors are G protein-coupled receptors (GPCRs) that modulate brain function at all levels of neural integration, including autonomic, sensory, emotional and cognitive processing. Mu (MOR) and delta (DOR) opioid receptors functionally interact in vivo, but whether interactions occur at circuitry, cellular or molecular levels remains unsolved. To challenge the hypothesis of MOR/DOR heteromerization in the brain, we generated redMOR/greenDOR double knock-in mice and report dual receptor mapping throughout the nervous system. Data are organized as an interactive database offering an opioid receptor atlas with concomitant MOR/DOR visualization at subcellular resolution, accessible online. We also provide co-immunoprecipitation-based evidence for receptor heteromerization in these mice. In the forebrain, MOR and DOR are mainly detected in separate neurons, suggesting system-level interactions in high-order processing. In contrast, neuronal co-localization is detected in subcortical networks essential for survival involved in eating and sexual behaviors or perception and response to aversive stimuli. In addition, potential MOR/DOR intracellular interactions within the nociceptive pathway offer novel therapeutic perspectives.

Journal ArticleDOI
TL;DR: The findings from this review suggest that CRF is linked to immune/inflammatory, metabolic, neuroendocrine, and genetic biomarkers, and gaps in knowledge are identified.
Abstract: Understanding the etiology of cancer-related fatigue (CRF) is critical to identify targets to develop therapies to reduce CRF burden The goal of this systematic review was to expand on the initial work by the National Cancer Institute CRF Working Group to understand the state of the science related to the biology of CRF and, specifically, to evaluate studies that examined the relationships between biomarkers and CRF and to develop an etiologic model of CRF to guide researchers on pathways to explore or therapeutic targets to investigate This review was completed by the Multinational Association of Supportive Care in Cancer Fatigue Study Group–Biomarker Working Group The initial search used three terms (biomarkers, fatigue, cancer), which yielded 11,129 articles After removing duplicates, 9145 articles remained Titles were assessed for the keywords “cancer” and “fatigue” resulting in 3811 articles Articles published before 2010 and those with samples <50 were excluded, leaving 75 articles for full-text review Of the 75 articles, 28 were further excluded for not investigating the associations of biomarkers and CRF Of the 47 articles reviewed, 25 were cross-sectional and 22 were longitudinal studies More than half (about 70 %) were published recently (2010–2013) Almost half (45 %) enrolled breast cancer participants The majority of studies assessed fatigue using self-report questionnaires, and only two studies used clinical parameters to measure fatigue The findings from this review suggest that CRF is linked to immune/inflammatory, metabolic, neuroendocrine, and genetic biomarkers We also identified gaps in knowledge and made recommendations for future research