scispace - formally typeset
Search or ask a question

Showing papers by "University of California, Santa Barbara published in 2015"


Journal ArticleDOI
13 Feb 2015-Science
TL;DR: This work combines available data on solid waste with a model that uses population density and economic status to estimate the amount of land-based plastic waste entering the ocean, which is estimated to be 275 million metric tons.
Abstract: Plastic debris in the marine environment is widely documented, but the quantity of plastic entering the ocean from waste generated on land is unknown. By linking worldwide data on solid waste, population density, and economic status, we estimated the mass of land-based plastic waste entering the ocean. We calculate that 275 million metric tons (MT) of plastic waste was generated in 192 coastal countries in 2010, with 4.8 to 12.7 million MT entering the ocean. Population size and the quality of waste management systems largely determine which countries contribute the greatest mass of uncaptured waste available to become plastic marine debris. Without waste management infrastructure improvements, the cumulative quantity of plastic waste available to enter the ocean from land is predicted to increase by an order of magnitude by 2025.

6,689 citations


Journal ArticleDOI
TL;DR: The Variable Infiltration Capacity model, a novel blending procedure incorporating the spatial correlation structure of CCD-estimates to assign interpolation weights, is presented and it is shown that CHIRPS can support effective hydrologic forecasts and trend analyses in southeastern Ethiopia.
Abstract: The Climate Hazards group Infrared Precipitation with Stations (CHIRPS) dataset builds on previous approaches to ‘smart’ interpolation techniques and high resolution, long period of record precipitation estimates based on infrared Cold Cloud Duration (CCD) observations. The algorithm i) is built around a 0.05° climatology that incorporates satellite information to represent sparsely gauged locations, ii) incorporates daily, pentadal, and monthly 1981-present 0.05° CCD-based precipitation estimates, iii) blends station data to produce a preliminary information product with a latency of about 2 days and a final product with an average latency of about 3 weeks, and iv) uses a novel blending procedure incorporating the spatial correlation structure of CCD-estimates to assign interpolation weights. We present the CHIRPS algorithm, global and regional validation results, and show how CHIRPS can be used to quantify the hydrologic impacts of decreasing precipitation and rising air temperatures in the Greater Horn of Africa. Using the Variable Infiltration Capacity model, we show that CHIRPS can support effective hydrologic forecasts and trend analyses in southeastern Ethiopia.

2,895 citations


Journal ArticleDOI
07 May 2015-Nature
TL;DR: The experimental implementation of transistor-free metal-oxide memristor crossbars, with device variability sufficiently low to allow operation of integrated neural networks, in a simple network: a single-layer perceptron (an algorithm for linear classification).
Abstract: Despite much progress in semiconductor integrated circuit technology, the extreme complexity of the human cerebral cortex, with its approximately 10(14) synapses, makes the hardware implementation of neuromorphic networks with a comparable number of devices exceptionally challenging. To provide comparable complexity while operating much faster and with manageable power dissipation, networks based on circuits combining complementary metal-oxide-semiconductors (CMOSs) and adjustable two-terminal resistive devices (memristors) have been developed. In such circuits, the usual CMOS stack is augmented with one or several crossbar layers, with memristors at each crosspoint. There have recently been notable improvements in the fabrication of such memristive crossbars and their integration with CMOS circuits, including first demonstrations of their vertical integration. Separately, discrete memristors have been used as artificial synapses in neuromorphic networks. Very recently, such experiments have been extended to crossbar arrays of phase-change memristive devices. The adjustment of such devices, however, requires an additional transistor at each crosspoint, and hence these devices are much harder to scale than metal-oxide memristors, whose nonlinear current-voltage curves enable transistor-free operation. Here we report the experimental implementation of transistor-free metal-oxide memristor crossbars, with device variability sufficiently low to allow operation of integrated neural networks, in a simple network: a single-layer perceptron (an algorithm for linear classification). The network can be taught in situ using a coarse-grain variety of the delta rule algorithm to perform the perfect classification of 3 × 3-pixel black/white images into three classes (representing letters). This demonstration is an important step towards much larger and more complex memristive neuromorphic networks.

2,222 citations


Journal ArticleDOI
TL;DR: Modules for Experiments in Stellar Astrophysics (MESA) as discussed by the authors can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution.
Abstract: We substantially update the capabilities of the open-source software instrument Modules for Experiments in Stellar Astrophysics (MESA). MESA can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution. New MESA capabilities in fully coupled calculation of nuclear networks with hundreds of isotopes now allow MESA to accurately simulate advanced burning stages needed to construct supernova progenitor models. Implicit hydrodynamics with shocks can now be treated with MESA, enabling modeling of the entire massive star lifecycle, from pre-main sequence evolution to the onset of core collapse and nucleosynthesis from the resulting explosion. Coupling of the GYRE non-adiabatic pulsation instrument with MESA allows for new explorations of the instability strips for massive stars while also accelerating the astrophysical use of asteroseismology data. We improve treatment of mass accretion, giving more accurate and robust near-surface profiles. A new MESA capability to calculate weak reaction rates "on-the-fly" from input nuclear data allows better simulation of accretion induced collapse of massive white dwarfs and the fate of some massive stars. We discuss the ongoing challenge of chemical diffusion in the strongly coupled plasma regime, and exhibit improvements in MESA that now allow for the simulation of radiative levitation of heavy elements in hot stars. We close by noting that the MESA software infrastructure provides bit-for-bit consistency for all results across all the supported platforms, a profound enabling capability for accelerating MESA's development.

2,166 citations


Journal ArticleDOI
24 Nov 2015-ACS Nano
TL;DR: Insight is provided into the theoretical modeling and understanding of the van der Waals forces that hold together the 2D layers in bulk solids, as well as their excitonic properties and growth morphologies.
Abstract: The isolation of graphene in 2004 from graphite was a defining moment for the “birth” of a field: two-dimensional (2D) materials In recent years, there has been a rapidly increasing number of papers focusing on non-graphene layered materials, including transition-metal dichalcogenides (TMDs), because of the new properties and applications that emerge upon 2D confinement Here, we review significant recent advances and important new developments in 2D materials “beyond graphene” We provide insight into the theoretical modeling and understanding of the van der Waals (vdW) forces that hold together the 2D layers in bulk solids, as well as their excitonic properties and growth morphologies Additionally, we highlight recent breakthroughs in TMD synthesis and characterization and discuss the newest families of 2D materials, including monoelement 2D materials (ie, silicene, phosphorene, etc) and transition metal carbide- and carbon nitride-based MXenes We then discuss the doping and functionalization of 2

2,036 citations


Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, Ovsat Abdinov4  +5117 moreInstitutions (314)
TL;DR: A measurement of the Higgs boson mass is presented based on the combined data samples of the ATLAS and CMS experiments at the CERN LHC in the H→γγ and H→ZZ→4ℓ decay channels.
Abstract: A measurement of the Higgs boson mass is presented based on the combined data samples of the ATLAS and CMS experiments at the CERN LHC in the H→γγ and H→ZZ→4l decay channels. The results are obtained from a simultaneous fit to the reconstructed invariant mass peaks in the two channels and for the two experiments. The measured masses from the individual channels and the two experiments are found to be consistent among themselves. The combined measured mass of the Higgs boson is mH=125.09±0.21 (stat)±0.11 (syst) GeV.

1,567 citations


Journal ArticleDOI
TL;DR: Modules for Experiments in Stellar Astrophysics (MESA) as discussed by the authors can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution.
Abstract: We substantially update the capabilities of the open-source software instrument Modules for Experiments in Stellar Astrophysics (MESA). MESA can now simultaneously evolve an interacting pair of differentially rotating stars undergoing transfer and loss of mass and angular momentum, greatly enhancing the prior ability to model binary evolution. New MESA capabilities in fully coupled calculation of nuclear networks with hundreds of isotopes now allow MESA to accurately simulate advanced burning stages needed to construct supernova progenitor models. Implicit hydrodynamics with shocks can now be treated with MESA, enabling modeling of the entire massive star lifecycle, from pre-main sequence evolution to the onset of core collapse and nucleosynthesis from the resulting explosion. Coupling of the GYRE non-adiabatic pulsation instrument with MESA allows for new explorations of the instability strips for massive stars while also accelerating the astrophysical use of asteroseismology data. We improve treatment of mass accretion, giving more accurate and robust near-surface profiles. A new MESA capability to calculate weak reaction rates "on-the-fly" from input nuclear data allows better simulation of accretion induced collapse of massive white dwarfs and the fate of some massive stars. We discuss the ongoing challenge of chemical diffusion in the strongly coupled plasma regime, and exhibit improvements in MESA that now allow for the simulation of radiative levitation of heavy elements in hot stars. We close by noting that the MESA software infrastructure provides bit-for-bit consistency for all results across all the supported platforms, a profound enabling capability for accelerating MESA's development.

1,401 citations


Journal ArticleDOI
TL;DR: A comprehensive treatment of the physics of such interfaces at the contact region is presented and recent progress towards realizing optimal contacts for two-dimensional materials is discussed.
Abstract: The performance of electronic and optoelectronic devices based on two-dimensional layered crystals, including graphene, semiconductors of the transition metal dichalcogenide family such as molybdenum disulphide (MoS2) and tungsten diselenide (WSe2), as well as other emerging two-dimensional semiconductors such as atomically thin black phosphorus, is significantly affected by the electrical contacts that connect these materials with external circuitry. Here, we present a comprehensive treatment of the physics of such interfaces at the contact region and discuss recent progress towards realizing optimal contacts for two-dimensional materials. We also discuss the requirements that must be fulfilled to realize efficient spin injection in transition metal dichalcogenides.

1,293 citations


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Zeeshan Ahmed3, Randol W. Aikin4  +354 moreInstitutions (75)
TL;DR: Strong evidence for dust and no statistically significant evidence for tensor modes is found and various model variations and extensions are probe, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint.
Abstract: We report the results of a joint analysis of data from BICEP2/Keck Array and Planck. BICEP2 and Keck Array have observed the same approximately 400 deg2 patch of sky centered on RA 0h, Dec. −57.5deg. The combined maps reach a depth of 57 nK deg in Stokes Q and U in a band centered at 150 GHz. Planck has observed the full sky in polarization at seven frequencies from 30 to 353 GHz, but much less deeply in any given region (1.2 μK deg in Q and U at 143 GHz). We detect 150×353 cross-correlation in B-modes at high significance. We fit the single- and cross-frequency power spectra at frequencies above 150 GHz to a lensed-ΛCDM model that includes dust and a possible contribution from inflationary gravitational waves (as parameterized by the tensor-to-scalar ratio r). We probe various model variations and extensions, including adding a synchrotron component in combination with lower frequency data, and find that these make little difference to the r constraint. Finally we present an alternative analysis which is similar to a map-based cleaning of the dust contribution, and show that this gives similar constraints. The final result is expressed as a likelihood curve for r, and yields an upper limit r0.05<0.12 at 95% confidence. Marginalizing over dust and r, lensing B-modes are detected at 7.0σ significance.

1,255 citations


Journal ArticleDOI
TL;DR: The most comprehensive and most highly resolved economic input–output framework of the world economy together with a detailed database of global material flows are used to calculate the full material requirements of all countries covering a period of two decades and demonstrate that countries’ use of nondomestic resources is about threefold larger than the physical quantity of traded goods.
Abstract: Metrics on resource productivity currently used by governments suggest that some developed countries have increased the use of natural resources at a slower rate than economic growth (relative decoupling) or have even managed to use fewer resources over time (absolute decoupling). Using the material footprint (MF), a consumption-based indicator of resource use, we find the contrary: Achievements in decoupling in advanced economies are smaller than reported or even nonexistent. We present a time series analysis of the MF of 186 countries and identify material flows associated with global production and consumption networks in unprecedented specificity. By calculating raw material equivalents of international trade, we demonstrate that countries’ use of nondomestic resources is, on average, about threefold larger than the physical quantity of traded goods. As wealth grows, countries tend to reduce their domestic portion of materials extraction through international trade, whereas the overall mass of material consumption generally increases. With every 10% increase in gross domestic product, the average national MF increases by 6%. Our findings call into question the sole use of current resource productivity indicators in policy making and suggest the necessity of an additional focus on consumption-based accounting for natural resource use.

1,182 citations


Journal ArticleDOI
TL;DR: Examination of the information-processing demands of the mind-wandering state suggests that it involves perceptual decoupling to escape the constraints of the moment, its content arises from episodic and affective processes, and its regulation relies on executive control.
Abstract: Conscious experience is fluid; it rarely remains on one topic for an extended period without deviation. Its dynamic nature is illustrated by the experience of mind wandering, in which attention switches from a current task to unrelated thoughts and feelings. Studies exploring the phenomenology of mind wandering highlight the importance of its content and relation to meta-cognition in determining its functional outcomes. Examination of the information-processing demands of the mind-wandering state suggests that it involves perceptual decoupling to escape the constraints of the moment, its content arises from episodic and affective processes, and its regulation relies on executive control. Mind wandering also involves a complex balance of costs and benefits: Its association with various kinds of error underlines its cost, whereas its relationship to creativity and future planning suggest its potential value. Although essential to the stream of consciousness, various strategies may minimize the downsides of mind wandering while maintaining its productive aspects.

Journal ArticleDOI
TL;DR: This work calculates and map recent change over 5 years in cumulative impacts to marine ecosystems globally from fishing, climate change, and ocean- and land-based stressors and affirm the importance of addressing climate change to maintain and improve the condition of marine ecosystems.
Abstract: Human pressures on the ocean are thought to be increasing globally, yet we know little about their patterns of cumulative change, which pressures are most responsible for change, and which places are experiencing the greatest increases. Managers and policymakers require such information to make strategic decisions and monitor progress towards management objectives. Here we calculate and map recent change over 5 years in cumulative impacts to marine ecosystems globally from fishing, climate change, and ocean- and land-based stressors. Nearly 66% of the ocean and 77% of national jurisdictions show increased human impact, driven mostly by climate change pressures. Five percent of the ocean is heavily impacted with increasing pressures, requiring management attention. Ten percent has very low impact with decreasing pressures. Our results provide large-scale guidance about where to prioritize management efforts and affirm the importance of addressing climate change to maintain and improve the condition of marine ecosystems.

Journal ArticleDOI
05 Mar 2015-Nature
TL;DR: The protection of classical states from environmental bit-flip errors is reported and the suppression of these errors with increasing system size is demonstrated, motivating further research into the many challenges associated with building a large-scale superconducting quantum computer.
Abstract: Quantum computing becomes viable when a quantum state can be protected from environment-induced error. If quantum bits (qubits) are sufficiently reliable, errors are sparse and quantum error correction (QEC) is capable of identifying and correcting them. Adding more qubits improves the preservation of states by guaranteeing that increasingly larger clusters of errors will not cause logical failure-a key requirement for large-scale systems. Using QEC to extend the qubit lifetime remains one of the outstanding experimental challenges in quantum computing. Here we report the protection of classical states from environmental bit-flip errors and demonstrate the suppression of these errors with increasing system size. We use a linear array of nine qubits, which is a natural step towards the two-dimensional surface code QEC scheme, and track errors as they occur by repeatedly performing projective quantum non-demolition parity measurements. Relative to a single physical qubit, we reduce the failure rate in retrieving an input state by a factor of 2.7 when using five of our nine qubits and by a factor of 8.5 when using all nine qubits after eight cycles. Additionally, we tomographically verify preservation of the non-classical Greenberger-Horne-Zeilinger state. The successful suppression of environment-induced errors will motivate further research into the many challenges associated with building a large-scale superconducting quantum computer.

Journal ArticleDOI
TL;DR: This paper used a rigorous statistical framework to standardize a global dataset of plastic marine debris measured using surface-trawling plankton nets and coupled this with three different ocean circulation models to spatially interpolate the observations.
Abstract: Microplastic debris floating at the ocean surface can harm marine life. Understanding the severity of this harm requires knowledge of plastic abundance and distributions. Dozens of expeditions measuring microplastics have been carried out since the 1970s, but they have primarily focused on the North Atlantic and North Pacific accumulation zones, with much sparser coverage elsewhere. Here, we use the largest dataset of microplastic measurements assembled to date to assess the confidence we can have in global estimates of microplastic abundance and mass. We use a rigorous statistical framework to standardize a global dataset of plastic marine debris measured using surface-trawling plankton nets and coupled this with three different ocean circulation models to spatially interpolate the observations. Our estimates show that the accumulated number of microplastic particles in 2014 ranges from 15 to 51 trillion particles, weighing between 93 and 236 thousand metric tons, which is only approximately 1% of global plastic waste estimated to enter the ocean in the year 2010. These estimates are larger than previous global estimates, but vary widely because the scarcity of data in most of the world ocean, differences in model formulations, and fundamental knowledge gaps in the sources, transformations and fates of microplastics in the ocean.

Journal ArticleDOI
22 Oct 2015-Nature
TL;DR: Biodiversity mainly stabilizes ecosystem productivity, and productivity-dependent ecosystem services, by increasing resistance to climate events, and restoration of biodiversity to increase it, mainly by changing the resistance of ecosystem productivity toClimate events.
Abstract: It remains unclear whether biodiversity buffers ecosystems against climate extremes, which are becoming increasingly frequent worldwide1. Early results suggested that the ecosystem productivity of diverse grassland plant communities was more resistant, changing less during drought, and more resilient, recovering more quickly after drought, than that of depauperate communities2. However, subsequent experimental tests produced mixed results3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13. Here we use data from 46 experiments that manipulated grassland plant diversity to test whether biodiversity provides resistance during and resilience after climate events. We show that biodiversity increased ecosystem resistance for a broad range of climate events, including wet or dry, moderate or extreme, and brief or prolonged events. Across all studies and climate events, the productivity of low-diversity communities with one or two species changed by approximately 50% during climate events, whereas that of high-diversity communities with 16–32 species was more resistant, changing by only approximately 25%. By a year after each climate event, ecosystem productivity had often fully recovered, or overshot, normal levels of productivity in both high- and low-diversity communities, leading to no detectable dependence of ecosystem resilience on biodiversity. Our results suggest that biodiversity mainly stabilizes ecosystem productivity, and productivity-dependent ecosystem services, by increasing resistance to climate events. Anthropogenic environmental changes that drive biodiversity loss thus seem likely to decrease ecosystem stability14, and restoration of biodiversity to increase it, mainly by changing the resistance of ecosystem productivity to climate events.

Journal ArticleDOI
16 Jan 2015-Science
TL;DR: Today’s low rates of marine extinction may be the prelude to a major extinction pulse, similar to that observed on land during the industrial revolution, as the footprint of human ocean use widens.
Abstract: BACKGROUND: Comparing patterns of ter- restrial and marine defaunation helps to place human impacts on marine fauna in context and to navigate toward recovery. De- faunation began in ear- nest tens of thousands of years later in the oceans than it did on land. Al- though defaunation has been less severe in the oceans than on land, our effects on marine animals are increasing in pace and impact. Humans have caused few complete extinctions in the sea, but we are responsible for many ecological, commercial, and local extinctions. Despite our late start, humans have already powerfully changed virtually all major marine ecosystems. ADVANCES: Humans have profoundly de- creased the abundance of both large (e.g., whales) and small (e.g., anchovies) marine fauna. Such declines can generate waves of ecological change that travel both up and down marine food webs and can alter ocean ecosystem functioning. Human harvesters have also been a major force of evolutionary change in the oceans and have reshaped the genetic structure of marine animal popula- tions. Climate change threatens toaccelerate marine defaunation over the next century. The high mobility of many marine animals offers some increased, though limited, ca- pacity for marine species to respond to cli- mate stress, but it also exposes many species to increased risk from other stressors. Be- cause humans are intensely reliant on ocean ecosystems for food and other ecosystem ser- vices, we are deeply affected by all of these forecasted changes. Three lessons emerge when comparing the marine and terrestrial defaunation ex-

Journal ArticleDOI
09 Oct 2015-Science
TL;DR: It is demonstrated that infrared spectroscopy can be a fast and convenient characterization method with which to directly distinguish and quantify Pt single atoms from nanoparticles, and directly observe that only Pt nanoparticles show activity for carbon monoxide (CO) oxidation and water-gas shift at low temperatures, whereas Ptsingle atoms behave as spectators.
Abstract: Identification and characterization of catalytic active sites are the prerequisites for an atomic-level understanding of the catalytic mechanism and rational design of high-performance heterogeneous catalysts. Indirect evidence in recent reports suggests that platinum (Pt) single atoms are exceptionally active catalytic sites. We demonstrate that infrared spectroscopy can be a fast and convenient characterization method with which to directly distinguish and quantify Pt single atoms from nanoparticles. In addition, we directly observe that only Pt nanoparticles show activity for carbon monoxide (CO) oxidation and water-gas shift at low temperatures, whereas Pt single atoms behave as spectators. The lack of catalytic activity of Pt single atoms can be partly attributed to the strong binding of CO molecules.

Journal ArticleDOI
TL;DR: In the first worldwide synthesis of in situ and satellite-derived lake data, this paper found that lake summer surface water temperatures rose rapidly (global mean = 0.34°C decade−1) between 1985 and 2009.
Abstract: In this first worldwide synthesis of in situ and satellite-derived lake data, we find that lake summer surface water temperatures rose rapidly (global mean = 0.34°C decade−1) between 1985 and 2009. Our analyses show that surface water warming rates are dependent on combinations of climate and local characteristics, rather than just lake location, leading to the counterintuitive result that regional consistency in lake warming is the exception, rather than the rule. The most rapidly warming lakes are widely geographically distributed, and their warming is associated with interactions among different climatic factors—from seasonally ice-covered lakes in areas where temperature and solar radiation are increasing while cloud cover is diminishing (0.72°C decade−1) to ice-free lakes experiencing increases in air temperature and solar radiation (0.53°C decade−1). The pervasive and rapid warming observed here signals the urgent need to incorporate climate impacts into vulnerability assessments and adaptation efforts for lakes.

Journal ArticleDOI
TL;DR: In this paper, a model of thriving through relationships is presented to provide a theoretical foundation for identifying the specific interpersonal processes that underlie the effects of close relationships on thriving, highlighting two life contexts through which people may potentially thrive (coping successfully with life's adversities and actively pursuing life opportunities for growth and development).
Abstract: Close and caring relationships are undeniably linked to health and well-being at all stages in the life span. Yet the specific pathways through which close relationships promote optimal well-being are not well understood. In this article, we present a model of thriving through relationships to provide a theoretical foundation for identifying the specific interpersonal processes that underlie the effects of close relationships on thriving. This model highlights two life contexts through which people may potentially thrive (coping successfully with life's adversities and actively pursuing life opportunities for growth and development), it proposes two relational support functions that are fundamental to the experience of thriving in each life context, and it identifies mediators through which relational support is likely to have long-term effects on thriving. This perspective highlights the need for researchers to take a new look at social support by conceptualizing it as an interpersonal process with a focus on thriving.

Journal ArticleDOI
01 Oct 2015-Nature
TL;DR: This paper demonstrates band-to-band tunnel field-effect transistors (tunnel-FETs), based on a two-dimensional semiconductor, that exhibit steep turn-on and is the only planar architecture tunnel-fET to achieve subthermionic subthreshold swing over four decades of drain current, and is also the only tunnel- FET (in any architecture) to achieve this at a low power-supply voltage of 0.1 volts.
Abstract: A new type of device, the band-to-band tunnel transistor, which has atomically thin molybdenum disulfide as the active channel, operates in a fundamentally different way from a conventional silicon (MOSFET) transistor; it has turn-on characteristics and low-power operation that are better than those of state-of-the-art MOSFETs or any tunnelling transistor reported so far. Traditional transistor technology is fast approaching its fundamental limits, and two-dimensional semiconducting materials such as molybdenum disulfide (MoS2) are seen as possible replacements for silicon in a next generation of high-density, lower-power chip electronics. A particularly promising prospect is their potential in band-to-band tunnel transistors, which operate in a fundamentally different way from conventional silicon (MOSFET) transistors. So far, few such devices with overall characteristics better than silicon transistors have been demonstrated. Now Kaustav Banerjee et al. have built a tunnel transistor by making a vertical structure with atomically thin MoS2 as the active channel and germanium as the source electrode. It has turn-on characteristics and low-power operation that are better than those of existing silicon transistors, and the results will be of interest in a range of electronic applications including low-power integrated circuits, as well as ultra-sensitive bio sensors or gas sensors. The fast growth of information technology has been sustained by continuous scaling down of the silicon-based metal–oxide field-effect transistor. However, such technology faces two major challenges to further scaling. First, the device electrostatics (the ability of the transistor’s gate electrode to control its channel potential) are degraded when the channel length is decreased, using conventional bulk materials such as silicon as the channel. Recently, two-dimensional semiconducting materials1,2,3,4,5,6,7 have emerged as promising candidates to replace silicon, as they can maintain excellent device electrostatics even at much reduced channel lengths. The second, more severe, challenge is that the supply voltage can no longer be scaled down by the same factor as the transistor dimensions because of the fundamental thermionic limitation of the steepness of turn-on characteristics, or subthreshold swing8,9. To enable scaling to continue without a power penalty, a different transistor mechanism is required to obtain subthermionic subthreshold swing, such as band-to-band tunnelling10,11,12,13,14,15,16. Here we demonstrate band-to-band tunnel field-effect transistors (tunnel-FETs), based on a two-dimensional semiconductor, that exhibit steep turn-on; subthreshold swing is a minimum of 3.9 millivolts per decade and an average of 31.1 millivolts per decade for four decades of drain current at room temperature. By using highly doped germanium as the source and atomically thin molybdenum disulfide as the channel, a vertical heterostructure is built with excellent electrostatics, a strain-free heterointerface, a low tunnelling barrier, and a large tunnelling area. Our atomically thin and layered semiconducting-channel tunnel-FET (ATLAS-TFET) is the only planar architecture tunnel-FET to achieve subthermionic subthreshold swing over four decades of drain current, as recommended in ref. 17, and is also the only tunnel-FET (in any architecture) to achieve this at a low power-supply voltage of 0.1 volts. Our device is at present the thinnest-channel subthermionic transistor, and has the potential to open up new avenues for ultra-dense and low-power integrated circuits, as well as for ultra-sensitive biosensors and gas sensors18,19,20,21.

Journal ArticleDOI
TL;DR: Author(s): Varki, Ajit; Cummings, Richard D; Aebi, Markus; Packer, Nicole H; Seeberger, Peter H; Esko, Jeffrey D; Stanley, Pamela; Hart, Gerald; Darvill, Alan; Kinoshita, Taroh; Prestegard, James J; Schnaar, Ronald L; Freeze, Hudson H; Marth, Jamey D; Bertozzi, Carolyn R.
Abstract: Author(s): Varki, Ajit; Cummings, Richard D; Aebi, Markus; Packer, Nicole H; Seeberger, Peter H; Esko, Jeffrey D; Stanley, Pamela; Hart, Gerald; Darvill, Alan; Kinoshita, Taroh; Prestegard, James J; Schnaar, Ronald L; Freeze, Hudson H; Marth, Jamey D; Bertozzi, Carolyn R; Etzler, Marilynn E; Frank, Martin; Vliegenthart, Johannes Fg; Lutteke, Thomas; Perez, Serge; Bolton, Evan; Rudd, Pauline; Paulson, James; Kanehisa, Minoru; Toukach, Philip; Aoki-Kinoshita, Kiyoko F; Dell, Anne; Narimatsu, Hisashi; York, William; Taniguchi, Naoyuki; Kornfeld, Stuart

Journal ArticleDOI
TL;DR: Tools from control and network theories are used to offer a mechanistic explanation for how the brain moves between cognitive states drawn from the network organization of white matter microstructure and suggest that densely connected areas facilitate the movement of the brain to many easily reachable states.
Abstract: Cognitive function is driven by dynamic interactions between large-scale neural circuits or networks, enabling behaviour. However, fundamental principles constraining these dynamic network processes have remained elusive. Here we use tools from control and network theories to offer a mechanistic explanation for how the brain moves between cognitive states drawn from the network organization of white matter microstructure. Our results suggest that densely connected areas, particularly in the default mode system, facilitate the movement of the brain to many easily reachable states. Weakly connected areas, particularly in cognitive control systems, facilitate the movement of the brain to difficult-to-reach states. Areas located on the boundary between network communities, particularly in attentional control systems, facilitate the integration or segregation of diverse cognitive systems. Our results suggest that structural network differences between cognitive circuits dictate their distinct roles in controlling trajectories of brain network function.

Journal ArticleDOI
Vardan Khachatryan1, Albert M. Sirunyan1, Armen Tumasyan1, Wolfgang Adam  +2134 moreInstitutions (142)
TL;DR: The couplings of the Higgs boson are probed for deviations in magnitude from the standard model predictions in multiple ways, including searches for invisible and undetected decays, and no significant deviations are found.
Abstract: Properties of the Higgs boson with mass near 125 GeV are measured in proton-proton collisions with the CMS experiment at the LHC. Comprehensive sets of production and decay measurements are combined. The decay channels include gamma gamma, ZZ, WW, tau tau, bb, and mu mu pairs. The data samples were collected in 2011 and 2012 and correspond to integrated luminosities of up to 5.1 inverse femtobarns at 7 TeV and up to 19.7 inverse femtobarns at 8 TeV. From the high-resolution gamma gamma and ZZ channels, the mass of the Higgs boson is determined to be 125.02 +0.26 -0.27 (stat) +0.14 -0.15 (syst) GeV. For this mass value, the event yields obtained in the different analyses tagging specific decay channels and production mechanisms are consistent with those expected for the standard model Higgs boson. The combined best-fit signal relative to the standard model expectation is 1.00 +/- 0.09 (stat) +0.08 -0.07 (theo) +/- 0.07 (syst) at the measured mass. The couplings of the Higgs boson are probed for deviations in magnitude from the standard model predictions in multiple ways, including searches for invisible and undetected decays. No significant deviations are found.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a basic physical description of the exciton diffusion in organic semiconductors and present experimental methods that are used to measure the key parameters of this process.
Abstract: The purpose of this review is to provide a basic physical description of the exciton diffusion in organic semiconductors. Furthermore, experimental methods that are used to measure the key parameters of this process as well as strategies to manipulate the exciton diffusion length are summarized. Special attention is devoted to the temperature dependence of exciton diffusion and its relationship to Forster energy transfer rates. An extensive table of more than a hundred measurements of the exciton diffusion length in various organic semiconductors is presented. Finally, an outlook of remaining challenges for future research is provided.

OtherDOI
27 Apr 2015
TL;DR: Communication Accommodation Theory (CAT) as discussed by the authors is a general theoretical framework of both interpersonal and intergroup communication, and it seeks to explain and predict why, when, and how people adjust their communicative behavior during social interaction, and what social consequences result from those adjustments.
Abstract: Communication Accommodation Theory (CAT) is a general theoretical framework of both interpersonal and intergroup communication. It seeks to explain and predict why, when, and how people adjust their communicative behavior during social interaction, and what social consequences result from those adjustments. In this entry, a brief historical overview of CAT's development is first provided, and some of its basic concepts are introduced. Second, the different adjustment strategies that speakers may enact are explained, and objective and subjective measures of accommodation are distinguished. Third, the motivations underlying communicative adjustment are examined, and the ways in which they can be shaped by the sociohistorical context in which an interaction is embedded are discussed. Fourth, the social consequences of communicative adjustment (and nonadjustment) are explored, and some of the many factors that mediate and moderate people's evaluations of others’ behavior are discussed. Finally, previous CAT principles are refined and elaborated, and directions for future research are suggested. Keywords: accommodation; convergence; divergence; intergroup communication; interpersonal communication; language; overaccommodation; social identity; underaccommodation

Journal ArticleDOI
TL;DR: These complex relationships between phenotypic differences and the dynamics of competing species argue against the simple use of single functional traits to infer community assembly processes but lay the groundwork for a theoretically justified trait-based community ecology.
Abstract: Understanding the processes maintaining species diversity is a central problem in ecology, with implications for the conservation and management of ecosystems. Although biologists often assume that trait differences between competitors promote diversity, empirical evidence connecting functional traits to the niche differences that stabilize species coexistence is rare. Obtaining such evidence is critical because traits also underlie the average fitness differences driving competitive exclusion, and this complicates efforts to infer community dynamics from phenotypic patterns. We coupled field-parameterized mathematical models of competition between 102 pairs of annual plants with detailed sampling of leaf, seed, root, and whole-plant functional traits to relate phenotypic differences to stabilizing niche and average fitness differences. Single functional traits were often well correlated with average fitness differences between species, indicating that competitive dominance was associated with late phenology, deep rooting, and several other traits. In contrast, single functional traits were poorly correlated with the stabilizing niche differences that promote coexistence. Niche differences could only be described by combinations of traits, corresponding to differentiation between species in multiple ecological dimensions. In addition, several traits were associated with both fitness differences and stabilizing niche differences. These complex relationships between phenotypic differences and the dynamics of competing species argue against the simple use of single functional traits to infer community assembly processes but lay the groundwork for a theoretically justified trait-based community ecology.

Journal ArticleDOI
TL;DR: In this article, the authors discuss the theoretical prediction, experimental realization, and potential use of Majorana zero modes in future information processing devices through braiding-based topological quantum computation.
Abstract: We provide a current perspective on the rapidly developing field of Majorana zero modes in solid state systems. We emphasize the theoretical prediction, experimental realization, and potential use of Majorana zero modes in future information processing devices through braiding-based topological quantum computation. Well-separated Majorana zero modes should manifest non-Abelian braiding statistics suitable for unitary gate operations for topological quantum computation. Recent experimental work, following earlier theoretical predictions, has shown specific signatures consistent with the existence of Majorana modes localized at the ends of semiconductor nanowires in the presence of superconducting proximity effect. We discuss the experimental findings and their theoretical analyses, and provide a perspective on the extent to which the observations indicate the existence of anyonic Majorana zero modes in solid state systems. We also discuss fractional quantum Hall systems (the 5/2 state) in this context. We describe proposed schemes for carrying out braiding with Majorana zero modes as well as the necessary steps for implementing topological quantum computation.

Journal ArticleDOI
TL;DR: In this article, it was shown that holographic entanglement entropy can be calculated at arbitrary orders in the bulk Planck constant using the concept of a "quantum extremal surface", i.e., a surface which extremizes the generalized entropy.
Abstract: We propose that holographic entanglement entropy can be calculated at arbitrary orders in the bulk Planck constant using the concept of a “quantum extremal surface”: a surface which extremizes the generalized entropy, i.e. the sum of area and bulk entanglement entropy. At leading order in bulk quantum corrections, our proposal agrees with the formula of Faulkner, Lewkowycz, and Maldacena, which was derived only at this order; beyond leading order corrections, the two conjectures diverge. Quantum extremal surfaces lie outside the causal domain of influence of the boundary region as well as its complement, and in some spacetimes there are barriers preventing them from entering certain regions. We comment on the implications for bulk reconstruction.

Journal ArticleDOI
TL;DR: These guidelines are a working document that reflects the state of the field at the time of publication and any decision by practitioners to apply these guidelines must be made in light of local resources and individual patient circumstances.

Journal ArticleDOI
TL;DR: Using data sets from the western USA and associated studies, a framework is presented for determining the relative contribution of drought stress, insect attack, and their interactions, critical for modeling mortality in future climates.
Abstract: Climate change is expected to drive increased tree mortality through drought, heat stress, and insect attacks, with manifold impacts on forest ecosystems. Yet, climate-induced tree mortality and biotic disturbance agents are largely absent from process-based ecosystem models. Using data sets from the western USA and associated studies, we present a framework for determining the relative contribution of drought stress, insect attack, and their interactions, which is critical for modeling mortality in future climates. We outline a simple approach that identifies the mechanisms associated with two guilds of insects - bark beetles and defoliators - which are responsible for substantial tree mortality. We then discuss cross-biome patterns of insect-driven tree mortality and draw upon available evidence contrasting the prevalence of insect outbreaks in temperate and tropical regions. We conclude with an overview of tools and promising avenues to address major challenges. Ultimately, a multitrophic approach that captures tree physiology, insect populations, and tree-insect interactions will better inform projections of forest ecosystem responses to climate change.