scispace - formally typeset
Search or ask a question

Showing papers by "University of California published in 2013"


Journal ArticleDOI
TL;DR: In this article, the authors argue that the following three statements cannot all be true: (i) Hawking radiation is in a pure state, (ii) the information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon.
Abstract: We argue that the following three statements cannot all be true: (i) Hawking radiation is in a pure state, (ii) the information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon, and (iii) the infalling observer encounters nothing unusual at the horizon. Perhaps the most conservative resolution is that the infalling observer burns up at the horizon. Alternatives would seem to require novel dynamics that nevertheless cause notable violations of semiclassical physics at macroscopic distances from the horizon.

1,476 citations


Journal ArticleDOI
TL;DR: In this paper, the authors highlight some of the challenges to hazards and disaster poli..., highlighting the accelerating disaster losses coupled with the increasing frequency of billion-dollar disaster events, such as the recent Hurricane Sandy.
Abstract: Escalating disaster losses coupled with the increasing frequency of billion-dollar disaster events, such as the recent Hurricane Sandy, highlight some of the challenges to hazards and disaster poli...

708 citations


Patent
15 Mar 2013
TL;DR: In this paper, a DNA-targeting RNA that comprises a targeting sequence and, together with a modifying polypeptide, provides for site-specific modification of a target DNA and/or a polypeptic associated with the target DNA.
Abstract: The present disclosure provides a DNA-targeting RNA that comprises a targeting sequence and, together with a modifying polypeptide, provides for site-specific modification of a target DNA and/or a polypeptide associated with the target DNA. The present disclosure further provides site-specific modifying polypeptides. The present disclosure further provides methods of site-specific modification of a target DNA and/or a polypeptide associated with the target DNA The present disclosure provides methods of modulating transcription of a target nucleic acid in a target cell, generally involving contacting the target nucleic acid with an enzymatically inactive Cas9 polypeptide and a DNA-targeting RNA. Kits and compositions for carrying out the methods are also provided. The present disclosure provides genetically modified cells that produce Cas9; and Cas9 transgenic non-human multicellular organisms.

702 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that embedding the interior Hilbert space of an old black hole into the early radiation is inconsistent, and that the semi-classical interior of an AdS black hole can be embedded into any dual CFT Hilbert space.
Abstract: We address claimed alternatives to the black hole firewall. We show that embedding the interior Hilbert space of an old black hole into the Hilbert space of the early radiation is inconsistent, as is embedding the semi-classical interior of an AdS black hole into any dual CFT Hilbert space. We develop the use of large AdS black holes as a system to sharpen the firewall argument. We also reiterate arguments that unitary non-local theories can avoid firewalls only if the non-localities are suitably dramatic.

563 citations


Book ChapterDOI
01 Jan 2013
TL;DR: In this paper, structural equation models (SEMs) and their role in causal analysis have been discussed and a variety of misunderstandings and myths about the nature of SEMs have emerged, and their repetition has led some to believe they are true.
Abstract: Causality was at the center of the early history of structural equation models (SEMs) which continue to serve as the most popular approach to causal analysis in the social sciences. Through decades of development, critics and defenses of the capability of SEMs to support causal inference have accumulated. A variety of misunderstandings and myths about the nature of SEMs and their role in causal analysis have emerged, and their repetition has led some to believe they are true. Our chapter is organized by presenting eight myths about causality and SEMs in the hope that this will lead to a more accurate understanding. More specifically, the eight myths are the following: (1) SEMs aim to establish causal relations from associations alone, (2) SEMs and regression are essentially equivalent, (3) no causation without manipulation, (4) SEMs are not equipped to handle nonlinear causal relationships, (5) a potential outcome framework is more principled than SEMs, (6) SEMs are not applicable to experiments with randomized treatments, (7) mediation analysis in SEMs is inherently noncausal, and (8) SEMs do not test any major part of the theory against the data. We present the facts that dispel these myths, describe what SEMs can and cannot do, and briefly present our critique of current practice using SEMs. We conclude that the current capabilities of SEMs to formalize and implement causal inference tasks are indispensible; its potential to do more is even greater.

495 citations


Journal ArticleDOI
TL;DR: The Radiation Belt Storm Probes (RBSP)-Energetic Particle, Composition, and Thermal Plasma (ECT) suite contains an innovative complement of particle instruments to ensure the highest quality measurements ever made in the inner magnetosphere and radiation belts as mentioned in this paper.
Abstract: The Radiation Belt Storm Probes (RBSP)-Energetic Particle, Composition, and Thermal Plasma (ECT) suite contains an innovative complement of particle instruments to ensure the highest quality measurements ever made in the inner magnetosphere and radiation belts. The coordinated RBSP-ECT particle measurements, analyzed in combination with fields and waves observations and state-of-the-art theory and modeling, are necessary for understanding the acceleration, global distribution, and variability of radiation belt electrons and ions, key science objectives of NASA’s Living With a Star program and the Van Allen Probes mission. The RBSP-ECT suite consists of three highly-coordinated instruments: the Magnetic Electron Ion Spectrometer (MagEIS), the Helium Oxygen Proton Electron (HOPE) sensor, and the Relativistic Electron Proton Telescope (REPT). Collectively they cover, continuously, the full electron and ion spectra from one eV to 10’s of MeV with sufficient energy resolution, pitch angle coverage and resolution, and with composition measurements in the critical energy range up to 50 keV and also from a few to 50 MeV/nucleon. All three instruments are based on measurement techniques proven in the radiation belts. The instruments use those proven techniques along with innovative new designs, optimized for operation in the most extreme conditions in order to provide unambiguous separation of ions and electrons and clean energy responses even in the presence of extreme penetrating background environments. The design, fabrication and operation of ECT spaceflight instrumentation in the harsh radiation belt environment ensure that particle measurements have the fidelity needed for closure in answering key mission science questions. ECT instrument details are provided in companion papers in this same issue.

492 citations


Journal ArticleDOI
TL;DR: In this paper, the Van Allen Probes were used to measure three dimensional quasi-static and low frequency electric fields and waves associated with the acceleration of energetic charged particles in the inner magnetosphere of the Earth.
Abstract: The Electric Fields and Waves (EFW) Instruments on the two Radiation Belt Storm Probe (RBSP) spacecraft (recently renamed the Van Allen Probes) are designed to measure three dimensional quasi-static and low frequency electric fields and waves associated with the major mechanisms responsible for the acceleration of energetic charged particles in the inner magnetosphere of the Earth. For this measurement, the instrument uses two pairs of spherical double probe sensors at the ends of orthogonal centripetally deployed booms in the spin plane with tip-to-tip separations of 100 meters. The third component of the electric field is measured by two spherical sensors separated by ∼15 m, deployed at the ends of two stacer booms oppositely directed along the spin axis of the spacecraft. The instrument provides a continuous stream of measurements over the entire orbit of the low frequency electric field vector at 32 samples/s in a survey mode. This survey mode also includes measurements of spacecraft potential to provide information on thermal electron plasma variations and structure. Survey mode spectral information allows the continuous evaluation of the peak value and spectral power in electric, magnetic and density fluctuations from several Hz to 6.5 kHz. On-board cross-spectral data allows the calculation of field-aligned wave Poynting flux along the magnetic field. For higher frequency waveform information, two different programmable burst memories are used with nominal sampling rates of 512 samples/s and 16 k samples/s. The EFW burst modes provide targeted measurements over brief time intervals of 3-d electric fields, 3-d wave magnetic fields (from the EMFISIS magnetic search coil sensors), and spacecraft potential. In the burst modes all six sensor-spacecraft potential measurements are telemetered enabling interferometric timing of small-scale plasma structures. In the first burst mode, the instrument stores all or a substantial fraction of the high frequency measurements in a 32 gigabyte burst memory. The sub-intervals to be downloaded are uplinked by ground command after inspection of instrument survey data and other information available on the ground. The second burst mode involves autonomous storing and playback of data controlled by flight software algorithms, which assess the “highest quality” events on the basis of instrument measurements and information from other instruments available on orbit. The EFW instrument provides 3-d wave electric field signals with a frequency response up to 400 kHz to the EMFISIS instrument for analysis and telemetry (Kletzing et al. Space Sci. Rev. 2013).

479 citations


Book ChapterDOI
26 May 2013
TL;DR: The Message-Locked Encryption (MLE) as discussed by the authors is a new cryptographic primitive where the key under which encryption and decryption are performed is itself derived from the message.
Abstract: We formalize a new cryptographic primitive that we call Message-Locked Encryption (MLE), where the key under which encryption and decryption are performed is itself derived from the message. MLE provides a way to achieve secure deduplication (space-efficient secure outsourced storage), a goal currently targeted by numerous cloudstorage providers. We provide definitions both for privacy and for a form of integrity that we call tag consistency. Based on this foundation, we make both practical and theoretical contributions. On the practical side, we provide ROM security analyses of a natural family of MLE schemes that includes deployed schemes. On the theoretical side the challenge is standard model solutions, and we make connections with deterministic encryption, hash functions secure on correlated inputs and the sample-then-extract paradigm to deliver schemes under different assumptions and for different classes of message sources. Our work shows that MLE is a primitive of both practical and theoretical interest.

461 citations


Journal ArticleDOI
TL;DR: It is shown that presentation of antibodies on the surface of nonspherical particles enhances antibody specificity as well as avidity toward their targets, opening unique opportunities for particulate forms of antibodies in therapeutics and diagnostics.
Abstract: Monoclonal antibodies are used in numerous therapeutic and diagnostic applications; however, their efficacy is contingent on specificity and avidity. Here, we show that presentation of antibodies on the surface of nonspherical particles enhances antibody specificity as well as avidity toward their targets. Using spherical, rod-, and disk-shaped polystyrene nano- and microparticles and trastuzumab as the targeting antibody, we studied specific and nonspecific uptake in three breast cancer cell lines: BT-474, SK-BR-3, and MDA-MB-231. Rods exhibited higher specific uptake and lower nonspecific uptake in all cells compared with spheres. This surprising interplay between particle shape and antibodies originates from the unique role of shape in determining binding and unbinding of particles to cell surface. In addition to exhibiting higher binding and internalization, trastuzumab-coated rods also exhibited greater inhibition of BT-474 breast cancer cell growth in vitro to a level that could not be attained by soluble forms of the antibody. The effect of trastuzumab-coated rods on cells was enhanced further by replacing polystyrene particles with pure chemotherapeutic drug nanoparticles of comparable dimensions made from camptothecin. Trastuzumab-coated camptothecin nanoparticles inhibited cell growth at a dose 1,000-fold lower than that required for comparable inhibition of growth using soluble trastuzumab and 10-fold lower than that using BSA-coated camptothecin. These results open unique opportunities for particulate forms of antibodies in therapeutics and diagnostics.

451 citations


Journal ArticleDOI
TL;DR: Flexibility programs have become widespread in the United States, but their use has not. as mentioned in this paper found that 79% of companies say they allow some of their employees, and 37% officially allow all or most employees, to periodically change starting or quitting times (Galinsky, Bond, & Sakai, 2008).
Abstract: Flexibility programs have become widespread in the United States, but their use has not. According to a recent study, 79% of companies say they allow some of their employees, and 37% officially allow all or most of their employees, to periodically change starting or quitting times (Galinsky, Bond, & Sakai, 2008). Although researchers often regard the official availability of flexibility and other work–life policies as an indicator of an organization’s responsiveness to employees’ work–life concerns (Davis & Kalleberg, 2006), having policies on the books does not always mean that workers feel comfortable using these policies (Blair-Loy, Wharton, & Goodstein, 2011). Studies that have assessed usage rates generally find that usage rates are low. This has proved a remarkably resilient problem. The basic forms of workplace flexibility have been around for decades: flextime, part-time schedules, compressed workweeks, job shares (Friedman, n.d.). Yet usage of these programs

447 citations


Journal ArticleDOI
01 May 2013-Gut
TL;DR: Progress in this area will be facilitated by optimising strain, dose and product formulations, including protective commensal species; matching these formulations with selectively responsive subpopulations; and identifying ways to manipulate diet to modify bacterial profiles and metabolism.
Abstract: Probiotics are derived from traditional fermented foods, from beneficial commensals or from the environment. They act through diverse mechanisms affecting the composition or function of the commensal microbiota and by altering host epithelial and immunological responses. Certain probiotic interventions have shown promise in selected clinical conditions where aberrant microbiota have been reported, such as atopic dermatitis, necrotising enterocolitis, pouchitis and possibly irritable bowel syndrome. However, no studies have been conducted that can causally link clinical improvements to probiotic-induced microbiota changes. Whether a disease-prone microbiota pattern can be remodelled to a more robust, resilient and disease-free state by probiotic administration remains a key unanswered question. Progress in this area will be facilitated by: optimising strain, dose and product formulations, including protective commensal species; matching these formulations with selectively responsive subpopulations; and identifying ways to manipulate diet to modify bacterial profiles and metabolism.

Posted ContentDOI
TL;DR: In this article, the authors review the existing evidence regarding the effects of technological and non-technological innovations on the productivity of firms and the existence of possible complementarities between these different forms of innovation.
Abstract: This paper reviews the existing evidence regarding the effects of technological and non-technological innovations on the productivity of firms and the existence of possible complementarities between these different forms of innovation.

Journal ArticleDOI
TL;DR: The authors found that negative attitudes were associated with endorsement of a binary conception of gender; higher levels of psychological authoritarianism, political conservatism, and anti-egalitarianism, and (for women) religiosity; and lack of personal contact with sexual minorities.
Abstract: Using data from a national probability sample of heterosexual U.S. adults (N = 2,281), the present study describes the distribution and correlates of men’s and women’s attitudes toward transgender people. Feeling thermometer ratings of transgender people were strongly correlated with attitudes toward gay men, lesbians, and bisexuals, but were significantly less favorable. Attitudes toward transgender people were more negative among heterosexual men than women. Negative attitudes were associated with endorsement of a binary conception of gender; higher levels of psychological authoritarianism, political conservatism, and anti-egalitarianism, and (for women) religiosity; and lack of personal contact with sexual minorities. In regression analysis, sexual prejudice accounted for much of the variance in transgender attitudes, but respondent gender, educational level, authoritarianism, anti-egalitarianism, and (for women) religiosity remained significant predictors with sexual prejudice statistically controlled. Implications and directions for future research on attitudes toward transgender people are discussed.

01 Jan 2013
TL;DR: In this article, the authors present the first systematic study of the costs of cyber-crime in the UK and the world as a whole, focusing on the direct costs, indirect costs and defence costs.
Abstract: This chapter documents what we believe to be the first systematic study of the costs of cybercrime. The initial workshop paper was prepared in response to a request from the UK Ministry of Defence following scepticism that previous studies had hyped the problem. For each of the main categories of cybercrime we set out what is and is not known of the direct costs, indirect costs and defence costs – both to the UK and to the world as a whole. We distinguish carefully between traditional crimes that are now “cyber” because they are conducted online (such as tax and welfare fraud); transitional crimes whose modus operandi has changed substantially as a result of the move online (such as credit card fraud); new crimes that owe their existence to the Internet; and what we might call platform crimes such as the provision of botnets which facilitate other crimes rather than being used to extract money from victims directly. As far as direct costs are concerned, we find that traditional offences such as tax and welfare fraud cost the typical citizen in the low hundreds of pounds/euros/dollars a year; transitional frauds cost a few pounds/euros/dollars; while the new computer crimes cost in the tens of pence/cents. However, the indirect costs and defence costs are much higher for transitional and new crimes. For the former they may be roughly comparable to what the criminals earn, while for the latter they may be an order of magnitude more. As a striking example, the botnet behind a third of the spam sent in 2010 earned its owners around $2.7 million, while worldwide expenditures on spam prevention probably exceeded a billion dollars. We are extremely inefficient at fighting cybercrime; or to put it another way, cyber-crooks are like terrorists or metal thieves in that their activities impose disproportionate costs on society. Some of the reasons for this are well-known: cybercrimes are global and have strong externalities, while traditional crimes such as burglary and car theft are local, and the associated equilibria have emerged after many years of optimisation. As for the more direct question of what should be done, our figures suggest that we should spend less in anticipation of cybercrime (on antivirus, firewalls, etc.) and more in response – that is, on the prosaic business of hunting down cyber-criminals and throwing them in jail.

Journal ArticleDOI
TL;DR: There are 18 mammalian cytochrome P450 (CYP) families, which encode 57 genes in the human genome, but the CYP2, CYP3 and CYP4 families contain far more genes than the other 15 families; these three families are also the ones that are dramatically larger in rodent genomes.
Abstract: There are 18 mammalian cytochrome P450 (CYP) families, which encode 57 genes in the human genome. CYP2, CYP3 and CYP4 families contain far more genes than the other 15 families; these three families are also the ones that are dramatically larger in rodent genomes. Most (if not all) genes in the CYP1, CYP2, CYP3 and CYP4 families encode enzymes involved in eicosanoid metabolism and are inducible by various environmental stimuli (i.e. diet, chemical inducers, drugs, pheromones, etc.), whereas the other 14 gene families often have only a single member, and are rarely if ever inducible or redundant. Although the CYP2 and CYP3 families can be regarded as largely redundant and promiscuous, mutations or other defects in one or more genes of the remaining 16 gene families are primarily the ones responsible for P450-specific diseases-confirming these genes are not superfluous or promiscuous but rather are more directly involved in critical life functions. P450-mediated diseases comprise those caused by: aberrant steroidogenesis; defects in fatty acid, cholesterol and bile acid pathways; vitamin D dysregulation and retinoid (as well as putative eicosanoid) dysregulation during fertilization, implantation, embryogenesis, foetogenesis and neonatal development.

Book ChapterDOI
01 Jan 2013
TL;DR: The figures suggest that the UK should spend less in anticipation of cybercrime and more in response – that is, on the prosaic business of hunting down cyber-criminals and throwing them in jail.
Abstract: This chapter documents what we believe to be the first systematic study of the costs of cybercrime The initial workshop paper was prepared in response to a request from the UK Ministry of Defence following scepticism that previous studies had hyped the problem For each of the main categories of cybercrime we set out what is and is not known of the direct costs, indirect costs and defence costs – both to the UK and to the world as a whole We distinguish carefully between traditional crimes that are now “cyber” because they are conducted online (such as tax and welfare fraud); transitional crimes whose modus operandi has changed substantially as a result of the move online (such as credit card fraud); new crimes that owe their existence to the Internet; and what we might call platform crimes such as the provision of botnets which facilitate other crimes rather than being used to extract money from victims directly As far as direct costs are concerned, we find that traditional offences such as tax and welfare fraud cost the typical citizen in the low hundreds of pounds/euros/dollars a year; transitional frauds cost a few pounds/euros/dollars; while the new computer crimes cost in the tens of pence/cents However, the indirect costs and defence costs are much higher for transitional and new crimes For the former they may be roughly comparable to what the criminals earn, while for the latter they may be an order of magnitude more As a striking example, the botnet behind a third of the spam sent in 2010 earned its owners around $27 million, while worldwide expenditures on spam prevention probably exceeded a billion dollars We are extremely inefficient at fighting cybercrime; or to put it another way, cyber-crooks are like terrorists or metal thieves in that their activities impose disproportionate costs on society Some of the reasons for this are well-known: cybercrimes are global and have strong externalities, while traditional crimes such as burglary and car theft are local, and the associated equilibria have emerged after many years of optimisation As for the more direct question of what should be done, our figures suggest that we should spend less in anticipation of cybercrime (on antivirus, firewalls, etc) and more in response – that is, on the prosaic business of hunting down cyber-criminals and throwing them in jail

Journal ArticleDOI
TL;DR: Patients treated with percutaneous repair of the mitral valve more commonly required surgery to treat residual MR; however, after the first year of follow-up, there were few surgeries required after either per cutaneous or surgical treatment and no difference in the prevalence of moderate-severe and severe MR or mortality at 4 years.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed two real-time indices, BSISO1 and BSISO2, based on multivariate empirical orthogonal function (MV-EOF) analysis of daily anomalies of outgoing longwave radiation (OLR) and zonal wind at 850hPa (U850) in the region 10°S−40°N, 40°160°E, for the extended boreal summer (May-October) season over the 30-year period 1981-2010.
Abstract: The boreal summer intraseasonal oscillation (BSISO) of the Asian summer monsoon (ASM) is one of the most prominent sources of short-term climate variability in the global monsoon system. Compared with the related Madden-Julian Oscillation (MJO) it is more complex in nature, with prominent northward propagation and variability extending much further from the equator. In order to facilitate detection, monitoring and prediction of the BSISO we suggest two real-time indices: BSISO1 and BSISO2, based on multivariate empirical orthogonal function (MV-EOF) analysis of daily anomalies of outgoing longwave radiation (OLR) and zonal wind at 850 hPa (U850) in the region 10°S–40°N, 40°–160°E, for the extended boreal summer (May–October) season over the 30-year period 1981–2010. BSISO1 is defined by the first two principal components (PCs) of the MV-EOF analysis, which together represent the canonical northward propagating variability that often occurs in conjunction with the eastward MJO with quasi-oscillating periods of 30–60 days. BSISO2 is defined by the third and fourth PCs, which together mainly capture the northward/northwestward propagating variability with periods of 10–30 days during primarily the pre-monsoon and monsoon-onset season. The BSISO1 circulation cells are more Rossby wave like with a northwest to southeast slope, whereas the circulation associated with BSISO2 is more elongated and front-like with a southwest to northeast slope. BSISO2 is shown to modulate the timing of the onset of Indian and South China Sea monsoons. Together, the two BSISO indices are capable of describing a large fraction of the total intraseasonal variability in the ASM region, and better represent the northward and northwestward propagation than the real-time multivariate MJO (RMM) index of Wheeler and Hendon.

Book ChapterDOI
01 Jan 2013
TL;DR: In this paper, the inverse heat transfer analysis (INTA) problem is studied, where the necessary geometry, temperatures, and radiative properties are known, enabling us to calculate the radiative intensity and heat fluxes in such enclosures.
Abstract: Up to this point we have concerned ourselves with radiative heat transfer problems, where the necessary geometry, temperatures, and radiative properties are known, enabling us to calculate the radiative intensity and radiative heat fluxes in such enclosures. Such cases are sometimes called “direct” heat transfer problems. However, there are many important engineering applications where knowledge of one or more input parameters is desired that cause a certain radiative intensity field. For example, it may be desired to control the temperatures of heating elements in a furnace, in order to achieve a specified temperature distribution or radiative heat load on an object being heated. Or the aim may be to deduce difficult to measure parameters (such as radiative properties, temperature fields inside a furnace, etc.) based on measurements of radiative intensity or radiative flux. Such calculations are known as inverse heat transfer analyses.

Journal ArticleDOI
TL;DR: An analysis of findings and models suggests four prominent aspects of executive control—selecting, maintaining, updating, and rerouting information processing that contribute to executive control of information processing.
Abstract: Functional neuroimaging and neuropsychological methods have broadened our understanding of the human prefrontal cortex. Converging evidence suggests that this brain region contributes to executive control of information processing. Both cognitive and neural-based models have attempted to delineate the manner in which the prefrontal cortex mediates executive control. An analysis of these findings and models suggests four prominent aspects of executive control—selecting, maintaining, updating, and rerouting information processing. These four aspects are couched in terms of dynamic filtering theory, which proposes that the prefrontal cortex acts as a selective gating or filtering mechanism that controls information processing.

Journal ArticleDOI
TL;DR: The authors reviewed how US deportations ballooned between 1997 and 2012, and highlighted how these deportations disproportionately targeted Latino working class men, and described this recent mass deportation as a gendered racial removal program.
Abstract: This article reviews how US deportations ballooned between 1997 and 2012, and underscores how these deportations disproportionately targeted Latino working class men. Building on Mae Ngai's (2004) concept of racial removal, we describe this recent mass deportation as a gendered racial removal program. Drawing from secondary sources, surveys conducted in Mexico, the U.S. Department of Home- land Security published statistics, and interviews with deportees conducted by the first author in Guatemala, the Dominican Republic, Brazil and Jamaica, we argue that: (1) deportations have taken on a new course in the aftermath of 9/11 and in the wake of the global economic crisis - involving a shift towards interior enforcement; (2) deportation has become a gendered and racial removal project of the state; and (3) deportations will have lasting consequences with gendered and raced effects here in the United States. We begin by examining the mechanisms of the new deportation regime, showing how it functions, and then examine the legislation and administrative decisions that make it possible. Next, we show the concentration of deportations by nation and gender. Finally, we discuss the causes of this gendered racial removal program, which include the male joblessness crisis since the Great Recession, the War on Terror, and the con- tinued criminalization of Black and Latino men by police authorities. Latino Studies (2013) 11, 271-292. doi:10.1057/lst.2013.14

Journal ArticleDOI
TL;DR: In this article, structural breaks in the unconditional and conditional mean as well as in the variance and covariance/correlation structure of time series models have been investigated for data exhibiting serial dependence.
Abstract: This paper gives an account of some of the recent work on structural breaks in time series models. In particular, we show how procedures based on the popular cumulative sum, CUSUM, statistics can be modified to work also for data exhibiting serial dependence. Both structural breaks in the unconditional and conditional mean as well as in the variance and covariance/correlation structure are covered. CUSUM procedures are nonparametric by design. If the data allows for parametric modeling, we demonstrate how likelihood approaches may be utilized to recover structural breaks. The estimation of multiple structural breaks is discussed. Furthermore, we cover how one can disentangle structural breaks (in the mean and/or the variance) on one hand and long memory or unit roots on the other. Several new lines of research are briefly mentioned.

Journal Article
C. Adams1, David H. Adams2, T. Akiri3, T. Alion4  +478 moreInstitutions (66)
TL;DR: The Long-Baseline Neutrino Experiment (LBNE) as mentioned in this paper is an extensively developed plan for a world-class experiment dedicated to addressing the early evolution of our universe, its current state and its eventual fate.
Abstract: The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.

Journal ArticleDOI
TL;DR: In this article, a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of the associated models according to its biological assumptions.
Abstract: Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.

Journal ArticleDOI
TL;DR: In this article, the China Antique variety of the sacred lotus was sequenced with Illumina and 454 technologies, at respective depths of 101× and 5.2×, and the final assembly has a contig N50 of 38.8 kbp and a scaffold n50 of 3.4 Mbp, covering 86.5% of the estimated 929 Mbp total genome size.
Abstract: Background: Sacred lotus is a basal eudicot with agricultural, medicinal, cultural and religious importance. It was domesticated in Asia about 7,000 years ago, and cultivated for its rhizomes and seeds as a food crop. It is particularly noted for its 1,300-year seed longevity and exceptional water repellency, known as the lotus effect. The latter property is due to the nanoscopic closely packed protuberances of its self-cleaning leaf surface, which have been adapted for the manufacture of a self-cleaning industrial paint, Lotusan. Results: The genome of the China Antique variety of the sacred lotus was sequenced with Illumina and 454 technologies, at respective depths of 101× and 5.2×. The final assembly has a contig N50 of 38.8 kbp and a scaffold N50 of 3.4 Mbp, and covers 86.5% of the estimated 929 Mbp total genome size. The genome notably lacks the paleo-triplication observed in other eudicots, but reveals a lineage-specific duplication. The genome has evidence of slow evolution, with a 30% slower nucleotide mutation rate than observed in grape. Comparisons of the available sequenced genomes suggest a minimum gene set for vascular plants of 4,223 genes. Strikingly, the sacred lotus has 16 COG2132 multi-copper oxidase family proteins with root-specific expression; these are involved in root meristem phosphate starvation, reflecting adaptation to limited nutrient availability in an aquatic environment. Conclusions: The slow nucleotide substitution rate makes the sacred lotus a better resource than the current standard, grape, for reconstructing the pan-eudicot genome, and should therefore accelerate comparative analysis between eudicots and monocots.

PatentDOI
TL;DR: In this article, a method for preparing a malleable and/or self-healing polymeric or composite material is provided, which includes providing a polymeric material comprising at least one alkene-containing polymer, combining the polymer with a homogeneous or heterogeneous transition metal olefin metathesis catalyst, and performing an olefins reaction on the polymer so as to form reversible carbon-carbon double bonds in the polymer.
Abstract: A method of preparing a malleable and/or self-healing polymeric or composite material is provided. The method includes providing a polymeric or composite material comprising at least one alkene-containing polymer, combining the polymer with at least one homogeneous or heterogeneous transition metal olefin metathesis catalyst to form a polymeric or composite material, and performing an olefin metathesis reaction on the polymer so as to form reversible carbon-carbon double bonds in the polymer. Also provided is a method of healing a fractured surface of a polymeric material. The method includes bringing a fractured surface of a first polymeric material into contact with a second polymeric material, and performing an olefin metathesis reaction in the presence of a transition metal olefin metathesis catalyst such that the first polymeric material forms reversible carbon-carbon double bonds with the second polymeric material. Compositions comprising malleable and/or self-healing polymeric or composite material are also provided.

Journal ArticleDOI
TL;DR: In this article, the authors present the "Drag-Based Model" (DBM) of heliospheric propagation of interplanetary coronal mass ejections (ICMEs) based on the hypothesis that the driving Lorentz force, which launches a CME, ceases in the upper corona and that beyond a certain distance the dynamics becomes governed by the interaction of the ICME and the ambient solar wind.
Abstract: We present the “Drag-Based Model” (DBM) of heliospheric propagation of interplanetary coronal mass ejections (ICMEs). The DBM is based on the hypothesis that the driving Lorentz force, which launches a CME, ceases in the upper corona and that beyond a certain distance the dynamics becomes governed solely by the interaction of the ICME and the ambient solar wind. In particular, we consider the option where the drag acceleration has a quadratic dependence on the ICME relative speed, which is expected in a collisionless environment, where the drag is caused primarily by emission of magnetohydrodynamic (MHD) waves. In this paper we present the simplest version of DBM, where the equation of motion can be solved analytically, providing explicit solutions for the Sun–Earth ICME transit time and impact speed. This offers easy handling and straightforward application to real-time space-weather forecasting. Beside presenting the model itself, we perform an analysis of DBM performances, applying a statistical and case-study approach, which provides insight into the advantages and drawbacks of DBM. Finally, we present a public, DBM-based, online forecast tool.

Journal ArticleDOI
TL;DR: Among those with an ASD, lower conversation ability, lower functional skills, and living with a parent were predictors of less social participation.
Abstract: Investigating social participation of young adults with an autism spectrum disorder (ASD) is important given the increasing number of youth aging into young adulthood. Social participation is an indicator of life quality and overall functioning. Using data from the National Longitudinal Transition Study 2, we examined rates of participation in social activities among young adults who received special education services for autism (ASD group), compared to young adults who received special education for intellectual disability, emotional/behavioral disability, or a learning disability. Young adults with an ASD were significantly more likely to never see friends, never get called by friends, never be invited to activities, and be socially isolated. Among those with an ASD, lower conversation ability, lower functional skills, and living with a parent were predictors of less social participation.

Journal Article
TL;DR: In this paper, the Van Allen Probes were used to measure three dimensional quasi-static and low frequency electric fields and waves associated with the acceleration of energetic charged particles in the inner magnetosphere of the Earth.
Abstract: The Electric Fields and Waves (EFW) Instruments on the two Radiation Belt Storm Probe (RBSP) spacecraft (recently renamed the Van Allen Probes) are designed to measure three dimensional quasi-static and low frequency electric fields and waves associated with the major mechanisms responsible for the acceleration of energetic charged particles in the inner magnetosphere of the Earth. For this measurement, the instrument uses two pairs of spherical double probe sensors at the ends of orthogonal centripetally deployed booms in the spin plane with tip-to-tip separations of 100 meters. The third component of the electric field is measured by two spherical sensors separated by ∼15 m, deployed at the ends of two stacer booms oppositely directed along the spin axis of the spacecraft. The instrument provides a continuous stream of measurements over the entire orbit of the low frequency electric field vector at 32 samples/s in a survey mode. This survey mode also includes measurements of spacecraft potential to provide information on thermal electron plasma variations and structure. Survey mode spectral information allows the continuous evaluation of the peak value and spectral power in electric, magnetic and density fluctuations from several Hz to 6.5 kHz. On-board cross-spectral data allows the calculation of field-aligned wave Poynting flux along the magnetic field. For higher frequency waveform information, two different programmable burst memories are used with nominal sampling rates of 512 samples/s and 16 k samples/s. The EFW burst modes provide targeted measurements over brief time intervals of 3-d electric fields, 3-d wave magnetic fields (from the EMFISIS magnetic search coil sensors), and spacecraft potential. In the burst modes all six sensor-spacecraft potential measurements are telemetered enabling interferometric timing of small-scale plasma structures. In the first burst mode, the instrument stores all or a substantial fraction of the high frequency measurements in a 32 gigabyte burst memory. The sub-intervals to be downloaded are uplinked by ground command after inspection of instrument survey data and other information available on the ground. The second burst mode involves autonomous storing and playback of data controlled by flight software algorithms, which assess the “highest quality” events on the basis of instrument measurements and information from other instruments available on orbit. The EFW instrument provides 3-d wave electric field signals with a frequency response up to 400 kHz to the EMFISIS instrument for analysis and telemetry (Kletzing et al. Space Sci. Rev. 2013).

Journal ArticleDOI
TL;DR: Key neuropathological features of FTD/ALS with GGGGCC repeat expansions can be recapitulated in iPSC-derived human neurons and also suggest that compromised autophagy function may represent a novel underlying pathogenic mechanism.
Abstract: The recently identified GGGGCC repeat expansion in the noncoding region of C9ORF72 is the most common pathogenic mutation in patients with frontotemporal dementia (FTD) or amyotrophic lateral sclerosis (ALS). We generated a human neuronal model and investigated the pathological phenotypes of human neurons containing GGGGCC repeat expansions. Skin biopsies were obtained from two subjects who had >1,000 GGGGCC repeats in C9ORF72 and their respective fibroblasts were used to generate multiple induced pluripotent stem cell (iPSC) lines. After extensive characterization, two iPSC lines from each subject were selected, differentiated into postmitotic neurons, and compared with control neurons to identify disease-relevant phenotypes. Expanded GGGGCC repeats exhibit instability during reprogramming and neuronal differentiation of iPSCs. RNA foci containing GGGGCC repeats were present in some iPSCs, iPSC-derived human neurons and primary fibroblasts. The percentage of cells with foci and the number of foci per cell appeared to be determined not simply by repeat length but also by other factors. These RNA foci do not seem to sequester several major RNA-binding proteins. Moreover, repeat-associated non-ATG (RAN) translation products were detected in human neurons with GGGGCC repeat expansions and these neurons showed significantly elevated p62 levels and increased sensitivity to cellular stress induced by autophagy inhibitors. Our findings demonstrate that key neuropathological features of FTD/ALS with GGGGCC repeat expansions can be recapitulated in iPSC-derived human neurons and also suggest that compromised autophagy function may represent a novel underlying pathogenic mechanism.