scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: In this paper, the authors provide an overview of the fundamental origins and important applications of the main spin-orbit interaction phenomena in modern optics that play a crucial role at subwavelength scales, including spin-Hall effects in inhomogeneous media and at optical interfaces, spindependent effects in non-paraxial (focused or scattered) fields, spin-controlled shaping of light using anisotropic structured interfaces (metasurfaces).
Abstract: This Review article provides an overview of the fundamental origins and important applications of the main spin–orbit interaction phenomena in modern optics that play a crucial role at subwavelength scales. Light carries both spin and orbital angular momentum. These dynamical properties are determined by the polarization and spatial degrees of freedom of light. Nano-optics, photonics and plasmonics tend to explore subwavelength scales and additional degrees of freedom of structured — that is, spatially inhomogeneous — optical fields. In such fields, spin and orbital properties become strongly coupled with each other. In this Review we cover the fundamental origins and important applications of the main spin–orbit interaction phenomena in optics. These include: spin-Hall effects in inhomogeneous media and at optical interfaces, spin-dependent effects in nonparaxial (focused or scattered) fields, spin-controlled shaping of light using anisotropic structured interfaces (metasurfaces) and robust spin-directional coupling via evanescent near fields. We show that spin–orbit interactions are inherent in all basic optical processes, and that they play a crucial role in modern optics.

1,642 citations


Proceedings ArticleDOI
07 Jun 2015
TL;DR: This paper proposes an end-to-end hierarchical RNN for skeleton based action recognition, and demonstrates that the model achieves the state-of-the-art performance with high computational efficiency.
Abstract: Human actions can be represented by the trajectories of skeleton joints. Traditional methods generally model the spatial structure and temporal dynamics of human skeleton with hand-crafted features and recognize human actions by well-designed classifiers. In this paper, considering that recurrent neural network (RNN) can model the long-term contextual information of temporal sequences well, we propose an end-to-end hierarchical RNN for skeleton based action recognition. Instead of taking the whole skeleton as the input, we divide the human skeleton into five parts according to human physical structure, and then separately feed them to five subnets. As the number of layers increases, the representations extracted by the subnets are hierarchically fused to be the inputs of higher layers. The final representations of the skeleton sequences are fed into a single-layer perceptron, and the temporally accumulated output of the perceptron is the final decision. We compare with five other deep RNN architectures derived from our model to verify the effectiveness of the proposed network, and also compare with several other methods on three publicly available datasets. Experimental results demonstrate that our model achieves the state-of-the-art performance with high computational efficiency.

1,642 citations


Journal ArticleDOI
TL;DR: In this article, a revised definition of the circular economy is proposed, where planning, resourcing, procurement, production and reprocessing are designed and managed, as both process and output, to maximize ecosystem functioning and human well-being.
Abstract: There have long been calls from industry for guidance in implementing strategies for sustainable development. The Circular Economy represents the most recent attempt to conceptualize the integration of economic activity and environmental wellbeing in a sustainable way. This set of ideas has been adopted by China as the basis of their economic development (included in both the 11th and the 12th ‘Five Year Plan’), escalating the concept in minds of western policymakers and NGOs. This paper traces the conceptualisations and origins of the Circular Economy, tracing its meanings, and exploring its antecedents in economics and ecology, and discusses how the Circular Economy has been operationalized in business and policy. The paper finds that while the Circular Economy places emphasis on the redesign of processes and cycling of materials, which may contribute to more sustainable business models, it also encapsulates tensions and limitations. These include an absence of the social dimension inherent in sustainable development that limits its ethical dimensions, and some unintended consequences. This leads us to propose a revised definition of the Circular Economy as “an economic model wherein planning, resourcing, procurement, production and reprocessing are designed and managed, as both process and output, to maximize ecosystem functioning and human well-being”.

1,641 citations


Journal ArticleDOI
TL;DR: Pathway analysis implicates immunity, lipid metabolism, tau binding proteins, and amyloid precursor protein (APP) metabolism, showing that genetic variants affecting APP and Aβ processing are associated not only with early-onset autosomal dominant Alzheimer’s disease but also with LOAD.
Abstract: Risk for late-onset Alzheimer’s disease (LOAD), the most prevalent dementia, is partially driven by genetics. To identify LOAD risk loci, we performed a large genome-wide association meta-analysis of clinically diagnosed LOAD (94,437 individuals). We confirm 20 previous LOAD risk loci and identify five new genome-wide loci (IQCK, ACE, ADAM10, ADAMTS1, and WWOX), two of which (ADAM10, ACE) were identified in a recent genome-wide association (GWAS)-by-familial-proxy of Alzheimer’s or dementia. Fine-mapping of the human leukocyte antigen (HLA) region confirms the neurological and immune-mediated disease haplotype HLA-DR15 as a risk factor for LOAD. Pathway analysis implicates immunity, lipid metabolism, tau binding proteins, and amyloid precursor protein (APP) metabolism, showing that genetic variants affecting APP and Aβ processing are associated not only with early-onset autosomal dominant Alzheimer’s disease but also with LOAD. Analyses of risk genes and pathways show enrichment for rare variants (P = 1.32 × 10−7), indicating that additional rare variants remain to be identified. We also identify important genetic correlations between LOAD and traits such as family history of dementia and education.

1,641 citations


Journal Article
TL;DR: This article argued that narrative is a solution to a problem of general human concern, namely, the problem of how to translate knowing into telling, and fashioning human experience into a form assimilable to structures of meaning that are generally human rather than culture-specific.
Abstract: To raise the question of the nature of narrative is to invite reflection on the very nature of culture and, possibly, even on the nature of humanity itself. So natural is the impulse to narrate, so inevitable is the form of narrative for any report of the way things really happened, that narrativity could appear problematical only in a culture in which it was absent-absent or, as in some domains of contemporary Western intellectual and artistic culture, programmatically refused. As a panglobal fact of culture, narrative and narration are less problems than simply data. As the late (and already profoundly missed) Roland Barthes remarked, narrative "is simply there like life itself. . international, transhistorical, transcultural."' Far from being a problem, then, narrative might well be considered a solution to a problem of general human concern, namely, the problem of how to translate knowing into telling,2 the problem of fashioning human experience into a form assimilable to structures of meaning that are generally human rather than culture-specific. We may not be able fully to comprehend specific thought patterns of another culture, but we have relatively less difficulty understanding a story coming from another culture, however exotic that

1,640 citations


Proceedings ArticleDOI
07 Dec 2015
TL;DR: PoseNet as mentioned in this paper uses a CNN to regress the 6-DOF camera pose from a single RGB image in an end-to-end manner with no need of additional engineering or graph optimisation.
Abstract: We present a robust and real-time monocular six degree of freedom relocalization system. Our system trains a convolutional neural network to regress the 6-DOF camera pose from a single RGB image in an end-to-end manner with no need of additional engineering or graph optimisation. The algorithm can operate indoors and outdoors in real time, taking 5ms per frame to compute. It obtains approximately 2m and 3 degrees accuracy for large scale outdoor scenes and 0.5m and 5 degrees accuracy indoors. This is achieved using an efficient 23 layer deep convnet, demonstrating that convnets can be used to solve complicated out of image plane regression problems. This was made possible by leveraging transfer learning from large scale classification data. We show that the PoseNet localizes from high level features and is robust to difficult lighting, motion blur and different camera intrinsics where point based SIFT registration fails. Furthermore we show how the pose feature that is produced generalizes to other scenes allowing us to regress pose with only a few dozen training examples.

1,638 citations


Journal ArticleDOI
11 May 2020-JAMA
TL;DR: The authors found that African American individuals and to a lesser extent, Latino individuals bear a disproportionate burden of COVID-19-related outcomes, with the most severe presentation being acute respiratory distress syndrome leading to severe complications and death.
Abstract: The novel SARS-CoV-2 (severe acute respiratory syndrome coronavirus 2) has led to a global pandemic manifested as coronavirus disease 2019 (COVID-19), with its most severe presentation being acute respiratory distress syndrome leading to severe complications and death. Select underlying medical comorbidities, older age, diabetes, obesity, and male sex have been identified as biological vulnerabilities for more severe COVID-19 outcomes.1 Geographic locations that reported data by race/ethnicity indicate that African American individuals and, to a lesser extent, Latino individuals bear a disproportionate burden of COVID-19–related outcomes. The pandemic has shone a spotlight on health disparities and created an opportunity to address the causes underlying these inequities.2 The most pervasive disparities are observed among African American and Latino individuals, and where data exist, American Indian, Alaska Native, and Pacific Islander populations. Preliminary prevalence

1,637 citations


Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo sampler (The Joker) is used to perform a search for companions to 96,231 red-giant stars observed in the APOGEE survey (DR14) with $3$ spectroscopic epochs.
Abstract: Multi-epoch radial velocity measurements of stars can be used to identify stellar, sub-stellar, and planetary-mass companions. Even a small number of observation epochs can be informative about companions, though there can be multiple qualitatively different orbital solutions that fit the data. We have custom-built a Monte Carlo sampler (The Joker) that delivers reliable (and often highly multi-modal) posterior samplings for companion orbital parameters given sparse radial-velocity data. Here we use The Joker to perform a search for companions to 96,231 red-giant stars observed in the APOGEE survey (DR14) with $\geq 3$ spectroscopic epochs. We select stars with probable companions by making a cut on our posterior belief about the amplitude of the stellar radial-velocity variation induced by the orbit. We provide (1) a catalog of 320 companions for which the stellar companion properties can be confidently determined, (2) a catalog of 4,898 stars that likely have companions, but would require more observations to uniquely determine the orbital properties, and (3) posterior samplings for the full orbital parameters for all stars in the parent sample. We show the characteristics of systems with confidently determined companion properties and highlight interesting systems with candidate compact object companions.

1,637 citations


Proceedings Article
22 Jul 2016
TL;DR: In this paper, an actor-critic, model-free algorithm based on the deterministic policy gradient is proposed to operate over continuous action spaces, which is able to find policies whose performance is competitive with those found by a planning algorithm with full access to the dynamics of the domain.
Abstract: We adapt the ideas underlying the success of Deep Q-Learning to the continuous action domain. We present an actor-critic, model-free algorithm based on the deterministic policy gradient that can operate over continuous action spaces. Using the same learning algorithm, network architecture and hyper-parameters, our algorithm robustly solves more than 20 simulated physics tasks, including classic problems such as cartpole swing-up, dexterous manipulation, legged locomotion and car driving. Our algorithm is able to find policies whose performance is competitive with those found by a planning algorithm with full access to the dynamics of the domain and its derivatives. We further demonstrate that for many of the tasks the algorithm can learn policies end-to-end: directly from raw pixel inputs.

1,636 citations


Posted Content
TL;DR: In this paper, the authors identify a number of ways in which the form of economic organization of a society appears to influence the process of human development by shaping tastes, the framing of choice situations, psychological dispositions, values, and other determinants of individual behavior.
Abstract: Drawing on experimental economics, anthropology, social psychology, sociology, history, the theory of cultural evolution as well as more conventional economic sources, I review models and evidence concerning the impact of economic institutions on preferences, broadly construed. I identify a number of ways in which the form of economic organization of a society appears to influence the process of human development by shaping tastes, the framing of choice situations, psychological dispositions, values, and other determinants of individual behavior. I conclude by commenting on some implications for economic theory and policy analysis.

1,635 citations


Book ChapterDOI
12 Oct 2015
TL;DR: This paper proposes the triplet network model, which aims to learn useful representations by distance comparisons, and demonstrates using various datasets that this model learns a better representation than that of its immediate competitor, the Siamese network.
Abstract: Deep learning has proven itself as a successful set of models for learning useful semantic representations of data. These, however, are mostly implicitly learned as part of a classification task. In this paper we propose the triplet network model, which aims to learn useful representations by distance comparisons. A similar model was defined by Wang et al. (2014), tailor made for learning a ranking for image information retrieval. Here we demonstrate using various datasets that our model learns a better representation than that of its immediate competitor, the Siamese network. We also discuss future possible usage as a framework for unsupervised learning.

Journal ArticleDOI
TL;DR: In this article, the authors survey the state-of-the-art in NFV and identify promising research directions in this area, and also overview key NFV projects, standardization efforts, early implementations, use cases, and commercial products.
Abstract: Network function virtualization (NFV) has drawn significant attention from both industry and academia as an important shift in telecommunication service provisioning. By decoupling network functions (NFs) from the physical devices on which they run, NFV has the potential to lead to significant reductions in operating expenses (OPEX) and capital expenses (CAPEX) and facilitate the deployment of new services with increased agility and faster time-to-value. The NFV paradigm is still in its infancy and there is a large spectrum of opportunities for the research community to develop new architectures, systems and applications, and to evaluate alternatives and trade-offs in developing technologies for its successful deployment. In this paper, after discussing NFV and its relationship with complementary fields of software defined networking (SDN) and cloud computing, we survey the state-of-the-art in NFV, and identify promising research directions in this area. We also overview key NFV projects, standardization efforts, early implementations, use cases, and commercial products.

Journal ArticleDOI
TL;DR: This article presents an alternative model that separates the within-person process from stable between-person differences through the inclusion of random intercepts, and discusses how this model is related to existing structural equation models that include cross-lagged relationships.
Abstract: The cross-lagged panel model (CLPM) is believed by many to overcome the problems associated with the use of cross-lagged correlations as a way to study causal influences in longitudinal panel data. The current article, however, shows that if stability of constructs is to some extent of a trait-like, time-invariant nature, the autoregressive relationships of the CLPM fail to adequately account for this. As a result, the lagged parameters that are obtained with the CLPM do not represent the actual within-person relationships over time, and this may lead to erroneous conclusions regarding the presence, predominance, and sign of causal influences. In this article we present an alternative model that separates the within-person process from stable between-person differences through the inclusion of random intercepts, and we discuss how this model is related to existing structural equation models that include cross-lagged relationships. We derive the analytical relationship between the cross-lagged parameters from the CLPM and the alternative model, and use simulations to demonstrate the spurious results that may arise when using the CLPM to analyze data that include stable, trait-like individual differences. We also present a modeling strategy to avoid this pitfall and illustrate this using an empirical data set. The implications for both existing and future cross-lagged panel research are discussed.

Book ChapterDOI
08 Sep 2018
TL;DR: In this paper, a part-based convolutional baseline (PCB) is proposed to learn discriminative part-informed features for person retrieval and two contributions are made: (i) a network named Part-based Convolutional Baseline (PCBB) which outputs a convolutionAL descriptor consisting of several part-level features.
Abstract: Employing part-level features offers fine-grained information for pedestrian image description. A prerequisite of part discovery is that each part should be well located. Instead of using external resources like pose estimator, we consider content consistency within each part for precise part location. Specifically, we target at learning discriminative part-informed features for person retrieval and make two contributions. (i) A network named Part-based Convolutional Baseline (PCB). Given an image input, it outputs a convolutional descriptor consisting of several part-level features. With a uniform partition strategy, PCB achieves competitive results with the state-of-the-art methods, proving itself as a strong convolutional baseline for person retrieval. (ii) A refined part pooling (RPP) method. Uniform partition inevitably incurs outliers in each part, which are in fact more similar to other parts. RPP re-assigns these outliers to the parts they are closest to, resulting in refined parts with enhanced within-part consistency. Experiment confirms that RPP allows PCB to gain another round of performance boost. For instance, on the Market-1501 dataset, we achieve (77.4+4.2)% mAP and (92.3+1.5)% rank-1 accuracy, surpassing the state of the art by a large margin. Code is available at: https://github.com/syfafterzy/PCB_RPP

Journal ArticleDOI
TL;DR: The field is now in an exciting transitional period in which ctDNA analysis is beginning to be applied clinically, although there is still much to learn about the biology of cell-free DNA.
Abstract: Improvements in genomic and molecular methods are expanding the range of potential applications for circulating tumour DNA (ctDNA), both in a research setting and as a 'liquid biopsy' for cancer management. Proof-of-principle studies have demonstrated the translational potential of ctDNA for prognostication, molecular profiling and monitoring. The field is now in an exciting transitional period in which ctDNA analysis is beginning to be applied clinically, although there is still much to learn about the biology of cell-free DNA. This is an opportune time to appraise potential approaches to ctDNA analysis, and to consider their applications in personalized oncology and in cancer research.

Posted Content
Wei Wen1, Chunpeng Wu1, Yandan Wang1, Yi Chen2, Hai Li1 
TL;DR: The results show that for CIFAR-10, regularization on layer depth can reduce 20 layers of a Deep Residual Network to 18 layers while improve the accuracy from 91.25% to 92.60%, which is still slightly higher than that of original ResNet with 32 layers.
Abstract: High demand for computation resources severely hinders deployment of large-scale Deep Neural Networks (DNN) in resource constrained devices. In this work, we propose a Structured Sparsity Learning (SSL) method to regularize the structures (i.e., filters, channels, filter shapes, and layer depth) of DNNs. SSL can: (1) learn a compact structure from a bigger DNN to reduce computation cost; (2) obtain a hardware-friendly structured sparsity of DNN to efficiently accelerate the DNNs evaluation. Experimental results show that SSL achieves on average 5.1x and 3.1x speedups of convolutional layer computation of AlexNet against CPU and GPU, respectively, with off-the-shelf libraries. These speedups are about twice speedups of non-structured sparsity; (3) regularize the DNN structure to improve classification accuracy. The results show that for CIFAR-10, regularization on layer depth can reduce 20 layers of a Deep Residual Network (ResNet) to 18 layers while improve the accuracy from 91.25% to 92.60%, which is still slightly higher than that of original ResNet with 32 layers. For AlexNet, structure regularization by SSL also reduces the error by around ~1%. Open source code is in this https URL

Proceedings ArticleDOI
07 Jun 2015
TL;DR: In this article, a general framework was proposed to invert representations such as HOG and Bag of Visual Words (BOW) to reconstruct the image itself, which can be applied to CNNs too.
Abstract: Image representations, from SIFT and Bag of Visual Words to Convolutional Neural Networks (CNNs), are a crucial component of almost any image understanding system. Nevertheless, our understanding of them remains limited. In this paper we conduct a direct analysis of the visual information contained in representations by asking the following question: given an encoding of an image, to which extent is it possible to reconstruct the image itself? To answer this question we contribute a general framework to invert representations. We show that this method can invert representations such as HOG more accurately than recent alternatives while being applicable to CNNs too. We then use this technique to study the inverse of recent state-of-the-art CNN image representations for the first time. Among our findings, we show that several layers in CNNs retain photographically accurate information about the image, with different degrees of geometric and photometric invariance.

Book
09 Aug 2021
TL;DR: Section 1 - General Aspects of Medicinal Chemistry Section 2 - Lead Compound Discovery Strategies Section 3 - Primary Exploration of Structure-Activity Relationships Section 4 - Substituents and Functions: Qualitative and Quantitative aspects of structure- activity Relationships.
Abstract: General aspects of medicinal chemsitry drug targets and lead compound discovery strategies primary exploration of structure-activity relationships substituents and functions - qualitative and quantitative aspects of structure-activity relationships spatial organization, receptor mapping and molecular modelling chemical modifications influencing the pharmacokinetic properties pharmaceutical and chemical formulation problems development of new drugs - legal and economic aspects.

Journal ArticleDOI
TL;DR: In this paper, the authors review important mechanisms that contribute towards elevation-dependent warming, such as snow albedo and surface-based feedbacks, water vapour changes and latent heat release, surface water vapours and radiative flux changes, surface heat loss and temperature change; and aerosols.
Abstract: There is growing evidence that the rate of warming is amplified with elevation, such that high-mountain environments experience more rapid changes in temperature than environments at lower elevations. Elevation-dependent warming (EDW) can accelerate the rate of change in mountain ecosystems, cryospheric systems, hydrological regimes and biodiversity. Here we review important mechanisms that contribute towards EDW: snow albedo and surface-based feedbacks; water vapour changes and latent heat release; surface water vapour and radiative flux changes; surface heat loss and temperature change; and aerosols. All lead to enhanced warming with elevation (or at a critical elevation), and it is believed that combinations of these mechanisms may account for contrasting regional patterns of EDW. We discuss future needs to increase knowledge of mountain temperature trends and their controlling mechanisms through improved observations, satellite-based remote sensing and model simulations.

DOI
07 Dec 2015
TL;DR: The FEniCS Project is a collaborative project for the development of innovative concepts and tools for automated scientific computing, with a particular focus on the solution of differential equations by finite element methods.
Abstract: The FEniCS Project is a collaborative project for the development of innovative concepts and tools for automated scientific computing, with a particular focus on the solution of differential equations by finite element methods. The FEniCS Projects software consists of a collection of interoperable software components, including DOLFIN, FFC, FIAT, Instant, UFC, UFL, and mshr. This note describes the new features and changes introduced in the release of FEniCS version 1.5.

Journal ArticleDOI
01 Jun 2015-Pain
TL;DR: The IASP Task Force, which comprises pain experts from across the globe, has developed a new and pragmatic classification of chronic pain for the upcoming 11th revision of the International Classification of Diseases, termed “multiple parenting.”
Abstract: Chronic pain has been recognized as pain that persists past normal healing time5 and hence lacks the acute warning function of physiological nociception.35 Usually pain is regarded as chronic when it lasts or recurs for more than 3 to 6 months.29 Chronic pain is a frequent condition, affecting an estimated 20% of people worldwide6,13,14,18 and accounting for 15% to 20% of physician visits.25,28 Chronic pain should receive greater attention as a global health priority because adequate pain treatment is a human right, and it is the duty of any health care system to provide it.4,13 The current version of the International Classification of Diseases (ICD) of the World Health Organization (WHO) includes some diagnostic codes for chronic pain conditions, but these diagnoses do not reflect the actual epidemiology of chronic pain, nor are they categorized in a systematic manner. The ICD is the preeminent tool for coding diagnoses and documenting investigations or therapeutic measures within the health care systems of many countries. In addition, ICD codes are commonly used to report target diseases and comorbidities of participants in clinical research. Consequently, the current lack of adequate coding in the ICD makes the acquisition of accurate epidemiological data related to chronic pain difficult, prevents adequate billing for health care expenses related to pain treatment, and hinders the development and implementation of new therapies.10,11,16,23,27,31,37 Responding to these shortcomings, the International Association for the Study of Pain (IASP) contacted the WHO and established a Task Force for the Classification of Chronic Pain. The IASP Task Force, which comprises pain experts from across the globe,19 has developed a new and pragmatic classification of chronic pain for the upcoming 11th revision of the ICD. The goal is to create a classification system that is applicable in primary care and in clinical settings for specialized pain management. A major challenge in this process was finding a rational principle of classification that suits the different types of chronic pain and fits into the general ICD-11 framework. Pain categories are variably defined based on the perceived location (headache), etiology (cancer pain), or the primarily affected anatomical system (neuropathic pain). Some diagnoses of pain defy these classification principles (fibromyalgia). This problem is not unique to the classification of pain, but exists throughout the ICD. The IASP Task Force decided to give first priority to pain etiology, followed by underlying pathophysiological mechanisms, and finally the body site. Developing this multilayered classification was greatly facilitated by a novel principle of assigning diagnostic codes in ICD-11, termed “multiple parenting.” Multiple parenting allows the same diagnosis to be subsumed under more than 1 category (for a glossary of ICD terms refer to Table ​Table1).1). Each diagnosis retains 1 category as primary parent, but is cross-referenced to other categories that function as secondary parents. Table 1 Glossary of ICD-11 terms. The new ICD category for “Chronic Pain” comprises the most common clinically relevant disorders. These disorders were divided into 7 groups (Fig. ​(Fig.1):1): (1) chronic primary pain, (2) chronic cancer pain, (3) chronic posttraumatic and postsurgical pain, (4) chronic neuropathic pain, (5) chronic headache and orofacial pain, (6) chronic visceral pain, and (7) chronic musculoskeletal pain. Experts assigned to each group are responsible for the definition of diagnostic criteria and the selection of the diagnoses to be included under these subcategories of chronic pain. Thanks to Bedirhan Ustun and Robert Jakob of the WHO, these pain diagnoses are now integrated in the beta version of ICD-11 (http://id.who.int/icd/entity/1581976053). The Task Force is generating content models for single entities to describe their clinical characteristics. After peer review overseen by the WHO Steering Committee,39 the classification of chronic pain will be voted into action by the World Health Assembly in 2017. Figure 1 Organizational chart of Task Force, IASP, and WHO interactions. The IASP Task Force was created by the IASP council and its scope defined in direct consultation of the chairs (R.D.T. and W.R.) with WHO representatives in 2012. The Task Force reports to ... 2. Classification of chronic pain Chronic pain was defined as persistent or recurrent pain lasting longer than 3 months. This definition according to pain duration has the advantage that it is clear and operationalized. Optional specifiers for each diagnosis record evidence of psychosocial factors and the severity of the pain. Pain severity can be graded based on pain intensity, pain-related distress, and functional impairment. 2.1. Chronic primary pain Chronic primary pain is pain in 1 or more anatomic regions that persists or recurs for longer than 3 months and is associated with significant emotional distress or significant functional disability (interference with activities of daily life and participation in social roles) and that cannot be better explained by another chronic pain condition. This is a new phenomenological definition, created because the etiology is unknown for many forms of chronic pain. Common conditions such as, eg, back pain that is neither identified as musculoskeletal or neuropathic pain, chronic widespread pain, fibromyalgia, and irritable bowel syndrome will be found in this section and biological findings contributing to the pain problem may or may not be present. The term “primary pain” was chosen in close liaison with the ICD-11 revision committee, who felt this was the most widely acceptable term, in particular, from a nonspecialist perspective.

Journal ArticleDOI
TL;DR: This 2017 Consensus Statement is to provide a state-of-the-art review of the field of catheter and surgical ablation of AF and to report the findings of a writing group, convened by these five international societies.

Journal ArticleDOI
TL;DR: A strategy to entrap polysulfides in the cathode that relies on a chemical process, whereby a host--manganese dioxide nanosheets serve as the prototype--reacts with initially formed lithium polysolfides to form surface-bound intermediates, which are among the best reported to date.
Abstract: The lithium-sulfur battery is receiving intense interest because its theoretical energy density exceeds that of lithium-ion batteries at much lower cost, but practical applications are still hindered by capacity decay caused by the polysulfide shuttle. Here we report a strategy to entrap polysulfides in the cathode that relies on a chemical process, whereby a host--manganese dioxide nanosheets serve as the prototype--reacts with initially formed lithium polysulfides to form surface-bound intermediates. These function as a redox shuttle to catenate and bind 'higher' polysulfides, and convert them on reduction to insoluble lithium sulfide via disproportionation. The sulfur/manganese dioxide nanosheet composite with 75 wt% sulfur exhibits a reversible capacity of 1,300 mA h g(-1) at moderate rates and a fade rate over 2,000 cycles of 0.036%/cycle, among the best reported to date. We furthermore show that this mechanism extends to graphene oxide and suggest it can be employed more widely.

Journal ArticleDOI
TL;DR: An emergent logic of accumulation in the networked sphere, ‘surveillance capitalism,’ is described and its implications for ‘information civilization’ are considered and a distributed and largely uncontested new expression of power is christened: ‘Big Other.’
Abstract: This article describes an emergent logic of accumulation in the networked sphere, ‘surveillance capitalism,’ and considers its implications for ‘information civilization.’ The institutionalizing practices and operational assumptions of Google Inc. are the primary lens for this analysis as they are rendered in two recent articles authored by Google Chief Economist Hal Varian. Varian asserts four uses that follow from computer-mediated transactions: ‘data extraction and analysis,’ ‘new contractual forms due to better monitoring,’ ‘personalization and customization,’ and ‘continuous experiments.’ An examination of the nature and consequences of these uses sheds light on the implicit logic of surveillance capitalism and the global architecture of computer mediation upon which it depends. This architecture produces a distributed and largely uncontested new expression of power that I christen: ‘Big Other.’ It is constituted by unexpected and often illegible mechanisms of extraction, commodification, and control that effectively exile persons from their own behavior while producing new markets of behavioral prediction and modification. Surveillance capitalism challenges democratic norms and departs in key ways from the centuries-long evolution of market capitalism.

Journal ArticleDOI
TL;DR: The ReCiPe2016 method as discussed by the authors provides a state-of-the-art method to convert life cycle inventories to a limited number of life cycle impact scores on midpoint and endpoint level.
Abstract: Life cycle impact assessment (LCIA) translates emissions and resource extractions into a limited number of environmental impact scores by means of so-called characterisation factors. There are two mainstream ways to derive characterisation factors, i.e. at midpoint level and at endpoint level. To further progress LCIA method development, we updated the ReCiPe2008 method to its version of 2016. This paper provides an overview of the key elements of the ReCiPe2016 method. We implemented human health, ecosystem quality and resource scarcity as three areas of protection. Endpoint characterisation factors, directly related to the areas of protection, were derived from midpoint characterisation factors with a constant mid-to-endpoint factor per impact category. We included 17 midpoint impact categories. The update of ReCiPe provides characterisation factors that are representative for the global scale instead of the European scale, while maintaining the possibility for a number of impact categories to implement characterisation factors at a country and continental scale. We also expanded the number of environmental interventions and added impacts of water use on human health, impacts of water use and climate change on freshwater ecosystems and impacts of water use and tropospheric ozone formation on terrestrial ecosystems as novel damage pathways. Although significant effort has been put into the update of ReCiPe, there is still major improvement potential in the way impact pathways are modelled. Further improvements relate to a regionalisation of more impact categories, moving from local to global species extinction and adding more impact pathways. Life cycle impact assessment is a fast evolving field of research. ReCiPe2016 provides a state-of-the-art method to convert life cycle inventories to a limited number of life cycle impact scores on midpoint and endpoint level.

Journal ArticleDOI
05 Apr 2018-Cell
TL;DR: This study reports a PanCancer and PanSoftware analysis spanning 9,423 tumor exomes (comprising all 33 of The Cancer Genome Atlas projects) and using 26 computational tools to catalog driver genes and mutations, identifying 299 driver genes with implications regarding their anatomical sites and cancer/cell types.

Book ChapterDOI
TL;DR: In this article, the authors used the Cobb-Douglas technology to determine what will happen to consumption if only the rents from exhaustible resources are invested in reproducible capital goods.
Abstract: The present generation's ethical dilemma over shortchanging future generations by overconsuming exhaustible resources could be relieved by a program of converting the capital from these resources into machines and living off the current flow of machines and labor If the stock of machines is assumed not to depreciate, then the stock of productive capital and resources is not depleted Cobb-Douglas technology is used to determine what will happen to consumption if only the rents from exhaustible resources are invested in reproducible capital goods An important feature of Cobb-Douglas technology is that input in the form of minerals from an exhaustible resource is needed to get a positive output of the single produced commodity Results of the model indicate that a savings investment rule will not provide for the maintenance of per capita consumption constant over time Further studies have explored the effects of depreciations and intergenerational equity 7 references

Journal ArticleDOI
19 Feb 2015-Nature
TL;DR: A fine-mapping algorithm is developed to identify candidate causal variants for 21 autoimmune diseases from genotyping data, and it is found that most non-coding risk variants, including those that alter gene expression, affect non-canonical sequence determinants not well-explained by current gene regulatory models.
Abstract: Genome-wide association studies have identified loci underlying human diseases, but the causal nucleotide changes and mechanisms remain largely unknown. Here we developed a fine-mapping algorithm to identify candidate causal variants for 21 autoimmune diseases from genotyping data. We integrated these predictions with transcription and cis-regulatory element annotations, derived by mapping RNA and chromatin in primary immune cells, including resting and stimulated CD4(+) T-cell subsets, regulatory T cells, CD8(+) T cells, B cells, and monocytes. We find that ∼90% of causal variants are non-coding, with ∼60% mapping to immune-cell enhancers, many of which gain histone acetylation and transcribe enhancer-associated RNA upon immune stimulation. Causal variants tend to occur near binding sites for master regulators of immune differentiation and stimulus-dependent gene activation, but only 10-20% directly alter recognizable transcription factor binding motifs. Rather, most non-coding risk variants, including those that alter gene expression, affect non-canonical sequence determinants not well-explained by current gene regulatory models.

Journal ArticleDOI
TL;DR: Improved understanding of the host immune system and tumor microenvironment will better elucidate which patients derive benefit from these promising agents, and the issue of PD-L1 as an exclusionary predictive biomarker is clarified.
Abstract: The resurgence of cancer immunotherapy stems from an improved understanding of the tumor microenvironment. The PD-1/PD-L1 axis is of particular interest, in light of promising data demonstrating a restoration of host immunity against tumors, with the prospect of durable remissions. Indeed, remarkable clinical responses have been seen in several different malignancies including, but not limited to, melanoma, lung, kidney, and bladder cancers. Even so, determining which patients derive benefit from PD-1/PD-L1-directed immunotherapy remains an important clinical question, particularly in light of the autoimmune toxicity of these agents. The use of PD-L1 (B7-H1) immunohistochemistry (IHC) as a predictive biomarker is confounded by multiple unresolved issues: variable detection antibodies, differing IHC cutoffs, tissue preparation, processing variability, primary versus metastatic biopsies, oncogenic versus induced PD-L1 expression, and staining of tumor versus immune cells. Emerging data suggest that patients whose tumors overexpress PD-L1 by IHC have improved clinical outcomes with anti-PD-1-directed therapy, but the presence of robust responses in some patients with low levels of expression of these markers complicates the issue of PD-L1 as an exclusionary predictive biomarker. An improved understanding of the host immune system and tumor microenvironment will better elucidate which patients derive benefit from these promising agents.

Journal ArticleDOI
TL;DR: The mechanisms of microbial immune subversion that tip the balance from homeostasis to disease in oral or extra-oral sites are discussed.
Abstract: Periodontitis is a dysbiotic inflammatory disease with an adverse impact on systemic health. Recent studies have provided insights into the emergence and persistence of dysbiotic oral microbial communities that can mediate inflammatory pathology at local as well as distant sites. This Review discusses the mechanisms of microbial immune subversion that tip the balance from homeostasis to disease in oral or extra-oral sites.