scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: Postoperative pain and quality of life were investigated in a randomised trial of patients with early-stage non-small-cell lung cancer undergoing VATS versus open surgery, suggesting that VATS should be the preferred surgical approach for lobectomy in stage I non- Small cell lung cancer.
Abstract: Summary Background Video-assisted thoracoscopic surgery (VATS) is used increasingly as an alternative to thoracotomy for lobectomy in the treatment of early-stage non-small-cell lung cancer, but remains controversial and worldwide adoption rates are low. Non-randomised studies have suggested that VATS reduces postoperative morbidity, but there is little high-quality evidence to show its superiority over open surgery. We aimed to investigate postoperative pain and quality of life in a randomised trial of patients with early-stage non-small-cell lung cancer undergoing VATS versus open surgery. Methods We did a randomised controlled patient and observer blinded trial at a public university-based cardiothoracic surgery department in Denmark. We enrolled patients who were scheduled for lobectomy for stage I non-small-cell lung cancer. By use of a web-based randomisation system, we assigned patients (1:1) to lobectomy via four-port VATS or anterolateral thoracotomy. After surgery, we applied identical surgical dressings to ensure masking of patients and staff. Postoperative pain was measured with a numeric rating scale (NRS) six times per day during hospital stay and once at 2, 4, 8, 12, 26, and 52 weeks, and self-reported quality of life was assessed with the EuroQol 5 Dimensions (EQ5D) and the European Organisation for Research and Treatment of Cancer (EORTC) 30 item Quality of Life Questionnaire (QLQ-C30) during hospital stay and 2, 4, 8, 12, 26, and 52 weeks after discharge. The primary outcomes were the proportion of patients with clinically relevant moderate-to-severe pain (NRS ≥3) and mean quality of life scores. These outcomes were assessed longitudinally by logistic regression across all timepoints. Data for the primary analysis were analysed by modified intention to treat (ie, all randomised patients with pathologically confirmed non-small-cell lung cancer). This trial is registered with ClinicalTrials.gov, number NCT01278888. Findings Between Oct 1, 2008, and Aug 20, 2014, we screened 772 patients, of whom 361 were eligible for inclusion and 206 were enrolled. We randomly assigned 103 patients to VATS and 103 to anterolateral thoracotomy. 102 patients in the VATS group and 99 in the thoracotomy group were included in the final analysis. The proportion of patients with clinically relevant pain (NRS ≥3) was significantly lower during the first 24 h after VATS than after anterolateral thoracotomy (VATS 38%, 95% CI 0·28–0·48 vs thoracotomy 63%, 95% CI 0·52–0·72, p=0·0012). During 52 weeks of follow-up, episodes of moderate-to-severe pain were significantly less frequent after VATS than after anterolateral thoracotomy (p vs nine patients in the thoracotomy group), re-operation for bleeding (two vs none), twisted middle lobe (one vs three) or prolonged air leakage over 7 days (five vs six), arrhythmia (one vs one), or neurological events (one vs two). Nine (4%) patients died during the follow-up period (three in the VATS group and six in the thoracotomy group). Interpretation VATS is associated with less postoperative pain and better quality of life than is anterolateral thoracotomy for the first year after surgery, suggesting that VATS should be the preferred surgical approach for lobectomy in stage I non-small-cell lung cancer. Funding Simon Fougner Hartmanns Familiefond, Guldsmed AL & D Rasmussens Mindefond, Karen S Jensens legat, The University of Southern Denmark, The Research Council at Odense University Hospital, and Department of Cardiothoracic Surgery, Odense University Hospital.

697 citations


Journal ArticleDOI
TL;DR: Kirschvink et al. as discussed by the authors used magnetic analyses and electron microscopy to identify the abundant presence in the brain of magnetite nanoparticles that are consistent with high-temperature formation, suggesting, therefore, an external, not internal, source.
Abstract: Biologically formed nanoparticles of the strongly magnetic mineral, magnetite, were first detected in the human brain over 20 y ago [Kirschvink JL, Kobayashi-Kirschvink A, Woodford BJ (1992) Proc Natl Acad Sci USA 89(16):7683-7687]. Magnetite can have potentially large impacts on the brain due to its unique combination of redox activity, surface charge, and strongly magnetic behavior. We used magnetic analyses and electron microscopy to identify the abundant presence in the brain of magnetite nanoparticles that are consistent with high-temperature formation, suggesting, therefore, an external, not internal, source. Comprising a separate nanoparticle population from the euhedral particles ascribed to endogenous sources, these brain magnetites are often found with other transition metal nanoparticles, and they display rounded crystal morphologies and fused surface textures, reflecting crystallization upon cooling from an initially heated, iron-bearing source material. Such high-temperature magnetite nanospheres are ubiquitous and abundant in airborne particulate matter pollution. They arise as combustion-derived, iron-rich particles, often associated with other transition metal particles, which condense and/or oxidize upon airborne release. Those magnetite pollutant particles which are <∼200 nm in diameter can enter the brain directly via the olfactory bulb. Their presence proves that externally sourced iron-bearing nanoparticles, rather than their soluble compounds, can be transported directly into the brain, where they may pose hazard to human health.

697 citations


Journal ArticleDOI
TL;DR: This paper constructs an Underwater Image Enhancement Benchmark (UIEB) including 950 real-world underwater images, 890 of which have the corresponding reference images and proposes an underwater image enhancement network (called Water-Net) trained on this benchmark as a baseline, which indicates the generalization of the proposed UIEB for training Convolutional Neural Networks (CNNs).
Abstract: Underwater image enhancement has been attracting much attention due to its significance in marine engineering and aquatic robotics. Numerous underwater image enhancement algorithms have been proposed in the last few years. However, these algorithms are mainly evaluated using either synthetic datasets or few selected real-world images. It is thus unclear how these algorithms would perform on images acquired in the wild and how we could gauge the progress in the field. To bridge this gap, we present the first comprehensive perceptual study and analysis of underwater image enhancement using large-scale real-world images. In this paper, we construct an Underwater Image Enhancement Benchmark (UIEB) including 950 real-world underwater images, 890 of which have the corresponding reference images. We treat the rest 60 underwater images which cannot obtain satisfactory reference images as challenging data. Using this dataset, we conduct a comprehensive study of the state-of-the-art underwater image enhancement algorithms qualitatively and quantitatively. In addition, we propose an underwater image enhancement network (called Water-Net) trained on this benchmark as a baseline, which indicates the generalization of the proposed UIEB for training Convolutional Neural Networks (CNNs). The benchmark evaluations and the proposed Water-Net demonstrate the performance and limitations of state-of-the-art algorithms, which shed light on future research in underwater image enhancement. The dataset and code are available at https://li-chongyi.github.io/proj_benchmark.html .

697 citations


Journal ArticleDOI
TL;DR: The role of torsion in gravity has been extensively investigated along the main direction of bringing gravity closer to its gauge formulation and incorporating spin in a geometric description, and various torsional constructions, from teleparallel, to Einstein-Cartan, and metric-affine gauge theories are reviewed.
Abstract: Over the past decades, the role of torsion in gravity has been extensively investigated along the main direction of bringing gravity closer to its gauge formulation and incorporating spin in a geometric description. Here we review various torsional constructions, from teleparallel, to Einstein-Cartan, and metric-affine gauge theories, resulting in extending torsional gravity in the paradigm of f(T) gravity, where f(T) is an arbitrary function of the torsion scalar. Based on this theory, we further review the corresponding cosmological and astrophysical applications. In particular, we study cosmological solutions arising from f(T) gravity, both at the background and perturbation levels, in different eras along the cosmic expansion. The f(T) gravity construction can provide a theoretical interpretation of the late-time universe acceleration, and it can easily accommodate with the regular thermal expanding history including the radiation and cold dark matter dominated phases. Furthermore, if one traces back to very early times, a sufficiently long period of inflation can be achieved and hence can be investigated by cosmic microwave background observations, or alternatively, the Big Bang singularity can be avoided due to the appearance of non-singular bounces. Various observational constraints, especially the bounds coming from the large-scale structure data in the case of f(T) cosmology, as well as the behavior of gravitational waves, are described in detail. Moreover, the spherically symmetric and black hole solutions of the theory are reviewed. Additionally, we discuss various extensions of the f(T) paradigm. Finally, we consider the relation with other modified gravitational theories, such as those based on curvature, like f(R) gravity, trying to enlighten the subject of which formulation might be more suitable for quantization ventures and cosmological applications.

697 citations


Journal ArticleDOI
TL;DR: This review details the human pathology of select neurodegenerative disorders, focusing on their main protein aggregates, and suggests that abnormal protein conformers may spread from cell to cell along anatomically connected pathways.
Abstract: Neurodegenerative disorders are characterized by progressive loss of selectively vulnerable populations of neurons, which contrasts with select static neuronal loss because of metabolic or toxic disorders. Neurodegenerative diseases can be classified according to primary clinical features (e.g., dementia, parkinsonism, or motor neuron disease), anatomic distribution of neurodegeneration (e.g., frontotemporal degenerations, extrapyramidal disorders, or spinocerebellar degenerations), or principal molecular abnormality. The most common neurodegenerative disorders are amyloidoses, tauopathies, α-synucleinopathies, and TDP-43 proteinopathies. The protein abnormalities in these disorders have abnormal conformational properties. Growing experimental evidence suggests that abnormal protein conformers may spread from cell to cell along anatomically connected pathways, which may in part explain the specific anatomical patterns observed at autopsy. In this review, we detail the human pathology of select neurodegenerative disorders, focusing on their main protein aggregates.

697 citations


Journal ArticleDOI
TL;DR: In this article, a ternary hybrid was constructed by in situ growth of cobalt selenide (Co0.85Se) nanosheets vertically oriented on electrochemically exfoliated graphene foil, with subsequent deposition of NiFe layered-double-hydroxide by a hydrothermal treatment.
Abstract: Developing cost-effective electrocatalysts for both oxygen evolution reaction (OER) and hydrogen evolution reaction (HER) in basic media is critical to renewable energy conversion technologies. Here, we report a ternary hybrid that is constructed by in situ growth of cobalt selenide (Co0.85Se) nanosheets vertically oriented on electrochemically exfoliated graphene foil, with subsequent deposition of NiFe layered-double-hydroxide by a hydrothermal treatment. The resulting 3D hierarchical hybrid, possessing a high surface area of 156 m2 g−1 and strong coupling effect, exhibits excellent catalytic activity for OER, which only requires overpotentials of 1.50 and 1.51 V to attain current densities of 150 and 250 mA cm−2, respectively. These overpotentials are much lower than those reported for other non-noble-metal materials and Ir/C catalysts. The hybrid also efficiently catalyzes HER in base with a current density of 10 mA cm−2 at an overpotential of −0.26 V. Most importantly, we achieve a current density of 20 mA cm−2 at 1.71 V by using the 3D hybrid as both a cathode and an anode for overall water splitting, which is well comparable to the integrated performance of Pt/C and Ir/C catalysts.

697 citations


Posted Content
TL;DR: StarGAN v2, a single framework that tackles image-to-image translation models with limited diversity and multiple models for all domains, is proposed and shows significantly improved results over the baselines.
Abstract: A good image-to-image translation model should learn a mapping between different visual domains while satisfying the following properties: 1) diversity of generated images and 2) scalability over multiple domains. Existing methods address either of the issues, having limited diversity or multiple models for all domains. We propose StarGAN v2, a single framework that tackles both and shows significantly improved results over the baselines. Experiments on CelebA-HQ and a new animal faces dataset (AFHQ) validate our superiority in terms of visual quality, diversity, and scalability. To better assess image-to-image translation models, we release AFHQ, high-quality animal faces with large inter- and intra-domain differences. The code, pretrained models, and dataset can be found at this https URL.

697 citations


Journal ArticleDOI
15 Apr 2016-Science
TL;DR: Nuclear envelope opening in migrating leukocytes could have potentially important consequences for normal and pathological immune responses and survival of cells migrating through confining environments depended on efficient nuclear envelope and DNA repair machineries.
Abstract: In eukaryotic cells, the nuclear envelope separates the genomic DNA from the cytoplasmic space and regulates protein trafficking between the two compartments. This barrier is only transiently dissolved during mitosis. Here, we found that it also opened at high frequency in migrating mammalian cells during interphase, which allowed nuclear proteins to leak out and cytoplasmic proteins to leak in. This transient opening was caused by nuclear deformation and was rapidly repaired in an ESCRT (endosomal sorting complexes required for transport)-dependent manner. DNA double-strand breaks coincided with nuclear envelope opening events. As a consequence, survival of cells migrating through confining environments depended on efficient nuclear envelope and DNA repair machineries. Nuclear envelope opening in migrating leukocytes could have potentially important consequences for normal and pathological immune responses.

697 citations


PatentDOI
TL;DR: This work introduces network-based stratification (NBS), a method to integrate somatic tumor genomes with gene networks that allows for stratification of cancer into informative subtypes by clustering together patients with mutations in similar network regions.
Abstract: The embodiments provide a method of for stratification of cancer into one or more informative subtypes of a subject in need thereof The embodiments, further provide assigning a subject in need into one or more informative subtypes, including assigning a subject in need into an informative subtype wherein the subtype is an ovarian cancer subtype

697 citations


Journal ArticleDOI
TL;DR: This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras.
Abstract: Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of is), very high dynamic range (140dB vs. 60dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world.

697 citations


Proceedings ArticleDOI
07 Dec 2015
TL;DR: In this paper, the authors argue that domain expertise represented by the conventional sparse coding model is still valuable, and it can be combined with the key ingredients of deep learning to achieve further improved results.
Abstract: Deep learning techniques have been successfully applied in many areas of computer vision, including low-level image restoration problems. For image super-resolution, several models based on deep neural networks have been recently proposed and attained superior performance that overshadows all previous handcrafted models. The question then arises whether large-capacity and data-driven models have become the dominant solution to the ill-posed super-resolution problem. In this paper, we argue that domain expertise represented by the conventional sparse coding model is still valuable, and it can be combined with the key ingredients of deep learning to achieve further improved results. We show that a sparse coding model particularly designed for super-resolution can be incarnated as a neural network, and trained in a cascaded structure from end to end. The interpretation of the network based on sparse coding leads to much more efficient and effective training, as well as a reduced model size. Our model is evaluated on a wide range of images, and shows clear advantage over existing state-of-the-art methods in terms of both restoration accuracy and human subjective quality.

Journal ArticleDOI
TL;DR: In this article, the authors proposed that individuals overweight inflation experienced during their lifetimes and modifies existing adaptive learning models to allow for age-dependent updating of expectations in response to inflation surprises.
Abstract: How do individuals form expectations about future inflation? We propose that individuals overweight inflation experienced during their lifetimes. This approach modifies existing adaptive learning models to allow for age-dependent updating of expectations in response to inflation surprises. Young individuals update their expectations more strongly than older individuals since recent experiences account for a greater share of their accumulated lifetime history. We find support for these predictions using 57 years of microdata on inflation expectations from the Reuters/Michigan Survey of Consumers. Differences in experiences strongly predict differences in expectations, including the substantial disagreement between young and old individuals in periods of highly volatile inflation, such as the 1970s. It also explains household borrowing and lending behavior, including the choice of mortgages. JEL Codes: E03, G02, D03, E31, E37, D84, D83, D14.

Posted Content
TL;DR: This paper proposed an adversarial evaluation scheme for the Stanford Question Answering Dataset (SQuAD) to test whether systems can answer questions about paragraphs that contain adversarially inserted sentences, which are automatically generated to distract computer systems without changing the correct answer or misleading humans.
Abstract: Standard accuracy metrics indicate that reading comprehension systems are making rapid progress, but the extent to which these systems truly understand language remains unclear. To reward systems with real language understanding abilities, we propose an adversarial evaluation scheme for the Stanford Question Answering Dataset (SQuAD). Our method tests whether systems can answer questions about paragraphs that contain adversarially inserted sentences, which are automatically generated to distract computer systems without changing the correct answer or misleading humans. In this adversarial setting, the accuracy of sixteen published models drops from an average of $75\%$ F1 score to $36\%$; when the adversary is allowed to add ungrammatical sequences of words, average accuracy on four models decreases further to $7\%$. We hope our insights will motivate the development of new models that understand language more precisely.

Journal ArticleDOI
TL;DR: Evidence suggests that the number of patients with heart failure may be on the rise in low‐income countries struggling under the double burden of communicable diseases and conditions associated with a Western‐type lifestyle, and indicates the end of the epidemic yet is not reached.
Abstract: The heart failure syndrome has first been described as an emerging epidemic about 25 years ago. Today, because of a growing and ageing population, the total number of heart failure patients still continues to rise. However, the case mix of heart failure seems to be evolving. Incidence has stabilized and may even be decreasing in some populations, but alarming opposite trends have been observed in the relatively young, possibly related to an increase in obesity. In addition, a clear transition towards heart failure with a preserved ejection fraction has occurred. Although this transition is partially artificial, due to improved recognition of heart failure as a disorder affecting the entire left ventricular ejection fraction spectrum, links can be made with the growing burden of obesity-related diseases and with the ageing of the population. Similarly, evidence suggests that the number of patients with heart failure may be on the rise in low-income countries struggling under the double burden of communicable diseases and conditions associated with a Western-type lifestyle. These findings, together with the observation that the mortality rate of heart failure is declining less rapidly than previously, indicate we have not reached the end of the epidemic yet. In this review, the evolving epidemiology of heart failure is put into perspective, to discern major trends and project future directions.


Journal ArticleDOI
TL;DR: This work presents an experimental and theoretical study of an active metamaterial—composed of coupled gyroscopes on a lattice—that breaks time-reversal symmetry and presents a mathematical model that explains how the edge mode chirality can be switched via controlled distortions of the underlying lattice.
Abstract: Topological mechanical metamaterials are artificial structures whose unusual properties are protected very much like their electronic and optical counterparts. Here, we present an experimental and theoretical study of an active metamaterial--composed of coupled gyroscopes on a lattice--that breaks time-reversal symmetry. The vibrational spectrum displays a sonic gap populated by topologically protected edge modes that propagate in only one direction and are unaffected by disorder. We present a mathematical model that explains how the edge mode chirality can be switched via controlled distortions of the underlying lattice. This effect allows the direction of the edge current to be determined on demand. We demonstrate this functionality in experiment and envision applications of these edge modes to the design of one-way acoustic waveguides.

Journal ArticleDOI
TL;DR: This paper considers the problem of constructing interatomic potentials that approximate a given quantum-mechanical interaction model, and proposes a new class of systematically improvable potentials which are proposed, analyzed, and tested on an existing quantum- mechanical database.
Abstract: Density functional theory offers a very accurate way of computing materials properties from first principles. However, it is too expensive for modelling large-scale molecular systems whose properties are, in contrast, computed using interatomic potentials. The present paper considers, from a mathematical point of view, the problem of constructing interatomic potentials that approximate a given quantum-mechanical interaction model. In particular, a new class of systematically improvable potentials is proposed, analyzed, and tested on an existing quantum-mechanical database.

Journal ArticleDOI
06 Mar 2018-JAMA
TL;DR: Treatment with opioids was not superior to treatment with nonopioid medications for improving pain-related function over 12 months and results do not support initiation of opioid therapy for moderate to severe chronic back pain or hip or knee osteoarthritis pain.
Abstract: Importance Limited evidence is available regarding long-term outcomes of opioids compared with nonopioid medications for chronic pain. Objective To compare opioid vs nonopioid medications over 12 months on pain-related function, pain intensity, and adverse effects. Design, Setting, and Participants Pragmatic, 12-month, randomized trial with masked outcome assessment. Patients were recruited from Veterans Affairs primary care clinics from June 2013 through December 2015; follow-up was completed December 2016. Eligible patients had moderate to severe chronic back pain or hip or knee osteoarthritis pain despite analgesic use. Of 265 patients enrolled, 25 withdrew prior to randomization and 240 were randomized. Interventions Both interventions (opioid and nonopioid medication therapy) followed a treat-to-target strategy aiming for improved pain and function. Each intervention had its own prescribing strategy that included multiple medication options in 3 steps. In the opioid group, the first step was immediate-release morphine, oxycodone, or hydrocodone/acetaminophen. For the nonopioid group, the first step was acetaminophen (paracetamol) or a nonsteroidal anti-inflammatory drug. Medications were changed, added, or adjusted within the assigned treatment group according to individual patient response. Main Outcomes and Measures The primary outcome was pain-related function (Brief Pain Inventory [BPI] interference scale) over 12 months and the main secondary outcome was pain intensity (BPI severity scale). For both BPI scales (range, 0-10; higher scores = worse function or pain intensity), a 1-point improvement was clinically important. The primary adverse outcome was medication-related symptoms (patient-reported checklist; range, 0-19). Results Among 240 randomized patients (mean age, 58.3 years; women, 32 [13.0%]), 234 (97.5%) completed the trial. Groups did not significantly differ on pain-related function over 12 months (overallP = .58); mean 12-month BPI interference was 3.4 for the opioid group and 3.3 for the nonopioid group (difference, 0.1 [95% CI, −0.5 to 0.7]). Pain intensity was significantly better in the nonopioid group over 12 months (overallP = .03); mean 12-month BPI severity was 4.0 for the opioid group and 3.5 for the nonopioid group (difference, 0.5 [95% CI, 0.0 to 1.0]). Adverse medication-related symptoms were significantly more common in the opioid group over 12 months (overallP = .03); mean medication-related symptoms at 12 months were 1.8 in the opioid group and 0.9 in the nonopioid group (difference, 0.9 [95% CI, 0.3 to 1.5]). Conclusions and Relevance Treatment with opioids was not superior to treatment with nonopioid medications for improving pain-related function over 12 months. Results do not support initiation of opioid therapy for moderate to severe chronic back pain or hip or knee osteoarthritis pain. Trial Registration clinicaltrials.gov Identifier:NCT01583985

Journal ArticleDOI
01 Aug 2017-BMJ Open
TL;DR: This review identified a variety of factors of association between telehealth and patient satisfaction that could help implementers to match interventions as solutions to specific problems.
Abstract: Background The use of telehealth steadily increases as it has become a viable modality to patient care. Early adopters attempt to use telehealth to deliver high-quality care. Patient satisfaction is a key indicator of how well the telemedicine modality met patient expectations. Objective The objective of this systematic review and narrative analysis is to explore the association of telehealth and patient satisfaction in regards to effectiveness and efficiency. Methods Boolean expressions between keywords created a complex search string. Variations of this string were used in Cumulative Index of Nursing and Allied Health Literature and MEDLINE. Results 2193 articles were filtered and assessed for suitability (n=44). Factors relating to effectiveness and efficiency were identified using consensus. The factors listed most often were improved outcomes (20%), preferred modality (10%), ease of use (9%), low cost 8%), improved communication (8%) and decreased travel time (7%), which in total accounted for 61% of occurrences. Conclusion This review identified a variety of factors of association between telehealth and patient satisfaction. Knowledge of these factors could help implementers to match interventions as solutions to specific problems.

Journal ArticleDOI
TL;DR: Comparing the results from conjoint and vignette experiments on which attributes of hypothetical immigrants generate support for naturalization with the outcomes of closely corresponding referendums in Switzerland, it is found that the effects estimated from the surveys match the effects of the same attributes in the behavioral benchmark remarkably well.
Abstract: Survey experiments, like vignette and conjoint analyses, are widely used in the social sciences to elicit stated preferences and study how humans make multidimensional choices. However, there is a paucity of research on the external validity of these methods that examines whether the determinants that explain hypothetical choices made by survey respondents match the determinants that explain what subjects actually do when making similar choices in real-world situations. This study compares results from conjoint and vignette analyses on which immigrant attributes generate support for naturalization with closely corresponding behavioral data from a natural experiment in Switzerland, where some municipalities used referendums to decide on the citizenship applications of foreign residents. Using a representative sample from the same population and the official descriptions of applicant characteristics that voters received before each referendum as a behavioral benchmark, we find that the effects of the applicant attributes estimated from the survey experiments perform remarkably well in recovering the effects of the same attributes in the behavioral benchmark. We also find important differences in the relative performances of the different designs. Overall, the paired conjoint design, where respondents evaluate two immigrants side by side, comes closest to the behavioral benchmark; on average, its estimates are within 2% percentage points of the effects in the behavioral benchmark.

Proceedings Article
02 Feb 2018
TL;DR: Feature-wise linear modulation (FiLM) as discussed by the authors is a general-purpose conditioning method for neural networks that modulates features in a coherent manner to answer image-related questions.
Abstract: We introduce a general-purpose conditioning method for neu-ral networks called FiLM: Feature-wise Linear Modulation. FiLM layers influence neural network computation via a simple , feature-wise affine transformation based on conditioning information. We show that FiLM layers are highly effective for visual reasoning — answering image-related questions which require a multi-step, high-level process — a task which has proven difficult for standard deep learning methods that do not explicitly model reasoning. Specifically, we show on visual reasoning tasks that FiLM layers 1) halve state-of-the-art error for the CLEVR benchmark, 2) modulate features in a coherent manner, 3) are robust to ablations and architectural modifications, and 4) generalize well to challenging, new data from few examples or even zero-shot.

Journal ArticleDOI
TL;DR: In this paper, Prostate-specific membrane antigen (PSMA) is highly expressed in metastatic castration-resistant prostate cancer and remains fatal despite recent advances in medical technology.
Abstract: Background Metastatic castration-resistant prostate cancer remains fatal despite recent advances. Prostate-specific membrane antigen (PSMA) is highly expressed in metastatic castration-res...

Journal ArticleDOI
25 Feb 2020-BMJ
TL;DR: Ten years after the landmark review on health inequalities in England, coauthor Michael Marmot says the situation has become worse.
Abstract: Ten years after the landmark review on health inequalities in England, coauthor Michael Marmot says the situation has become worse

Journal ArticleDOI
TL;DR: Endoplasmic reticulum stress is a common feature in the pathology of numerous diseases because it plays a role in neurodegeneration, stroke, cancer, metabolic diseases and inflammation.
Abstract: The endoplasmic reticulum is an organelle with multiple functions. The synthesis of transmembrane proteins and proteins that are to be secreted occurs in this organelle. Many conditions that impose stress on cells, including hypoxia, starvation, infections and changes in secretory needs, challenge the folding capacity of the cell and promote endoplasmic reticulum stress. The cellular response involves the activation of sensors that transduce signaling cascades with the aim of restoring homeostasis. This is known as the unfolded protein response, which also intersects with the integrated stress response that reduces protein synthesis through inactivation of the initiation factor eIF2α. Central to the unfolded protein response are the sensors PERK, IRE1 and ATF6, as well as other signaling nodes such as c-Jun N-terminal kinase 1 (JNK) and the downstream transcription factors XBP1, ATF4 and CHOP. These proteins aim to restore homeostasis, but they can also induce cell death, which has been shown to occur by necroptosis and, more commonly, through the regulation of Bcl-2 family proteins (Bim, Noxa and Puma) that leads to mitochondrial apoptosis. In addition, endoplasmic reticulum stress and proteotoxic stress have been shown to induce TRAIL receptors and activation of caspase-8. Endoplasmic reticulum stress is a common feature in the pathology of numerous diseases because it plays a role in neurodegeneration, stroke, cancer, metabolic diseases and inflammation. Understanding how cells react to endoplasmic reticulum stress can accelerate discovery of drugs against these diseases.

Proceedings ArticleDOI
TL;DR: Two feature squeezing methods are explored: reducing the color bit depth of each pixel and spatial smoothing, which are inexpensive and complementary to other defenses, and can be combined in a joint detection framework to achieve high detection rates against state-of-the-art attacks.
Abstract: Although deep neural networks (DNNs) have achieved great success in many tasks, they can often be fooled by \emph{adversarial examples} that are generated by adding small but purposeful distortions to natural examples. Previous studies to defend against adversarial examples mostly focused on refining the DNN models, but have either shown limited success or required expensive computation. We propose a new strategy, \emph{feature squeezing}, that can be used to harden DNN models by detecting adversarial examples. Feature squeezing reduces the search space available to an adversary by coalescing samples that correspond to many different feature vectors in the original space into a single sample. By comparing a DNN model's prediction on the original input with that on squeezed inputs, feature squeezing detects adversarial examples with high accuracy and few false positives. This paper explores two feature squeezing methods: reducing the color bit depth of each pixel and spatial smoothing. These simple strategies are inexpensive and complementary to other defenses, and can be combined in a joint detection framework to achieve high detection rates against state-of-the-art attacks.

Journal ArticleDOI
TL;DR: A new method of Evaluation based on Distance from Average Solution (EDAS) is introduced for multi-criteria inventory clas- sification (MCIC) problems and shows that the proposed method is stable in different weights and well consistent with the other methods.
Abstract: An effective way for managing and controlling a large number o f inventory items or stock keeping units (SKUs) is the inventory classification. Tradi tional ABC analysis which based on only a single criterion is commonly used for classification of SKU s. However, we should consider inven- tory classification as a multi-criteria problem in practice . In this study, a new method of Evaluation based on Distance from Average Solution (EDAS) is introduced for multi-criteria inventory clas- sification (MCIC) problems. In the proposed method, we use po sitive and negative distances from the average solution for appraising alternatives (SKUs). To represent performance of the proposed method in MCIC problems, we use a common example with 47 SKUs. Comparing the results of the proposed method with some existing methods shows the good performance of it in ABC classifica- tion. The proposed method can also be used for multi-criteria decision-making (MCDM) problems. A comparative analysis is also made for showing the validity and stability of the proposed method in MCDM problems. We compare the proposed method with VIKOR, TOPSIS, SAW and COPRAS methods using an example. Seven sets of criteria weights and Spearman's correlation coefficient are used for this analysis. The results show that the proposed method is stable in different weights and well consistent with the other methods.

Journal ArticleDOI
TL;DR: In this article, the authors review recent progress in impurity systems such as colour centres in diamond and silicon carbide, rare-earth ions in solids and donors in silicon and project a possible path to chip-scale quantum technologies through sustained advances in nanofabrication, quantum control and materials engineering.
Abstract: Spins of impurities in solids provide a unique architecture to realize quantum technologies. A quantum register of electron and nearby nuclear spins in the lattice encompasses high-fidelity state manipulation and readout, long-lived quantum memory, and long-distance transmission of quantum states by optical transitions that coherently connect spins and photons. These features, combined with solid-state device engineering, establish impurity spins as promising resources for quantum networks, information processing and sensing. Focusing on optical methods for the access and connectivity of single spins, we review recent progress in impurity systems such as colour centres in diamond and silicon carbide, rare-earth ions in solids and donors in silicon. We project a possible path to chip-scale quantum technologies through sustained advances in nanofabrication, quantum control and materials engineering.

Journal ArticleDOI
TL;DR: It is demonstrated that FePS3 exhibits an Ising-type antiferromagnetic ordering down to the monolayer limit, in good agreement with the Onsager solution for two-dimensional order-disorder transition.
Abstract: Magnetism in two-dimensional materials is not only of fundamental scientific interest but also a promising candidate for numerous applications. However, studies so far, especially the experimental ones, have been mostly limited to the magnetism arising from defects, vacancies, edges, or chemical dopants which are all extrinsic effects. Here, we report on the observation of intrinsic antiferromagnetic ordering in the two-dimensional limit. By monitoring the Raman peaks that arise from zone folding due to antiferromagnetic ordering at the transition temperature, we demonstrate that FePS3 exhibits an Ising-type antiferromagnetic ordering down to the monolayer limit, in good agreement with the Onsager solution for two-dimensional order–disorder transition. The transition temperature remains almost independent of the thickness from bulk to the monolayer limit with TN ∼ 118 K, indicating that the weak interlayer interaction has little effect on the antiferromagnetic ordering.

Proceedings Article
01 Jun 2017
TL;DR: In this paper, Gradient Episodic Memory (GEM) is proposed for continual learning, where the model observes, once and one by one, examples concerning a sequence of tasks.
Abstract: One major obstacle towards AI is the poor ability of models to solve new problems quicker, and without forgetting previously acquired knowledge. To better understand this issue, we study the problem of continual learning, where the model observes, once and one by one, examples concerning a sequence of tasks. First, we propose a set of metrics to evaluate models learning over a continuum of data. These metrics characterize models not only by their test accuracy, but also in terms of their ability to transfer knowledge across tasks. Second, we propose a model for continual learning, called Gradient Episodic Memory (GEM) that alleviates forgetting, while allowing beneficial transfer of knowledge to previous tasks. Our experiments on variants of the MNIST and CIFAR-100 datasets demonstrate the strong performance of GEM when compared to the state-of-the-art.

Journal ArticleDOI
25 Aug 2016-Nature
TL;DR: This paper reported genome-wide ancient DNA from 44 ancient Near Easterners ranging in time between ~12,000 and 1,400 bc, from Natufian hunter-gatherers to Bronze Age farmers, showing that the earliest populations of the Near East derived around half their ancestry from a 'Basal Eurasian' lineage that had little if any Neanderthal admixture and that separated from other non-African lineages before their separation from each other.
Abstract: We report genome-wide ancient DNA from 44 ancient Near Easterners ranging in time between ~12,000 and 1,400 bc, from Natufian hunter–gatherers to Bronze Age farmers. We show that the earliest populations of the Near East derived around half their ancestry from a ‘Basal Eurasian’ lineage that had little if any Neanderthal admixture and that separated from other non-African lineages before their separation from each other. The first farmers of the southern Levant (Israel and Jordan) and Zagros Mountains (Iran) were strongly genetically differentiated, and each descended from local hunter–gatherers. By the time of the Bronze Age, these two populations and Anatolian-related farmers had mixed with each other and with the hunter–gatherers of Europe to greatly reduce genetic differentiation. The impact of the Near Eastern farmers extended beyond the Near East: farmers related to those of Anatolia spread westward into Europe; farmers related to those of the Levant spread southward into East Africa; farmers related to those of Iran spread northward into the Eurasian steppe; and people related to both the early farmers of Iran and to the pastoralists of the Eurasian steppe spread eastward into South Asia.