scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: It is demonstrated that vaginal microbes can be partially restored at birth in C-section–delivered babies.
Abstract: Exposure of newborns to the maternal vaginal microbiota is interrupted with cesarean birthing. Babies delivered by cesarean section (C-section) acquire a microbiota that differs from that of vaginally delivered infants, and C-section delivery has been associated with increased risk for immune and metabolic disorders. Here we conducted a pilot study in which infants delivered by C-section were exposed to maternal vaginal fluids at birth. Similarly to vaginally delivered babies, the gut, oral and skin bacterial communities of these newborns during the first 30 d of life was enriched in vaginal bacteria--which were underrepresented in unexposed C-section-delivered infants--and the microbiome similarity to those of vaginally delivered infants was greater in oral and skin samples than in anal samples. Although the long-term health consequences of restoring the microbiota of C-section-delivered infants remain unclear, our results demonstrate that vaginal microbes can be partially restored at birth in C-section-delivered babies.

710 citations


Posted Content
TL;DR: This paper introduces the first domain adaptive semantic segmentation method, proposing an unsupervised adversarial approach to pixel prediction problems, and outperforms baselines across different settings on multiple large-scale datasets.
Abstract: Fully convolutional models for dense prediction have proven successful for a wide range of visual tasks. Such models perform well in a supervised setting, but performance can be surprisingly poor under domain shifts that appear mild to a human observer. For example, training on one city and testing on another in a different geographic region and/or weather condition may result in significantly degraded performance due to pixel-level distribution shift. In this paper, we introduce the first domain adaptive semantic segmentation method, proposing an unsupervised adversarial approach to pixel prediction problems. Our method consists of both global and category specific adaptation techniques. Global domain alignment is performed using a novel semantic segmentation network with fully convolutional domain adversarial learning. This initially adapted space then enables category specific adaptation through a generalization of constrained weak learning, with explicit transfer of the spatial layout from the source to the target domains. Our approach outperforms baselines across different settings on multiple large-scale datasets, including adapting across various real city environments, different synthetic sub-domains, from simulated to real environments, and on a novel large-scale dash-cam dataset.

710 citations


Journal ArticleDOI
TL;DR: ESGE recommends that the goals of endoscopic mucosal resection (EMR) are to achieve a completely snare-resected lesion in the safest minimum number of pieces, with adequate margins and without need for adjunctive ablative techniques.
Abstract: 1 ESGE recommends cold snare polypectomy (CSP) as the preferred technique for removal of diminutive polyps (size ≤ 5 mm). This technique has high rates of complete resection, adequate tissue sampling for histology, and low complication rates. (High quality evidence, strong recommendation.) 2 ESGE suggests CSP for sessile polyps 6 – 9 mm in size because of its superior safety profile, although evidence comparing efficacy with hot snare polypectomy (HSP) is lacking. (Moderate quality evidence, weak recommendation.) 3 ESGE suggests HSP (with or without submucosal injection) for removal of sessile polyps 10 – 19 mm in size. In most cases deep thermal injury is a potential risk and thus submucosal injection prior to HSP should be considered. (Low quality evidence, strong recommendation.) 4 ESGE recommends HSP for pedunculated polyps. To prevent bleeding in pedunculated colorectal polyps with head ≥ 20 mm or a stalk ≥ 10 mm in diameter, ESGE recommends pretreatment of the stalk with injection of dilute adrenaline and/or mechanical hemostasis. (Moderate quality evidence, strong recommendation.) 5 ESGE recommends that the goals of endoscopic mucosal resection (EMR) are to achieve a completely snare-resected lesion in the safest minimum number of pieces, with adequate margins and without need for adjunctive ablative techniques. (Low quality evidence; strong recommendation.) 6 ESGE recommends careful lesion assessment prior to EMR to identify features suggestive of poor outcome. Features associated with incomplete resection or recurrence include lesion size > 40 mm, ileocecal valve location, prior failed attempts at resection, and size, morphology, site, and access (SMSA) level 4. (Moderate quality evidence; strong recommendation.) 7 For intraprocedural bleeding, ESGE recommends endoscopic coagulation (snare-tip soft coagulation or coagulating forceps) or mechanical therapy, with or without the combined use of dilute adrenaline injection. (Low quality evidence, strong recommendation.) An algorithm of polypectomy recommendations according to shape and size of polyps is given ( Fig. 1 ).

710 citations


Journal ArticleDOI
TL;DR: In this article, a weighted model called HRDetect was developed to accurately detect BRCA1/BRCA2-deficient samples with 98.7% sensitivity (area under the curve (AUC) = 0.98).
Abstract: Approximately 1-5% of breast cancers are attributed to inherited mutations in BRCA1 or BRCA2 and are selectively sensitive to poly(ADP-ribose) polymerase (PARP) inhibitors. In other cancer types, germline and/or somatic mutations in BRCA1 and/or BRCA2 (BRCA1/BRCA2) also confer selective sensitivity to PARP inhibitors. Thus, assays to detect BRCA1/BRCA2-deficient tumors have been sought. Recently, somatic substitution, insertion/deletion and rearrangement patterns, or 'mutational signatures', were associated with BRCA1/BRCA2 dysfunction. Herein we used a lasso logistic regression model to identify six distinguishing mutational signatures predictive of BRCA1/BRCA2 deficiency. A weighted model called HRDetect was developed to accurately detect BRCA1/BRCA2-deficient samples. HRDetect identifies BRCA1/BRCA2-deficient tumors with 98.7% sensitivity (area under the curve (AUC) = 0.98). Application of this model in a cohort of 560 individuals with breast cancer, of whom 22 were known to carry a germline BRCA1 or BRCA2 mutation, allowed us to identify an additional 22 tumors with somatic loss of BRCA1 or BRCA2 and 47 tumors with functional BRCA1/BRCA2 deficiency where no mutation was detected. We validated HRDetect on independent cohorts of breast, ovarian and pancreatic cancers and demonstrated its efficacy in alternative sequencing strategies. Integrating all of the classes of mutational signatures thus reveals a larger proportion of individuals with breast cancer harboring BRCA1/BRCA2 deficiency (up to 22%) than hitherto appreciated (∼1-5%) who could have selective therapeutic sensitivity to PARP inhibition.

710 citations


Journal ArticleDOI
27 Jul 2018-Science
TL;DR: Live-cell single-molecule imaging revealed that TF LCDs interact to form local high-concentration hubs at both synthetic DNA arrays and endogenous genomic loci, suggesting that under physiological conditions, rapid, reversible, and selective multivalent LCD-LCD interactions occur between TFs and the RNA Pol II machinery to activate transcription.
Abstract: Many eukaryotic transcription factors (TFs) contain intrinsically disordered low-complexity sequence domains (LCDs), but how these LCDs drive transactivation remains unclear. We used live-cell single-molecule imaging to reveal that TF LCDs form local high-concentration interaction hubs at synthetic and endogenous genomic loci. TF LCD hubs stabilize DNA binding, recruit RNA polymerase II (RNA Pol II), and activate transcription. LCD-LCD interactions within hubs are highly dynamic, display selectivity with binding partners, and are differentially sensitive to disruption by hexanediols. Under physiological conditions, rapid and reversible LCD-LCD interactions occur between TFs and the RNA Pol II machinery without detectable phase separation. Our findings reveal fundamental mechanisms underpinning transcriptional control and suggest a framework for developing single-molecule imaging screens for drugs targeting gene regulatory interactions implicated in disease.

710 citations


Journal ArticleDOI
TL;DR: This survey surveys the state of the art regarding computational methods to process social media messages and highlights both their contributions and shortcomings, and methodically examines a series of key subproblems ranging from the detection of events to the creation of actionable and useful summaries.
Abstract: Social media platforms provide active communication channels during mass convergence and emergency events such as disasters caused by natural hazards. As a result, first responders, decision makers, and the public can use this information to gain insight into the situation as it unfolds. In particular, many social media messages communicated during emergencies convey timely, actionable information. Processing social media messages to obtain such information, however, involves solving multiple challenges including: parsing brief and informal messages, handling information overload, and prioritizing different types of information found in messages. These challenges can be mapped to classical information processing operations such as filtering, classifying, ranking, aggregating, extracting, and summarizing. We survey the state of the art regarding computational methods to process social media messages and highlight both their contributions and shortcomings. In addition, we examine their particularities, and methodically examine a series of key subproblems ranging from the detection of events to the creation of actionable and useful summaries. Research thus far has, to a large extent, produced methods to extract situational awareness information from social media. In this survey, we cover these various approaches, and highlight their benefits and shortcomings. We conclude with research challenges that go beyond situational awareness, and begin to look at supporting decision making and coordinating emergency-response actions.

710 citations


Journal ArticleDOI
TL;DR: The panel selected seven PICO (population–intervention–comparison–outcome) questions that generated a series of recommendations for HAP/VAP diagnosis, treatment and prevention that were adopted by the ERS/ESICM/ESCMID/ALAT panel.
Abstract: The most recent European guidelines and task force reports on hospital-acquired pneumonia (HAP) and ventilator-associated pneumonia (VAP) were published almost 10 years ago. Since then, further randomised clinical trials of HAP and VAP have been conducted and new information has become available. Studies of epidemiology, diagnosis, empiric treatment, response to treatment, new antibiotics or new forms of antibiotic administration and disease prevention have changed old paradigms. In addition, important differences between approaches in Europe and the USA have become apparent. The European Respiratory Society launched a project to develop new international guidelines for HAP and VAP. Other European societies, including the European Society of Intensive Care Medicine and the European Society of Clinical Microbiology and Infectious Diseases, were invited to participate and appointed their representatives. The Latin American Thoracic Association was also invited. A total of 15 experts and two methodologists made up the panel. Three experts from the USA were also invited (Michael S. Niederman, Marin Kollef and Richard Wunderink). Applying the GRADE (Grading of Recommendations, Assessment, Development and Evaluation) methodology, the panel selected seven PICO (population–intervention–comparison–outcome) questions that generated a series of recommendations for HAP/VAP diagnosis, treatment and prevention.

710 citations


Proceedings ArticleDOI
15 Jun 2019
TL;DR: ATOM as discussed by the authors proposes a novel tracking architecture consisting of dedicated target estimation and classification components, which is trained to predict the overlap between the target object and an estimated bounding box.
Abstract: While recent years have witnessed astonishing improvements in visual tracking robustness, the advancements in tracking accuracy have been limited. As the focus has been directed towards the development of powerful classifiers, the problem of accurate target state estimation has been largely overlooked. In fact, most trackers resort to a simple multi-scale search in order to estimate the target bounding box. We argue that this approach is fundamentally limited since target estimation is a complex task, requiring high-level knowledge about the object. We address this problem by proposing a novel tracking architecture, consisting of dedicated target estimation and classification components. High level knowledge is incorporated into the target estimation through extensive offline learning. Our target estimation component is trained to predict the overlap between the target object and an estimated bounding box. By carefully integrating target-specific information, our approach achieves previously unseen bounding box accuracy. We further introduce a classification component that is trained online to guarantee high discriminative power in the presence of distractors. Our final tracking framework sets a new state-of-the-art on five challenging benchmarks. On the new large-scale TrackingNet dataset, our tracker ATOM achieves a relative gain of 15% over the previous best approach, while running at over 30 FPS. Code and models are available at https://github.com/visionml/pytracking.

710 citations


Journal ArticleDOI
01 Feb 2015
TL;DR: A survey of techniques for event detection from Twitter streams aimed at finding real‐world occurrences that unfold over space and time and highlights the need for public benchmarks to evaluate the performance of different detection approaches and various features.
Abstract: Twitter is among the fastest-growing microblogging and online social networking services. Messages posted on Twitter tweets have been reporting everything from daily life stories to the latest local and global news and events. Monitoring and analyzing this rich and continuous user-generated content can yield unprecedentedly valuable information, enabling users and organizations to acquire actionable knowledge. This article provides a survey of techniques for event detection from Twitter streams. These techniques aim at finding real-world occurrences that unfold over space and time. In contrast to conventional media, event detection from Twitter streams poses new challenges. Twitter streams contain large amounts of meaningless messages and polluted content, which negatively affect the detection performance. In addition, traditional text mining techniques are not suitable, because of the short length of tweets, the large number of spelling and grammatical errors, and the frequent use of informal and mixed language. Event detection techniques presented in literature address these issues by adapting techniques from various fields to the uniqueness of Twitter. This article classifies these techniques according to the event type, detection task, and detection method and discusses commonly used features. Finally, it highlights the need for public benchmarks to evaluate the performance of different detection approaches and various features.

710 citations


Journal ArticleDOI
24 May 2019-Science
TL;DR: By a process of complete delignification and densification of wood, a structural material with a mechanical strength of 404.3 megapascals is developed, more than eight times that of natural wood, resulting in continuous subambient cooling during both day and night.
Abstract: Reducing human reliance on energy-inefficient cooling methods such as air conditioning would have a large impact on the global energy landscape. By a process of complete delignification and densification of wood, we developed a structural material with a mechanical strength of 404.3 megapascals, more than eight times that of natural wood. The cellulose nanofibers in our engineered material backscatter solar radiation and emit strongly in mid-infrared wavelengths, resulting in continuous subambient cooling during both day and night. We model the potential impact of our cooling wood and find energy savings between 20 and 60%, which is most pronounced in hot and dry climates.

710 citations


Journal ArticleDOI
TL;DR: Facing Covid-19 in Italy Physicians in northern Italy have learned some painful lessons about rationing care during an epidemic, and as health care systems work out ethical allocation principles, it is clear that more needs to be done to protect against future outbreaks of disease.
Abstract: Facing Covid-19 in Italy Physicians in northern Italy have learned some painful lessons about rationing care during an epidemic. As health care systems work out ethical allocation principles, it se...

Journal ArticleDOI
TL;DR: It is demonstrated that cells release distinct exosome subpopulations with unique compositions that elicit differential effects on recipient cells that will advance the understanding of exosomal biology in health and disease and accelerate the development ofExosome-based diagnostics and therapeutics.
Abstract: Cells release nano-sized membrane vesicles that are involved in intercellular communication by transferring biological information between cells. It is generally accepted that cells release at least three types of extracellular vesicles (EVs): apoptotic bodies, microvesicles and exosomes. While a wide range of putative biological functions have been attributed to exosomes, they are assumed to represent a homogenous population of EVs. We hypothesized the existence of subpopulations of exosomes with defined molecular compositions and biological properties. Density gradient centrifugation of isolated exosomes revealed the presence of two distinct subpopulations, differing in biophysical properties and their proteomic and RNA repertoires. Interestingly, the subpopulations mediated differential effects on the gene expression programmes in recipient cells. In conclusion, we demonstrate that cells release distinct exosome subpopulations with unique compositions that elicit differential effects on recipient cells. Further dissection of exosome heterogeneity will advance our understanding of exosomal biology in health and disease and accelerate the development of exosome-based diagnostics and therapeutics.

Journal ArticleDOI
TL;DR: This Perspective proposes a precision medicine strategy for chronic airway diseases in general, and asthma and COPD in particular, and a discussion of the concept of “treatable traits” as a way towards precision medicine of chronicAirway diseases.
Abstract: Asthma and chronic obstructive pulmonary disease (COPD) are two prevalent chronic airway diseases that have a high personal and social impact. They likely represent a continuum of different diseases that may share biological mechanisms (i.e. endotypes), and present similar clinical, functional, imaging and/or biological features that can be observed (i.e. phenotypes) which require individualised treatment. Precision medicine is defined as "treatments targeted to the needs of individual patients on the basis of genetic, biomarker, phenotypic, or psychosocial characteristics that distinguish a given patient from other patients with similar clinical presentations". In this Perspective, we propose a precision medicine strategy for chronic airway diseases in general, and asthma and COPD in particular.

Journal ArticleDOI
TL;DR: The name of MSCs should be changed to Medicinal Signaling Cells to more accurately reflect the fact that these cells home in on sites of injury or disease and secrete bioactive factors that are immunomodulatory and trophic (regenerative) meaning thatThese cells make therapeutic drugs in situ that are medicinal.
Abstract: Mesenchymal stem cells (MSCs) were officially named more than 25 years ago to represent a class of cells from human and mammalian bone marrow and periosteum that could be isolated and expanded in culture while maintaining their in vitro capacity to be induced to form a variety of mesodermal phenotypes and tissues. The in vitro capacity to form bone, cartilage, fat, etc., became an assay for identifying this class of multipotent cells and around which several companies were formed in the 1990s to medically exploit the regenerative capabilities of MSCs. Today, there are hundreds of clinics and hundreds of clinical trials using human MSCs with very few, if any, focusing on the in vitro multipotential capacities of these cells. Unfortunately, the fact that MSCs are called "stem cells" is being used to infer that patients will receive direct medical benefit, because they imagine that these cells will differentiate into regenerating tissue-producing cells. Such a stem cell treatment will presumably cure the patient of their medically relevant difficulties ranging from osteoarthritic (bone-on-bone) knees to various neurological maladies including dementia. I now urge that we change the name of MSCs to Medicinal Signaling Cells to more accurately reflect the fact that these cells home in on sites of injury or disease and secrete bioactive factors that are immunomodulatory and trophic (regenerative) meaning that these cells make therapeutic drugs in situ that are medicinal. It is, indeed, the patient's own site-specific and tissue-specific resident stem cells that construct the new tissue as stimulated by the bioactive factors secreted by the exogenously supplied MSCs. Stem Cells Translational Medicine 2017;6:1445-1451.

Journal ArticleDOI
TL;DR: Pembrolizumab improved progression-free survival and overall survival versus ipilimumab in patients with advanced melanoma and is now a standard of care in the first-line setting, however, the optimal duration of anti-PD-1 administration is unknown.
Abstract: Summary Background Pembrolizumab improved progression-free survival and overall survival versus ipilimumab in patients with advanced melanoma and is now a standard of care in the first-line setting. However, the optimal duration of anti-PD-1 administration is unknown. We present results from 5 years of follow-up of patients in KEYNOTE-006. Methods KEYNOTE-006 was an open-label, multicentre, randomised, controlled, phase 3 study done at 87 academic institutions, hospitals, and cancer centres in 16 countries. Patients aged at least 18 years with Eastern Cooperative Oncology Group performance status of 0 or 1, ipilimumab-naive histologically confirmed advanced melanoma with known BRAFV600 status and up to one previous systemic therapy were randomly assigned (1:1:1) to intravenous pembrolizumab 10 mg/kg every 2 weeks or every 3 weeks or four doses of intravenous ipilimumab 3 mg/kg every 3 weeks. Treatments were assigned using a centralised, computer-generated allocation schedule with blocked randomisation within strata. Exploratory combination of data from the two pembrolizumab dosing regimen groups was not protocol-specified. Pembrolizumab treatment continued for up to 24 months. Eligible patients who discontinued pembrolizumab with stable disease or better after receiving at least 24 months of pembrolizumab or discontinued with complete response after at least 6 months of pembrolizumab and then progressed could receive an additional 17 cycles of pembrolizumab. Co-primary endpoints were overall survival and progression-free survival. Efficacy was analysed in all randomly assigned patients, and safety was analysed in all randomly assigned patients who received at least one dose of study treatment. Exploratory assessment of efficacy and safety at 5 years' follow-up was not specified in the protocol. Data cutoff for this analysis was Dec 3, 2018. Recruitment is closed; the study is ongoing. This study is registered with ClinicalTrials.gov, number NCT01866319. Findings Between Sept 18, 2013, and March 3, 2014, 834 patients were enrolled and randomly assigned to receive pembrolizumab (every 2 weeks, n=279; every 3 weeks, n=277), or ipilimumab (n=278). After a median follow-up of 57·7 months (IQR 56·7–59·2) in surviving patients, median overall survival was 32·7 months (95% CI 24·5–41·6) in the combined pembrolizumab groups and 15·9 months (13·3–22·0) in the ipilimumab group (hazard ratio [HR] 0·73, 95% CI 0·61–0·88, p=0·00049). Median progression-free survival was 8·4 months (95% CI 6·6–11·3) in the combined pembrolizumab groups versus 3·4 months (2·9–4·2) in the ipilimumab group (HR 0·57, 95% CI 0·48–0·67, p Interpretation Pembrolizumab continued to show superiority over ipilimumab after almost 5 years of follow-up. These results provide further support for use of pembrolizumab in patients with advanced melanoma. Funding Merck Sharp & Dohme.

Posted Content
TL;DR: The adoption of a reconfigurable intelligent surface (RIS) for downlink multi-user communication from a multi-antenna base station is investigated and the results show that the proposed RIS-based resource allocation methods are able to provide up to 300% higher energy efficiency in comparison with the use of regular multi-Antenna amplify-and-forward relaying.
Abstract: The adoption of a Reconfigurable Intelligent Surface (RIS) for downlink multi-user communication from a multi-antenna base station is investigated in this paper. We develop energy-efficient designs for both the transmit power allocation and the phase shifts of the surface reflecting elements, subject to individual link budget guarantees for the mobile users. This leads to non-convex design optimization problems for which to tackle we propose two computationally affordable approaches, capitalizing on alternating maximization, gradient descent search, and sequential fractional programming. Specifically, one algorithm employs gradient descent for obtaining the RIS phase coefficients, and fractional programming for optimal transmit power allocation. Instead, the second algorithm employs sequential fractional programming for the optimization of the RIS phase shifts. In addition, a realistic power consumption model for RIS-based systems is presented, and the performance of the proposed methods is analyzed in a realistic outdoor environment. In particular, our results show that the proposed RIS-based resource allocation methods are able to provide up to $300\%$ higher energy efficiency, in comparison with the use of regular multi-antenna amplify-and-forward relaying.

Journal ArticleDOI
TL;DR: This review article will address the importance of homocysteine in nervous system specifically how these amino acids may trigger the release of catecholamines and highlight some of the controversies associated with hyperhomocysteinemia-induced cardiovascular problems.
Abstract: It is well known that neuronal damage following a stroke has been attributed to the over stimulation of excitatory amino acids such as glutamate and aspartate through activation of NMDA receptors. The brain is exposed to most of the constituents of plasma including homocysteine as a result of the disruption of the blood–brain barrier after stroke, head trauma and stress. The question, therefore, arises as to whether or not homocysteine is able to selectively stimulate the release of excitatory amino acids in stroke. This review article will address the importance of homocysteine in nervous system specifically how these amino acids may trigger the release of catecholamines. Our data will thus strengthen the view that a mechanism for the association of hyperhomocysteinemia with increased brain lesion in stroke. As hypothalamus also controls the cardiac function via sympathetic system, the contractility of heart will be compromised. Homocysteine is also known to mediate cardiovascular problems by its adverse effects on cardiovascular endothelium and smooth muscle cells with resultant alterations in subclinical arterial structure and function. The present review will thus summarize both central and peripheral effects of homocysteine and will highlight some of the controversies associated with hyperhomocysteinemia-induced cardiovascular problems.


Journal ArticleDOI
TL;DR: This study found that adjuvant chemotherapy with S-1 can be a new standard care for resected pancreatic cancer in Japanese patients and estimated overall and relapse-free survival using the Kaplan-Meier methods and assessed non-inferiority of S-2 to gemcitabine using the Cox proportional hazard model.

Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of the majorana zero modes and their applications in nuclear, particle, and solid-state physics, as well as a discussion of the possibilities of finding a fermion that is its own antiparticle.
Abstract: Ettore Majorana (1906--1938) disappeared while traveling by ship from Palermo to Naples in 1938. His fate has never been fully resolved and several articles have been written that explore the mystery itself. His demise intrigues us still today because of his seminal work, published the previous year, that established symmetric solutions to the Dirac equation that describe a fermionic particle that is its own antiparticle. This work has long had a significant impact in neutrino physics, where this fundamental question regarding the particle remains unanswered. But the formalism he developed has found many uses as there are now a number of candidate spin-$1/2$ neutral particles that may be truly neutral with no quantum number to distinguish them from their antiparticles. If such particles exist, they will influence many areas of nuclear and particle physics. Most notably the process of neutrinoless double beta decay can exist only if neutrinos are massive Majorana particles. Hence, many efforts to search for this process are underway. Majorana's influence does not stop with particle physics, however, even though that was his original consideration. The equations he derived also arise in solid-state physics where they describe electronic states in materials with superconducting order. Of special interest here is the class of solutions of the Majorana equation in one and two spatial dimensions at exactly zero energy. These Majorana zero modes are endowed with some remarkable physical properties that may lead to advances in quantum computing and, in fact, there is evidence that they have been experimentally observed. This Colloquium first summarizes the basics of Majorana's theory and its implications. It then provides an overview of the rich experimental programs trying to find a fermion that is its own antiparticle in nuclear, particle, and solid-state physics.

Proceedings ArticleDOI
14 Jun 2020
TL;DR: This work proposes a differentiable rendering formulation for implicit shape and texture representations, showing that depth gradients can be derived analytically using the concept of implicit differentiation, and finds that this method can be used for multi-view 3D reconstruction, directly resulting in watertight meshes.
Abstract: Learning-based 3D reconstruction methods have shown impressive results. However, most methods require 3D supervision which is often hard to obtain for real-world datasets. Recently, several works have proposed differentiable rendering techniques to train reconstruction models from RGB images. Unfortunately, these approaches are currently restricted to voxel- and mesh-based representations, suffering from discretization or low resolution. In this work, we propose a differentiable rendering formulation for implicit shape and texture representations. Implicit representations have recently gained popularity as they represent shape and texture continuously. Our key insight is that depth gradients can be derived analytically using the concept of implicit differentiation. This allows us to learn implicit shape and texture representations directly from RGB images. We experimentally show that our single-view reconstructions rival those learned with full 3D supervision. Moreover, we find that our method can be used for multi-view 3D reconstruction, directly resulting in watertight meshes.

Journal ArticleDOI
TL;DR: Aiken et al. as mentioned in this paper used the Aerodyne high-resolution time-of-flight mass spectrometer (HR-ToF-AMS) to measure OA elemental composition.
Abstract: . Elemental compositions of organic aerosol (OA) particles provide useful constraints on OA sources, chemical evolution, and effects. The Aerodyne high-resolution time-of-flight aerosol mass spectrometer (HR-ToF-AMS) is widely used to measure OA elemental composition. This study evaluates AMS measurements of atomic oxygen-to-carbon (O : C), hydrogen-to-carbon (H : C), and organic mass-to-organic carbon (OM : OC) ratios, and of carbon oxidation state ( OS C) for a vastly expanded laboratory data set of multifunctional oxidized OA standards. For the expanded standard data set, the method introduced by Aiken et al. (2008), which uses experimentally measured ion intensities at all ions to determine elemental ratios (referred to here as "Aiken-Explicit"), reproduces known O : C and H : C ratio values within 20% (average absolute value of relative errors) and 12%, respectively. The more commonly used method, which uses empirically estimated H2O+ and CO+ ion intensities to avoid gas phase air interferences at these ions (referred to here as "Aiken-Ambient"), reproduces O : C and H : C of multifunctional oxidized species within 28 and 14% of known values. The values from the latter method are systematically biased low, however, with larger biases observed for alcohols and simple diacids. A detailed examination of the H2O+, CO+, and CO2+ fragments in the high-resolution mass spectra of the standard compounds indicates that the Aiken-Ambient method underestimates the CO+ and especially H2O+ produced from many oxidized species. Combined AMS–vacuum ultraviolet (VUV) ionization measurements indicate that these ions are produced by dehydration and decarboxylation on the AMS vaporizer (usually operated at 600 °C). Thermal decomposition is observed to be efficient at vaporizer temperatures down to 200 °C. These results are used together to develop an "Improved-Ambient" elemental analysis method for AMS spectra measured in air. The Improved-Ambient method uses specific ion fragments as markers to correct for molecular functionality-dependent systematic biases and reproduces known O : C (H : C) ratios of individual oxidized standards within 28% (13%) of the known molecular values. The error in Improved-Ambient O : C (H : C) values is smaller for theoretical standard mixtures of the oxidized organic standards, which are more representative of the complex mix of species present in ambient OA. For ambient OA, the Improved-Ambient method produces O : C (H : C) values that are 27% (11%) larger than previously published Aiken-Ambient values; a corresponding increase of 9% is observed for OM : OC values. These results imply that ambient OA has a higher relative oxygen content than previously estimated. The OS C values calculated for ambient OA by the two methods agree well, however (average relative difference of 0.06 OS C units). This indicates that OS C is a more robust metric of oxidation than O : C, likely since OS C is not affected by hydration or dehydration, either in the atmosphere or during analysis.

Journal ArticleDOI
TL;DR: Statistics draws population inferences from a sample, and machine learning finds generalizable predictive patterns that can be applied to solve puzzles in medicine and science.
Abstract: Statistics draws population inferences from a sample, and machine learning finds generalizable predictive patterns.

Proceedings ArticleDOI
15 Jun 2019
TL;DR: Noise2Void is introduced, a training scheme that allows us to train directly on the body of data to be denoised and can therefore be applied when other methods cannot, and compares favorably to training-free denoising methods.
Abstract: The field of image denoising is currently dominated by discriminative deep learning methods that are trained on pairs of noisy input and clean target images. Recently it has been shown that such methods can also be trained without clean targets. Instead, independent pairs of noisy images can be used, in an approach known as Noise2Noise (N2N). Here, we introduce Noise2Void (N2V), a training scheme that takes this idea one step further. It does not require noisy image pairs, nor clean target images. Consequently, N2V allows us to train directly on the body of data to be denoised and can therefore be applied when other methods cannot. Especially interesting is the application to biomedical image data, where the acquisition of training targets, clean or noisy, is frequently not possible. We compare the performance of N2V to approaches that have either clean target images and/or noisy image pairs available. Intuitively, N2V cannot be expected to outperform methods that have more information available during training. Still, we observe that the denoising performance of Noise2Void drops in moderation and compares favorably to training-free denoising methods.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the stability and electronic properties of the honeycomb structure of the arsenene system based on first-principles calculations and find that both buckled and puckered arsenenes possess indirect gaps.
Abstract: Recently, phosphorene, a monolayer honeycomb structure of black phosphorus, was experimentally manufactured and has attracted rapidly growing interest. Motivated by phosphorene, here we investigate the stability and electronic properties of the honeycomb structure of the arsenic system based on first-principles calculations. Two types of honeycomb structures, buckled and puckered, are found to be stable. We call them arsenenes, as in the case of phosphorene. We find that both buckled and puckered arsenenes possess indirect gaps. We show that the band gap of puckered and buckled arsenenes can be tuned by applying strain. The gap closing occurs at 6% strain for puckered arsenene, where the bond angles between the nearest neighbors become nearly equal. An indirect-to-direct gap transition occurs by applying strain. Specifically, 1% strain is enough to transform puckered arsenene into a direct-gap semiconductor. We note that a bulk form of arsenic called gray arsenic exists which can be used as a precursor for buckled arsenene. Our results will pave the way for applications to light-emitting diodes and solar cells.

Journal ArticleDOI
TL;DR: Cardiorespiratory training and, to a lesser extent, mixed training reduce disability during or after usual stroke care; this could be mediated by improved mobility and balance.
Abstract: Stroke patients have impaired physical fitness and this may exacerbate their disability. It is not known whether improving physical fitness after stroke reduces disability. Objectives The primary aims were to establish whether physical fitness training reduces death, dependence and disability after stroke. The secondary aims included an investigation of the effects of fitness training on secondary outcome measures (including, physical fitness, mobility, physical function, health and quality of life, mood and the incidence of adverse events). Randomised controlled trials were included when an intervention represented a clear attempt to improve either muscle strength and/or cardiorespiratory fitness, and whose control groups comprised either usual care or a non-exercise intervention. A total of 12 trials were included in the review. No trials reported death and dependence data. Two small trials reporting disability showed no evidence of benefit. The remaining available secondary outcome data suggest that cardiorespiratory training improves walking ability (mobility). Observed benefits appear to be associated with specific or 'task-related' training.

Journal ArticleDOI
TL;DR: A massive analysis of the impact of lockdown measures introduced in response to the spread of novel coronavirus disease 2019 (COVID-19) on socioeconomic conditions of Italian citizens is presented and evidence of a segregation effect is found, since mobility contraction is stronger in municipalities in which inequality is higher and for those where individuals have lower income per capita.
Abstract: In response to the coronavirus disease 2019 (COVID-19) pandemic, several national governments have applied lockdown restrictions to reduce the infection rate. Here we perform a massive analysis on near-real-time Italian mobility data provided by Facebook to investigate how lockdown strategies affect economic conditions of individuals and local governments. We model the change in mobility as an exogenous shock similar to a natural disaster. We identify two ways through which mobility restrictions affect Italian citizens. First, we find that the impact of lockdown is stronger in municipalities with higher fiscal capacity. Second, we find evidence of a segregation effect, since mobility contraction is stronger in municipalities in which inequality is higher and for those where individuals have lower income per capita. Our results highlight both the social costs of lockdown and a challenge of unprecedented intensity: On the one hand, the crisis is inducing a sharp reduction of fiscal revenues for both national and local governments; on the other hand, a significant fiscal effort is needed to sustain the most fragile individuals and to mitigate the increase in poverty and inequality induced by the lockdown.

Journal ArticleDOI
19 Nov 2018
TL;DR: This presentation explains why Python is far behind the R programming language when it comes to general statistics and why many scientists still rely heavily on R to perform their statistical analyses.
Abstract: Python is currently the fastest growing programming language in the world, thanks to its ease-of-use, fast learning curve and its numerous high quality packages for data science and machine-learning. Surprisingly however, Python is far behind the R programming language when it comes to general statistics and for this reason many scientists still rely heavily on R to perform their statistical analyses.

Journal ArticleDOI
TL;DR: The notion of self-branding has drawn myriad academic responses over the last decade as mentioned in this paper, and has been criticised by some academic researchers, such as the authors of this article.
Abstract: The notion of self-branding has drawn myriad academic responses over the last decade. First popularised in a provocative piece published in Fast Company, self-branding has been criticised by some o...

Proceedings ArticleDOI
18 Jun 2018
TL;DR: Zhang et al. as discussed by the authors proposed a Densely Connected Pyramid Dehazing Network (DCPDN), which can jointly learn the transmission map, atmospheric light and dehazing all together.
Abstract: We propose a new end-to-end single image dehazing method, called Densely Connected Pyramid Dehazing Network (DCPDN), which can jointly learn the transmission map, atmospheric light and dehazing all together. The end-to-end learning is achieved by directly embedding the atmospheric scattering model into the network, thereby ensuring that the proposed method strictly follows the physics-driven scattering model for dehazing. Inspired by the dense network that can maximize the information flow along features from different levels, we propose a new edge-preserving densely connected encoder-decoder structure with multi-level pyramid pooling module for estimating the transmission map. This network is optimized using a newly introduced edge-preserving loss function. To further incorporate the mutual structural information between the estimated transmission map and the dehazed result, we propose a joint-discriminator based on generative adversarial network framework to decide whether the corresponding dehazed image and the estimated transmission map are real or fake. An ablation study is conducted to demonstrate the effectiveness of each module evaluated at both estimated transmission map and dehazed result. Extensive experiments demonstrate that the proposed method achieves significant improvements over the state-of-the-art methods. Code and dataset is made available at: https://github.com/hezhangsprinter/DCPDN