scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: This review discusses the main gut microorganisms, particularly bacteria, and microbial pathways associated with the metabolism of dietary carbohydrates, proteins, plant polyphenols, bile acids, and vitamins, and the methodologies, existing and novel, that can be employed to explore gut microbial pathways of metabolism.
Abstract: The diverse microbial community that inhabits the human gut has an extensive metabolic repertoire that is distinct from, but complements the activity of mammalian enzymes in the liver and gut mucosa and includes functions essential for host digestion. As such, the gut microbiota is a key factor in shaping the biochemical profile of the diet and, therefore, its impact on host health and disease. The important role that the gut microbiota appears to play in human metabolism and health has stimulated research into the identification of specific microorganisms involved in different processes, and the elucidation of metabolic pathways, particularly those associated with metabolism of dietary components and some host-generated substances. In the first part of the review, we discuss the main gut microorganisms, particularly bacteria, and microbial pathways associated with the metabolism of dietary carbohydrates (to short chain fatty acids and gases), proteins, plant polyphenols, bile acids, and vitamins. The second part of the review focuses on the methodologies, existing and novel, that can be employed to explore gut microbial pathways of metabolism. These include mathematical models, omics techniques, isolated microbes, and enzyme assays.

1,294 citations


Journal ArticleDOI
TL;DR: The MARS is a simple, objective, and reliable tool for classifying and assessing the quality of mobile health apps and can also be used to provide a checklist for the design and development of new high quality health apps.
Abstract: Background: The use of mobile apps for health and well being promotion has grown exponentially in recent years. Yet, there is currently no app-quality assessment tool beyond “star”-ratings. Objective: The objective of this study was to develop a reliable, multidimensional measure for trialling, classifying, and rating the quality of mobile health apps. Methods: A literature search was conducted to identify articles containing explicit Web or app quality rating criteria published between January 2000 and January 2013. Existing criteria for the assessment of app quality were categorized by an expert panel to develop the new Mobile App Rating Scale (MARS) subscales, items, descriptors, and anchors. There were sixty well being apps that were randomly selected using an iTunes search for MARS rating. There were ten that were used to pilot the rating procedure, and the remaining 50 provided data on interrater reliability. Results: There were 372 explicit criteria for assessing Web or app quality that were extracted from 25 published papers, conference proceedings, and Internet resources. There were five broad categories of criteria that were identified including four objective quality scales: engagement, functionality, aesthetics, and information quality; and one subjective quality scale; which were refined into the 23-item MARS. The MARS demonstrated excellent internal consistency (alpha = .90) and interrater reliability intraclass correlation coefficient (ICC = .79). Conclusions: The MARS is a simple, objective, and reliable tool for classifying and assessing the quality of mobile health apps. It can also be used to provide a checklist for the design and development of new high quality health apps.

1,293 citations


Journal ArticleDOI
Jennifer E. Huffman1, Eva Albrecht, Alexander Teumer2, Massimo Mangino3, Karen Kapur, Toby Johnson4, Z. Kutalik, Nicola Pirastu5, Giorgio Pistis6, Lorna M. Lopez1, Toomas Haller7, Perttu Salo8, Anuj Goel9, Man Li10, Toshiko Tanaka8, Abbas Dehghan11, Daniela Ruggiero, Giovanni Malerba12, Albert V. Smith13, Ilja M. Nolte, Laura Portas, Amanda Phipps-Green14, Lora Boteva1, Pau Navarro1, Åsa Johansson15, Andrew A. Hicks16, Ozren Polasek17, Tõnu Esko18, John F. Peden9, Sarah E. Harris1, Federico Murgia, Sarah H. Wild1, Albert Tenesa1, Adrienne Tin10, Evelin Mihailov7, Anne Grotevendt2, Gauti Kjartan Gislason, Josef Coresh10, Pio D'Adamo5, Sheila Ulivi, Peter Vollenweider19, Gérard Waeber19, Susan Campbell1, Ivana Kolcic17, Krista Fisher7, Margus Viigimaa, Jeffrey Metter8, Corrado Masciullo6, Elisabetta Trabetti12, Cristina Bombieri12, Rossella Sorice, Angela Doering, Eva Reischl, Konstantin Strauch20, Albert Hofman11, André G. Uitterlinden11, Melanie Waldenberger, H-Erich Wichmann20, Gail Davies1, Alan J. Gow1, Nicola Dalbeth21, Lisa K. Stamp14, Johannes H. Smit22, Mirna Kirin1, Ramaiah Nagaraja8, Matthias Nauck2, Claudia Schurmann2, Kathrin Budde2, Susan M. Farrington1, Evropi Theodoratou1, Antti Jula8, Veikko Salomaa8, Cinzia Sala6, Christian Hengstenberg23, Michel Burnier19, R Maegi7, Norman Klopp20, Stefan Kloiber24, Sabine Schipf25, Samuli Ripatti26, Stefano Cabras27, Nicole Soranzo28, Georg Homuth2, Teresa Nutile, Patricia B. Munroe4, Nicholas D. Hastie1, Harry Campbell1, Igor Rudan1, Claudia P. Cabrera29, Chris Haley1, Oscar H. Franco11, Tony R. Merriman14, Vilmundur Gudnason13, Mario Pirastu, Brenda W.J.H. Penninx11, Brenda W.J.H. Penninx30, Harold Snieder, Andres Metspalu7, Marina Ciullo, Peter P. Pramstaller16, Cornelia M. van Duijn11, Luigi Ferrucci8, Giovanni Gambaro31, Ian J. Deary1, Malcolm G. Dunlop1, James F. Wilson1, Paolo Gasparini5, Ulf Gyllensten15, Tim D. Spector3, Alan F. Wright1, Caroline Hayward1, Hugh Watkins9, Markus Perola8, Murielle Bochud32, W. H. Linda Kao10, Mark J. Caulfield4, Daniela Toniolo6, Henry Voelzke25, Christian Gieger, Anna Koettgen33, Veronique Vitart1 
26 Mar 2015-PLOS ONE
TL;DR: Interactions between body mass index (BMI) and common genetic variants affecting serum urate levels, genome-wide, and regression-type analyses in a non BMI-stratified overall sample suggested a role for N-glycan biosynthesis as a prominent urate-associated pathway in the lean stratum.
Abstract: We tested for interactions between body mass index (BMI) and common genetic variants affecting serum urate levels, genome-wide, in up to 42569 participants. Both stratified genome-wide association (GWAS) analyses, in lean, overweight and obese individuals, and regression-type analyses in a non BMI-stratified overall sample were performed. The former did not uncover any novel locus with a major main effect, but supported modulation of effects for some known and potentially new urate loci. The latter highlighted a SNP at RBFOX3 reaching genome-wide significant level (effect size 0.014, 95% CI 0.008-0.02, Pinter= 2.6 x 10-8). Two top loci in interaction term analyses, RBFOX3 and ERO1LB-EDARADD, also displayed suggestive differences in main effect size between the lean and obese strata. All top ranking loci for urate effect differences between BMI categories were novel and most had small magnitude but opposite direction effects between strata. They include the locus RBMS1-TANK (men, Pdifflean-overweight= 4.7 x 10-8), a region that has been associated with several obesity related traits, and TSPYL5 (men, Pdifflean-overweight= 9.1 x 10-8), regulating adipocytes-produced estradiol. The top-ranking known urate loci was ABCG2, the strongest known gout risk locus, with an effect halved in obese compared to lean men (Pdifflean-obese= 2 x 10-4). Finally, pathway analysis suggested a role for N-glycan biosynthesis as a prominent urate-associated pathway in the lean stratum. These results illustrate a potentially powerful way to monitor changes occurring in obesogenic environment.

1,293 citations


Journal ArticleDOI
TL;DR: These findings guide which normalization and differential abundance techniques to use based on the data characteristics of a given study.
Abstract: Data from 16S ribosomal RNA (rRNA) amplicon sequencing present challenges to ecological and statistical interpretation. In particular, library sizes often vary over several ranges of magnitude, and the data contains many zeros. Although we are typically interested in comparing relative abundance of taxa in the ecosystem of two or more groups, we can only measure the taxon relative abundance in specimens obtained from the ecosystems. Because the comparison of taxon relative abundance in the specimen is not equivalent to the comparison of taxon relative abundance in the ecosystems, this presents a special challenge. Second, because the relative abundance of taxa in the specimen (as well as in the ecosystem) sum to 1, these are compositional data. Because the compositional data are constrained by the simplex (sum to 1) and are not unconstrained in the Euclidean space, many standard methods of analysis are not applicable. Here, we evaluate how these challenges impact the performance of existing normalization methods and differential abundance analyses. Effects on normalization: Most normalization methods enable successful clustering of samples according to biological origin when the groups differ substantially in their overall microbial composition. Rarefying more clearly clusters samples according to biological origin than other normalization techniques do for ordination metrics based on presence or absence. Alternate normalization measures are potentially vulnerable to artifacts due to library size. Effects on differential abundance testing: We build on a previous work to evaluate seven proposed statistical methods using rarefied as well as raw data. Our simulation studies suggest that the false discovery rates of many differential abundance-testing methods are not increased by rarefying itself, although of course rarefying results in a loss of sensitivity due to elimination of a portion of available data. For groups with large (~10×) differences in the average library size, rarefying lowers the false discovery rate. DESeq2, without addition of a constant, increased sensitivity on smaller datasets ( 20 samples per group) but also critically the only method tested that has a good control of false discovery rate. These findings guide which normalization and differential abundance techniques to use based on the data characteristics of a given study.

1,292 citations


Journal ArticleDOI
08 May 2019
TL;DR: The most common mistakes being to describe effect sizes in ways that are uninformative (e.g., using arbitrary standards) or misleading as mentioned in this paper, i.e., squa...
Abstract: Effect sizes are underappreciated and often misinterpreted—the most common mistakes being to describe them in ways that are uninformative (e.g., using arbitrary standards) or misleading (e.g., squa...

1,292 citations


Journal ArticleDOI
TL;DR: Among women with early-stage breast cancer who were at high clinical risk and low genomic risk for recurrence, the receipt of no chemotherapy on the basis of the 70-gene signature led to a 5-year rate of survival without distant metastasis that was 1.5 percentage points lower than the rate with chemotherapy.
Abstract: BackgroundThe 70-gene signature test (MammaPrint) has been shown to improve prediction of clinical outcome in women with early-stage breast cancer. We sought to provide prospective evidence of the clinical utility of the addition of the 70-gene signature to standard clinical–pathological criteria in selecting patients for adjuvant chemotherapy. MethodsIn this randomized, phase 3 study, we enrolled 6693 women with early-stage breast cancer and determined their genomic risk (using the 70-gene signature) and their clinical risk (using a modified version of Adjuvant! Online). Women at low clinical and genomic risk did not receive chemotherapy, whereas those at high clinical and genomic risk did receive such therapy. In patients with discordant risk results, either the genomic risk or the clinical risk was used to determine the use of chemotherapy. The primary goal was to assess whether, among patients with high-risk clinical features and a low-risk gene-expression profile who did not receive chemotherapy, the...

1,291 citations


Journal ArticleDOI
Richard M. Ransohoff1
19 Aug 2016-Science
TL;DR: Observations indicate that therapies targeting glial cells might provide benefit for those afflicted by neurodegenerative disorders, because the environment is affected during disease in a cascade of processes collectively termed neuroinflammation.
Abstract: Neurodegenerative diseases such as Alzheimer’s disease, Parkinson’s disease, amyotrophic lateral sclerosis, and frontotemporal lobar dementia are among the most pressing problems of developed societies with aging populations. Neurons carry out essential functions such as signal transmission and network integration in the central nervous system and are the main targets of neurodegenerative disease. In this Review, I address how the neuron’s environment also contributes to neurodegeneration. Maintaining an optimal milieu for neuronal function rests with supportive cells termed glia and the blood-brain barrier. Accumulating evidence suggests that neurodegeneration occurs in part because the environment is affected during disease in a cascade of processes collectively termed neuroinflammation. These observations indicate that therapies targeting glial cells might provide benefit for those afflicted by neurodegenerative disorders.

1,291 citations


Posted Content
TL;DR: In this paper, a scalable production system for federated learning in the domain of mobile devices, based on TensorFlow, is presented. Butler et al. describe the resulting high-level design, sketch some of the challenges and their solutions, and touch upon the open problems and future directions.
Abstract: Federated Learning is a distributed machine learning approach which enables model training on a large corpus of decentralized data. We have built a scalable production system for Federated Learning in the domain of mobile devices, based on TensorFlow. In this paper, we describe the resulting high-level design, sketch some of the challenges and their solutions, and touch upon the open problems and future directions.

1,291 citations


Journal ArticleDOI
27 Jan 2015-JAMA
TL;DR: Among men undergoing biopsy for suspected prostate cancer, targeted MR/ultrasound fusionBiopsy, compared with standard extended-sextant ultrasound-guided biopsy, was associated with increased detection of high-risk prostate cancer and decreased detection of low- risk prostate cancer.
Abstract: Importance Targeted magnetic resonance (MR)/ultrasound fusion prostate biopsy has been shown to detect prostate cancer. The implications of targeted biopsy alone vs standard extended-sextant biopsy or the 2 modalities combined are not well understood. Objective To assess targeted vs standard biopsy and the 2 approaches combined for the diagnosis of intermediate- to high-risk prostate cancer. Design, setting, and participants Prospective cohort study of 1003 men undergoing both targeted and standard biopsy concurrently from 2007 through 2014 at the National Cancer Institute in the United States. Patients were referred for elevated level of prostate-specific antigen (PSA) or abnormal digital rectal examination results, often with prior negative biopsy results. Risk categorization was compared among targeted and standard biopsy and, when available, whole-gland pathology after prostatectomy as the "gold standard." Interventions Patients underwent multiparametric prostate magnetic resonance imaging to identify regions of prostate cancer suspicion followed by targeted MR/ultrasound fusion biopsy and concurrent standard biopsy. Main outcomes and measures The primary objective was to compare targeted and standard biopsy approaches for detection of high-risk prostate cancer (Gleason score ≥ 4 + 3); secondary end points focused on detection of low-risk prostate cancer (Gleason score 3 + 3 or low-volume 3 + 4) and the biopsy ability to predict whole-gland pathology at prostatectomy. Results Targeted MR/ultrasound fusion biopsy diagnosed 461 prostate cancer cases, and standard biopsy diagnosed 469 cases. There was exact agreement between targeted and standard biopsy in 690 men (69%) undergoing biopsy. Targeted biopsy diagnosed 30% more high-risk cancers vs standard biopsy (173 vs 122 cases, P Conclusions and relevance Among men undergoing biopsy for suspected prostate cancer, targeted MR/ultrasound fusion biopsy, compared with standard extended-sextant ultrasound-guided biopsy, was associated with increased detection of high-risk prostate cancer and decreased detection of low-risk prostate cancer. Future studies will be needed to assess the ultimate clinical implications of targeted biopsy. Trial registration clinicaltrials.gov Identifier: NCT00102544.

1,291 citations


Journal ArticleDOI
TL;DR: A deep learning algorithm similar in spirit to Galerkin methods, using a deep neural network instead of linear combinations of basis functions is proposed, and is implemented for American options in up to 100 dimensions.

1,290 citations


Journal ArticleDOI
Yu Zheng1
TL;DR: A systematic survey on the major research into trajectory data mining, providing a panorama of the field as well as the scope of its research topics, and introduces the methods that transform trajectories into other data formats, such as graphs, matrices, and tensors.
Abstract: The advances in location-acquisition and mobile computing techniques have generated massive spatial trajectory data, which represent the mobility of a diversity of moving objects, such as people, vehicles, and animals. Many techniques have been proposed for processing, managing, and mining trajectory data in the past decade, fostering a broad range of applications. In this article, we conduct a systematic survey on the major research into trajectory data mining, providing a panorama of the field as well as the scope of its research topics. Following a road map from the derivation of trajectory data, to trajectory data preprocessing, to trajectory data management, and to a variety of mining tasks (such as trajectory pattern mining, outlier detection, and trajectory classification), the survey explores the connections, correlations, and differences among these existing techniques. This survey also introduces the methods that transform trajectories into other data formats, such as graphs, matrices, and tensors, to which more data mining and machine learning techniques can be applied. Finally, some public trajectory datasets are presented. This survey can help shape the field of trajectory data mining, providing a quick understanding of this field to the community.

Journal ArticleDOI
TL;DR: Idarucizumab completely reversed the anticoagulant effect of dabigatran within minutes and normalized the test results in 88 to 98% of the patients, an effect that was evident within minutes.
Abstract: BACKGROUND Specific reversal agents for non–vitamin K antagonist oral anticoagulants are lacking Idarucizumab, an antibody fragment, was developed to reverse the anticoagulant effects of dabigatran METHODS We undertook this prospective cohort study to determine the safety of 5 g of intravenous idarucizumab and its capacity to reverse the anticoagulant effects of dabigatran in patients who had serious bleeding (group A) or required an urgent procedure (group B) The primary end point was the maximum percentage reversal of the anticoagulant effect of dabigatran within 4 hours after the administration of idarucizumab, on the basis of the determination at a central laboratory of the dilute thrombin time or ecarin clotting time A key secondary end point was the restoration of hemostasis RESULTS This interim analysis included 90 patients who received idarucizumab (51 patients in group A and 39 in group B) Among 68 patients with an elevated dilute thrombin time and 81 with an elevated ecarin clotting time at baseline, the median maximum percentage reversal was 100% (95% confidence interval, 100 to 100) Idarucizumab normalized the test results in 88 to 98% of the patients, an effect that was evident within minutes Concentrations of unbound dabigatran remained below 20 ng per milliliter at 24 hours in 79% of the patients Among 35 patients in group A who could be assessed, hemostasis, as determined by local investigators, was restored at a median of 114 hours Among 36 patients in group B who underwent a procedure, normal intraoperative hemostasis was reported in 33, and mildly or moderately abnormal hemostasis was reported in 2 patients and 1 patient, respectively One thrombotic event occurred within 72 hours after idarucizumab administration in a patient in whom anticoagulants had not been reinitiated CONCLUSIONS Idarucizumab completely reversed the anticoagulant effect of dabigatran within minutes (Funded by Boehringer Ingelheim; RE-VERSE AD ClinicalTrialsgov number, NCT02104947)

Journal ArticleDOI
12 Jan 2017-Nature
TL;DR: The authors' analyses identified three molecular subclasses of oesophageal squamous cell carcinomas, but none showed evidence for an aetiological role of human papillomavirus and these data provide a framework to facilitate more rational categorization of these tumours and a foundation for new therapies.
Abstract: Oesophageal cancers are prominent worldwide; however, there are few targeted therapies and survival rates for these cancers remain dismal. Here we performed a comprehensive molecular analysis of 164 carcinomas of the oesophagus derived from Western and Eastern populations. Beyond known histopathological and epidemiologic distinctions, molecular features differentiated oesophageal squamous cell carcinomas from oesophageal adenocarcinomas. Oesophageal squamous cell carcinomas resembled squamous carcinomas of other organs more than they did oesophageal adenocarcinomas. Our analyses identified three molecular subclasses of oesophageal squamous cell carcinomas, but none showed evidence for an aetiological role of human papillomavirus. Squamous cell carcinomas showed frequent genomic amplifications of CCND1 and SOX2 and/or TP63, whereas ERBB2, VEGFA and GATA4 and GATA6 were more commonly amplified in adenocarcinomas. Oesophageal adenocarcinomas strongly resembled the chromosomally unstable variant of gastric adenocarcinoma, suggesting that these cancers could be considered a single disease entity. However, some molecular features, including DNA hypermethylation, occurred disproportionally in oesophageal adenocarcinomas. These data provide a framework to facilitate more rational categorization of these tumours and a foundation for new therapies.

Proceedings ArticleDOI
14 Jun 2018
TL;DR: In this article, a large-scale audio-visual speaker recognition dataset, VoxCeleb2, is presented, which contains over a million utterances from over 6,000 speakers.
Abstract: The objective of this paper is speaker recognition under noisy and unconstrained conditions. We make two key contributions. First, we introduce a very large-scale audio-visual speaker recognition dataset collected from open-source media. Using a fully automated pipeline, we curate VoxCeleb2 which contains over a million utterances from over 6,000 speakers. This is several times larger than any publicly available speaker recognition dataset. Second, we develop and compare Convolutional Neural Network (CNN) models and training strategies that can effectively recognise identities from voice under various conditions. The models trained on the VoxCeleb2 dataset surpass the performance of previous works on a benchmark dataset by a significant margin.

Proceedings ArticleDOI
01 Oct 2017
TL;DR: A very deep persistent memory network (MemNet) is proposed that introduces a memory block, consisting of a recursive unit and a gate unit, to explicitly mine persistent memory through an adaptive learning process.
Abstract: Recently, very deep convolutional neural networks (CNNs) have been attracting considerable attention in image restoration. However, as the depth grows, the longterm dependency problem is rarely realized for these very deep models, which results in the prior states/layers having little influence on the subsequent ones. Motivated by the fact that human thoughts have persistency, we propose a very deep persistent memory network (MemNet) that introduces a memory block, consisting of a recursive unit and a gate unit, to explicitly mine persistent memory through an adaptive learning process. The recursive unit learns multi-level representations of the current state under different receptive fields. The representations and the outputs from the previous memory blocks are concatenated and sent to the gate unit, which adaptively controls how much of the previous states should be reserved, and decides how much of the current state should be stored. We apply MemNet to three image restoration tasks, i.e., image denosing, super-resolution and JPEG deblocking. Comprehensive experiments demonstrate the necessity of the MemNet and its unanimous superiority on all three tasks over the state of the arts. Code is available at https://github.com/tyshiwo/MemNet.

Journal ArticleDOI
TL;DR: This review discusses the nature of such phases and their properties based on paradigmatic models and general arguments, and introduces theoretical technology such as gauge theory and partons, which are conveniently used in the study of quantum spin liquids.
Abstract: Quantum spin liquids may be considered 'quantum disordered' ground states of spin systems, in which zero-point fluctuations are so strong that they prevent conventional magnetic long-range order. More interestingly, quantum spin liquids are prototypical examples of ground states with massive many-body entanglement, which is of a degree sufficient to render these states distinct phases of matter. Their highly entangled nature imbues quantum spin liquids with unique physical aspects, such as non-local excitations, topological properties, and more. In this review, we discuss the nature of such phases and their properties based on paradigmatic models and general arguments, and introduce theoretical technology such as gauge theory and partons, which are conveniently used in the study of quantum spin liquids. An overview is given of the different types of quantum spin liquids and the models and theories used to describe them. We also provide a guide to the current status of experiments in relation to study quantum spin liquids, and to the diverse probes used therein.

Journal ArticleDOI
TL;DR: This work proposes a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages using a shared wordpiece vocabulary, and introduces an artificial token at the beginning of the input sentence to specify the required target language.
Abstract: We propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no changes to the model architecture from a standard NMT system but instead introduces an artificial token at the beginning of the input sentence to specify the required target language. Using a shared wordpiece vocabulary, our approach enables Multilingual NMT using a single model. On the WMT’14 benchmarks, a single multilingual model achieves comparable performance for English→French and surpasses state-of-the-art results for English→German. Similarly, a single multilingual model surpasses state-of-the-art results for French→English and German→English on WMT’14 and WMT’15 benchmarks, respectively. On production corpora, multilingual models of up to twelve language pairs allow for better translation of many individual pairs. Our models can also learn to perform implicit bridging between language pairs never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation. Finally, we show analyses that hints at a universal interlingua representation in our models and show some interesting examples when mixing languages.

Journal ArticleDOI
TL;DR: In this review, the major materials and technology advances within the last five years for each of the common 3D Printing technologies (Three Dimensional Printing, Fused Deposition Modeling, Selective Laser Sintering, Stereolithography, and 3D Plotting/Direct-Write/Bioprinting) are described.
Abstract: 3D Printing promises to produce complex biomedical devices according to computer design using patient-specific anatomical data. Since its initial use as pre-surgical visualization models and tooling molds, 3D Printing has slowly evolved to create one-of-a-kind devices, implants, scaffolds for tissue engineering, diagnostic platforms, and drug delivery systems. Fueled by the recent explosion in public interest and access to affordable printers, there is renewed interest to combine stem cells with custom 3D scaffolds for personalized regenerative medicine. Before 3D Printing can be used routinely for the regeneration of complex tissues (e.g. bone, cartilage, muscles, vessels, nerves in the craniomaxillofacial complex), and complex organs with intricate 3D microarchitecture (e.g. liver, lymphoid organs), several technological limitations must be addressed. In this review, the major materials and technology advances within the last five years for each of the common 3D Printing technologies (Three Dimensional Printing, Fused Deposition Modeling, Selective Laser Sintering, Stereolithography, and 3D Plotting/Direct-Write/Bioprinting) are described. Examples are highlighted to illustrate progress of each technology in tissue engineering, and key limitations are identified to motivate future research and advance this fascinating field of advanced manufacturing.

Journal ArticleDOI
04 Apr 2020-Cureus
TL;DR: The author will highlight the potential impact of the terrible COVID-19 outbreak on the education and mental health of students and academic staff.
Abstract: The novel coronavirus disease 2019 (COVID-19), originated in Wuhan city of China, has spread rapidly around the world, sending billions of people into lockdown. The World Health Organization (WHO) declared the coronavirus epidemic a pandemic. In light of rising concern about the current COVID-19 pandemic, a growing number of universities across the world have either postponed or canceled all campus events such as workshops, conferences, sports, and other activities. Universities are taking intensive measures to prevent and protect all students and staff members from the highly infectious disease. Faculty members are already in the process of transitioning to online teaching platforms. In this review, the author will highlight the potential impact of the terrible COVID-19 outbreak on the education and mental health of students and academic staff.

Journal ArticleDOI
TL;DR: The application of decontam to two recently published datasets corroborated and extended their conclusions that little evidence existed for an indigenous placenta microbiome and that some low-frequency taxa seemingly associated with preterm birth were contaminants.
Abstract: The accuracy of microbial community surveys based on marker-gene and metagenomic sequencing (MGS) suffers from the presence of contaminants—DNA sequences not truly present in the sample. Contaminants come from various sources, including reagents. Appropriate laboratory practices can reduce contamination, but do not eliminate it. Here we introduce decontam ( https://github.com/benjjneb/decontam ), an open-source R package that implements a statistical classification procedure that identifies contaminants in MGS data based on two widely reproduced patterns: contaminants appear at higher frequencies in low-concentration samples and are often found in negative controls. Decontam classified amplicon sequence variants (ASVs) in a human oral dataset consistently with prior microscopic observations of the microbial taxa inhabiting that environment and previous reports of contaminant taxa. In metagenomics and marker-gene measurements of a dilution series, decontam substantially reduced technical variation arising from different sequencing protocols. The application of decontam to two recently published datasets corroborated and extended their conclusions that little evidence existed for an indigenous placenta microbiome and that some low-frequency taxa seemingly associated with preterm birth were contaminants. Decontam improves the quality of metagenomic and marker-gene sequencing by identifying and removing contaminant DNA sequences. Decontam integrates easily with existing MGS workflows and allows researchers to generate more accurate profiles of microbial communities at little to no additional cost.

Proceedings Article
06 Jul 2015
TL;DR: This work introduces a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop, and shows how the learnt uncertainty in the weights can be used to improve generalisation in non-linear regression problems.
Abstract: We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop. It regularises the weights by minimising a compression cost, known as the variational free energy or the expected lower bound on the marginal likelihood. We show that this principled kind of regularisation yields comparable performance to dropout on MNIST classification. We then demonstrate how the learnt uncertainty in the weights can be used to improve generalisation in non-linear regression problems, and how this weight uncertainty can be used to drive the exploration-exploitation trade-off in reinforcement learning.

Journal ArticleDOI
08 Oct 2015-PeerJ
TL;DR: Using anvi’o, this work re-analyzed publicly available datasets and explored temporal genomic changes within naturally occurring microbial populations through de novo characterization of single nucleotide variations, and linked cultivar and single-cell genomes with metagenomic and metatranscriptomic data.
Abstract: Advances in high-throughput sequencing and ‘omics technologies are revolutionizing studies of naturally occurring microbial communities. Comprehensive investigations of microbial lifestyles require the ability to interactively organize and visualize genetic information and to incorporate subtle differences that enable greater resolution of complex data. Here we introduce anvi’o, an advanced analysis and visualization platform that offers automated and human-guided characterization of microbial genomes in metagenomic assemblies, with interactive interfaces that can link ‘omics data from multiple sources into a single, intuitive display. Its extensible visualization approach distills multiple dimensions of information about each contig, offering a dynamic and unified work environment for data exploration, manipulation, and reporting. Using anvi’o, we re-analyzed publicly available datasets and explored temporal genomic changes within naturally occurring microbial populations through de novo characterization of single nucleotide variations, and linked cultivar and single-cell genomes with metagenomic and metatranscriptomic data. Anvi’o is an open-source platform that empowers researchers without extensive bioinformatics skills to perform and communicate in-depth analyses on large ‘omics datasets.

Posted Content
TL;DR: This paper presents a simple yet effective approach that for the first time enables arbitrary style transfer in real-time, comparable to the fastest existing approach, without the restriction to a pre-defined set of styles.
Abstract: Gatys et al. recently introduced a neural algorithm that renders a content image in the style of another image, achieving so-called style transfer. However, their framework requires a slow iterative optimization process, which limits its practical application. Fast approximations with feed-forward neural networks have been proposed to speed up neural style transfer. Unfortunately, the speed improvement comes at a cost: the network is usually tied to a fixed set of styles and cannot adapt to arbitrary new styles. In this paper, we present a simple yet effective approach that for the first time enables arbitrary style transfer in real-time. At the heart of our method is a novel adaptive instance normalization (AdaIN) layer that aligns the mean and variance of the content features with those of the style features. Our method achieves speed comparable to the fastest existing approach, without the restriction to a pre-defined set of styles. In addition, our approach allows flexible user controls such as content-style trade-off, style interpolation, color & spatial controls, all using a single feed-forward neural network.

Proceedings ArticleDOI
17 Dec 2018
TL;DR: In this paper, a deep mutual learning (DML) strategy is proposed to transfer knowledge from a teacher to a student network, where an ensemble of students learn collaboratively and teach each other throughout the training process.
Abstract: Model distillation is an effective and widely used technique to transfer knowledge from a teacher to a student network The typical application is to transfer from a powerful large network or ensemble to a small network, in order to meet the low-memory or fast execution requirements In this paper, we present a deep mutual learning (DML) strategy Different from the one-way transfer between a static pre-defined teacher and a student in model distillation, with DML, an ensemble of students learn collaboratively and teach each other throughout the training process Our experiments show that a variety of network architectures benefit from mutual learning and achieve compelling results on both category and instance recognition tasks Surprisingly, it is revealed that no prior powerful teacher network is necessary - mutual learning of a collection of simple student networks works, and moreover outperforms distillation from a more powerful yet static teacher

Posted Content
TL;DR: In this article, it was shown that consumer's surplus can be used to estimate the unobservable compensating and equivalent variations, the correct theoretical measures of the welfare impact of changes in prices and income on an individual.
Abstract: The purpose of this paper is to settle the controversy surrounding consumer's surplus' and, by so doing, to validate its use as a tool of welfare economics. I will show that observed consumer's surplus can be rigorously utilized to estimate the unobservable compensating and equivalent variations-the correct theoretical measures of the welfare impact of changes in prices and income on an individual. I derive precise upper and lower bounds on the percentage errors of approximating the compensating and equivalent variations with consumer's surplus. These bounds can be explicitly calculated from observable demand data, and it is clear that in most applications the error of approximation will be very small. In fact, the error will often be overshadowed by the errors involved in estimating the demand curve. The results in no way depend upon arguments about the constancy of the marginal utility of income. Consequently, this paper supplies specific empirical criteria which can replace the apologetic caveats frequently employed by those who presently apply consumer's surplus. Moreover, the results imply that consumer's surplus is usually a very good approximation to the appropriate welfare measures. To preview, below I establish the validity of these rules of thumb: For a

Proceedings ArticleDOI
27 Jun 2016
TL;DR: It is shown that a simple tracker combining complementary cues in a ridge regression framework can operate faster than 80 FPS and outperform not only all entries in the popular VOT14 competition, but also recent and far more sophisticated trackers according to multiple benchmarks.
Abstract: Correlation Filter-based trackers have recently achieved excellent performance, showing great robustness to challenging situations exhibiting motion blur and illumination changes. However, since the model that they learn depends strongly on the spatial layout of the tracked object, they are notoriously sensitive to deformation. Models based on colour statistics have complementary traits: they cope well with variation in shape, but suffer when illumination is not consistent throughout a sequence. Moreover, colour distributions alone can be insufficiently discriminative. In this paper, we show that a simple tracker combining complementary cues in a ridge regression framework can operate faster than 80 FPS and outperform not only all entries in the popular VOT14 competition, but also recent and far more sophisticated trackers according to multiple benchmarks.

Journal ArticleDOI
TL;DR: A review of the current state-of-the-art on PPCPs in the freshwater aquatic environment is presented in this article, where the environmental risk posed by these contaminants is evaluated in light of the persistence, bioaccumulation and toxicity criteria.


Journal ArticleDOI
TL;DR: In this article, the effect of Afatinib on overall survival of patients with EGFR mutation-positive lung adenocarcinoma through an analysis of data from two open-label, randomised, phase 3 trials was evaluated.
Abstract: Summary Background We aimed to assess the effect of afatinib on overall survival of patients with EGFR mutation-positive lung adenocarcinoma through an analysis of data from two open-label, randomised, phase 3 trials. Methods Previously untreated patients with EGFR mutation-positive stage IIIB or IV lung adenocarcinoma were enrolled in LUX-Lung 3 (n=345) and LUX-Lung 6 (n=364). These patients were randomly assigned in a 2:1 ratio to receive afatinib or chemotherapy (pemetrexed-cisplatin [LUX-Lung 3] or gemcitabine-cisplatin [LUX-Lung 6]), stratified by EGFR mutation (exon 19 deletion [del19], Leu858Arg, or other) and ethnic origin (LUX-Lung 3 only). We planned analyses of mature overall survival data in the intention-to-treat population after 209 (LUX-Lung 3) and 237 (LUX-Lung 6) deaths. These ongoing studies are registered with ClinicalTrials.gov, numbers NCT00949650 and NCT01121393. Findings Median follow-up in LUX-Lung 3 was 41 months (IQR 35–44); 213 (62%) of 345 patients had died. Median follow-up in LUX-Lung 6 was 33 months (IQR 31–37); 246 (68%) of 364 patients had died. In LUX-Lung 3, median overall survival was 28·2 months (95% CI 24·6–33·6) in the afatinib group and 28·2 months (20·7–33·2) in the pemetrexed-cisplatin group (HR 0·88, 95% CI 0·66–1·17, p=0·39). In LUX-Lung 6, median overall survival was 23·1 months (95% CI 20·4–27·3) in the afatinib group and 23·5 months (18·0–25·6) in the gemcitabine-cisplatin group (HR 0·93, 95% CI 0·72–1·22, p=0·61). However, in preplanned analyses, overall survival was significantly longer for patients with del19-positive tumours in the afatinib group than in the chemotherapy group in both trials: in LUX-Lung 3, median overall survival was 33·3 months (95% CI 26·8–41·5) in the afatinib group versus 21·1 months (16·3–30·7) in the chemotherapy group (HR 0·54, 95% CI 0·36–0·79, p=0·0015); in LUX-Lung 6, it was 31·4 months (95% CI 24·2–35·3) versus 18·4 months (14·6–25·6), respectively (HR 0·64, 95% CI 0·44–0·94, p=0·023). By contrast, there were no significant differences by treatment group for patients with EGFR Leu858Arg-positive tumours in either trial: in LUX-Lung 3, median overall survival was 27·6 months (19·8–41·7) in the afatinib group versus 40·3 months (24·3–not estimable) in the chemotherapy group (HR 1·30, 95% CI 0·80–2·11, p=0·29); in LUX-Lung 6, it was 19·6 months (95% CI 17·0–22·1) versus 24·3 months (19·0–27·0), respectively (HR 1·22, 95% CI 0·81–1·83, p=0·34). In both trials, the most common afatinib-related grade 3–4 adverse events were rash or acne (37 [16%] of 229 patients in LUX-Lung 3 and 35 [15%] of 239 patients in LUX-Lung 6), diarrhoea (33 [14%] and 13 [5%]), paronychia (26 [11%] in LUX-Lung 3 only), and stomatitis or mucositis (13 [5%] in LUX-Lung 6 only). In LUX-Lung 3, neutropenia (20 [18%] of 111 patients), fatigue (14 [13%]) and leucopenia (nine [8%]) were the most common chemotherapy-related grade 3–4 adverse events, while in LUX-Lung 6, the most common chemotherapy-related grade 3–4 adverse events were neutropenia (30 [27%] of 113 patients), vomiting (22 [19%]), and leucopenia (17 [15%]). Interpretation Although afatinib did not improve overall survival in the whole population of either trial, overall survival was improved with the drug for patients with del19 EGFR mutations. The absence of an effect in patients with Leu858Arg EGFR mutations suggests that EGFR del19-positive disease might be distinct from Leu858Arg-positive disease and that these subgroups should be analysed separately in future trials. Funding Boehringer Ingelheim.

Journal ArticleDOI
03 Dec 2015-Cell
TL;DR: It is demonstrated that context-dependent fitness genes accurately recapitulate pathway-specific genetic vulnerabilities induced by known oncogenes and reveal cell-type-specific dependencies for specific receptor tyrosine kinases, even in oncogenic KRAS backgrounds.