scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
Adam M. Session1, Adam M. Session2, Yoshinobu Uno3, Taejoon Kwon4, Taejoon Kwon5, Jarrod Chapman1, Atsushi Toyoda6, Shuji Takahashi7, Akimasa Fukui8, Akira Hikosaka7, Atsushi Suzuki7, Mariko Kondo9, Simon J. van Heeringen10, Ian K. Quigley11, Sven Heinz11, Hajime Ogino12, Haruki Ochi13, Uffe Hellsten1, Jessica B. Lyons2, Oleg Simakov14, Nicholas H. Putnam, Jonathan C. Stites, Yoko Kuroki, Toshiaki Tanaka15, Tatsuo Michiue9, Minoru Watanabe16, Ozren Bogdanovic17, Ryan Lister17, Georgios Georgiou10, Sarita S. Paranjpe10, Ila van Kruijsbergen10, Shengquiang Shu1, Joseph W. Carlson1, Tsutomu Kinoshita18, Yuko Ohta19, Shuuji Mawaribuchi20, Jerry Jenkins1, Jane Grimwood1, Jeremy Schmutz1, Therese Mitros2, Sahar V. Mozaffari21, Yutaka Suzuki9, Yoshikazu Haramoto22, Takamasa S. Yamamoto23, Chiyo Takagi23, Rebecca Heald2, Kelly E. Miller2, Christian D. Haudenschild24, Jacob O. Kitzman25, Takuya Nakayama26, Yumi Izutsu27, Jacques Robert28, Joshua D. Fortriede29, Kevin A. Burns, Vaneet Lotay30, Kamran Karimi30, Yuuri Yasuoka14, Darwin S. Dichmann2, Martin F. Flajnik19, Douglas W. Houston31, Jay Shendure25, Louis DuPasquier32, Peter D. Vize30, Aaron M. Zorn29, Michihiko Ito20, Edward M. Marcotte5, John B. Wallingford5, Yuzuru Ito22, Makoto Asashima22, Naoto Ueno23, Naoto Ueno33, Yoichi Matsuda3, Gert Jan C. Veenstra10, Asao Fujiyama6, Asao Fujiyama34, Asao Fujiyama33, Richard M. Harland2, Masanori Taira9, Daniel S. Rokhsar2, Daniel S. Rokhsar1, Daniel S. Rokhsar14 
20 Oct 2016-Nature
TL;DR: The Xenopus laevis genome is sequenced and it is estimated that the two diploid progenitor species diverged around 34 million years ago and combined to form an allotetraploid around 17–18 Ma, where more than 56% of all genes were retained in two homoeologous copies.
Abstract: To explore the origins and consequences of tetraploidy in the African clawed frog, we sequenced the Xenopus laevis genome and compared it to the related diploid X. tropicalis genome. We characterize the allotetraploid origin of X. laevis by partitioning its genome into two homoeologous subgenomes, marked by distinct families of 'fossil' transposable elements. On the basis of the activity of these elements and the age of hundreds of unitary pseudogenes, we estimate that the two diploid progenitor species diverged around 34 million years ago (Ma) and combined to form an allotetraploid around 17-18 Ma. More than 56% of all genes were retained in two homoeologous copies. Protein function, gene expression, and the amount of conserved flanking sequence all correlate with retention rates. The subgenomes have evolved asymmetrically, with one chromosome set more often preserving the ancestral state and the other experiencing more gene loss, deletion, rearrangement, and reduced gene expression.

761 citations



Journal ArticleDOI
TL;DR: Profex is a platform-independent open-source graphical user interface for the Rietveld refinement program BGMN.
Abstract: Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.

761 citations


Proceedings ArticleDOI
07 Jun 2015
TL;DR: This paper gives an introduction to industrial IoT systems, the related security and privacy challenges, and an outlook on possible solutions towards a holistic security framework for Industrial IoT systems.
Abstract: Today, embedded, mobile, and cyberphysical systems are ubiquitous and used in many applications, from industrial control systems, modern vehicles, to critical infrastructure. Current trends and initiatives, such as "Industrie 4.0" and Internet of Things (IoT), promise innovative business models and novel user experiences through strong connectivity and effective use of next generation of embedded devices. These systems generate, process, and exchange vast amounts of security-critical and privacy-sensitive data, which makes them attractive targets of attacks. Cyberattacks on IoT systems are very critical since they may cause physical damage and even threaten human lives. The complexity of these systems and the potential impact of cyberattacks bring upon new threats. This paper gives an introduction to Industrial IoT systems, the related security and privacy challenges, and an outlook on possible solutions towards a holistic security framework for Industrial IoT systems.

761 citations


Journal ArticleDOI
21 Sep 2017-Cell
TL;DR: Polycomb (PcG) and Trithorax (TrxG) group proteins are evolutionarily conserved chromatin-modifying factors originally identified as part of an epigenetic cellular memory system that maintains repressed or active gene expression states.

761 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined whether cumulative anticholinergic use is associated with a higher risk for incident dementia and found a 10-year cumulative dose-response relationship was observed for dementia and Alzheimer disease.
Abstract: Importance Many medications have anticholinergic effects. In general, anticholinergic-induced cognitive impairment is considered reversible on discontinuation of anticholinergic therapy. However, a few studies suggest that anticholinergics may be associated with an increased risk for dementia. Objective To examine whether cumulative anticholinergic use is associated with a higher risk for incident dementia. Design, Setting, and Participants Prospective population-based cohort study using data from the Adult Changes in Thought study in Group Health, an integrated health care delivery system in Seattle, Washington. We included 3434 participants 65 years or older with no dementia at study entry. Initial recruitment occurred from 1994 through 1996 and from 2000 through 2003. Beginning in 2004, continuous replacement for deaths occurred. All participants were followed up every 2 years. Data through September 30, 2012, were included in these analyses. Exposures Computerized pharmacy dispensing data were used to ascertain cumulative anticholinergic exposure, which was defined as the total standardized daily doses (TSDDs) dispensed in the past 10 years. The most recent 12 months of use was excluded to avoid use related to prodromal symptoms. Cumulative exposure was updated as participants were followed up over time. Main Outcomes and Measures Incident dementia and Alzheimer disease using standard diagnostic criteria. Statistical analysis used Cox proportional hazards regression models adjusted for demographic characteristics, health behaviors, and health status, including comorbidities. Results The most common anticholinergic classes used were tricyclic antidepressants, first-generation antihistamines, and bladder antimuscarinics. During a mean follow-up of 7.3 years, 797 participants (23.2%) developed dementia (637 of these [79.9%] developed Alzheimer disease). A 10-year cumulative dose-response relationship was observed for dementia and Alzheimer disease (test for trend, P Conclusions and Relevance Higher cumulative anticholinergic use is associated with an increased risk for dementia. Efforts to increase awareness among health care professionals and older adults about this potential medication-related risk are important to minimize anticholinergic use over time.

761 citations


Journal ArticleDOI
TL;DR: This review gives both sides of the story, with the current best theory of quantum security, and an extensive survey of what makes quantum cryptosystem safe in practice.
Abstract: Some years ago quantum hacking became popular: devices implementing the unbreakable quantum cryptography were shown to have imperfections which could be exploited by attackers. Security has been thoroughly enhanced, as a consequence of both theoretical and experimental advances. This review gives both sides of the story, with the current best theory of quantum security, and an extensive survey of what makes quantum cryptosystem safe in practice.

761 citations


Journal ArticleDOI
01 Jul 2015-Thyroid
TL;DR: These inaugural guidelines provide recommendations for the evaluation and management of thyroid nodules in children and adolescents, including the role and interpretation of ultrasound, fine-needle aspiration cytology, and the management of benign nodules.
Abstract: Background: Previous guidelines for the management of thyroid nodules and cancers were geared toward adults. Compared with thyroid neoplasms in adults, however, those in the pediatric population ex...

760 citations


Journal ArticleDOI
TL;DR: This review outlines some of the advantages and challenges that may accompany a transition from macroscopic to microfluidic cell culture and focuses on decisive factors that distinguish Macroscopic from microfluidity cell culture to encourage a reconsideration of how macroscopy cell culture principles might apply to micro fluidiccell culture.

760 citations


Journal ArticleDOI
03 Nov 2016-Cell
TL;DR: Validation of two predicted host-microbial interactions reveal that TNFα and IFNγ production are associated with specific microbial metabolic pathways: palmitoleic acid metabolism andtryptophan degradation to tryptophol.

760 citations


Journal ArticleDOI
TL;DR: Deep learning has achieved remarkable success in diverse applications; however, its use in solving partial differential equations (PDEs) has emerged only recently as discussed by the authors, and a comprehensive overview of deep learning for PDEs can be found in Section 2.1.
Abstract: Deep learning has achieved remarkable success in diverse applications; however, its use in solving partial differential equations (PDEs) has emerged only recently. Here, we present an overview of p...

Journal ArticleDOI
08 Mar 2017-Nature
TL;DR: This work observes long-lived temporal correlations, experimentally identifies the phase boundary and finds that the temporal order is protected by strong interactions, which opens the door to exploring dynamical phases of matter and controlling interacting, disordered many-body systems.
Abstract: Understanding quantum dynamics away from equilibrium is an outstanding challenge in the modern physical sciences Out-of-equilibrium systems can display a rich variety of phenomena, including self-organized synchronization and dynamical phase transitions More recently, advances in the controlled manipulation of isolated many-body systems have enabled detailed studies of non-equilibrium phases in strongly interacting quantum matter; for example, the interplay between periodic driving, disorder and strong interactions has been predicted to result in exotic 'time-crystalline' phases, in which a system exhibits temporal correlations at integer multiples of the fundamental driving period, breaking the discrete time-translational symmetry of the underlying drive Here we report the experimental observation of such discrete time-crystalline order in a driven, disordered ensemble of about one million dipolar spin impurities in diamond at room temperature We observe long-lived temporal correlations, experimentally identify the phase boundary and find that the temporal order is protected by strong interactions This order is remarkably stable to perturbations, even in the presence of slow thermalization Our work opens the door to exploring dynamical phases of matter and controlling interacting, disordered many-body systems

Proceedings ArticleDOI
06 Apr 2017
TL;DR: In this article, a quadruplet deep network using a margin-based online hard negative mining is proposed based on the quadruplet loss for person ReID, which can lead to the model output with a larger interclass variation and a smaller intra-class variation compared to the triplet loss.
Abstract: Person re-identification (ReID) is an important task in wide area video surveillance which focuses on identifying people across different cameras. Recently, deep learning networks with a triplet loss become a common framework for person ReID. However, the triplet loss pays main attentions on obtaining correct orders on the training set. It still suffers from a weaker generalization capability from the training set to the testing set, thus resulting in inferior performance. In this paper, we design a quadruplet loss, which can lead to the model output with a larger inter-class variation and a smaller intra-class variation compared to the triplet loss. As a result, our model has a better generalization ability and can achieve a higher performance on the testing set. In particular, a quadruplet deep network using a margin-based online hard negative mining is proposed based on the quadruplet loss for the person ReID. In extensive experiments, the proposed network outperforms most of the state-of-the-art algorithms on representative datasets which clearly demonstrates the effectiveness of our proposed method.

Journal ArticleDOI
TL;DR: This article discusses research findings concerning salient contextual issues that might influence or alter the impact of telecommuting, including the nature of the work performed while telecommuters, interpersonal processes such as knowledge sharing and innovation, and additional considerations that include motives fortelecommuting such as family responsibilities.
Abstract: Telecommuting has become an increasingly popular work mode that has generated significant interest from scholars and practitioners alike. With recent advances in technology that enable mobile connections at ever-affordable rates, working away from the office as a telecommuter has become increasingly available to many workers around the world. Since the term telecommuting was first coined in the 1970s, scholars and practitioners have debated the merits of working away from the office, as it represents a fundamental shift in how organizations have historically done business. Complicating efforts to truly understand the implications of telecommuting have been the widely varying definitions and conceptualizations of telecommuting and the diverse fields in which research has taken place.Our objective in this article is to review existing research on telecommuting in an effort to better understand what we as a scientific community know about telecommuting and its implications. In so doing, we aim to bring to the surface some of the intricacies associated with telecommuting research so that we may shed insights into the debate regarding telecommuting's benefits and drawbacks. We attempt to sift through the divergent and at times conflicting literature to develop an overall sense of the status of our scientific findings, in an effort to identify not only what we know and what we think we know about telecommuting, but also what we must yet learn to fully understand this increasingly important work mode.After a brief review of the history of telecommuting and its prevalence, we begin by discussing the definitional challenges inherent within existing literature and offer a comprehensive definition of telecommuting rooted in existing research. Our review starts by highlighting the need to interpret existing findings with an understanding of how the extent of telecommuting practiced by participants in a study is likely to alter conclusions that may be drawn. We then review telecommuting's implications for employees' work-family issues, attitudes, and work outcomes, including job satisfaction, organizational commitment and identification, stress, performance, wages, withdrawal behaviors, and firm-level metrics. Our article continues by discussing research findings concerning salient contextual issues that might influence or alter the impact of telecommuting, including the nature of the work performed while telecommuting, interpersonal processes such as knowledge sharing and innovation, and additional considerations that include motives for telecommuting such as family responsibilities. We also cover organizational culture and support that may shape the telecommuting experience, after which we discuss the community and societal effects of telecommuting, including its effects on traffic and emissions, business continuity, and work opportunities, as well as the potential impact on societal ties. Selected examples of telecommuting legislation and policies are also provided in an effort to inform readers regarding the status of the national debate and its legislative implications. Our synthesis concludes by offering recommendations for telecommuting research and practice that aim to improve the quality of data on telecommuting as well as identify areas of research in need of development.

Journal ArticleDOI
TL;DR: Triple therapy with fluticasone furoate, umeclidinium, and vilanterol resulted in a lower rate of moderate or severe COPD exacerbations than flutic asonefuroate–vilanterol or u meclid inium–vilAnterol in this population.
Abstract: Background The benefits of triple therapy for chronic obstructive pulmonary disease (COPD) with an inhaled glucocorticoid, a long-acting muscarinic antagonist (LAMA), and a long-acting β2-agonist (LABA), as compared with dual therapy (either inhaled glucocorticoid–LABA or LAMA–LABA), are uncertain. Methods In this randomized trial involving 10,355 patients with COPD, we compared 52 weeks of a once-daily combination of fluticasone furoate (an inhaled glucocorticoid) at a dose of 100 μg, umeclidinium (a LAMA) at a dose of 62.5 μg, and vilanterol (a LABA) at a dose of 25 μg (triple therapy) with fluticasone furoate–vilanterol (at doses of 100 μg and 25 μg, respectively) and umeclidinium–vilanterol (at doses of 62.5 μg and 25 μg, respectively). Each regimen was administered in a single Ellipta inhaler. The primary outcome was the annual rate of moderate or severe COPD exacerbations during treatment. Results The rate of moderate or severe exacerbations in the triple-therapy group was 0.91 per year, as...

Journal ArticleDOI
02 Jan 2017-PeerJ
TL;DR: Bracken (Bayesian Reestimation of Abundance after Classification with KrakEN) uses the taxonomic assignments made by Kraken, a very fast read-level classifier, along with information about the genomes themselves to estimate abundance at the species level, the genus level, or above.
Abstract: We describe a new, highly accurate statistical method that computes the abundance of species in DNA sequences from a metagenomics sample. Bracken (Bayesian Reestimation of Abundance after Classification with KrakEN) uses the taxonomy labels assigned by Kraken, a highly accurate metagenomics classification algorithm, to estimate the number of reads originating from each species present in a sample. Kraken classifies reads to the best matching location in the taxonomic tree, but does not estimate abundances of species. We use the Kraken database itself to derive probabilities that describe how much sequence from each genome is shared with other genomes in the database, and combine this information with the assignments for a particular sample to estimate abundance at the species level, the genus level, or above. Combined with the Kraken classifier, Bracken produces accurate species- and genus-level abundance estimates even when a sample contains multiple near-identical species.

Journal ArticleDOI
TL;DR: Among patients with heart failure with reduced ejection fraction who were hospitalized for acute decompensated heart failure, the initiation of sacubitril–valsartan therapy led to a greater reduction in the N‐terminal pro–B‐type natriuretic peptide concentration than enalapril therapy.
Abstract: Background Acute decompensated heart failure accounts for more than 1 million hospitalizations in the United States annually. Whether the initiation of sacubitril–valsartan therapy is safe and effective among patients who are hospitalized for acute decompensated heart failure is unknown. Methods We enrolled patients with heart failure with reduced ejection fraction who were hospitalized for acute decompensated heart failure at 129 sites in the United States. After hemodynamic stabilization, patients were randomly assigned to receive sacubitril–valsartan (target dose, 97 mg of sacubitril with 103 mg of valsartan twice daily) or enalapril (target dose, 10 mg twice daily). The primary efficacy outcome was the time-averaged proportional change in the N-terminal pro–B-type natriuretic peptide (NT-proBNP) concentration from baseline through weeks 4 and 8. Key safety outcomes were the rates of worsening renal function, hyperkalemia, symptomatic hypotension, and angioedema. Results Of the 881 patients wh...

Posted Content
TL;DR: This work proposes a method built upon product quantization to store the word embeddings, which produces a text classifier, derived from the fastText approach, which at test time requires only a fraction of the memory compared to the original one, without noticeably sacrificing the quality in terms of classification accuracy.
Abstract: We consider the problem of producing compact architectures for text classification, such that the full model fits in a limited amount of memory. After considering different solutions inspired by the hashing literature, we propose a method built upon product quantization to store word embeddings. While the original technique leads to a loss in accuracy, we adapt this method to circumvent quantization artefacts. Our experiments carried out on several benchmarks show that our approach typically requires two orders of magnitude less memory than fastText while being only slightly inferior with respect to accuracy. As a result, it outperforms the state of the art by a good margin in terms of the compromise between memory usage and accuracy.

Proceedings Article
01 Jan 2018
TL;DR: Conditional domain adversarial networks (CDANs) as discussed by the authors are designed with two novel conditioning strategies: multilinear conditioning that captures the cross-covariance between feature representations and classifier predictions to improve the discriminability, and entropy conditioning that controls the uncertainty of classifier prediction to guarantee the transferability.
Abstract: Adversarial learning has been embedded into deep networks to learn disentangled and transferable representations for domain adaptation. Existing adversarial domain adaptation methods may struggle to align different domains of multimodal distributions that are native in classification problems. In this paper, we present conditional adversarial domain adaptation, a principled framework that conditions the adversarial adaptation models on discriminative information conveyed in the classifier predictions. Conditional domain adversarial networks (CDANs) are designed with two novel conditioning strategies: multilinear conditioning that captures the cross-covariance between feature representations and classifier predictions to improve the discriminability, and entropy conditioning that controls the uncertainty of classifier predictions to guarantee the transferability. Experiments testify that the proposed approach exceeds the state-of-the-art results on five benchmark datasets.

Journal ArticleDOI
TL;DR: In this article, a rebuttal to Arrow's famous argument that health care is special and free market economic principles do not apply is presented, based on concepts of Austrian economics, and the conclusions are that free market insurance (as opposed to subsidy) handles uncertainty of demand, Branding handles the uncertainty of outcome and the free market for specialized information handles information asymmetry.
Abstract: Part 2 of this 3 part series continues a rebuttal to Kenneth Arrow’s famous argument that health care is special and free market economic principles do not apply. The rebuttal is based on concepts of Austrian Economics. Part 1 of the series framed the debate and discussed general concepts. Part 2 discusses specific examples of how health care is special and does not behave according to market principles. Uncertainty of demand and uncertainty of outcome are discussed in detail. Information asymmetry is a special form of uncertainty that Kenneth Arrow claimed was somewhat unique to health care. Free market solutions to these problems are discussed in general with specific examples provided. The conclusions are that free market insurance (as opposed to subsidy) handles uncertainty of demand, branding handles uncertainty of outcome, and the free market for specialized information handles information asymmetry.

Journal ArticleDOI
TL;DR: This tutorial surveys neural network models from the perspective of natural language processing research, in an attempt to bring natural-language researchers up to speed with the neural techniques.
Abstract: Over the past few years, neural networks have re-emerged as powerful machine-learning models, yielding state-of-the-art results in fields such as image recognition and speech processing. More recently, neural network models started to be applied also to textual natural language signals, again with very promising results. This tutorial surveys neural network models from the perspective of natural language processing research, in an attempt to bring natural-language researchers up to speed with the neural techniques. The tutorial covers input encoding for natural language tasks, feed-forward networks, convolutional networks, recurrent networks and recursive networks, as well as the computation graph abstraction for automatic gradient computation.

Journal ArticleDOI
19 Aug 2016-Science
TL;DR: Common principles revealed by maternal immune activation models are described, highlighting recent findings that strengthen their relevance for schizophrenia and autism and are starting to reveal the molecular mechanisms underlying the effects of MIA on offspring.
Abstract: Epidemiological evidence implicates maternal infection as a risk factor for autism spectrum disorder and schizophrenia. Animal models corroborate this link and demonstrate that maternal immune activation (MIA) alone is sufficient to impart lifelong neuropathology and altered behaviors in offspring. This Review describes common principles revealed by these models, highlighting recent findings that strengthen their relevance for schizophrenia and autism and are starting to reveal the molecular mechanisms underlying the effects of MIA on offspring. The role of MIA as a primer for a much wider range of psychiatric and neurologic disorders is also discussed. Finally, the need for more research in this nascent field and the implications for identifying and developing new treatments for individuals at heightened risk for neuroimmune disorders are considered.

Journal ArticleDOI
TL;DR: It is reported that cobalt-phosphorous-derived films (Co-P) can act as bifunctional catalysts for overall water splitting with 100% Faradaic efficiency, rivalling the integrated performance of Pt and IrO2.
Abstract: One of the challenges to realize large-scale water splitting is the lack of active and low-cost electrocatalysts for its two half reactions: H2 and O2 evolution reactions (HER and OER). Herein, we report that cobalt-phosphorous-derived films (Co-P) can act as bifunctional catalysts for overall water splitting. The as-prepared Co-P films exhibited remarkable catalytic performance for both HER and OER in alkaline media, with a current density of 10 mA cm−2 at overpotentials of −94 mV for HER and 345 mV for OER and Tafel slopes of 42 and 47 mV/dec, respectively. They can be employed as catalysts on both anode and cathode for overall water splitting with 100 % Faradaic efficiency, rivalling the integrated performance of Pt and IrO2. The major composition of the as-prepared and post-HER films are metallic cobalt and cobalt phosphide, which partially evolved to cobalt oxide during OER.

Journal ArticleDOI
TL;DR: In this article, the current state of our understanding of the OER mechanism on PEM-compatible heterogeneous electrocatalysts, before comparing and contrast that to the OOR mechanism on homogenous catalysts.
Abstract: The low efficiency of the electrocatalytic oxidation of water to O2 (oxygen evolution reaction-OER) is considered as one of the major roadblocks for the storage of electricity from renewable sources in form of molecular fuels like H2 or hydrocarbons Especially in acidic environments, compatible with the powerful proton exchange membrane (PEM), an earth-abundant OER catalyst that combines high activity and high stability is still unknown Current PEM-compatible OER catalysts still rely mostly on Ir and/or Ru as active components, which are both very scarce elements of the platinum group Hence, the Ir and/or Ru amount in OER catalysts has to be strictly minimized Unfortunately, the OER mechanism, which is the most powerful tool for OER catalyst optimization, still remains unclear In this review, we first summarize the current state of our understanding of the OER mechanism on PEM-compatible heterogeneous electrocatalysts, before we compare and contrast that to the OER mechanism on homogenous catalysts Thereafter, an overview over monometallic OER catalysts is provided to obtain insights into structure-function relations followed by a review of current material optimization concepts and support materials Moreover, missing links required to complete the mechanistic picture as well as the most promising material optimization concepts are pointed out

Journal ArticleDOI
TL;DR: Evidence indicates that motor competence is positively associated with perceived competence and multiple aspects of health, but questions related to the increased strength of associations across time and antecedent/consequent mechanisms remain.
Abstract: In 2008, Stodden and colleagues took a unique developmental approach toward addressing the potential role of motor competence in promoting positive or negative trajectories of physical activity, health-related fitness, and weight status. The conceptual model proposed synergistic relationships among physical activity, motor competence, perceived motor competence, health-related physical fitness, and obesity with associations hypothesized to strengthen over time. At the time the model was proposed, limited evidence was available to support or refute the model hypotheses. Over the past 6 years, the number of investigations exploring these relationships has increased significantly. Thus, it is an appropriate time to examine published data that directly or indirectly relate to specific pathways noted in the conceptual model. Evidence indicates that motor competence is positively associated with perceived competence and multiple aspects of health (i.e., physical activity, cardiorespiratory fitness, muscular strength, muscular endurance, and a healthy weight status). However, questions related to the increased strength of associations across time and antecedent/consequent mechanisms remain. An individual’s physical and psychological development is a complex and multifaceted process that synergistically evolves across time. Understanding the most salient factors that influence health and well-being and how relationships among these factors change across time is a critical need for future research in this area. This knowledge could aid in addressing the declining levels of physical activity and fitness along with the increasing rates of obesity across childhood and adolescence.

Posted Content
TL;DR: This study demonstrated the useful application of deep learning models to classify COVID-19 in X-ray images based on the proposed COVIDX-Net framework and indicated that clinical studies are the next milestone of this research work.
Abstract: Background and Purpose: Coronaviruses (CoV) are perilous viruses that may cause Severe Acute Respiratory Syndrome (SARS-CoV), Middle East Respiratory Syndrome (MERS-CoV). The novel 2019 Coronavirus disease (COVID-19) was discovered as a novel disease pneumonia in the city of Wuhan, China at the end of 2019. Now, it becomes a Coronavirus outbreak around the world, the number of infected people and deaths are increasing rapidly every day according to the updated reports of the World Health Organization (WHO). Therefore, the aim of this article is to introduce a new deep learning framework; namely COVIDX-Net to assist radiologists to automatically diagnose COVID-19 in X-ray images. Materials and Methods: Due to the lack of public COVID-19 datasets, the study is validated on 50 Chest X-ray images with 25 confirmed positive COVID-19 cases. The COVIDX-Net includes seven different architectures of deep convolutional neural network models, such as modified Visual Geometry Group Network (VGG19) and the second version of Google MobileNet. Each deep neural network model is able to analyze the normalized intensities of the X-ray image to classify the patient status either negative or positive COVID-19 case. Results: Experiments and evaluation of the COVIDX-Net have been successfully done based on 80-20% of X-ray images for the model training and testing phases, respectively. The VGG19 and Dense Convolutional Network (DenseNet) models showed a good and similar performance of automated COVID-19 classification with f1-scores of 0.89 and 0.91 for normal and COVID-19, respectively. Conclusions: This study demonstrated the useful application of deep learning models to classify COVID-19 in X-ray images based on the proposed COVIDX-Net framework. Clinical studies are the next milestone of this research work.

Journal ArticleDOI
TL;DR: An in-depth analysis of twelve proposal methods along with four baselines regarding proposal repeatability, ground truth annotation recall on PASCAL, ImageNet, and MS COCO, and their impact on DPM, R-CNN, and Fast R- CNN detection performance shows that for object detection improving proposal localisation accuracy is as important as improving recall.
Abstract: Current top performing object detectors employ detection proposals to guide the search for objects, thereby avoiding exhaustive sliding window search across images. Despite the popularity and widespread use of detection proposals, it is unclear which trade-offs are made when using them during object detection. We provide an in-depth analysis of twelve proposal methods along with four baselines regarding proposal repeatability, ground truth annotation recall on PASCAL, ImageNet, and MS COCO, and their impact on DPM, R-CNN, and Fast R-CNN detection performance. Our analysis shows that for object detection improving proposal localisation accuracy is as important as improving recall. We introduce a novel metric, the average recall (AR), which rewards both high recall and good localisation and correlates surprisingly well with detection performance. Our findings show common strengths and weaknesses of existing methods, and provide insights and metrics for selecting and tuning proposal methods.


Proceedings Article
06 Dec 2017
TL;DR: Quantized SGD (QSGD) as discussed by the authors is a family of compression schemes for gradient updates which provides convergence guarantees for convex and nonconvex objectives, under asynchrony, and can be extended to stochastic variance-reduced techniques.
Abstract: Parallel implementations of stochastic gradient descent (SGD) have received significant research attention, thanks to its excellent scalability properties. A fundamental barrier when parallelizing SGD is the high bandwidth cost of communicating gradient updates between nodes; consequently, several lossy compresion heuristics have been proposed, by which nodes only communicate quantized gradients. Although effective in practice, these heuristics do not always guarantee convergence, and it is not clear whether they can be improved. In this paper, we propose Quantized SGD (QSGD), a family of compression schemes for gradient updates which provides convergence guarantees. QSGD allows the user to smoothly trade off \emph{communication bandwidth} and \emph{convergence time}: nodes can adjust the number of bits sent per iteration, at the cost of possibly higher variance. We show that this trade-off is inherent, in the sense that improving it past some threshold would violate information-theoretic lower bounds. QSGD guarantees convergence for convex and non-convex objectives, under asynchrony, and can be extended to stochastic variance-reduced techniques. When applied to training deep neural networks for image classification and automated speech recognition, QSGD leads to significant reductions in end-to-end training time. For example, on 16GPUs, we can train the ResNet152 network to full accuracy on ImageNet 1.8x faster than the full-precision variant.

Journal ArticleDOI
17 Jun 2016-Science
TL;DR: Various mechanisms controlling ribosome scanning and initiation codon selection by 5′ upstream open reading frames, translation initiation factors, and primary and secondary structures of the 5′UTR, including particular sequence motifs are described.
Abstract: The eukaryotic 5′ untranslated region (UTR) is critical for ribosome recruitment to the messenger RNA (mRNA) and start codon choice and plays a major role in the control of translation efficiency and shaping the cellular proteome. The ribosomal initiation complex is assembled on the mRNA via a cap-dependent or cap-independent mechanism. We describe various mechanisms controlling ribosome scanning and initiation codon selection by 5′ upstream open reading frames, translation initiation factors, and primary and secondary structures of the 5′UTR, including particular sequence motifs. We also discuss translational control via phosphorylation of eukaryotic initiation factor 2, which is implicated in learning and memory, neurodegenerative diseases, and cancer.