Institution
Wuhan University
Education•Wuhan, China•
About: Wuhan University is a education organization based out in Wuhan, China. It is known for research contribution in the topics: Computer science & Population. The organization has 92849 authors who have published 92882 publications receiving 1691049 citations. The organization is also known as: WHU & Wuhan College.
Topics: Computer science, Population, Catalysis, Feature extraction, Apoptosis
Papers published on a yearly basis
Papers
More filters
••
Gregory A. Roth1, Gregory A. Roth2, Degu Abate3, Kalkidan Hassen Abate4 +1025 more•Institutions (333)
TL;DR: Non-communicable diseases comprised the greatest fraction of deaths, contributing to 73·4% (95% uncertainty interval [UI] 72·5–74·1) of total deaths in 2017, while communicable, maternal, neonatal, and nutritional causes accounted for 18·6% (17·9–19·6), and injuries 8·0% (7·7–8·2).
5,211 citations
••
TL;DR: In this paper, the authors present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macro-autophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes.
For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure flux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy.
Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation, it is imperative to target by gene knockout or RNA interference more than one autophagy-related protein. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways implying that not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular assays, we hope to encourage technical innovation in the field.
5,187 citations
••
02 Mar 2020TL;DR: Among Chinese health care workers exposed to COVID-19, women, nurses, those in Wuhan, and front-line health care Workers have a high risk of developing unfavorable mental health outcomes and may need psychological support or interventions.
Abstract: Importance Health care workers exposed to coronavirus disease 2019 (COVID-19) could be psychologically stressed. Objective To assess the magnitude of mental health outcomes and associated factors among health care workers treating patients exposed to COVID-19 in China. Design, Settings, and Participants This cross-sectional, survey-based, region-stratified study collected demographic data and mental health measurements from 1257 health care workers in 34 hospitals from January 29, 2020, to February 3, 2020, in China. Health care workers in hospitals equipped with fever clinics or wards for patients with COVID-19 were eligible. Main Outcomes and Measures The degree of symptoms of depression, anxiety, insomnia, and distress was assessed by the Chinese versions of the 9-item Patient Health Questionnaire, the 7-item Generalized Anxiety Disorder scale, the 7-item Insomnia Severity Index, and the 22-item Impact of Event Scale–Revised, respectively. Multivariable logistic regression analysis was performed to identify factors associated with mental health outcomes. Results A total of 1257 of 1830 contacted individuals completed the survey, with a participation rate of 68.7%. A total of 813 (64.7%) were aged 26 to 40 years, and 964 (76.7%) were women. Of all participants, 764 (60.8%) were nurses, and 493 (39.2%) were physicians; 760 (60.5%) worked in hospitals in Wuhan, and 522 (41.5%) were frontline health care workers. A considerable proportion of participants reported symptoms of depression (634 [50.4%]), anxiety (560 [44.6%]), insomnia (427 [34.0%]), and distress (899 [71.5%]). Nurses, women, frontline health care workers, and those working in Wuhan, China, reported more severe degrees of all measurements of mental health symptoms than other health care workers (eg, median [IQR] Patient Health Questionnaire scores among physicians vs nurses: 4.0 [1.0-7.0] vs 5.0 [2.0-8.0];P = .007; median [interquartile range {IQR}] Generalized Anxiety Disorder scale scores among men vs women: 2.0 [0-6.0] vs 4.0 [1.0-7.0];P Conclusions and Relevance In this survey of heath care workers in hospitals equipped with fever clinics or wards for patients with COVID-19 in Wuhan and other regions in China, participants reported experiencing psychological burden, especially nurses, women, those in Wuhan, and frontline health care workers directly engaged in the diagnosis, treatment, and care for patients with COVID-19.
5,157 citations
••
TL;DR: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 (GBD 2015) as discussed by the authors was used to estimate the incidence, prevalence, and years lived with disability for diseases and injuries at the global, regional, and national scale over the period of 1990 to 2015.
5,050 citations
••
01 Oct 2020
TL;DR: Transformers is an open-source library that consists of carefully engineered state-of-the art Transformer architectures under a unified API and a curated collection of pretrained models made by and available for the community.
Abstract: Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. Transformers is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. Transformers is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at https://github.com/huggingface/transformers.
4,798 citations
Authors
Showing all 93441 results
Name | H-index | Papers | Citations |
---|---|---|---|
Jing Wang | 184 | 4046 | 202769 |
Jiaguo Yu | 178 | 730 | 113300 |
Lei Jiang | 170 | 2244 | 135205 |
Gang Chen | 167 | 3372 | 149819 |
Omar M. Yaghi | 165 | 459 | 163918 |
Xiang Zhang | 154 | 1733 | 117576 |
Yi Yang | 143 | 2456 | 92268 |
Thomas P. Russell | 141 | 1012 | 80055 |
Jun Chen | 136 | 1856 | 77368 |
Lei Zhang | 135 | 2240 | 99365 |
Chuan He | 130 | 584 | 66438 |
Han Zhang | 130 | 970 | 58863 |
Lei Zhang | 130 | 2312 | 86950 |
Zhen Li | 127 | 1712 | 71351 |
Chao Zhang | 127 | 3119 | 84711 |