scispace - formally typeset
Search or ask a question
Institution

Wuhan University

EducationWuhan, China
About: Wuhan University is a education organization based out in Wuhan, China. It is known for research contribution in the topics: Computer science & Population. The organization has 92849 authors who have published 92882 publications receiving 1691049 citations. The organization is also known as: WHU & Wuhan College.


Papers
More filters
Journal ArticleDOI
TL;DR: essential knowledge is introduced about COVID-19 and nosocomial infection in dental settings and recommended management protocols for dental practitioners and students in (potentially) affected areas and strict and effective infection control protocols are urgently needed.
Abstract: The epidemic of coronavirus disease 2019 (COVID-19), originating in Wuhan, China, has become a major public health challenge for not only China but also countries around the world. The World Health Organization announced that the outbreaks of the novel coronavirus have constituted a public health emergency of international concern. As of February 26, 2020, COVID-19 has been recognized in 34 countries, with a total of 80,239 laboratory-confirmed cases and 2,700 deaths. Infection control measures are necessary to prevent the virus from further spreading and to help control the epidemic situation. Due to the characteristics of dental settings, the risk of cross infection can be high between patients and dental practitioners. For dental practices and hospitals in areas that are (potentially) affected with COVID-19, strict and effective infection control protocols are urgently needed. This article, based on our experience and relevant guidelines and research, introduces essential knowledge about COVID-19 and nosocomial infection in dental settings and provides recommended management protocols for dental practitioners and students in (potentially) affected areas.

1,377 citations

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors used Landsat TM and ETM+ images from 1990 to 2000 in the Pearl River Delta (PRD) to retrieve the brightness temperatures and land use/cover types.

1,307 citations

Journal ArticleDOI
TL;DR: To systematically review the methodological assessment tools for pre‐clinical and clinical studies, systematic review and meta‐analysis, and clinical practice guideline.
Abstract: Objective To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. Methods We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. Results We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. Conclusions We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided.

1,241 citations

Journal ArticleDOI
TL;DR: In this article, the first 30 m resolution global land cover maps using Landsat Thematic Mapper TM and enhanced thematic mapper plus ETM+ data were produced. And the authors used four classifiers that were freely available were employed, including the conventional maximum likelihood classifier MLC, J4.8 decision tree classifier, Random Forest RF classifier and support vector machine SVM classifier.
Abstract: We have produced the first 30 m resolution global land-cover maps using Landsat Thematic Mapper TM and Enhanced Thematic Mapper Plus ETM+ data. We have classified over 6600 scenes of Landsat TM data after 2006, and over 2300 scenes of Landsat TM and ETM+ data before 2006, all selected from the green season. These images cover most of the world's land surface except Antarctica and Greenland. Most of these images came from the United States Geological Survey in level L1T orthorectified. Four classifiers that were freely available were employed, including the conventional maximum likelihood classifier MLC, J4.8 decision tree classifier, Random Forest RF classifier and support vector machine SVM classifier. A total of 91,433 training samples were collected by traversing each scene and finding the most representative and homogeneous samples. A total of 38,664 test samples were collected at preset, fixed locations based on a globally systematic unaligned sampling strategy. Two software tools, Global Analyst and Global Mapper developed by extending the functionality of Google Earth, were used in developing the training and test sample databases by referencing the Moderate Resolution Imaging Spectroradiometer enhanced vegetation index MODIS EVI time series for 2010 and high resolution images from Google Earth. A unique land-cover classification system was developed that can be crosswalked to the existing United Nations Food and Agriculture Organization FAO land-cover classification system as well as the International Geosphere-Biosphere Programme IGBP system. Using the four classification algorithms, we obtained the initial set of global land-cover maps. The SVM produced the highest overall classification accuracy OCA of 64.9% assessed with our test samples, with RF 59.8%, J4.8 57.9%, and MLC 53.9% ranked from the second to the fourth. We also estimated the OCAs using a subset of our test samples 8629 each of which represented a homogeneous area greater than 500 m × 500 m. Using this subset, we found the OCA for the SVM to be 71.5%. As a consistent source for estimating the coverage of global land-cover types in the world, estimation from the test samples shows that only 6.90% of the world is planted for agricultural production. The total area of cropland is 11.51% if unplanted croplands are included. The forests, grasslands, and shrublands cover 28.35%, 13.37%, and 11.49% of the world, respectively. The impervious surface covers only 0.66% of the world. Inland waterbodies, barren lands, and snow and ice cover 3.56%, 16.51%, and 12.81% of the world, respectively.

1,212 citations


Authors

Showing all 93441 results

NameH-indexPapersCitations
Jing Wang1844046202769
Jiaguo Yu178730113300
Lei Jiang1702244135205
Gang Chen1673372149819
Omar M. Yaghi165459163918
Xiang Zhang1541733117576
Yi Yang143245692268
Thomas P. Russell141101280055
Jun Chen136185677368
Lei Zhang135224099365
Chuan He13058466438
Han Zhang13097058863
Lei Zhang130231286950
Zhen Li127171271351
Chao Zhang127311984711
Network Information
Related Institutions (5)
Peking University
181K papers, 4.1M citations

93% related

Zhejiang University
183.2K papers, 3.4M citations

93% related

Shanghai Jiao Tong University
184.6K papers, 3.4M citations

93% related

Fudan University
117.9K papers, 2.6M citations

92% related

Chinese Academy of Sciences
634.8K papers, 14.8M citations

91% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023286
20221,141
20219,719
20209,672
20197,977
20186,629