scispace - formally typeset
Search or ask a question
Institution

University of Adelaide

EducationAdelaide, South Australia, Australia
About: University of Adelaide is a education organization based out in Adelaide, South Australia, Australia. It is known for research contribution in the topics: Population & Poison control. The organization has 27251 authors who have published 79167 publications receiving 2671128 citations. The organization is also known as: The University of Adelaide & Adelaide University.


Papers
More filters
Journal ArticleDOI
TL;DR: Systematic reviews of prevalence and incidence data should follow the same structured steps as systematic reviews of effectiveness, however, many of these steps need to be tailored for this type of evidence, particularly surrounding the stages of critical appraisal and synthesis.
Abstract: Aim:There currently does not exist guidance for authors aiming to undertake systematic reviews of observational epidemiological studies, such as those reporting prevalence and incidence information. These reviews are particularly useful to measure global disease burden and changes in disease

1,253 citations

Posted Content
TL;DR: This paper introduces a generalized version of IoU ( GIoU) as a loss into the state-of-the art object detection frameworks, and shows a consistent improvement on their performance using both the standard, IoU based, and new, GIo U based, performance measures on popular object detection benchmarks.
Abstract: Intersection over Union (IoU) is the most popular evaluation metric used in the object detection benchmarks. However, there is a gap between optimizing the commonly used distance losses for regressing the parameters of a bounding box and maximizing this metric value. The optimal objective for a metric is the metric itself. In the case of axis-aligned 2D bounding boxes, it can be shown that $IoU$ can be directly used as a regression loss. However, $IoU$ has a plateau making it infeasible to optimize in the case of non-overlapping bounding boxes. In this paper, we address the weaknesses of $IoU$ by introducing a generalized version as both a new loss and a new metric. By incorporating this generalized $IoU$ ($GIoU$) as a loss into the state-of-the art object detection frameworks, we show a consistent improvement on their performance using both the standard, $IoU$ based, and new, $GIoU$ based, performance measures on popular object detection benchmarks such as PASCAL VOC and MS COCO.

1,251 citations

Journal ArticleDOI
01 Oct 2020
TL;DR: The updated JBI guidance for scoping reviews includes additional guidance on several methodological issues, such as when a scoping review is (or is not) appropriate, and how to extract, analyze, and present results, and provides clarification for implications for practice and research.
Abstract: OBJECTIVE: The objective of this paper is to describe the updated methodological guidance for conducting a JBI scoping review, with a focus on new updates to the approach and development of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (the PRISMA-ScR). INTRODUCTION: Scoping reviews are an increasingly common approach to informing decision-making and research based on the identification and examination of the literature on a given topic or issue. Scoping reviews draw on evidence from any research methodology and may also include evidence from non-research sources, such as policy. In this manner, scoping reviews provide a comprehensive overview to address broader review questions than traditionally more specific systematic reviews of effectiveness or qualitative evidence. The increasing popularity of scoping reviews has been accompanied by the development of a reporting guideline: the PRISMA-ScR. In 2014, the JBI Scoping Review Methodology Group developed guidance for scoping reviews that received minor updates in 2017 and was most recently updated in 2020. The updates reflect ongoing and substantial developments in approaches to scoping review conduct and reporting. As such, the JBI Scoping Review Methodology Group recognized the need to revise the guidance to align with the current state of knowledge and reporting standards in evidence synthesis. METHODS: Between 2015 and 2020, the JBI Scoping Review Methodology Group expanded its membership; extensively reviewed the literature; engaged via annual face-to-face meetings, regular teleconferences, and email correspondence; sought advice from methodological experts; facilitated workshops; and presented at scientific conferences. This process led to updated guidance for scoping reviews published in the JBI Manual for Evidence Synthesis. The updated chapter was endorsed by JBI's International Scientific Committee in 2020. RESULTS: The updated JBI guidance for scoping reviews includes additional guidance on several methodological issues, such as when a scoping review is (or is not) appropriate, and how to extract, analyze, and present results, and provides clarification for implications for practice and research. Furthermore, it is aligned with the PRISMA-ScR to ensure consistent reporting. CONCLUSIONS: The latest JBI guidance for scoping reviews provides up-to-date guidance that can be used by authors when conducting a scoping review. Furthermore, it aligns with the PRISMA-ScR, which can be used to report the conduct of a scoping review. A series of ongoing and future methodological projects identified by the JBI Scoping Review Methodology Group to further refine the methodology are planned.

1,250 citations


Authors

Showing all 27579 results

NameH-indexPapersCitations
Martin White1962038232387
Nicholas G. Martin1921770161952
David W. Johnson1602714140778
Nicholas J. Talley158157190197
Mark E. Cooper1581463124887
Xiang Zhang1541733117576
John E. Morley154137797021
Howard I. Scher151944101737
Christopher M. Dobson1501008105475
A. Artamonov1501858119791
Timothy P. Hughes14583191357
Christopher Hill1441562128098
Shi-Zhang Qiao14252380888
Paul Jackson141137293464
H. A. Neal1411903115480
Network Information
Related Institutions (5)
University of Melbourne
174.8K papers, 6.3M citations

97% related

University of British Columbia
209.6K papers, 9.2M citations

92% related

McGill University
162.5K papers, 6.9M citations

92% related

University of Edinburgh
151.6K papers, 6.6M citations

92% related

Imperial College London
209.1K papers, 9.3M citations

91% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023127
2022597
20215,500
20205,342
20194,803
20184,443