scispace - formally typeset
Search or ask a question
Topic

Scopus

About: Scopus is a research topic. Over the lifetime, 3748 publications have been published within this topic receiving 42416 citations. The topic is also known as: SciVerse Scopus & scopus.com.


Papers
More filters
Journal ArticleDOI
TL;DR: The content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar are compared and PubMed remains an optimal tool in biomedical electronic research.
Abstract: The evolution of the electronic age has led to the development of numerous medical databases on the World Wide Web, offering search facilities on a particular subject and the ability to perform citation analysis. We compared the content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar. The official Web pages of the databases were used to extract information on the range of journals covered, search facilities and restrictions, and update frequency. We used the example of a keyword search to evaluate the usefulness of these databases in biomedical information retrieval and a specific published article to evaluate their utility in performing citation analysis. All databases were practical in use and offered numerous search facilities. PubMed and Google Scholar are accessed for free. The keyword search with PubMed offers optimal update frequency and includes online early articles; other databases can rate articles by number of citations, as an index of importance. For citation analysis, Scopus offers about 20% more coverage than Web of Science, whereas Google Scholar offers results of inconsistent accuracy. PubMed remains an optimal tool in biomedical electronic research. Scopus covers a wider journal range, of help both in keyword searching and citation analysis, but it is currently limited to recent articles (published after 1995) compared with Web of Science. Google Scholar, as for the Web in general, can help in the retrieval of even the most obscure information but its use is marred by inadequate, less often updated, citation information.

2,696 citations

Journal ArticleDOI
TL;DR: In this article, the authors compared the coverage of active scholarly journals in the Web of Science (WoS) and Scopus (20,346 journals) with Ulrich's extensive periodical directory (63,013 journals) to assess whether some field, publishing country and language are over or underrepresented.
Abstract: Bibliometric methods are used in multiple fields for a variety of purposes, namely for research evaluation. Most bibliometric analyses have in common their data sources: Thomson Reuters' Web of Science (WoS) and Elsevier's Scopus. The objective of this research is to describe the journal coverage of those two databases and to assess whether some field, publishing country and language are over or underrepresented. To do this we compared the coverage of active scholarly journals in WoS (13,605 journals) and Scopus (20,346 journals) with Ulrich's extensive periodical directory (63,013 journals). Results indicate that the use of either WoS or Scopus for research evaluation may introduce biases that favor Natural Sciences and Engineering as well as Biomedical Research to the detriment of Social Sciences and Arts and Humanities. Similarly, English-language journals are overrepresented to the detriment of other languages. While both databases share these biases, their coverage differs substantially. As a consequence, the results of bibliometric analyses may vary depending on the database used. These results imply that in the context of comparative research evaluation, WoS and Scopus should be used with caution, especially when comparing different fields, institutions, countries or languages. The bibliometric community should continue its efforts to develop methods and indicators that include scientific output that are not covered in WoS or Scopus, such as field-specific and national citation indexes.

1,686 citations

Journal ArticleDOI
06 Sep 2006-JAMA
TL;DR: Practical recommendations on mentoring in medicine that are evidence-based will require studies using more rigorous methods, addressing contextual issues, and using cross-disciplinary approaches.
Abstract: ContextMentoring, as a partnership in personal and professional growth and development, is central to academic medicine, but it is challenged by increased clinical, administrative, research, and other educational demands on medical faculty. Therefore, evidence for the value of mentoring needs to be evaluated.ObjectiveTo systematically review the evidence about the prevalence of mentorship and its relationship to career development.Data SourcesMEDLINE, Current Contents, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Cochrane Central Register of Controlled Trials, PsycINFO, and Scopus databases from the earliest available date to May 2006.Study Selection and Data ExtractionWe identified all studies evaluating the effect of mentoring on career choices and academic advancement among medical students and physicians. Minimum inclusion criteria were a description of the study population and availability of extractable data. No restrictions were placed on study methods or language.Data SynthesisThe literature search identified 3640 citations. Review of abstracts led to retrieval of 142 full-text articles for assessment; 42 articles describing 39 studies were selected for review. Of these, 34 (87%) were cross-sectional self-report surveys with small sample size and response rates ranging from 5% to 99%. One case-control study nested in a survey used a comparison group that had not received mentoring, and 1 cohort study had a small sample size and a large loss to follow-up. Less than 50% of medical students and in some fields less than 20% of faculty members had a mentor. Women perceived that they had more difficulty finding mentors than their colleagues who are men. Mentorship was reported to have an important influence on personal development, career guidance, career choice, and research productivity, including publication and grant success.ConclusionsMentoring is perceived as an important part of academic medicine, but the evidence to support this perception is not strong. Practical recommendations on mentoring in medicine that are evidence-based will require studies using more rigorous methods, addressing contextual issues, and using cross-disciplinary approaches.

1,318 citations

Journal ArticleDOI
TL;DR: Tweets can predict highly cited articles within the first 3 days of article publication, and the proposed twimpact factor may be a useful and timely metric to measure uptake of research findings and to filter research findings resonating with the public in real time.
Abstract: Background: Citations in peer-reviewed articles and the impact factor are generally accepted measures of scientific impact Web 20 tools such as Twitter, blogs or social bookmarking tools provide the possibility to construct innovative article-level or journal-level metrics to gauge impact and influence However, the relationship of the these new metrics to traditional metrics such as citations is not known Objective: (1) To explore the feasibility of measuring social impact of and public attention to scholarly articles by analyzing buzz in social media, (2) to explore the dynamics, content, and timing of tweets relative to the publication of a scholarly article, and (3) to explore whether these metrics are sensitive and specific enough to predict highly cited articles Methods: Between July 2008 and November 2011, all tweets containing links to articles in the Journal of Medical Internet Research (JMIR) were mined For a subset of 1573 tweets about 55 articles published between issues 3/2009 and 2/2010, different metrics of social media impact were calculated and compared against subsequent citation data from Scopus and Google Scholar 17 to 29 months later A heuristic to predict the top-cited articles in each issue through tweet metrics was validated Results: A total of 4208 tweets cited 286 distinct JMIR articles The distribution of tweets over the first 30 days after article publication followed a power law (Zipf, Bradford, or Pareto distribution), with most tweets sent on the day when an article was published (1458/3318, 4394% of all tweets in a 60-day period) or on the following day (528/3318, 159%), followed by a rapid decay The Pearson correlations between tweetations and citations were moderate and statistically significant, with correlation coefficients ranging from 42 to 72 for the log-transformed Google Scholar citations, but were less clear for Scopus citations and rank correlations A linear multivariate model with time and tweets as significant predictors (P < 001) could explain 27% of the variation of citations Highly tweeted articles were 11 times more likely to be highly cited than less-tweeted articles (9/12 or 75% of highly tweeted article were highly cited, while only 3/43 or 7% of less-tweeted articles were highly cited; rate ratio 075/007 = 1075, 95% confidence interval, 34–336) Top-cited articles can be predicted from top-tweeted articles with 93% specificity and 75% sensitivity Conclusions: Tweets can predict highly cited articles within the first 3 days of article publication Social media activity either increases citations or reflects the underlying qualities of the article that also predict citations, but the true use of these metrics is to measure the distinct concept of social impact Social impact measures based on tweets are proposed to complement traditional citation metrics The proposed twimpact factor may be a useful and timely metric to measure uptake of research findings and to filter research findings resonating with the public in real time [J Med Internet Res 2011;13(4):e123]

954 citations

Journal ArticleDOI
TL;DR: A longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases, suggesting that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Abstract: This article aims to provide a systematic and comprehensive comparison of the coverage of the three major bibliometric databases: Google Scholar, Scopus and the Web of Science. Based on a sample of 146 senior academics in five broad disciplinary areas, we therefore provide both a longitudinal and a cross-disciplinary comparison of the three databases. Our longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons. Our cross-disciplinary comparison of the three databases includes four key research metrics (publications, citations, h-index, and hI, annual, an annualised individual h-index) and five major disciplines (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We show that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.

930 citations


Network Information
Related Topics (5)
Corporate governance
118.5K papers, 2.7M citations
77% related
Job satisfaction
58K papers, 1.8M citations
76% related
Qualitative research
39.9K papers, 2.3M citations
76% related
Empirical research
51.3K papers, 1.9M citations
75% related
The Internet
213.2K papers, 3.8M citations
75% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
20234,563
20228,272
2021566
2020371
2019253