scispace - formally typeset
Search or ask a question
Author

David Campbell

Bio: David Campbell is an academic researcher. The author has contributed to research in topics: Scopus & Bibliometrics. The author has an hindex of 2, co-authored 3 publications receiving 525 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors compare results obtained from the Web of Science and Scopus, and show that the correlations between the measures obtained with both databases for the number of papers and the citations received by countries, as well as for their ranks, are extremely high (R2 >.99).
Abstract: For more than 40 years, the Institute for Scientific Information (ISI, now part of Thomson Reuters) produced the only available bibliographic databases from which bibliometricians could compile large-scale bibliometric indicators. ISI's citation indexes, now regrouped under the Web of Science (WoS), were the major sources of bibliometric data until 2004, when Scopus was launched by the publisher Reed Elsevier. For those who perform bibliometric analyses and comparisons of countries or institutions, the existence of these two major databases raises the important question of the comparability and stability of statistics obtained from different data sources. This paper uses macro-level bibliometric indicators to compare results obtained from the WoS and Scopus. It shows that the correlations between the measures obtained with both databases for the number of papers and the number of citations received by countries, as well as for their ranks, are extremely high (R2 > .99). There is also a very high correlation when countries' papers are broken down by field. The paper thus provides evidence that indicators of scientific production and citations at the country level are stable and largely independent of the database.

341 citations

Journal IssueDOI
TL;DR: Using macrolevel bibliometric indicators to compare results obtained from the WoS and Scopus provides evidence that indicators of scientific production and citations at the country level are stable and largely independent of the database.
Abstract: For more than 40 years, the Institute for Scientific Information (ISI, now part of Thomson Reuters) produced the only available bibliographic databases from which bibliometricians could compile large-scale bibliometric indicators. ISI's citation indexes, now regrouped under the Web of Science (WoS), were the major sources of bibliometric data until 2004, when Scopus was launched by the publisher Reed Elsevier. For those who perform bibliometric analyses and comparisons of countries or institutions, the existence of these two major databases raises the important question of the comparability and stability of statistics obtained from different data sources. This paper uses macrolevel bibliometric indicators to compare results obtained from the WoS and Scopus. It shows that the correlations between the measures obtained with both databases for the number of papers and the number of citations received by countries, as well as for their ranks, are extremely high (R2 a .99). There is also a very high correlation when countries' papers are broken down by field. The paper thus provides evidence that indicators of scientific production and citations at the country level are stable and largely independent of the database. © 2009 Wiley Periodicals, Inc.

303 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, the authors compared the coverage of active scholarly journals in the Web of Science (WoS) and Scopus (20,346 journals) with Ulrich's extensive periodical directory (63,013 journals) to assess whether some field, publishing country and language are over or underrepresented.
Abstract: Bibliometric methods are used in multiple fields for a variety of purposes, namely for research evaluation. Most bibliometric analyses have in common their data sources: Thomson Reuters' Web of Science (WoS) and Elsevier's Scopus. The objective of this research is to describe the journal coverage of those two databases and to assess whether some field, publishing country and language are over or underrepresented. To do this we compared the coverage of active scholarly journals in WoS (13,605 journals) and Scopus (20,346 journals) with Ulrich's extensive periodical directory (63,013 journals). Results indicate that the use of either WoS or Scopus for research evaluation may introduce biases that favor Natural Sciences and Engineering as well as Biomedical Research to the detriment of Social Sciences and Arts and Humanities. Similarly, English-language journals are overrepresented to the detriment of other languages. While both databases share these biases, their coverage differs substantially. As a consequence, the results of bibliometric analyses may vary depending on the database used. These results imply that in the context of comparative research evaluation, WoS and Scopus should be used with caution, especially when comparing different fields, institutions, countries or languages. The bibliometric community should continue its efforts to develop methods and indicators that include scientific output that are not covered in WoS or Scopus, such as field-specific and national citation indexes.

1,686 citations

Journal ArticleDOI
Ludo Waltman1
TL;DR: In this paper, an in-depth review of the literature on citation impact indicators is provided, focusing on the selection of publications and citations to be included in the calculation of citation impact indicator.

774 citations

Posted Content
Ludo Waltman1
TL;DR: An in-depth review of the literature on citation impact indicators with recommendations for future research on normalization for field differences and counting methods for dealing with co-authored publications.
Abstract: Citation impact indicators nowadays play an important role in research evaluation, and consequently these indicators have received a lot of attention in the bibliometric and scientometric literature. This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar). Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences. Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research.

469 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present an all-inclusive description of the two main bibliographic DBs by gathering the findings that are presented in the most recent literature and information provided by the owners of the DBs at one place.
Abstract: Nowadays, the importance of bibliographic databases (DBs) has increased enormously, as they are the main providers of publication metadata and bibliometric indicators universally used both for research assessment practices and for performing daily tasks. Because the reliability of these tasks firstly depends on the data source, all users of the DBs should be able to choose the most suitable one. Web of Science (WoS) and Scopus are the two main bibliographic DBs. The comprehensive evaluation of the DBs’ coverage is practically impossible without extensive bibliometric analyses or literature reviews, but most DBs users do not have bibliometric competence and/or are not willing to invest additional time for such evaluations. Apart from that, the convenience of the DB’s interface, performance, provided impact indicators and additional tools may also influence the users’ choice. The main goal of this work is to provide all of the potential users with an all-inclusive description of the two main bibliographic DBs by gathering the findings that are presented in the most recent literature and information provided by the owners of the DBs at one place. This overview should aid all stakeholders employing publication and citation data in selecting the most suitable DB.

267 citations

Journal ArticleDOI
TL;DR: A randomized controlled trial of open access publishing, involving 36 participating journals in the sciences, social sciences, and humanities, reports on the effects of free access on article downloads and citations.
Abstract: Does free access to journal articles result in greater diffusion of scientific knowledge? Using a randomized controlled trial of open access publishing, involving 36 participating journals in the sciences, social sciences, and humanities, we report on the effects of free access on article downloads and citations. Articles placed in the open access condition (n=712) received significantly more downloads and reached a broader audience within the first year, yet were cited no more frequently, nor earlier, than subscription-access control articles (n=2533) within 3 yr. These results may be explained by social stratification, a process that concentrates scientific authors at a small number of elite research universities with excellent access to the scientific literature. The real beneficiaries of open access publishing may not be the research community but communities of practice that consume, but rarely contribute to, the corpus of literature.—Davis, P. M. Open access, readership, citations: a randomized cont...

257 citations