scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Bibliometrics: The Leiden Manifesto for research metrics

23 Apr 2015-Nature (Nature Publishing Group)-Vol. 520, Iss: 7548, pp 429-431
TL;DR: Zehn Grundsatze um Forschung zu bewerten, drangen Diana Hicks, Paul Wouters und Kollegen einiges zusammen wirkt.
Abstract: Nutzen Sie diese zehn Grundsatze um Forschung zu bewerten, drangen Diana Hicks, Paul Wouters und Kollegen.

Content maybe subject to copyright    Report

Citations
More filters
01 Jan 1995
TL;DR: In this paper, the authors propose a method to improve the quality of the data collected by the data collection system. But it is difficult to implement and time consuming and computationally expensive.
Abstract: 本文对国际科学计量学杂志《Scientometrics》1979-1991年的研究论文内容、栏目、作者及国别和编委及国别作了计量分析,揭示出科学计量学研究的重点、活动的中心及发展趋势,说明了学科带头人在发展科学计量学这门新兴学科中的作用。

1,636 citations

Journal ArticleDOI
TL;DR: A keyword analysis identifies the most popular subjects covered by bibliometric analysis, and multidisciplinary articles are shown to have the highest impact.
Abstract: Bibliometric methods or "analysis" are now firmly established as scientific specialties and are an integral part of research evaluation methodology especially within the scientific and applied fields. The methods are used increasingly when studying various aspects of science and also in the way institutions and universities are ranked worldwide. A sufficient number of studies have been completed, and with the resulting literature, it is now possible to analyse the bibliometric method by using its own methodology. The bibliometric literature in this study, which was extracted from Web of Science, is divided into two parts using a method comparable to the method of Jonkers et al. (Characteristics of bibliometrics articles in library and information sciences (LIS) and other journals, pp. 449---551, 2012: The publications either lie within the Information and Library Science (ILS) category or within the non-ILS category which includes more applied, "subject" based studies. The impact in the different groupings is judged by means of citation analysis using normalized data and an almost linear increase can be observed from 1994 onwards in the non-ILS category. The implication for the dissemination and use of the bibliometric methods in the different contexts is discussed. A keyword analysis identifies the most popular subjects covered by bibliometric analysis, and multidisciplinary articles are shown to have the highest impact. A noticeable shift is observed in those countries which contribute to the pool of bibliometric analysis, as well as a self-perpetuating effect in giving and taking references.

1,098 citations


Cites background from "Bibliometrics: The Leiden Manifesto..."

  • ...Later on, the same types of problems have been raised in the Leiden Manifesto by Hicks et al. (2015) that research evaluation is now led by data rather than sound judgement and good practice....

    [...]

Journal ArticleDOI
Chaomei Chen1
TL;DR: A systematic review of the literature concerning major aspects of science mapping is presented to demonstrate the use of a science mapping approach to perform the review so that researchers may apply the procedure to the review of a scientific domain of their own interest.
Abstract: Abstract Purpose We present a systematic review of the literature concerning major aspects of science mapping to serve two primary purposes: First, to demonstrate the use of a science mapping approach to perform the review so that researchers may apply the procedure to the review of a scientific domain of their own interest, and second, to identify major areas of research activities concerning science mapping, intellectual milestones in the development of key specialties, evolutionary stages of major specialties involved, and the dynamics of transitions from one specialty to another. Design/methodology/approach We first introduce a theoretical framework of the evolution of a scientific specialty. Then we demonstrate a generic search strategy that can be used to construct a representative dataset of bibliographic records of a domain of research. Next, progressively synthesized co-citation networks are constructed and visualized to aid visual analytic studies of the domain’s structural and dynamic patterns and trends. Finally, trajectories of citations made by particular types of authors and articles are presented to illustrate the predictive potential of the analytic approach. Findings The evolution of the science mapping research involves the development of a number of interrelated specialties. Four major specialties are discussed in detail in terms of four evolutionary stages: conceptualization, tool construction, application, and codification. Underlying connections between major specialties are also explored. The predictive analysis demonstrates citations trajectories of potentially transformative contributions. Research limitations The systematic review is primarily guided by citation patterns in the dataset retrieved from the literature. The scope of the data is limited by the source of the retrieval, i.e. the Web of Science, and the composite query used. An iterative query refinement is possible if one would like to improve the data quality, although the current approach serves our purpose adequately. More in-depth analyses of each specialty would be more revealing by incorporating additional methods such as citation context analysis and studies of other aspects of scholarly publications. Practical implications The underlying analytic process of science mapping serves many practical needs, notably bibliometric mapping, knowledge domain visualization, and visualization of scientific literature. In order to master such a complex process of science mapping, researchers often need to develop a diverse set of skills and knowledge that may span multiple disciplines. The approach demonstrated in this article provides a generic method for conducting a systematic review. Originality/value Incorporating the evolutionary stages of a specialty into the visual analytic study of a research domain is innovative. It provides a systematic methodology for researchers to achieve a good understanding of how scientific fields evolve, to recognize potentially insightful patterns from visually encoded signs, and to synthesize various information so as to capture the state of the art of the domain.

818 citations

Journal ArticleDOI
Ludo Waltman1
TL;DR: In this paper, an in-depth review of the literature on citation impact indicators is provided, focusing on the selection of publications and citations to be included in the calculation of citation impact indicator.

774 citations

Journal ArticleDOI
TL;DR: Citations are increasingly used as performance indicators in research policy and within the research system as mentioned in this paper, and it is argued that citations reflect aspects related to scientific impact and relevance, although with important limitations.
Abstract: Citations are increasingly used as performance indicators in research policy and within the research system. Usually, citations are assumed to reflect the impact of the research or its quality. What is the justification for these assumptions and how do citations relate to research quality? These and similar issues have been addressed through several decades of scientometric research. This article provides an overview of some of the main issues at stake, including theories of citation and the interpretation and validity of citations as performance measures. Research quality is a multidimensional concept, where plausibility/soundness, originality, scientific value, and societal value commonly are perceived as key characteristics. The article investigates how citations may relate to these various research quality dimensions. It is argued that citations reflect aspects related to scientific impact and relevance, although with important limitations. On the contrary, there is no evidence that citations reflect other key dimensions of research quality. Hence, an increased use of citation indicators in research evaluation and funding may imply less attention to these other research quality dimensions, such as solidity/plausibility, originality, and societal value.

517 citations

References
More filters
Journal ArticleDOI
TL;DR: The index h, defined as the number of papers with citation number ≥h, is proposed as a useful index to characterize the scientific output of a researcher.
Abstract: I propose the index h, defined as the number of papers with citation number ≥h, as a useful index to characterize the scientific output of a researcher.

8,996 citations

Journal ArticleDOI
04 Jan 2006-JAMA
TL;DR: The journal impact factor was created to help select additional source journals and is based on the number of citations in the current year to items published in the previous 2 years, which allows for the inclusion of many small but influential journals.
Abstract: IFIRST MENTIONED THE IDEA OF AN IMPACT FACTOR IN Science in 1955. With support from the National Institutes of Health, the experimental Genetics Citation Index was published, and that led to the 1961 publication of the Science Citation Index. Irving H. Sher and I created the journal impact factor to help select additional source journals. To do this we simply re-sorted the author citation index into the journal citation index. From this simple exercise, we learned that initially a core group of large and highly cited journals needed to be covered in the new Science Citation Index (SCI). Consider that, in 2004, the Journal of Biological Chemistry published 6500 articles, whereas articles from the Proceedings of the National Academy of Sciences were cited more than 300 000 times that year. Smaller journals might not be selected if we rely solely on publication count, so we created the journal impact factor (JIF). The TABLE provides a selective list of journals ranked by impact factor for 2004. The Table also includes the total number of articles published in 2004, the total number of articles published in 2002 plus 2003 (the JIF denominator), the citations to everything published in 2002 plus 2003 (the JIF numerator), and the total citations in 2004 for all articles ever published in a given journal. Sorting by impact factor allows for the inclusion of many small (in terms of total number of articles published) but influential journals. Obviously, sorting by total citations or other provided data would result in a different ranking. The term “impact factor” has gradually evolved to describe both journal and author impact. Journal impact factors generally involve relatively large populations of articles and citations. Individual authors generally produce smaller numbers of articles, although some have published a phenomenal number. For example, transplant surgeon Tom Starzl has coauthored more than 2000 articles, while Carl Djerassi, inventor of the modern oral contraceptive, has published more than 1300. Even before the Journal Citation Reports (JCR) appeared, we sampled the 1969 SCI to create the first published ranking by impact factor. Today, the JCR includes every journal citation in more than 5000 journals—about 15 million citations from 1 million source items per year. The precision of impact factors is questionable, but reporting to 3 decimal places reduces the number of journals with the identical impact rank. However, it matters very little whether, for example, the impact of JAMA is quoted as 24.8 rather than 24.831. A journal’s impact factor is based on 2 elements: the numerator, which is the number of citations in the current year to items published in the previous 2 years, and the denominator, which is the number of substantive articles and reviews published in the same 2 years. The impact factor could just as easily be based on the previous year’s articles alone, which would give even greater weight to rapidly changing fields. An impact factor could also take into account longer periods of citations and sources, but then the measure would be less current.

2,345 citations

Journal ArticleDOI
15 Feb 1997-BMJ
TL;DR: Alternative methods for evaluating research are being sought, such as citation rates and journal impact factors, which seem to be quantitative and objective indicators directly related to published science.
Abstract: Evaluating scientific quality is a notoriously difficult problem which has no standard solution. Ideally, published scientific results should be scrutinised by true experts in the field and given scores for quality and quantity according to established rules. In practice, however, what is called peer review is usually performed by committees with general competence rather than with the specialist's insight that is needed to assess primary research data. Committees tend, therefore, to resort to secondary criteria like crude publication counts, journal prestige, the reputation of authors and institutions, and estimated importance and relevance of the research field,1 making peer review as much of a lottery as of a rational process.2 3 On this background, it is hardly surprising that alternative methods for evaluating research are being sought, such as citation rates and journal impact factors, which seem to be quantitative and objective indicators directly related to published science. The citation data are obtained from a database produced by the Institute for Scientific Information (ISI) in Philadelphia, which continuously records scientific citations as represented by the reference lists of articles from a large number of the world's scientific journals. The references are rearranged in the database to show how many times each publication has been cited within a certain period, and by whom, and the results are published as the Science Citation Index (SCI) . On the basis of the Science Citation Index and authors' publication lists, the annual citation rate of papers by a scientific author or research group can thus be calculated. Similarly, the citation rate of a scientific journal—known as the journal impact factor—can be calculated as the mean citation rate of all the articles contained in the journal.4 Journal impact factors, which are published annually in SCI Journal Citation Reports , are widely regarded as …

2,238 citations

01 Jan 1995
TL;DR: In this paper, the authors propose a method to improve the quality of the data collected by the data collection system. But it is difficult to implement and time consuming and computationally expensive.
Abstract: 本文对国际科学计量学杂志《Scientometrics》1979-1991年的研究论文内容、栏目、作者及国别和编委及国别作了计量分析,揭示出科学计量学研究的重点、活动的中心及发展趋势,说明了学科带头人在发展科学计量学这门新兴学科中的作用。

1,636 citations

Journal ArticleDOI
TL;DR: This paper compares the h-indices of a list of highly-cited Israeli researchers based on citations counts retrieved from the Web of Science, Scopus and Google Scholar respectively with results obtained through Google Scholar.
Abstract: This paper compares the h-indices of a list of highly-cited Israeli researchers based on citations counts retrieved from the Web of Science, Scopus and Google Scholar respectively. In several case the results obtained through Google Scholar are considerably different from the results based on the Web of Science and Scopus. Data cleansing is discussed extensively.

672 citations

Trending Questions (1)
Apa Bibliometric analysis ?

The provided paper does not explicitly define or discuss bibliometric analysis.