scispace - formally typeset
Search or ask a question
Author

Robert J. W. Tijssen

Other affiliations: Stellenbosch University
Bio: Robert J. W. Tijssen is an academic researcher from Leiden University. The author has contributed to research in topics: Citation & Citation analysis. The author has an hindex of 34, co-authored 87 publications receiving 5108 citations. Previous affiliations of Robert J. W. Tijssen include Stellenbosch University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors analyzed the changing effect of physical distance and territorial borders on the intensity of research collaboration across European regions and found that the bias to collaborate with physically proximate partners did not decrease, while the bias towards collaboration within territorial borders did decrease over time.

463 citations

Journal ArticleDOI
TL;DR: The authors conclude that the value of impact indicators of research activities at the level of an institution or a country strongly depend upon whether one includes research publications in SCI covered journals written in other languages than in English.
Abstract: Empirical evidence presented in this paper shows that the utmost care must be taken ininterpreting bibliometric data in a comparative evaluation of national research systems. From theresults of recent studies, the authors conclude that the value of impact indicators of researchactivities at the level of an institution or a country strongly depend upon whether one includes orexcludes research publications in SCI covered journals written in other languages than in English.Additional material was gathered to show the distribution of SCI papers among publicationlanguages. Finally, the authors make suggestions for further research on how to deal with this typeof problems in future national research performance studies.

433 citations

Journal ArticleDOI
TL;DR: The Leiden Ranking 2011/2012 as discussed by the authors is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration, which includes 500 major universities from 41 different countries.
Abstract: The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. The comparison focuses on the methodological choices underlying the different rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a university's highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non-English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking and a number of limitations of the ranking are pointed out. © 2012 Wiley Periodicals, Inc.

376 citations

Posted Content
TL;DR: The Leiden Ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings, and the comparison focuses on the methodological choices underlying the different rankings.
Abstract: The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a university's highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non-English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking, and a number of limitations of the ranking are pointed out.

338 citations

Journal ArticleDOI
TL;DR: The paper concludes that, in an analysis of collaborative links, it is essential to use both absolute and relative measures, which normalize differences in country size.
Abstract: A growing science policy interest in international scientific collaboration has brought about a multitude of studies which attempt to measure the extent of international scientific collaboration between countries and to explore intercountry collaborative networks. This paper attempts to clarify the methodology that is being used or can be used for this purpose and discusses the adequacy of the methods. The paper concludes that, in an analysis of collaborative links, it is essential to use both absolute and relative measures. The latter normalize differences in country size. Each yields a different type of information. Absolute measures yield an answer to questions such as which countries are central in the international network of science, whether collaborative links reveal a centre — periphery relationship, and which countries are the most important collaborative partners of another country. Relative measures provide answers to questions of the intensity of collaborative links.

326 citations


Cited by
More filters
Posted Content
TL;DR: The Oxford Handbook of Innovation as mentioned in this paper provides a comprehensive and holistic understanding of the phenomenon of innovation, with a focus on firms and networks, and the consequences of innovation with respect to economic growth, international competitiveness, and employment.
Abstract: This handbook looks to provide academics and students with a comprehensive and holistic understanding of the phenomenon of innovation. Innovation spans a number of fields within the social sciences and humanities: Management, Economics, Geography, Sociology, Politics, Psychology, and History. Consequently, the rapidly increasing body of literature on innovation is characterized by a multitude of perspectives based on, or cutting across, existing disciplines and specializations. Scholars of innovation can come from such diverse starting points that much of this literature can be missed, and so constructive dialogues missed. The editors of The Oxford Handbook of Innovation have carefully selected and designed twenty-one contributions from leading academic experts within their particular field, each focusing on a specific aspect of innovation. These have been organized into four main sections, the first of which looks at the creation of innovations, with particular focus on firms and networks. Section Two provides an account of the wider systematic setting influencing innovation and the role of institutions and organizations in this context. Section Three explores some of the diversity in the working of innovation over time and across different sectors of the economy, and Section Four focuses on the consequences of innovation with respect to economic growth, international competitiveness, and employment. An introductory overview, concluding remarks, and guide to further reading for each chapter, make this handbook a key introduction and vital reference work for researchers, academics, and advanced students of innovation. Contributors to this volume - Jan Fagerberg, University of Oslo William Lazonick, INSEAD Walter W. Powell, Stanford University Keith Pavitt, SPRU Alice Lam, Brunel University Keith Smith, INTECH Charles Edquist, Linkoping David Mowery, University of California, Berkeley Mary O'Sullivan, INSEAD Ove Granstrand, Chalmers Bjorn Asheim, University of Lund Rajneesh Narula, Copenhagen Business School Antonello Zanfei, Urbino Kristine Bruland, University of Oslo Franco Malerba, University of Bocconi Nick Von Tunzelmann, SPRU Ian Miles, University of Manchester Bronwyn Hall, University of California, Berkeley Bart Verspagen , ECIS Francisco Louca, ISEG Manuel M. Godinho, ISEG Richard R. Nelson, Mario Pianta, Urbino Bengt-Ake Lundvall, Aalborg

3,040 citations

Journal ArticleDOI
TL;DR: In this paper, the authors distinguish between collaboration at different levels and show that inter-institutional and international collaboration need not necessarily involve inter-individual collaboration, and argue for a more symmetrical approach in comparing the costs of collaboration with the undoubted benefits when considering policies towards research collaboration.

2,594 citations

Journal IssueDOI
Chaomei Chen1
TL;DR: This article describes the latest development of a generic approach to detecting and visualizing emerging trends and transient patterns in scientific literature, and makes substantial theoretical and methodological contributions to progressive knowledge domain visualization.
Abstract: This article describes the latest development of a generic approach to detecting and visualizing emerging trends and transient patterns in scientific literature. The work makes substantial theoretical and methodological contributions to progressive knowledge domain visualization. A specialty is conceptualized and visualized as a time-variant duality between two fundamental concepts in information science: research fronts and intellectual bases. A research front is defined as an emergent and transient grouping of concepts and underlying research issues. The intellectual base of a research front is its citation and co-citation footprint in scientific literature—an evolving network of scientific publications cited by research-front concepts. Kleinberg's (2002) burst-detection algorithm is adapted to identify emergent research-front concepts. Freeman's (1979) betweenness centrality metric is used to highlight potential pivotal points of paradigm shift over time. Two complementary visualization views are designed and implemented: cluster views and time-zone views. The contributions of the approach are that (a) the nature of an intellectual base is algorithmically and temporally identified by emergent research-front terms, (b) the value of a co-citation cluster is explicitly interpreted in terms of research-front concepts, and (c) visually prominent and algorithmically detected pivotal points substantially reduce the complexity of a visualized network. The modeling and visualization process is implemented in CiteSpace II, a Java application, and applied to the analysis of two research fields: mass extinction (1981–2004) and terrorism (1990–2003). Prominent trends and pivotal points in visualized networks were verified in collaboration with domain experts, who are the authors of pivotal-point articles. Practical implications of the work are discussed. A number of challenges and opportunities for future studies are identified. © 2006 Wiley Periodicals, Inc.

2,521 citations

Posted Content
TL;DR: The process of innovation must be viewed as a series of changes in a complete system not only of hardware, but also of market environment, production facilities and knowledge, and the social contexts of the innovation organization as discussed by the authors.
Abstract: Models that depict innovation as a smooth, well-behaved linear process badly misspecify the nature and direction of the causal factors at work. Innovation is complex, uncertain, somewhat disorderly, and subject to changes of many sorts. Innovation is also difficult to measure and demands close coordination of adequate technical knowledge and excellent market judgment in order to satisfy economic, technological, and other types of constraints—all simultaneously. The process of innovation must be viewed as a series of changes in a complete system not only of hardware, but also of market environment, production facilities and knowledge, and the social contexts of the innovation organization.

2,154 citations

Journal ArticleDOI
TL;DR: In this article, the authors compared the coverage of active scholarly journals in the Web of Science (WoS) and Scopus (20,346 journals) with Ulrich's extensive periodical directory (63,013 journals) to assess whether some field, publishing country and language are over or underrepresented.
Abstract: Bibliometric methods are used in multiple fields for a variety of purposes, namely for research evaluation. Most bibliometric analyses have in common their data sources: Thomson Reuters' Web of Science (WoS) and Elsevier's Scopus. The objective of this research is to describe the journal coverage of those two databases and to assess whether some field, publishing country and language are over or underrepresented. To do this we compared the coverage of active scholarly journals in WoS (13,605 journals) and Scopus (20,346 journals) with Ulrich's extensive periodical directory (63,013 journals). Results indicate that the use of either WoS or Scopus for research evaluation may introduce biases that favor Natural Sciences and Engineering as well as Biomedical Research to the detriment of Social Sciences and Arts and Humanities. Similarly, English-language journals are overrepresented to the detriment of other languages. While both databases share these biases, their coverage differs substantially. As a consequence, the results of bibliometric analyses may vary depending on the database used. These results imply that in the context of comparative research evaluation, WoS and Scopus should be used with caution, especially when comparing different fields, institutions, countries or languages. The bibliometric community should continue its efforts to develop methods and indicators that include scientific output that are not covered in WoS or Scopus, such as field-specific and national citation indexes.

1,686 citations