scispace - formally typeset
Search or ask a question
Author

Daniel Torres-Salinas

Bio: Daniel Torres-Salinas is an academic researcher from University of Granada. The author has contributed to research in topics: Altmetrics & Bibliometrics. The author has an hindex of 27, co-authored 160 publications receiving 2637 citations. Previous affiliations of Daniel Torres-Salinas include Chartered Institute of Management Accountants & University of Navarra.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the Journal of the American Society for Information Science and Technology (ASIS) has published a paper on the use of data mining techniques in information science and technology.
Abstract: Paper accepted for publication at the Journal of the American Society for Information Science and Technology. http://www.asis.org/jasist.html

209 citations

Journal ArticleDOI
TL;DR: It is concluded that Altmetric.com is a transparent, rich and accurate tool for altmetric data, Nevertheless, there are still potential limitations on its exhaustiveness as well as on the selection of social media sources that need further research.
Abstract: This paper analyzes Altmetric.com , one of the most important altmetric data providers currently used. We have analyzed a set of publications with doi number indexed in the Web of Science during the period 2011-2013 and collected their data with the Altmetric API. 19% of the original set of papers was retrieved from Altmetric.com including some altmetric data. We identified 16 different social media sources from which Altmetric.com retrieves data. However five of them cover 95.5% of the total set. Twitter (87.1%) and Mendeley (64.8%) have the highest coverage. We conclude that Altmetric.com is a transparent, rich and accurate tool for altmetric data. Nevertheless, there are still potential limitations on its exhaustiveness as well as on the selection of social media sources that need further research.

138 citations

Journal ArticleDOI
TL;DR: The results show that the most cited papers are also the ones with a highest impact according to the altmetrics, pointing out the main shortcomings these metrics present and the role they may play when measuring the research impact through 2.0 platforms.
Abstract: In this paper we review the socalled altmetrics or alternative metrics. This concept raises from the development of new indicators based on Web 2.0, for the evaluation of the research and academic activity. The basic assumption is that variables such as mentions in blogs, number of twits or of researchers bookmarking a research paper for instance, may be legitimate indicators for measuring the use and impact of scientific publications. In this sense, these indicators are currently the focus of the bibliometric community and are being discussed and debated. We describe the main platforms and indicators and we analyze as a sample the Spanish research output in Communication Studies. Comparing traditional indicators such as citations with these new indicators. The results show that the most cited papers are also the ones with a highest impact according to the altmetrics. We conclude pointing out the main shortcomings these metrics present and the role they may play when measuring the research impact through 2.0 platforms.

101 citations

Journal ArticleDOI
TL;DR: This paper illustrates how LCA can be fruitfully used to assess book production and research performance at the level of an individual researcher, a research department, an entire country and a book publisher.

94 citations

Posted Content
TL;DR: An experiment is presented in which the Google Citations profiles of a research group are manipulated through the creation of false documents that cite their documents, and consequently, the journals in which they have published modifying their H index.
Abstract: The launch of Google Scholar Citations and Google Scholar Metrics may provoke a revolution in the research evaluation field as it places within every researchers reach tools that allow bibliometric measuring. In order to alert the research community over how easily one can manipulate the data and bibliometric indicators offered by Google s products we present an experiment in which we manipulate the Google Citations profiles of a research group through the creation of false documents that cite their documents, and consequently, the journals in which they have published modifying their H index. For this purpose we created six documents authored by a faked author and we uploaded them to a researcher s personal website under the University of Granadas domain. The result of the experiment meant an increase of 774 citations in 129 papers (six citations per paper) increasing the authors and journals H index. We analyse the malicious effect this type of practices can cause to Google Scholar Citations and Google Scholar Metrics. Finally, we conclude with several deliberations over the effects these malpractices may have and the lack of control tools these tools offer

87 citations


Cited by
More filters
01 Jan 1995
TL;DR: In this paper, the authors propose a method to improve the quality of the data collected by the data collection system. But it is difficult to implement and time consuming and computationally expensive.
Abstract: 本文对国际科学计量学杂志《Scientometrics》1979-1991年的研究论文内容、栏目、作者及国别和编委及国别作了计量分析,揭示出科学计量学研究的重点、活动的中心及发展趋势,说明了学科带头人在发展科学计量学这门新兴学科中的作用。

1,636 citations

Journal ArticleDOI
TL;DR: A longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases, suggesting that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Abstract: This article aims to provide a systematic and comprehensive comparison of the coverage of the three major bibliometric databases: Google Scholar, Scopus and the Web of Science. Based on a sample of 146 senior academics in five broad disciplinary areas, we therefore provide both a longitudinal and a cross-disciplinary comparison of the three databases. Our longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons. Our cross-disciplinary comparison of the three databases includes four key research metrics (publications, citations, h-index, and hI, annual, an annualised individual h-index) and five major disciplines (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We show that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.

930 citations

Journal ArticleDOI
17 Sep 2015-PLOS ONE
TL;DR: It is concluded that whilst Google Scholar can find much grey literature and specific, known studies, it should not be used alone for systematic review searches, rather, it forms a powerful addition to other traditional search methods.
Abstract: Google Scholar (GS), a commonly used web-based academic search engine, catalogues between 2 and 100 million records of both academic and grey literature (articles not formally published by commercial academic publishers). Google Scholar collates results from across the internet and is free to use. As a result it has received considerable attention as a method for searching for literature, particularly in searches for grey literature, as required by systematic reviews. The reliance on GS as a standalone resource has been greatly debated, however, and its efficacy in grey literature searching has not yet been investigated. Using systematic review case studies from environmental science, we investigated the utility of GS in systematic reviews and in searches for grey literature. Our findings show that GS results contain moderate amounts of grey literature, with the majority found on average at page 80. We also found that, when searched for specifically, the majority of literature identified using Web of Science was also found using GS. However, our findings showed moderate/poor overlap in results when similar search strings were used in Web of Science and GS (10–67%), and that GS missed some important literature in five of six case studies. Furthermore, a general GS search failed to find any grey literature from a case study that involved manual searching of organisations’ websites. If used in systematic reviews for grey literature, we recommend that searches of article titles focus on the first 200 to 300 results. We conclude that whilst Google Scholar can find much grey literature and specific, known studies, it should not be used alone for systematic review searches. Rather, it forms a powerful addition to other traditional search methods. In addition, we advocate the use of tools to transparently document and catalogue GS search results to maintain high levels of transparency and the ability to be updated, critical to systematic reviews.

901 citations

Journal ArticleDOI
Ludo Waltman1
TL;DR: In this paper, an in-depth review of the literature on citation impact indicators is provided, focusing on the selection of publications and citations to be included in the calculation of citation impact indicator.

774 citations

Journal ArticleDOI
TL;DR: Martin-Martin this article was funded for a four-year doctoral fellowship (FPU2013/05863) granted by the Ministerio de Educacion, Cultura, y Deportes (Spain).

763 citations