scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Multidisciplinary bibliographic databases.

TL;DR: The originator of the idea, Eugene Garfield, formulated several critical points in bibliometrics that have shaped citation indexes, for example, libraries with limited funding should be selective about the journals they acquire and a bibliography should selectively cover 'high quality' sources.
Abstract: The past five decades have witnessed the so-called data deluge and publication explosion across all branches of science (1). Numerous academic journals have been launched that use a systematic approach to the submission, peer review, and publishing of information. To facilitate the wide use of published sources, libraries across the world have expanded cataloguing and advanced literature search techniques. The first major step towards indexing academic journals and helping libraries acquire the most influential sources was made by the Institute for Scientific Information (ISI) in Philadelphia, USA, in 1960. The idea behind indexing and distributing information on published articles was to facilitate scientific communication between authors and readers (2). In other words, indexing was proposed as a tool for finding relevant sources of interest to the consumers. The originator of the idea, Eugene Garfield, also the founder of the ISI, formulated several critical points in bibliometrics that have shaped citation indexes, for example, libraries with limited funding should be selective about the journals they acquire; most read and highly cited journals constitute 'quality' sources; highly cited articles influence science; citations from highly-cited journals are weighed more than those from low-cited ones; and a bibliography should selectively cover 'high quality' sources.

Content maybe subject to copyright    Report

Citations
More filters
01 Feb 2009

911 citations

Journal ArticleDOI
TL;DR: This article overviews unethical publishing practices in connection with the pressure to publish more, and several measures are proposed to tackle the issue of predatory publishing.
Abstract: This article overviews unethical publishing practices in connection with the pressure to publish more. Both open-access and subscription publishing models can be abused by ‘predatory’ authors, editors, and publishing outlets. Relevant examples of ‘prolific’ scholars are viewed through the prism of the violation of ethical authorship in established journals and indiscriminately boosting publication records elsewhere. The instances of ethical transgressions by brokering editorial agencies and agents, operating predominantly in non-Anglophone countries, are presented to raise awareness of predatory activities. The scheme of predatory publishing activities is presented, and several measures are proposed to tackle the issue of predatory publishing. The awareness campaigns by professional societies, consultations with information facilitators, implementation of the criteria of best target journals, and crediting of scholars with use of integrative citation metrics, such as the h-index, are believed to make a difference.

61 citations


Cites background from "Multidisciplinary bibliographic dat..."

  • ...The choice of a bibliographic database for recording the h-index depends on the indexing status of journals in a given discipline, peculiarities of research environments and regional priorities, with Scopus viewed as the most comprehensive platform for authors from Europe and non-Anglophone countries (36)....

    [...]

Journal ArticleDOI
TL;DR: The study suggests that the intensified self-correction in biomedicine is due to the attention of readers and authors, who spot errors in their hub of evidence-based information.
Abstract: Aim To analyze mistakes and misconduct in multidisciplinary and specialized biomedical journals.

59 citations

Journal ArticleDOI
TL;DR: It is found that WoS, INSPEC and Scopus provided better quality indexing and better bibliographic records in terms of accuracy, control and granularity of information, when compared to GS and DBLP.
Abstract: We compared general and specialized databases, by searching bibliographic information regarding journal articles in the computer science field, and by evaluating their bibliographic coverage and the quality of the bibliographic records retrieved. We selected a sample of computer science articles from an Italian university repository (AIR) to carry out our comparison. The databases selected were INSPEC, Scopus, Web of Science (WoS), and DBLP. We found that DBLP and Scopus indexed the highest number of unique articles (4.14 and 4.05 % respectively), that each of the four databases indexed a set of unique articles, that 12.95 % of the articles sampled were not indexed in any of the databases selected, that Scopus was better than WoS for identifying computer science publications, and that DBLP had a greater number of unique articles indexed (19.03 %), when compared to INSPEC (11.28 %). We also measured the quality of a set of bibliographic records, by comparing five databases: Scopus, WoS, INSPEC, DBLP and Google Scholar (GS). We found that WoS, INSPEC and Scopus provided better quality indexing and better bibliographic records in terms of accuracy, control and granularity of information, when compared to GS and DBLP. WoS and Scopus also provided more sophisticated tools for measuring trends of scholarly publications.

58 citations


Cites background from "Multidisciplinary bibliographic dat..."

  • ...Keywords Web of Science Scopus DBLP INSPEC Google Scholar...

    [...]

  • ...Other studies also found that GS required extra analyses of the retrieved citing sources, to single out the irrelevant and non-scholarly materials (Gasparyan et al. 2013)....

    [...]

  • ...The research question was: is there a need of using multiple databases for searching computer science articles?...

    [...]

  • ...Over the past few years GS has significantly expanded its indexing of full texts of scholarly literature through agreements with publishers (like Elsevier), online libraries and repositories (Gasparyan et al. 2013)....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: The content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar are compared and PubMed remains an optimal tool in biomedical electronic research.
Abstract: The evolution of the electronic age has led to the development of numerous medical databases on the World Wide Web, offering search facilities on a particular subject and the ability to perform citation analysis. We compared the content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar. The official Web pages of the databases were used to extract information on the range of journals covered, search facilities and restrictions, and update frequency. We used the example of a keyword search to evaluate the usefulness of these databases in biomedical information retrieval and a specific published article to evaluate their utility in performing citation analysis. All databases were practical in use and offered numerous search facilities. PubMed and Google Scholar are accessed for free. The keyword search with PubMed offers optimal update frequency and includes online early articles; other databases can rate articles by number of citations, as an index of importance. For citation analysis, Scopus offers about 20% more coverage than Web of Science, whereas Google Scholar offers results of inconsistent accuracy. PubMed remains an optimal tool in biomedical electronic research. Scopus covers a wider journal range, of help both in keyword searching and citation analysis, but it is currently limited to recent articles (published after 1995) compared with Web of Science. Google Scholar, as for the Web in general, can help in the retrieval of even the most obscure information but its use is marred by inadequate, less often updated, citation information.

2,696 citations


"Multidisciplinary bibliographic dat..." refers background in this paper

  • ...Scopus retrieves 20% more citations than WoS (11)....

    [...]

Journal ArticleDOI
15 Jul 1955-Science
TL;DR: ‘The uncritical citation of disputed data by a writer, whether it be deliberate or not, is a serious matter.
Abstract: Objectives To investigate whether longitudinal structural network efficiency is associated with cognitive decline and whether baseline network efficiency predicts mortality in cerebral small vessel disease (SVD). Methods A prospective, single-centre cohort consisting of 277 non-demented individuals with SVD was conducted. In 2011 and 2015, all participants were scanned with MRI and underwent neuropsychological assessment. We computed network properties using graph theory from probabilistic tractography and calculated changes in psychomotor speed and overall cognitive index. Multiple linear regressions were performed, while adjusting for potential confounders. We divided the group into mild-to-moderate white matter hyperintensities (WMH) and severe WMH group based on median split on WMH volume. Results The decline in global efficiency was significantly associated with a decline in psychomotor speed in the group with severe WMH (β=0.18, p=0.03) and a trend with change in cognitive index (β=0.14, p=0.068), which diminished after adjusting for imaging markers for SVD. Baseline global efficiency was associated with all-cause mortality (HR per decrease of 1 SD 0.43, 95% CI 0.23 to 0.80, p=0.008, C-statistic 0.76). Conclusion Disruption of the network efficiency, a metric assessing the efficiency of network information transfer, plays an important role in explaining cognitive decline in SVD, which was however not independent of imaging markers of SVD. Furthermore, baseline network efficiency predicts risk of mortality in SVD that may reflect the global health status of the brain in SVD. This emphasises the importance of structural network analysis in the context of SVD research and the use of network measures as surrogate markers in research setting.

1,822 citations

01 Jan 1955
TL;DR: The uncritical citation of disputed data by a writer, whether it be deliberate or not, is a serious matter as discussed by the authors, and many naive students may be swayed by unfounded assertions presented by a writers who is unaware of the criticisms.
Abstract: “The uncritical citation of disputed data by a writer, whether it be deliberate or not, is a serious matter. Of course, knowingly propagandizing unsubstantiated claims is particularly abhorrent, but just as many naive students may be swayed by unfounded assertions presented by a writer who is unaware of the criticisms. Buried in scholarly journals, critical notes are increasingly likely to be overlooked with the passage of time, while the studies to which they pertain, having been reported more widely, are apt to be rediscovered.” (I)

1,040 citations

01 Feb 2009

911 citations