Search   |   Back Issues   |   Author Index   |   Title Index   |   Contents



D-Lib Magazine
September 2005

Volume 11 Number 9

ISSN 1082-9873

An Examination of Citation Counts in a New Scholarly Communication Environment


Kathleen Bauer
ELI Integrated Interfaces Librarian
Yale University Library <>

Nisa Bakkalbasi
General Science Librarian
Kline Science Library
Yale University <>

Red Line



Citation analysis is an important tool used to trace scholarly research, measure impact, and justify tenure and funding decisions. Web of Science, which indexes peer-reviewed journal literature, has been the major research database for citation tracking. Changes in scholarly communication, including preprint/postprint servers, technical reports available via the internet, and open access e-journals are developing rapidly, and traditional citation tracking using Web of Science may miss much of this new activity. Two new tools are now available to count citations: Scopus and Google Scholar. This paper presents a case study comparing the citation counts provided by Web of Science, Scopus, and Google Scholar for articles from the Journal of the American Society for Information Science and Technology (JASIST) published in 1985 and in 2000 using a paired t-test to determine statistical significance. Web of Science provided the largest citation counts for the 1985 articles, although this could not be tested statistically. For JASIST articles published in 2000, Google Scholar provided statistically significant higher citation counts than either Web of Science or Scopus, while there was no significant difference between Web of Science and Scopus. The implications for measuring impact in a changing scholarly communication environment are examined.

1. Introduction

Scholarly communication has changed rapidly over the last decade. Authors have new avenues for researching, publishing and disseminating their work, including preprint/postprint servers, open access journals, and electronic journals (Borgman & Furner, 2002). These new methods of scholarly communication have met with varying degrees of acceptance in different disciplines, ranging from broad acceptance in physics to resistance in medicine (Eysenbach, 2000; Garson, 2004; King, 1999). New means of sharing research results include ArXiv, a preprint server for electronic publications in physics, computer science, mathematics and biology, and RePec, a digital repository for economics papers. In a rapidly changing environment it would be wise for anyone who wishes to find highly influential articles to carefully consider the ramifications of changes in how scholars publish or communicate findings. One method that has traditionally been used to track and measure impact is citation analysis, and in this paper we consider if methods of citation analysis need to change to account for new methods of scholarly communication.

Citation analysis allows a researcher to follow the development and impact of an article through time by looking backward at the references the author cites, and forward to those authors who then cite the article. Citation analysis was made popular by the work of Garfield (Presley & Caraway, 1999) who created three indices to record citations for articles: Science Citation Index, Social Science Citation Index and the Humanities Index. These three print resources were combined into a database, Web of Science, which constituted a powerful interdisciplinary research tool. Web of Science builds on citation analysis to determine journal impact factors calculated as the average number of citations to articles in a journal for the last two years per article published (Garfield, 1972). This is a controversial, but highly influential measure (Seglen, 1997; Walter, Bloch, Hunt, & Fisher, 2003). Web of Science builds on the work of Bradford who found that for particular scientific disciplines a fairly small number of highly productive journals provided two thirds of all cited references (1953). Web of Science is designed to cover only the most important journals, thus covering the greatest percentage of citing activity in an efficient manner. Web of Science is also limited to Roman-language journals that provide bibliographic elements in English (Garfield, 1972). This has contributed to a tilt toward English-language publications in Web of Science (Seglen, 1997).

Until recently Web of Science was unique in providing interdisciplinary coverage and citation tracking. Late in 2004 Elsevier introduced Scopus, offering coverage of articles from journals in science and social science, along with citation tracking. Scopus indexes more journals than Web of Science, and has greater coverage of open access and international journals. Scopus is generally credited by reviewers with having a superior interface, but lacks the depth of coverage in years of scientific journals and has no humanities coverage (Deis & Goodman, 2005; Jacso, 2004b; LaGuardia, 2005). Scopus offers a search of internet resources in its Scirus component, but web results are separate and are not included in citation tracking.

In November 2004 Google, producer of the most popular internet search engine (Fallows, 2005), introduced Google Scholar in Beta version, a freely available service that uses Google's crawler to index the content of scholarly material and adds citation counts to raise or lower individual articles in the rankings of a result set. Google Scholar offers citation counts and citation tracking for articles and other material. Google will not be explicit about the material indexed in Google Scholar, a major drawback to its use as a scholarly tool. Google Scholar does say that all "major" publishers except Elsevier and ACS have cooperated in making their material available to Google Scholar (Jacso, 2004a). However, Google Scholar contains citations to articles from Elsevier and ACS when other journals cite these articles. It is the other non-journal material indexed that makes Google Scholar stand out in comparison to Web of Science and Scopus. Again, Google is not explicit about what is covered, but searches have revealed preprints/ postprints from ArXiv and RePec, conference proceedings, technical reports, books, and dissertations in addition to electronic journal articles from traditional publishers. This is potentially a powerful new tool for citation analysis, particularly in subject areas that have experienced rapid changes in communication and publishing.

While some researchers have argued that citation counts and journal impact factors are overused and overvalued, in practice academics rely on citation counts to show that their work has influence, to aid in tenure decisions, and to award grant funding (Debackere & Glanzel, 2004; Vincent & Ross, 2000). This preliminary study will refrain from assessing the appropriateness of the importance placed on citations. Rather, it contends that new citation tracking resources, motivated by and developed using new technology, should be explored further to determine whether they provide more complete citation information for some scholarly work. To that end, this study compares the citation counts from three mentioned resources for research articles taken from one journal, Journal of the American Society for Information Science and Technology (JASIST), for the years 1985 (before materials were available online) and 2000 (after publishing had expanded to include electronic full text and other web-based documents.) JASIST was chosen for this study because it mainly publishes new research, it is focused in one subject area (computer science), it has been published since 1950, and is available with membership in a scholarly association, the American Society for Information Science and Technology.

2. Methodology

JASIST was chosen as the sample base for the dataset because it is a highly regarded journal known to receive a high number of citations. To construct the dataset, we obtained the complete list of articles published in years 2000 and 1985, essentially corresponding to two citation periods of different length. The raw data consisted of 105 and 41 articles for 2000 and 1985, respectively. Since research articles are likely to contain many more citations, we excluded editorial material, letters, corrections, biographical items, and book reviews from the study. Then, during the week of April 18-22, 2005, we extracted the citation counts for each research article from three sources: Web of Science, Scopus, and Google Scholar. The absence of an entire issue from Scopus for year 2000 resulted in the elimination of 6 records, corresponding to approximately 6% loss. Missing data from Scopus and Google Scholar for year 1985 resulted in the loss of 14 records (34%). In order to compare whether the citation counts for the three tools are significantly different, we chose to use a paired t-test. The high number of missing records for the year 1985 prevented a meaningful statistical analysis. The statistical software package SPSS was used for all statistical computations.

3. Findings

Table 1 displays the descriptive statistics of the citation count from each of the three resources as well as the mean difference of citation counts for each pair of the resources in year 2000. For example, the number of times an article is cited in Web of Science (WoS) ranges from a minimum of 0 to a maximum of 52 with an average of 7.6 and a standard deviation of 8.3. In addition, on average, Google Scholar (GS) detects 4.5 more citations than WoS. Other columns in the table display similar numbers for Scopus (Sco), GS and the paired differences.

Table 1: Descriptive statistics for the citation counts for year 2000 (n=105)
  WoS Sco GS WoS-Sco WoS-GS Sco-GS
Mean 7.6 7.6 12.1 0.3 -4.5 -3.9
St. Dev. 8.3 9.0 12.7 1.9 7.8 6.7
Min. 0 0 0 -8 -39 -36
Max. 52 60 64 4 8 7


Figure 1 shows the frequency distribution of citation count differences. The shape of the distribution suggests that the frequency distribution of the difference for each pair is approximately normal, satisfying a major assumption in the use of the paired t-test.

Figure 1: Stem & Leaf Plot for citation count difference for year 2000
WoS — Sco WoS — GS Sco — GS
Freq.  Stem & Leaf Freq.  Stem & Leaf Freq.  Stem & Leaf
6   Extremes   (=<-4) 9   Extremes   (=<-14) 7   Extremes   (=<-14)
4 -3 0000 2 -13 00 1 -11 0
0 -2   2 -12 00 4 -10 0000
7 -2 0000000 0 -11   1 -9 0
0 -1   0 -10   2 -8 00
19 -1 0000000000000000000 4 -9 0000 3 -7 000
31 0 0000000000000000000000000000000    4 -8 0000 4 -6 0000
17 1 00000000000000000 5 -7 00000 8 -5 00000000
11 2 00000000000 2 -6 00 9 -4 000000000
0 2   5 -5 00000 9 -3 000000000
3 3 000 8 -4 00000000 6 -2 000000
1   Extremes   (>=4.0) 9 -3 000000000 19 -1 0000000000000000000
  11 -2 00000000000 12 0 000000000000
  18 -1 000000000000000000     5 1 00000
  13 0 0000000000000 4 2 0000
  6 1 000000 3 3 000
  1 2 0 0 4  
  2 3 00 1 5 0
  1 4 0 0 6  
  0 5   1 7 0
  1 6 0  
  2   Extremes   (>=7.0)  
Stem width: 1
Each leaf: 1 case(s)


Table 2 displays the descriptive statistics for the citation counts for year 1985. As expected, we observe that the average number of citations for WoS is the highest, due primarily to the maturity of WoS as a citation resource.

Table 2: Descriptive statistics for the year 1985 (n=27)
  WoS Sco GS WoS-Sco WoS-GS Sco-GS
Mean 14.1 3.7 6.8 11.9 8.7 -2.9
St. Dev. 16.4 4.6 9.9 10.9 9.6 6.9
Min. 0 0 0 -8 -2 -27
Max. 58 16 39 42 43 4


For year 1985, the Stem & Leaf plots of the paired differences were either scattered or highly skewed, therefore violating a major assumption in the use of a paired t-test. Therefore, no test for statistical significance was made.

To test the hypothesis of "no difference among citation counts extracted by three different resources in year 2000," a paired t-test is performed for each pair with a significance level of α = 0.05. Table 3 displays the results.

Η0  :  µD = 0

ΗΑ  :  µD ≠ 0 where µD is the average difference between two groups

Table 3: Paired t-test for the year 2000
  Correlation t df t0.05,df 95% CI
WoS-Sco 0.98 -1.372 98 ± 1.661 [-0.667, 0.1222]
Sco-GS 0.85 -5.824 98 ± 1.661 [-5.268, -2.590]
WoS-GS 0.80 -5.980 105 ± 1.645 [-6.049, -3.036]


With a significance level of 0.05, we accept the null hypothesis and conclude that the mean citation count between WoS and Sco for year 2000 is not significantly different. On the other hand, we reject the null hypothesis and conclude that the mean citation counts between both the pair WoS and GS, and the pair Sco and GS are significantly different for year 2000.

For this one journal, JASIST, older material is covered most completely by Web of Science. For those articles published in 2000, Web of Science and Scopus do not give a statistically significant different number of mean citations. On the other hand, for articles published in 2000, Google Scholar produces a statistically significant larger mean number of citations than either Web of Science or Scopus.

Although this study focuses on the number of citations to an article, we cannot ignore the complications that the number of citations may hide. Consider one record: "Individual differences and the conundrums of user-centered design: Two experiments" by Allen, published in JASIST in 2000. Google Scholar, Web of Science and Scopus each indicate exactly eleven citations for this article, but a closer examination reveals that these are not the same eleven citations: Web of Science has one unique citation, Scopus has one, and Google Scholar has four. The superset of citations revealed for this article from the three resources is in fact sixteen, not eleven. At the core of the set, all three resources return traditional journal articles. Google Scholar and Scopus, but not Web of Science, find a record from the open-access journal BMC Medical Informatics and Decision Making. Google Scholar also finds a preprint on a university web-site, "Modeles et facteurs humains en IHM, application a la realite virtuelle," and "Impact of large displays on virtual reality task performance" from Proceedings of the 3rd international conference on computer graphics, virtual reality, visualisation and interaction in Africa. This set seems to reflect what is seen in other cases, where the unique citations found by Google Scholar often are comprised of a mix of materials, including technical reports, dissertations, conference proceedings, and preprints/postprints, in addition to electronic articles from more traditional publishers.

Beyond the major interdisciplinary resources examined here, other databases and digital libraries are providing citation tracking, and so scholars in some disciplines may need to consider other sources in addition to Web of Science, Google Scholar and Scopus. The additional resources are more narrowly-defined databases, including CINAHL (nursing), PsycINFO (psychology and psychiatry), and Academic Search Premier (general interest). Examinations of the usefulness of the more narrowly focused resources have begun. Abt has reported that the NASA Astrophysics Data System produces 15% more citations than Web of Science, mainly from conference proceedings (2004).

4. Conclusions and Further Study

Older material from JASIST (the proxy is the 1985 data set) appears to be best covered in Web of Science, although this could not be confirmed statistically due to the small size of our data set. Newer material (the proxy is the 2000 data set) receives higher citation counts in Google Scholar than in either Web of Science or Scopus, while there is no statistical difference between the citation counts reported by Web of Science and Scopus. Additional ad hoc searches done in Web of Science, Scopus and Google Scholar support this finding for articles in other publications and other disciplines, but a more rigorous study is required before this can be stated definitively. Based on our preliminary examination and discovery of higher citation counts, we recommend that researchers should consult Google Scholar in addition to Web of Science or Scopus, especially for a relatively recent article, author or subject area. A search of Google Scholar will likely reveal both traditional journal articles, some of which will also be covered in Web of Science and Scopus, and additional unique material, but the scholarly value of some of the unique material remains an open question. Consulting Google Scholar may prove most useful for disciplines such as physics, where nontraditional forms of publishing are widely accepted. However, it is important for all researchers to note that until Google Scholar gives a full account of what material it is indexing and how often that index is updated, it cannot be considered a true scholarly resource in the sense that Web of Science and Scopus are. An understanding of the material being covered is central to the validity of any search of scholarly material.

We propose to address the limitation of missing records for older material in a future study by increasing our scope to articles published in several journals, in different disciplines and across many years. The larger data set and a more comprehensive approach will allow us to present more generalizable results, and will indicate whether the difference in the number of citing articles will be more pronounced in other disciplines. Furthermore, we will be able to provide insight to the users of citation count services, based on statistical analysis, on the effect of new modes of scholarly communication, such as preprint/postprint servers, dissertations, conference proceedings and open access journals. Perhaps no one resource may now be considered sufficient in researching the citation count of an article or an author. Web of Science offers the most comprehensive coverage back in time, but for some subject areas specialized databases may offer the best citation coverage, and for yet other areas Google Scholar may be an indispensable tool. Given the changes in scholarly communication induced by technology and the corresponding proliferation of resources that offer citation tracking, it is imperative that a rigorous study be undertaken to determine which sources perform best for particular subjects or time periods.


Abt, H. A. (2004). A comparison of the citation counts in the Science Citation Index and the NASA Astronomical Data System. In A. Heck (Ed.), Organizations and Strategies in Astronomy (Vol. 6). Dordrecht: Kluwer Academic Publishers.

Borgman, C. L., & Furner, J. (2002). Scholarly communication and bibliometrics. Annual Review of Information Science and Technology, 36, 3-72.

Bradford, S. C. (1953). Documentation (2nd ed.). London,: C. Lockwood.

Debackere, K., & Glanzel, W. (2004). Using a bibliometric approach to support research policy making: The case of the Flemish BOF-key. Scientometrics, 59(2), 253-276, <doi:10.1023/B%3ASCIE.0000018532.70146.02>.

Deis, L. F., & Goodman, D. (2005). Web of Science (2004 version) and Scopus. The Charleston Advisor, 6(3), 5-21. <>.

Eysenbach, G. (2000). The impact of preprint servers and electronic publishing on biomedical research. Current Opinion in Immunology, 12(5), 499-503. <doi:10.1016/S0952-7915(00)00127-8>.

Fallows, D. (2005). Search engine users. Washington, D.C.: Pew Internet & American Life Project. <>.

Garfield, E. (1972). Citation Analysis as a Tool in Journal Evaluation. Science, 178(4060), 471-479.

Garson, L. R. (2004). Communicating original research in chemistry and related sciences. Accounts of Chemical Research, 37(3), 141-148. <doi:10.1021/ar0300017>.

Jacso, P. (2004a, December). Google Scholar. Retrieved March 15, 2005, from <>.

Jacso, P. (2004b, September). Scopus. Retrieved March 15, 2005, from <>.

King, R. M., Geoffrey. (1999). Scholarly communication and the continuum of electronic publishing. Journal of the American Society for Information Science, 50(10), 890-906.

LaGuardia, C. (2005). Scopus vs. Web of Science. Library Journal (1976), 130(1), 40, 42.

Presley, R. L., & Caraway, B. L. (1999). An Interview with Eugene Garfield. Serials Review, 25(3), 67-80.

Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314(7079), 498-502. <>.

Vincent, A., & Ross, D. (2000). On Evaluation Of Faculty Research Impact Of Citation Analysis. Journal of Applied Business Research, 16(2), 1-15.

Walter, G., Bloch, S., Hunt, G., & Fisher, K. (2003). Counting on citations: a flawed way to measure quality. Medical Journal of Australia, 178(6), 280-281.

(The spelling of Peter Jacso's name was corrected in all references 9/16/05)

Copyright © 2005 Kathleen Bauer and Nisa Bakkalbasi

Top | Contents
Search | Author Index | Title Index | Back Issues
Previous Article | Next article
Home | E-mail the Editor


D-Lib Magazine Access Terms and Conditions