On this page is an overview of the key sources of data on scholarly citations and other similar indicators of the use of academic materials.
When selecting which source(s) to utilise it is important to note that results may vary due to differences in the extent and nature of coverage.
Web of Science
In the 1960s, an American scientist called Eugene Garfield had the idea of assembling a team of workers to transcribe the reference lists in scientific journal articles, and then group together all later references to the same earlier article.
This enabled him to publish an annual printed reference work. In it, you could look up any article in which you were interested, and find a list of all the later articles which had cited it.
His original idea was that this should be primarily a means of literature searching, on the basis that a later article which cited an earlier article was likely to have at least some subject connection with the earlier article.
However, people soon realised that seeing how many later articles had cited an earlier article was one means of assessing the academic influence of the earlier article.
Garfield started by publishing a work under the title Science Citation Index, and later extended the work creating a Social Science Citation Index and an Arts and Humanities Citation Index.
In the 1990s all three works became a single electronic resource, under the slightly misleading name Web of Science (part of a wider resource called Web of Knowledge).
In the first years of the twenty-first century, the publishers Elsevier decided to do much the same thing and create a very similar electronic resource of their own, under the name Scopus.
Scopus was founded in 2004, and the period of publications which it covers is continually extending further back in time, currently going back to about 1996.
Web of Science was founded in 1963, and, because of this earlier starting date, it doesn't give as much importance to extending its coverage back before the start date.
So, although it's impossible to give an exact date, one can roughly say that Web of Science coverage extends at least thirty years further back than Scopus. Its depth of coverage is therefore greater.
Scopus, on the other hand, has broader coverage. It contains many more titles, and unsurprisingly it therefore has many more unique titles.
A lower proportion of Scopus content than Web of Science content is published in the UK or US. This broadly means that there's rather more non-English-language material in Scopus than Web of Science.
A lower proportion of Scopus content than Web of Science content is categorised as scholarly. This means that there's rather more material in Scopus which might be categorised as non-scholarly, such as trade journals.
Both databases contain details only of citations made in journal articles. In other words, references made in books (for example, the footnotes or endnotes) aren't recorded.
In November 2004 Google released Google Scholar, allowing users to freely search the full text and references of millions of academic works including journal articles, technical reports, preprints, theses, books, and other documents deemed to be scholarly in nature.
The service indexes across a wider range of academic disciplines than both Scopus and Web of Science as well as a wider range of sources including academic publishers, professional societies, online repositories, universities and other academic websites. As a consequence, citation counts in Google Scholar can often be much higher than the other citation indexing services.
A paper is ranked in part by how often and how recently it has been cited in other scholarly works. Search results for individual papers include a total citation count with a link to view a list of all other citing papers.
The term 'altmetrics' describes emerging technologies and practices enabling the measurement of web-based activity related to research outputs. These new metrics offer alternative ways to measure research impact beyond the current established methods such as peer review and citation counts.
The field is still in the very early stages of development but there are already a growing number of altmetrics services; three of the highest profile services are Impact Story, Altmetric, and Plum Analytics. Each service records and aggregates some or all of the following types of activity:
- Scholarly usage data: Web-page views, PDF downloads
- Scholarly reference: Bookmarking, shares and recommendations from CiteULike, Zotero, Mendeley
- Mass media mentions: NY Times, BBC, The Washington Post
- Social media mentions: Twitter, Facebook, Delicious
- Data and code usage: Dryad, GitHub
- Component mentions: SlideShare, Figshare
Groth P, Taylor M. (2013) Helping scholars tell their stories using altmetrics.
Altmetrics are becoming increasingly accepted as a useful way of measuring research impact. Several major journal publishers including PLoS, BioMed Central, and Nature Publishing Group now display article level altmetrics data on their websites and a growing number of researchers are including altmetrics data in their CVs, grant applications, and promotion applications.
Although altmetrics offer a wealth of information about how research is being received, care should be taken when interpreting the data.