Thing 16: Research Impact (Bibliometrics and Altmetrics)

thing-16

Photo credit: AJC1: CC BY-SA 2.0

Bibliometrics is a well-established approach for studying one type of research output: the academic publication, and especially, the journal article.  Most bibliometric work is quantitative in nature.

Measures of output and indicators of impact…

Bibliometrics provide measures of academic output and indicators of academic impact.  Output describes the volume of publications and can be linked to productivity.  Impact considers how a publication influences and affects the research community.  When used knowledgeably and appropriately in combination with peer review and human judgement, bibliometrics can contribute to the overall assessment of research quality.

Use multiple metrics…

If you want to use bibliometrics, always draw on more than one metric.  For a start, you’ll need different metrics for measuring output and impact.  But, even if you are just looking at impact, be aware that there are at least four types of impact and that each is measured differently:

1)  Impact at point of publication

Point of publication impact is about how a research output is accepted and embraced initially by the research community; for example, acceptance of a publication into a high-impact journal suggests that the community believes the research to be of high value.  Point of publication impact metrics tend to focus on the journal in which an article is published rather than on the article itself.

2)  Impact post publication

Impact post publication considers the influence that a research output has after it has been shared.  In bibliometric terms, post publication impact equates to citation impact. (With altmetrics, post publication impact is more broad—see below.)  Be aware that the citation cycle normally takes at least two years; therefore, if you are using a shorter window than this to evaluate a publication, you may have to make do with looking at point of publication impact instead.

3)  Impact from enabling knowledge transfer

Every academic publication you produce is actually a link in a chain of publications: almost certainly, you will have cited other publications, providing the backward links in the chain, and with luck, other publications will cite yours, creating the forward links in the chain.  By being an important or key link in the publication chain—for example, by publishing an article that connects two disparate research areas—you enable the transfer of knowledge.  Acting as a ‘knowledge bridge’ in this way is a vital but often overlooked form of academic impact.

4) Impact through collaboration

Impact though collaboration can be seen in the networks you inadvertently create when you publish research outputs with others.  These networks may describe links between authors or institutions.  Your collaboration networks show how you connect with others in the research community and allow you to draw inferences about your collaboration impact.

If you want to compare, normalise…

Aside from using different metrics to study output and the four different types of impact, there is another important caveat you must consider when using bibliometrics: always compare like with like.  In practice, we often seek to compare: which of these publications has higher citation impact, is this journal better than that one, and is my research group having greater impact than so and so’s group?

To make comparisons, you need normalised bibliometric indicators.  This means that you can’t make direct comparisons using some of the more familiar—but unnormalised— bibliometrics such as number of citations, h-indices, or Journal Impact Factors.

Citation practices vary from field to field, older papers have had more time to attract citations than newer papers, and some document types, for example reviews, are cited a lot more than other types of academic work.  For comparison purposes, it doesn’t matter that you have 50 citations and a colleague has only 30—you don’t necessarily have the higher citation impact.  The only way to compare fairly is to take into account differences in subject area, publication year, and document type.  Normalised bibliometric indicators do this; absolute counts such as the h-index and averages such as the Journal Impact Factor do not.

Bibliometrics has its detractors, see this Nature piece from 2015 by Reinhard Werner and this piece from the viewpoint of some publishing staff from 2016.

Bibliometric tools…

Three different publication and citation ‘data universes’ can be used to produce bibliometrics: Web of Science, Scopus, (both use institutional logins) and Google Scholar.  Each of these data worlds is made up of different sets of publications.  This means that the bibliometric results derived from one world will not be the same as results derived from the others.

Formal bibliometric studies always use either Web of Science or Scopus because these databases incorporate the professionally-managed metadata that underlie the normalised bibliometric indicators needed for making fair comparisons.  And, as we saw earlier, the need to compare arises often in bibliometric work.

For more informal assessments, academics themselves often turn to Google Scholar. It just so happens that Google Scholar normally gives the highest absolute citation counts of the three databases.  But because Google Scholar is not used formally in bibliometrics, you need to be aware that your citation counts may appear lower in most formal bibliometric studies.  Don’t panic, though!  This apparent ‘drop’ in citations affects every researcher, not just you, and in any event, is irrelevant when properly normalised indicators are used.

At Surrey, both Web of Science and Scopus can be accessed through the library website.  Additionally, we subscribe to a specialist bibliometric tool, SciVal, which derives its results from Scopus data.  Hands-on training in this tool is offered year round, and an introductory SciVal guide for researchers is available on the library website as a pdf.

For fun see how the Olympics has shaped research through the eyes of bibliometrics

Tasks

Choose two or three publications (yours or somebody else’s) that have received citations.  Look up these publications in Web of Science, Scopus, and Google Scholar.

  • Do the publications appear in all three data worlds?
  • How many citations does each publication receive according to Web of Science, Scopus, and Google Scholar? Are the counts different in each database?
  • Most likely, Google Scholar will give the highest citation count—did you find this to be the case with your example publications?

Download the SciVal for Researchers guide from the library website.  To access SciVal, follow the registration and login instructions provided in the guide.  Again, following the guide, use the default ‘University of Surrey’ example to explore some of the bibliometrics offered in SciVal.

Now, let’s take a look at a new kid on the block.

Alternative resources and alternative data (Altmetrics)

Altmetrics, or alternative metrics, study resources and data that owe their existence primarily to the online environment.  Alternative resources include material such as online research blogs, datasets, or software.  Such outputs are an increasingly important aspect of sharing of research.  Online-specific data have only relatively recently become available and include data such as numbers of tweets, clicks, or downloads.  These data offer alternatives to citation data for the quantitative assessment of research.

A new but fast moving field

Altmetrics, although still in its infancy, is attracting a lot of interest as a potentially useful way to measure research outputs.  Because altmetrics cover such a wide range of types of resources and data, they can be used to examine not just the academic impact of research but also its societal impact.

At present, perhaps the biggest two stumbling blocks for altmetrics are the lack of standardisation across online resources and data—and hence, in practice, little ability to normalise metrics—and the ease with which altmetric results can be gamed.   Nonetheless, even if altmetrics aren’t yet suitable for formal studies comparable to those we carry out using bibliometrics, altmetrics certainly have their uses for the individual academic; REF impact case studies provide a good example.

Altmetric tools

There are many tools online for the creation and capture of altmetrics.  ImpactStory and altmetric.com both aggregate altmetric information to allow you to study the academic and societal impact of research.  Research networks such as ResearchGate and Academia.edu are aimed primarily at helping researchers create academic impact through sharing their research with others in the research community.  And, of course, don’t forget non-academic-centric social media such as Twitter, Facebook, and LinkedIn.  Used in the right way, these channels can be highly effective ways to share your research, both with academic colleagues and wider society

Further reading

Tasks

  • Are any of these alternative resources especially important in your research field? If not, do you think they might be in future?
  • Visit almetric.com to learn more about the company’s distinctive altmetric donut. A key feature of this donut is the ‘click for more details’ option which allows you to retrieve specific details such as who said what about the article as well as where it is capturing interest.
  • Did you know that Scopus now includes the altmetric donut for many of the recent articles indexed in its database? This means that you can retrieve both citation and altmetric information for an article in one place.  Log into Scopus and have a look at the record for this publication.   Explore the information you see.  If you were an author on this paper, how might you use the altmetric information that the article has been tweeted often by members of the public and that it is cited in Wikipedia?
  • If you wish, open Chrome, Firefox, or Safari and follow the instructions to install the altmetric bookmarklet app. Now test the app by visiting the Leiden Manifesto for Research Metrics, a recent comment article published in Nature.  Now click on ‘altmetric it!’   If your installation has worked, you should see the altmetric donut appear in the top right-hand corner of the screen.  At time of writing this blog, the Leiden Manifesto article has no citations in Scopus and does not even appear in Web of Science, seeming to suggest it has had little post publication impact.  How does the picture presented by the article’s altmetrics differ?

Week 8 blog post

This week you have had a lot to look at in the world of publishing and impact. What did you think? Are you published? Are those publications accruing impact? How do you feel about the new altmetrics, are they a fad or here to stay? How do you feel about impact in research as a general concept?

Advertisements

4 thoughts on “Thing 16: Research Impact (Bibliometrics and Altmetrics)

  1. I cannot see the altmetric donut button on the “Research impact: Altmetrics make their mark” page, but I clicked on the ‘view all metrics’ button and I guess this has taken me to the correct page?

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s