Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

“What’s your h-factor?” is a question that is increasingly asked at gatherings of scientists or during interviews for academic positions. Scientific careers depend on h-factors these days. What am I talking about?

The h-index is an index that attempts to measure both the productivity and impact of the published work of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a group of scientists, such as a department or university or country.

A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np − h) papers have no more than h citations each.

In other words, a scholar with an index of h has published h papers each of which has been cited in other papers at least h times. Thus, the h-index reflects both the number of publications and the number of citations per publication. (source: wikipedia)

An important determinant of the hight of one’s h-factor depends on what counts as a ‘ cite-able publication’. In the Web of Science h-factor only scientific articles published in journals which have ISI-recognition (determined by Thomson-Reuters) are considered citeable (so not articles in other journals, chapters in books, etc.). In the Scopus h-factor a larger pool of journals is included, while in google citations and in ‘publish or perish’ (www.harzing.com/pop.htm) the h-factor is likely to be higher as it also considers articles in a wide range of journals, book chapters and reports as cite-able). In my university, Wageningen University, it’s not your x-factor that matters but your h-factor. Our librarians have become information specialists that have expertise in bibliometrics and scientometrics. Such expertise is pivotal in helping our academics, science groups and, indeed, our university (76th on the Times Higher Education Index…) climb the rankings. Biblio-what?

Bibliometrics and scientometrics are two closely related approaches to measuring scientific publications and science in general, respectively.  In practice, much of the work that falls under this header involves various types of citation analysis, which looks at how scholars cite one another in publications  (source: Eric Meijers via www. microsites.oii.ox.ac.uk/tidsr/kb/48/what-bibliometrics-and-scientometrics

Below you will see a screenshot of my personal bibliometrics (click on the image to make bigger).

As you can see my overall h-factor is 16. Impressive? Hardly. But how can I raise it? Let me move to the crucial information google citations provides (if you want to check your own bibliometric data you need to make a profile on google citations!). You will note below that “Learning in a changing world and changing in a learning world: reflexively fumbling towards sustainability” is the crucial paper at this moment (click on the image to make bigger).

If I want to increase my h-factor to 17 then I need to get two of my papers cited 17 times or more. I could try to promote the paper that currently occupies place 16 (“reflexively fumbling”). I would then also need to find another paper that is still somewhat attractive to be cited – perhaps the 2006 paper with Justin Dillon on “The danger of blurring methods….” .

So how can I do that? There are many ways of course – I can suggest the paper to authors of papers I review for journals… or I can ask my colleagues to cite those papers… or I can make free-downloads available of those papers via my blog… but there might be a better way – one that could be the beginning of the end of this metrics-based system

Introducing: PleaseCiteMe.com

Why not develop PleaseCiteMe.com – a web-based system where people can trade citations. Scholars can post citations of their own work that they need to have cited in order to increase their h-factor. Of course there is a price to pay: the scholar will have to cite the work of the scholar on the other end who agreed to cite the work. If this is not possible then there can be a monetary value attached to a citation. Possibly citations in a Web of Science journal might cost more than citations in a Scopus journal or of a book chapter that is recognized by Google citations. Of course we need to have a few clever web-designers, economists and Mark Zuckerman-types to create all this but then by 2014 we could probably take this to Wallstreet as by then will be a huge demand in the world of science for this service.

Publish or perish…. or, perhaps, publish and perish…

So what are we to do? In the end it is not about science for impact factors or h-factors but science for impact in society. Some of us are part of a system run increasinly by bibliometric information. Playing the game may be inevitable until the game will end, when people begin to realize that people don’t read but only cite… or when people only write without having the time to read… or when strategic thinking replaces intelligent thinking, curiosity and passion for contributing something meaningful to people and planet.

Fortunately there are still academic environments driven by societal relevance, planetary responsibility and curiosity. And, indeed, there are calls for bridging science and society and other forms of reviewing quality in research (see for instance Funtowitch and Ravetz idea of the ‘ extended peer review’). More on this in another post!

7 thoughts on “Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

  1. Thanks for sharing these thoughts, Arjen (and I wouldn’t agree with the “hardly impressive” statement – at least for the “new kid on the blog” as Bill Scott put it…). Reading this, I just wonder why hbay doesn’t really sound that implausible as it should?!

    But you never even mentioned the rules of the game that are already played far too often, and sadly also in “our” area of research: no. 1: cite yourself as often as possible, regardless if that adds anything to the text; no 2: when you review a text, make sure the author cite you extensively if he wants to pass the review.

    I am keen to see new and other forms of quality control emerge, but simply can’t imagine a change coming any soon. We experimented with an open non-blind review to encourage discussion in a journal for sustainability communication years ago, but I have to say it didn’t work as well as we were enthusiastic about it (and I mostly blame us as editors for that).

    So I guess that leaves me to trust in the “academic environments driven by societal relevance, planetary responsibility and curiosity” you are writing about – fortunately an environment I am lucky enough to work in…

    Cheers, Matthias

    • Thanks Matthias – when you read the end of the post you will note that I have already identified some of strategic responses… and there are a bunch more…

  2. Pingback: The Black Market for Facebook “Likes,” and What It Means for Citations and Alt-Metrics « The Scholarly Kitchen

  3. Pingback: Everything you wanted to know about the h-index Bitesize Bio

  4. Pingback: Getting to Know Your h-index | I World New

Leave a Reply to Matthias Barth Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s