Frank Vazquez, CEO of MDPI, Responds to “Publish AND perish” Post

The blog on the commodification of scientific publishing, posted earlier in December, has received lots of interest from many concerned academics. Some people responded to the blog (see under ‘replies’ at the end of the post), some contacted me by email and several people offered alternatives to the current system that seems to invite strategic behavior, a focus on ‘production’ (number of papers published) and efficiency (short turn-around), rather than on quality that is supported by a critical review system. One of the responses to the blog  came from MDPI’s – the publisher of the journal Sustainability that is featured in the post – CEO Frank Vazquez. I am posting his elaborate response after the key messages from the blog. I do appreciate that he took the time to do this.

Note the blog post on this can be found here: Update – Publish AND perish: how the commodification of scientific publishing is undermining both science and the public good

—————————————————————–

Key messages

“Everybody is writing, nobody is reading, everybody is writing for nobody.”

  • Academics are spending hundreds of hours a year, getting their work published, in peer-reviewed journals, providing free labor to commercial publishing companies.
  • The pressure to ‘produce’ and grow is huge, both in academia and in the publishing industry; this undermines quality and the university’s ability to serve the public good and, indeed, public trust in science.
  • Open access journal Sustainability publishes over 4000 contributions in its current Volume 10 – where most contributors will have to pay 1400 US Dollars* to have their work published. Its publisher MDPI has close to 200 journals working in a similar vein.’
  • Sustainability has 561 associate-editors from mostly public universities all working for free for the journal.
  • Of all industries, the publishing industry has the highest profit margin according to a recent article in the New Scientist.
  • A transition in science is needed to restore quality, trust and a culture of co-learning, peer-to-peer feedback and dialogue, and to unlock the the power of science in creating  more sustainable world.

* Sustainability just announced that the fee for having an article published in 2019 has been raised to 1700 US dollars…


Here you find the respons of MDPI’s CEO Frank Vazquez – without comment. Should you have a comment of your own, feel free to reply.

MDPIResponse_1

MDPI_Response_2

MDPI_Response_3

 

Update – Publish AND perish: how the commodification of scientific publishing is undermining both science and the public good

(since this post appeared 10 days ago it has been updated a few times which is why I am re-posting it)

Key messages

“Everybody is writing, nobody is reading, everybody is writing for nobody.”

  • Academics are spending hundreds of hours a year, getting their work published, in peer-reviewed journals, providing free labor to commercial publishing companies.
  • The pressure to ‘produce’ and grow is huge, both in academia and in the publishing industry; this undermines quality and the university’s ability to serve the public good and, indeed, public trust in science.
  • Open access journal Sustainability publishes over 4000 contributions in its current Volume 10 – where most contributors will have to pay 1400 US Dollars* to have their work published. Its publisher MDPI has close to 200 journals working in a similar vein.’
  • Sustainability has 561 associate-editors from mostly public universities all working for free for the journal.
  • Of all industries, the publishing industry has the highest profit margin according to a recent article in the New Scientist.
  • A transition in science is needed to restore quality, trust and a culture of co-learning, peer-to-peer feedback and dialogue, and to unlock the the power of science in creating  more sustainable world.

* Sustainability just announced that the fee for having an article published in 2019 has been raised to 1700 US dollars…


publish_or_perish_academic_publishing_ecology_cartoon

Let me apologise first, for this post has turned into a bit of a rant but I had to get if off my chest. Here we go:

The open-access journal Sustainability (IF 2,025) just published Volume 10, issue 11 which contains 508 papers of which – with some, often, negotiated exceptions – the authors, provided their labor free (that is, usually sponsored by public money to cover their salaries) will have paid its publisher MDPI 1400 Swiss Francs (about 1400 US Dollar) per paper. I looked into this after being invited by the journal to edit a special issue a few weeks ago. Below I share what I found out.

+++++++++++

Dear Prof Wals,

We invite you to join us as Guest Editor for the open access journal Sustainability (ISSN 2071-1050), to establish a Special Issue. Our suggested topic is ‘Higher Education and Education for Sustainable Development’. You have been invited based on your strong publication record in this area, and we hope to work with you to establish a collection of papers that will be of interest to scholars in the field.

++++++++++++

I have published in Sustainability (Impact Factor: 2,025) before and am currently also involved in co-editing a Special Issue for the same publisher, MDPI, but for another one of their journals called Water (Impact Factor: 2,069), so my initial response was positive. The invitation seemed serious and the journal seems reputable. It was not one of those almost daily invitations from a bogus journal that usually starts with: “Greetings!! We read your paper on social learning and believe you could make and excellent contribution to our forthcoming issue in Preventative Cardiological Medicine” (usually a journal on a topic I know nothing about) and ends with something like: “I hope you have good days ahead”. No, this one was serious and caught my interest.

I responded by saying that I found the proposed topic a bit outdated – there is a lot available and being done in the area of Higher Education for Sustainable Development (in fact there is an entire journal on the subject that’s been around for more than 20 years) but that I would like to focus on the role of higher education in sustainability transitions. The assistant-editor responded immediately that that would be fine and she sent me the template to fill out. I drafted a text for a Call for Papers with input from two colleagues and asked her if the text was fine. Instead of getting a reply I received a link to the Special Issue Announcement (will be removed shortly by MDPI at our request).

“Wow, that went really fast,” I thought. Then, just days later, I received an invitation from another colleague working in more or less the same field:

“We write to invite submissions of papers to a Special Issue of the Sustainability Journal focusing on “Innovation, Higher Education and Sustainable Futures” which we are editing. We think that the work you are doing in this area would make an excellent contribution to this journal.”

I was very surprised: basically, our SI would be competing with that of my colleagues which is on more on less the same topic! Why did the editors not check for overlap or connect us? I then decided to have a look at the journal’s special issue website and was shocked to find that at the moment “Sustainability” has planned about 200 (!) Special Issues  for the year 2019 have a look here….

Let’s think about this. Sustainability publishes 12 issues per Volume and integrates these ‘special issues’ in one of those issues. On average each issue will have 10 articles normally, I figured naively, based on old times when publishers would actually print journals, but then I started thinking: how can they cram in all these special issue articles in the 12 issues of a volume? This became clear yesterday when I received an advertisement from MDPI announcing its ‘release’ of Sustainability’s Volume 10, Issue 11 titled: Historic Rural Landscapes: Sustainable Planning Strategies and Action Criteria. The Italian Experience in the Global and European Context.

In the email the table of contents was embedded and I started scrolling down to read some of the titles. Then something odd seemed to be happening, there was no end to the list of papers; I kept on scrolling and scrolling… How many papers are in one volume I wondered… well 508!  Feel free to check this here.

So, I then checked Issue 10: 468 articles…, Issue 9:  401 articles, and noted that with every new issue the number of published papers tends to go up. On average the journal has published just over 380 articles per issue this year which will result in about 4560 articles. Now for some of the editorial papers and for some other papers, authors will get their open access fee waived. Let us assume that about 10% of all papers will have the fee of 1400 US Dollar waived. The total revenue for 2018 for this MDPI journal would be 1400 x 4100 = 5.740.000 US Dollar.

Now, figure this, MDPI publishes more than two-hundred journals varying from the Journal of Acoustics to the Journal of World Electric Vehicles, all using more or less the same business model. Here is a list of MDPI-s journals And let us not forget the other big publishers like Taylor & Francis/Routledge, Elsevier, Springer, etc. who use the same or a similar model.

Now, to be fair, I must say that scrolling down the ToC of Vol. 10 (11), I saw many intriguing titles and some very inspiring and high-quality authors: there is some good work out there and indeed it is open access – that’s what the 1400 US pays for after all… But all the journal needs to do is to invite lots of Special Issue editors (when telling this story to colleagues at an international conference, it seemed that everybody there had been asked recently to do a SI…), have a good manuscript management system with a big reviewer database and have a good website where papers can be easily downloaded, plus they need mechanisms to make sure that the impact factor of the journal goes up (that’s another blog post…). They don’t need to print anything anymore, neither do they need to do any graphic design work as nowadays people submitting need to do that themselves in accordance the journal’s instructions.

The job of the assisting editor is really one of acquisition editor: soliciting special issues and making academics responsible for gathering content, reviewing content, editing content, citing content, all for free! I would not be surprised if journals and editors receive bonuses based on growth in revenue. The whole industry is driven by targets, growth and expansion. This leads to a lot of pressure on everybody involved which undermines scientific quality. See below an example of this: “An Aberdeen University researcher resigned from a prestigious international journal after claiming she was put under pressure to do “mediocre” work.” Aberdeen researcher washes her hands off of overbearing publisher(excerpt below)

ScotishResearcher

To return to the journal Sustainability… since the first version of this post appeared there has been a lot of activity on twitter with lots of comments, including the one below.

Picture1

Sadly our ‘business’ of academia has been contaminated by the same modus operandi: an increase in the production of papers and number of citations and the growth of one’s ‘h-factor’ (see an older post about this here), is driving much of what we do today. Quantity over quality. Who has time to review, to read with intend and concentration, to organise a seminar or a debate? All activities for which no brownie points can be earned but essential for scientific quality.

Academics trying to stay on top of their game or trying to climb the tenure track ladder, are frantically trying to get their work published, all working for free for the private sector, paid for by, often, public money, then having to pay the journal to make the publicly funded research accessible for ‘free’ to the public. This leads to absurd performances: I know of colleagues, some with whom I have co-authored papers, who average one scientific peer-reviewed article per week, per week

As suggested already, all this also has implications for the quality of the work of course: as people only get rewarded for their production (published papers) and not for their contributions to assuring quality (e.g. reviewing and critical reading), the quality of the review process goes down rapidly as both the people working for the publishing industry and the academic industry need to achieve their targets and show growth to remain competitive and to climb the rankings.

There is a huge unsettling paradox in contemporary academia where everybody is writing while nobody seems to be reading, really, which means that everybody is writing for nobody. This also makes me wonder: what does it mean to be cited? In the meantime, all that time we spend behind a screen making letters flow from our brains, through our hands to a computer screen, is sponsored mostly by public money, which we then move to the publishing industry, where the top management and the shareholders are all anticipating the next quarterly earnings report, good salaries and bonuses, and good returns on investments.

HERE is a trivia question for you: what is the most profitable business in the world? You might think oil, or maybe banking. You would be wrong. The answer is academic publishing. Its profit margins are vast, reportedly in the region of 40 per cent. (Source: The New Scientist)

NEW

Needless to say, this is a system that will run itself into the ground eventually. Science for impact factors in journals will need to transition towards science for impact in society. This will require that the world of higher education and academia becomes more autonomous and independent from globalising neo-liberal forces that undermine academic quality and integrity. Fortunately there are counter-movements in science seeking to disrupt this tragically resilient system such as the science-in-transition movement, the global alliance for community-engaged research  and the living knowledge network (send me more examples if know of nay, I will add them here). Furthermore, mainstream universities are beginning to recognise the problem and are beginning to emphasise the importance of healthy working environments, societal impact, citizen science and knowledge co-creation. More on this in another blog post.

p.s. you may also find the Beall’s list of predatory journals and publishers an interesting resource to help you check whether a journal or publisher you are considering is legitimate (also read the cautionary note stating that this is a rather dynamic and fluid world where a list like this one needs constant updating)

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

The above post has been re-blogged now that the San Francisco Declaration has been created which has been endorsed by Science.

The San Francisco Declaration is looking to challenge some of the strategic games being played in the world of academia to increase productivity, get tenure and climb the rankings. These games could easily lead to both a decline of scientific quality and an erosion of trust in science all together. Below there’s an exerpt from the editorial that can be found in Science (see also: http://www.aaas.org/news/releases/2013/0516_impact-factors.shtml#fb). The San Francisco Declaration can be found here:The San Francisco Declaration.

————————————————————————————————————————————————–Science Endorses New Limits on Journal Impact Factors

———————- —————–A measure developed to assess the quality of scientific journals has distorted how research is evaluated, and should be not be used to judge an individual’s work, Science Editor-in-Chief Bruce Alberts writes in the 17 May issue of the journal.

The editorial coincides with the release of the San Francisco Declaration of Research Assessment (DORA), which grew out of a gathering of scientists at the December 2012 meeting of the American Society for Cell Biology (ASCB). More than 150 scientists and 75 scientific organizations including Science’s publisher AAAS have endorsed DORA, which recommends specific changes to the way scientific journal rankings are used in hiring scientists, funding research and publishing papers.

———————————————————————One of the most popular ranking measures, called Journal Impact Factor or JIF, ranks research journals based on the average number of times its papers are cited by other papers. (The higher the JIF score, the more often its research papers are cited by others.) JIF was devised to rank journals, but is now often used to evaluate an individual’s research, by looking at whether she or he has published in high-score journals.

This misuse of the JIF score encourages far too much “me-too science,” Alberts writes. “Any evaluation system in which the mere number of a researcher’s publications increases his or her score creates a strong disincentive to pursue risky and potentially groundbreaking work, because it takes years to create a new approach in a new experimental context, during which no publications should be expected.”

Alberts notes that an unhealthy obsession with journal ranking scores may also make journals reluctant to publish papers in fields that are less cited, such as the social sciences, compared to papers from highly-cited fields such as biomedicine.

————————————–The DORA guidelines offer 18 specific recommendations for discontinuing the use of JIF in scientists’ hiring, tenure, and promotion, along with ways to assess research on its own merits apart from its place of publication.

Transformative learning

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

“What’s your h-factor?” is a question that is increasingly asked at gatherings of scientists or during interviews for academic positions. Scientific careers depend on h-factors these days. What am I talking about?

The h-index is an index that attempts to measure both the productivity and impact of the published work of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a group of scientists, such as a department or university or country.

A scientist has index h if h of his/her Nppapers have at least h citations each, and the other (Np − h) papers have no more than h citations…

View original post 792 more words

The Black Market for Facebook “Likes,” and What It Means for Citations and Alt-Metrics

The Black Market for Facebook “Likes,” and What It Means for Citations and Alt-Metrics

This is a follow-up on my earlier posted ‘publish or perish’ post. It is inspired by Phil Davis who posted an interesting article in the ‘Scholarly Kitchen’ today which only amplifies some of the concerns expressed in my initial post. Here are the opening lines of his article which can be found in its complete form at: http://scholarlykitchen.sspnet.org/2012/05/18/the-black-market-for-facebook-likes/

“There is an online market for so many intangible goods these days that it should come as no surprise that there is a market for Facebook “Likes” — the little thumbs-up rating that accompanies so many products and services we see on a daily basis.

For $75, a marketing company will sell you 1,000 Facebook “Likes,” according to NPR’s Planet Money. Only the marketing company does not supply the “likes” but works as a broker between real individuals who are willing to sell their online preferences to your product for very small sums of money — ten cents a “like” — and those who wish to artificially inflate their prestige.

Ten cents may not seem like a lot of money, but there is a huge workforce of individuals willing to be employed to undertake low-skilled, repetitive online work for pennies a task, as evidenced by mature markets like Amazon’s Mechanical Turk. Global outsourcing has never been easier.”

And later on Phil writes: “The artificial trust market is not new and is found in environments where online trust is important, such as purchasing an antique love seat from a complete stranger on eBay, finding a reputable bed and breakfast in rural Ireland, selecting a new e-book from Amazon, or choosing an app from the Apple Store. When in doubt, our tendency is to turn to the wisdom of the crowds because we believe that these ratings are accurate evaluations generated by honest individuals and based on real experiences.

Trust — or at least consensus — works the same way in scientific publication through the accumulation of citations, only the barriers to participate in this market are much, much higher. To cast your votes, you need to publish a paper that is indexed by Thomson Reuters’ Web of Science (or alternatively, Elsevier’s Scopus). Like Facebook, Thomson Reuters does not take kindly with citation manipulation and will delist a journal when it exhibits forms of citation manipulation such as systemic self-citation or, more recently, through the formation of citation cartels.”

He then refers to my earlier post where I suggest a website like PleaseCiteMe.Com where ‘academics’ can ‘purchase’ citations to increase their h-factor… see my original post below.

“What’s your h-factor?” is a question that is increasingly asked at gatherings of scientists or during interviews for academic positions. Scientific careers depend on h-factors these days. What am I talking about?

The h-index is an index that attempts to measure both the productivity and impact of the published work of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a group of scientists, such as a department or university or country.

A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np − h) papers have no more than h citations each.

In other words, a scholar with an index of h has published h papers each of which has been cited in other papers at least h times. Thus, the h-index reflects both the number of publications and the number of citations per publication. (source: wikipedia)

An important determinant of the hight of one’s h-factor depends on what counts as a ‘ cite-able publication’. In the Web of Science h-factor only scientific articles published in journals which have ISI-recognition (determined by Thomson-Reuters) are considered citeable (so not articles in other journals, chapters in books, etc.). In the Scopus h-factor a larger pool of journals is included, while in google citations and in ‘publish or perish’ (www.harzing.com/pop.htm) the h-factor is likely to be higher as it also considers articles in a wide range of journals, book chapters and reports as cite-able). In my university, Wageningen University, it’s not your x-factor that matters but your h-factor. Our librarians have become information specialists that have expertise in bibliometrics and scientometrics. Such expertise is pivotal in helping our academics, science groups and, indeed, our university (76th on the Times Higher Education Index…) climb the rankings. Biblio-what?

Bibliometrics and scientometrics are two closely related approaches to measuring scientific publications and science in general, respectively.  In practice, much of the work that falls under this header involves various types of citation analysis, which looks at how scholars cite one another in publications  (source: Eric Meijers via www. microsites.oii.ox.ac.uk/tidsr/kb/48/what-bibliometrics-and-scientometrics

Below you will see a screenshot of my personal bibliometrics (click on the image to make bigger).

As you can see my overall h-factor is 16. Impressive? Hardly. But how can I raise it? Let me move to the crucial information google citations provides (if you want to check your own bibliometric data you need to make a profile on google citations!). You will note below that “Learning in a changing world and changing in a learning world: reflexively fumbling towards sustainability” is the crucial paper at this moment (click on the image to make bigger).

If I want to increase my h-factor to 17 then I need to get two of my papers cited 17 times or more. I could try to promote the paper that currently occupies place 16 (“reflexively fumbling”). I would then also need to find another paper that is still somewhat attractive to be cited – perhaps the 2006 paper with Justin Dillon on “The danger of blurring methods….” .

So how can I do that? There are many ways of course – I can suggest the paper to authors of papers I review for journals… or I can ask my colleagues to cite those papers… or I can make free-downloads available of those papers via my blog… but there might be a better way – one that could be the beginning of the end of this metrics-based system: hBay

Introducing: PleaseCiteMe.com

Why not develop PleaseCiteMe.com – a web-based system where people can trade citations. Scholars can post citations of their own work that they need to have cited in order to increase their h-factor. Of course there is a price to pay: the scholar will have to cite the work of the scholar on the other end who agreed to cite the work. If this is not possible then there can be a monetary value attached to a citation. Possibly citations in a Web of Science journal might cost more than citations in a Scopus journal or of a book chapter that is recognized by Google citations. Of course we need to have a few clever web-designers, economists and Mark Zuckerman-types to create all this but then by 2014 we could probably take this to Wallstreet as by then will be a huge demand in the world of science for this service.

Publish or perish…. or, perhaps, publish and perish…

So what are we to do? In the end it is not about science for impact factors or h-factors but science for impact in society. Some of us are part of a system run increasinly by bibliometric information. Playing the game may be inevitable until the game will end, when people begin to realize that people don’t read but only cite… or when people only write without having the time to read… or when strategic thinking replaces intelligent thinking, curiosity and passion for contributing something meaningful to people and planet.

Fortunately there are still academic environments driven by societal relevance, planetary responsibility and curiosity. And, indeed, there are calls for bridging science and society and other forms of reviewing quality in research (see for instance Funtowitch and Ravetz idea of the ‘ extended peer review’). More on this in another post!

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

Publish or perish: Improving your H-factor made easy through PleaseCiteMe.com

“What’s your h-factor?” is a question that is increasingly asked at gatherings of scientists or during interviews for academic positions. Scientific careers depend on h-factors these days. What am I talking about?

The h-index is an index that attempts to measure both the productivity and impact of the published work of a scientist or scholar. The index is based on the set of the scientist’s most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a group of scientists, such as a department or university or country.

A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np − h) papers have no more than h citations each.

In other words, a scholar with an index of h has published h papers each of which has been cited in other papers at least h times. Thus, the h-index reflects both the number of publications and the number of citations per publication. (source: wikipedia)

An important determinant of the hight of one’s h-factor depends on what counts as a ‘ cite-able publication’. In the Web of Science h-factor only scientific articles published in journals which have ISI-recognition (determined by Thomson-Reuters) are considered citeable (so not articles in other journals, chapters in books, etc.). In the Scopus h-factor a larger pool of journals is included, while in google citations and in ‘publish or perish’ (www.harzing.com/pop.htm) the h-factor is likely to be higher as it also considers articles in a wide range of journals, book chapters and reports as cite-able). In my university, Wageningen University, it’s not your x-factor that matters but your h-factor. Our librarians have become information specialists that have expertise in bibliometrics and scientometrics. Such expertise is pivotal in helping our academics, science groups and, indeed, our university (76th on the Times Higher Education Index…) climb the rankings. Biblio-what?

Bibliometrics and scientometrics are two closely related approaches to measuring scientific publications and science in general, respectively.  In practice, much of the work that falls under this header involves various types of citation analysis, which looks at how scholars cite one another in publications  (source: Eric Meijers via www. microsites.oii.ox.ac.uk/tidsr/kb/48/what-bibliometrics-and-scientometrics

Below you will see a screenshot of my personal bibliometrics (click on the image to make bigger).

As you can see my overall h-factor is 16. Impressive? Hardly. But how can I raise it? Let me move to the crucial information google citations provides (if you want to check your own bibliometric data you need to make a profile on google citations!). You will note below that “Learning in a changing world and changing in a learning world: reflexively fumbling towards sustainability” is the crucial paper at this moment (click on the image to make bigger).

If I want to increase my h-factor to 17 then I need to get two of my papers cited 17 times or more. I could try to promote the paper that currently occupies place 16 (“reflexively fumbling”). I would then also need to find another paper that is still somewhat attractive to be cited – perhaps the 2006 paper with Justin Dillon on “The danger of blurring methods….” .

So how can I do that? There are many ways of course – I can suggest the paper to authors of papers I review for journals… or I can ask my colleagues to cite those papers… or I can make free-downloads available of those papers via my blog… but there might be a better way – one that could be the beginning of the end of this metrics-based system

Introducing: PleaseCiteMe.com

Why not develop PleaseCiteMe.com – a web-based system where people can trade citations. Scholars can post citations of their own work that they need to have cited in order to increase their h-factor. Of course there is a price to pay: the scholar will have to cite the work of the scholar on the other end who agreed to cite the work. If this is not possible then there can be a monetary value attached to a citation. Possibly citations in a Web of Science journal might cost more than citations in a Scopus journal or of a book chapter that is recognized by Google citations. Of course we need to have a few clever web-designers, economists and Mark Zuckerman-types to create all this but then by 2014 we could probably take this to Wallstreet as by then will be a huge demand in the world of science for this service.

Publish or perish…. or, perhaps, publish and perish…

So what are we to do? In the end it is not about science for impact factors or h-factors but science for impact in society. Some of us are part of a system run increasinly by bibliometric information. Playing the game may be inevitable until the game will end, when people begin to realize that people don’t read but only cite… or when people only write without having the time to read… or when strategic thinking replaces intelligent thinking, curiosity and passion for contributing something meaningful to people and planet.

Fortunately there are still academic environments driven by societal relevance, planetary responsibility and curiosity. And, indeed, there are calls for bridging science and society and other forms of reviewing quality in research (see for instance Funtowitch and Ravetz idea of the ‘ extended peer review’). More on this in another post!