You are browsing the archive for Tom Olijhoek.

The cost of scientific publishing: update and call for action

- May 16, 2014 in Call to Action

Opening knowledge is great. Sharing knowledge is vital. In the past, publishers were the sole mediators for the dissemination of knowledge in printed form. In our, digital age, sharing has become easier thanks to the internet. Yet, although all areas of society have embraced the internet as THE sharing medium, scientific publishing has lagged far behind. Nowhere else are the advantages of sharing knowledge so obvious as in science: but instead of facilitating sharing, scientific publishers desperately try to protect their grip on the access to knowledge. Open access publication systems are a real threat to their lucrative business. That is why finding arguments against open access have become of vital importance. Similarly, countering these arguments is of vital importance for all of us who need [open] access.

The_Cost_of_Knowledge_logo Where are we now…

Tim Gowers’ recently published   blog post on his quest for information on subscription prices for Elsevier journals, using direct approaches (calling, writing publishers and libraries)  and indirect approaches (demanding information based on the FOI),  has caused a major stir. I recommend reading Michelle Brook’s very good overview in her recent blog. Some really astonishing facts have already come to light. It has for instance become apparent that different institutions pay very different prices for almost the ‘same’ deals. Also, universities that did not want to give the information responded often using the argument of commercial interests that had to be protected (full details for all cases in Tim Gowers blog post) The ‘Big deals’ that ELSEVIER and other scientific publishers have made with libraries, form their insurance for long term profits. Making these deals subject to confidentiality actively prevents decrease of prices for publishing which would otherwise occur through market mechanisms. And better still from the publisher’s viewpoint, so-called hybrid journals that allow open-access publishing at a price, only add to the profits. Although publishers say that prices for subscriptions will be lowered when the proportion of paid for open access articles increase, there is as yet no sign of this. APC charges for open access articles in hybrid journals are on the average $900 more expensive than for full open access journals (as Dave Robberts reported on Google+). A discussion thread on the above topics was started at the open access list of the OKFN. The following is an attempt to summarize this discussion here and at the same time asking everyone to participate in this project by adding data on subscription prices of his / her country to the growing dataset.

Charging for educational re-use…

One of the new items that came to light was the fact that Elsevier and other publishers receive extra fees that allow articles to be re-used for education. One would think that this situation would be covered by fair use policies but this is not the case. Esther Hoorn of Groningen University, NL provided the following link for an overview of worldwide rules on fair use and limitations for educational use: Study for Copyright Limitations and Exceptions for Educational Activities in North America, Europe, Caucasus, Central Asia and Israel (by Raquel Xalabarder, Professor of Law, Universitat Oberta de Catalunya (UOC), Barcelona, Spain).  For instance in the Netherlands a fair renumeration for being able to use publisher copyrighted material for educational use has to be paid. And not very surprisingly this is done in the form of a nationwide big deal with the publishers that has to be negotiated every few years. According to Jeroen bosman of Utrecht University (NL) educational use has been granted by some publishers or is sometimes included in big deal contracts for subscriptions.  The situation seems to be somewhat similar for other countries. According to Stuart Lawson In the UK it’s the Copyright Licensing Agency (CLA) that higher education institutions pay for a license to copy/re-use work for educational use. In 2013 the CLA had an income of £7,872,449 (, p.1). Not all of that was from educational institutions, but a lot of it was. Another few million pounds that is only spend because the original work wasn’t openly licensed…. According to Fabiana Kubke of the University Auckland, New Zealand does not have fair use either – it has fair dealing. There is a substantial amount that goes into licencing [] for education at the University of Auckland and those licences do not come out of the libraries budget. (which is why they don’t show in the libraries’ expenditures). This makes it all the more difficult to find out about the costs for this type of licensing. One would need to find out for different institutions what office is in charge of negotiating and paying for Copyright Clearing house costs.. From the examples that were discussed it seems that ‘fair use’ for education almost always has to be paid for. Only in the US and Canada fair use usually allows for 10% reproduction of copyrighted material for “fair use”. This situation of having to pay for re-use of publisher copyrighted material will probably not change very soon. This is at least the conclusion that can be drawn from the recent decision by the EU to block further discussion of WIPO’s (World Intellectual Property Organisation) case for harmonising international legislation on text/data-mining and other copyright issues.

Fully implemented open access would be an order of magnitude cheaper…

Three items were heavily discussed on the open access mailing list, apart from the cost of toll access publishing (what customers had to pay), the cost of open access publishing and open access to data which is necessary for text and datamining. Concerning the cost of publishing, It would be very useful to have data on the total expenditures for publishing per country. For the total cost data are available. According to Bjoern Brembs : “Data from the consulting firm Outsell in Burlingame, California, suggest that the science-publishing industry generated $9.4 billion in revenue in 2011 and published around 1.8 million English-language articles — an average revenue per article of roughly $5,000.” For the cost of open access publishing,  Outsell estimates that the average per-article charge for open-access publishers in 2011 was $660. Doing some quick calculations on the income of Elsevier for access to its digital content, Ross Mounce initially concluded that the cost per article was on the average  $2800 USD over the subscription lifetime per article (70 years) . This was assuming  a mere $ 0,5 Billion income per year for Elsevier on digital access to its content. The more realistic figure here probably is $1 Billion income,  so the cost per article for a representative toll access business would be $5,600 in line with what Outsell estimated. The cost of publishing with one representative open access publisher (PloS) is at the moment around $1,350 per article. This means that if we could switch to full open access for all articles immediately, we would save about 76% on publishing costs! publisher profits     Profit of scholarly publishing vs. other industries  via @ceptional     Regarding the profit margins of toll access publishers, Stuart Lawson provided a link to a spreadsheet where he compared revenues, profits, and profit margins of academic publishers for the last two years( Link). There is nothing wrong with making profit as such. The scientific publisher profit margins  are an issue because of their scale which would be impossible in a truly open market, where open access publishing would have a competitive advantage. However, the toll access publisher’s business of secret deals and long term contracts work very effectively against a transition to open access business models. The more data we can aggregate on publishing costs the more chance there is that open access can profit from its competitive advantage. Data for several individual countries, notably for Brazil, Germany, UK, Netherlands, USA, France and many more are already coming in.

What about open data…

Open data is another issue. As Peter Murray Rust wrote in a mail in the discussion thread of the open access mailing list :” the real danger [of] publishers moving into data where they will do far worse damage in controlling us “ and “Yes – keep fighting on prices. But prices are not the primary issue – it’s control.”. But scientific publishers make huge profits and money can buy a lot of influence. There is a growing awareness that data are the ‘gold’ of the 21st century. It is of vital importance for science and society that scientific data will not be controlled by a few big publishers who want to make as much profit as possible. Open data would be the best way to prevent this from happening. The signs in the US are positive in this respect, with Obama’s directive for open government data. The outlook in Europe is less good in view of the previously mentioned decision by the EU to block further discussion on harmonizing international legislation on text and data-mining.

What we can do….

Together we can uncover the real cost of scientific publishing. Some of you may know where to find figures for your country or you may be able to ask for information using Freedom Of Information legislation. You can add these data directly into a spreadsheet on GoogleDocs or report it on the WIKI  . The aggregated data will allow us to lobby more effectively for the promotion of open knowledge. We will keep you apprised of developments through the open access mailing list and the blogs on this website.  You can subscribe to the open access mailing list and/or other lists of OKFN at LISTINFO.

Recap of the Berlin 11 conference: the call for a change in scientific culture becomes stronger

- January 22, 2014 in #oa, berlin11, Events, metrics,, Open Access, Open Science, quality assessment

The Berlin11 meeting which took place in Nov 2013 was a very energetic, motivating and inspiring event, and it can only be hoped that especially the newly introduced satellite meeting for young scientists will take place again next year. The presentations of this conference will soon be online ( I would like to use this blog to highlight some of the presentations and events of Berlin11 in relation to the developments of 2013. First of all the Berlin11 conference was the first ever that hosted a satellite conference aimed at young scientists. This reflects the increasing support for open access with the young generation of scientists. One of the highlights here was the presentation of Jack Andraka, a 16 year old scholar who told his story on the role of open access for him and his development of a innovative test for pancreatic cancer. Another highlight of the Satellite Conference was the launch of the OpenAccess Button which collects instances where people cannot access information because of toll barriers AND even suggests ways on how to find the information elsewhere in e.g. open access repositories (link here). At the main conference Mike Taylor, a scientist and open access advocate from the UK, gave a compelling talk on why the open access movement needs idealists and why the (also economic) benefits of open access far outweigh the cost (slideshare via this link). Bernard Rentier, Rector of the University of Liege, told of his solution for getting scientists to submit their articles to the university repository: only those articles deposited count for tenure and promotion. Robert Darnton, professor at Harvard University, had a fascinating presentation on the building of the Digital Public Library of America which gives access to over 5,5 million items from libraries, archives and museums.
DPLA Berlin11 slide small

Slide of Robert Darnton’s presentation at Berlin11.


Marin Dacos from OpenEdition, France introduced OpenEdition the major European portal for digital publications, including books, in human and social sciences. Glyn Moody, science journalist from the UK, gave a captivating talk called ‘Half a Revolution’  on the history of the internet, open source software and open access publications. He made a very strong plea for open access without embargo calling it zero embargo now: ZEN. The slides of his talk are available here. David Willetts, Minister from the UK, explained the UK government’s policy for Open Access and as a side-line announced the imminent launch of an Science WIKI platform. Ulrich Pöschl of the university of Mainz, Germany, elaborated on the need for other systems of peer-review and public discussions on published articles. The standard peer-review system has become so flawed that we urgently need to find ways to replace this scientific quality measure with new methods of quality assessment. He gave a talk at Berlin11 very similar to the one he gave at the ALPSP seminar on the future of peer-review in London 1 week earlier. He puts his ideas on interactive open access publishing to practice in the journal Atmospheric Chemistry and Physics. According to Pöschl a new multi-stage peer-review system in combination with Open Access will lead to improved scientific quality assurance. The process of interactive open access publishing  has 2-stages: rapid publication of a discussion paper, public peer review and interactive discussion,  2) review completion and publication of the Final Paper.  The use of a discussion paper guarantees rapid publication, open access enables public peer-review and discussion with a large number of participants, and the number of individual reviews possible in this system is a build-in quality assurance for the final paper. Interestingly, quality assessment using this scheme can also incorporate altmetrics and other measures of impact assessment.
Berlin11 Ploeschl slide small

Slide of Ulrich Pöschl ‘s presentation at Berlin11

Two main topics during the conference were the call for immediate unrestricted access,  Zero Embargo Now (Glyn Moody) and the call for a new scientific quality assessment system (many speakers). The current quality assessment systen focuses on where you publish (rewarding publication in high impact journals) and on the number of publications (the ‘publish or perish’ ). A general feeling on what such a new assessment should look like is summarized in a key message from the presentation that Cameron Neylon gave : “it should not be that important where and how much you publish but rather what and what quality you publish”. I would add to this that in a sense in DOES matter where you publish, namely as far as you publish your work open access allowing for unrestricted re-use. One of the main problems here is how to create the right incentives for scientists to publish open access. We have seen one possible solution in the idea given by Bernard Rentier (see above). In addition a completely refurbished assessment system could provide the right incentives when this system takes into account access, re-use, (social) impact and quality of publications. The current scientific culture with its emphasis on quantity and status is slowly but surely undermining the quality of science, because quantity is fundamentally different from quality. In their need to publish as much as possible as quickly as possible, scientists often duplicate results and/or publish results prematurely. Extreme cases of fabricated  results are also seen and it is especially when these cases are found out that the reputation of science suffers. The conventional peer-review system used for quality assessment has proven to be insufficiently robust to prevent this, and new forms of peer-review and other methods are being developed to replace it (see for instance  F1000). The prioritization of publishing and doing research has had  a very negative impact on education and the quality of research.  It was for this reason that Science in Transition was founded in October 2013 in Amsterdam. The initiators felt that “science does not work as it should” and that something ought to be done about it. The central message again is that the pressure to publish is detrimental to education, and that the quality of science and its reputation is compromised by the current systems and judging scientific output by the number of publications. The full text of the position paper can be read here. In a TV interview on Dec 30, 2013 Robbert Dijkgraaf, president of the Princeton Institute for Advanced Study put it this way:
“we have to judge science more by its value than by the number [of scientific output]”.
That there is growing awareness of the need for change in the area of scientific quality assessment is also illustrated by these citations taken from the New Year’s speech of the rector of the University of Amsterdam, prof. van den Boom: “More publications is not always better”,  and “we have to think hard on alternatives for the evaluation of research”. ( full speech can be seen here). In the two months that have passed since Berlin11, Open Access and the Quality of Science have received considerable public media attention. The German magazine “Die Zeit’’ for example has published a set of articles in its first issue of 2014 on ‘how to save science’.  A new case of (self) plagiarism has led to heated discussions in Dutch newspapers on the uncertain quality of science and the possibilities of fraud.  The president of the organization of Dutch universities, Karl Dittrich has announced that starting Jan 2015 more emphasis will be put on quality, away from quantity of scientific publications in the evaluation of scientists and universities. The Dutch state secretary for research and education Sander Dekker previously already had stated that scientific results should be published in open access journals as of 2016, even by law if scientists will not abide out of free will. Also on the issue of Open Access, the US congress in Jan 2014 approved the so-called 2014 omnibus appropriations legislation. This policy shall require public, free access to each paper based on researches even partially funded by a federal agency, submitted to any peer-review journal, no later than 12 months after the publication. Although this move towards open access is great news, the drawback here is that the bill only deals with free access, the issue of re=use and copyright is left in the open. Also, the 12 month embargo period is not in line with open access in the sense of the Berlin declaration. So it is a step forward but there still is a long way to go, as Mike Taylor says it in his latest blog post. Let me conclude this blog by a small prediction: that 2014 will be the year that scientific output will be judged less and less by how much and where one has published, and more and more by what and in what (accessible and re-usable) form one publishes his/ hers results.