Open Enterprise

RSSSubscribe to this blog
About Author

Glyn Moody's look at all levels of the enterprise open source stack. The blog will look at the organisations that are embracing open source, old and new alike (start-ups welcome), and the communities of users and developers that have formed around them (or not, as the case may be).

Contact Author

Email Glyn

Twitter Profile

Linked-in Profile


Open Access: Looking Back, Looking Forwards

Article comments

A couple of weeks ago, I spoke at a conference celebrating the tenth anniversary of the Berlin declaration on open access. More formally, the "Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities" is one of three seminal formulations of the open access idea: the other two are the Bethesda Statement (2003) and the original Budapest Open Access Initiative (2002).

I entitled my talk "Half a Revolution"; the slides are embedded below, and can also be freely downloaded.

I began by sketching the pre-history of open access - that is, the key developments before the three definining declarations mentioned above. For example, I noted that the preprint repository arXiv.org was set up in 1991 by Paul Ginsparg, who through his brother knew of Richard Stallman's early work on GNU. Thus one of the ideas behind arXiv.org, with its free access to high energy physics preprints, was precisely Stallman's idea of sharing knowledge freely.

Interestingly, arXiv.org began on 19 August 1991; on 23 August 1991, the World Wide Web was released publicly; and then on 25 August 1991, Linus announced the start of Linux. What a week that was....

I also pointed out that the late Michael Hart, the creator of Project Gutenberg, was ahead of them all: he first realised the power of sharing digital artefacts back in in 1972 - a decade earlier than GNU, and three decades before open access really started.

But the main part of my talk looked at a key section in the Berlin Declaration:

The author(s) and right holder(s) of such contributions grant(s) to all users a free, irrevocable, worldwide, right of access to, and a license to copy, use, distribute, transmit and display the work publicly and to make and distribute derivative works, in any digital medium for any responsible purpose, subject to proper attribution of authorship

Of course, all those are familiar enough from open source, but as I pointed out, open access still has some way to go in meeting them. That's because the concept of "open access" has been diluted to include things like CC-NC and CC-ND: the first forbids commercial use, while the second forbids changes. Those both fail to meet the open access definition I quote above, and I urged people to move on to licences that were compatible: CC-BY,CC-SA, CC-BY-SA, CC0. These are precisely the licenses that meet the Open Knowledge Foundation's Open Definition.

I also emphasised that the trigger for articles to be released as open access is public funding: it's generally accepted that if the public helps to pay for research with its taxes, then it has a right to access the fruits of that research - the papers - free of charge. But I noted that implies immediate access to open access papers - not with an embargo, as is frequently the case. I therefore suggested that people should adopt what I called the ZEN approach: Zero Embargo Now.

The normal objection to such proposals is that publishers would lose money if that were implemented. But that really doesn't stand up to scrutiny. If the public has a right to access, it has a right to immediate access. While it may be true that such immediate access could reduce the profits of academic publishers, it's worth remembering two things.

First, that the profit margins of such publishers are fabulous - typically 30%, and often more. So if those came down somewhat, they could still be making money hand over first. Indeed it could be argued that such high profit levels prove that the academic publishing system is unfair and dysfunctional. But more generally, propping up old business models is not a justification for denying the public access to the papers they have funded.

I quoted Clay Shirky's great encapsulation of the situation:

Publishing is not evolving. Publishing is going away. Because the word “publishing” means a cadre of professionals who are taking on the incredible difficulty and complexity and expense of making something public. That’s not a job anymore. That’s a button. There’s a button that says “publish,” and when you press it, it’s done.

In ye olden times of 1997, it was difficult and expensive to make things public, and it was easy and cheap to keep things private. Privacy was the default setting. We had a class of people called publishers because it took special professional skill to make words and images visible to the public. Now it doesn’t take professional skills. It doesn’t take any skills. It takes a Wordpress install.

The question isn’t what happens to publishing — the entire category has been evacuated. The question is, what are the parent professions needed around writing? Publishing isn’t one of them. Editing, we need, desperately. Fact-checking, we need. For some kinds of long-form texts, we need designers. Will we have a movie-studio kind of setup, where you have one class of cinematographers over here and another class of art directors over there, and you hire them and put them together for different projects, or is all of that stuff going to be bundled under one roof? We don’t know yet. But the publishing apparatus is gone.

Picking up on those thoughts, I suggested that there were still plenty of opportunities here, but for new kinds of companies - the Red Hats and Googles of the open access publishing world.

I also noted that open access is inherently digital, which means that it must also concern itself with the digital data associated with research - not least because it is becoming more and more important. Again, we need to be careful about licensing - particularly since in Europe we have to take account of the benighted "sui generis database right." The best way to do that, I suggested, was to turn to the Open Definition again, which requires one of the following licences for data: Public Domain Dedication and Licence (PDDL); Attribution Licence (ODC-BY); Open Database Licence (ODC-ODBL).

The Berlin Declaration is commendably thorough in its discussion of types of material:

open access contributions include original scientific research results, raw data and metadata, source materials, digital representations of pictorial and graphical materials and scholarly multimedia material.

However, there's one glaring omission there: software. Academic work today is almost inconceivable without software. Of course, not all of it will be written specifically for the task in hand, but often that's the case in scientific work, for example. If that software is not available for others to use, its data cannot be reproduced independently - hardly the scientific method. Similarly, if the software is closed source, there is no way to check its logic and workings. The optimal solution is releasing all software produced as the result of public funding as open source: that allows others to check it, and build on it.

Finally, I pointed out that there was a big problem with patents, where they were applicable. Since the US Bayh-Dole Act was passed in 1980, the dogma has been that encouraging universities to apply for patents on tax-funded research is a wonderful way to reward creativity and drive innovation. But as a recent article in Nature points out, Bayh-Dole has been an abysmal failure:

Joy Goswami, assistant director of the technology-transfer office at the University of Delaware in Newark, estimates that only about 5% of patents are licensed at most universities. The rest are a drain on office resources, he adds, because of required maintenance and legal fees.

That is, fully 95% of patents taken out on publicly-funded research are never licensed: they are just locked away. Moreover, the cost of running technology-transfer offices can be considerable, so the academic institution is often worse off financially too. The best solution is to require that all inventions arising out of publicly-funded work must be placed into the public domain, for anyone to use - including companies.

People often say that this is economically inefficient, and that patenting would be better for the economy as a whole. But we have an extremely important example that shows that making results freely available, without patent encumbrances, can be massively efficient:

The financial benefit of the project to decode the human genome continues to grow, according to a controversial report released today by the Battelle Memorial Institute. A decade after the project ended, the benefit now hovers near US$1 trillion.

The Human Genome Project, an international effort led by the United States that ran from 1988 to 2003, has delivered $178 to the US economy for every public dollar spent on the original sequencing, the report says. That is 26% greater than the $141 return-per-dollar that Battelle, a research contractor based in Columbus, Ohio, had calculated in 2011, in its first attempt to estimate the scientific effort’s financial reach.

"The economic impacts generated by the sequencing of the human genome are large, widespread and continue to grow," says Martin Grueber, the primary author of the report and a research leader in Battelle's technology partnership practice.

The important point here is that all genomic sequences generated by the Human Genome Project were placed in the public domain as soon as they were available, as agreed in the Bermuda Principles of 1996, perhaps the first and one of the most important statements about open data.

Thus my shopping list of suggestions for the future contours of open access can be summarised as follows:

ZEN: zero embargo now

text, graphical material, data and databases: Open Definition compliant

software: open source

inventions: public domain (no patents)

I concluded by quoting the Berlin Declaration once more:

our mission of disseminating knowledge is only half complete if the information is not made widely and readily available to society.

For all the many wonderful achievements of open access in the last ten years, I pointed out that we have not reached the point where information is widely and readily available to society - not least to the six billion who need it most. Hence I sought to justify my title - "Half a Revolution" - and urged the open access community to be bold and to complete the revolution by pushing for real and full open access in the next few years.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Email this to a friend

* indicates mandatory field






ComputerWorldUK Webcast

ComputerworldUK
Share
x
Open