You are currently browsing the category archive for the ‘Copyright Regulation’ category.

March 12-15, 2019, Freie Universität Berlin, Germany

Creativity is one of the key concepts, yet among the most slippery ones of present-day Western societies. Today, the call for creativity spans far beyond typically “creative” fields and industries towards becoming a universal social norm. Creative processes, however, are fundamentally surrounded by uncertainty. It is difficult to know ex-ante what will become a creative idea and, due to its destructive force, it is also highly contested. This inherent uncertainty associated with creativity thus spills over to other social spheres, too.
The DFG-funded Research Unit “Organized Creativity” is studying creative processes in music and pharmaceuticals – as representatives for creativity in the arts and in the sciences. The goal of the unit is to understand in greater depth those practices of inducing and coping with uncertainty which are employed by various actors involved in creative processes.

Target Group
The Spring School provides space for exchange between advanced doctoral students, early postdocs and several senior scholars that do research on creativity either in the context of innovation research or in the fields of business and management studies, economic geography, psychology or sociology. Combining lectures from renowned scholars (Prof. Dr. Dr. Karin Knorr Cetina, Prof. David Stark, Ph.D., Prof. Dr. Gernot Grabher, Prof. Dr. Elke Schüßler, Prof. Dr. Jörg Sydow) with the presentation, discussion and development of individual papers, this call invites advanced doctoral students and early postdocs from all disciplines concerned with creativity and uncertainty to join our discussion in Berlin. The working language will be English. Read the rest of this entry »

Digitalization reduces technological and financial barriers to scientific publishing. Science can thus become faster, more inclusive and more plural. At the same time, the growing acceptance of specific forms of Open Access has also led to the rise of author-pays business models based on Article Processing Charges (APCs). The increasing publication pressure in the scientific system in combination with APCs provides incentives for creating “predatory” journals that only supposedly or very superficially conduct peer review in order to maximize their profits from such APCs. These manuscripts are at best inadequate and at worst deliberately tendentious and misleading.

How to stop predatory publishers? (Credit: SarahRichterArt, CC0)

Recently, an investigative report by the German newspaper Sueddeutsche Zeitung and public broadcasters WDR and NDR has revealed that even researchers from reputable academic institutions publish in or represent publishers of dubious quality. In their attempt to reveal “Fake Science” (using the English term in their German reportings), journalists easily accomplished the publication of a non-sensical article in an allegedly peer reviewed journal charging APCs. What they also show is how these unscientific practices not just harm the reputation of legitimate open access journals but are also a potential source – and allegedly scientific proof – for fake news more generally.

This blogpost discusses how reputable (Open Access) journals can defend their credibility against somewhat or even completely dubious Open Access journals. In our opinion, the most sustainable response, which however would only be possible in the mid to long-term, would be to abandon author-pays business models altogether and switch to publication infrastructures financed by universities and institutions (for an example of such an approach, check out the Open Library of Humanities). In the short-term, however, certain open-peer review practices might also be helpful to address the problem of predatory open access journals. Read the rest of this entry »

The increasing number of collective open access deals either on the national level (e.g., Dutch open access deals) or between publishers and research institutions (e.g., agreement of the Max Planck Society with Springer) has some very practical consequences for scientific publishing processes. On a macro level, these deals make research strong countries and institutions stronger: their papers are better accessible worldwide with respective consequences for reception and citation counts.

But there are also consequences on the micro level. For example, in co-authored papers, the question of who acts as “corresponding author” suddenly becomes of utmost importance. Only if the corresponding author is situated at an institution with such an open access deal then an article will be immediately accessible to anyone worldwide. And it is the corresponding author who has to sign copyright forms on behalf of all the authors to “seal the deal”.

Publishers pushing for Non-Commercial Clause

What I have learnt only very recently is that publishers try to retain as much rights a possible even in cases where researchers are eligible for open access publication. SAGE Publications, for instance, tries to convince – if not force – authors to opt for a Creative Commons license with the restrictive non-commercial (NC) clause (full form as a PDF):

This is troublesome for a whole bunch of reasons: Read the rest of this entry »

On Tuesday, September 26, I was invited to speak at the Digital Europe Working Group of the Progressive Alliance of Socialists and Democrats (S&D) in the European Parliament on the issue of copyright reform. Current debates circle mostly around two new articles proposed by the European Commission. Article 11 proposes to introduce a new neighbouring right for press publishers, following the (so far mostly failed) examples of Germany and Spain. Article 13, in turn, wants platform owners to implement upload filters as a means of copyright enforcement, thereby undermining liability exceptions of the EU E-Commerce directive.

The key point I was trying to make was that targeting large platforms such as Google and Facebook with ever stricter copyright regulation won’t hurt them but rather their competition and anyone else. Even now we can observe that, due to its market power, Google was able to more or less write its own copyright rules for its video platform YouTube. Actually, this had also been necessary, given the misalignment between European copyright regulation and every day online practices based (e.g., creating and sharing user-generated content) upon new digital technologies.

As a consequence, YouTube effectively functions as a transnational rights clearing center and has brought back registration requirements to copyright law in action. However, rights clearing only works on YouTube’s proprietary platform and with remuneration rules negotiated between Google and rights holders; this further strengthens Youtube’s already dominant position in the market place Introducing an upload filter requirement would only further strengthen Google’s market position, making it even more difficult for rights holders to negotiate fair remuneration.

As a way forward, I proposed introducing harmonized and remunerated exceptions for remix and bagatelle uses instead. These would be practical not just for Google but also for anyone else and help to re-align copyright law in the books with copyright law in action.

Please find a video (slides and audio only) and my slides below: Read the rest of this entry »

When Creative Commons published version 4.0 of its set of alternative copyright licenses in 2013, this represented a sea change. While previously a generic set of licenses had been legally adapted to different jurisdictions (“ported”), version 4.0 of the licenses was developed as a single globally applicable license standard. To a certain degree, a decade of laborious license porting helped to build an international network of legal professionals and a respective body of legal knowledge, which than enabled Creative Commons to abandon its porting strategy.


More than three years after this “globalization” of Creative Commons’ licenses, the NGO strives to also globalizing its organizational structure. By March 24, 2017, the various communities of Creative Commons activists, lawyers and contributors may comment on a detailed proposal for an entirely new governance structure. The proposal is accompanied by several regional and a global “Faces of the Commons” reports and additional background information on the process that led to the proposal – a particular open form of strategy-making. Read the rest of this entry »

Since the beginning of 2017, over 60 large German universities and other research institutions lack access to journals published by scientific publishing giant Elsevier. Paradoxically, this escalation in the conflict between research institutions and Elsevier is actually a good thing. To a certain degree, the battle puts to test a great thought experiment provided by James Heathers last year. In his post he applied The Garbage Strike Test to the contemporary scientific publishing system:

What happens [when garbagemen just stop doing their job]? Almost immediately, massive stinking middens of rancid trash build up. Streets became partially inaccessible. Rats run rampant. Cities marinate in their own furious stink. Rocks are thrown at strike-breakers and scabs. Mayors call meetings.

In the end, garbagemen win in such struggles because they are (a) truly necessary, (b) on the right side of public opinion, and (c) something whose absence horrifies people utterly. If you apply this scenario to large academic publishers, assuming that they “suddenly refused anyone any access to any of their copyrighted materials at 9am tomorrow morning ”, the outcome would differ substantially: Read the rest of this entry »

I like recursivity in acronyms such as GNU, which stands for “GNU’s Not Unix”, and also in cartoons. A recent example for a recursive cartoon is featured below, addressing a great number of issues regularly debated around the alternative copyright licensing standard Creative Commons. The author of the work is Patrick Hochstenbach, a comic artist, programmer, and digital architect at University of Ghent libraries. You can also find his work on Instagram and Twitter.

Patrick Hochstenbach, CC-BY-SA

Patrick Hochstenbach, CC-BY-SA

Read the rest of this entry »

The World Bank’s previously public data on microfinance and financial inclusion has recently been locked away behind a paywall. It’s hard to figure out why. However, it raises larger questions about the Bank’s strategies for microfinance and knowledge more broadly.

(This is a background piece to an article published on the IDS blog.)

Since the 1990s, the World Bank has sought to present itself as not only as a lender, but also a global “Knowledge Bank” that collects and provides knowledge as a global public good. It has garnered some praise, and perhaps more criticism, for ostensibly seeking to monopolise knowledge about development. In 2012, the Independent Evaluation Group concluded the objective of creating a global Knowledge Bank had not been achieved, criticising a lack of uptake of knowledge within the Bank and “intellectual silos”.

So how about intellectual vaults, with knowledge securely locked away? Turning public monopolies into private (or pseudo-private) monopolies; now that doesn’t sound like something the World Bank would be in favour of, does it? It’s precisely what happened with the World Bank’s microfinance data platform earlier this month.

The MIX (also known as “Microfinance Information Exchange”, or “”) was created by the World Bank’s in-house-but-arms-length microfinance governing body, CGAP, to improve the transparency of the microfinance industry. Since 2002, the MIX (whose connections to the World Bank are not made very clear, but its headquarters are across the street) has collected data about the global microfinance sector, packaged primarily to cater to investment decision-makers.

The MIX’s “.org” suffix denotes its claim to serve the greater good. The data were made available on-line.  Anyone with an interest in microfinance could access it: “a big win for open data in international development”.

Get the “public” data – for upwards of $486

Those days, it seems, are over. All the data which were previously available for downloading and (usually after some cleaning) analysing in a spreadsheet are now behind a paywall. What used to be a “global public good” is now priced at at least $486 a year – clearly too much for most students or researchers, let alone those from developing countries.


(Image: screenshot from

Read the rest of this entry »

At the end of March I was invited speaker at a workshop on “Balancing Intellectual Property Claims and the Freedom of Art and Communication” at Bielefeld University’s Center for Interdisciplinary Research (ZiF). My talk was mainly based upon thoughts sketched out in two posts from the series on “Algorithm Regulation” on this blog:

Please find a video auf my talk and the respective slideshare embedded below:




The dreadful state of copyright law in the digital age can be nicely illustrated by a thought experiment.* If one thinks back to 1980, it is hard to imagine how one could have committed a copyright violation with a book, an LP or a reel of film. Lending the book to a friend, duplicating parts – or even the whole book – on a photocopier, or staging a reading were all possible without clarifying rights. While copyright was already a complex matter at that time, until the internet it played little role in most people’s everyday lives.

Cover of the reader on "The Digital Society" by the Friedrich-Ebert-Foundation

Cover of the reader on “The Digital Society” by the Friedrich-Ebert-Foundation

Today everything is different. Anyone who uses a smartphone to video everyday experiences and share them with friends in a personal blog will hardly be able to avoid violating copyright. A couple of seconds of music or a poster in the background will suffice if “making publicly available” in the internet violates copyright. Many of the most creative digital artforms, such as remix and mashup, are almost impossible to disseminate by legal means, still less to commercialise. The use of even the briefest music or video sequence must be legally clarified, and in most cases this is much too complicated and expensive. Libraries, museums and archives battle with similar problems, preventing them from digitising their holdings.

Introducing a Right to Remix?

Apart from shorter copyright periods, there would be two other sensible approaches to solving this problem. Firstly, a European harmonisation and expansion of the catalogue of copyright limitations and exceptions would be sensible. The introduction of a de minimis or remix exemption modelled on the fair use clause in US copyright, combined with the forms of flat-fee reimbursement established in Europe, would enable new forms of recreational and remix creativity. Even for commercial publication of remixes and mashups all that would be required is to notify the relevant copyright collecting society (as is already the case for cover versions), in place of the complicated and expensive process of clarifying rights. Secondly, the establishment of a European register of works would simplify clarification of rights and restrict ongoing copyright protection (after an initial period) to cases where works are in fact still in commercial circulation.

Read the rest of this entry »

The Book

Governance across borders: transnational fields and transversal themes. Leonhard Dobusch, Philip Mader and Sigrid Quack (eds.), 2013, epubli publishers.
September 2018
« Jul    

Twitter Updates

Copyright Information

Creative Commons License
All texts on governance across borders are licensed under a Creative Commons Attribution-Share Alike 3.0 Germany License.