You are currently browsing the category archive for the ‘Copyright Regulation’ category.
When Creative Commons published version 4.0 of its set of alternative copyright licenses in 2013, this represented a sea change. While previously a generic set of licenses had been legally adapted to different jurisdictions (“ported”), version 4.0 of the licenses was developed as a single globally applicable license standard. To a certain degree, a decade of laborious license porting helped to build an international network of legal professionals and a respective body of legal knowledge, which than enabled Creative Commons to abandon its porting strategy.
More than three years after this “globalization” of Creative Commons’ licenses, the NGO strives to also globalizing its organizational structure. By March 24, 2017, the various communities of Creative Commons activists, lawyers and contributors may comment on a detailed proposal for an entirely new governance structure. The proposal is accompanied by several regional and a global “Faces of the Commons” reports and additional background information on the process that led to the proposal – a particular open form of strategy-making. Read the rest of this entry »
Since the beginning of 2017, over 60 large German universities and other research institutions lack access to journals published by scientific publishing giant Elsevier. Paradoxically, this escalation in the conflict between research institutions and Elsevier is actually a good thing. To a certain degree, the battle puts to test a great thought experiment provided by James Heathers last year. In his post he applied The Garbage Strike Test to the contemporary scientific publishing system:
What happens [when garbagemen just stop doing their job]? Almost immediately, massive stinking middens of rancid trash build up. Streets became partially inaccessible. Rats run rampant. Cities marinate in their own furious stink. Rocks are thrown at strike-breakers and scabs. Mayors call meetings.
In the end, garbagemen win in such struggles because they are (a) truly necessary, (b) on the right side of public opinion, and (c) something whose absence horrifies people utterly. If you apply this scenario to large academic publishers, assuming that they “suddenly refused anyone any access to any of their copyrighted materials at 9am tomorrow morning ”, the outcome would differ substantially: Read the rest of this entry »
I like recursivity in acronyms such as GNU, which stands for “GNU’s Not Unix”, and also in cartoons. A recent example for a recursive cartoon is featured below, addressing a great number of issues regularly debated around the alternative copyright licensing standard Creative Commons. The author of the work is Patrick Hochstenbach, a comic artist, programmer, and digital architect at University of Ghent libraries. You can also find his work on Instagram and Twitter.
The World Bank’s previously public data on microfinance and financial inclusion has recently been locked away behind a paywall. It’s hard to figure out why. However, it raises larger questions about the Bank’s strategies for microfinance and knowledge more broadly.
(This is a background piece to an article published on the IDS blog.)
Since the 1990s, the World Bank has sought to present itself as not only as a lender, but also a global “Knowledge Bank” that collects and provides knowledge as a global public good. It has garnered some praise, and perhaps more criticism, for ostensibly seeking to monopolise knowledge about development. In 2012, the Independent Evaluation Group concluded the objective of creating a global Knowledge Bank had not been achieved, criticising a lack of uptake of knowledge within the Bank and “intellectual silos”.
So how about intellectual vaults, with knowledge securely locked away? Turning public monopolies into private (or pseudo-private) monopolies; now that doesn’t sound like something the World Bank would be in favour of, does it? It’s precisely what happened with the World Bank’s microfinance data platform earlier this month.
The MIX (also known as “Microfinance Information Exchange”, or “Mixmarket.org”) was created by the World Bank’s in-house-but-arms-length microfinance governing body, CGAP, to improve the transparency of the microfinance industry. Since 2002, the MIX (whose connections to the World Bank are not made very clear, but its headquarters are across the street) has collected data about the global microfinance sector, packaged primarily to cater to investment decision-makers.
The MIX’s “.org” suffix denotes its claim to serve the greater good. The data were made available on-line. Anyone with an interest in microfinance could access it: “a big win for open data in international development”.
Get the “public” data – for upwards of $486
Those days, it seems, are over. All the data which were previously available for downloading and (usually after some cleaning) analysing in a spreadsheet are now behind a paywall. What used to be a “global public good” is now priced at at least $486 a year – clearly too much for most students or researchers, let alone those from developing countries.
(Image: screenshot from themix.org)
At the end of March I was invited speaker at a workshop on “Balancing Intellectual Property Claims and the Freedom of Art and Communication” at Bielefeld University’s Center for Interdisciplinary Research (ZiF). My talk was mainly based upon thoughts sketched out in two posts from the series on “Algorithm Regulation” on this blog:
- Algorithm Regulation #9: YouTube and the Comeback of Copyright Registration
- Algorithm Regulation #10: YouTube as a Transnational Rights Clearing Center
The dreadful state of copyright law in the digital age can be nicely illustrated by a thought experiment.* If one thinks back to 1980, it is hard to imagine how one could have committed a copyright violation with a book, an LP or a reel of film. Lending the book to a friend, duplicating parts – or even the whole book – on a photocopier, or staging a reading were all possible without clarifying rights. While copyright was already a complex matter at that time, until the internet it played little role in most people’s everyday lives.
Today everything is different. Anyone who uses a smartphone to video everyday experiences and share them with friends in a personal blog will hardly be able to avoid violating copyright. A couple of seconds of music or a poster in the background will suffice if “making publicly available” in the internet violates copyright. Many of the most creative digital artforms, such as remix and mashup, are almost impossible to disseminate by legal means, still less to commercialise. The use of even the briefest music or video sequence must be legally clarified, and in most cases this is much too complicated and expensive. Libraries, museums and archives battle with similar problems, preventing them from digitising their holdings.
Introducing a Right to Remix?
Apart from shorter copyright periods, there would be two other sensible approaches to solving this problem. Firstly, a European harmonisation and expansion of the catalogue of copyright limitations and exceptions would be sensible. The introduction of a de minimis or remix exemption modelled on the fair use clause in US copyright, combined with the forms of flat-fee reimbursement established in Europe, would enable new forms of recreational and remix creativity. Even for commercial publication of remixes and mashups all that would be required is to notify the relevant copyright collecting society (as is already the case for cover versions), in place of the complicated and expensive process of clarifying rights. Secondly, the establishment of a European register of works would simplify clarification of rights and restrict ongoing copyright protection (after an initial period) to cases where works are in fact still in commercial circulation.
Is it dangerous to take a public domain picture from Wikipedia and use it on your blog or print it on a T-shirt? Last week the COMMUNIA blog wrote about a copyright case in Germany where several users of public domain pictures received letters from the lawyers of Mannheim’s Reiss-Engelhorn museum. The letters demanded payment for the use of photos of public domain art works that had been uploaded to Wikipedia. The museum justifies this legal action by pointing to the costs of digitizing their artworks and the respective acquisition of some form of ancillary copyright protection for simple photographs (“Lichtbildschutz”, § 72 in the German copyright law). On Wikimedia Commons, the repository that hosts media for Wikipedia, there is already a separate category for “Images subject to Reiss Engelhorn lawsuit”.
Amongst the several recipients of the letters were not only Wikimedia Germany and the Wikimedia Foundation, but also the online radio station detektor.fm and the non-profit website “Musical&Co“, which features music-related articles authored by children for children. Read the rest of this entry »
The recent infight between the world’s largest academic publishing company, Elsevier, and (soon: former) editors of one of their journals over attempts to make the journal open access – that is, freely available online – demonstrates the potential power of editorial boards in shaping the digital future of academic publishing.
The academic publishing system runs on reputation. Researchers gain reputation by publishing in reputable journals, which are more read and cited than other journals. The better the reputation of a journal, the more prestigious is it to review and serve as a member of the editorial board. Of course, the related reputation dynamic is self-stabilizing and highly path dependent because prestigious journals get more submissions, have higher rejection rates, more prestigious authors and reviewers, all of which contributes to being cited more often, which in turn is the key reputation metric in most disciplines (see a paper by Jakob Kapeller and myself on this issue for the field of economics).
The path dependence of journal reputation in contempary academic publishing is one of the reasons – if not the main reason – why new open access journals face a steep uphill battle against incumbent journals. The few open access journals that managed to acquire substantial prestige such as some of Public Library of Science (PLoS) journals did so mostly because of the very high prestige of founding editors, including nobel laureates. It is also the reason why simply calling for researchers to switch to open access outlets won’t work. Since careers and funding depend on the proven ability to publish in established “top journals”, researchers in general and early-career researchers in particular have strong incentives to avoid newly founded open access outlets.
While the recent Google Books ruling by the US Second Circuit has once again proven how the US copyright system is – thanks to its fair use provision – more flexible and adaptable to digital challenges than its European counterpart, in other fields the legal situation is very similar on both sides of the Atlantic. One such field is digital sampling in music, which is the topic of the book “Creative License: The Law and Culture of Digital Sampling” by McLeod and DiCola (2011, Duke University Press).
Sampling is a comparably recent practice where parts of sound recordings are reused in creating new works. According to McLeod and DiCola, “a good appropriated sample has […] a good quality of its own, and it has a strong reference that evokes cultural resonance as well” (p. 99, emphasis added). The latter of the two, cultural resonance, not only adds an additional meta-layer of cultural reference to a song but is also the main reason for legal calamities associated with sampling. As with remix practices more generally, a core characteristic of sampling is that the old remains visible within the new and is not hidden behind a (more or less transparent) veil of originality.
However, this visibility of creative raw materials – that is, samples of previous works – is considered as some form of creative “short-cut” by the courts, which require samplers to clear each and every sample they use, as small and tiny the portion of sound may be. McLeod and DiCola:
“Bridgeport Music v. Dimension Films held that no de minimis exception applied to sound recordings. […] [T]he bottom line was, as the ruling stated, ‘Get a license or do not sample.’” (pp. 139, 141)
In Germany, the decision “Metall auf Metall” by Germany’s highest court had identical consequences. The detrimental effects of such a restrictive application of current copyright to the artistic practice of sampling are the reason why sampling-based creativity suffers from permission culture.
Remixing has long since become a part of our daily lives. Today, when amateurs and artists work with images, texts and music, they are inspired and free. However, in many cases copyright law gets in their way.
During what turned out to be the not-so-hot summer of 2014, a wave of ice water crashed through the internet. Throughout the world, people were filming themselves as they poured buckets of cold water over their heads, sharing the results in social networks and then nominating their friends to perform this strange ritual, which was quickly dubbed the Ice Bucket Challenge. It was a digital chain letter of sorts that spread like wildfire through the internet. The whole thing was actually a call for donations to the ALS Association. ALS is a rare nerve disease.
But this is only a partial description of the phenomenon. In contrast to a chain letter, each ice water performance also had an individual note; it was a continuation of the general motif. In this sense, the Ice Bucket Challenge is also prototypical for digital remix culture.
An example of this remix character is a version of the Ice Bucket Challenge that is circulating in the internet: it is based on the Edvard Munch’s famous picture “The Scream”. The internet picture is a remix and lives off of an interaction between the old and the new. Without a clearly recognizable reference to Edvard Munch’s series of paintings “The Scream”, it would be as inexplicable as it would be without any previous knowledge of the Ice Bucket Challenge phenomenon. This is the essence of a remix: the old, original work remains identifiable in the new work.
The example of the Ice Bucket Challenge is revealing. It illustrates how the internet and digital technologies have contributed to the rise in broadly disseminated – not to mention democratized – remix culture. As a mass phenomenon, this new remix culture is characterized by numerous contradictions. Read the rest of this entry »