On Thursday morning, the House Judiciary Committee held its latest in a long series of hearings concerning potential copyright reform — sometimes referred to as “the Next Great Copyright Act” after the Copyright Office kicked off the process with a talk on that topic (I’d quibble with the word “great” in there given how things are going so far). The latest hearing focused on Section 512 of the DMCA, better known as the “notice and takedown” provisions, or, more broadly, as the “safe harbor” provisions, which (mostly) protect service providers from being held liable for infringement done by their users. You’ve heard all of the arguments concerning this on both sides before — and we had a post describing 5 myths likely to come up during the hearings (which did not disappoint).
Professor Rebecca Tushnet did her usual amazing job of taking insanely detailed notes of both the speechifying section and the Q&A section. There’s a lot to cover, so we’re going to break it down into a few different posts. This one is going to focus on the catchy phrase that came up repeatedly throughout the hearings: the idea that rather than the “notice and takedown” provision we have today, there should be a “notice and staydown.” While mentioned repeatedly during the hearing, the concept was also outlined by two of the more maximalist (and clueless) defenders of extreme copyright law, Reps. Judy Chu and Tom Marino, in an opinion piece pushing for such a “notice and staydown” concept.
The idea is, more or less, that if a site receives a takedown notice concerning a particular copy of a work, it should then automatically delete all copies of that work and, more importantly, block that work from ever being uploaded again. This may sound good if you’re not very knowledgeable about (a) technology and (b) copyright law. But if you understand either, or both, you quickly realize this is a really, really stupid solution that won’t work and will have all sorts of dangerous unintended consequences that harm both creativity and the wider internet itself.
First, as was pointed out in the 5 myths piece, content itself is not illegal. It’s actions concerning a piece of content. So, by doing a notice and staydown, you’re guaranteeing that perfectly legitimate uses — including both licensed uses and fair uses — get blocked as well. That’s because to determine if something is infringing, you have to view it in the full context. No matter how much some copyright maximalists want to believe that copyright is a strict liability law, it is not. The very same content may be infringing in some cases and not infringing in others. Not checking the context of each use would clearly block forms of perfectly legitimate expression. That’s a big problem.
Second, and perhaps even bigger, is the fact that such a law would more or less lock in a few big players, like YouTube, and effectively kill the chance of any startup or entrepreneur to innovate and offer a better solution. Throughout the hearing, you hear people refer to Google’s ContentID system — which takes fingerprints of audio and video works and matches new uploads against it — as an example of a proactive system “done right.” Except, that system cost Google somewhere around $50 or $60 million to build. No startup can replicate that. And, even then, if you ask plenty of regular YouTube users, ContentID is really, really bad. It kills off fair use work all the time, it creates tremendous problems for legitimate and licensed users of content who suddenly find their content pulled and strikes on their account. It more or less proves that even if you have all the money in the world, no one can yet build a fingerprinting system that is particularly accurate.
If such a rule did get put in place, however, it would basically just guarantee that the few big players who could afford both the technology and the legal liability/insurance over the inevitable lawsuits, would be able to continue hosting user generated content. That’s more or less ceding much of the internet to Google and Facebook. Considering how often copyright maximalists like to attack big companies like Google for not “sharing the wealth” or “doing their part,” it’s absolutely ridiculous that their biggest suggestion is one that would effectively give the big internet players more power and control.