Once again, automated takedown requests do collateral damage

May 28, 2013 12:40 GMT  ·  By

It's no secret that media companies don't much like Kim Dotcom or anything he does, including Mega, the new cloud storage service he unveiled a few months ago.

However, these companies have so far managed to refrain themselves from taking shots at Mega. With the new site, Dotcom goes above and beyond what is required by law to make sure infringing material doesn't end up on the site.

That's not to say they wouldn't like Mega to go away. They may have the perfect solution as well, like sending a takedown request to Google. This has become the preferred method for censoring millions of sites with very, very little effort.

In the span of a few weeks, Mega was the target of two takedown requests from two separate Hollywood studios.

That in itself isn't a surprise, as some infringing material will end up on the site no matter how hard Mega tries to keep it out.

But the takedown requests weren't for a specific file, not even for a specific user, but for the entire Mega.co.nz domain.

Rather than point out the URL which contained the infringing files, the studios decided to do one better and try to get the entire site removed from Google.

It's hard to believe that this was done on purpose, though even that wouldn't be too surprising. More likely though, the problem comes from the automated tools these companies use to detect and try to remove infringing material.

These tools are error prone and constantly target innocent sites and URLs. But the companies that make them and the companies that use them have little incentive to make them better.

If they do in fact target some sites by mistake, Google will reject the request, as it did with these two targeting Mega, but that will be it. The companies making the claims will be able to continue to ask for the removal of millions of pages every day.