For the last couple of years, journalist Dune Lawrence has been subject to constant harassment. She wrote several articles about an investment firm, and the owner initiated an online defamation campaign against her and other targets. He posted images of her to his website and called her a racist, a fraud, incompetent and dumb, among many other insults.
Remarkably, these images quickly rose to the top of Google’s image search results for her name, and are still there at the time of writing, making it easy for anyone to find the defamatory comments about her. In fact, these results still display highly even though the creator of the site has already lost a defamation case, has been charged with fraud, and is in the process of defending yet another defamation case.
One of the great advantages – and disadvantages – of the internet is that it lets anyone easily publish anything they want. If someone publishes defamatory comments about you, or personal information or explicit photos of you, what can you do? Suing the host of the potentially illegal content is difficult, expensive, and it may even be impossible if they are in another country or are protected by strong free speech laws.
Right to be forgotten
If you live in the European Union, you can activate something called the “right to be forgotten”. This allows European citizens to request that search engines such as Google remove entries from their search results that lead to content that is irrelevant, excessive, or unnecessarily defamatory.
Brought into effect in 2014, the measure was recently extended to apply to any international version of a search engine accessed from within the EU. The problem is that the harmful content isn’t removed – just the search engine link to it.
The RTBF principle has attracted criticism from those who see it as an unnecessary infringement of freedom of speech. It also poses a risk to mainstream search sites such as Google. Google currently maintains overwhelming control over the search engine market, but web users could start to abandon the site if they felt that they are not getting a full set of results. It is also undoubtedly expensive to answer all of the requests.
Nevertheless, it may sometimes be useful for citizens to be able to try to remove content from search results. Victims of harmful content can apply to search engines to explain why they should have links removed. At the time of writing, just under 1.5 million links have been examined by Google since 2014 and 43% of requests have led to link removal. By any standard, this is a considerable amount of information that is no longer listed. If your RTBF petition is denied, you can appeal to your national data protection authority, and maybe eventually go to court to try to get a link removed.
Legal removal request
But you don’t actually need to use an RTBF request – which deals with content that may be perfectly legal but can be deemed irrelevant, outdated or excessive – to have a link removed from Google. Instead, you could use a legal removal request, versions of which have existed for over a decade as a way to remove a link on any Google-owned site, including YouTube and Blogger, to content that infringes copyright.
Legal removal, however, targets links to potentially illegal content, including defamatory statements, malware, sexually explicit content uploaded without consent, and other similar infringing materials. After you submit your request, Google’s employees will analyse the content to see if it is potentially illegal and violates their terms of service. The link might also be removed during the investigation.
However, this comes at a cost. After a successful legal removal request, Google will display a notice in the search results detailing that content or a link has been removed. This will include a link to the request in a database called Lumen (formerly known as Chilling Effects). The database records a URL of the removed link, which is often redacted in instances of defamation or explicit images, for the purposes of transparency and to shine a light on removal requests. Victims of illegal activity have to decide if having such a notice at the bottom of a search is worth it.
One thing is clear, most of the above solutions tend to be imperfect, as they do not cover all search engines. It is also important to stress that legal requests and the right to be forgotten do not remove the actual content.
Internet activist John Gilmore is often quoted as having said: “The Net treats censorship as a defect and routes around it.” A full removal of illegal content may very well be impossible.
Andres Guadamuz,Senior Lecturer in Intellectual Property Law, University of Sussex
This article first appeared on The Conversation.