What Claims Do We Have Over Our Google Search Profiles?

This is a guest post by Hannah Carnegy-Arbuthnott (University of York).

We’ve all done things we regret. It used to be possible to comfort ourselves with the thought that our misadventures would soon be forgotten. In the digital age, however, not only is more of our personal information captured and recorded, search engines can also serve up previously long-forgotten information at the click of a button.

While many of us may worry about the results that search engines like Google serve up when someone searches our name, should we have a right to have unpalatable results erased? Data protection provisions embedded in the GDPR give us just such a right, at least when the search results in question are deemed sufficiently outdated or irrelevant.

This right, colloquially known as the ‘right to be forgotten’, has been subject to fierce criticism. Critics argue that it goes too far in protecting the privacy interests of individuals against the public interest in freedom of information, and that it incoherently treats search engine results differently from the underlying sources to which they point.

I argue in my paper, ‘Privacy, Publicity, and the Right to Be Forgotten’, that these critics are correct that the right to be forgotten can’t be justified by appeal to privacy alone. However, we can defend it on the basis that we have claims against our public profiles being distorted in various ways. Recognizing these claims against distortion expands our toolkit for defending a range of data protection provisions, and widens the scope of the debate about how much control individuals should have over how their personal information is presented in public.

How Search Engines Create and Distort Our Public Profiles

In order to identify the reason we have to specifically target results served up by search engines like Google, we need to understand how they operate. Google does not merely provide a neutral index of the internet, analogous to a library catalogue. Instead, it creates a ranking by relevance of search results for any given search term. Moreover, it tailors these results to each individual user, depending on what it knows about why you’re using a particular search term. When it comes to searches of a person’s name, the search engine essentially curates a public profile of that person, one that is tailored to the purposes of each searcher.

As Meg Leta Jones points out, we use search engines to make assessments about people across a range of domains, whether that’s in recruitment, admissions, or dating. When Google serves up results, the message it implicitly conveys to the searcher is: here are the most relevant pieces of information by which to judge your prospective employee, applicant or date.

What claims do we have against search engines doing this? When search engines serve up information that is outdated or irrelevant relative to the purpose of the search, that involves a distortion of our public profile. The distortion arises not from making the information public, nor from presenting false information. After all, search results link to information that is already in the public domain. Rather, the distortion arises from the way in which the information is presented.

Such distortion can happen when information from someone’s past is presented in a way that suggests it is appropriate to hold them accountable for it, when it is no longer appropriate to do so. For example, when it serves up information about a debt you defaulted on decades ago to a prospective employer. I argue that we have claims against that kind of distortion.

Shaping Norms of Accountability

In trying to ascertain what counts as distortion, one question that arises is how to assess what it is appropriate to hold people accountable for. One way to do this is by reference to existing norms of accountability. But of course, those norms shift over time, and the very existence of search engines has already caused a seismic shift in those norms.

What’s needed, then, is careful thought about what our norms of accountability should look like, and how to foster a culture in which we are not kept stiflingly shackled to our past mistakes. We should also pay attention to the power dynamics at play. We have reason to worry when a handful of corporations play an outsize role in shaping our norms.

This approach provides a path to defending and implementing measures that give individuals some control over their personal information even when from the perspective of privacy, the cat is already out of the bag.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *