Google and the right to be forgotten — Catherine Tucker

MIT Sloan Assoc. Prof. Catherine Tucker

From Nikkei Business

The European Court of Justice’s ruling that Google must honor individuals’ requests to be removed from search results—the right to be forgotten, as it has come to be known—is a misguided attempt to address one of the more unfortunate aspects of the digital age.

Although digital technology has brought many wondrous advances, it also has spawned problems. Among the most serious is what I call digital persistence or the tendency of information in digital format to last for a very long time—regardless of its accuracy.

In the analog era, if a telephone directory listed a number incorrectly, the result would be missed calls and wrong numbers until a new directory was published a year later. But in the digital world, wrong information gets repeated again and again, often showing up long after the original mistake was made.

While digital technology can perpetuate the mistakes others make about us, it also has the same effect on the mistakes we make ourselves. For example, young people by nature do silly things. Sometimes they take digital photos of themselves doing these silly things. The pictures can resurface years or decades later, when the actions no longer seem so amusing.

An approach that addresses these problems by targeting Google is flawed in several respects. First, while Google may be a handy scapegoat, especially in Europe, the American search giant is far from the only source of digital data that threatens the right to be forgotten. Information persists also in government records, online newspapers, and social media, as well as other search engines. To rein in Google while leaving other major information sources unimpeded will have little effect on the overall problem.

Second, the European Court of Justice’s actions ignore the nature of search engines. They work so well because they are automated. The combination of sophisticated algorithms, high-speed networks, and the Internet’s vast stores of data is what produces Google’s instantaneous and usually on target results. Introduce humans into this formula via requests to be forgotten and Google’s performance will slow to a crawl.

A third problem with the ECJ’s approach is that the process of approving requests to be forgotten can have precisely the opposite effect of what the architects of the policy intended. When someone asks to be removed from search results—say, a politician concerned about rumors of an illicit affair—the request itself sparks interest. In the case of the politician combating damaging rumors, reports of a request to be forgotten prompt new speculation and more rumors, even if the politician isn’t mentioned by name.

Digital persistence unfortunately is a problem that will be with us for some time. There are no quick or easy answers. Aiming at one very big target may be a popular move, but it will not bring us any closer to resolution.

Catherine Tucker is the Mark Hyman Jr. Career Development Professor and Associate Professor of Management Science at the MIT Sloan School of Management.