- Source: Right to be forgotten
The right to be forgotten (RTBF) is the right to have private information about a person be removed from Internet searches and other directories in some circumstances. The issue has arisen from desires of individuals to "determine the development of their life in an autonomous way, without being perpetually or periodically stigmatized as a consequence of a specific action performed in the past".: 231 The right entitles a person to have data about them deleted so that it can no longer be discovered by third parties, particularly through search engines.: 121
Those who favor a right to be forgotten cite its necessity due to issues such as revenge porn sites and references to past petty crimes appearing in search engine listings for a person's name. The main concern is for the potentially undue influence that such results may exert upon a person's online reputation indefinitely if not removed.
Those who oppose the right worry about its effect on the right to freedom of expression and whether creating a right to be forgotten would result in a decreased quality of the Internet, censorship, and the rewriting of history.
The right to be forgotten is distinct from the right to privacy. The right to privacy constitutes information that is not known publicly, whereas the right to be forgotten involves revoking public access to information that was known publicly at a certain time.: 122
Recognition by jurisdiction
= Argentina
=Argentina has had lawsuits by celebrities against Google and Yahoo! in which the plaintiffs demand the removal of certain search results, and require removal of links to photographs. One case, brought by artist Virginia da Cunha, involved photographs which had originally been taken with her permission and uploaded with her permission, however she alleged that the search results improperly associated her photographs with pornography. De Cunha's case achieved initial success resulting in Argentina search engines not showing images of the particular celebrity, however, this decision is on appeal.
Virginia Simari, the judge in favor of De Cunha, stated that people have the right to control their image and avert others from "capturing, reproducing, broadcasting, or publishing one's image without permission". In addition, Simari used a treatise written by Julio César Rivera, a Buenos Aires lawyer, author, and law professor "the right to control one's personal data includes the right to prevent others from using one's image." Since the 1990s, Argentina has also been a part of the habeas data movement in which they "adopted a constitutional provision that is part freedom-of-government-information law and part data privacy law." Their version is known as Amparo. Article 43 explains it:
"Any person shall file this action to obtain information on the data about himself and their purpose, registered in public records or databases, or in private ones intended to supply information; and in case of false data or discrimination, this action may be filed to request the suppression, rectification, confidentiality or updating of said data."
Argentina's efforts to protect their people's right to be forgotten has been called "the most complete" because individuals are able to correct, delete, or update information about themselves. Overall, their information is bound to remain confidential.
= China
=In 2016, a Chinese court in Beijing rejected an argument for the right to be forgotten when a judge ruled in favor of Baidu in a lawsuit over removing search results.: 140 It was the first of such cases to be heard in Chinese court. In the suit, Ren Jiayu sued Chinese search engine Baidu over search results that negatively associated him with a previous employer, Wuxi Taoshi Biotechnology.: 140 Ren argued that by posting the search results, Baidu had infringed upon his right of name and right of reputation, both protected under Chinese law. Because of these protections, Ren believed he had a right to be forgotten by removing these search results. The court ruled against Ren, claiming his name is a collection of common characters and as a result the search results were derived from relevant words. The court described search results as neutral findings based on an algorithm and stated that retaining such information was necessary for the public.: 140
= European Union
=Europe's data protection laws do not implement a "right to be forgotten", but a more limited "right to [data] erasure". Variations on the concept a right to be forgotten have existed in Europe for many years, including:
In the United Kingdom there is the idea, addressed for example by the Rehabilitation of Offenders Act of 1974, that after a certain period of time many criminal convictions are "spent", meaning that information regarding said person should not be considered when obtaining insurance or seeking employment.
In France le droit à l'oubli (the right to be forgotten) was enacted in French Law in 2010.
Opinions on the right to be forgotten differ greatly between the United States and EU countries. In the United States, accessibility, the right of free speech according to the First Amendment, and the "right to know" are typically favored over removing or increasing difficulty to access truthfully published information regarding individuals and corporations. Although the term "right to be forgotten" is a relatively new idea, the European Court of Justice legally solidified that the "right to be forgotten" is a human right when they ruled against Google in the Costeja case on May 13, 2014.
This raises questions about the limitations of application in a jurisdiction include the inability to require removal of information possessed by companies outside the jurisdiction. There is no global framework to allow individuals control over their online image. However, Professor Viktor Mayer-Schönberger has argued that Google cannot escape compliance with the law of France implementing the decision of the European Court of Justice in 2014, pointing out that the U.S. and other nations have long maintained that their local laws have "extra-territorial effects".
In 1995, the European Union adopted the European Data Protection Directive (Directive 95/46/EC) to regulate the processing of personal data. This is now considered a component of human rights law. The new European General Data Protection Regulation provides protection and exemption for companies listed as "media" companies, like newspapers and other journalistic work. However, Google purposely opted out of being classified as a "media" company, therefore the company is not protected. Judges in the European Union ruled that because the international corporation, Google, is a collector and processor of data it should be classified as a "data controller" under the meaning of the EU data protection directive. These "data controllers" are required under EU law to remove data that is "inadequate, irrelevant, or no longer relevant", making this directive of global importance.
In Article 12 of the Directive 95/46/EC the EU gave a legal basis to Internet protection for individuals.: 233 In 2012 the European Commission disclosed a draft European Data Protection Regulation to supersede the directive, which included specific protection in the right to be forgotten in Article 17. A right to be forgotten was replaced by a more limited right of erasure in Article 17 of the version of the GDPR that was adopted by the European Parliament in March 2014 and which became EU law in April 2016.
To exercise the right to be forgotten and request removal from a search engine, one must complete a form through the search engine's website. Google's removal request process requires the applicant to identify their country of residence, personal information, a list of the URLs to be removed along with a short description, and - in some cases - attachment of legal identification. The applicant receives an email from Google confirming the request but the request must be assessed before it is approved for removal. If the request is approved, searches using the individual's name will no longer result in the content appearing in search results. The content remains online and is not erased. After a request is filled, their removals team reviews the request, weighing "the individual's right to privacy against the public's right to know", deciding if the website is "inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed". Google has formed an Advisory Council of various professors, lawyers, and government officials from around Europe to provide guidelines for these decisions. However, the review process is still a mystery to the general public. Guidelines set by EU regulators were not released until November 2014, but Google began to take action on this much sooner than that, which (according to one author) allowed them "to shape interpretation to [their] own ends". In May 2015, eighty academics called for more transparency from Google in an open letter.
The form asks people to select one of the 28 countries that make up the European Union, as well as Iceland, Liechtenstein, Norway, and Switzerland. "The form allows an individual or someone representing an individual to put in a request" for the removal of any URLs believed to be a violation of the individual's privacy. Regardless of who is submitting the form, some form of photo identification of the person the form is being submitted for must be present. This is meant to serve as proof that the person for whom the request was made for does in fact approve.
If Google refuses a request to delink material, Europeans can appeal to their local data protection agency. As of May 2015, the British Data Protection Agency had treated 184 such complaints, and overturned Google's decision in about a quarter of those. If Google fails to comply with a Data Protection Agency decision, it can face legal action.
In July 2014, in the early stages of Google's effort to comply with the court ruling, legal experts questioned whether Google's widely publicized delistings of a number of news articles violated the UK and EU Data Protection Directive, since in implementing the Directive, Google is required to weigh the damage to the person making the request against any public interest in the information being available. Google indeed acknowledged that some of its search result removals, affecting articles that were of public interest, were incorrect, and reinstated the links a week later. Commentators like Charles Arthur, technology editor of The Guardian, and Andrew Orlowski of The Register noted that Google is not required to comply with removal requests at all, as it can refer requests to the information commissioner in the relevant country for a decision weighing the respective merits of public interest and individual rights.
Google notifies websites that have URLs delinked, and various news organizations, such as BBC, have published lists of delinked articles. Complainants have been named in news commentary regarding those delinkings. In August 2015 the British Data Protection Agency issued an enforcement action requiring Google to delink some of these more recent articles from searches for a complainant's name, after Google refused to do so. Google complied with the request. Some academics have criticized news organizations and Google for their behavior.
In July 2015, Google accidentally revealed data on delinkings that "shows 95% of Google privacy requests are from citizens out to protect personal and private information – not criminals, politicians and public figures."
This data leak caused serious social consequences for Google as the public expressed their outrage and fear over the information that was recently made public. Though only 5% of requests were made by criminals, politicians, and public figures, the content removed was what sparked the most fear. In particular, one request for data removal was from a British doctor requesting to have 50 links removed on past botched medical procedures. Google agreed to remove three search results containing his personal information. The public voiced their outrage stating that removing such information can be used for manipulation and could result in innocent people making uninformed decisions. Google responded to the public outrage by saying that when removing content they consider both the right of the individual and public interest.
The European Union has been advocating for the delinkings requested by EU citizens to be implemented by Google not just in European versions of Google (as in google.co.uk, google.fr, etc.), but on google.com and other international subdomains. Regulators want delinkings to be implemented so that the law cannot be circumvented in any way. Google has refused the French Data Protection Agency's demand to apply the right internationally. Due in part to their refusal to comply with the recommendation of the privacy regulating board Google has become the subject of a four-year-long antitrust investigation by the European Commission. In September 2015, the French Data Protection Agency dismissed Google's appeal.
The French Data Protection Agency appealed to the EU courts to seek action on Google for failing to delink in its global servers. In September 2019 the Court of Justice for the EU issued its decision, finding that Google is not required to delink on sites external to the EU, concluding that "Currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject ... to carry out such a de-referencing on all the versions of its search engine."
As of September 2015, the most delinked site is www.facebook.com. Three of Google's own sites, groups.google.com, plus.google.com and www.youtube.com are among the ten most delinked sites. In addition to Google, Yahoo and Bing have also made forms available for making delinking requests.
In September 2019, the European Court of Justice ruled that the Right to be Forgotten did not apply outside of its member states. The ruling meant that Google did not have to delete the names of individuals from all of its international versions.
In December 2022, the judges in Luxembourg further extended the Right to be Forgotten in the case C-460/20 TU, RE vs Google LLC. The case relates to two managers of a group of investment companies, who argued that three unflattering news articles should be ‘de-referenced’ from the search engine results of Google, when searching for their names. They claimed that the information presented in the articles was wrong factually, which raised the question whether search engine operators need to check the accuracy of the information. Additionally, the applicants required that photographs showing them on preview images — or thumbnails — when carrying out a search, ought to be removed. In this judgment the European Court of Justice largely agreed with the request of the applicants. Search engine operators such as Google are required to de-reference the respective information, if a person who seeks de-referencing submits 'relevant and sufficient' evidence capable of substantiating his or her request, and thereby manifests the inaccuracy of the information found (para. 72). For thumbnails an independent assessment must be performed, but essentially the same thinking applies.
Europe's jurisdiction of data also extends beyond its borders into countries that does not have "adequate" protections. For instance, Europe's transfer of data to vulnerable countries are limited, resulting in companies like Google and Amazon to establish European data centers to quarantine data from Europe.
Caselaw in Spain
In May 2014, the European Court of Justice ruled against Google in Costeja, a case brought by a Spanish man, Mario Costeja González, who requested the removal of a link to a digitized 1998 article in La Vanguardia newspaper about an auction for his foreclosed home, for a debt that he had subsequently paid. He initially attempted to have the article removed by complaining to the Spanish Agency of data protection, which rejected the claim on the grounds that it was lawful and accurate, but accepted a complaint against Google and asked Google to remove the results. Google sued in the Spanish Audiencia Nacional (National High Court) which referred a series of questions to the European Court of Justice. The court ruled in Costeja that search engines are responsible for the content they point to and thus, Google was required to comply with EU data privacy laws. On its first day of compliance only (May 30, 2014), Google received 12,000 requests to have personal details removed from its search engine.
Caselaw in Germany
On October 27, 2009, lawyers for Wolfgang Werlé who—together with Manfred Lauber—was convicted of murdering Walter Sedlmayr sent the Wikimedia Foundation a cease and desist letter requesting that Werlé's name be removed from the English language Wikipedia article Walter Sedlmayr, citing a 1973 Federal Constitutional Court decision that allows the suppression of a criminal's name in news accounts once he is released from custody. Previously, Alexander H. Stopp, attorney for Werlé and Lauber, had won a default judgment in German court, on behalf of Lauber, against the Wikimedia Foundation. According to the Electronic Frontier Foundation, Werlé's lawyers also challenged an Internet service provider in Austria which published the names of the convicted killers.
Wikimedia is based in the United States, where the First Amendment protects freedom of speech and freedom of the press. In Germany, the law seeks to protect the name and likenesses of private persons from unwanted publicity. On January 18, 2008, a court in Hamburg supported the personality rights of Werlé, which by German law includes removing his name from archive coverage of the case.
On November 12, 2009, The New York Times reported that Wolfgang Werlé had a case pending against the Wikimedia Foundation in a German court. The editors of the German-language Wikipedia article about Sedlmayr removed the names of the murderers, which have since then been restored to the article. The Guardian observed that the lawsuit has resulted in the Streisand effect, an upsurge in publicity for the case resulting from the legal action.
On December 15, 2009, the German Federal Court of Justice (Bundesgerichtshof) in Karlsruhe ruled that German websites do not have to check their archives in order to provide permanent protection of personality rights for convicted criminals. The case occurred after the names of the brothers were found on the website of Deutschlandradio, in an archive article dating from July 2000. The presiding judge Gregor Galke stated "This is not a blank check", and stated that the right to rehabilitation of offenders had been taken into consideration.
On November 28, 2019, the German constitutional court in Karlsruhe ruled that German murderer Paul Termann has the right to be forgotten.
General Data Protection Regulation
The 2012 draft European Data Protection Regulation Article 17 detailed the "right to be forgotten and to erasure". By Article 17 individuals to whom the data appertains are granted the right to "obtain from the controller the erasure of personal data relating to them and the abstention from further dissemination of such data, especially in relation to personal data which are made available by the data subject while he or she was a child or where the data is no longer necessary for the purpose it was collected for, the subject withdraws consent, the storage period has expired, the data subject objects to the processing of personal data or the processing of data does not comply with other regulation".
The EU defines "data controllers" as "people or bodies that collect and manage personal data". The EU General Data Protection Regulation requires data controllers who have been informed that an individual has requested the deletion of any links to or copies of information must "take all reasonable steps, including technical measures, in relation to data for the publication of which the controller is responsible, to inform third parties which are processing such data, that a data subject requests them to erase any links to, or copy or replication of that personal data. Where the controller has authorized a third party publication of personal data, the controller shall be considered responsible for that publication". In the situation that a data controller does not take all reasonable steps then they will be fined heavily.
The European Parliament was once "expected to adopt the proposals in first reading in the April 2013 Plenary session". The right to be forgotten was replaced by a more limited right to erasure in the version of the GDPR adopted by the European Parliament in March 2014. Article 17 provides that the data subject has the right to request erasure of personal data related to him on any one of a number of grounds including non-compliance with article 6.1 (lawfulness) that includes a case (f) where the legitimate interests of the controller is overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data (see also Costeja).
The European Union is an influential group of states, and this tendency towards the right to be forgotten in the EU is an indicator of its recognition globally as a right. To support this, in 2012 the Obama Administration released a "Privacy Bill of Rights" to protect consumers online, and while this has not quite the strength of the EU law, it is a step towards recognition of the right to be forgotten.
= India
=In April 2016, the Delhi High Court began to examine the issue after a Delhi banker requested to have his personal details removed from search results after a marital dispute. In this case, due to the dispute being settled, the banker's request is valid. The High Court has asked for a reply from Google and other search engine companies by September 19, upon which the court will continue to investigate the issue.
In January 2017, the Karnataka High Court upheld the right to be forgotten, in a case involving a woman who originally went to court in order to get a marriage certificate annulled, claiming to have never been married to the man on the certificate. After the two parties came to an agreement, the woman's father wanted her name to be removed from search engines regarding criminal cases in the high court. The Karnataka High Court approved the father's request, stating that she had a right to be forgotten. According to the court, its ruling would align with western countries' decisions, which typically approve of the right to be forgotten when dealing with cases "involving women in general and highly sensitive cases involving rape or affecting the modesty and reputation of the person concerned." The woman in this specific case was worried that the search results would affect her standing with her husband, as well as her reputation in society.
As of February 2017, the Delhi High Court is hearing a case involving a man requesting to have information regarding his mother and wife to be removed from a search engine. The man believes that having his name linked to the search is hindering his employment options. The Delhi High Court is still working on the case, along with the issue of whether or not a right to be forgotten should be a legal standard in India. Currently, there is no legal standard for the right to be forgotten, but if implemented, this would mean that citizens no longer need to file a case in order to request for information from search engines to be removed. This case could have significant impacts on the right to be forgotten and search engines in India.
= South Korea
=In May 2016, South Korea's Communications Commission (KCC) announced citizens will be able to request search engines and website administrators to restrict their own postings from being accessible publicly. The KCC released "Guidelines on the Right to Request Access Restrictions on Personal Internet Postings", which took effect in June 2016 and do not apply to third party contents. To the extent that the right to be forgotten concerns a data subject's right to limit the searchability of third party postings about him/her, the Guideline does not constitute a right to be forgotten. Also, as to the right to withdraw one's own posting, critics have noted that people have been able to delete their own postings before the Guideline as long as they have retained their login credentials, and that people who have misplaced their login credentials were permitted to retrieve or receive new ones. The only services significantly affected by the Guideline are Wiki-type services where people's contributions make logical sense only in response to or in conjunction with one another's contributions and therefore the postings are made permanent part of the mass-created content, but KCC made sure that the Guideline applies to these services only when the posting identifies the authors.
The guidelines created by the KCC include that data subjects can remove content that includes the URL links, and any evidence consisting of personal information. The commission included different amendments to the guideline. This includes describing the Guidelines as a "minimum" and "preliminary" precaution regarding privacy rights in vague areas of existing laws. The guideline encompasses foreign Internet companies that provide translation services for South Korean consumers. In order to have a person's information "forgotten" he or she has to go through a three step process: the issue posted with the URL, proof of ownership of the post, grounds for the request. There are restrictions on each step. When posting the URL, the web operator has the right to preserve the posting issue. The second being that if the post is relevant to public interest, web operators will process this request on the terms of relevance.
= Switzerland
=The right to be forgotten was added in the constitution of the Canton of Geneva in the new article 21A right to digital integrity voted on June 18, 2023.
= United States
=In law
Consideration of the right to be forgotten occurred in US case law, specifically in Melvin v. Reid, and in Sidis v. FR Publishing Corp.
In Melvin v. Reid (1931), an ex-prostitute was charged with murder and then acquitted; she subsequently tried to assume a quiet and anonymous place in society. However, the 1925 movie The Red Kimono revealed her history, and she successfully sued the producer. The court reasoned that "any person living a life of rectitude has that right to happiness which includes a freedom from unnecessary attacks on his character, social standing or reputation."
In Sidis v. FR Publishing Corp. (1940), the plaintiff, William James Sidis, was a former child prodigy who wished to spend his adult life quietly, without recognition; however, this was disrupted by an article in The New Yorker. The court held here that there were limits to the right to control one's life and facts about oneself, and held that there is social value in published facts, and that a person cannot ignore their celebrity status merely because they want to.
There is opposition to further recognition of the right to be forgotten in the United States as commentators argue that it will contravene the right to freedom of speech and freedom of expression, or will constitute censorship, thus potentially breaching peoples' constitutionally protected right to freedom of expression in the United States Constitution. These criticisms are consistent with the proposal that the only information that can be removed by user's request is content that they themselves uploaded.
In a June 2014 opinion piece in Forbes, columnist Joseph Steinberg noted that "many privacy protections that Americans believe that they enjoy – even some guaranteed by law – have, in fact, been eroded or even obliterated by technological advances". Steinberg, in explaining the need for legislation guaranteeing the "right to be forgotten", noted that existing laws require adverse information be removed from credit reports after a period of time, and that allowing the sealing or expunging of criminals records are effectively undermined by the ability of prospective lenders or employers to find forever the removed information in a matter of seconds by doing a web search.
On March 11, 2015, Intelligence Squared US, an organization that stages Oxford-Style debates, held an event concerning the question, "Should the U.S. adopt the 'Right to be Forgotten' online?" The side against the motion won with a 56% majority of the voting audience.
While opinions among experts are divided in the U.S., one survey indicated that 9 in 10 Americans want some form of the right to be forgotten. The consumer rights organization Consumer Watchdog has filed a complaint with the Federal Trade Commission for Americans to obtain the right as well.
In March 2017, New York state senator Tony Avella and assemblyman David Weprin introduced a bill proposing that individuals be allowed to require search engines and online speakers to remove information that is "inaccurate", "irrelevant", "inadequate", or "excessive", that is "no longer material to current public debate or discourse" and is causing demonstrable harm to the subject.
In June 2018, California enacted the California Consumer Privacy Act, providing consumers with the right to delete their personal information from covered businesses. In October 2023, the state enacted the California Delete Act, requiring the California Privacy Protection Agency to establish a one-stop shop deletion mechanism for consumers to direct data brokers to delete their personal information.
By private entities
In January 2021, the Boston Globe announced a program to allow subjects of relatively inconsequential stories to have the stories contextualized, removed from Google searches or anonymized.
Connection to international relations
The regulatory differences in the protection of personal data between countries has real impact on international relations. The right to be forgotten, specifically, is a matter of EU-US relations when applied to cross-border data flow, as it raises questions about territorial sovereignty. The structure of the Westphalian international system assumes that the reach of a country's jurisdiction is limited to its geographic territory. However, online interactions are independent of geographic location and present across multiple locations, rendering the traditional concept of territorial sovereignty moot. Therefore, the EU and the United States are forced to confront their regulatory differences and negotiate on a set of regulations that apply to all foreign companies processing and handling data of European citizens and residents.
The regulatory differences on the right to be forgotten along with numerous other data protection rights have affected discussions and negotiations on trans-Atlantic data privacy regulations. A case in point is the EU and the United States' endeavors to develop the International Safe Harbor Privacy Principles agreement, a data transfer pact that enables the transfer of data between the EU and US companies in a manner consistent with the EU's data protection schemes. Article 25 of the Data Protection Directive articulates that cross-border transfer of data can take place only if the "third country in question ensures an adequate level of protection," meaning that the country meets the EU's minimum standards of data protection. The standards include, among many provisions, a component that protects the right to "opt out" of further processing or transmission of personal data, with the assumption that data may not be further processed in ways inconsistent with the intent for which they were collected.
Given the inconsistencies between the EU and the United States on numerous digital privacy regulations, including the right to be forgotten, Article 25 poses a threat to trans-Atlantic data flows. Therefore, the EU and the United States began negotiations to mediate the differences through the Safe Harbor agreement, which as a result of debate and discussion between the two parties, requires companies to provide individuals with the choice or opportunity to "opt out", and provides other protections.
As a result of the mass surveillance performed by the US government on European citizens' data, the Safe Harbor agreement has been invalidated by the European Union Court of Justice in its Schrems case. The Safe Harbor agreement has now been replaced by the Privacy Shield principles.
Response and criticism
= Response by reputation management firms
=Businesses that manage their clients' online reputations have responded to the European Court ruling by exercising the right to be forgotten as a means to remove unfavorable information. One technique used by reputation consulting companies is to submit multiple requests for each link, written with different angles in an attempt to get links removed. Google, for example, does not limit the number of requests that can be submitted on the removal of a given link.
= Criticism
=Major criticisms stem from the idea that the right to be forgotten would restrict the right to freedom of speech. Many nations, and the United States in particular (with the First Amendment to the United States Constitution), have very strong domestic freedom of speech laws, which would be challenging to reconcile with the right to be forgotten. Some academics see that only a limited form of the right to be forgotten would be reconcilable with US constitutional law; the right of an individual to delete data that he or she has personally submitted. In this limited form of the right individuals could not have material removed that has been uploaded by others, as demanding the removal of information could constitute censorship and a reduction in the freedom of expression in many countries. Sandra Coliver of the Open Society Justice Initiative argues that not all rights must be compatible and this conflict between the two rights is not detrimental to the survival of either.
The draft General Data Protection Regulation was written broadly and this has caused concern. It has attracted criticism that its enactment would require data controlling companies to go to great lengths to identify third parties with the information and remove it. The proposed regulation has also caused criticism due to the fact that this could produce a censoring effect in that companies, such as Facebook or Google, will wish to not be fined due to the act, and will therefore be likely to delete wholesale information rather than facing the fine, which could produce a "serious chilling effect". In addition to this, there are concerns about the requirement to take down information that others have posted about an individual; the definition of personal data in Article 4(2) includes "any information relating to" the individual. This, critics have claimed, would require companies to take down any information relating to an individual, regardless of its source, which would amount to censorship, and result in the big data companies eradicating much data to comply with this. Such removal can impact the accuracy and ability of businesses and individuals to perform business intelligence, particularly due diligence to comply with anti bribery, anticorruption, and know your customer laws. The right to be forgotten was invoked to remove from Google searches 120 reports about company directors published by Dato Capital, a Spanish company which compiles such reports about private company directors, consisting entirely of information they are required by law to disclose; Fortune magazine examined the 64 reports relating to UK directorships, finding that in 27 (42%) the director was the only person named, in the remaining only the director and co-directors were named, and 23 (36%) involve directorships started since 2012.
Other criticism revolves around the principle of accountability.
There were concerns that the proposed General Data Protection Regulation would result in Google and other Internet search engines not producing neutral search results, but rather producing biased and patchy results, and compromising the integrity of Internet-based information. To balance this criticism, the proposed General Data Protection Regulation included an exception "for the processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression in order to reconcile the right to the protection of personal data with the rules governing freedom of expression." Article 80 upheld freedom of speech, and while not lessening obligations on data providers and social media sites, nevertheless due to the wide meaning of "journalistic purposes" allows more autonomy and reduces the amount of information that is necessary to be removed.: 9 When Google agreed to implement the ruling, European Commission Vice-President Viviane Reding said, "The Court also made clear that journalistic work must not be touched; it is to be protected." However, Google was criticized for taking down (by the Costeja precedent) a BBC News weblog post about Stan O'Neal by economics editor Robert Peston (eventually, Peston reported that his weblog post has remained findable in Google after all). Despite these criticisms and Google's action, the company's CEO, Larry Page worries that the ruling will be "used by other governments that aren't as forward and progressive as Europe to do bad things", though has since distanced himself from that statement. For example, pianist Dejan Lazic cited the "Right To Be Forgotten" in trying to remove a negative review about his performance from The Washington Post. He claimed that the critique was "defamatory, mean-spirited, opinionated, offensive and simply irrelevant for the arts". and the St. Lawrence parish of the Roman Catholic church in Kutno, Poland asked Google to remove the Polish Wikipedia page about it, (in Polish) without any allegations mentioned therein as of that date.
Index on Censorship claimed that the Costeja ruling "allows individuals to complain to search engines about information they do not like with no legal oversight. This is akin to marching into a library and forcing it to pulp books. Although the ruling is intended for private individuals it opens the door to anyone who wants to whitewash their personal history… The Court's decision is a retrograde move that misunderstands the role and responsibility of search engines and the wider Internet. It should send chills down the spine of everyone in the European Union who believes in the crucial importance of free expression and freedom of information."
In 2014, the Gerry Hutch page on the English Wikipedia was among the first Wikipedia pages to be removed by several search engines' query results in the European Union. The Daily Telegraph said, on 6 August 2014, that Wikipedia co-founder Jimmy Wales "described the EU's Right to be Forgotten as deeply immoral, as the organisation that operates the online encyclopedia warned the ruling will result in an Internet riddled with memory holes". Other commentators have disagreed with Wales, pointing to problems such as Google including links to revenge porn sites in its search results, and have accused Google of orchestrating a publicity campaign to escape the burdensome obligation to comply with the law. Julia Powles, a law and technology researcher at the University of Cambridge, made a rebuttal to Wales' and the Wikimedia Foundation concerns in an editorial published by Guardian, opining that "There is a public sphere of memory and truth, and there is a private one...Without the freedom to be private, we have precious little freedom at all."
In response to the criticism, the EU has released a factsheet to address what it considers myths about the right to be forgotten. In addition to this, for further clarification of the law, the factsheet provides information about the important court case C-131/12 and frequently asked questions regarding Google, the purpose of the law, and how it works.
Other criticisms involving the right to be forgotten concerns the policies for data removal regarding minors. The U.S. has laws that protect the privacy of minors. The California Minor Eraser Law is a law that allows California residents younger than the age of 18 to request to have information removed that they posted on an online server. The law "applies to websites, social media sites, mobile apps and other online services" and follows "Europe's recognition of the 'right to be forgotten'". This law became effective on January 1, 2015 and remains in existence. Online "service" operators that have services "directed toward minors" must update their privacy policies to include the option to remove data if requested by a minor that is posted on a service.
In the UK, the 2017 Conservative manifesto included a pledge to allow social media platform users to remove outdated information that was posted when they were younger than the age of 18. "A Tory victory on the 8th of June will lead to those youthful indiscretions on Facebook and Twitter being open to erasure. But there are also plans to fine social media firms for not moving at the speed of political opportunism over extreme content." The United Kingdom has not yet fully adopted the ruling of the European Court of Justice regarding the right to be forgotten and argued to keep it from becoming EU law. However, in the upcoming elections in the UK laws could be passed to allow minors to remove embarrassing posts or photos on social media that could come back to affect job applications or public image in later life.
Theresa May, then Prime Minister of the UK, has advocated to extend privacy rights for minors in allowing them to have a right to delete information. The intentions for this extension of privacy are based on the fact that social media sites store years of data that affect minors lives' much later after the information is posted. May gave her stance on privacy when she said, "'The Internet has brought a wealth of opportunity but also significant new risks which have evolved faster than society's response to them'". The Conservative Party, which was headed by May from 2016 to 2019, has pushed for policies that aggressively remove illegal material from the Internet and fine firms that do not take action in removing said material.
In 2015, Commission nationale de l'informatique et des libertés (CNIL) asked Google to remove data from all versions available in any part of the world. Google and other entities argued that European data regulators should not be allowed to decide what Internet users around the world find when they use a search engine.
= Research
=Security researchers from CISPA, Saarland University and the University of Auckland are working to develop software to support the automation of the right to be forgotten in a scalable, provable and privacy-preserving manner. The team's software, Oblivion, would automate the process of verifying someone's personal information can be found in a Google search result, to help Google staff manage the high volume of take-down requests.
Researchers have noted that the current capabilities of this technology have some limitations. The software can only determine whether a piece of information is available, not whether it should be removed.
Data deletion protocols concerning the death of a user is another consideration.
See also
Article 29 Working Party
Fundamental rights
Information privacy
International human rights law
Internet privacy
Martin v. Hearst Corporation
Right to disconnect
Search engine privacy
Thomas Goolnik
Tiziana Cantone
References
Further reading
= Articles
== Books
=Ausloos, Jef (2020). The Right to Erasure in EU Data Protection Law. Oxford University Press. ISBN 9780198847977.
Eichhorn, Kate (2019). The End of Forgetting: Growing Up with Social Media. Harvard University Press. ISBN 978-0674976696.
Jones, Meg Leta (2016). Ctrl + Z: The Right to Be Forgotten. NYU Press. ISBN 978-1479881703.
= Cases
=Melvin v. Reid 112 Cal.App. 285, 297 P. 91 (1931)
Sidis v F-R Publishing Corporation 311 U.S. 711 61 S. Ct. 393 85 L. Ed. 462 1940 U.S.
Google Spain, S.L., Google Inc. y Agencia Española de Protección de Datos (AEPD), Mario Costeja González ECLI:EU:C:2014:317
= Legislation
=Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data. EU Directive 1995.
European Commission. Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and On the Free Movement of Such Data (General Data Protection Regulation). 2012/0011 (COD). Article 3. "Territorial Scope."
in Art
Esther Hovers, 2021
External links
Google's 'Search removal request under European Data Protection law' form
Yahoo's Right to be Forgotten Request page
Bing's European Privacy Request form
List of European Data Protection Authorities (for appeals)
Guidelines on the implementation of the Court of Justice of the European Union judgment on Google Spain and inc v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González
Factsheet on the "Right to be Forgotten" ruling (C-131/12)
Google's Transparency Report (statistics and examples)
Report of the Advisory Council to Google on the Right to be Forgotten
Google's responses to the Questionnaire addressed to Search Engines by the Article 29 Working Party
List of BBC web pages which have been removed from Google's search results
Telegraph stories affected by EU 'right to be forgotten'
The Right to Be Forgotten: Forced Amnesia in a Technological Age
The Right to be Forgotten Archived 2019-12-03 at the Wayback Machine
Understand Your Privacy Rights
Kata Kunci Pencarian:
- Google Spain v AEPD and Mario Costeja González
- Hak untuk dilupakan
- Regulasi Umum Perlindungan Data
- Pawai Unite the Right
- Last Twilight
- Best of The Corrs
- Pengentasan kemiskinan
- Sosialisme
- Ariana Grande
- Kajol
- Right to be forgotten
- Right to privacy
- Dato Capital
- Freedom of speech
- Streisand effect
- Google Spain v AEPD and Mario Costeja González
- General Data Protection Regulation
- Freedom of information
- Dejan Lazić
- Ashutosh Kaushik