|
The Swedish government internet censor fines Google for not taking down links and for warning the targeted website about the censorship
|
|
|
 |
17th March 2020
|
|
| See article from sputniknews.com
|
The Swedish data protection censor, Datainspektionen has fined Google 75 million Swedish kronor (7 million euro) for failure to comply with the censorship instructions. According to the internet censor, which is affiliated with Sweden's Ministry of
Justice, Google violated the terms of the right-to-be-forgotten rule, a EU-mandated regulation introduced in 2014 allowing individuals to request the removal of potentially harmful private information from popping up in internet searches and directories.
Datainspektionen says an internal audit has shown that Google has failed to properly remove two search results which were ordered to be delisted back in 2017, making either too narrow an interpretation of what content needed to be removed, or failing
to remove a link to content without undue delay. The watchdog has also slapped Google with a cease-and-desist order for its practice of notifying website owners of a delisting request, claiming that this practice defeats the purpose of link
removal in the first place. Google has promised to appeal the fine, with a spokesperson for the company saying that it disagrees with this decision on principle.
|
|
The EU's highest court finds that the 'right to be forgotten' does not apply outside of the EU
|
|
|
| 25th September 2019
|
|
| See article from bbc.com See
A view from America: Europe's
'right to be forgotten' shows legislators what not to dofrom washingtonpost.com |
The EU's top court has ruled that Google does not have to apply the right to be forgotten globally. It means the firm only needs to remove links from its search results in the EU, and not elsewhere The ruling stems from a dispute between Google
and a French privacy censor CNIL. In 2015 it ordered Google to globally remove search result listings to pages containing banned information about a person. The following year, Google introduced a geoblocking feature that prevents European users
from being able to see delisted links. But it resisted censoring search results for people in other parts of the world. And Googlechallenged a 100,000 euro fine that CNIL had tried to impose. The right to be forgotten, officially known as the
right to erasure, gives EU citizens the power to request data about them be deleted. Members of the public can make a request to any organisation verbally or in writing and the recipient has one month to respond. They then have a range of considerations
to weigh up to decide whether they are compelled to comply or not. Google had argued that internet censorship rules should not be extended to external jurisdictions lest other countries do the same, eg China would very much like to demand that the
whole world forgets the Tiananmen Square massacre. The court also issued a related second ruling, which said that links do not automatically have to be removed just because they contain information about a person's sex life or a criminal
conviction. Instead, it ruled that such listings could be kept where strictly necessary for people's freedom of information rights to be preserved. However, it indicated a high threshold should be applied and that such results should fall down search
result listings over time. Notably, the ECJ ruling said that delistings must be accompanied by measures which effectively prevent or, at the very least, seriously discourage an internet user from being able to access the results from one of
Google's non-EU sites. It will be for the national court to ascertain whether the measures put in place by Google Inc meet those requirements. |
|
European Court of Justice moves towards limiting censorship via the 'right to be forgotten' to the EU
|
|
|
 |
13th January 2019
|
|
| See article from clicklancashire.com
|
The French Internet censor CNIL some time ago insisted that censorship required under the 'right to be forgotten' should be applied worldwide rather than limited to the EU. Google appealed against the court order leading to the case being sent to the
European Court of Justice. Now opinions from the court's advocate general suggest that court will determine that the right to be forgotten does not apply worldwide. The opinions are not final but the court often follows them when it hands down its
ruling, which is expected later. CNIL wanted Google to remove links from Google.com instead of just removing links from European versions of the site, like Google.de and Google.fr. However Maciej Szpunar warned that going further would be risky
because the right to be forgotten always has to be balanced against other rights, including legitimate public interest in accessing the information sought. Szpunar said if worldwide de-referencing was allowed, European Union authorities would not
be able to determine a right to receive information or balance it against other fundamental rights to data protection and to privacy. And of course if France were allowed to censor information from the entire worldwide internet then why not China,
Russia, Iran, and Saudi Arabia? |
|
European Court of Justice hears case with France calling for its information bans under the 'right to be forgotten' to be implemented throughout the World.
|
|
|
 | 10th September 2018
|
|
| See article
from article19.org |
ARTICLE 19 is leading a coalition of international human rights organisations, who will tell the European Court of Justice (CJEU) that the de-listing of websites under the right to be forgotten should be limited in order to protect global freedom
of expression. The hearing will take place on September 11 with a judgment expected in early 2019. The CJEU hearing in Google vs CNIL is taking place after France's highest administrative court asked for clarification in relation
to the 2014 ruling in Google Spain. This judgment allows European citizens to ask search engines like Google to remove links to inadequate, irrelevant or ... excessive content -- commonly known as the right to be forgotten (RTBF). While the content
itself remains online, it cannot be found through online searches of the individual's name. The CJEU has been asked to clarify whether a court or data regulator should require a search engine to de-list websites only in the
country where it has jurisdiction or across the entire world. France's data regulator, the Commission Nationale de l'Informatique et des Libertes (CNIL) has argued that if they uphold a complaint by a French citizen, search
engines such as Google should not only be compelled to remove links from google.fr but all Google domains. ARTICLE 19 and the coalition of intervening organisations have warned that forcing search engines to de-list information on
a global basis would be disproportionate. Executive Director of ARTICLE 19, Thomas Hughes said: This case could see the right to be forgotten threatening global free speech. European data regulators should not be
allowed to decide what Internet users around the world find when they use a search engine. The CJEU must limit the scope of the right to be forgotten in order to protect the right of Internet users around the world to access information online.
ARTICLE 19 argues that rights to privacy and rights to freedom of expression must be balanced when it comes to making deciding whether websites should be de-listed. Hughes added: If
European regulators can tell Google to remove all references to a website, then it will be only a matter of time before countries like China, Russia and Saudi Arabia start to do the same. The CJEU should protect freedom of expression not set a global
precedent for censorship.
|
|
France ludicrously claims the right to censor the World's internet and fines Google for not blocking Americans from viewing content censored in the EU
|
|
|
 | 26th March 2016
|
|
| See article from engadget.com
|
Europe's right to be forgotten is a nasty and arbitrary censorship power used to hide internet content such as past criminal history. Many think it tramples on the public's right to know, as quite a few examples have born out. It seems that
France and the EU thinks that such content should be censored worldwide, and have fined Google 100,000 euro for allowing non EU internet viewers to see information censored in the EU. Since EU laws don't apply elsewhere, Google at first just
deleted right to be forgotten requested results from its French domain. However, France pointed out that it would be easy to find the info on a different site and ordered the company to scrub results everywhere. In an attempted compromise, Google
started omitting results worldwide as long as it determined, by geolocation, that the search was conducted from within France. But now EU internet censors have rejected that idea (as it would be easy to get around with a VPN) and fined Google
effectively for allowing Americans to see content censored in the EU. Google commented: We disagree with the [regulator's] assertion that it has the authority to control the content that people can access outside
France.
In its ruling, France's CNIL censor says that geolocalizing search results does not give people effective, full protection of their right to be delisted ... accordingly, the CNIL restricted committee pronounced a 100,000
euro fine against Google. Google plans to appeal the ruling.
|
|
Google starts censoring google.com for countries affected by EU censorship demands
|
|
|
 | 5th March 2016
|
|
| See article from theregister.co.uk See
article from searchengineland.com |
If you use Google in Europe, your search results will be censored under the EU's's disgraceful 'right-to-be-forgotten'. Until now if you used Google.com rather than, say, Google.de, you could still find results that have been arbitrarily removed based
on how loud people shout. The censorship has been implemented as follows. Assume that someone in Germany files a Right To Be Forgotten request to have some listing censored for their name. If granted, the censorship will work like this for
searches on that person's name:
- Listing censored for those in Germany, using ANY version of Google.
- Listing censored for those in the EU, using a European version of Google.
- Listing NOT censored for those outside Germany but within the EU, using non-European
versions of Google.
- Listing NOT censored for those outside the EU, using ANY version of Google.
Google's Peter Fleischer explained the reasons for the censorship: We're changing our approach as a result of specific discussions that we've had with EU data protection regulators in recent months.
We believe that this additional layer of delisting enables us to provide the enhanced protections that European regulators ask us for, while also upholding the rights of people in other countries to access lawfully published
information.
|
|
|
|
|
 | 19th December 2015
|
|
|
Here's a conspiracy theory for you: Are anti-European MEPs behind this shamefully bad censorship legislation so as to encourage us to vote for Brexit? See
article from techdirt.com |
|
ICO demands that Google censors information from google.com when accessed from the UK
|
|
|
| 25th November 2015
|
|
| See
article from
publicaffairs.linx.net |
The "right to be forgotten" applies to any search engine accessible in the UK, the Information Commissioner's Office has claimed. In a blog post earlier this month, ICO demanded: In August we
issued our first enforcement notice in this area , ordering Google to remove nine search
results brought up by entering an individual's name. Google has so far responded constructively, and the links are no longer visible on the European versions of their search engine. However we consider that they should go a step further, and make the
links no longer visible to anyone directly accessing any Google search services from within the UK (this would include someone sat a desk in Newcastle, but using google.com). This is a proper and proportionate reflection of what the EU Court of Justice
ruling means in practice, and so we've clarified the original enforcement notice , with the original text remaining the same but with a new section
added spelling out exactly what we expect of Google.
|
|
More disgraceful censorship legislation on the way from the EU
|
|
|
 | 21st November 2015
|
|
| See article from
eff.org |
Europe is very close to the finishing line of an extraordinary project: the adoption of the new General Data Protection Regulation (GDPR), a single, comprehensive replacement for the 28 different laws that implement Europe's existing 1995
Data Protection Directive . More than any other instrument, the original Directive has created a high global standard for personal data
protection, and led many other countries to follow Europe's approach. Over the years, Europe has grown ever more committed to the idea of data protection as a core value. In the Union's Charter of Fundamental Rights, legally binding on all the EU states
since 2009, lists the "right to the protection of personal data" as a separate and equal right to privacy. The GDPR is intended to update and maintain that high standard of protection, while modernising and streamlining its enforcement.
The battle over the details of the GDPR has so far mostly been a debate between advocates pushing to better defend data protection, against companies and other interests that find consumer privacy laws a hindrance to their business
models. Most of the compromises between these two groups have now already been struck. The result is a ticking time-bomb that will be bad for online speech, and bad for the future reputation of the GDPR and data protection in
general. The current draft of the GDPR doubles down on Google Spain, and raises new problems. (The draft currently under negotiation is not publicly available, but July 2015 versions of the provisions that we refer to can be found
in this comparative table of proposals and counter-proposals by
the European institutions. Article numbers referenced here, which will likely change in the final text, are to the proposal from the Council of the EU unless otherwise stated.) First, it requires an Internet intermediary (which is
not limited to a search engine, though the exact scope of the obligation remains vague) to respond to a request by a person for the removal of their personal information by immediately restricting the content, without notice to the user who uploaded that
content (Articles 4(3a), 17, 17a, and 19a.). Compare this with the DMCA takedown notices, which include a notification requirement, or even the current Right to Be Forgotten process, which give search engines some time to consider the legitimacy of the
request. In the new GDPR regime, the default is to block. Then, after reviewing the (also vague) criteria that balance the privacy claim with other legitimate interests and public interest considerations such as freedom of
expression (Articles 6.1(f), 17a(3) and 17.3(a)), and possibly consulting with the user who uploaded the content if doubt remains, the intermediary either permanently erases the content (which, for search engines, means removing their link to it), or
reinstates it (Articles 17.1 and 17a(3)). If it does erase the information, it is not required to notify the uploading user of having done so, but is required to notify any downstream publishers or recipients of the same content (Articles 13 and 17.2),
and must apparently also disclose any information that it has about the uploading user to the person who requested its removal (Articles 14a(g) and 15(1)(g)). Think about that for a moment. You place a comment on a website which
mentions a few (truthful) facts about another person. Under the GDPR, that person can now demand the instant removal of your comment from the host of the website, while that host determines whether it might be okay to still publish it. If the host's
decision goes against you (and you won't always be notified, so good luck spotting the pre-emptive deletion in time to plead your case to Google or Facebook or your ISP), your comment will be erased. If that comment was syndicated, by RSS or some other
mechanism, your deleting host is now obliged to let anyone else know that they should also remove the content. Finally, according to the existing language, while the host is dissuaded from telling you about any of this procedure,
they are compelled to hand over personal information about you to the original complainant. So this part of EU's data protection law would actually release personal information! What are the incentives for the intermediary to
stand by the author and keep the material online? If the host fails to remove content that a data protection authority later determines it should have removed, it may become liable to astronomical penalties of ?100 million or up to 5% of its global
turnover, whichever is higher (European Parliament proposal for Article 79). That means there is enormous pressure on the intermediary to take information down if there is even a remote possibility that the information has indeed
become "irrelevant", and that countervailing public interest considerations do not apply. It is not too late yet: proposed amendments to the GDPR are still being considered. We have written a
joint letter with ARTICLE 19 to European policymakers, drawing their
attention to the problem and explaining what needs to be done. We contend that the problems identified can be overcome by relatively simple amendments to the GDPR, which will help to secure European users' freedom of expression, without detracting from
the strong protection that the regime affords to their personal data. Without fixing the problem, the current draft risks sullying the entire GDPR project. Just like the DMCA takedown process, these GDPR removals won't just be
used for the limited purpose they were intended for. Instead, it will be abused to censor authors and invade the privacy of speakers. A GDPR without fixes will damage the reputation of data protection law as effectively as the DMCA permanently tarnished
the intent and purpose of copyright law.
|
|
The EU's right to be forgotten has diminished free speech according to former UN free speech rapporteur
|
|
|
 |
27th June 2015
|
|
| See article
from indexoncensorship.org |
Freedom of expression is more in danger today than in 2008 because of the right to be forgotten , the United Nation's former free expression rapporteur Frank La Rue told an internet conference. At the event La Rue told Index on Censorship:
The emphasis on the 'right to be forgotten' in a way is a reduction of freedom of expression, which I think is a mistake. People get excited because they can correct the record on many things but the trend is towards
limiting people's access to information which I think is a bad trend in general.
La Rue, who was the UN's rapporteur between 2008 and 2014, addressed lawyers, academics and researchers at the Institute of Advanced Legal Studies in
London, in particular covering the May 2014 right to be forgotten ruling from the Court of Justice of the European Union, and its impact on free speech. On the ruling, La Rue said: I would want to know the past.
It is very relevant information. Everyone should be on the record and we have to question who is making these decisions anyway? The state is accountable to the people of a nation so should be accountable here. Not private
companies and especially not those with commercial interests.
|
|
BBC publishes a long list of websites censored from Google search under the EU's 'right to be forgotten'
|
|
|
 |
27th June 2015
|
|
| See list of censored news stories from
bbc.co.uk |
The BBC explains its commendable policy in a blog post: Since a European Court of Justice ruling last year, individuals have the right to request that search engines remove certain web pages from their search results. Those pages
usually contain personal information about individuals. Following the ruling, Google removed a large number of links from its search results , including some to BBC web pages, and continues to delist pages from BBC Online.
The BBC has decided to make clear to licence fee payers which pages have been removed from Google's search results by publishing this list of links. Each month, we'll republish this list with new removals added at the top.
We are doing this primarily as a contribution to public policy. We think it is important that those with an interest in the right to be forgotten can ascertain which articles have been affected by the ruling. We hope it will
contribute to the debate about this issue. We also think the integrity of the BBC's online archive is important and, although the pages concerned remain published on BBC Online, removal from Google searches makes parts of that archive harder to find.
The pages affected by delinking may disappear from Google searches, but they do still exist on BBC Online. David Jordan, the BBC's Director of Editorial Policy and Standards, has written a blog post which explains how we view that
archive as a matter of historic public record and, thus, something we alter only in exceptional circumstances. The BBC's rules on deleting content from BBC Online are strict; in general, unless content is specifically made available only for a
limited time, the assumption is that what we publish on BBC Online will become part of a permanently accessible archive. To do anything else risks reducing transparency and damaging trust. One caveat: when looking through this
list it is worth noting that we are not told who has requested the delisting, and we should not leap to conclusions as to who is responsible. The request may not have come from the obvious subject of a story. See
list of censored news stories |
|
Internet censor wants Google to implement its censorship demands worldwide, not just in France
|
|
|
 | 15th June 2015
|
|
| See article from bbc.co.uk
|
Google has 15 days to comply with a demand from France's internet censor to extend the right to be forgotten to all its search engines. Google has responded to European censorship under the right to be forgotten by only removing the required
information for the copy of the search engine specific to the censoring country. And in particular leaves the links live in the global google.com version. French censor CNIL said Google could face sanctions if it did not comply within the time
limit. In response, Google said in a statement: We've been working hard to strike the right balance in implementing the European Court's ruling, co-operating closely with data protection authorities.
The ruling focused on services directed to European users, and that's the approach we are taking in complying with it.
|
|
|
|
|
 | 18th December 2014
|
|
|
That is a big step, one that even China, the master of internet censorship, has never taken. See article from
huffingtonpost.com |
|
|
|
|
 |
14th December 2014
|
|
|
The company is under relentless attack by European authorities who won't stop until they do real damage. By Mike Elgan See article from computerworld.com
|
|
EU internet censors publish their rules about the 'right to be forgotten'
|
|
|
 | 4th
December 2014
|
|
| See article from
searchengineland.com See
EU 'right to be forgotten' censorship rules [pdf] from
ec.europa.eu |
The EU has issued formal censorship rules surrounding the so-called Right to Be Forgotten (RTBF). The formal considerations that the EU data censors want considered in evaluating any RTBF request are:
- Does the search result relate to a natural person -- i.e. an individual? And does the search result come up against a search on the data subject's name?
- Does the data subject play a role in
public life?
- Is the data subject a public figure?
- Is the data subject a minor?
- Is the data accurate?
- Is the data relevant and not excessive?
- Is the information sensitive within the meaning of Article 8 of the Directive 95/46/EC?
- Is the data up to date? Is the data being made available for longer than is necessary for
the purpose of the processing?
- Is the data processing causing prejudice to the data subject?
- Does the data have a disproportionately negative privacy impact on the data subject?
- Does the search result link to information that puts the data subject at risk?
- In what context was the information published?
- Was the
original content published in the context of journalistic purposes?
- Does the publisher of the data have a legal power, or a legal obligation, to make the personal data publicly available?
-
Does the data relate to a criminal offence?
In most cases, it appears that more than one criterion will need to be taken into account in order to reach a decision to censor. In other words, no single criterion is, in itself, determinative. The document asserts that successful RTBF
requests should be applied globally and not just to specific country domain search results, as Google has been doing: [D]e-listing decisions must be implemented in a way that guarantees the effective and complete
protection of these rights and that EU law cannot be easily circumvented. In that sense, limiting de-listing to EU domains on the grounds that users tend to access search engines via their national domains cannot be considered a sufficient means to
satisfactorily guarantee the rights of data subjects according to the judgment. In practice, this means that in any case de-listing should also be effective on all relevant domains, including .com
But any such global de-listing
sets up a conflict of laws between nations that recognize RTBF and those that do not. Google had been notifying publishers that their links were being removed, causing some to republish those links for re-indexing. This has frustrated some European
censors who see this practice as undermining the RTBF. Accordingly, the EU says that publishers should not be notified of the removal of links: Search engine managers should not as a general practice inform the
webmasters of the pages affected by de-listing of the fact that some webpages cannot be acceded from the search engine in response to specific queries. Such a communication has no legal basis under EU data protection law.
The EU also
doesn't want Google to publish notices to users that links have been removed for similar reasons: It appears that some search engines have developed the practice of systematically informing the users of search engines
of the fact that some results to their queries have been de-listed in response to requests of an individual. If such information would only be visible in search results where hyperlinks were actually de-listed, this would strongly undermine the purpose
of the ruling. Such a practice can only be acceptable if the information is offered in such a way that users cannot in any case come to the conclusion that a specific individual has asked for the de-listing of results concerning him or her.
The guidelines state that beyond external search engines (e.g., Google) they may be extended to undefined intermediaries. However they immediately go on to apparently contradict that notion:
The right to de-listing should not apply to search engines with a restricted field of action, particularly in the case of search tools of websites of newspapers. Finally the guidelines suggest that only EU citizens
may be eligible in practice to make RTBF requests. |
|
EU internet censors want to prevent Europeans from accessing censored links via google.com
|
|
|
 | 27th November 2014
|
|
| See article from bbc.co.uk
|
Google is under fresh pressure to expand censorship under the right to be forgotten to its international .com search engine. A panel of EU censors claimed the move was necessary to prevent the law from being circumvented. Google
currently de-lists results that appear in the European versions of its search engines, but not the international one. At present, visitors are diverted to localised editions of the US company's search tool - such as Google.co.uk and Google.fr - when they
initially try to visit the Google.com site. However, a link is provided at the bottom right-hand corner of the screen offering an option to switch to the international .com version. This link does not appear if the users attempted to go to a regional
version in the first place. Even so, it means it is possible for people in Europe to easily opt out of the censored lists. |
|
Man goes to court to force Google to track down links to delete
|
|
|
| 25th November 2014
|
|
| 24th November 2014. See article from bbc.co.uk
|
The case of a UK businessman who wants Google to stop malicious web postings about him appearing in its search results is set to begin. Daniel Hegglin says he has been wrongly called a murderer, a paedophile and a Ku Klux Klan sympathiser during a
malicious online campaign against him. He wants Google to block the anonymous posts from its search engine results. Google asked him to provide a list of web links to be removed, but High Court judges will rule if it should do more. He claims
there are more than 3,600 websites containing abusive and untrue material about him, and says listing all the posts for Google to remove would be expensive, time consuming, and ineffective. He says that although Google is not the originator of the
abusive campaign, its search engines have allowed the abuse to become more widespread. He is seeking a legal order to force Google to take steps to prevent the abusive posts being processed in searches in England and Wales.
Update: Settled 25th November 2014. See article from
wiggin.co.uk The case was settled on the first day of trial. Daniel Hegglin's barrister said in a statement: The settlement includes
significant efforts on Google's part to remove the abusive material from Google-hosted websites and from its search results. Mr Hegglin will now concentrate his energies on bringing the persons responsible for this campaign of harassment to justice .
And a statement for Google: Google provides search services to millions of people and cannot be responsible for policing internet content. It will, however, continue to apply its procedures that have
been developed to assist with the removal of content which breaches local applicable laws .
|
|
|
|
|
 | 23rd September 2014
|
|
|
Oxford Mail republishes crime stories censored by Google under the EU's disgraceful 'right to be forgotten' See
article from oxfordmail.co.uk
|
|
|
|
|
 | 23rd September 2014
|
|
|
Myth-busting: European Commission misrepresents right to be forgotten objections See
article from indexoncensorship.org |
|
EU internet censors want to extend arbitrary censorship under the right to be forgotten
|
|
|
| 19th September 2014
|
|
| See article from
searchengineland.com
|
According to Reuters, European internet censors say they've agreed on a uniform set of EU-wide rules and criteria that will be used to evaluate appeals under the disgraceful Right to Be Forgotten (RTBF) law announced earlier this year by the
Luxembourg-based European Union Court of 'Justice'. Google has received in excess of 120,00 censorship requests since May. Many have been granted but many have not. Google is hardly in a position to research the merits of the case, so the decisions
are essentially arbitrary. Those whose censorship requests are turned down will be able to appeal the decision and that's where these censorship criteria will be applied. The specifics of the rules won't be finalized until November. However
Reuters suggests they will primarily take into account factors such as the public role of the person, whether the information relates to a crime and how old it is. There's still considerable ambiguity in some of these areas. Google has
adopted a practice of notifying publishers when RTBF links are removed. Apparently EU censors don't like this practice (probably because it puts political pressure on them amid cries of censorship or objections from the publishers). Google
currently only removes the subject links and material from the individual country Google site where the request was made (e.g., Google.fr, Google.de) but not from Google.com. Johannes Caspar, Germany's internet censor, reportedly believes that these RTBF
removals should be expunged globally. He spewed: The effect of removing search results should be global. This is in the spirit of the court ruling and the only meaningful way to act in a global environment like the
Internet..
Hopefully this won't occur as the US is a bit more keen on freedom than the PC extremists of the EU. |
|
Google organises protests across Europe against the ludicrous and inept 'right to be forgotten'
|
|
|
 |
8th September 2014
|
|
| See article from
stockwisedaily.com |
Google is to fight back against the European Union's inane right to be forgotten ruling. Following a ruling from the European Union Court of Justice under which, Google must remove personal information from search results upon requests without
being in the position to ascertain that the request is justified. In order to oppose against the ruling, Google is planning public hearings in seven different European cities starting in Madrid on September 9. Google is looking for a robust
debate over the ruling and its implementation criteria, as said by a top lawyer, David Drummond. Google is not the only company to criticize the ruling and Wikipedia Founder , Jimmy Wales, has called the ruling to be deeply immoral and even
said that ruling will lead to an internet riddled with memory holes. Drummond and Eric Schmidt, Google Chairman, will highlight the implications of this ruling. Furthermore, the company will outline ideas for handling requests related to
criminal convictions. |
|
EU internet censors get heavy with Google for informing websites that they have been censored
|
|
|
 |
26th July 2014
|
|
| From itproportal.com |
The EU's Article 29 Censorship Working Party has criticised Google for telling publishers about removed right to be forgotten links, and it wants links removed worldwide, not just on European variants of Google. Representatives of Google, Yahoo
and Bing were called back to address issues about the way that Google was handling right to be forgotten censorship requests. It turned into a sort of public dressing down of Google for not censoring links 'properly' . Google was criticised for
the fact that it was only removing links from the EU sites, and links could still be found on the US and other Google search pages. The EU censors feel that any EU citizen who doesn't like a particular post has the right to have all links to that story
censored worldwide. Google was also called out because they were informing the sources of the stories that they were pulling the links (causing websites to republish new articles, which added more new links, and so on). Irish data protection
censor Billy Hawkes expressed concerns regarding Google warning sites about their links being removed. The more they do so, it means the media organisation republishes the information and so much for the right to be
forgotten. There is an issue there.
Jimmy Wales criticises 'right to be forgotten' censorship See
article from theguardian.com
Wikipedia founder Jimmy Wales said it was dangerous to have companies decide what should be allowed to appear on the internet. Internet search engines such as Google should not be left in charge of censoring history , the Wikipedia founder
has said, after the US firm revealed it had approved half of more than 90,000 right to be forgotten requests. Jimmy Wales said it was dangerous to have companies decide what should and should not be allowed to appear on the internet.
|
|
Society of Editors calls on David Cameron to end censorship based on the right to be forgotten
|
|
|
 | 24th July 2014
|
|
| See article
[pdf] from indexoncensorship.org | The Society of Editors, which has the backing of
senior figures at the BBC, Sky News and ITN as well as major newspaper groups, as joined with Index on Censorship and the Media Lawyers Association to call on David Cameron and key EU data protection chiefs to resist censorship in the guise of the right
to be forgotten. The Society of Editors has wriiten to David Cameron: Dear Prime Minister,
The issues about the so-called right to be forgotten raised by the recent European Court judgement involving Google, with its implications for other search engines and accessibility to other journalistic information give us serious cause for
concern. We appreciate that no general right to be forgotten exists, as Ministers and the Information Commissioner have confirmed. The Court ruling is only about restricting access to links generated by search engines in
response to name searches. But there is a vital principle at stake which we trust that the Information Commissioner - responsible for adjudicating both data protection and freedom of information in the UK - and the government will defend with vigour.
The judgement makes clear that Europeans now have the right to demand that certain online material is obscured in search results and its dissemination via search engines is stopped. For media organisations and journalists, this is
akin to being asked - on the basis of the subjective opinions of individuals, rather than any specific Court order - to remove items from an index in newspaper archives. This is something we would only do after careful consideration based on a sound
legal and factual basis and hope never to be asked to do more. We feel sure that neither the Information Commissioner nor the government would wish to see this happen but we seek assurances that any such moves will be firmly
resisted and will not be applied in any new data protection legislation coming out of Europe in the future. We are concerned that the European Court's judgment goes against Article 10 of the European Convention of Human Rights and
certainly the intentions of the UK Parliament when it introduced the Human Rights Act. With regard to data protection legislation, journalistic work has always received special consideration. We are glad to see that the Court's
ruling continues this, and does not require news publishers to remove articles when asked to do so by individuals. This principle must be strongly defended or even enhanced. But the Court's ruling is deeply problematic for journalism in general, as it
has the effect of limiting the accessibility and dissemination of journalistic work via search engines, where the media company wishes this to be done. This reduces the visibility of the vital work done by journalists to ensure accountability throughout
society, which in itself is contrary to the spirit behind Article 10. For this reason, we believe that there should be greater transparency about the actions of search engines to comply with the European Court's ruling.
Specifically, we believe there should be no restrictions on the ability of Google or other operators to inform the originator of material when links to that material are removed. Any restrictions would prevent publishers having the opportunity to make
their case on freedom of expression grounds thus making the process one-sided. The Society of Editors has more than 400 members in national, regional and local newspapers, magazines, broadcasting and digital media, journalism
education and media law. It campaigns for media freedom, self regulation, the public's right to know and the maintenance of standards in journalism. This letter has the full support of the Society's board of directors which includes senior editors from
Sky News and the BBC and and key regional newspapers in England, Scotland, Wales and Northern Ireland. It also has the support of editors of major UK newspapers, including The Times, The Sunday Times, The Sun, The Guardian, The Independent, the Financial
Times, the Daily Express, the Daily Mirror, the Sunday Mirror, The Daily Telegraph, and Associated Newspapers as well as ITN. We would be grateful for your comments about this and your assurances that these principles will be
defended. |
|
|
|
|
 | 10th July 2014
|
|
|
When government minister Simon Hughes tries to spin that it's not See
article from dailymail.co.uk |
|
|
|
|
 |
10th July 2014
|
|
|
For centuries, the wicked have dreamt of wiping their crimes from history. Now - thanks to an idiotic ruling by European judges and Google's connivance - they're doing it in their thousands See
article from dailymail.co.uk
|
|
|
|
|
 | 6th
July 2014
|
|
|
The Register asks if recent examples of right to be forgotten censorship by Google were a ploy See
article from theregister.co.uk |
|
|
|
|
 |
3rd July 2014
|
|
|
The BBC reveals how its news archives have been censored by Google in the name of the right to be forgotten See article from bbc.co.uk |
|
|
|
|
| 3rd July 2014
|
|
|
The Guardian reveals how its news archives have been censored by Google in the name of the right to be forgotten See
article from theguardian.com |
|
Google begins removing search links under the EU's right to be forgotten
|
|
|
 |
27th June 2014
|
|
| See article from
theguardian.com |
Google has begun removing search links to content in Europe under the right to be forgotten ruling, which obliges it exclude web pages with supposedly outdated or irrelevant information about individuals from web searches. Searches made
on Google's services in Europe using peoples' names includes a section at the bottom with the phrase Some results may have been removed under data protection law in Europe , and a link to a page explaining the ruling by the European court of
justice (ECJ) in May 2014. However searches made on Google.com, the US-based service, do not include the same warning, because the ECJ ruling only applies within Europe. Google would not say how many peoples' search histories have been
censored, nor how many web pages have been affected. Comment: Goggle.eu.censored 28th June 2014. From Alan Not mentioned in the Guardian report is the difficulty for UK surfers of finding
uncensored searches on the American site. If I'm in Italy, I can either search in Italian at google.it or, if I want to search in English and enter google.com, I get the American site. But in this country, typing the URL for google.com redirects to
google.co.uk. Looks like we Brits are particular disadvantaged by the absurd decision of twattish Euro-judges. |
|
The supposed 'right to be forgotten' doesn't look like it will trump the right to factual legitimate information
|
|
|
 |
26th June 2013
|
|
| See detailed analysis of the judgement
from ukhumanrightsblog.com
|
Google should not have to delete information from its search results when old information is pulled up that is damaging to individuals who claim to be harmed by the content. That's the early opinion of a special advisor to the European Union's
highest court, who has apparently sided with Google in a case involving a man in Spain who argued that Google searches about him provide information about an arrest years before that should be cleaned up to protect him. An expert opinion requested
by the European Court of Justice, which is based in Luxembourg, recommended that Google not be forced to expunge all links to a 15-year-old legal notice published in a Spanish newspaper documenting a failure to pay back taxes. Instead, the
European Union's highest court was advised to strike down a Spanish regulator's demand that the search engine grant citizens a broad digital 'right to be forgotten,' including the ability to delete previous arrests and other negative publicity from
Google's online search results. A final decision in the case is expected before the end of this year.
|
|
|
|
|
 | 5th April 2013
|
|
|
Advocating a nightmare world where historical fact will be re-written to suit those who complain loudest and have the most most money to demand their version of the truth in court See
article from guardian.co.uk |
|
Google opposes the right to be forgotten in the European Court of Justice
|
|
|
 | 28th February 2013
|
|
| See article from
searchengineland.com
|
In a test case that could have significant implications for Google throughout Europe the company faced off against the Spanish data protection authority in the European Court of Justice. From the Spanish government's point of view its data
protection authority is pushing for the recently articulated right (of individuals) to be forgotten by having content or data about them removed from the search index upon request. From Google's perspective, if the court agrees with Spain, the
outcome would be tantamount to granting individuals the right to censor Google. The Spanish citizen, Mario Costeja, filed a complaint with the Spanish Data Protection Agency (AEPD) against Google and the newspaper La Vanguardia after discovering
that a Google search for his name produced results referring to the auction of real estate property seized from him for non-payment of social security contributions. The AEPD rejected Costeja's complaint against the newspaper on the grounds that
the publication of the information was legal and was protected by the right to information but, with extraordinary inconsistency, upheld his complaint his complaint against Google, ordering the search engine to eliminate about 100 links from all
future searches for Costeja's name. Google refused to accept the ruling and filed an appeal which has now reached court.
|
7th March 2012 | |
| Spain asks European Court to comment on the legality of demands of Google to de-list personal information
| See article from
reuters.com
|
Spain's highest court wants the European Court of Justice (ECJ) to decide if requests by Spanish citizens to have data deleted from Google's search engine are lawful. The Spanish court said it had asked the ECJ to clarify whether Google should
remove data from its search engine's index and news aggregator. Madrid's data protection authority has received over 100 requests from Spanish citizens to have their data removed from Google's search results. An example case is a plastic surgeon
who wants to get rid of archived references to a botched operation. The Spanish judges also asked the ECJ whether the complainants must take their grievances to California, where Google is based, or whether they can be addressed by Google Spain.
Google has maintained that it cannot lawfully remove any content for which it is merely the host and not the producer, a principle enshrined in EU law on eCommerce since 2000. Google told the Spanish prosecutor it needed more legal justification
for removing references to events in an individual's history.
|
16th February 2012 | |
| Lawyer warns that the 'right to be forgotten' will surely lead to internet censorship
|
See
article from
newstrackindia.com
|
A leading British lawyer has condemned new European regulations that force websites to delete data on users' request, saying such rules could transform search engines like Google into a censor-in-chief for the European Union, rather than a neutral
platform . According to the current European proposal from Justice Commissioner Viviane Reding, various websites will be forced to delete information shortly after consumers request it be removed. Prof Jeffrey Rosen, writing in the
Stanford Law Review, argued that the fear of fines will have a chilling effect, and that it will be hard to enforce across the Internet when information is widely disseminated: Although Reding depicted the new
right as a modest expansion of existing data privacy rights, in fact it represents the biggest threat to free speech on the Internet in the coming decade. Unless the right is defined more precisely when it is promulgated over the
next year or so, it could precipitate a dramatic clash between European and American conceptions of the proper balance between privacy and free speech, leading to a far less open Internet.
Prof Rosen warns that if the regulations are
implemented as currently proposed, it's hard to imagine that the Internet results will be as free and open as it is now.
|
30th January 2012 | |
| |
EU proposes a bag of worms that will only be untangled by incredibly expensive lawyers See article from
arstechnica.com |
| |