Facebook has outlined its approach to 'fake news' in a blog post:
A few weeks ago we previewed some of the things we're working on to address the issue of fake news and hoaxes. We're committed to doing our part and today we'd like to share some updates we're testing and starting to roll out.
We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully. We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on
engaging both our community and third party organizations.
The work falls into the following four areas. These are just some of the first steps we're taking to improve the experience for people on Facebook. We'll learn from these tests, and iterate and extend them over time.
We're testing several ways to make it easier to report a hoax if you see one on Facebook, which you can do by clicking the upper right hand corner of a post. We've relied heavily on our community for help on this issue, and this can help us detect more
We believe providing more context can help people decide for themselves what to trust and what to share. We've started a program to work with third-party fact checking organizations that are signatories of Poynter's International Fact Checking Code of
Principles. We'll use the reports from our community, along with other signals, to send stories to these organizations. If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the
corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.
It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share. Once a story is flagged, it can't be made into an ad and promoted, either.
We're always looking to improve News Feed by listening to what the community is telling us. We've found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We're
going to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it.
We've found that a lot of fake news is financially motivated. Spammers make money by masquerading as well-known news organizations, and posting hoaxes that get people to visit to their sites, which are often mostly ads. So we're doing several things to
reduce the financial incentives. On the buying side we've eliminated the ability to spoof domains, which will reduce the prevalence of sites that pretend to be real publications. On the publisher side, we are analyzing publisher sites to detect where
policy enforcement actions might be necessary.
It's important to us that the stories you see on Facebook are authentic and meaningful. We're excited about this progress, but we know there's more to be done. We're going to keep working on this problem for as long as it takes to get it right.
Offsite Article: Fake news detection on the cheap
The Guardian investigates how Facebook's trumpeted 'fake news' detection relies on unpaid volunteers.
El Inca is a 2016 Venezuela romance by Ignacio Castillo Cottin.
Starring Alexander Leterni, Scarlett Jaimes and Miguel Ferrari.
A tragic love story based in the life of the great Latin American boxer Edwin "El Inca" Valero. The only fight he lost, was the one against himself.
On an April morning in 2010, Venezuelan boxing legend Edwin El Inca Valero, an undefeated two-time world champion, murdered his wife. Two days later, he took his own life in his prison cell at 28 years of age.
Now, his violent and troubling story is making headlines again, in a politically charged scandal over a banned film about his life. The movie, El Inca, was a box office sensation when it premiered on Nov 25, rising to become the third-most
lucrative film of the year in Venezuela in less than three weeks.
But its run came to an abrupt halt on Dec 13, when a judge ordered it removed from theaters and impounded all copies. The court case was brought by Valero's family, which accused director Ignacio Castillo Cottin of slander. But the director alleges
politics had more to do with the ruling.
The film was banned by temporary injunction before the defamation trial even got under way. The judge then postponed the first hearing, scheduled for last Monday, because neither he nor the prosecution had seen the movie.
Connections of the film were not impressed that it was banned before the judge had even seen it.
A Virginia lawmaker is asking the state legislature to declare pornography a public health hazard 204 a move he hopes will pave the way for
limits of some sort.
Delegate Robert G. Marshall has proposed a resolution claiming that pornography leads to many social problems and that the General Assembly, which convenes its annual session on Jan. 11, needs to do something about it. Just what lawmakers should do is
The measure does not call for any sort of ban, only a broad recognition of the need for education, prevention, research, and policy change at the community and societal level in order to address the pornography epidemic that is harming the people of
the Commonwealth and the nation.
The legislation frames pornography not just as a moral scourge that leads to infidelity, the hypersexualization of teenagers and deviant sexual arousal, but as a weapon against women. The measure blames pornography for low self-esteem
and body image disorders and devotes a lot of attention to the objectification of women and girls.
The resolution piqued the interest of state Senator Barbara A. Favola, a women's rights campaigner. She said:
We will talk about it in the women's health caucus, I'm sure of that. He's right; pornography does have a negative impact on public health, and it does lead to lots of other issues. I'm going to look at it.
Russia has accused Charlie Hebdo of mocking the Black Sea plane crash after publishing inhuman
cartoons about the disaster.
In one reference to the crash, the French magazine depicted a jet hurtling downwards along with words translated as: Bad news... Putin wasn't on board .
The magazine also published a cartoon showing a choir member from the ensemble making a wailing sound aaaaaa . One caption reads: The repertoire of the army choir is expanding. A third cartoon shows bodies sinking in the sea with the
caption: The Red Army conquers a new public .
The Russian Defence Ministry's spokesman Major General Igor Konashenkov complained:
It is degrading for any human being to even pay attention to such a poorly-created abomination. If such, dare I say, "artistry" is the real manifestation of "Western values", then those who hold and support them are doomed - at least
to loneliness in the future.
National Archives show minister for justice Alan Dukes clashed with attorney general John Rodgers over access to the film censor's historical files.
In 1986 Kevin Rockett, then academic and chairman of the Irish Film Institute , wrote to attorney general John Rogers to say he had been refused access to the film censor's files, even for films of the 1920s, by then minister for justice Alan Dukes .
Rogers wrote to Dukes saying that he did not see the legal basis on which access to the files, especially for films made 30 years or more previously, could be resisted or refused.
A month later, Dukes responded that over the years, censors and ministers for justice had always considered themselves precluded , on the basis of breach of confidence, from disclosure of information on films.
Further letters ensued and eventually the files were opened following a long struggle. Rockett told The Irish Times that a fter a long and frustrating campaign he eventually convinced the Official Film Censor in 1998 to transfer the more than 100
volumes of film censorship material to the National Archives.
Rockett wrote Irish Film Censorship: A Cultural Journey from Silent Cinema to Internet Pornography in 2004, with the help of those files.
Amazon has refused to hand over recordings from an Echo smart speaker to US police investigating a murder in Arkansas. Police issued a warrant to Amazon to turn over recordings and other information associated with the device.
Amazon twice declined to provide the police with the information they requested from the device, although it did provide account information and purchase history.
Although the Echo is known for having always-on microphones to enable its voice-controlled features, the vast majority of the recordings it makes are not saved for longer than the few seconds it takes to determine if a pre-set wake word (usually
Alexa ) has been said. Only if that wake word has been heard does the device's full complement of microphones come on and begin transmitting audio to Amazon.
However the police pursuit of the data suggests there is more of interest up for grabs than Amazon is admitting.
Amazon's reluctance to part with user information fits a familiar pattern. Tech companies often see law enforcement requests for data as invasive and damaging to an industry. It is clearly an issue for sales of a home microphone system if it is easy for
the authorities to grab recordings.
Other devices have also been good data sources for police investigations. Wristwatch-style Fitbit activity trackers have cropped up in a few cases eg for checking alibis against sleep patterns or activity.
A smart water meter has also been used in a murder case as evidence of a blood clean up operation,
The Bangladesh government has started an initiative
to block several hundred pornography websites and already sent a list of more than 500 sites, mostly locally hosted, to ISPs.
The Bangladesh Telecommunication Regulatory Commission (BTRC) sent the list to all the mobile phone operators, international gateway operators, international internet gateway operators, interconnection exchange operators, internet service providers and
other telecom service providers to block the domains from their respective networks.
After receiving the list the operators have started to comply with the directive. However, a few of the websites could not be blocked immediately due to technical challenges, said BTRC officials.
The government actually wants to create massive awareness about the issue and as many hurdles as possible in browsing those sites. Tarana Halim, state minister for post and telecommunications division, said:
Initially we have decided to block around 500 websites that contain pornography, obscene pictures and video contents. In the first phase we will go for blocking the locally hosted sites
The Daily Star has obtained a copy of an email that contained a list of 510 websites branded as pornographic by an 'offensive online content control committee'.
Noor TV is a digital satellite television channel broadcasting religious and other programming in Urdu from an Islamic perspective to
audiences in the UK and internationally.
On 17 November 2015, the Licensee broadcast the second instalment of a series of four programmes which had been recorded at the Urs Nehrian festival in Pakistan that had taken place in June 2015. The programme consisted of 15 religious scholars and
preachers addressing an assembled congregation with short sermons, homilies and poetic verses.
One of the speakers, Allama Mufti Muhammad Saeed Sialvi Sahib (“Allama Sialvi”), recounted a parable in which he stated that the Prophet Muhammed had given a general command to kill all Jewish people. He stated that upon hearing this command one Muslim
follower had immediately killed a Jewish trader with whom he had long standing business relations. Allama Sialvi held this to be an example of the devotion and obedience of a disciple to the Prophet Muhammed and on several occasions appeared to condone
the killing of a Jewish trader.
We noted that Allama Sialvi held the titles “Mufti” and “Allama”, denoting that he was a figure of religious authority within the Muslim community, and therefore someone whose views would carry some weight within the Muslim community.
We considered that Allama Sialvi's clear statement that religious obedience within the Islamic faith could be demonstrated through murder of Jewish people had the potential to be interpreted as spreading anti-Semitism, i.e. his comments could amount to a
form of hate speech . In this context we were mindful of the Council of Europe's definition of' hate speech', as follows: all forms of expression which spread, incite, promote or justify racial hatred, xenophobia, anti-Semitism or other forms
of hatred based on intolerance, including: intolerance expressed by aggressive nationalism and ethnocentrism, discrimination and hostility against minorities, migrants and people of immigrant origin
We considered that Allama Sialvi's speech, particularly due to his standing and authority within the Muslim community, involved clear potential to cause significant offence as it held up in unequivocal terms the killing of a Jewish person as an example
of devotion and obedience within the context of the Islamic faith. We also considered that the content had the potential to cause harm by portraying the murder of Jewish people in highly positive terms and promoting a highly negative anti-Semitic
attitude towards Jewish people.
Ofcom's Decision is that an appropriate and proportionate sanction would be a financial penalty of £75,000. In addition, Ofcom considers that the Licensee should broadcast a statement of Ofcom's findings in this case, on a date and in a form to be
determined by Ofcom.
Given that the holocaust is historical fact with massive amounts of historical evidence, then it hardly seems likely that
authoritative websites will feel the need to debate the existence the event. The debate only exists on contrarian websites. You wouldn't really expect wiki to lead with the phrase: yes the holocaust really did exist.
So searching for the phrase : did the Holocaust happen? is hardly likely to strike many close matches on authoritative websites. And yes it will find many matches on the contrarian websites, after all they are the only websites asking that
A Guardian commentator, Carole Cadwalladr, asked that question and was somehow 'outraged' that Google didn't return links to an entirely different question that was more in line with what Cadwalladr wanted to see.
It would be a bad day indeed if Google dictated only morally upright answers. Searches for porn would return links to anti-porn activists and a search for local pubs would return links to religious preachers. People would soon seek other solutions to
their searching. Even holocaust campaigners would get caught out, eg if they were seeking out websites to challenge.
Surely nobody would gain from Google refusing to comply with search requests as written.
Google has now responded to the Cadwalladr article saying that it is thinking deeply about ways to improve search. A spokesman said:
This is a really challenging problem, and something we're thinking deeply about in terms of how we can do a better job
Search is a reflection of the content that exists on the web.
The fact that hate sites may appear in search results in no way means that Google endorses these views.
Editor of news site Search Engine Land, Danny Sullivan, said Google was keen to come up with a solution that was broadly applicable across all searches, rather than just those that have been noticed by users:
It's very easy to take a search here and there and demand Google change something, and then the next day you find a different search and say, 'why didn't you fix that?' Hate speech
Last month, a California judge tentatively ruled that he would dismiss charges lodged by California's attorney general against
Backpage.com's chief executive and two of its former owners. After an interim scare, the judge has now issued a final judgement confirming the previous ruling and the charges have been dismissed.
The CEO, Carl Ferrer was charged with pimping a minor, pimping, and conspiracy to commit pimping in connection to online advertisements posted on the online ads portal. California's attorney general Kamala Harris claimed that advertisements
amounted to solicitation of prostitution.
However Judge Michael Bowman agreed with the defendants, including former owners Michael Lacey and James Larkin, that they were protected, among other things, by the Communications Decency Act, and hence they were not liable for third-party ads posted by
others. The ruling said:
By enacting the CDA, Congress struck a balance in favor of free speech by providing for both a foreclosure from prosecution and an affirmative defense at trial for those who are deemed an internet service provider.
California attorney general Kamala Harris is pursuing new charges against Backpage.com website
The fresh charges, which attorney general Kamala Harris claims are based on new evidence, come after an earlier case against the website was thrown out of court.
The website advertises escort services and seems t have wound up Harris who claimed that the site operated a hotbed of illicit and exploitative activity .
Harris said she had charged Backpage executives Carl Ferrer, Michael Lacey and James Larkin with 13 counts of pimping and conspiracy to commit pimping. They also are charged with 26 counts of money laundering. In the latest case, filed in Sacramento
County superior court, Harris claims Backpage illegally funnelled money through multiple companies and created various websites to get around banks that refused to process transactions. (This does not seem a particularly surprising, or necessarily bad
thing to do).
She also alleged that the company used photos of women from Backpage on other sites without their permission in order to increase revenue and knowingly profited from the proceeds of prostitution. And from what Harris said in a statement it seems that
hers is a morality campaign against sex work. Harris said:
By creating an online brothel -- a hotbed of illicit and exploitative activity -- Carl Ferrer, Michael Lacey, and James Larkin preyed on vulnerable victims, including children, and profited from their exploitation.
No matter how much governments spout bollox about mass snooping being used onlt to detect the likes of terrorism, the authorities end up sharing the data with Tom, Dick and Harry for the most trivial of reasons
Embrace is a 2016 Australia / Canada / Dominican Republic / Germany / USA / UK documentary by Taryn Brumfitt.
Starring Renee Airya, Jade Beall and Taryn Brumfitt.
When Body Image Activist Taryn Brumfitt posted an unconventional before-and-after photo in 2013 it was seen by more than 100 million people worldwide and sparked an international media frenzy. EMBRACE follows Taryn's crusade as she explores the global
issue of body loathing, inspiring us to change the way we feel about ourselves and think about our bodies.
There is no mention of cuts and the running times remains the same. The nudity and surgical detail could have been pixellated out. But it seems more likely that feminists have dreamt up a new rule of political correctness that nudity does not count in
the context of a feminist film.
Perhaps the BBFC advice should read, strong language, positive body image, negative surgical body image augmentation
Is the government misleading the Lords about blocking Twitter?
Last week we reported that the UK government expect the BBFC to ask social media providers, such as Twitter, to block the use of their service by accounts that are associated with porn sites that fail to verify the age of their users.
The Bill is even worse than we illustrated. The definition of a "pornographic website" in Clause 15 (2) is purely a site that operates on a "commercial basis". This could catch any site--including Twitter, Reddit, Tumblr--where
pornography can be found. The practical limit would therefore purely be down to the discretion of the regulator, the BBFC, as to the kind of commercial sites they wanted to force to use Age Verification. However, the BBFC does not seem to want to require
Twitter or Reddit to apply age verification--at least, not yet.
However, we also got one part wrong last week
. In relation to Twitter, Reddit and other websites where porn sites might promote their content, the Bill contains a power to notify these "ancillary services" but has no specific power to enforce the notifications .
In other words, they expect Twitter, Google, Facebook, Tumblr and other companies to voluntarily block accounts within the UK, without a specific legal basis for their action .
This would create a toxic situation for these companies. If they fail to "act" on the "notifications", these services will leave themselves open to the accusation that they are failing to protect children, or actively
"supplying" pornography to minors.
On the other hand, if they act on these notices, they will rightly be accused by ourselves and those that are censored of acting in an unaccountable, arbitrary manner. They will not have been legally obliged to act by a court; similar content will remain
unblocked; and there will be no clear remedy for someone who wished to contest a "notification". Liability for the blocks would remain with the company, rather than the BBFC.
The government has not been clear with the Lords that this highly unclear situation is the likely result of notifications to Twitter--rather than account blocks, as they have suggested.
There are very good reasons not to block accounts after a mere notification. For instance in this case, although sites can contest a classification at the BBFC, and an internal appeals process will exist, there is no external appeal available, other than
embarking on an expensive judicial review. It is not clear that a classification as pornography should automatically lead to action by ancillary services, not least because compliance automatically results in the same content being made available. To be
clear, the bill does not aim to remove pornography from Twitter, Reddit users or search engines.
Why then, has the government drafted a bill with this power to notify "ancillary services", but no method to enforce? The reason appears to be that payment providers in particular have a long standing agreement amongst themselves that they will
halt payments when they are notified that someone is taking payments for unlawful activity. Similarly, large online ad networks have a similar process of accepting notifications.
There is therefore no need to create enforcement mechanisms for these two kinds of "ancillary providers". (There are pitfalls with their approach--it can lead to censorship and unwarranted damage to businesses--but let us leave that debate
aside for now.)
It seems clear that, when the bill was written, there was no expectation that "ancillary providers" would include Twitter, Yahoo, or Google, so no enofrcement power was created.
The government, in their haste, has agreed with the BBFC that they should be able to notify Twitter, Google, Yahoo and other platforms. They have agreed that BBFC need not take on a role of enforcement through court orders.
The key point is that the Lords are being misled by the government as things stand. Neither the BBFC or government have explored with Parliamentarians what the consequences of expanding the notion of "ancillary providers" is.
The Lords need to be told that this change means that:
the notices are unenforceable against Internet platforms;
they will lead to public disputes with the companies;
they make BBFC's decisions relating to ancillary providers highly unaccountable as legal responsibility for account blocks rest with the platforms.
It appears that the BBFC do not wish to be cast in the role of "national censor". They believe that their role is one of classification, rather than enforcement. However, the fact that they also wish to directly block websites via ISPs rather
flies in the face of their self-perception, as censorship is most clearly what they will be engaging in. Their self-perception is also not a reason to pass the legal buck onto Internet platforms who have no role in deciding whether a site fails to meet
This mess is the result of rushing to legislate without understanding the problems involved. The obvious thing to do is to limit the impact of the "ancillary services" approach by narrowing the definition to exclude all but payment providers
and ad networks. The alternative--to create enforcement powers against a range of organisations--would need to establish full accountability for the duties imposed on ancillary providers in a court, something that the BBFC seems to wish to avoid.
Or of course, the government could try to roll back its mistaken approach entirely, and give up on censorship as a punishment: that would be the right thing to do. Please
sign our petition if you agree
China has banned its internet users from sharing on the social media videos about current events that are not from official sources, media
The State Administration of Press, Publication, Radio, Film and Television (China), in a notice, said Chinese social media platforms WeChat and Weibo were not allowed to disseminate user-generated audio or video programmes about current events.
The news landed quietly among China's internet users, with only a handful discussing the new rules on Weibo, many seemingly resigned to ever increasing censorship.
Back in May, we wrote about a draft report by Australia's Productivity Commission on how Australia's copyright and patent laws could be reformed to foster domestic production and innovation. That report is back in the news this week, after it was
released in its final form , and a consultation seeking public feedback was opened.
The most important proposed change would introduce a fair use right into Australia's copyright law. Currently Australia's copyright flexibilities are narrowly pre-defined; for example, it is lawful for Australians to backup their computer software and to
digitize their video tapes (remember those?), though there is still no similar exception allowing them to back up their iTunes downloads or to rip copies of their DVDs. This approach has made Australia's copyright law a complicated and anachronistic
By swapping these kinds of narrow exceptions out for a broad and flexible fair use right, Australians would be permitted to make any use of a copyright work that is fair, taking into account the purpose of the use, the nature of the work, the amount
copied and the effect on the potential market value of the work. This would make the day to day operation of copyright law much simpler and more adaptable to changes in society and technology. It will also stimulate the development of innovative new
products and services that rely on fair use.
A second important change proposed in the Productivity Commission report is to guarantee Australians' right to circumvent geoblocks that prevent them from accessing videos, music, books and software from overseas online stores or streaming services.
These geoblocks mean that Australians pay more money for the same products, or are forced to wait for longer for their local release. The result is that some users resort to piracy. Clarifying that circumventing these geoblocks is lawful is therefore
likely to create a win for users and copyright holders alike.
Although the Productivity Commission makes many more recommendations, we'll stop at one more--that universities, schools, and libraries should receive the benefit of the same safe harbor that protects ISPs from copyright liability for infringements by
their users. This reform is a sensible one, which would bring Australia into line with U.S. and European law, and with our Manila Principles on Intermediary Liability . In practice, it means for example that if a student uploads a copyright-infringing
file to their school's website, the school won't be held responsible until they are notified of the infringement and refuse to remove the file.
It's predictable that copyright monopolists are up in arms about the proposed changes, dragging out the usual doomsday scenarios about job losses , tear-jerking celebrity pitches , and even some frankly bizarre similes . And the worst thing is that these
tactics, which have previously been successful in obstructing reform, may be so again . Yet the facts are difficult to argue with; Australians pay more to access copyright works lawfully, suffer tight constraints on what they can do with works to which
they do have access, and risk legal liability for acts that harm nobody. The time for reform is long overdue.
Further comments on the Productivity Commission final report are due by Valentine's Day 2017 . If the government resists the monopolists' uproar and legislates to implement the Commission's recommendations, that will be the love letter that Australian
users and creators have been waiting for.
Nominet, the Registry responsible for running the .UK domain name space, has recently published a report on the number of domain names it has suspended
further to requests from law enforcement agencies. The figures show that during the 12 month period from 1 November 2015 to 31 October 2016, over 8,000 domain names were suspended. This is more than twice the number of domain name suspensions during the
preceding 12 month period in 2014/2015.
A revised registration policy, which came into effect in May 2014, made it clear that the use of a domain name under .UK for criminal purposes is not permitted and that such domain names may be suspended. Police or law enforcement agencies (LEAs) are
able to notify Nominet of any .UK domain names being used for criminal activity.
The suspension of 8,049 domain names from 1 November 2015 to 31 October 2016 was the result of notifications from eight different LEAs, ranging from the Counter Terrorism Internet Referral Unit to the UK Trading Standards body. The majority of the
requests came from the UK Police Intellectual Property Crime Unit which submitted 7,617 suspension requests.
In addition to this, the revised registration policy also prohibited the registration of domain names that appear to relate to a serious sexual offence. Such domain names are termed offensive names under the policy. Thus Nominet, in its sole
discretion, will not allow a domain name to remain registered if it appears to indicate, comprise or promote a serious sexual offence and where there is no legitimate use of the domain name which could be reasonably contemplated . As a
result of this, all new domain name registrations are run through an automated process and those that are identified as potentially problematic are highlighted. These domain names are then verified manually to ensure that they are in breach of Nominet's
offensive names policy.
It is interesting to note that while the automated process to identify offensive domain names highlighted 2,407 cases, this resulted in only one suspension.
President Obama recently signed the Consumer Review Fairness Act of 2016 ( H.R. 5111
), which passed both houses of Congress unanimously. The bill addresses a dangerous trend: businesses inserting clauses into their form contracts that attempt to limit their customers' ability to criticize products and services online. We're pleased to
see Congress taking a big step to protect free speech online and rein in abusive form contracts.
The CRFA tackles two different ways that businesses attempt to squash their customers' reviews. The first is rather straightforward: simply inserting clauses into their form contracts saying that customers can't post negative reviews online, or imposing
a fine for them. For instance, the Union Street Guest House used such a contract and attempted to fine guests over their bad reviews.
The second tactic is a bit more roundabout: businesses put a clause in their contracts saying that they own the copyright to customers' reviews. Then, when they see a review that they don't like, they file a takedown notice under the Digital Millennium
Copyright Act (DMCA). One notorious example of that trick is a form contract for doctors offered by a company called Medical Justice . The U.S. Department of Health and Human Services ordered doctors to quit using such contracts in 2013 , but similar
practices live on across different industries. The CRFA voids both types of contract clauses and makes it illegal for businesses to offer them.
When we've written previously about the CRFA, we've noted a potential gap in the way the law was worded
. Companies may try to argue that they are allowed to craft contract clauses assigning themselves the copyright to customers' reviews so long as the reviews are not "lawful." Companies may then attempt to remove web content written by customers
using the special censorship tools available to copyright owners under the DMCA, claiming that that content is not lawful (for example, because it allegedly defames the company). If courts--or service providers who receive takedown notices--accept
that reasoning, then vendors could bypass the traditional protections for allegedly illegal speech, having content removed immediately under the DMCA rather than going through a court as it normally would for non-copyright speech claims. We are
disappointed that Congress failed to clearly foreclose this abuse of form contracts and the DMCA takedown process.
Ultimately, though, anti-review contracts were already on very shaky legal ground before the CRFA passed, as were form contracts that included surprising transfers of copyright ownership . Courts have reliably sided with the customer's freedom to write
negative reviews. We will be watching closely to see if any unscrupulous companies attempt to take advantage of the ambiguous wording in the law. If that happens, the courts should shut it down.
Despite this oversight, we're glad to see Congress standing up to the use of abusive form contracts to stifle freedom of expression. It's telling that the bill passed both chambers unanimously: in a session that's been marked by gridlock, this has been
one area where lawmakers in both parties agree. We hope to see lawmakers build on this progress and protect customers in the next session of Congress via bills like the
SPEAK FREE Act
and the Justice for Telecommunications Consumers Act
Signal, an encrypted messaging apt for mobile devices had its service blocked in Egypt and UAE.
Now Signal have responded by making a new release available to those territories that should make the censors thinks twice before reaching for the block option.
The new Signal release uses a technique known as domain fronting. Many popular services and CDNs, such as Google, Amazon Cloudfront, Amazon S3, Azure, CloudFlare, Fastly, and Akamai can be used to access Signal in ways that look indistinguishable from
other uncensored traffic. The idea is that to block the target traffic, the censor would also have to block those entire services. With enough large scale services acting as domain fronts, disabling Signal starts to look like disabling the internet. When
users in the two countries send a Signal message, it will look like a normal HTTPS request to www.google.com. To block Signal messages, these countries would also have to block all of google.com.
Signal , the messaging app that prides itself on circumventing government censorship, has a few new places where its flagship feature works. Last week it was Egypt, and now users in Cuba and Oman can send messages without fear of them being intercepted
and altered by lawmakers.
The European Court of Justice has passed judgement on several linked cases in Europe requiring that ISP retain extensive records of all phone and internet communications. This includes a challenge by Labour's Tom Watson. The court wrote in a press
The Members States may not impose a general obligation to retain data on providers of electronic
EU law precludes a general and indiscriminate retention of traffic data and location data, but it is open to Members States to make provision, as a preventive measure, for targeted retention of that data solely for the purpose of fighting serious crime,
provided that such retention is, with respect to the categories of data to be retained, the means of communication affected, the persons concerned and the chosen duration of retention, limited to what is strictly necessary. Access of the national
authorities to the retained data must be subject to conditions, including prior review by an independent authority and the data being retained within the EU.
In today's judgment, the Court's answer is that EU law precludes national legislation that prescribes general and indiscriminate retention of data.
The Court confirms first that the national measures at issue fall within the scope of the directive. The protection of the confidentiality of electronic communications and related traffic data guaranteed by the directive, applies to the measures taken by
all persons other than users, whether by private persons or bodies, or by State bodies.
Next, the Court finds that while that directive enables Member States to restrict the scope of the obligation to ensure the confidentiality of communications and related traffic data, it cannot justify the exception to that obligation, and in particular
to the prohibition on storage of data laid down by that directive, becoming the rule.
Further, the Court states that, in accordance with its settled case-law, the protection of the fundamental right to respect for private life requires that derogations from the protection of personal data should apply only in so far as is strictly
necessary. The Court applies that case-law to the rules governing the retention of data and those governing access to the retained data.
The Court states that, with respect to retention, the retained data, taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained.
The interference by national legislation that provides for the retention of traffic data and location data with that right must therefore be considered to be particularly serious. The fact that the data is retained without the users of electronic
communications services being informed of the fact is likely to cause the persons concerned to feel that their private lives are the subject of constant surveillance. Consequently, only the objective of fighting serious crime is capable of justifying
The Court states that legislation prescribing a general and indiscriminate retention of data does not require there to be any relationship between the data which must be retained and a threat to public security and is not restricted to, inter alia,
providing for retention of data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved in a serious crime. Such national legislation therefore exceeds the limits of what is strictly necessary and
cannot be considered to be justified within a democratic society, as required by the directive, read in the light of the Charter.
The Court makes clear however that the directive does not preclude national legislation from imposing a targeted retention of data for the purpose of fighting serious crime, provided that such retention of data is, with respect to the categories of data
to be retained, the means of communication affected, the persons concerned and the retention period adopted, limited to what is strictly necessary. The Court states that any national legislation to that effect must be clear and precise and must provide
for sufficient guarantees of the protection of data against risks of misuse. The legislation must indicate in what circumstances and under which conditions a data retention measure may, as a preventive measure, be adopted, thereby ensuring that the scope
of that measure is, in practice, actually limited to what is strictly necessary. In particular, such legislation must be based on objective evidence which makes it possible to identify the persons whose data is likely to reveal a link with serious
criminal offences, to contribute to fighting serious crime or to preventing a serious risk to public security.
As regards the access of the competent national authorities to the retained data, the Court confirms that the national legislation concerned cannot be limited to requiring that access should be for one of the objectives referred to in the directive, even
if that objective is to fight serious crime, but must also lay down the substantive and procedural conditions governing the access of the competent national authorities to the retained data. That legislation must be based on objective criteria in order
to define the circumstances and conditions under which the competent national authorities are to be granted access to the data. Access can, as a general rule, be granted, in relation to the objective of fighting crime, only to the data of individuals
suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime. However, in particular situations, where for example vital national security, defence or public security interests are
threatened by terrorist activities, access to the data of other persons might also be granted where there is objective evidence from which it can be inferred that that data might, in a specific case, make an effective contribution to combating such
Further, the Court considers that it is essential that access to retained data should, except in cases of urgency, be subject to prior review carried out by either a court or an independent body. In addition, the competent national authorities to whom
access to retained data has been granted must notify the persons concerned of that fact.
Given the quantity of retained data, the sensitivity of that data and the risk of unlawful access to it, the national legislation must make provision for that data to be retained within the EU and for the irreversible destruction of the data at the end
of the retention period.
The view of the authorities
David Anderson, the Independent Reviewer of Terrorism Legislation gives a lucid response outlining the government's case for
mass surveillance. However the official justification is easily summarised as it clearly assists in the detection of serious crime. He simply does not mention that the government having justified grabbing the data on grounds of serious crime detection,
will share it willy nilly with all sorts of government departments for their own convenience, way beyond the reasons set out in the official justification.
And when the authorities talk about their fight against 'serious' crime, recent governments have been updating legislation to redefine practically all crimes as 'serious' crimes. Eg possessing a single spliff may in practice be a trivial crime, but the
law on possession has a high maximum sentence that qualifies it as a 'serious' crime. It does not become trivial until it goes to court and the a trivia punishment has been handed down. So using mass snooping data would be easily justified to track down
trivial drug users.
The judgment relates to a case brought by Deputy Leader of the Labour Party, Tom Watson MP, over intrusive data retention powers. The ruling says that:
- Blanket data retention is not permissible
- Access to data must be authorised by an independent body
- Only data belonging to people who are suspected of serious crimes can be accessed
- Individuals need to be notified if their data is accessed.
At present, none of these conditions are met by UK law.
Open Rights Group intervened in the case together with Privacy International, arguing that the Data Retention and Investigatory Powers Act (DRIPA), rushed through parliament in 2014, was incompatible with EU law. While the Judgment
will no longer affect DRIPA, which expires at the end of 2016, it has major implications for the Investigatory Powers Act.
Executive Director Jim Killock said:
The CJEU has sent a clear message to the UK Government: blanket surveillance of our communications is intrusive and unacceptable in a democracy.
The Government knew this judgment was coming but Theresa May was determined to push through her snoopers' charter regardless. The Government must act quickly to re-write the IPA or be prepared to go to court again.
Data retention powers in the Investigatory Powers Act will come into effect on 30 Dec 2016. These mean that ISPs and mobile phone providers can be obliged to keep data about our communications, including a record of the websites we
visit and the apps we use. This data can be accessed by the police but also a wide range of organisations like the Food Standards Agency, the Health and Safety Executive and the Department of Health.
The French National Assembly and the Senate have launched the initial steps to create the crime of online obstruction
of abortion. The final bill is scheduled to be passed into law in February 2017.
The new crime would affect websites that discuss the possible psychological effects of abortion and those that promote alternatives to terminating pregnancy. Although the new crime would not be a ban, it has raised concern among those troubled over the
possible censorship of information on pro-life sites.
The law imposes a maximum of two-years' imprisonment for putting up 'false' information on abortion online, plus a fine of 30,000 Euros.
Maybe interesting times ahead if religious prohibitions on abortion are contested as 'false' information, eg claims that God will punish those that opt for aboortion.
Embrace is a 2016 Australia / Canada / Dominican Republic / Germany / USA / UK feminsit documentary by Taryn Brumfitt.
Starring Renee Airya, Jade Beall and Taryn Brumfitt.
When Body Image Activist Taryn Brumfitt posted an unconventional before-and-after photo in 2013 it was seen by more than 100 million people worldwide and sparked an international media frenzy. EMBRACE follows Taryn's crusade as she explores the global
issue of body loathing, inspiring us to change the way we feel about ourselves and think about our bodies.
Never cut by censors but the film made the news in Australia after the director successfully appealed against a MA 15+ rating and won an M rating instead.
In Australia, the original MA15+ (15A) rating was downrated to M (PG-15) for nudity on appeal. The Review board explained:
A three-member panel of the Classification Review Board has unanimously determined that the film Embrace is classified M (Mature) with the consumer advice Nudity .
The National Classification Code and Classification Guidelines allows for nudity to occur at the M level if it is justified by context. In the Classification Review Board's opinion Embrace warrants an M classification because the scenes of nudity and of
women's breasts and genitals in the film are justified by the context of the documentary approach to women's body image and their impact is no higher than moderate.
Now the BBFC have passed the film 15 uncut for cinema for strong language, nudity, brief surgical detail.
France is considering appointing an official internet ombudsman to investigate complaints about online material in order to
prevent excessive censorship and preserve free speech. A bill establishing a content qualification assessment procedure has been tabled in the French senate.
Dan Shefets, a Danish lawyer explained one of the issues targeted by the bill:
ISPs face both penal and civil liability as soon as they are made aware of allegedly illicit content. One consequence of such liability is that smaller companies take down such content for fear of later sanctions.
The aim is to provide a simple procedure that will support firms operating online who are uncertain of their legal liabilities and to prevent over-zealous removal or censorship of material merely because it is the subject of a complaint. It could be
copied by other European jurisdictions.
The idea is that a rapid response from the internet ombudsman would either order the material to be taken down or allow it to remain. As long as ISPs complied with the rulings, they would not face any fine or punishment.
Ariana International, 20 July 2016, 12:00I
Ariana International is a general entertainment channel originating from Afghanistan, and broadcast by satellite in the UK.
Ofcom noted a news item relating to Muhammad Riyad, a 17-year old, who was described as said to be an Afghan . He had injured five people when he attacked a train, armed with a knife and axe, in Wuerzburg, Germany in July 2016.
A video was then broadcast which showed Muhammad Riyad talking straight to camera and at times brandishing a knife. The video lasted approximately two minutes and 15 seconds, and Muhammad Riyad said the following:
...Inshallah Mujahids from Islamic State will reach you everywhere. Inshallah you will be slaughtered in your homes. Inshallah they will enter your homes, enter your land, and on the streets. Inshallah you will not be safe in your homes, your villages,
your towns and inshallah, and in every street in every airport inshallah. The Islamic State has enough strength to get you everywhere, even in your parliament [vigorously waving knife at camera]. I am living here amongst you and inshallah I have made a
plan to deal with you here in your homes inshallah. I tell you, that I will slaughter you in your homes. I promise you that I will make you forget about France...
The news show made no further comments after the speech and moved on to the next item
Ofcom considered the following rules:
Rule 2.3: “In applying generally accepted standards broadcasters must ensure that material which may cause offence is justified by the context...”.
Rule 3.1: “Material likely to encourage or to incite the commission of crime or to lead to disorder must not be included in television or radio services”.
Rule 3.2: “Material which contains hate speech14 must not be included in television and radio programmes except where it is justified by the context”.
Ofcom Decision: Breach of rules 2.3, 3.1, 3.2
Ofcom considered the audience would have interpreted Muhammad Riyad's various comments as promoting and justifying hatred and violence towards the persons who did not conform to his definition of Islam. In Ofcom's view, this was a clear example of hate
speech, as defined by the Code.
Given the very strong nature of the material in this case, we considered that, under the Code, there would need to be extremely clear and strong context provided to justify the broadcast of the video featuring Muhammad Riyad. Our Decision was that that
there was clearly insufficient context to justify the inclusion of hate speech in this broadcast, and Rule 3.2 was therefore breached.
Breaches of Section Three of the Code, in particular, are very serious because they involve the potential for serious harm. Ofcom considered all of the breaches in this case to be very serious.
Due to the highly challenging and potentially harmful nature of the content broadcast, we are putting the Licensee on notice that we will consider these very serious breaches for the imposition of a statutory sanction.
California-based artist Mark Thaler, who created the decorations, appears to have now removed them from sale. He had initially told the newspaper he would consider removing them out of respect for his fellow humans .
Thailand's Land Transport Department has warned of a 2,000 baht fine for motor vehicles installed with ghost stickers at the back windows or rear bumpers
The warning came after the reflective creepy stickers of ghosts were posted on the social media and were widely shared.
Although the ghost stickers were not widely seen in the country, but in China, the action by the department was seen as timely and served as a preventive measure in case they are imported and used by motorists.
According to the director-general of the department, Sanit Phromwong, the ghost stickers could confuse motorists and disturb concentration while driving.
Labour's industrial spokesperson has called for the algorithms used by technology firms to be made transparent and subject to
Shadow minister Chi Onwurah wants to see greater scrutiny of the algorithms that now control everything from the tailored news served to Facebook users to the order in which websites are presented in Google search. She said:
Algorithms aren't above the law. The outcomes of algorithms are regulated -- the companies which use them have to meet employment law and competition law. The question is, how do we make that regulation effective when we can't see the algorithm?
She added in a letter to the Guardian:
Google and others argue their results are a mirror to society, not their responsibility. Google, Facebook and Uber need to take responsibility for the unintended consequences of the algorithms and machine learning that drive their profits. They can bring
huge benefits and great apps, but we need a tech-savvy government to minimise the downside by opening up algorithms to regulation as well as legislating for greater consumer ownership of data and control of the advertising revenue it generates.
Labour's industrial paper, due to be published after the Christmas break, will call for suggestions on how tech firms could be more closely supervised by government.
A bill filed this month by state Representative Bill Chumley would require sellers to install a digital censorship hijack on computers and other devices
that access the internet to prevent the viewing of what the lawmaker considers obscene content.
The proposal also would prohibit access to any online resource that supports sex work and would require manufacturers or sellers to block any websites that supposedly facilitate trafficking.
Both sellers and buyers could get around the limitation, for a ransom fee. The bill would fine manufacturers that sell a device without the blocking system, but they could opt out by paying $20 per device sold. Buyers could also verify their age and pay
$20 to remove the censorship software.
Money collected would go toward the Attorney General's Office's pet project of a human trafficking task force.
Chumley's bill has been referred to the House Judiciary Committee. Legislators return to Columbia for a new session next month.
Turkey's President Erdogan has stepped up his repression of dissent by blocking the Tor network in the country.
Watchdog group Turkey Blocks has confirmed that Turkey is blocking the Tor anonymity network's direct access mode for most users. You can still use a bridge mode for now, but there are hints that internet providers might be hurting performance
even then. Bridges are unlisted relays and may require a bit of searching out.
The restrictions come alongside a recent government ban on virtual private network services.
It wasn't so long ago that the Daily Mail was universally derided in anti-censorship circles. Now it has become a champion of the people, willing to give voice to large sections of the population whose opinions are being silenced by the politically
correct 'elite'. Anyway the petition reads:
Facebook has announced that a new tool is being rolled out so that readers have a fast and convenient way to flag fake news stories, and once again there seems to be no distinction between fake news for entertainment purposes (our site Southend News
Network) and fake news for deliberate and dangerous deception purposes (those nasty pages who got Donald Trump elected, apparently).
Well then, seeing as that will probably result in the eventual demise of Southend News Network, we thought that we might as well go out in a real blaze of glory.
Therefore, we are proud to present our campaign to get the Daily Mail reclassified as a fake news website!
Our argument is a simple one. We feel that the Daily Mail reports stories in a manner that often swerves into the realms of fake news. Reporting Jeremy Corbyn doing a jig, making the campaigner Gina Miller appear darker than she really is in photographs,
and just a general history of reporting real world events in a way that incites hatred against anyone who is an enemy of the Daily Mail are all unacceptable in our eyes at a time when actual fake news websites are getting blamed for many of society's
If enough people sign this petition, there is just the slightest glimmer of hope that their Facebook classification could be altered to Humour or something along those lines. We know that it is a tall order in reality as they have the most-visited
news site in the world, but to be brutally honest if we are going down then we might as well try and create one hell of a storm at the same time.
Amazon Prime Video has just launched in India, and have started on the wrong foot by censoring 30 minutes from an episode of The Grand Tour.
The fourth episode of The Grand Tour is listed as only 30 minutes in India, as opposed to the normal one hour. That's so that all references to a car made of meat could be excised from the show. It seems that India is a little meat sensitive for
It is reported that Amazon have made frequent recourse to blurring to censor more straightforward censorship issues such as nudity and other sexual content. Amazon are also quick to reach for the annoying bleep button when strong language is in the air.
The Amazon self censorship is a little confusing as this week, the Ministry of Information and Broadcasting clarified that it has no plans to censor online streaming services.
On the occasion o f Human Rights day, t he National Coalition Against Censorship (NCAC) announced the launch of CENSORPEDIA, a crowdsourced Wiki, cataloging over 1,200 individual censorship incidents throughout history.
Censorpedia is a tool enabling researchers, journalists, academics, students and anyone interested in free speech t o explore the current landscape of censorship in the world and delve into censorship's history.
The database ' s crowdsourced model allows anyone with relevant source materials to add a case: past, present, ongoing or resolved.
Visitors can browse or search these cases by the region in which they occurred, the grounds for censorship and the medium of expression.
The 1,200 incidents currently cataloged on Censorpedia are the product of over 20 years of contributions from NCAC staff, volunteers, artists and students to Antoni Muntadas landmark 1994 art project, The File Room.
A new US has been signed into effect that bars businesses from punishing customers for giving bad reviews.
The Consumer Review Fairness Act ( HR 5111 ) voids any contract that involves prohibitions or penalties related to poor online reviews.
The aim of the bill, written by Reps. Leonard Lance (R-NJ) and Joseph Kennedy (D-MA) is to stop companies from imposing penalties on consumers who would leave negative comments on sites such as Yelp. Lance explained:
Consumers in the 21st century economy should be able to post, comment and tweet their honest and accurate feedback without fear of retribution. Too many companies are burying non-disparagement clauses in fine print and going after consumers when they
post negative feedback online.
With the new law in effect, any contract that attempts to tie in a clause calling for a fine or penalty for a review would be rendered void and legally unenforceable. The law would also prevent a business from asserting intellectual property claims on
the content of a review, provided no trade secrets or personally identifiable information are involved. However the bill does make exceptions in the case of reviews deemed to be libelous or slanderous, and also removes any protections for
reviews and posts that are found to be false or misleading.
Online retailers in America will soon be required by law to disclose to state governments what purchases their customers have made.
The law seems to have been made up in US courts during a long-running legal case based around the jurisdiction of sales tax. An appeals court decision now requires out-of-state retailers to report to the Colorado state government the details of all
purchases, including what that purchase was and who bought it.
The US Supreme Court has refused to hear the case so the appeal court decision stands.
Colorado is not the only state pushing the requirement. Vermont will also make the same requirement three months after Colorado starts imposing the law. And other states including Alabama, South Dakota, Tennessee and Wyoming have approved similar rules.
The exec director of the American Catalog Mailers Association (ACMA), Hamilton Davison, is extremely concerned He said:
Consumers, particularly those who buy from catalogs and e-commerce merchants, put considerable trust in the businesses from which they make the most personal of purchases, he noted. This decision undermines this trust by requiring remote sellers to
report to state tax collectors on the buying habits of their customers, including health care products, apparel or other sensitive items.
Thailand's rubber-stamp parliament has unanimously passed a new cyber-crime law that strengthens the junta's ability
to police the web and repress criticism.
The junta has banned protests, muzzled the press, blocked scores of websites and used already stringent cyber and defamation laws to prosecute critics over everything from Facebook comments to investigative reports on rights abuses.
The new law is even more vaguely-worded than its predecessor, broadening the scope of the government's surveillance and censorship powers. It allots up to five years in prison for entering false information into a computer system that jeopardises
national security, public safety, national economic stability or public infrastructure, or causes panic .
One of the most controversial additions is the creation of a five-person committee that can seek court approval to remove online content considered a breach of public morals . The definition (of this term) is not written in any law, it is just up
to the committee.
Another new clause empowers authorities to request user and traffic data from internet service providers without a court warrant.
Welsh police ludicrously wasted time and money following up a ridiculous complaint about a jokey mug in a
Cardiff shop window.
Staff at Ginger Whites shop in Rhiwbina said that the mug, displayed on a high shelf in window, has a picture of Father Christmas and holly with the words Oh bollocks it's Christmas written across it. Following the visit they posted a picture of
the mug on Facebook with the comment:
We had the Police in the shop today, Ginger Whites, due to a complaint being made about our Bollocks Christmas mugs.
Absolutely gobsmacked as to why anyone would waste police time over this. Seriously, put your time to better use!
No action was taken by the police and they didn't tell us to take it down. They just let us know a complaint had been made.
The Lords had their first debate on the Digital Economy Bill which includes laws to require age verification as well
as extension of out dated police and BBFC censorship rules to the internet.
Lords inevitable queued up to support the age verification requirements. However a couple of the lords made cautionary remarks about the privacy issues of websites being able to build up dangerous database of personal ID information of porn users.
A couple of lords also spoke our against the BBFC/police/government censorship prohibitions being included in the bill. It was noted that these rules are outdated, disproportionate and perhaps requires further debate in another bill.
As an example of these points, the Earl of Erroll (cross bencher) said:
My Lords, I welcome the Bill because it has some very useful stuff in it -- but, like everything else, it might benefit from some tweaking. Many other speakers mentioned the tweaks that need to be made, and if that happens I think that we may end up with
quite a good Bill.
I will concentrate on age verification because I have been working on this issue with a group for about a year and three-quarters. We spotted that its profile was going to be raised because so many people were worried about it. We were the first group to
bring together the people who run adult content websites -- porn websites -- with those who want to protect children. The interesting thing to come out quite quickly from the meetings was that, believe it or not, the people who run porn sites are not
interested in corrupting children because they want to make money. What they want are adult, middle-aged people, with credit cards from whom they can extract money, preferably on a subscription basis or whatever. The stuff that children are getting
access to is what are called teaser adverts. They are designed to draw people in to the harder stuff inside, you might say. The providers would be delighted to offer age verification right up front so long as all the others have to comply as well --
otherwise they will get all the traffic. Children use up bandwidth. It costs the providers money and wastes their time, so they are very happy to go along with it. They will even help police it, for the simple reason that it will block the opposition. It
is one of the few times I approve of the larger companies getting a competitive advantage in helping to police the smaller sites that try not to comply.
One of the things that became apparent early on was that we will not be able to do anything about foreign sites. They will not answer mail or do anything, so blocking is probably the only thing that will work. We are delighted that the Government has
gone for that at this stage. Things need to get blocked fast or sites will get around it. So it is a case of block first, appeal later, and we will need a simple appeals system. I am sure that the BBFC will do a fine job, but we need something just in
Another thing that came back from the ISPs is that they want more clarity about what should be blocked, how it will be done and what they will have to do. There also needs to be indemnity. When the ISPs block something for intellectual property and
copyright reasons, they are indemnified. They would need to have it for this as well, or there will be a great deal of reluctance, which will cause problems.
The next thing that came up was censorship. The whole point of this is we want to enforce online what is already illegal offline. We are not trying to increase censorship or censor new material. If it illegal offline, it should be illegal online and we
should be able to do something about it. This is about children viewing adult material and pornography online. I am afraid this is where I slightly disagree with the noble Baroness, Lady Kidron. We should decide what should be blocked elsewhere; we
should not use the Bill to block other content that adults probably should not be watching either. It is a separate issue. The Bill is about protecting children. The challenge is that the Obscene Publications Act has some definitions and there is ATVOD
stuff as well. They are supposed to be involved with time. CPS guidelines are out of step with current case law as a result of one of the quite recent cases -- so there is a bit of a mess that needs clearing up. This is not the Bill to do it. We probably
need to address it quite soon and keep the pressure on; that is the next step. But this Bill is about keeping children away from such material.
The noble Baroness, Lady Benjamin, made a very good point about social platforms. They are commercial. There are loopholes that will get exploited. It is probably unrealistic to block the whole of Twitter -- it would make us look like idiots. On the
other hand, there are other things we can do. This brings me to the point that other noble Lords made about ancillary service complaints. If we start to make the payment service providers comply and help, they will make it less easy for those sites to
make money. They will not be able to do certain things. I do not know what enforcement is possible. All these sites have to sign up to terms and conditions. Big retail websites such as Amazon sell films that would certainly come under this category. They
should put an age check in front of the webpage. It is not difficult to do; they could easily comply.
We will probably need an enforcer as well. The BBFC is happy to be a regulator, and I think it is also happy to inform ISPs which sites should be blocked, but other enforcement stuff might need to be done. There is provision for it in the Bill. The
Government may need to start looking for an enforcer.
Another point that has come up is about anonymity and privacy, which is paramount. Imagine the fallout if some hacker found a list of senior politicians who had had to go through an age-verification process on one of these websites, which would mean they
had accessed them. They could bring down the Government or the Opposition overnight. Noble Lords could all go to the MindGeek website and look at the statistics, where there is a breakdown of which age groups and genders are accessing these websites. I
have not dared to do so because it will show I have been to that website, which I am sure would show up somewhere on one of these investigatory powers web searches and could be dangerous.
One of the things the Digital Policy Alliance, which I chair, has done is sponsor a public available specification, which the BSI is behind as well. There is a lot privacy-enforcing stuff in that. It is not totally obvious; it is not finished yet, and it
is being highlighted a bit more. One thing we came up with is that websites should not store the identity of the people whom they age-check. In fact, in most cases, they will bounce straight off the website and be sent to someone called an attribute
provider, who will check the age. They will probably know who the person is, but they will send back to the website only an encrypted token which says, We've checked this person that you sent to us. Store this token. This person is over 18 -- or
under 18, or whatever age they have asked to be confirmed. On their side, they will just keep a record of the token but will not say to which website they have issued it -- they will not store that, either. The link is the token, so if a regulator or
social service had to track it down, they could physically take the token from the porn site to where it came from, the attribute provider, and say, Can you check this person's really over 18, because we think someone breached the security? What went
wrong with your procedures? They can then reverse it and find out who the person was -- but they could still perhaps not be told by the regulator which site it was. So there should be a security cut-out in there. A lot of work went into this because
we all knew the danger.
This is where I agree entirely with the Open Rights Group, which thinks that such a measure should be mandated. Although the publicly available specification, which is almost like a British standard, says that privacy should be mandated under general
data protection regulation out of Europe, which we all subscribe to, I am not sure that that is enough. It is a guideline at the end of the day and it depends on how much emphasis the BBFC decides to put on it. I am not sure that we should not just put
something in the Bill to mandate that a website cannot keep a person's identity. If the person after they have proved that they are 18 then decides to subscribe to the website freely and to give it credit card details and stuff like that, that is a
different problem -- I am not worried about that. That is something else. That should be kept extremely securely and I personally would not give my ID to such a site -- but at the age verification end, it must be private.
There are some other funny things behind the scenes that I have been briefed on, such as the EU VAT reporting requirements under the VAT Mini One Stop Shop, which requires sites to keep some information which might make a person identifiable. That could
apply if someone was using one of the attribute providers that uses a credit card to provide that check or if the website itself was doing that. There may be some things that people will have to be careful of. There are some perfectly good age-checking
providers out there who can do it without you having to give your details. So it is a good idea; I think that it will help. Let us then worry about the point that the noble Baroness, Lady Kidron, made so well about what goes where.
The universal service obligation should be territorial; it has to cover the country and not just everyone's homes. With the internet of things coming along -- which I am also involved in because I am chair of the Hypercat Alliance, which is about
resource discovery over the internet of things -- one of the big problems is that we are going to need it everywhere: to do traffic monitoring, people flows and all the useful things we need. We cannot have little not-spots, or the Government will not be
able to get the information on which to run all sorts of helpful control systems. The noble Lord, Lord Gordon of Strathblane, referred to mast sharing. The problem with it is that they then do not put masts in the not-spots; they just keep the money and
work off just one mast -- you still get the not-spots. If someone shares a mast, they should be forced a mast somewhere else, which they then share as well.
On broadband take-up, people say, Oh, well, people aren't asking for it . It is chicken and egg: until it is there, you do not know what it is good for. Once it is there and suddenly it is all useful, the applications will flow. We have to look to
the future; we have to have some vision. Let us get chicken or the egg out there and the chicken will follow -- I cannot remember which way round it is.
I agree entirely with the noble Lord, Lord Mitchell, that the problem with Openreach is that it will always be controlled by its holding company, which takes the investment, redirects it and decides where the money goes. That is the challenge with having
I do not want waste much time, because I know that it is getting late-ish. On jobs, a huge number of jobs were created in earlier days in installing and maintaining internet of things sensors all over the place -- that will change. On the gigabit stuff,
it will save travel, energy and all sorts of things -- we might even do remote-control hip operations, so you send the device and the surgeon then does it remotely, once we get super-duper superfast broadband.
I want to say one thing about IP. The Open Rights Group raised having thresholds of seriousness. It is quite important that we do not start prosecuting people on charges with 10-year sentences for trivial things. But it is also sad how interesting
documentaries can disappear terribly quickly. The catch-up services cover only a month or so and if you are interested, it is quite nice being able to find these things out there on the internet a year or two later. There should somehow be a publicly
available archive for all the people who produce interesting documentaries. I do not know whether they should make a small charge for it, but it should be out there.
The Open Rights Group also highlighted the bulk sharing of data. Some of the stuff will be very useful -- the briefing on free school meals is interesting -- but if you are the only person who really knows what might be leaked, it is very dangerous. If
someone were to beat you up, an ordinary register could leak your address across without realising that at that point you are about to go into witness protection. There can be lots of problems with bulk data sharing, so be careful; that is why the
insurance database was killed off a few years ago. Apart from that, I thank your Lordships for listening and say that, in general, this is a good effort.?
The Malaysian Communications and Multimedia Commission (MCMC) blocked 5,044 websites for
various offences under the Communications and Multimedia Act 1998 since 2015 until October this year.
Deputy Communications and Multimedia Minister Datuk Jailani Johari said, out of the total, 4,277 are pornographic websites while another 767 displayed elements of gambling, prostitution, fraud, piracy, counterfeit products, unregistered medicine and
others. He added:
MCMC blocks all the websites based on the application of the enforcement agencies such as the police, Health Ministry, Domestic Trade, Cooperatives and Consumerism Ministry and other relevant agencies.
Until last October, MCMC also blocked 72 websites related to the spread of Daesh or Islamic State ideology.
MCMC had also investigated 181 cases of social media and Internet abuse involving the spread of false information and contents through the WhatsApp, Facebook, Twitter plaform and so forth under the same Act. Jailani said, out of the total, six cases were
brought to court, including five cases that were prosecuted while 10 cases were compounded.
Google Play and Apple have banned an app designed to help women obscure nudity when posting images online.
Model Melina DiMarco has designed an app called Nood to provide simple stylised lady parts to post over your own, to get round censorship rules on nudity. She claims other options for censoring photos, for example black bars and crosses,
encourage the sexualisation of the female form. DiMarco said:
It's a photo editing app where you can upload a photo and put illustrated nipples over your nipples and an illustrated vagina over your vagina.
As a woman and as a model I don't quite understand why my male counterpart can post freely on social media but I cannot. Nipples are nipples, there's no difference. Female nipples, male nipples, they're all the same.
However the App Store and Google Play Store claim her app promotes explicit content Apple claimed the app includes content that many users would find objectionable and offensive , while Google says we don't allow apps that contain or promote
sexually explicit content .
Murray Perkins of the BBFC explains how all the world's major porn websites will have to be totally banned in Britain (even if they set up age verification systems) under the censorship rules contained in the Digital Economy Bill
The BBFC currently cuts about 15% of all R18 porn films on their way to totally ordinary mainstream porn shops. These are not niche or speciality
films, they are totally middle of the road porn, which represents the sort of content on all the world's major porn sites. Most of the cuts are ludicrous but Murray Perkins, a senior examiner of the BBFC, points out that they are all considered either be
to be harmful, or else are still prohibited by the police or the government for reasons that have long since past their sell by date.
So about a sixth of all the world's adult films are therefore considered prohibited by the British authorities, and so any website containing such films will have to be banned as there is to practical way to cut out the bits that wind up censors, police
or government. And this mainstream but prohibited content appears on just about all the world's major porn sites, free or paid.
The main prohibitions that will cause a website to be blocked (even before considering whether they will set up strict age verification) are such mainstream content as female ejaculation, urine play, gagging during blow jobs, rough sex, incest story
lines (which is a major genre of porn at the moment), use of the word 'teen' and verbal references to under 18's.
Murray Perkins has picked up the job of explaining this catch all ban. He explains it well, but he tries to throw readers off track by citing examples of prohibitions being justifiable because the apply to violent porn, whilst not mentioning that
they apply equally well to trivia such as female squirting.
Perkins writes in the Huffington Post:
Recent media reports highlighting what content will be defined as prohibited material under the terms of the Digital Economy Bill could have given an inaccurate impression of the serious nature of the harmful material that the BBFC generally refuses to
classify. The BBFC works only to the BBFC Classification Guidelines and UK law, with guidance from the Crown Prosecution Service (CPS) and enforcement bodies, and not to any other lists.
The Digital Economy Bill aims to reduce the risk of children and young people accessing, or stumbling across, pornographic content online. It proposes that the BBFC check whether
(i) robust age verification is in place on websites containing pornographic content and
(ii) whether the website or app contains pornographic content that is prohibited.
An amendment to the Digital Economy Bill, passed in the House of Commons, would also permit the BBFC to ask Internet Service Providers (ISPs) to block pornographic websites that refuse to offer effective age verification or contain prohibited material
such as sexually violent pornography.
In making any assessment of content, the BBFC will apply the standards used to classify pornography that is distributed offline. Under the Video Recordings Act 1984 the BBFC is obliged to consider harm when classifying any content including 18 and R18
rated sex works. Examples of material that the BBFC refuses to classify include pornographic works that: depict and encourage rape, including gang rape; depict non-consensual violent abuse against women; promote an interest in incestuous behaviour; and
promote an interest in sex with children. [Perkins misleadingly neglects to include, squirting, gagging, and urine play in his examples here]. The Digital Economy Bill defines this type of unclassifiable material as
Under its letters of designation the BBFC may not classify anything that may breach criminal law, including the Obscene Publications Act (OPA) as currently interpreted by the Crown Prosecution Service (CPS). The CPS provides guidance on acts which are
most commonly prosecuted under the OPA. The BBFC is required to follow this guidance when classifying content offline and will be required to do the same under the Digital Economy Bill. In 2015, 12% of all cuts made to pornographic works classified by
the BBFC were compulsory cuts under the OPA. The majority of these cuts were to scenes involving urolagnia which is in breach of CPS guidance and could be subject to prosecution.
A radio ad for the film Lights Out , broadcast on 16 August at 7:25 pm on Capital Radio East Midlands, featured an introductory voice-over stating, From the visionary filmmaker behind the Conjuring . A child's voice then said, Every time
I turn the lights off, there's this woman. A woman said I've been seeing her too , followed by a scream. The voice-over continued, Critics are calling it one of the year's best horrors. The woman then said, Everyone is afraid of the
dark, and that's what she feeds on. There was the sound of more screaming. The voice-over said, Chilling. The woman said, We need to leave. The voice-over continued, It will leave you sleeping with the lights on. The child said,
She won't let that happen. The woman, sounding very distressed, shouted, Stay in the light! , followed by sinister noises. The voice-over said, Lights Out. In cinemas Friday. Certificate 15.
A complainant challenged whether the ad had been scheduled appropriately, because it had frightened her child.
ASA Assessment: Complaint not upheld
The ASA noted that, in line with the plot of the film, the audio clips in the ad suggested that there was something threatening associated with being in the dark or having the lights off. We acknowledged that a fear of the dark was common among young
children. We agreed with Radiocentre that the ominous tone of the ad meant that it should have been scheduled away from times when children aged under 16 were likely to be listening in order to minimise the possibility of children hearing it.
Prior to scheduling the ad, This is Global had consulted RAJAR figures for the time that the ad was to be aired and those figures had shown that the under-16 segment typically comprised a low proportion of the audience at that time. The ad had been heard
during school holidays, when children's listening patterns might be expected to differ slightly compared to term time. However, we noted that RAJAR figures for the specific day and time that the ad was broadcast showed that only 7% of the listening
audience was under 16, which we considered minimal. We concluded that the scheduling advice given by Radiocentre was appropriate and that it had been applied responsibly by the broadcasters, and that the ad therefore did not breach the Code.
We investigated the ad under BCAP Code rules 5.1 (Children), 32.1 and 32.3 (Scheduling), but did not find it in breach.
A policeman arrested on suspicion of possessing extreme pornography has been cleared after an internal investigation. Four other
officers placed on desk jobs as part of a criminal and internal misconduct investigation into the use of a social media account will also face no action, Devon and Cornwall police have said.
The Crown Prosecution Service did not consider the image constituted extreme porn nor did they find any criminality, the force claimed. A police spokesman said:
The investigation centred on one image which was produced as an inappropriate joke on a social media account . The behaviour was misguided and as such management action was given in the form of words of advice to those involved. This brings
the matter to a conclusion. All five officers have now returned to front line duties.
Leading German MPs have called for online 'fake news' campaigns to be made a crime. Patrick Sensburg, a senior MP in Angela Merkel's
Christian Democratic Union (CDU) party, said:
Targeted disinformation to destabilise a state should be a criminal offence. We need to consider whether there should be some sort of 'test site' that reveals and identifies propaganda pages.
The call was backed by his party colleague Ansgar Heveling, the chairman of the German parliament's influential internal affairs committee aying:
We last saw disinformation campaigns during the Cold War, now they have clearly been revived with new media opportunities. The law already offers options, such as a slander or defamation. But I think a criminal sentence is more appropriate when it is a
German intelligence has warned that Russia is seeking to influence next year's German elections via propaganda distributed via the internet, partcularly social media. Russia has been accused of deliberately using socialbots , automated software
masqueraring as real people, to promote 'fake news' stories on social media.
Mrs Merkel's current coalition partners and main rival in next year's elections, the Social Democratic Party (SPD), have also called for a cross-party alliance against 'fake news' stories. Sigmar Gabriel, the SPD leader called for
Democratic solidarity against manipulative socialbots and an alliance against 'fake news'.
Thorsten Schäfer-Gümbel of the SPD added:
If there is any doubt about the authenticity of any information, we should refrain from attacking our political opponents with it.
Dutch political leader Geert Wilders has been found guilty of hate speech and inciting racial discrimination for leading a
chant calling for fewer, fewer Moroccans in the Netherlands.
Presiding Judge Hendrik Steenhuis said the court would not impose a sentence because the conviction was punishment enough for a democratically elected lawmaker. Prosecutors had asked judges to fine him 5,000 euros ($5,300).
Wilders, head of the PVV Freedom Party, was not present to hear the judgement but his lawyer Geert-Jan Knoops immediately issued a statement to say that he would appeal.
The judge claimed that Wilders had breached the boundaries of even a politician's freedom of speech. Wilders said, in a statement:
I still cannot believe it, but I have been convicted because I asked a question about Moroccans. The Netherlands has become a sick country. The judge who convicted me [has] restricted the freedom of speech for millions of Dutch. I will never be silent. I
am not a racist and neither are my voters.
The US moralist campaign group, Parents Television Council have kindly been hyping a new TV comedy series to be shown on ABC.
Written by Nick Thune and frequent collaborator Kevin Parker Flynn, Holy Sh*t follows the staff of a struggling church and their edgy new pastor (Nick Thune) as they fight to survive in the modern world.
The morality campaigners have issued a press releasing whingeing about the title. The group writes:
The Parents Television Council haveurged ABC to reconsider using profanity in the title of a TV show in development, Holy Sh*t . PTC President Tim Winter spouted:
ABC's primetime programs have become a home for explicit, vulgar, and sexualized language, and this new show's title is yet more evidence that Disney-owned ABC is going in the wrong direction. Beyond whether the vulgar title would run afoul of the
broadcast indecency law, it's absurd that ABC would even consider exposing children to this explicit program title. While parents may steer their family viewing away from the program itself, the title would appear on program guides, marketing materials,
and in network promos when families are watching other ABC programming.
Surely, ABC should have learned from experience that using profanity in a show title won't ensure a program's financial success. ABC and CBS have each attempted to use profanity in a program title and both attempts resulted in advertiser embargoes, and
in only one season of each of those shows.
With ABC Entertainment President Channing Dungey saying just last week that 'in recent history we haven't paid enough attention to some of the true realities of what life is like for everyday Americans in our dramas,' we have to wonder why ABC would
immediately alienate those families with this show title.
ABC should immediately change the title in order to protect children from this explicit language.
Facebook, Microsoft, Twitter and YouTube are coming together to help curb the spread of terrorist content online. There is no place for content that
promotes terrorism on our hosted consumer services. When alerted, we take swift action against this kind of content in accordance with our respective policies.
We have committed to the creation of a shared industry database of hashes 204 unique digital fingerprints 204 for violent terrorist imagery or terrorist recruitment videos or images that we have removed from our services. By sharing this
information with each other, we may use the shared hashes to help identify potential terrorist content on our respective hosted consumer platforms. We hope this collaboration will lead to greater efficiency as we continue to enforce our policies to help
curb the pressing global issue of terrorist content online.
Our companies will begin sharing hashes of the most extreme and egregious terrorist images and videos we have removed from our services 204 content most likely to violate all of our respective companies' content policies. Participating companies can add
hashes of terrorist images or videos that are identified on one of our platforms to the database. Other participating companies can then use those hashes to identify such content on their services, review against their respective policies and
definitions, and remove matching content as appropriate.
As we continue to collaborate and share best practices, each company will independently determine what image and video hashes to contribute to the shared database. No personally identifiable information will be shared, and matching content will not be
automatically removed. Each company will continue to apply its own policies and definitions of terrorist content when deciding whether to remove content when a match to a shared hash is found. And each company will continue to apply its practice of
transparency and review for any government requests, as well as retain its own appeal process for removal decisions and grievances. As part of this collaboration, we will all focus on how to involve additional companies in the future.
Throughout this collaboration, we are committed to protecting our users' privacy and their ability to express themselves freely and safely on our platforms. We also seek to engage with the wider community of interested stakeholders in a transparent,
thoughtful and responsible way as we further our shared objective to prevent the spread of terrorist content online while respecting human rights.
A PC campaign group is urging for the totally disproportionate punishment of the sack for Strictly Come Dancing judge Craig
Revel Horwood after he said he liked the sex and violence in the hit TV series Game of Thrones.
Revel Horwood was on the More4 comedy panel show 8 Out of 10 Cats on Tuesday night when discussing the hit American fantasy drama series with other panelists on the show.
Host Jimmy asked Horwood if he watched the programme and he replied:
No, I persevered for the first series until the dragon came on and that's when I switched off.
I liked all the sex scenes and the rape and I liked the cleavers through the skulls and I liked all of that, but I got very bored in the end.
Irish actress Aisling Bea who was on his team, looked horrified. She said:
When they weren't raping anyone? Am I the only one who heard that? What world are we living in? Oh Trump's world, fine keep going.
Marilyn Hawes, founder of Enough Abuse UK, said she was:
Absolutely disgusted. Rape is the most devastating and vile crime and I would have to question him as a person and his merit as a judge on the Strictly Come Dancing panel, which is a family show.
His comment would have enraged many women and men who have been raped. How could he say he liked that scene? I'm absolutely disgusted.
You cannot have people on a family show with that mindset. I think the BBC should get rid of him. It is so distressing for people who have been raped to hear that.
All round politically correct good egg finds himself on the receiving end of attacks by social 'justice' warriors for disrespecting people with 'less formal education' who don't understand his long words and well crafted prose
Ofcom has set out how it will take on regulation of the BBC from next April.
This will see the biggest reform of the governance and regulation of the BBC since it was founded.
The Government has decided that a new BBC unitary board will govern and run the BBC, and ultimately be responsible for editorial and management decisions.
Ofcom will become the new external regulator of the BBC. Our job will be to hold the BBC to account.
We have published a statement
explaining how we will prepare to undertake our new regulatory duties from the planned effective date, 3 April 2017.
Ofcom's approach to regulating the BBC
As the new external regulator, Ofcom will bring its experience of regulating the wider broadcast and communications sector at a time of increasing convergence.
Ofcom already has roles across many of the BBC's services, from content standards to competition. These new responsibilities broaden the scope of Ofcom's existing work.
Regulation of the BBC will sit within Ofcom's existing teams and will focus on three core areas as laid out in the Charter: content standards; protecting fair and effective competition; and reviewing the BBC's performance.
In order to carry out our new duties effectively and efficiently, and to provide clarity to audiences and the wider sector, we will :
Proceed from our principal duty -- as with all our work, our principal objective is to further the interests of citizens and consumers;
Recognise that the BBC is the cornerstone of public service broadcasting in the UK -- the BBC has a special status, but we won't give it special treatment;
Recognise that responsibility for governance lies with the new BBC Board -- it is for the BBC Board, rather than Ofcom, to determine how to deliver the mission and purposes defined in the Charter. The Board must set the BBC's editorial guidelines.
We will hold the BBC to account;
Make good use of our depth of knowledge and experience -- we have experience of regulating the broadcasting sector, as well as existing roles in relation to the BBC in the key areas of content standards, competition and performance;
Consult widely -- ensure the views of citizens, consumers and stakeholders feed into our work; and
Be clear about our expectations and requirements of the BBC -- provide clarity on how we will address issues if things go wrong, to provide certainty to the BBC, its audiences and the wider sector.
In the coming months, Ofcom will develop an 'Operating Framework' for the BBC. This will ultimately contain all of the elements of our regulation across the BBC's content standards, competition and performance.
The Operating Framework will set out the regulatory tools that Ofcom will use to hold the BBC to account. There will be separate consultations on the finer details of our role over the coming, which fall into the following broad categories:
1. Content standards
Viewers and listeners should be able to trust what they see and hear. They should know that steps have been taken to avoid unjustified offence, and that protection from harmful content is in place. Ofcom will set content standards for the BBC so that its
viewers and listeners are adequately protected.
The previous Charter and Agreement gave Ofcom shared regulatory oversight of some of the BBC's content standards with the BBC Trust, which will close when Ofcom takes on its new role. The new arrangement hands Ofcom regulatory responsibility for content
standards on BBC broadcasting and on-demand programme services including, for the first time, for the accuracy and impartiality of BBC news and current affairs programmes. Ofcom will be updating the rules in its Broadcasting Code to fulfil these new
Ofcom will also create procedures for handling complaints about BBC content standards, and for conducting our investigations and sanctions.
Additionally, we will publish procedures explaining how audiences will be able to obtain an independent opinion from Ofcom on whether the BBC has observed relevant editorial guidelines for online material in its UK Public Services.
2. Protecting fair and effective competition
Fair and effective competition is good for viewers and listeners. It can increase choice and stimulate investment and innovation -- ensuring the provision of a wide range of high-quality and varied programmes, and different ways to access them.
Ofcom will assess the effect of the BBC's activities on audiences and the UK media sector, and set rules as to how the BBC should behave.
We will also impose requirements on the BBC to avoid the relationship between its public-service activities and commercial subsidiaries distorting the market, or creating an unfair competitive advantage for the BBC's subsidiaries.
3. Performance -- holding the BBC to account
Ofcom is currently developing a set of tools to regulate the BBC's performance. This will include an Operating Licence for the BBC's UK public services and may include any performance measures we consider appropriate, further to those set by the BBC, we
will consult on this over the course of next year.
As explained in the Charter, we will have a particular focus on assessing the distinctiveness of the BBC's output. We will also hold the BBC to account in relation to its obligations to serve audiences in all four of the UK's nations and for diversity.
As part of the approach to performance, we expect to carry out both ad hoc and periodic reviews of the BBC's services.
Selling media players with pirate add-ons violates EU law, according to a recommendation from Advocate General Campos
He issued the advice in a landmark case over the legality of pre-loaded XBMC/Kodi devices, which are widely sold across Europe. Whether users of these players also liable depends on whether they know that the content is infringing. While Kodi itself is a
neutral platform, there are lots of add-ons available that turn it into a pirate's heaven.
In Europe, the European Court of Justice is currently handling a landmark case that should provide more clarity on the legality of set-top boxes that are sold with links to infringing content.
The issue was raised in a case between Dutch anti-piracy group BREIN and the Filmspeler.nl store, which sells piracy configured media players. While these devices don't host any infringing content, they ship with add-ons that make it very
easy to watch infringing content.
The Dutch District Court referred the case to the EU Court of Justice, and the Advocate General (AG) Campos S31nchez-Bordona issued his recommendation to the Court. The AG concluded that selling a media player with the knowledge that it links to
infringing material, constitutes a communication to the public, which makes it copyright infringing.
Whether the users of these devices are also acting unlawfully is a different question. According to the AG it would be logical to conclude that, when offering devices with pirate add-ons is illegal, using them would be too:
In my opinion, if the key factor, in the case of a person who inserts a hyperlink without pursuing a profit, is knowledge  that the protected work is available on the internet unlawfully, it would be difficult not to extend that criterion to a
person who merely makes use of that hyperlink, also without pursuing a profit.
The Advocate General's advice is often crucial, but not binding. It is expected that the EU Court of Justice will issue its final verdict in this case early next year.
Two dozen human rights and civil liberty groups have thrown their weight behind Google's challenge of a Canadian court decision
it warns could stifle freedom of expression around the world and lead to a diminished internet of the lowest common denominator .
In an appeal heard on Tuesday in the supreme court of Canada , Google Inc took aim at a 2015 court decision that sought to censor search results beyond Canada's borders.
In 2012, Canadian company Equustek won a judgment to have a company banned from selling a counterfeit version of Equustek's product online. Google voluntarily removed more than 300 infringing URLs. But as more sites popped up, Equustek went back to court
-- this time seeking a worldwide ban. A court of appeal in British Columbia sided with Equustek in 2015, ordering Google to remove all of its search results linked to the company. It is this ruling that Google is now appealing.
The human rights groups are focusing on the question at the heart of the precedent-setting case: if one country can control what you see on the internet, what is to prevent other countries from doing the same? Gregg Leslie of Reporters Committee
for Freedom of the Press said:
It's a worrisome trend, where we see individual countries trying to regulate the internet worldwide. And of course the consequences of that would mean that even countries like Russia and China could do the same thing and that will really affect the
content available on the internet.
The European Commission has called on tech companies such as Twitter, Facebook, and other major names to implement more
aggressively measures in order to censor online hate speech. The alternative is to face new EU legislation that would force the tech companies to censor more quickly.
The Financial Times reports that a study commissioned by the EU justice commissioner, Vera Jourova, found that YouTube, Google, Microsoft, Twitter, and Facebook have struggled to comply with the hate speech voluntary code of conduct that was announced
earlier this year. Amid national security concerns and heightened racial tensions, mostly resulting from unpopular EU refugee policies.
In Germany, the government-led effort has been particularly aggressive. Germany is one of the European nations where the ongoing refugee crisis has reinvigorated the far-right and sparked a backlash against government policy. Reuter reports that Heiko
Maas, the German Justice Minister, recently said that Facebook should be made liable for any hate speech published on its social media platform and it should be treated as a media company.
According to The Verge, Google, Twitter, Facebook and Microsoft agreed in a code of conduct announced in May to review and respond within 24 hours to the majority of hate speech complaints. However, only 40% of the recorded incidents have been
reviewed within 24 hours, according to the commission's report. That figure rose to 80% after 48 hours.
According to PCMag, two advocacy groups have criticized those efforts in France. In May, the two rights groups announced their plans to sue Google, Twitter, and Facebook for failing to remove from their platforms homophobic, racist and other hateful
posts. News articles have so far failed to point out that maybe some of these groups are making some false claims about material being censorable. Perhaps the media companies were right to not remove all of the posts reported.
EU justice ministers will meet to discuss the report's findings on 8th December.
Offsite Comment: Social Networks Must Stand Against Censorship
The pressure for social networks to censor the content that appears on them just won't cease, and the networks are bending. Censorship, however, is not what users want. Nor is it technically possible, even if the platforms won't admit it.
Lawmakers in France have voted to ban misleading anti-abortion web sites. After a heated debate, the French National Assembly passed a bill to
outlaw websites spreading misinformation about abortion. Pro-abortion activists have accused Pro-Life campaigners of pretending to give neutral information while putting pressure on women not to have abortions.
The new law, which still has to pass the Senate, extends an existing law against physical intimidation over abortion to digital media and would extend the scope of a 1993 law, which criminalizes false information related abortions, to digital
Providing false information on abortion online would be punishable by up to two years in prison and a 30,000 euro fine, a stipulation that pro-life advocates were quick to ridicule.
In the current fad for blaming all society's ills on 'false' news and information this adds an interesting possibility of religious commandments being tested in court as false information. The underlying religious view is that abortion is bad simply
ecause their god said so. And surely opponents will understandably see this as false information.
Bruno Retailleau, who heads the Republicans party group in the Senate, says the bill is totally against freedom of expression. He claimed the bill went against the spirit of the 1975 law legalizing abortion, which called for women to be
informed of alternatives.
A book parodying vintage children's texts has been withdrawn following a fierce attack by social justice whingers who claim that Bad Little Children's Books is racist for its depictions of Native Americans and Muslims
Illustrator Arthur C. Gackley has now withdrawn the title saying:
The book is clearly not being read by some in the way I had intended--as satire--and, more disturbingly, is being misread as the very act of hate and bigotry that the work was meant to expose, not promote.
Images in the book, marketed as a collection of 120 edgy, politically incorrect parodies, include a girl wearing a burka giving a ticking present to a little boy for a book called Happy Burkaday, Timmy by Ben Laden.
Examples of the miserable and censorial whinges are:
Nick Hanover tweeted: We need to stop letting entities like @ABRAMSbooks claim satire whenever they want to publish hateful trash.
Book Riot blogger Kelly Jensen whinged:
In a culture which is hateful and violent against anyone outside of the Christian norm, particularly Muslims, who thought this was even an okay image to present in a book, humorous or not? This is the sort of harmful imagery and stereotyping that
literally kills lives -- and it's not the lives of little white boys who are dying. It's the lives of those, like the girl in the burka, who are impacted by disgusting "humor" like this. We don't live in a world where humor like this is
acceptable. This kind of "humor" is never acceptable. It's deadly.
Book publisher Abrams disagreed with the author's withdrawal of the book saying:
In the last few days some commentators on social media and those who follow them have taken elements of the book out of context, failing to recognise it as an artistic work of social satire and comic parody. They argue that it lends credence to the
hateful views that the author's work is clearly meant to mock, demean, expose, and subvert.
Bad Little Children's Books is a work of parody and satire and, as such, it is intentionally, openly, and provocatively offensive... We stand fully behind freedom of speech and artistic expression, and fully support the First Amendment. We have been
disheartened by calls to censor the book and to stifle the author's right to express his artistic vision by people we would expect to promote those basic fundamental rights and freedoms.
However, faced with the misperceived message of the book, we are respecting the author's request.
Russian MPs have sent a letter of complaint to the country's internet censors and state 'consumer protection' agencies asserting that FIFA 17 may be in violation of Russia's 2013 gay propaganda law that claims the presence of positive homosexual
material in media will do harm to children's health and development.
In this case, it was games publisher EA giving out a free rainbow calcio kit that led to Communist MPs sending the letter. EA gave away the digital item in support of the Rainbow Laces campaign meant to combat homophobia, biphobia and transphobia in the
According to The Guardian , MP Valery Rashkin says the family-friendly-rated game needs to be investigated by the Federal Service for Supervision in the Sphere of Telecom, Information Technologies and Mass Communications to ensure it is in compliance
with the 2013 law.
The European Commission says Internet hosts should pre-censor everything we upload to the Internet for copyright violations. The UK agrees.
Tell the UK's Intellectual Property Office (IPO) we don't want rights holders to monitor and filter the Internet!
The European Commission has published plans to force Internet companies to filter everything we upload in case it infringes copyright laws. The UK's Intellectual Property Office wants our views on the European Commission's plans. The UK Government
is minded to support the plans if they can get them to work.
This could block Downfall parodies, campaign videos, TV clips, memes, profile pics -- anything that appears to reuse copyright content, even if it is legal to do so.
We need to stop this censorious, privacy-invading, anti-innovation proposal. Users of social media, photo, music and video sharing sites would all be hit hard.
Any company that lets you upload content to the Internet would check everything you upload against a database of copyright works - a massive violation of privacy in order to create this censorship regime.
If you want to insist on your right to publish, you'd have to supply your name and address and agree that you can be prosecuted by the rightsholder. That will put most people off taking the risk, even if they are within their rights to do so. And if
rightsholder think that websites aren't monitoring their users' uploads closely enough, they can take those websites to court too.
David Currie has been appointed Chairman of the Advertising Standards Authority and will succeed the current Chairman, Chris Smith
The appointment was announced by the Advertising Standards Board of Finance, the bodies that fund the advertising self-regulation system, following consultation with the Department for Culture, Media and Sports (DCMS), Ofcom and the Advertising
Currie has good experience of media censorship as he was the founding Chairman of Ofcom.
Currie will take up his position from 1 October 2017.
It was Conservative MP and former minister John Whittingdale who introduced the bill. But now, the BBC is reporting that
he's worried it might not actually work. He told Parliament:
One of the main ways in which young people are now exposed to pornography is through social media such as Twitter, and I do not really see that the bill will do anything to stop that happening.
This gets neatly at a key problem with the porn filter: The internet is not neatly divided into pornography and non-pornography. As I wrote last week , it's technically simple to block dedicated fetish websites. But plenty of sites mix porn with
non-pornographic content, or include both conventional and non-conventional material -- raising serious questions as to how the filter could ever work in practice.
A campaign group, the Evangelical Alliance, has claimed that a new Christmas themed board games is offensive, shocking and blasphemous .
Santa vs Jesus , made by London company Komo Games, is played by two teams - one for each of the festive figures - who battle through challenges in an attempt to win the most believers .
It was funded via crowd-sourcing site Kickstarter which said it was the most complained about game in history . But fans have called it good fun .
Danny Webster, spokesperson for the Evangelical Alliance, whinged about the game, saying he believes:
It trivialises Christian belief and equates them both as fictional characters. With over 4 out of 10 people in the UK mistakenly thinking that Jesus was not a real historical person, this game won't help correct that.
At its heart Christmas is about celebrating the birth of Jesus and the gift of life he brings. Santa comes from the story of St Nicholas who as a Christian bishop was generous to the poor and was very happy to have Christ as his king.
When it comes to Santa vs Jesus, we're firmly on Team Jesus too. Image copyright Santa vs Jesus Image caption A promotional video for the board game involves the creators acting as Santa and Jesus while singing a jauntily catchy tune and sparring with
One of the creators of the game, Julian Miller, says:
Sales are exceeding all expectations and we've had to rush through another order with our manufacturer to keep up with the demand.
The enthusiasm of our family and friends and the rise in popularity of games such as Cards Against Humanity and Exploding Kittens made us realise there was a gap in the market for a funny tongue-in-cheek game pitting Santa against Jesus. For years people
have wondered 'who rules Christmas? Santa or Jesus?'
Students at Plymouth university have followed students at City, University of London by deciding to ban three newspapers from their campus.
It means that the campus shop run by Plymouth's students will not sell copies of the Sun, Daily Mail, Daily Express and their Sunday equivalents from the new year.
According to a report in the Plymouth Herald , the decision was taken by the executive council of the University of Plymouth Students' Union (UPSU). The motion was passed, says a UPSU Facebook statement, by a large majority . However students
themselves were not consulted. The Student's Union said:
Whilst we believe that freedom of expression and speech are inalienable human rights, as defined in the Universal Declaration of Human Rights, a number of British tabloids are known to express hateful views. [...BUT...]
They aim at belittling and demonising certain groups in society, such as immigrants, refugees, asylum seekers, disabled people, the LGBTIQA+ community, Muslims, Black and Asian communities...
It is our duty to protect and empower and represent marginalised and discriminated against groups... UPSU opposes hatred, discrimination and demonisation of any individual..
Because of these very values that we hold and we are proud of, we believe that it is unethical for us to profit out of the sale of hateful, non-factual and anti-scientific media platforms.
Warning: Fake News Alert: No sane and rational person
would ever utter such bilge. This must be fake news...surely...
An Indian politician has blamed soap operas for the breakdown of society, saying that women become so caught up in them that they neglect to make their husbands a cup of tea.
Goa's Art and Culture Minister Dayanand Mandrekar made the ludicous claims while speaking at an awards function: He spewed:
Women are so interested in watching these serials, that once they start watching them in the evenings, they do not even pay attention to their husbands who come home after a long day at work. She is not even in the frame of mind to ask him whether he
would like to have a cup of tea or not.
He also said that because of soap operas people do not even care about religious festivals in the village , claiming many residents only attend if there is a break in programming.
What is 'fake news' anyway? Is it news that hides truths that are unpalatable to the politically correct? Is it reports of weapons of mass destruction in Iraq? Is it politicians outlining improvements in the economy?
Sausage Party is a 2016 USA animation comedy adventure by Greg Tiernan and Conrad Vernon.
Starring Kristen Wiig, Seth Rogen and Paul Rudd.
An animated fable about the delusion of religion. It is set in an American supermarket, its characters are horny and often blasphemous foodstuffs who at one point engage in a mass sex party.
Catholic and right-wing have whinged about a Seth Rogen cartoon featuring lots of strong language and a foodstuffs orgy scene.
France's film certification board, (Centre National de la Cinematographie: Commission de Classification, CNC) has now come under renewed censorship pressure by conservative organisations angry at what they perceive as an overlenient rating given to
a Hollywood cartoon.
In the US, the film was rated R for strong crude sexual content, pervasive language, and drug use. The film was cut in the US to avoid an NC-17 rating, with the censors asking for the deletion of the hairy scrotum of a pita bread. This cut version
has been distributed worldwide. In the UK, it had 15 for very strong language, strong sex references . In France it has been given a 12 certificate.
Jean-Frédéric Poisson, president of France's Christian Democratic party whinged:
An orgy scene for 12-year-olds! Everything remains to be done to combat early exposure to pornography.
La Manif Pour Tours, which has campaigned against same-sex marriage fired of an angry tweet:
Hello CNC, explain how you can authorise the screening of a giant orgy for the whole family?
The Association of Catholic Families warned parents:
[The movie gives] the appearance of being intended for young people and children. its content is not only coarse, but also clearly pornographic, under cover of being 'politically incorrect'.
The French ratings board has traditionally been more lenient than its UK and US equivalents, but is not entirely out on a limb in Europe as Sweden awarded an even lower rating, 11A.
Through an Assembly question to the Justice Minister, South Down UUP MLA Harold McKee has established that ultra-secretive
super-injunctions are still being ordered in Northern Ireland.
A standard injunction is a gagging order imposed by a judge, which bans anybody in the court's jurisdiction from reporting a story, or naming the parties involved. A super-injunction goes further and seeks to ban any mention of the fact that such an
injunction has been imposed in the first place.
Super-injunctions are so extraordinary and unwieldy that even the normally ban-happy London courts effectively banished them five years ago after getting a bad rap in a few celebrity cases.
However, it appears that Northern Ireland's courts are continuing to use the discredited orders. After a lull since 2009, a fresh super-injunction was granted in 2015 and another one just this year. That is all we are allowed to know.
Harold McKee linked the continuation of super-injunctions in Northern Ireland to the Executive's refusal to adopt the liberalising reforms to the defamation laws introduced across the rest of the UK.
A man in France has been sentenced to two years in prison for repeatedly visiting pro-ISIS websites, even though there is no indication that he planned to
stage a terrorist attack. The man was convicted by a court iunder a new law that has drawn scorn from civil liberties groups. In addition to the two-year prison sentence, he will have to pay a ?30,000 fine.
Police discovered the man's browsing history after conducting a raid on his house. During the investigation, they found pro-ISIS images and execution videos on his phone, personal computer, and a USB stick. An ISIS flag was on the wallpaper of his
computer desktop, and his computer's password was "13novembrehaha," a reference to the night gunmen killed 130 people in attacks across Paris. The man had been regularly consulting jihadist websites for two years, police said.
This week's conviction is the latest handed down under a controversial law that criminalizes the "habitual" consultation of websites that promote terrorism. The law makes exceptions for those who visit the sites "in good faith" -- for
research, to inform the public, or for judicial purposes.
The Council of the EU could undermine encryption as soon as December. It has been asking delegates from all EU countries to detail their national legislative position on encryption.
We've been down this road before. We know that encryption is critical to our right to privacy and to our own digital security. We need to come together once again and demand that our representatives protect these rights -- not undermine them in secret.
Act now to tell the Council of the EU to defend strong encryption!
Dear Slovak Presidency and Delegates to the Council of the EU:
According to the Presidency of the Council of the European Union, the Justice and Home Affairs Ministers will meet in December to discuss the issue of encryption. At that discussion, we urge you to protect our security, our economy, and our governments
by supporting the development and use of secure communications tools and technologies and rejecting calls for policies that would prevent or undermine the use of strong encryption.
Encryption tools, technologies, and services are essential to protect against harm and to shield our digital infrastructure and personal communications from unauthorized access. The ability to freely develop and use encryption provides the cornerstone
for today's EU economy. Economic growth in the digital age is powered by the ability to trust and authenticate our interactions and communication and conduct business securely both within and across borders.
The United Nations Special Rapporteur for freedom of expression has noted, encryption and anonymity, and the security concepts behind them, provide the privacy and security necessary for the exercise of the right to freedom of opinion and expression
in the digital age.
Recently, hundreds of organizations, companies, and individuals from more than 50 countries came together to make a global declaration in support of strong encryption. We stand with people from all over the world asking you not to break the encryption we
Among the many unpleasant things in the Investigatory Powers Act that was officially
signed into law this week, one that has not gained as much attention is the apparent ability for the UK government to undermine encryption and demand surveillance backdoors.
As the bill was passing through Parliament, several organizations noted their alarm at section 217 which obliged ISPs, telcos and other communications providers to let the government know in advance of any new products and services being deployed and
allow the government to demand technical changes to software and systems.
Communications Service Providers (CSP) subject to a technical capacity notice must notify the Government of new products and services in advance of their launch, in order to allow consideration of whether it is necessary and proportionate to require the
CSP to provide a technical capability on the new service.
As per the final wording of the law, comms providers on the receiving end of a technical capacity notice will be obliged to do various things on demand for government snoops -- such as disclosing details of any system upgrades and removing electronic protection
on encrypted communications.