We met to discuss BBFC's voluntary age verification privacy scheme, but BBFC did not attend. Open Rights Group met a number of age verification providers to discuss the privacy standards that they will be meeting when the scheme
launches, slated for April. Up to 20 million UK adults are expected to sign up to these products.
We invited all the AV providers we know about, and most importantly, the BBFC, at the start of February. BBFC are about to launch a voluntary privacy standard which some of the providers will sign up to. Unfortunately, BBFC have not committed to
any public consultation about the scheme, relying instead on a commercial provider to draft the contents with providers, but without wider feedback from privacy experts and people who are concerned about users.
We held the offices close to the BBFC's offices in order that it would be convenient for them to send someone that might be able to discuss this with us. We have been asking for meetings with BBFC about the privacy issues in the new code since
October 2018: but have not received any reply or acknowledgement of our requests, until this morning, when BBFC said they would be unable to attend today's roundtable. This is very disappointing.
BBFC's failure to consult the public about this standard, or even to meet us to discuss our concerns, is alarming. We can understand that BBFC is cautious and does not wish to overstep its relationship with its new masters at DCMS. BBFC may be
worried about ORG's attitude towards the scheme: and we certainly are critical. However, it is not responsible for a regulator to fail to talk to its potential critics.
We are very clear about our objectives. We are acting to do our best to ensure the risk to adult users of age verification technologies are minimised. We do not pose a threat to the scheme as a whole: listening to us can only result in making the
pornographic age verification scheme more likely to succeed, and for instance, to avoid catastrophic failures.
Privacy concerns appear to have been recognised by BBFC and DCMS as a result of consultation responses from ORG supporters and others, which resulted in the voluntary privacy standard. These concerns have also been highlighted by Parliament,
whose regulatory committee expressed surprise that the Digital Economy Act 2017 had contained no provision to deal with the privacy implications of pornographic age verification.
Today's meeting was held to discuss:
What the scheme is likely to cover; and what it ideally should cover;
Whether there is any prospect of making the scheme compulsory;
What should be done about non-compliant services;
What the governance of the scheme should be in the long tern, for instance whether it might be suitable to become an ICO backed code, or complement such as code
As we communicated to BBFC in December 2018, we have considerable worries about the lack of consultation over the standard they are writing, which appears to be truncated in order to meet the artificial deadline of April this year. This is what
we explained to BBFC in our email:
Security requires as many perspectives to be considered as possible.
The best security standards eg PCI DSS are developed in the open and iterated
The standards will be best if those with most to lose are involved in the design.
For PCI DSS, the banks and their customers have more to lose than the processors
For Age Verification, site users have more to lose than the processors, however only the processors seem likely to be involved in setting the standard
We look forward to BBFC agreeing to meet us to discuss the outcome of the roundtable we held about their scheme, and to discuss our concerns about the new voluntary privacy standard. Meanwhile, we will produce a note from the meeting, which we
believe was useful. It covered the concerns above, and issues around timing, as well as strategies for getting government to adjust their view of the absence of compulsory standards, which many of the providers want. In this, BBFC are a critical
actor. ORG also intends as a result of the meeting to start to produce a note explaining what an effective privacy scheme would cover, in terms of scope, risks to mitigate, governance and enforcement for participants.
Chinese government censors are reading Australian publishers' books and, in some cases, refusing to allow them to be printed in China if they fail to comply with a long list of restrictions.
Publishing industry figures have confirmed that the censors from the State Administration of Press, Publication, Radio, Film and Television of the People's Republic of China are vetting books sent by Australian publishers to Chinese printing
presses, even though they are written by Australian authors and intended for Australian readers.
Any mention of a list of political dissidents, protests or political figures in China, is entirely prohibited, according to a list circulated to publishers and obtained by The Age and Sydney Morning Herald.
The list of prohibitions mentions key political incidents, including the 1989 Tiananmen Square protests, the pro-democracy protests in 2011 and the 2014 umbrella revolution in Hong Kong. The Tibetan independence movement, Uighur nationalism and
Falun Gong are also taboo subjects.
Mention of all major Chinese political figures, including Mao Zedong and the current president, Mr Xi, and all current members of the Politburo Standing Committee is ruled out, as is a long list of 118 dissidents who are not allowed to be
Most major religions are also on the sensitive list, as well as a long list of Chinese, or former Chinese locations, most relating to current or former border disputes. The printer's guidance says these things can be published after vetting by
Pornography was ruled out entirely, but artistic nudity or sexual acts could be censored in 10 working days.
Printing books, particularly those with colour illustrations, is significantly cheaper in China, so some publishers have little choice but to put them through the government censorship process.
Sandy Grant, of publisher Hardie Grant, said he had scrapped a proposed children's atlas last year because the censors ruled out a map showing the wrong borders.(probably to do with Chinese claims about Taiwan or Tibet). European alternatives
were considered economically unviable.
A printing industry source who works with Chinese presses confirmed that the rules, in theory, had been in place for a long time, but that, all of a sudden they've decided to up the ante. They're checking every book; they're very, very strict at
the moment. I don't know how they're reading every book, but they definitely are, the printer said. The change had happened in the past few months.
In 2017, Chelsea Russell, a Liverpool teenager with Asperger's syndrome, paid tribute on her Instagram profile to a 13-year-old friend who died when he was hit by a car. She quoted the lyrics of a rap song, I'm Trippin" by Snap Dogg,
alongside the phrase 'RIP Frankie Murphy. Many other teenagers used the lyrics to pay tribute to Murphy.
A year later, Russell's profile came to the attention of the police, who decided to arrest her and have her charged. The lyrics she quoted Kill a snitch nigga, rob a rich nigga were found in court to be grossly offensive and Russell was
convicted of a hate crime . For nothing more than quoting rap lyrics, she was placed on an eight-week, 8am-to-8pm curfew, fitted with an ankle tag, and fined £585.
Last week, the conviction was overturned on appeal. Russell's defence lawyer slammed the initial verdict as ridiculous, akin to the actions of a totalitarian state.
Offsite Comment: Chelsea Russell and the depravity of PC
Tommy Robinsonm has been permanently banned from Facebook and sister website Instagram. In a blogpost, Facebook said:
When ideas and opinions cross the line and amount to hate speech that may create an environment of intimidation and exclusion for certain groups in society -- in some cases with potentially dangerous offline implications -- we take action. Tommy
Robinson's Facebook page has repeatedly broken these standards, posting material that uses dehumanizing language and calls for violence targeted at Muslims. He has also behaved in ways that violate our policies around organized hate.
Robinson is already banned from Twitter and the decision to cut him off from Instagram and Facebook will leave him reliant on YouTube as the only major online platform to provide him with a presence.
The ban comes a month after Facebook issued a final written warning against Robinson, warning him that he would be removed from its platform permanently if he continued to break the company's hate speech policies.
Mainstream outlets have struggled to deal with Robinson. When he was interviewed by Sky News last year, Robinson responded b uploading an unedited video of the discussion showing that Sky News did in fact mislead viewers by mixing and matching
questions to answers to make Robinson look bad. The video became an online success and was shared far more widely online than the original interview.
Robinson adopted a similar tactic with the BBC's Panorama, which is investigating the far-right activist. Two weeks ago, Robinson agreed to be interviewed by the programme, only to turn the tables on reporter John Sweeney by revealing he had sent
an associate undercover to film the BBC reporter.
Several other accounts were removed from Facebook on Tuesday, including one belonging to former Breitbart London editor Raheem Kassam.
We received complaints following the third party release of secretly recorded material related to a BBC Panorama investigation.
BBC Panorama is investigating Tommy Robinson, whose real name is Stephen Yaxley-Lennon. The BBC strongly rejects any suggestion that our journalism is faked or biased. Any programme we broadcast will adhere to the BBC's strict editorial
guidelines. BBC Panorama's investigation will continue.
John Sweeney made some offensive and inappropriate remarks whilst being secretly recorded, for which he apologises. The BBC has a strict expenses policy and the drinks bill in this video was paid for in full by John.
Offsite Comment: Why Tommy Robinson should not be banned
In the evening of February 13, negotiators from the European Parliament and the Council concluded the trilogue negotiations with a final text for the new EU Copyright Directive.
For two years we've debated different drafts and versions of the controversial Articles 11 and 13. Now, there is no more ambiguity: This law will fundamentally change the internet as we know it -- if it is adopted in the upcoming final vote. But we can still prevent that!
Commercial sites and apps where users can post material must make "best efforts" to preemptively buy licences for anything that users may possibly upload -- that is: all copyrighted content in the world. An impossible feat.
In addition, all but very few sites (those both tiny and very new) will need to do everything in their power to prevent anything from ever going online that may be an unauthorised copy of a work that a rightsholder has registered with the
platform. They will have no choice but to deploy upload filters , which are by their nature both expensive and
Should a court ever find their licensing or filtering efforts not fierce enough, sites are directly liable for infringements as if they had committed them themselves. This massive threat will lead platforms to over-comply with these
rules to stay on the safe side, further worsening the impact on our freedom of speech.
Reproducing more than "single words or very short extracts" of news stories will require a licence. That will likely cover many of the snippets commonly shown alongside links today in order to give you an idea of what they lead
to. We will have to wait and see how courts interpret what "very short" means in practice -- until then, hyperlinking (with snippets) will be mired in legal uncertainty.
No exceptions are made even for services run by individuals, small companies or non-profits, which probably includes any monetised blogs or websites.
The project to allow Europeans to conduct Text and Data Mining , crucial for modern research and the development of artificial intelligence, has been obstructed with too many caveats and requirements. Rightholders can opt out of having
their works datamined by anyone except research organisations.
Authors' rights: The Parliament's proposal that authors should have a right to proportionate remuneration has been severely watered down: Total buy-out contracts will continue to be the norm.
Minor improvements for access to cultural heritage : Libraries will be able to publish out-of-commerce works online and museums will no longer be able to claim copyright on photographs of centuries-old paintings.
How we got here Former digital Commissioner Oettinger proposed the law
The history of this law is a shameful one.
From the very beginning , the purpose of Articles 11 and 13 was never to solve clearly-defined issues in copyright law with well-assessed measures, but to serve powerful special interests , with hardly any concern for the collateral
In his conservative EPP group, the driving force behind this law, dissenters were marginalised . The work of their initially-appointed representative
was thrown out after the conclusions she reached were too sensible. Mr Voss then voted so blindly in favour of any and all restrictive measures that he was
caught by surprise by some of the nonsense he had gotten approved. His party, the German CDU/CSU, nonchalantly violated the coalition agreement they had signed (which rejected upload filters), paying no mind to their own
minister for digital issues .
It took efforts equally herculean and sisyphean
across party lines to prevent the text from turning out even worse than it now is.
In the end, a closed-door horse trade between France and Germany was enough to outweigh the objections... so far.
What's important to note, though: It's not "the EU" in general that is to blame -- but those who put special interests above fundamental rights who currently hold considerable power. You can change that at the polls! The anti-EU
far right is trying to seize this opportunity to promote their narrow-minded nationalist agenda -- when in fact without the persistent support of the far-right ENF Group (dominated by the Rassemblement/Front National ) the law
could have been stopped in the crucial Legal Affairs Committee and in general would not be as extreme as it is today.
We can still stop this law
Our best chance to stop the EU copyright law: The upcoming Parliament vote.
The Parliament and Council negotiators who agreed on the final text now return to their institutions seeking approval of the result. If it passes both votes unchanged, it becomes EU law , which member states are forced to implement into
In both bodies, there is resistance.
The Parliament's process starts with the approval by the Legal Affairs Committee -- which is likely to be given on Monday, February 18.
Next, at a date to be announced, the EU member state governments will vote in the Council. The law can be stopped here either by 13 member state governments or by any number of governments who together represent 35% of the EU population (
calculator ). Last time, 8 countries representing 27% of the population were opposed. Either a large country like Germany or several small ones would need to change their minds: This is the less likely way to stop it.
Our best bet: The final vote in the plenary of the European Parliament , when all 751 MEPs, directly elected to represent the people, have a vote. This will take place either between March 25 and 28, on April 4 or between April 15
and 18. We've already
demonstrated last July that a majority against a bad copyright proposal is achievable .
The plenary can vote to kill the bill -- or to make changes , like removing Articles 11 and 13. In the latter case, it's up to the Council to decide whether to accept these changes (the Directive then becomes law without these articles) or
to shelve the project until after the EU elections in May, which will reshuffle all the cards.
This is where you come in
The final Parliament vote will happen mere weeks before the EU elections . Most MEPs -- and certainly all parties -- are going to be seeking reelection. Articles 11 and 13 will be defeated if enough voters make these issues relevant to the
Here's how to vote in the EU elections -- change the language to one of your country's official ones for specific information)
It is up to you to make clear to your representatives: Their vote on whether to break the internet with Articles 11 and 13 will make or break your vote in the EU elections. Be insistent -- but please always stay polite.
Facebook has restored several RT-linked pages a week after it blocked them without prior notice. The pages were only freed-up after their administrators posted data about their management and funding.
The Facebook pages of InTheNow, Soapbox, Back Then and Waste-Ed -- all operated by the Germany-based company Maffick Media were made accessible again as of Monday evening.
Facebook said in a statement at the time of the ban that it wants the pages' administrators to reveal their ties to Russia to their audience in the name of greater transparency. Facebook's measure was taken following a CNN report, which
ludicrously accused the pages of concealing their ties to the Kremlin, even though their administrators had never actually made a secret of their relations to Ruptly and RT. In fact RT is very blatantly, a propaganda channel supporting Russia.
Maffick CEO Anissa Naouai revealed that the social media giant agreed to unblock the pages, but only after their administration updated our 'About' section, in a manner NO other page has been required to do. The accounts now indeed feature
information related to their funding and management, visible under the pages' logos.
About 100 journalists have been threatened with a charge of contempt of court -- and could face possible jail terms -- over reporting of the Cardinal George Pell trial.
Victoria's director of public prosecutions, Kerri Judd QC, has written to as many as 100 individual publishers, editors, broadcasters, reporters and subeditors at the media giants News Corp Australia, Nine Entertainment , the ABC, Crikey and
several smaller publications, accusing them of breaching a nationwide suppression order imposed during the case.
The ones who do not have a strong enough explanation could be prosecuted. The gag order was issued by the chief judge of Melbourne's county court on 25 June 2018 in the matter of DPP v George Pell . The prosecution had applied for the suppression
order to prevent risk of prejudice for a second trial for Pell on separate charges.
The Herald Sun published the most dramatic piece: a black front page with the word CENSORED in large white letters. The world is reading a very important story that is relevant to Victorians, the page one editorial said. The Herald Sun is
prevented from publishing details of this very significant news. But trust us, it's a story you deserve to read.
Judd's letters targeted even oblique references because the gag order banned any information about the case, including that there was a suppression order.
While most of the blocked sites are foreign, a few local websites and social media platforms have also been targeted by the government censorship. One of these websites, somewhereinblog.net, is the largest Bengali-language community blog platform
in the world.
The post and telecommunications minister blamed the site for spreading atheism in Bangladesh.
A group of 33 Bangladeshi university teachers, journalists, bloggers, and activists have demanded that the government lift the ban on the blog platform immediately.
Article 13 is the
off-again controversial proposal to make virtually every online community, service, and platform legally liable for any infringing material posted by their users, even very briefly, even if there was no conceivable way for the online service
provider to know that a copyright infringement had taken place.
This will require unimaginable sums of money to even attempt, and the attempt will fail. The outcome of Article 13 will be a radical contraction of alternatives to the U.S. Big Tech platforms and the giant media conglomerates. That means
that media companies will be able to pay creators less for their work, because creators will have no alternative to the multinational entertainment giants.
Throwing Creators Under the Bus
The media companies lured creators' groups into supporting Article 13 by arguing that media companies and the creators they distribute have the same interests. But in the endgame of Article 13, the media companies
threw their creator colleagues under the bus , calling for the deletion of clauses that protect artists' rights to fair compensation from media companies, prompting
entirely justifiable howls of outrage from those betrayed artists' rights groups.
But the reality is that Article 13 was always going to be bad for creators. At best, all Article 13 could hope for was to move a few euros from Big Tech's balance-sheet to Big Content's balance-sheet (and that would likely be a temporary
situation). Because Article 13 would reduce the options for creators by crushing independent media and tech companies, any windfalls that media companies made would go to their executives and shareholders, not to the artists who would have no
alternative but to suck it up and take what they're offered.
After all: when was the last time a media company celebrated a particularly profitable year by increasing their royalty rates?
It Was Always Going to Be Filters
The initial versions of Article 13 required companies to build copyright filters, modeled after YouTube's "Content ID" system: YouTube invites a select group of trusted rightsholders to upload samples of works they claim as their
copyright, and then blocks (or diverts revenue from) any user's video that seems to match these copyright claims.
There are many problems with this system. On the one hand,
giant media companies complain that they are far too easy for dedicated infringers to defeat; and on the other hand, Content ID ensnares all kinds of legitimate forms of expression, including
birdsong , and
music uploaded by the actual artist for distribution on YouTube . Sometimes, this is because a rightsholder has falsely claimed copyrights that don't belong to them; sometimes, it's because Content ID generated a "false positive"
(that is, made a mistake); and sometimes it's because software just can't tell the difference between an infringing use of a copyrighted work and a use that falls under "fair dealing," like criticism, commentary, parody, etc. No one has
trained an algorithm to recognise parody, and no one is likely to do so any time soon (it would be great if we could train humans to reliably recognise parody!).
Copyright filters are a terrible idea. Google has spent a reported $100 million (and counting) to build a very limited copyright filter that only looks at videos and only blocks submissions from a select group of pre-vetted rightsholders. Article
13 covers all possible copyrighted works: text, audio, video, still photographs, software, translations. And some versions of Article 13 have required platforms to block infringing publications of every copyrighted work, even those
that no one has told them about: somehow, your community message-board for dog-fanciers is going to have to block its users from plagiarising 50-year-old newspaper articles, posts from other message-boards, photos downloaded from social
media, etc. Even the milder "compromise" versions of Article 13 required online services to block publication of anything they'd been informed about, with dire penalties for failing to honour a claim, and no penalties for bogus claims.
But even as filters block things that aren't copyright infringement, they still allow dedicated infringers to operate with few hindrances. That's because filters use relatively simple, static techniques to inspect user uploads, and
infringers can probe the filters' blind-spots for free, trying different techniques until they hit on ways to get around them. For example, some image filters can be bypassed by
flipping the picture from left to right , or rendering it in black-and-white instead of color. Filters are "black boxes" that can be repeatedly tested by dedicated infringers to see what gets through.
For non-infringers -- the dolphins caught in copyright's tuna-nets -- there is no underground of tipsters who will share defeat-techniques to help get your content unstuck. If you're an AIDS researcher whose videos
have been falsely claimed by AIDS deniers in order to censor them, or police brutality activists whose bodycam videos have been blocked by police departments looking to evade criticism, you are already operating at the limit of your
abilities, just pursuing your own cause. You can try to become a filter-busting expert in addition to your research, activism, or communications, but there are only so many hours in a day, and the overlap between people with something to say and
people who can figure out how to evade overzealous (or corrupted) copyright filters just isn't very large.
All of this put filters into such bad odor that mention of them was purged from Article 13, but
despite obfuscation , it was clear that Article 13's purpose was to mandate filters: there's just no way to imagine that every tweet, Facebook update, message-board comment, social media photo, and other piece of user-generated content could
be evaluated for copyright compliance without an automated system. And once you make online forums liable for their users' infringement, they have to find some way to evaluate everything their users post.
Just Because Artists Support Media Companies, It Doesn't Mean Media Companies Support Artists
Spending hundreds of millions of euros to build filters that don't stop infringers but do improperly censor legitimate materials (whether due to malice, incompetence, or sloppiness) will not put any money in artists' pockets.
Which is not to say that these won't tilt the balance towards media companies (at least for a while). Because filters will always fail at least some of the time, and because Article 13 doesn't exempt companies from liability when this happens,
Big Tech will have to come to some kind of accommodation with the biggest media companies -- Get Out Of Jail cards, along with back-channels that media companies can use to get their own material unstuck when it is mistakenly blocked by a filter.
(It's amazing how often one part of a large media conglomerate will take down its own content, uploaded by another part of the same sprawling giant.)
But it's pretty naive to imagine that transferring money from Big Tech to Big Content will enrich artists. Indeed, since there's no way that smaller European tech companies can afford to comply with Article 13, artists will have no alternative
but to sign up with the major media companies, even if they don't like the deal they're offered.
Smaller companies play an important role today in the EU tech ecosystem. There are national alternatives to Instagram, Google, and Facebook that outperform U.S. Big Tech in their countries of origin. These will not survive contact with Article
13. Article 13's tiny exemptions for smaller tech companies were always mere ornaments, and the latest version of Article 13
renders them useless .
Smaller media companies -- often run by independent artists to market their own creations, or those of a few friends -- will likewise find themselves without a seat at the table with Big Tech, whose focus will be entirely on keeping the media
giants from using Article 13's provisions to put them out of business altogether.
Meanwhile, "filters for everything" will be a bonanza for fraudsters and crooks who prey on artists. Article 13 will force these systems to err on the side of over-blocking potential copyright violations, and that's
a godsend for blackmailers , who can use bogus copyright claims to shut down artists' feeds, and demand money to rescind the claims. In theory, artists victimised in this way can try to get the platforms to recognise the scam, but without
the shelter of a big media company with its back-channels into the big tech companies, these artists will have to get in line behind millions of other people who have been unjustly filtered to plead their case.
If You Think Big Tech Is Bad Now...
In the short term, Article 13 tilts the field toward media companies, but that advantage will quickly evaporate.
Without the need to buy or crush upstart competitors in Europe, the American tech giants will only grow bigger and harder to tame. Even the aggressive antitrust work of the European Commission will do little to encourage competition if competing
against Big Tech requires hundreds of millions for copyright compliance as part of doing business -- costs that Big Tech never had to bear while it was growing, and that would have crushed the tech companies before they could grow.
Ten years after Article 13 passes, Big Tech will be bigger than ever and more crucial to the operation of media companies. The Big Tech companies will not treat this power as a public trust to be equitably managed for all: they will treat it as a
commercial advantage to be exploited in every imaginable way. When the day comes that FIFA or Universal or Sky needs Google or Facebook or Apple much more than the tech companies need the media companies, the tech companies will squeeze,
and squeeze, and squeeze.
This will, of course, harm the media companies' bottom line. But you know who else it will hurt? Artists.
Because media giants, like other companies who have a buyer's market for their raw materials -- that is, art and other creative works -- do not share their windfalls with their suppliers, but they absolutely expect their suppliers to share their
When media companies starve, they take artists with them. When artists have no other option, the media companies squeeze them even harder .
What Is To Be Done?
Neither media giants nor tech giants have artists' interests at heart.
Both kinds of company are full of people who care about artists, but institutionally, they act for their shareholders, and every cent they give to an artist is a cent they can't return to those investors.
One important check on this dynamic is competition. Antitrust regulators have many tools at their disposal, and those tools have been largely idle for more than a generation. Companies have been allowed to grow by merger, or by acquiring nascent
competitors, leaving artists with fewer media companies and fewer tech companies, which means more chokepoints where they are shaken down for their share of the money from their work.
Another important mechanism could be genuine copyright reform, such as re-organizing the existing regulatory framework for copyright, or encouraging new revenue-sharing schemes such as voluntary blanket licenses, which could allow artists to opt
into a pool of copyrights in exchange for royalties.
Any such scheme must be designed to fight historic forms of corruption, such as collecting societies that unfairly share out license payments, or media companies that claim these. That's the sort of future-proof reform that the Copyright
Directive could have explored, before it got hijacked by vested interests.
In the absence of these policies, we may end up enriching the media companies, but not the artists whose works they sell. In an unfair marketplace, simply handing more copyrights to artists is like giving your bullied kid extra lunch-money: the
bullies will just take the extra money, too, and your kid will still go hungry.
Artists Should Be On the Side of Free Expression
It's easy to focus on media and art when thinking about Article 13, but that's not where its primary effect will be felt.
The platforms that Article 13 targets aren't primarily entertainment systems: they are used for everything, from romance to family life, employment to entertainment, health to leisure, politics and civics, and more besides.
Copyright filters will impact all of these activities, because they will all face the same problems of false-positives, censorship, fraud and more.
The arts has always championed free expression for all , not just for artists. Big Tech and Big Media already exert enormous control over our public and civic lives. Dialing that control up is bad for all of us, not just those of us
in the arts.
Artists and audiences share an interest in promoting the fortunes of artists: people don't buy books or music or movies because they want to support media companies, they do it to support creators. As always, the right side for artists to be on
is the side of the public: the side of free expression, without corporate gatekeepers of any kind.
The Parents Television Council have reported that the FCC is required to review the TV content ratings system and report on the effectiveness of the system within 90 days, as per the Appropriations Bill of 2019. Specifically, the Conference
Committee Report says:
Oversight Monitoring and Rating System.-In lieu of Senate report language on oversight monitoring and rating system, the FCC is directed to report to the Committees on Appropriations of the House and Senate within 90 days of enactment of this
Act on the extent to which the rating system matches the video content that is being shown and the ability of the TV Parental Guidelines Oversight Monitoring Board to address public concerns.
PTC President Tim Winter said:
Finally, after more than 20 years, Congress is addressing the needs of families and the welfare of children by formally calling for the first-ever regulatory review of the TV Content Ratings System and its ostensible oversight. We are elated
that this important legislative wording was adopted as part of the appropriations bill that funds the federal government for this fiscal year.
A cartoon of Serena Williams published in an Australian newspaper last year did not breach media standards, a press censor says.
The cartoon depicted Williams jumping above a broken racquet next to a baby's dummy in the US Open final which Williams lost to Naomi Osaka in September. During the match she aggressively accused the umpire of sexism and being a thief.
Critics claimed that the caricature used racist and sexist stereotypes of African-American people.
The Australian Press Council noted that some had found the image offensive, but accepted the publisher's defence. It added that the newspaper had sufficient public interest in commenting on behaviour and sportsmanship.
The cartoon went viral in September. The National Association of Black Journalists in the US denounced it as repugnant on many levels. Public complaints centred around the portrayal of Williams with large lips, a broad flat nose... and [being]
positioned in an ape-like pose.
Perennial hindu whinger Rajan Zed is urging urging Salem (Virginia) based Olde Salem Brewing Company to apologize and withdraw its Hanuman (Spanish Milk Stout) beer; calling it highly inappropriate. Zed claimed that inappropriate usage of Hindu
deities or concepts or symbols for commercial or other agenda was not okay as it hurt the devotees.
Zed, who is president of Universal Society of Hinduism, indicated that Lord Hanuman was highly revered in Hinduism and was meant to be worshipped in temples or home shrines and not to be used in selling beer for mercantile intent. Moreover,
linking Lord Hanuman with an alcoholic beverage was very disrespectful.
Brewery owner Sean Turk, in a Company statement emailed today to Rajan Zed, wrote:
When naming our Spanish milk stout Hanuman we were unaware of the Hindu deity referenced by Rajan Zed. This name was purely a musical reference and had no other intent. We are reviewing options to address the situation206We apologize if this
inadvertent association has offended anyone in anyway.
Social media companies face criminal sanctions for failing to protect children from online harms, according to drafts of the Government's White Paper circulating in Whitehall.
Civil servants are proposing a new corporate offence as an option in the White Paper plans for a tough new censor with the power to force social media firms to take down illegal content and to police legal but harmful material.
They see criminal sanctions as desirable and as an important part of a regulatory regime, said one source who added that there's a recognition particularly on the Home Office side that this needs to be a regulator with teeth. The main issue they
need to satisfy ministers on is extra-territoriality, that is can you apply this to non-UK companies like Facebook and YouTube? The belief is that you can.
The White Paper, which is due to published mid-March followed by a Summer consultation, is not expected to lay out as definitive a plan as previously thought. A decision on whether to create a brand new censor or use Ofcom is expected to be left
open. A Whitehall source said:
Criminal sanctions are going to be put into the White Paper as an option. We are not necessarily saying we are going to do it but these are things that are open to us. They will be allied to a system of fines amounting to 4% of global turnover
or Euros 20m, whichever is higher.
Government minister Jeremy Wright told the Telegraph this week he was especially focused on ensuring that technology companies enforce minimum age standards. He also indicated the Government w ould fulfill a manifesto commitment to a levy on
social media firms, that could fund the new censorr.
Unplanned is a 2019 USA drama by Chuck Konzelman and Cary Solomon.
Starring Ashley Bratcher, Brooks Ryan and Robia Scott.
As one of the youngest Planned Parenthood clinic directors in the nation, Abby Johnson was involved in upwards of 22,000 abortions and counseled countless women on their reproductive choices. Her passion surrounding a woman's right to choose led
her to become a spokesperson for Planned Parenthood, fighting to enact legislation for the cause she so deeply believed in. Until the day she saw something that changed everything.
One of Hollywood's biggest faith-based film studios has found itself in an unlikely battle with the movie industry's ratings board.
Pure Flix , the Christian-aimed studio, was recently informed that next month's anti-abortion drama Unplanned would receive an R rating for some disturbing/bloody images , the first in the studio's history. That could make it
a tough sell for the company's traditional family-friendly audience.
The MPAA R rating means that theatres will not allow anyone under 17, unless they're accompanied by a parent or guardian). The film received the rating due to a series of graphic abortion scenes.
According to the Hollywood Reporter, the MPAA contends the R rating for Unplanned wasn't political, and was instead assigned because of some disturbing/bloody images. Pure Flix, which doesn't plan on appealing the decision, was clearly frustrated
by the rating, as they were expecting a more on-brand PG-13. Pure Flix executive Ken Rather told the Reporter:
A 15-year old girl can get an abortion without her parent's permission but she can't see this movie without adult supervision?
Internet Watch Foundation's (IWF) CEO, Susie Hargreaves OBE, puts forward a voice of reason by urging politicians and policy makers to take a balanced approach to internet regulation which avoids a heavy cost to the victims of child sexual
IWF has set out its views on internet regulation ahead of the publication of the Government's Online Harms White Paper. It suggests that traditional approaches to regulation cannot apply to the internet and that human rights should play a big
role in any regulatory approach.
The IWF, as part of the UK Safer Internet Centre, supports the Government's ambition to make the UK the safest place in the world to go online, and the best place to start a digital business.
IWF has a world-leading reputation in identifying and removing child sexual abuse images and videos from the internet. It takes a co-regulatory approach to combating child sexual abuse images and videos by working in partnership with the internet
industry, law enforcement and governments around the world. It offers a suite of tools and services to the online industry to keep their networks safer. In the past 22 years, the internet watchdog has assessed -- with human eyes -- more than 1
Ms Hargreaves said:
Tackling criminal child sexual abuse material requires a global multi-stakeholder effort. We'll use our 22 years' experience in this area to help the government and policy makers to shape a regulatory framework which is sustainable and puts
victims at its heart. In order to do this, any regulation in this area should be developed with industry and other key stakeholders rather than imposed on them.
We recommend an outcomes-based approach where the outcomes are clearly defined and the government should provide clarity over the results it seeks in dealing with any harm. There also needs to be a process to monitor this and for any results to
be transparently communicated.
But, warns Ms Hargreaves, any solutions should be tested with users including understanding impacts on victims: "The UK already leads the world at tackling online child sexual abuse images and videos but there is definitely more that can be
done, particularly in relation to tackling grooming and livestreaming, and of course, regulating harmful content is important.
My worries, however, are about rushing into knee-jerk regulation which creates perverse incentives or unintended consequences to victims and could undo all the successful work accomplished to date. Ultimately, we must avoid a heavy cost to
victims of online sexual abuse.
Index on Censorship welcomes a report by the House of Commons Digital, Culture, Media and Sport select committee into disinformation and fake news that calls for greater transparency on social media companies' decision making processes, on
who posts political advertising and on use of personal data. However, we remain concerned about attempts by government to establish systems that would regulate harmful content online given there remains no agreed definition of harm in this
context beyond those which are already illegal.
Despite a number of reports, including the government's Internet Safety Strategy green paper, that have examined the issue over the past year, none have yet been able to come up with a definition of harmful content that goes beyond definitions of
speech and expression that are already illegal. DCMS recognises this in its report when it quotes the Secretary of State Jeremy Wright discussing the difficulties surrounding the definition. Despite acknowledging this, the report's authors
nevertheless expect technical experts to be able to set out what constitutes harmful content that will be overseen by an independent regulator.
International experience shows that in practice it is extremely difficult to define harmful content in such a way that would target only bad speech. Last year, for example, activists in Vietnam wrote an open letter to Facebook complaining that
Facebook's system of automatically pulling content if enough people complained could silence human rights activists and citizen journalists in Vietnam , while Facebook has shut down the livestreams of people in the United States using the
platform as a tool to document their experiences of police violence.
Index on Censorship chief executive Jodie Ginsberg said:
It is vital that any new system created for regulating social media protects freedom of expression, rather than introducing new restrictions on speech by the back door. We already have laws to deal with harassment, incitement to violence, and
incitement to hatred. Even well-intentioned laws meant to tackle hateful views online often end up hurting the minority groups they are meant to protect, stifle public debate, and limit the public's ability to hold the powerful to account.
The select committee report provides the example of Germany as a country that has legislated against harmful content on tech platforms. However, it fails to mention the German Network Reinforcement Act was legislating on content that was already
considered illegal, nor the widespread criticism of the law that included the UN rapporteur on freedom of expression and groups such as Human Rights Watch. It also cites the fact that one in six of Facebook's moderators now works in Germany as
practical evidence that legislation can work. Ginsberg said:
The existence of more moderators is not evidence that the laws work. Evidence would be if more harmful content had been removed and if lawful speech flourished. Given that there is no effective mechanism for challenging decisions made by
operators, it is impossible to tell how much lawful content is being removed in Germany. But the fact that Russia, Singapore and the Philippines have all cited the German law as a positive example of ways to restrict content online should give
Index has reported on various examples of the German law being applied incorrectly, including the removal of a tweet of journalist Martin Eimermacher criticising the double standards of tabloid newspaper Bild Zeitung and the blocking of the
Twitter account of German satirical magazine Titanic. The Association of German Journalists (DJV) has said the Twitter move amounted to censorship, adding it had warned of this danger when the German law was drawn up.
Index is also concerned about the continued calls for tools to distinguish between quality journalism and unreliable sources, most recently in the Cairncross Review . While we recognise that the ability to do this as individuals and through
education is key to democracy, we are worried that a reliance on a labelling system could create false positives, and mean that smaller or newer journalism outfits would find themselves rejected by the system.
Jack Whitehall left a few 2019 Brit Awards viewers 'outraged' at his one-liners on girl-group Little Mix.
The comedian sparked disapproval for his quip about fathers grabbing scatter cushions following the band's performance dressed in hot pink leather-look outfits and matching thigh boots.
On the back of the performance, Whitehall said:
Raunchy! Dads up and down the country awkwardly fumbling for a scatter cushion right now.
Ofcom said it received 38 complaints about the music awards ceremony, 25 of which were made regarding the host's cushion remark.A further eight complaints were made about the general nature of Little Mix's dance routine. Little Mix Brit Awards
An Ofcom spokesperson told the Standard: We will assess these complaints before deciding whether or not to investigate. This is Ofcom speak for the complaints being already consigned to the waste paper bin.
South Korea will expand its site blocking measures with SNI eavesdropping, so HTTPS sites can be blocked as well. The new measure, which will also affect pirate sites, has generated widespread opposition. While it's more effective than standard
DNS blocking, it's certainly not impossible to circumvent.
When it comes to pirate site blocking, South Korea is one of the most proactive countries in the Asia-Pacific region. Pirate website blocking orders are sanctioned by the Korean Communications Standards Commission (KCSC), which also oversees
other blocking efforts, including those targeted at porn or illegal gambling sites.
While the ISP blockades work well for regular HTTP sites, they are fairly easy to bypass on HTTPS connections, something most sites offer today. For this reason, the Korean authorities are now stepping up their blocking game. This week the
Government announced that it will start eavesdropping on
SNI fields , which identify the hostname of the target server. This allows ISPs to see which HTTPS sites users are trying to access, so these can be blocked if they're on the Korean blocklist.
The new measures will apply to 895 foreign websites that are linked to porn, gambling or copyright infringement.
The new blocking policy is meeting quite a bit of
resistance locally. A petition that was launched earlier this week has been signed by over 180,000 people already and this number is growing rapidly.
The petition warns that this type of censorship is limiting freedom of expression. At the same time, however, it notes that people will find ways to bypass the blockades.
SNI eavesdropping and blocking is useless when people use a VPN. In addition, more modern browsers and companies such as Cloudflare increasingly support encrypted SNI (ESNI). This prevents ISPs from snooping on SNI handshakes.
Berlin, I Love You is a 2019 Germany romance by Dianna Agron, Peter Chelsom...
Starring Keira Knightley, Helen Mirren and Luke Wilson.
Latest installment of the Cities of Love series (Paris, je t'aime / New York, I Love You / Rio, Eu Te Amo), this collective feature-film is made of ten stories of romance set in the German capital.
A contribution by the Chinese artist, film-maker and activist Ai Weiwei to a film called Berlin, I Love You , was cut by the producers on concern it could create difficulties for them with the Chinese authorities.
The film is part of a series known as Cities of Love created by Emmanuel Benbihy. The Berlin movie features 11 directors and stars Keira Knightley and Helen Mirren. Ai directed his contribution, which focussed on his relationship
with his son while in detention in China in 2015. It was included in the marketing teaser but did not make it into the finished film.
It was infuriating to find our involvement had been erased, Ai said in a statement on Deutsche Welle television. The reason we were given for the episode's removal was that my political status had made it difficult for the production team to
secure further funding.
Ai said another reason was that the organisers of the Berlin Film Festival told the producers of Berlin, I Love You that the artist's contribution would make it impossible to screen the film at this year's edition of the festival, which ended on
AI said the fact that the next film in the Cities of Love series centres on Shanghai also played a role in the producers' decision to scrap his contribution to Berlin, I Love You. He added:
The situation has got worse. China has become much more powerful and globally plays a major role in politics and economics. At the same time, China starts promoting its soft power. The effect is clearly being felt in the entertainment industry,
Buckling under pressure from enraged Christians, DC Comics has announced that it's pulled the plug on a planned series called Second Coming , in which Jesus returns as a superhero.
About 233,000 people signed a petition saying:
Can you imagine the media and political uproar if DC Comics was altering and poking fun at the story of Muhammad -- or Buddha?
This blasphemous content should not be tolerated. Jesus Christ is the Son of God. His story should not be ridiculed for the sake of selling comic books.
The plot summary for the first issue, previously sated for March, said:
Witness the return of Jesus Christ, as He is sent on a most holy mission by God to learn what it takes to be the true messiah of mankind by becoming roommates with the world's favorite savior: the all-powerful superhero Sun-Man, the Last Son of
Krispex!. But when Christ returns to Earth, he's shocked to discover what has become of his gospel 203 and now, he aims to set the record straight.
The writers will now offer the series to other publishers.
Google has acknowledged that one of its home alarm products contained a secret microphone. Product specifications for the Nest Guard, an all-in-one alarm, keypad and motion sensor, available since 2017, had made no mention of the listening
Nest Guard is an all-in-one alarm, keypad, and motion sensor but, despite being announced well over a year ago, the word microphone was only added to the product's specification this month.
But earlier this month, the firm said a software update would make Nest Guard voice-controlled. On Twitter, concerned Nest owners were told the microphone has not been used up to this point.
In response to criticism, Google claimed:
The on-device microphone was never intended to be a secret and should have been listed in the tech specs. That was an error on our part. The microphone has never been on and is only activated when users specifically enable the option.
This is the kind of thing that makes me paranoid of smart home devices, commented Nick Heer , who writes the Pixel Envy blog.
If I owned one of these things and found out that the world's biggest advertising company hid a microphone in my home for a year, I'd be livid.
Thankfully, Europeans aren't taking this lying down. With the final vote expected to come during the March 25-28 session, mere weeks before European elections, European activists are pouring the pressure onto their Members of the European
Parliament (MEPs), letting them know that their vote on this dreadful mess will be on everyone's mind during the election campaigns.
The epicenter of the uprising is Germany, which is only fitting, given that German MEP Axel Voss is almost singlehandedly responsible for poisoning the Directive with rules that will lead to mass surveillance and mass censorship, not to mention
undermining much of Europe's tech sector.
The German Consumer Association were swift to condemn the Directive,
stating : "The reform of copyright law in this form does not benefit anyone, let alone consumers. MEPs are now obliged to do so. Since the outcome of the trilogue falls short of the EU Parliament's positions at key points, they should
refuse to give their consent."
viral video of Axel Voss being confronted by activists has been picked up by politicians campaigning against Voss's Christian Democratic Party in the upcoming elections, spreading to Germany's top TV personalities, like Jan Böhmermann.
YouTube is continuing take down drill music videos at the request of London police. The Metropolitan Police has continually argued that the underground rap genre is partly responsible, linked a spate of knife attacks to violent lyrics.
As of last month, the police had requested the removal of 129 videos, of which the music sharing platform deleted 102. This purge has escalated since May last year at which point the Press Association reported that police had requested 50 to 60
videos be removed over the course of two years and Youtube, in response, deleted 30. Some of the videos that were removed later resurfaced on Pornhub.
Mike West heads a London police unit that has compiled a database of around 1,900 drill videos that he told the Press Association, generate purely a violent retaliatory response.
Last month police closed a landmark case against Skengdo and AM, two of the biggest names in the UK drill scene. The duo pled guilty to breaching a gang injunction by performing their song Attempted 1.0 during a sold out concert at Koko, London.
They received a suspended nine-month jail sentence, making it the first time in British history that an artist has been sentenced to prison for performing a song.
India's Central Board of Film Certification (CBFC) has banned a total of 793 films in 16 years.
The information was revealed in response to a request filed by Lucknow-based activist Dr Nutan Thakur. It said that between January 1, 2000 and March 31, 2016, the censor board banned 793 films from public exhibition. These include 586 Indian
films and 207 foreign films. These totals were broken down as follows:
The Digital, Culture, Media and Sport Committee has published its final report on Disinformation and 'fake news'. The report calls for:
Compulsory Code of Ethics for tech companies overseen by independent regulator
Regulator given powers to launch legal action against companies breaching code
Government to reform current electoral communications laws and rules on overseas involvement in UK elections
Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation
Further finds that:
Electoral law 'not fit for purpose'
Facebook intentionally and knowingly violated both data privacy and anti-competition laws
Damian Collins MP, Chair of the DCMS Committee said:
"Our inquiry over the last year has identified three big threats to our society. The challenge for the year ahead is to start to fix them; we cannot delay any longer.
"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use everyday. Much of this
is directed from agencies working in foreign countries, including Russia.
"The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.
"Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.
"These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the 'move fast and break things' culture often seems to be that it is better to apologise than ask permission.
"We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end. The rights of the citizen need to be established in statute, by requiring the tech companies
to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.
"We also have to accept that our electoral regulations are hopelessly out of date for the internet age. We need reform so that the same principles of transparency of political communications apply online, just as they do in the real world.
More needs to be done to require major donors to clearly establish the source of their funds.
"Much of the evidence we have scrutinised during our inquiry has focused on the business practices of Facebook; before, during and after the Cambridge Analytica data breach scandal.
"We believe that in its evidence to the Committee Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.
"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world. Evidence uncovered by my Committee shows he still has questions to answer yet he's continued to duck
them, refusing to respond to our invitations directly or sending representatives who don't have the right information. Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from
someone who sits at the top of one of the world's biggest companies.
"We also repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We want to find out what was the impact of disinformation and voter
manipulation on past elections including the UK Referendum in 2016 and are calling on the Government to launch an independent investigation."
This Final Report on Disinformation and 'Fake News' repeats a number of recommendations from the interim report published last summer. The Committee calls for the Government to reconsider a number of recommendations to which it did not respond
and to include concrete proposals for action in its forthcoming White Paper on online harms.
Independent regulation of social media companies.
The Report repeats a recommendation from the Interim Report for clear legal liabilities to be established for tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining
what constitutes harmful content. An independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.
Companies failing obligations on harmful or illegal content would face hefty fines. MPs conclude: "Social media companies cannot hide behind the claim of being merely a 'platform' and maintain that they have no responsibility themselves in
regulating the content of their sites."
The Report's recommendation chimes with recent statements by Ministers indicating the Government is prepared to regulate social media companies following the death of teenager Molly Russell. The Committee hopes to see firm recommendations for
legislation in the White Paper to create a regulatory system for online content that is as effective as that for offline content.
It repeats its recommendation for new independent regulation to be funded by a levy on tech companies operating in the UK.
Data use and data targeting
The Report highlights Facebook documents obtained by the Committee and published in December 2018 relating to a Californian court case brought by app developer Six4Three. Through scrutiny of internal Facebook emails between 2011 and 2015, the
Report finds evidence to indicate that the company was willing to: override its users' privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and
starve some developers--such as Six4Three--of that data, contributing to them losing their business. MPs conclude: "It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws."
It recommends that the ICO carries out a detailed investigation into the practices of the Facebook platform, its use of users' and users' friends' data, and the use of 'reciprocity' of the sharing of data. The CMA (Competition and Markets
Authority) should conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.
MPs note that Facebook, in particular, is unwilling to be accountable to regulators around the world: "By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has
shown contempt towards both our Committee and the 'International Grand Committee' involving members from nine legislators from around the world."
Researchers at the Oxford Internet Institute, University of Oxford, have found no relationship between aggressive behaviour in teenagers and the amount of time spent playing violent video games. The study used nationally representative data
from British teens and their parents alongside official E.U. and US ratings of game violence. The findings were published in Royal Society Open Science.
The idea that violent video games drive real-world aggression is a popular one, but it hasn't tested very well over time, says lead researcher Professor Andrew Przybylski, Director of Research at the Oxford Internet Institute. Despite interest in
the topic by parents and policy-makers, the research has not demonstrated that there is cause for concern.
The study is one of the most definitive to date, using a combination of subjective and objective data to measure teen aggression and violence in games. Unlike previous research on the topic, which relied heavily on self-reported data from
teenagers, the study used information from parents and carers to judge the level of aggressive behaviour in their children. Additionally, the content of the video games was classified using the official Pan European Game Information (EU) and
Entertainment Software Rating Board (US) rating system, rather than only player's perceptions of the amount of violence in the game.
Our findings suggest that researcher biases might have influenced previous studies on this topic, and have distorted our understanding of the effects of video games, says co-author Dr Netta Weinstein from Cardiff University. An important step
taken in this study was preregistration, where the researchers publically registered their hypothesis, methods and analysis technique prior to beginning the research.
Part of the problem in technology research is that there are many ways to analyse the same data, which will produce different results. A cherry-picked result can add undue weight to the moral panic surrounding video games. The registered study
approach is a safe-guard against this, says Przybylski.
While no correlation was found between playing video games and aggressive behaviour in teenagers, the researchers emphasize that this does not mean that some mechanics and situations in gaming do not provoke angry feelings or reactions in
players. Anecdotally, you do see things such as trash-talking, competitiveness and trolling in gaming communities that could qualify as antisocial behaviour, says Przybylski. This would be an interesting avenue for further research.
Researchers should use the registered study approach to investigate other media effects phenomena. There are a lot of ideas out there like 'social media drives depression and technology addiction that lowers quality of life that simply have no
supporting evidence. These topics and others that drive technological anxieties should be studied more rigorously 203 society needs solid evidence in order to make appropriate policy decisions.'
The data was drawn from a nationally representative sample of British 14- and 15-year olds, and the same number of their carers (totalling 2,008 subjects). Teenagers completed questions on their personality and gaming behaviour over the past
month, while carers completed questions on their child's recent aggressive behaviours using the widely-used Strengths and Difficulties Questionnaire. The violent content in the games played were coded based on their rating in the official Pan
European Game Information (PEGI; EU) and Entertainment Software Rating Board (ESRB; US) rating system, as well as player's subjective rating. The findings of the study were derived from a study following the Registered Reports Protocol; the
study's sampling plan and statistical approach were evaluated before the data were collected. Multiple linear regression modelling tested whether the relations between regular violent video game play (coded by researchers) and adolescents'
aggressive and helping behaviours (judged by parents) were positive, negative, linear, or parabolic.
Lebanon's General Directorate of General Security has censored a caricature of Iran's Supreme Leader Ali Khamenei that was published in the French weekly Courrier International .
The censor covered the caricature with a sticker before allowing the publication to enter Lebanon.
The move has sparked debate on social media, including criticism and questions as to whether the directorate is affiliated with the Shiite group Hezbollah, a close ally of Tehran. There were also questions as to whether such censorship would
apply to other leaders who are caricatured by the French newspaper.
According to a recent analysis, people in Hyderabad have taken an avid interest in viewing porn even though it has been banned. With the Union government banning 827 porn sites across the country, an increase of 75% has been seen in porn viewing
Hyderabad is among the many states which have seen an increase in porn viewership. On conducting a medical study, it was claimed that the increasing number of divorces can be attributed to psychological effects of porn addiction.
A survey published by DocOnline and conducted by city doctors, it was inferred that the obsession with pornography is effecting the sexual health of viewers. Dr Syed Abrar Kareem, a physician stated that porn gives rise to impractical sexual
expectations which when not met, result in psycho-somatic disorders. Out of the 5,000 people chosen for the survey, 3,500 were men and 1,500 were women confessed to watching porn regularly.
A rise of 31% has been recorded in divorces and break-ups. Allegedly, the doctors have also seen an increase in impotency cases being brought to them due to the extreme involvement in virtual sex.
The Council of Europe is a wider organisation of European countries than the EU and is known best for being the grouping behind the European Court of Human Rights.
The council's Committee of Ministers has issued a statement criticising the algorithmic nature of social media. It calls on member countries to address its concerns. The Committee writes:
- draws attention to the growing threat to the right of human beings to form opinions and take decisions independently of automated systems, which emanates from advanced digital technologies. Attention should be paid particularly to their
capacity to use personal and non-personal data to sort and micro-target people, to identify individual vulnerabilities and exploit accurate predictive knowledge, and to reconfigure social environments in order to meet specific goals and vested
- encourages member States to assume their responsibility to address this threat by
a) ensuring that adequate priority attention is paid at senior level to this inter-disciplinary concern that often falls in between established mandates of relevant authorities;
b) considering the need for additional protective frameworks related to data that go beyond current notions of personal data protection and privacy and address the significant impacts of the targeted use of data on societies and on the exercise
of human rights more broadly;
c) initiating, within appropriate institutional frameworks, open-ended, informed and inclusive public debates with a view to providing guidance on where to draw the line between forms of permissible persuasion and unacceptable manipulation. The
latter may take the form of influence that is subliminal, exploits existing vulnerabilities or cognitive biases, and/or encroaches on the independence and authenticity of individual decision-making;
d) taking appropriate and proportionate measures to ensure that effective legal guarantees are in place against such forms of illegitimate interference; and
e) empowering users by promoting critical digital literacy skills and robustly enhancing public awareness of how many data are generated and processed by personal devices, networks, and platforms through algorithmic processes that are trained
for data exploitation. Specifically, public awareness should be enhanced of the fact that algorithmic tools are widely used for commercial purposes and, increasingly, for political reasons, as well as for ambitions of anti- or undemocratic
power gain, warfare, or to inflict direct harm;
Of course if once strips away the jargon, then the fundamental algorithm is to simply give people more of what they seem to have enjoyed reading. And of course the establishment's preferred algorithm is to give people what the state would
like them to read.
It will be an offence to view terrorist material online just once -- and could incur a prison sentence of up to 15 years -- under a new UK law.
The Counter-Terrorism and Border Security Bill has just been granted Royal Assent, updating a previous Act and bringing new powers to law enforcement to tackle terrorism.
But a controversial inclusion was to update the offence of obtaining information likely to be useful to a person committing or preparing an act of terrorism so that it now covers viewing or streaming content online.
Originally, the proposal had been to make it an offence for someone to view material three or more times -- but the three strikes idea has been dropped from the final Act.
The government said that the existing laws didn't capture the nuance in changing methods for distribution and consumption of terrorist content -- and so added a new clause into the 2019 Act making it an offence to view (or otherwise access) any
terrorist material online. This means that, technically, anyone who clicked on a link to such material could be caught by the law.
A musician found guilty of broadcasting grossly offensive anti-Semitic songs has had her conviction upheld.
Alison Chabloz has written many politically incorrect, humorous and insulting songs often targeted at jews but also more generally against the PC establishment. The songs have been published on many internet platforms including YouTube.
In May she was convicted of three charges relating to the songs and was given a suspended jail sentence by magistrates which she appealed against.
A judge at Southwark Crown Court has upheld her conviction ruling the content was particularly repellent. In the songs Chabloz suggested the Holocaust was a bunch of lies and referred to Auschwitz as a theme park.
Chabloz was convicted of two counts of sending an offensive, indecent or menacing message through a public communications network and a third charge relating to a song on YouTube.
She was sentenced to 20 weeks' imprisonment, suspended for two years and banned from social media for 12 months.
During the appeal Adrian Davies, defending, told judge Christopher Hehir: It would be a very, very strong thing to say that a criminal penalty should be imposed on someone for singing in polemical terms about matters on which she feels so
The case started as a private prosecution by the Campaign Against Anti-Semitism before the Crown Prosecution Service took over. The group's chairman, Gideon Falter, said: This is the first conviction in the UK over Holocaust denial on social
Social media giants will face tough new laws to prevent the spread of knife crime, the Home Secretary threatened -- as he spoke of fears for his own children's safety.
Sajid Javid said it was time for a legal crackdown on social media images promoting gang culture, in the same way that child sex abuse images and terrorist propaganda have already been outlawed.
In a warning to online firms, he said:
My message to these companies is we are going to legislate and how far we go depends on what you decide to do now. At the moment we don't have the legislation for these types of [knife crime-related] content.
I have it for terrorist content and child sexual abuse images.
Google is among several firms which have been criticised for hosting content glamorising gang culture. Rappers using its YouTube video platform post so-called drill music videos to boast about the number of people they have stabbed or shot, using
street terms. The platform has taken down dozens of videos by drill artists, after warnings from the Metropolitan Police that they were raising the risk of violence.
ASA's rule writing arm CAP has published new standards to restrict gambling ads from being seen by under 18s.
This follows a review of the evidence on advertising's impact on under-18s and rulings by the Advertising Standards Authority. The last review was carried out in 2014.
The evidence suggests that exposure to gambling ads that comply with the UK's Advertising Codes is, of itself, unlikely to harm under-18s. Targeted restrictions are still required, however, to address the potential risks associated with
irresponsible advertising. While the advertising rules don't need to change, we have introduced new standards to strengthen how they apply in practice.
The new standards:
prohibit online ads for gambling products being targeted at groups of individuals who are likely to be under 18 based on data about their online interests and browsing behaviour;
extensively list unacceptable types of content, including certain types of animated characters, licensed characters from movies or TV and sportspeople and celebrities that are likely to be of particular appeal to children, and references to
youth culture; and
prohibit the use in gambling ads of sportspersons, celebrities or other characters who are or appear to be under 25; and
adds to existing guidance on the responsible targeting of ads, covering all media (including social networks and other online platforms)
In particular, the standards provide examples of scenarios to help advertisers understand what they need to do to target ads away from under-18s. For example:
Social media -- gambling operators must use all the tools available to them on a social network platform to prevent targeting their ads at under-18s. This includes both ad targeting facilities provided directly by the platform based, on
their platform users' interests and browsing behaviour, and tools that restrict under-18s' access to marketers' own social media content.
Parts of websites for under-18s -- gambling operators should take particular care to avoid placing their ads on parts of websites of particular appeal to under-18s. For example, a football club's website might have a strongly adult
audience in general, but it would be inappropriate to place gambling ads in pages dedicated to younger supporters.
Social and online gaming -- gambling-like games or games that feature elements of simulated gambling activity are often popular with children and young people. Such games should not be used to promote real-money gambling products. Where
social and online games feature marketing communications for gambling games, they should not be directed at under-18s.
Influencers -- gambling operators should take particular care when identifying influencers to promote their products or brands. They should take into account the influencer's likely appeal and obtain audience data (for instance, the
age-breakdown of a follower or subscriber-base) to ensure that under-18s are not likely to comprise more than 25% of the audience.
Affiliates -- responsibility lies with gambling operators to ensure that affiliates or other third parties acting on their behalf to publish or disseminate ads that comply with the advertising rules.
Press freedom in Europe is more fragile now than at any time since the end of the Cold War. That is the alarming conclusion of a report launched today by the 12 partner organisations of the Council of Europe platform to promote the protection
of journalism and safety of journalists.
The report, Democracy at Risk, analyses media freedom violations raised to the Platform in 2018. It provides a stark picture of the worsening environment for the media across Europe, in which journalists increasingly face obstruction, hostility
and violence as they investigate and report on behalf of the public.
The 12 Platform partners -- international journalists' and media organisations as well as freedom of expression advocacy groups -- reported 140 serious media freedom violations (alerts) in 32 Council of Europe member states in 2018. This review
of the alerts reveals a picture sharply at odds with the guarantees enshrined in the European Convention on Human Rights. Impunity for crimes against journalists is becoming a new normal. Legal protections for critical, investigative reporting
have been weakened offline and online. The space for the press to hold government authorities and the powerful to account has shrunk.
Last year the murders of J31n Kuciak and fianc39e Martina Kusn3drov31 in Slovakia and Jamal Khashoggi in the Saudi Arabian consulate in Istanbul, Turkey, were among 35 alerts for attacks on journalists' physical safety and integrity. Alerts about
serious threats to journalists' lives doubled and were accompanied by a strong surge in verbal abuse and public stigmatisation of the media and individual journalists in Council of Europe member states.
Urgent actions backed by a determined show of political will by Council of Europe member states are now required to improve the dire conditions for media freedom and to provide reliable protections for journalists in law and practice, the report
The purpose of the Platform, based on a 2015 agreement between the Council of Europe and the partner organisations, is to prompt an early dialogue with member states and hasten remedies for violations and shortcomings in the protections for free
and independent journalism.
The Platform partners call on states that impose the harshest restrictions on journalists' activities' and freedom of expression to:
Restore rule of law safeguards and drop charges against journalists and release them as a step towards restoring a safe and enabling environment for independent media; Remove extremism and other laws criminalising media and end arbitrary exercise
of powers by the executive and regulators; and Take the necessary steps to put in place a structure of media regulation and ownership which safeguards media plurality and freedom.
In addition, the Platform partners call on all states to reply fully and in good faith to all alerts and to take effective measures in law and practice to remedy the threats to the safety of individual journalists and media.
The Council of Europe Platform was launched in April 2015 to provide information which may serve as a basis for dialogue with member states about possible protective or remedial action. While many member states responded to alerts in 2018, five
states declined to respond to any alerts, including those reporting very serious media freedom violations.
The 12 Platform partners are: the European Federation of Journalists (EFJ), the International Federation of Journalists (IFJ), the Association of European Journalists (AEJ), Article 19, Reporters without Borders (RSF), the Committee to Protect
Journalists (CPJ), Index on Censorship, the International Press Institute (IPI), the International News Safety Institute (INSI), Rory Peck Trust, the European Broadcasting Union (EBU) and PEN International.
Blackface refers to a long standing rule of political correctness banning white people from pretending to be black people. But now it seems that the pretence underpinning the rule is no longer required, and that any image of of a black face is
now considered politically incorrect
A fashion line of shoes associated with Katy Perry has been withdrawn after being accused of using blackface.
The sandals and loafers, designed with a face featuring prominent red lips, are no longer on sale at retailers including Walmart. A spokesperson for the Kate Perry Collection told TMZ : In order to be respectful and sensitive the team is in the
process of pulling the shoes.
Perry has released a statement describing the shoes as part of a collection envisioned as a nod to modern art and surrealism. She said:
I was saddened when it was brought to my attention that it was being compared to painful images reminiscent of blackface. Our intention was never to inflict any pain. She said they had been immediately removed from the website for her fashion
Christian Concern writes a long article criticising the relaxation of UK obscenity law and concludes:
We need your help to monitor the mainstreaming of sado-masochism and extreme pornography in British society from now on. Christians have a unique calling to shed the light of the Gospel on this problem, and to provide a witness to redemption in
a society that has completely lost its way regarding sexual ethics.
Lords of Chaos is a UK / Sweden thriller by Jonas Åkerlund.
Starring Rory Culkin, Emory Cohen and Sky Ferreira.
A teenager's quest to launch Norwegian Black Metal in Oslo in the 1980s Members of the Norwegian death metal band perform a series of increasingly shocking publicity stunts leading to a very violent outcome.
It is based on real-life band Mayhem, and includes scenes of murder including the brutal killing of a homosexual man - and the burning of churches by satanists.
The latest most controversial film ever has been passed 18 uncut by the BBFC for strong bloody violence, gore, suicide.
According to the Telegraph the BBFC are understood to have been so concerned about the film that it was reviewed at the highest levels and suicide prevention experts were consulted before it was approved for an 18 certificate.
The Telegraph suggests the US film censors at the MPAA were similarly concerned before rating it R for strong brutal violence, disturbing behavior, grisly images, strong sexuality, nudity, and pervasive language.
The BBFC said the film did not glamorise self-harm and that there was no reason to think the film would have a damaging effect on adults who chose to view it - although some might find it distressing.
Church groups have, however, have called for it to be banned. Speaking to The Telegraph, Simon Calvert, deputy director of The Christian Institute, said he was surprised the film had not been banned given the recent discussion about self-harm. He
In the current climate of concern over self-harm and suicide, you would have thought there might have been more consideration of the risk that vulnerable people might imitate what they see. The distributors ought to be asking themselves if it is
worth this risk.'
The film is being distributed in the United Kingdom by Arrow Films and will be released in cinemas on 29th March.
The Cairncross Review into the future of the UK news industry has delivered its final report, with recommendations on how to safeguard the future sustainability of the UK press.
Online platforms should have a 'news quality obligation' to improve trust in news they host, overseen by a regulator
Government should explore direct funding for local news and new tax reliefs to support public interest journalism
A new Institute for Public Interest News should focus on the future of local and regional press and oversee a new innovation fund
The independent review , undertaken by Frances Cairncross, was tasked by the Prime Minister in 2018 with investigating the sustainability of the production and distribution of high-quality journalism. It comes as significant changes to technology
and consumer behaviour are posing problems for high-quality journalism, both in the UK and globally.
Cairncross was advised by a panel from the local and national press, digital and physical publishers and advertising. Her recommendations include measures to tackle the uneven balance of power between news publishers and the online platforms that
distribute their content, and to address the growing risks to the future provision of public-interest news.
It also concludes that intervention may be needed to improve people's ability to assess the quality of online news, and to measure their engagement with public interest news. The key recommendations are:
New codes of conduct to rebalance the relationship between publishers and online platforms;
The Competition and Markets Authority to investigate the online advertising market to ensure fair competition;
Online platforms' efforts to improve their users' news experience should be placed under regulatory supervision;
Ofcom should explore the market impact of BBC News, and whether its inappropriately steps into areas better served by commercial news providers;
The BBC should do more to help local publishers and think further about how its news provision can act as a complement to commercial news;
A new independent Institute should be created to ensure the future provision of public interest news;
A new Innovation Fund should be launched, aiming to improve the supply of public interest news;
New forms of tax reliefs to encourage payments for online news content and support local and investigative journalism;
Expanding financial support for local news by extending the BBC's Local Democracy Reporting Service;
Developing a media literacy strategy alongside Ofcom, industry and stakeholders.
The Government will now consider all of the recommendations in more detail. To inform this, the Culture Secretary will write immediately to the Competition and Markets Authority, Ofcom and the Chair of the Charity Commission to open discussions
about how best to take forward the recommendations which fall within their remits. The Government will respond fully to the report later this year.
DCMS Secretary of State Jeremy Wright said:
A healthy democracy needs high quality journalism to thrive and this report sets out the challenges to putting our news media on a stronger and more sustainable footing, in the face of changing technology and rising disinformation. There are
some things we can take action on immediately while others will need further careful consideration with stakeholders on the best way forward.
A Mediatique report Overview of recent market dynamics in the UK press, April 2018 commissioned by DCMS as the part of the Cairncross Review found:
Print advertising revenues have dropped by more than two-thirds in the ten years to 2017;
Print circulation of national papers fell from 11.5 million daily copies in 2008 to 5.8 million in 2018 and for local papers from 63.4 million weekly in 2007 to 31.4 million weekly in 2017;
Sales of both national and local printed papers fell by roughly half between 2007 and 2017, and are still declining;
The number of full-time frontline journalists in the UK has dropped from an estimated 23,000 in 2007, to just 17,000 today, and the numbers are still declining.
A report Online Advertising in the UK by Plum Consulting, commissioned by DCMS as the part of the Cairncross Review (and available as an annex to the Review) found:
UK internet advertising expenditure increased from £3.5 billion in 2008 to £11.5 billion in 2017, a compound annual growth rate of 14%.
Publishers rely on display advertising for their revenue online - which in the last decade has transformed into a complex, automated system known as programmatic advertising.
An estimated average of £0.62 of every £1 spent on programmatic advertising goes to the publisher - though this can range from £0.43 to £0.72. *Collectively, Facebook and Google were estimated to have accounted for over half (54%) of all UK
online advertising revenues in 2017.
The major online platforms collect multiple first-party datasets from large numbers of logged-in users. They generally, they do not share data with third-parties, including publishers.
Dame Frances Cairncross is a former economic journalist, author and academic administrator. She is currently Chair of the Court of Heriot-Watt University and a Trustee at the Natural History Museum. Dame Frances was Rector of Exeter College,
Oxford University; a senior editor on The Economist; and principal economic columnist for the Guardian. In 2014 she was made a Dame of the British Empire for services to education. She is the author of a number of books, including "The Death
of Distance: How the Communications Revolution is Changing our Lives" and "Costing the Earth: The Challenge for Governments, the Opportunities for Business". Dame Frances is married to financial journalist Hamish McRae.
The BBC comments on some of the ideas not included in the report's recommendations
The report falls short of requiring Facebook, Google and other tech giants to pay for the news they distribute via their platforms. Caurncross told the BBC's media editor Amol Rajan that "draconian and risky" measures could result in
firms such as Google withdrawing their news services altogether.:
There are a number of ways we have suggested technology companies could behave differently and could be made to behave differently. But they are mostly ways that don't immediately involve legislation."
Frances Cairncross earned widespread respect as a journalist for her hard-headed and pragmatic approach to economics. That pragmatism is the very reason the government commissioned her to look at the future of high-quality news - and also the
reason many in local and regional media will be disappointed by her recommendations.
What is most notable about her review is what it doesn't do.
It doesn't suggest all social media should be regulated in the UK
It doesn't suggest social media companies pay for the privilege of using news content
It doesn't suggest social media companies be treated as publishers, with legal liability for all that appears on their platform
This is because the practicalities of doing these things are difficult, and experience shows that the likes of Google will simply pull out of markets that don't suit them.
Ultimately, as this report acknowledges, when it comes to news, convenience is king. The speed, versatility and zero cost of so much news now means that, even if it is of poor quality, a generation of consumers has fallen out of the habit of
paying for news. But quality costs. If quality news has a future, consumers will have to pay. That's the main lesson of this report.
2018 was a pivotal year for data protection. First the Cambridge Analytica scandal put a spotlight on Facebook's questionable privacy practices. Then the new Data Protection Act and the General Data Protection Regulation (GDPR) forced
businesses to better handle personal data.
As these events continue to develop, 2019 is shaping up to be a similarly consequential year for free speech online as new forms of digital censorship assert themselves in the UK and EU.
Of chief concern in the UK are several initiatives within the Government's grand plan to "make Britain the safest place in the world to be online", known as the Digital Charter. Its founding document proclaims "the same rights that
people have offline must be protected online." That sounds a lot like Open Rights Group's mission! What's not to like?
Well, just as surveillance programmes created in the name of national security proved detrimental to privacy rights, new Internet regulations targeting "harmful content" risk curtailing free expression.
The Digital Charter's remit is staggeringly broad. It addresses just about every conceivable evil on the Internet from bullying and hate speech to copyright infringement, child pornography and terrorist propaganda. With so many initiatives
developing simultaneously it can be easy to get lost.
To gain clarity, Open Rights Group published a report surveying the current state of digital censorship in the UK . The report is broken up into two main sections - formal censorship practices like copyright and pornography blocking, and informal
censorship practices including ISP filtering and counter terrorism activity. The report shows how authorities, while often engaging in important work, can be prone to mistakes and unaccountable takedowns that lack independent means of redress.
Over the coming weeks we'll post a series of excerpts from the report covering the following:
Formal censorship practices
Copyright blocking injunctions
BBFC pornography blocking
BBFC requests to "Ancillary Service Providers"
Informal censorship practices
Nominet domain suspensions
The Counter Terrorism Internet Referral Unit (CTIRU)
The Internet Watch Foundation (IWF)
ISP content filtering
The big picture
Take a step back from the many measures encompassed within the Digital Charter and a clear pattern emerges. When it comes to web blocking, the same rules do not apply online as offline. Many powers and practices the government employs to remove
online content would be deemed unacceptable and arbitrary if they were applied to offline publications.
Part II of our report is in the works and will focus on threats to free speech within yet another branch of the Digital Charter known as the Internet Safety Strategy.
Indonesian entertainers have rallied against a draft law that seeks to ban blasphemous and pornographic music content, with critics saying it will be used to clamp down on an already very limited freedom of expression.
More than 100 protestors - many carrying placards or playing guitars and drums - took to the streets of Bogor, near Jakarta, on Sunday to demonstrate against the proposed law.
Under the proposed law, musicians would be prevented from bringing negative influences from foreign cultures and/or degrading human dignity into Indonesia. As well as cracking down on blasphemous and pornographic content, it imposes onerous
new requirements on musicians, such as carrying out competency tests to gain certification.
As well as cracking down on blasphemous and pornographic content, it imposes onerous new requirements on musicians, such as carrying out competency tests to gain certification.
An online petition calling for the vaguely worded bill to be scrapped has been signed by more than 250,000 people.
The Golden Glove (Der goldene Handschuh) is a 2019 Germany / France crime horror thriller by Fatih Akin.
Starring Marc Hosemann, Jonas Dassler and Adam Bousdoukos.
A serial killer strikes fear in the hearts of residents of Hamburg during the early 1970s.
One of Germany's most acclaimed directors, Fatih Akin, hit back at criticism of his new film about a real-life serial killer, The Golden Glove. Critics claimed that it exploits the female victims.
Akin insisted the ultra-violent new picture aimed to grant dignity to both the killer and the slain women. He commented:
We are living in a time in which the discussion about sexual violence is everywhere and that is justified. But when you make a film about sexual violence, you have to show it.
Akin said he had no desire to glorify violence against women with the film's scenes graphically depicting sexual torture, murder and dismemberment which many viewers said left them feeling queasy. He said for all the heightened sensitivity around
sexual misconduct in the entertainment industry, it should not be used to stifle artistic freedom.
A complaint about 3 Pugs Gin produced by Silverback Distillers has been upheld by the drinks censors of the Portman Group.
The complainant, a member of the public, believed that the product was aimed at an under-18's audience. The complaint was upheld under Code rule 3.2(h), which states that a drink, its packaging and any promotional material should not in any
direct or indirect way have a particular appeal to under-18s.
The Panel considered the overall packaging of the product. They concluded that the use of the descriptor pugalicious, description of the bubblegum flavour on the labelling and the fact that the product was a pink coloured gin were not in
themselves problematic. However, the Panel felt that when these factors were considered alongside the depiction of the dogs as cartoon pugs in a hot air balloon overlooking a Willy Wonka-like sweet land across a pink liquid, then it was likely to
have a particular appeal to under-18s.
A Portman Group spokesperson commented: This decision once again highlights that producers should steer clear of references and imagery related to childhood and childhood memories. They should think carefully about what is conveyed by the overall
impression of the product and speak to our advisory service if in any doubt.
The Russian State Duma is considering multiple bills of law that would further stifle free speech in Russia's already heavily restricted internet environment.
One targets expressions of willful disregard towards the state. Another targets disinformation. All of them echo increasingly global concerns among governments about the political implications of disinformation -- and unbridled criticism -- on
the internet. And all have been heavily criticized by Russian civil society groups, experts, users and even the government's own ministers. Yet these bills promoting possible further crackdown on free speech still trudge on through the
The first bill, a sovereign internet initiative , which is yet to reach the floor of the lower chamber of Russia's bicameral parliament, seeks to establish state-regulated internet exchange points that would allow for increased monitoring and
control over internet traffic moving into and out of the country.
Under this law, individuals, officials or organizations accused of spreading fake news disguised as genuine public announcements which are found to promote public disorder or other serious disturbances could be fined for up to a million rubles
(slightly above USD $15,000), unless they remove the violating content in a day's time. The bill also provides measures through which Roskomnadzor, Russia's media watchdog, will order ISPs to block websites hosting the offending content.
The bill passed its first reading in late January with flying colours, receiving 336 votes in its favor and only 44 against, thanks to the 2016 landslide which guaranteed the ruling United Russia party an absolute voting majority.
The anti-fake news bill will be reviewed again by the Duma in February, conditioned on the revision of some of its most contentious points. The bill pushed through by Putin's party was met with a rare response of significant opposition, even
among the normally acquiescent branches of Russia's highly centralized and executive-biased power structure. The attorney general's office, among others, criticized the bill's vague definitions as potentially damaging to citizens' civil rights.
The second bill , which came up for review alongside the fake news-busting proposal, is seen as being even more controversial. It seeks to punish vulgar expressions of wilful disregard towards the state, its symbols and organs of its power with
fines of up to 5,000 rubles (around USD $76) and detention for up to 15 days. The bill also passed in the first reading on the same day, despite vocal criticism from both government members (Deputy Communications Minister Alexey Volin said that
calmly accepting criticism was an obligation for state officials, adding that they weren't made of sugar) and opposition parties.
The Gambling Commission (UKGC) has released a new set of rules, ensuring that operators implement a new wave of identity checks to make gambling safer and fairer.
Following an open consultation, and to further guard against the risk of children gambling, new rules mean operators must verify customer identity and age before they can either deposit funds into an account or gamble with the licensee, with
either their own money or a free bet or bonus.
Furthermore, the UKGC has clamped down on free-to-play games, stressing that customer must now be age verified to access such versions gambling games on licensees' websites, emphasising that there is no legitimate reason why they should be
available to children.
Changes are also designed to aid with the detection of criminal activity, whilst operators are reminded that they cannot demand that ID be submitted as a condition of cashing out, if they could have asked for that information earlier.
Finally, an increase in identifying self-excluded players was stressed, because effective verification by operators will mean that a customer will not be verified, and therefore unable to gamble, until they provide correct details. These details
will then be checked against both the operator's own self-exclusion database and the verified data held by Gamstop.
Set to come into force on Tuesday 7 May, further new rules come as a result of a number of complaints to contact centre staff, regarding licensees not allowing a customer to withdraw funds until they submit certain forms of ID.
The new rules require remote licensees to:
Verify, as a minimum, the name, address and date of birth of a customer before allowing them to gamble
Ask for any additional verification information promptly
Inform customers, before they can deposit funds, of the types of identity documents or other information that might be required, the circumstances in which the information might be required, and how it should be supplied to the licensee
Take reasonable steps to ensure that information on their customers' identities remains accurate.
Another example of heavy handed policing has been highlighted by the Daily Mail.
Three police were sent to arrest Kate Scottow for critical comments about a trans person on Mumsnet. She was locked up for 7 hours and was questioned about her comment referring to a trans person as a man. The angry comments seem to be the result
of an online tiff with a transgender activist who didn't like Scottow's view that men cannot become women.
Police investigations are continuing and the police are still retaining Scottow's laptop and phone seized when she was arrested two months ago. She has also been served with a court order that bans her from referring to her accuser as a man.
The case is the latest where police have been accused of being heavy-handed in dealing with people who go online to debate gender issues. Sitcom writer Graham Linehan was given a verbal harassment warning by West Yorkshire Police after
transgender activist Stephanie Hayden reported him for referring to her by her previous names and pronouns on Twitter.
High Court papers obtained by The Mail on Sunday detail how Scottow is accused of a campaign of targeted harassment against the same activist, Stephanie Hayden, allegedly motivated by her status as a transgender woman.
Surely harsh words can be said in anger but surely this police action has confirmed that holding a politically incorrect opinion is now illegal and can be enforced as if it was a serious crime.
The heavy handed police action has only managed to transform an online tiff into a serious demonstration that the police have corrupted the law into some sort of blasphemy like prohibition on politically incorrect debate. Millions of newspaper
readers will now be even more worried that their own words may one day offend someone and get themselves arrested.
This police action just makes society more divisive and angry.
There is every reason to believe that the government and opposition are moving to a consensus on introducing a duty of care for social media companies to reduce harm and risk to their users. This may be backed by an Internet regulator, who
might decide what kind of mitigating actions are appropriate to address the risks to users on different platforms.
This idea originated from a series of papers by Will Perrin and Lorna Woods and has been mentioned most recently in a recent Science and Technology committee report and by NGOs including children's charity 5Rights.
A duty of care has some obvious merits: it could be based on objective risks, based on evidence, and ensure that mitigations are proportionate to those risks. It could take some of the politicisation out of the current debate.
However, it also has obvious problems. For a start, it focuses on risk rather than process . It moves attention away from the fact that interventions are regulating social media users just as much as platforms. It does not by itself tell us that
free expression impacts will be considered, tracked or mitigated.
Furthermore, the lack of focus that a duty of care model gives to process means that platform decisions that have nothing to do with risky content are not necessarily based on better decisions, independent appeals and so on. Rather, as has
happened with German regulation, processes can remain unaffected when they are outside a duty of care.
In practice, a lot of content which is disturbing or offensive is already banned on online platforms. Much of this would not be in scope under a duty of care but it is precisely these kinds of material which users often complain about, when it is
either not removed when they want it gone, or is removed incorrectly. Any model of social media regulation needs to improve these issues, but a duty of care is unlikely to touch these problems.
There are very many questions about the kinds of risk, whether to individual in general, vulnerable groups, or society at large; and the evidence required to create action. The truth is that a duty of care, if cast sensibly and narrowly, will not
satisfy many of the people who are demanding action; equally, if the threshold to act is low, then it will quickly be seen to be a mechanism for wide-scale Internet censorship.
It is also a simple fact that many decisions that platforms make about legal content which is not risky are not the business of government to regulate. This includes decisions about what legal content is promoted and why. For this reason, we
believe that a better approach might be to require independent self-regulation of major platforms across all of their content decisions. This requirement could be a legislative one, but the regulator would need to be independent of government and
Independent self-regulation has not been truly tried. Instead, voluntary agreements have filled its place. We should be cautious about moving straight to government regulation of social media and social media users. The government refuses to
regulate the press in this way because it doesn't wish to be seen to be controlling print media. It is pretty curious that neither the media nor the government are spelling out the risks of state regulation of the speech of millions of British
That we are in this place is of course largely the fault of the social media platforms themselves, who have failed to understand the need and value of transparent and accountable systems to ensure they are acting properly. That, however, just
demonstrates the problem: politically weak platforms who have created monopoly positions based on data silos are now being sliced and diced at the policy table for their wider errors. It's imperative that as these government proposals progress we
keep focus on the simple fact that it is end users whose speech will ultimately be regulated.
The number of people using the internet in Uganda has dropped by 26% since July 2018, when the country's social media tax was put into force. Prior to the tax's implementation, 47.4% of people in Uganda were using the internet. Three months after
the tax was put in place, that number had fallen to 35%.
ISPs charge an additional 200 Ugandan shillings (UGX) in social media tax on top of the ISP access fees and standard sales tax. This is nominally 5.4 US cents but is a significant portion of typical Ugandan incomes.
President Yoweri Museveni and several government officials said this was intended to curb online rumor-mongering and to generate more tax revenue.
The tax was the subject of large-scale public protests in July and August 2018. During one protest against the tax, key opposition leader, activist and musician Bobi Wine noted that the tax was enforced to oppress the young generation.
The government expected to collect about UGX 24 billion in revenue from the tax every quarter. But in the first quarter after the tax's implementation, they collected UGX 20 billion. In the second quarter, ending December 2018, they had collected
only UGX 16 billion.
While some people have gone offline altogether, others are simply finding different and more affordable ways to connect. People are creating shared access points where one device pays the tax and tethers the rest as a WiFi hotspot, or relying on
workplace and public area WiFi networks to access the services.
Other Ugandans are using Virtual Private Network (VPN) applications to bypass the tax. In a statement for
The Daily Monitor , the Uganda Revenue Authority's Ian Rumanyika argued that people could not use the VPNs forever, but that doesn't seem to be the case.
In addition to leaving Ugandans with less access to communication and diminished abilities to express themselves online, it has also affected economic and commercial sectors, where mobile money and online marketing are essential components of
Arizona has joined several other states in the nonsense claim that pornography is to be considered a public health crisis
Arizona state Representative Michelle Udall introduced a resolution declaring pornography is a crisis leading to a broad spectrum of individual and public health impacts. The resolution claims pornography perpetuates a sexually toxic environment
that damages all areas of our society.
The resolution passed a committee vote along party lines and now moves to the Arizona House, where Republicans hold a slim majority.
Utah was the first state in the nation to declare pornography a public health crisis in 2016, but measures have been passed in 11 other states since.
Apart from the name Donald, and securing a place in hell, both put American corporate interests above European livelihoods. The Council of the EU approves copyright law that will suffocate European businesses and livelihoods
While Italy, Poland, the Netherlands, Sweden, Finland and Luxembourg
maintained their opposition to the text and were newly joined by Malta and Slovakia, Germany's support of the "compromise" secretly negotiated with France over the last weeks has broken the previous deadlock .
This new Council position is actually more extreme than previous versions, requiring all platforms older than 3 years to automatically censor all their users' uploads, and putting unreasonable burdens even on the newest companies.
The German Conservative--Social Democrat government is now in blatant violation of its own coalition agreement , which rejects upload filters against copyright infringement as disproportionate. This breach of coalition promises will not go
down well with many young voters just ahead of the European elections in May. Meanwhile, prominent members of both German government parties have joined the protests against upload filters.
The deal in Council paves the way for a final round of negotiations with the Parliament over the course of next week, before the entire European Parliament and the Council vote on the final agreement. It is now up to you to contact your MEPs,
call their offices in their constituencies and visit as many of their election campaign events as you can! Ask them to reject a copyright deal that will violate your rights to share legal creations like parodies and reviews online, and
includes measures like the link tax that will limit your access to the news and drive small online newspapers out of business.
Right before the European elections, your voices cannot be ignored! Join the over 4.6 million signatories to
the largest European petition ever and tell your representatives: If you break the Internet and accept Article 13, we won't reelect you!
A new bill introduced late last month in the New York State legislature marks the latest attempt to impose a user tax on porn, or for that matter any sexually oriented media. Teh proposed bill will slap an extra $2 on to every porn download.
The charge would also apply to offline sexually oriented media, adding the two-buck fee to each magazine or DVD classified as sexually oriented. In fact, the language of New York Assembly Bill AO3417 is so broad that it apparently would apply not
only to porn, but even to R-rated movies and TV programs airing on pay cable networks such as HBO or Showtime.
That's because the law as written by Assistant Assembly Speaker Felix W. Ortiz defines sexually oriented as any media that features nude pictures or nude performances. And nude does not even mean completely nude under the bill's wording, breasts
or buttocks are enough.
The language of the bill is also unclear on whether the $2 surcharge would apply to free porn downloads, such as on Pornhub and similar tube sites.
An attempt to block pornography and other obscene material on all personal devices in
South Dakota, then charge users a $20 access fee, was voted down Friday by state lawmakers.
House Bill 1154, written by out-of-state authors, raised serious concerns with lobbyists representing South Dakota retailers and telecommunication companies, who opposed the measure in a meeting of the House Judiciary Committee Friday morning.
Google has agreed to censor search results in Russia as dictated by country's internet censor. This will then allow Google to continue operations in Russia.
Google is one of a few search engines that does not adhere to an official list of banned websites that should not be included in search results.. However, Google already deletes 70% links from its search results to websites that internet censor
Roskomnadzor has banned.
In December of 2018, Roskomnadzor charged Google a fine of 500,000 rubles ($7,590) for refusing to subscribe to the banned list. The company did not challenge the agency's decision and chose to pay the fine. The Russian law that made the fine
possible does not allow Roskomnadzor to block sites that do not comply with its censorship demands, but that did not stop Roskomnadzor from threatening to block Google within Russian borders regardless.
Lawmakers from Pennsylvania have introduced a bill that proposing an additional 10% sin tax on M (mature) and AO (adults only) rated video games.
The money would go into a fund called the Digital Protection for School Safety Account that aims to enhance security measures at schools in the wake of the school shootings in Parkland, Florida and Newtown, Connecticut.
State representative Chris Quinn, a republican first introduced the bill in 2018 but is trying again in the current session. Explaining the bill last year, Quinn said violent video games might be an element in the rise of school shootings in
America. One factor that may be contributing to the rise in, and intensity of, school violence is the material kids see, and act out, in video games, he said.
The Entertainment Software Association, which lobbies on behalf of the video game industry, notes that the bill is a violation of the US Constitution.
Governments around the world are grappling with the threat of terrorism, but their efforts aimed at curbing the dissemination of terrorist content online all too often result in censorship. Over the past five years, we've seen a number of
governments--from the US Congress to that of France and now the European Commission (EC)--seek to implement measures that place an undue burden on technology companies to remove terrorist speech or face financial liability.
This is why EFF has joined forces with dozens of organizations to call on members of the European Parliament to oppose the EC's proposed regulation, which would require companies to take down terrorist content within one hour . We've added our
voice to two letters--one from Witness and another organized by the Center for Democracy and Technology --asking that MEPs consider the serious consequences that the passing of this regulation could have on human rights defenders and on freedom
We share the concerns of dozens of allies that requiring the use of proactive measures such as use of the terrorism hash database (already voluntarily in use by a number of companies) will restrict expression and have a disproportionate impact on
marginalized groups. We know from years of experience that filters just don't work.
Furthermore, the proposed requirement that companies must respond to reports of terrorist speech within an hour is, to put it bluntly, absurd. As the letter organized by Witness states, this regulation essentially forces companies to bypass due
process and make rapid and unaccountable decisions on expression through automated means and furthermore doesn't reflect the realities of how violent groups recruit and share information online.
We echo these and other calls from defenders of human rights and civil liberties for MEPs to reject proactive filtering obligations and to refrain from enacting laws that will have unintended consequences for freedom of expression.
Toy Story 4 is a 2019 USA family animation comedy by Josh Cooley.
Starring Keanu Reeves, Patricia Arquette and Tom Hanks.
When a new toy called "Forky" joins Woody and the gang, a road trip alongside old and new friends reveals how big the world can be for a toy.
Animal rights campaigners PETA have launched an ad campaign this week, demanding that animators Pixar edit out a sheep-herding crook from the new Toy Story film, ludicrously claiming that the object promotes animal cruelty.
Activists at People for the Ethical Treatment of Animals (PETA) saw the crook as a betrayal of Pixar's attempt to give the character a tough modern update, claiming that the sheep-herding instrument she carries is still problematic.
Their problem is apparently not that they think the crook itself is a cruel instrument, but the fact that it promotes exploiting gentle sheep for their wool.
It is surely doubtful that most kids will even know what the old-time shepherd's tool is in the first place.
Wrangling in Whitehall has held up plans to set up a social media censor dubbed Ofweb, The Mail on Sunday reveals.
The Government was due to publish a White Paper this winter on censorship of tech giants but this Mail has learnt it is still far from ready. Culture Secretary Jeremy Wright said it would be published within a month, but a Cabinet source said
that timeline was wholly unrealistic. Other senior Government sources went further and said the policy document is unlikely to surface before the Spring.
Key details on how a new censor would work have yet to be decided while funding from the Treasury has not yet been secured. Another problem is that some Ministers believe the proposed clampdown is too draconian and are preparing to try to block
or water down the plan.
There are also concerns that technically difficult requirements would benefit the largest US companies as smaller European companies and start ups would not be able to afford the technology and development required.
The Mail on Sunday understands Jeremy Wright has postponed a visit to Facebook HQ in California to discuss the measures, as key details are still up in the air.
Update: The Conservatives don't have a monopoly on internet censorship...Labour agrees
Labour has called for a new entity capable of taking on the likes of Facebook and Google. Tom Watson, the shadow digital secretary, will on Wednesday say a regulator should also have responsibility for competition policy and be able to refer
cases to the Competition and Markets Authority.
According to Watson, any duty of care would only be effective with penalties that seriously affect companies' bottom lines. He has referred to regulators' ability to fine companies up to 4% of global turnover, or euro 20m, whichever is higher,
for worst-case breaches of the EU-wide General Data Protection Regulation.
Two men who breached an injunction banning them from making drill music have been given suspended jail sentences of nine months each.
The ruling comes as Scotland Yard continues its controversial crackdown on the rap genre, a strategy which has attracted significant criticism from drill fans.
The Metropolitan Police have repeatedly blamed the music genre for rising knife crime in London and has launched a wide ranging crackdown on drill music videos. Detective Inspector Luke Williams of Lambeth and Southwark Gangs Unit said:
I am pleased with the sentences passed in these cases which reflect that the police and courts are unwilling to accept behaviour leading to serious violence.
Contrary to some reports A rticle 13 was not shelved solely because EU governments listened to the unprecedented
public opposition and understood that upload filters are costly,
error-prone and threaten fundamental rights.
Without doubt, the consistent public opposition contributed to 11 member state governments voting against the mandate, instead of just 6 last year, but ultimately the reform hinges on agreement between France and Germany , who due to their size
can make or break blocking minorities. The deadlock is the direct result of their disagreement, which was not about whether to have upload filters at all; they just couldn't agree on exactly who should be forced to install those faulty
The deadlock hinged on a disagreement between France and Germany
France's position: Article 13 is great and must apply to all platforms, regardless of size . They must demonstrate that they have done all they possibly could to prevent uploads of copyrighted material. In the case of small businesses, that may
or may not mean using upload filters -- ultimately, a court would have to make that call . (This was previously the
majority position among EU governments , supported by France, before Italy's newly elected government retracted their support for Article 13 altogether.)
Germany's position: Article 13 is great, but it should not apply to everyone. Companies with a turnover below ?20 million per year should be excluded outright, so as not to harm European internet startups and SMEs. (This was closer to the
European Parliament's current position , which calls for the exclusion of companies with a turnover below ?10 million and fewer than 50 employees.)
What brought France and Germany together:
Making Article 13 even worse In the
Franco-German deal , which leaked today, Article 13 does apply to all for-profit platforms. Upload filters must be installed by everyone except those services which fit all three of the following extremely narrow criteria:
Available to the public for less than 3 years
Annual turnover below ? 10 million
Fewer than 5 million unique monthly users
Countless apps and sites that do not meet all these criteria would need to install upload filters, burdening their users and operators, even when copyright infringement is not at all currently a problem for them. Some examples:
Discussion boards on for-profit sites, such as the Ars Technica or Heise.de forums (older than 3 years)
Patreon , a platform with the sole purpose of helping authors get paid (fails to meet any of the three criteria)
Niche social networks like GetReeled , a platform for anglers (well below 5 million users, but older than 3 years)
Small European competitors to larger US brands like wykop, a Polish news sharing platform similar to reddit (well below ? 10 million turnover, bur may be above 5 million users depending on the calculation method)
On top of that, even the smallest and newest platforms, which do meet all three criteria , must still demonstrate they have undertaken " best efforts " to obtain licenses from rightholders such as record labels, book publishers and
stock photo databases for anything their users might possibly upload -- an impossible task . In practice, all sites and apps where users may upload material will likely be forced to accept any license a rightholder offers them , no matter how bad
the terms, and no matter whether the y actually want their copyrighted material to be available on the platform or not , to avoid the massive legal risk of coming in conflict with Article 13. In summary: France's and Germany's compromise on
Article 13 still calls for nearly everything we post or share online to require prior permission by "censorship machines" , algorithms that are fundamentally unable to distinguish between copyright infringement and legal works such as
parody and critique. It would change the web from a place where we can all freely express ourselves into one where big corporate rightholders are the gatekeepers of what can and can't be published. It would allow these rightholders to bully any
for-profit site or app that includes an upload function. European innovation on the web would be discouraged by the new costs and legal risks for startups -- even if they only apply when platforms become successful, or turn 3 years old.
Foreign sites and apps would be incentivised to just geoblock all EU users to be on the safe side.
Now everything hinges on the European Parliament
With this road block out of the way, the trilogue negotiations to finish the new EU copyright law are back on. With no time to lose, there will be massive pressure to reach an overall agreement within the next few days and pass the law in March
or April. The most likely next steps will be a rubber-stamping of the new Council position cooked up by Germany and France on Friday, 8 February, and a final trilogue on Monday, 11 February.
MEPs, most of whom are fighting for re-election, will get one final say. Last September, a narrow majority for Article 13 could only be found in the Parliament after a small business exception was included that was much stronger than the foul
deal France and Germany are now proposing -- but I don't have high hopes that Parliament negotiator Axel Voss will insist on this point. Whether MEPs will reject this harmful version of Article 13 (like they initially did last July) or bow to the
pressure will depend on whether all of us make clear to them: If you break the internet and enact Article 13, we won't re-elect you.
Facebook has been caught out censoring a poster for comedy show because Facebook's simplistic algorithms couldn't distinguish a jokey use of the word 'Brexit' from a political advert.
The social media site has taken drastic action to clamp down on political advertising in a bid to tackle a backlash over secret Russian interference. But it was accused of over-reacting after a comedian was told he couldn't promote his show Brexit Through The Gift Shop.
comedian Matt Forde was told his stand up show's title breached new rules on ads about politics or issues of national importance. Facebook told him: There's no way around this other than not using the word Brexit.
The comedian told The Sun that it was incredible that Facebook allowed tech firms to harvest the data of millions without telling them but stopped him from advertising a comedy show. Forde added:
I'm flattered that they think I'm a greater threat to their users than the collapse of global democracy. Obviously what I forgot to do was offer Facebook the personal data of my friends and family.
Uganda's government has been rattled by the popularity of pop star Bobi Wine who has become an opposition politician after amassing a large following amongst the country's disillusioned youths.
The government has now proposed a new censorship law vetting new songs, film and stage show scripts. In addition artists will have to seek state permission to perform abroad.
Musicians and other artists will also have to register with the government and obtain a practicing license which can be revoked for a range of violations.
Peace Mutuuzo, junior minister for gender, labor and social development, told Reuters in an interview the new regulations to govern the music and entertainment industry were already drafted and expected to be passed by cabinet by March.
The Grand Tour presenter Jeremy Clarkson has pushed back at claims of homophobia from gay singer Will Young by joking about enjoying lesbian porn.
LGBT+ campaigner and musician Will Young had hit out at Clarkson after a recent episode of Amazon motoring show included a running gag alluding to a Jeep Wrangler being gay. The January 27 episode of the Amazon show also saw Clarkson ask whether
LGBT stands for lesbian, bacon, transgender.
I'm afraid 3 heterosexual men SO uncomfortable with their sexuality that they reference in some lame way a Wrangler Jeep being a Gay mans car
.... and then Hammond and May's 'quips to Clarkson wearing chaps , a pink shirt , he should get some moisturiser . It's f**king pathetic and actually homophobic .
Clarkson responded to Young also on Twitter:
...I will apologise to Will for causing him some upset and reassure him that I know I'm not homophobic as I very much enjoy watching lesbians on the internet.
An up-and-coming young author has cancelled the publication of her highly anticipated debut novel received a barrage of criticism from the PC lynch mob over her depiction of race and slavery.
Amélie Wen Zhao's novel, Blood Heir , is a fantastical retelling of the Anastasia story involving "a princess hiding a dark secret and the conman she must trust to clear her name for her father's murder, it was scheduled to be
published in June.
After criticism on grounds of political correctness, Zhao said in a statement that negative feedback from the young adult community had led to her asking her publisher, Delacorte Press, not to release the book at this time. She said:
It was never my intention to bring harm to any reader of this valued community, particularly those for whom I seek to write and empower ... I don't wish to clarify, defend or have anyone defend me. This is not that; this is an apology.
Zhao had previously said on her website that she had set out to create "a diverse cast, many of which are beloved and dear to a third-culture kid like myself
Before the PC mob picked up on the book, early reviews had been positive.
Offsite Comment: The return of book-burning
The Twittermob's fury with un-woke novels has sinister echoes of the past.
Nervous BBC chiefs once forced Russell Howard to rewrite a joke -- in case it offended ISIS.
Speaking on his Sky One show The Russell Howard Hour , the stand-up said: A while back I worked for the BBC and I did a piece about the Paris attacks when I said Isis weren't Muslims, they were terrorists -- and the crowd cheered.
And then, at the end of the show, the BBC lost their mind, [saying] "You need to re-record it! You need to say Isis aren't *devout* Muslims."
I was like, "Are you worried we are going to offend Isis?" Are they going to write in?"
When the routine was broadcast on his former BBC show, Russell Howard's Good News , the words devout Muslims were used, in keeping with the executives' wishes.
Update: Even woke comics aren't safe
PC is bad for comedy of all political persuasions.
The Russian TV censor has found certain violations in activities of the BBC World News broadcaster in Russia. The probe into the broadcaster's actions was launched in response to the British TV censor Ofcom's ruling against the Russian
propaganda channel RT for biased reporting about the Salisbury poisoning.
Roskomnadzor the Russian TV censor said BBC World News in Russia, has been found in breach of Russian legislation following an unscheduled inspection. It did not elaborate on the nature of the revealed violations but said that it is assessing
their severity. Roskomnadzor will later provide further information about the measures taken.
On a separate occasion, January 10, Roskomnadzor said it found some BBC online reports in breach of Russian anti-extremism laws as they contained some direct quotes of Al-Baghdadi, the head of Islamic State, something that is banned under a