|
The suffocating mountain of red tape titled the Online Safety Acts kills its first British business
|
|
|
 | 24th
December 2024
|
|
| See article from lfgss.com |
The owner of a popular cycling forum LFGSS has decided to close his business due to the enormous risks and expenses inherent in running a British business due to be suffocated by the misleadingly named Online Safety Act. He explains: Reading
Ofcom's tome of censorship rules and we're done... we fall firmly
into scope, and I have no way to dodge it. The act is too broad, and it doesn't matter that there's never been an instance of any of the proclaimed things that this act protects adults, children and vulnerable people from... the very broad language and
the fact that I'm based in the UK means we're covered. The act simply does not care that this site and platform is run by an individual, and that I do so philanthropically without any profit motive (typically losing money), nor that the site
exists to reduce social loneliness, reduce suicide rates, help build meaningful communities that enrich life. The act only cares that is it "linked to the UK" (by me being involved as a UK native and resident, by you being a UK based
user), and that users can talk to other users... that's it, that's the scope. I can't afford what is likely tens of thousand to go through all the legal hoops here over a prolonged period of time, the site itself barely gets a few hundred in
donations each month and costs a little more to run... this is not a venture that can afford compliance costs... and if we did, what remains is a disproportionately high personal liability for me, and one that could easily be weaponised by disgruntled
people who are banned for their egregious behaviour... I do not see an alternative to shuttering it. The conclusion I have to make is that we're done... Microcosm, LFGSS, the many other communities running on this platform... the risk to me
personally is too high, and so I will need to shutter them all. On Sunday 16th March 2025 (the last day prior to the Act taking effect) I will delete the virtual servers hosting LFGSS and other communities, and effectively immediately end the
approximately 300 small communities that I run, and the few large communities such as LFGSS. |
|
Ofcom publishes another mountain of expensive and suffocating censorship red tape
|
|
|
| 16th December 2024
|
|
| See press release
from ofcom.org.uk |
Ofcom writes: Today we are publishing our first major policy Statement for the Online Safety regime. This decision on the Illegal Harms Codes and guidance marks a major milestone, with online
providers now being legally required to protect their users from illegal harm. Ofcom published proposals about the steps providers should take to address illegal harms on their services shortly after passage of the Online Safety
Act in October 2023. Since then, we have been consulting carefully and widely, listening to industry, charities and campaigners, parents and children, as well as expert bodies and law enforcement agencies. With today's publication1, online providers must
take action to start to comply with these new rules. The result will be a safer life online for people in the UK, especially children. Providers now have a duty to assess the risk of illegal harms on their services, with a
deadline of 16 March 2025. Subject to the Codes completing the Parliamentary process, from 17 March 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and
activity. We are ready to take enforcement action if providers do not act promptly to address the risks on their services.
Analysis to follow but there are over 1000 pages to get through first! |
|
Ofcom proposes definitions for which websites will be subjected to the most onerous censorship rules defined in the Online Safety Act
|
|
|
 | 31st March 2024
|
|
| See press release from ofcom.org.uk See
consultation from ofcom.org.uk |
Ofcom writes: Ofcom is seeking evidence to inform our codes of practice and guidance on the additional duties that will apply to some of the most widely used online sites and apps -- designated as categorised services - under the
Online Safety Act. Under the new laws, all in-scope tech firms must put in place appropriate safety measures to protect users from online harms. In addition, some online services will have to comply with extra requirements if they
fall into one of three categories, known as Category 1, 2A or 2B. These extra duties include giving users more tools to control what content they see, ensuring protections for news publisher and journalistic content, preventing
fraudulent advertising and producing transparency reports. Different duties apply, depending on which category a service falls into. The Act requires us to produce codes of practice and guidance outlining the steps that companies
can take to comply with these additional duties. We are inviting evidence from industry, expert groups and other organisations to help inform and shape our approach. A formal consultation on the draft codes and guidance will follow in 2025, taking
account of responses to today's call for evidence. Advice to Government on categorisation thresholds Alongside this, we have also today published our advice to Government on the thresholds which would determine whether or not a
service falls into Category 1, 2A or 2B. We advise that: Category 1 (most onerous): should apply to services which meet either of the following conditions:
Condition 1 - uses a content recommender system; and has more than 34 million UK users on the user-to-user part of its service, representing around 50% of the UK population; Condition 2 - allows users
to forward or reshare user-generated content; and uses a content recommender system; and has more than 7 million UK users on the user-to-user part of its service, representing circa 10% of the UK population.
Category 2A: should apply to services which meet both of the following criteria:
is a search service, but not vertical search service has more than 7 million UK users on the search engine part of its service, representing circa 10% of the UK population.
Category 2B: should apply to services which meet both of the following criteria:
allows users to send direct messages; and has more than 3 million UK users on the user-to-user part of the service, representing circa 5% of the UK population.
Taking our advice into consideration, the Secretary of State must set the threshold conditions in secondary legislation. Once passed, we will then gather information, as needed, from regulated services and produce a published register
of categorised services. |
|
Parts of the Online Censorship Act have come into force
|
|
|
 |
31st January 2024
|
|
| See press release
from gov.uk |
Abusers, trolls, and predators online now face a fleet of tough new jailable offences from Wednesday 31 January, as offences for cyberflashing, sending death threats, and epilepsy-trolling are written into the statute book after the Online Safety Act
gained Royal Assent. These new criminal offences will protect people from a wide range of abuse and harm online, including threatening messages, the non-consensual sharing of intimate images known as revenge porn, and sending fake
news that aims to cause non-trivial physical or psychological harm. Dubbed Zach's law, a new offence will also mean online trolls that send or show flashing images electronically with the intention of causing harm to people with
epilepsy will be held accountable for their actions and face prison. Following the campaigning of Love Island star Georgia Harrison, bitter ex-partners and other abusers who share, or threaten to share, intimate images on or
offline without the consent of those depicted will face jail time under new offences from today. Those found guilty of the base offence of sharing an intimate image could be in prison for up to 6 months, or 2 years if it is proven
the perpetrator also intended to cause distress, alarm or humiliation, or shared the image to obtain sexual gratification. Cyberflashing on dating apps, AirDrop and other platforms will also result in perpetrators facing up to two
years behind bars where it is done to gain sexual gratification, or to cause alarm, distress or humiliation. Sending death threats or threatening serious harm online will also carry a jail sentence of up to five years under a new
threatening communications offence that will completely outlaw appalling threats made online that would be illegal if said in person. A new false communications offence will bring internet trolls to justice by outlawing the
intentional sending of false information that could cause non-trivial psychological or physical harm to users online. This new offence will bolster the government's strong commitment to clamping down on dangerous disinformation and election interference
online. In the wake of sickening content, often targeted at children, that encourages users to self-harm, a new offence will mean the individuals that post content encouraging or assisting serious self-harm could face up to 5
years behind bars. While much of the Online Safety Act's protections are intended to hold tech companies and social media platforms to account for the content hosted on their sites, these new offences will apply directly to the
individuals sending threatening or menacing messages and bring justice directly to them. Some of the offences that commence from today will be further bolstered too, when the wide-ranging Criminal Justice Bill completes its
passage through Parliament.
|
|
|
|
|
 | 4th
December 2023
|
|
|
A summary of the current position of the UK's (anti-)pornographic internet censorship provisions See
article from decoded.legal |
|
|
|
|
 | 11th
November 2023
|
|
|
With 1500 pages outlining a mountain of suffocating red tape in the name of internet regulation, Ofcom delivers a message to small British internet companies See
article from webdevlaw.uk |
|
The Online Unsafety Bill gets Royal Assent and so becomes law
|
|
|
 | 29th
October 2023
|
|
| See article from ofcom.org.uk |
The Online Safety Bill received Royal Assenton 26th October 2023, heralding a new era of internet censorship. The new UK internet Ofcom was quick off the mark to outline its timetable for implementing the new censorship regime. Ofcom has set
out our plans for putting online safety laws into practice, and what we expect from tech firms, now that the Online Safety Act has passed. Ofcom writes: The Act makes companies that operate a wide range of online services legally
responsible for keeping people, especially children, safe online. These companies have new duties to protect UK users by assessing risks of harm, and taking steps to address them. All in-scope services with a significant number of UK users, or targeting
the UK market, are covered by the new rules, regardless of where they are based. While the onus is on companies to decide what safety measures they need given the risks they face, we expect implementation of the Act to ensure
people in the UK are safer online by delivering four outcomes:
stronger safety governance in online firms; online services designed and operated with safety in mind; choice for users so they can have meaningful
control over their online experiences; and transparency regarding the safety measures services use, and the action Ofcom is taking to improve them, in order to build trust .
We are moving quickly to implement the new rules Ofcom will give guidance and set out codes of practice on how in-scope companies can comply with their duties, in three phases, as set out in the Act.
Phase one: illegal harms duties We will publish draft codes and guidance on these duties on 9 November 2023, including:
analysis of the causes and impacts of online harm, to support services in carrying out their risk assessments; draft guidance on a recommended process for assessing risk; draft codes of practice, setting out what services can do to mitigate the risk of harm; and
draft guidelines on Ofcom's approach to enforcement.
We will consult on these documents, and plan to publish a statement on our final decisions in Autumn 2024. The codes of practices will then be submitted to the Secretary of State for Science, Innovation and Technology, and subject to
their approval, laid before Parliament. Phase two: child safety, pornography and the protection of women and girls Child protection duties will be set out in two parts. First, online pornography
services and other interested stakeholders will be able to read and respond to our draft guidance on age assurance from December 2023. This will be relevant to all services in scope of Part 5 of the Online Safety Act. Secondly,
regulated services and other interested stakeholders will be able to read and respond to draft codes of practice relating to protection of children, in Spring 2024. Alongside this, we expect to consult on:
We expect to publish draft guidance on protecting women and girls by Spring 2025, when we will have finalised our codes of practice on protection of children. Phase three: transparency, user empowerment, and
other duties on categorised services A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by Government. Our
final stage of implementation focuses on additional requirements that fall only on these categorised services. Those requirements include duties to:
produce transparency reports; provide user empowerment tools; operate in line with terms of service; protect certain types of journalistic
content; and prevent fraudulent advertising.
We now plan to issue a call for evidence regarding our approach to these duties in early 2024 and a consultation on draft transparency guidance in mid 2024. Ofcom must produce a register of categorised
services. We will advise Government on the thresholds for these categories in early 2024, and Government will then make secondary legislation on categorisation, which we currently expect to happen by summer 2024. Assuming this is achieved, we will:
publish the register of categorised services by the end of 2024; publish draft proposals regarding the additional duties on these services in early 2025; and issue
transparency notices in mid 2025.
|
| |