The U.K. government put out an explainer on what the Online Safety Act entails.
The Online Safety Act 2023 (the Act) is a new set of laws that protects children and adults online. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms. The Act will give providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear.The strongest protections in the Act have been designed for children. Platforms will be required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.The Act will also protect adult users, ensuring that major platforms will need to be more transparent about which kinds of potentially harmful content they allow, and give people more control over the types of content they want to see.Ofcom is the independent regulator of Online Safety. It will set out steps providers can take to fulfil their safety duties in codes of practice. It has a broad range of powers to assess and enforce providers’ compliance with the framework.Providers’ safety duties are proportionate to factors including the risk of harm to individuals, and the size and capacity of each provider. This makes sure that while safety measures will need to be put in place across the board, we aren’t requiring small services with limited functionality to take the same actions as the largest corporations. Ofcom is required to take users’ rights into account when setting out steps to take. And providers have simultaneous duties to pay particular regard to users’ rights when fulfilling their safety duties.
According to an article on Eurogamer, the law definitely affects online games and gaming platforms. Among those interviewed was George Osborn, editor of Video Games Industry Memo who was head of campaign and communications at Ukie during the time of bill negotiations.
From today [25 July 2025], any game with user-to-user communication (such as voice or text chat) that's available in the UK will need to follow the law. Broadly, Osborn explains, studios will need moderation tools to remove harmful content, better reporting processes, and measures in place to protect children if a game is accessible to them."That's the bare-bones version of the law," Osborn warns. "The reality is that a lawyer will likely tell you that you've got to do a hell of a lot more to meet the provisions of the 300-page act and the many volumes of guidance put out by Ofcom. And given that the regulator can dish out fines of up to £18m or 10 percent of your global turnover, and in some rare cases bang up a senior exec, it is something that you really need good advice on quickly."He adds: "The only crumb of comfort is that Ofcom is still rolling out the final bits of the act and has said it'll take its time to fully enforce. But with games having historically been a target for moral panics and policymaker madness, you don't want to bet too closely on not being picked up."
As part of my research for this piece Copilot put together a grid summarizing the OSA based on the above mentioned Eurogamer article and an article about Microsoft's upcoming age verification requirements for Xbox users.
![]() |
| A quick summary of video games and the OSA |
Despite any original intentions, the OSA apparently will place small indie developers at a disadvantage versus Triple-AAA studios.
However, Osborn states the Act is "far too big for indies who have chat functions to comply with easily"."Back when it was being passed," he says, "the bill was constantly referred to as a 'Christmas tree' because people kept whacking baubles onto it while it took years and years and years to pass."And the result of that is it empowered Big Tech and social media companies, rather than putting a muzzle on them. Yes, there are loads of ways they can be collared now. But because they have big legal teams, great external counsel and policy pros and public affairs agencies lobbying for them, the biggest businesses have been able to pay up for the right advice to adapt their services to meet the rules in advance."He adds: "But for small businesses including indie game developers who have the misfortune to have things like text and voice chat in their games, following all the rules is really hard." Even with the likes of Modulate and k-ID to assist, it remains hard for small businesses to comply in the short and long term, says Osborn.
The law is not fully in effect. Below is the timeline for passage and enforcement of the act:
- October 2023: Online Safety Act becomes law (Royal Assent). Ofcom begins developing codes; companies start preparing compliance plans.
- Jan 2024: New individual offenses (cyberflashing, etc.) come into effect.
- Late 2024: Initial codes of practice (Illegal Content code) laid before Parliament and guidance on illegal content risk assessment published. Ofcom also sets category thresholds via secondary legislation (defining Category 1, 2A, 2B by user numbers).
- March 2025: Illegal content duties fully in force. By 17 March, all in-scope services must have completed illegal content risk assessments and start implementing measures to mitigate those risks. Ofcom gains enforcement powers on illegal content duties.
- January 2025 – July 2025: Focus on children’s safety duties. Ofcom issued guidance on age assurance for pornography and on conducting “children’s access assessments” (to determine if a service is likely accessed by children). By April 2025, services had to assess if their platform is likely used by under-18s. Child protection codes of practice were laid in Parliament in April, and by 24 July 2025, any service deemed likely accessed by kids must have completed a detailed child risk assessment. Summer 2025 marks the point at which the child safety regime is in effect, with requirements like age verification for adult content kicking in.
- Late 2025 – 2026: Ofcom will publish the register of which services fall into Category 1, 2A, 2B (expected in late 2025) and develop further codes for those additional Category 1 duties. By early 2026, we expect new rules on transparency reports and adult user empowerment tools to become operational for the largest platforms. Enforcement activity will ramp up accordingly as all phases of the Act come into play.
So last Friday was just when the aspects of the law affecting children went into effect. "Just" in this case is a bit of an understatement as the measures are rather comprehensive. When all regulations kick in, the U.K.'s effort to regulate the Internet is probably out of reach of reality. In which case the regulators will get to pick and choose who to concentrate their efforts into monitoring. Given history, expect a spotlight to shine on the gaming industry.

No comments:
Post a Comment