Summary
Last week, UK's telecoms regulator, Ofcom, released draft guidelines as part of the Online Safety Act (OSA), detailing how tech firms should handle illegal content.
The proposed rules focus on social media platforms, search engines, online and mobile games, and pornography sites, outlining measures to address issues like child sexual abuse material (CSAM), terrorism content, and fraud. Ofcom emphasizes a shift from reactive to proactive approaches, urging platforms to be proactive in preventing the spread of illegal content.
The guidelines aim to enforce a duty of care on tech firms, requiring them to take responsibility for user safety. Ofcom estimates that around 100,000 services may fall under the rules, with the largest platforms facing the strictest requirements. The guidelines cover various aspects, including not allowing strangers to send direct messages to children, using hash matching to detect and remove CSAM, maintaining content and search moderation teams, and providing ways for users to report harmful content.
“Making the internet safer does not end with this Bill becoming an Act. The scale of child sexual abuse, and the harms children are exposed to online, have escalated in the years this legislation has been going through Parliament. Companies in scope of the regulations now have a huge opportunity to be part of a real step forward in terms of child safety.” -Susie Hargreaves, Chief Executive Officer, Internet Watch Foundation (Source: The Standard) |
The rules also address other illegal harms such as content encouraging suicide, harassment, and the supply of drugs and firearms. The regulator can levy fines of up to £18 million or 10% of worldwide turnover for breaches, and offending sites may even be blocked in the UK.
“We stand ready to work with Ofcom, and with companies looking to do the right thing to comply with the new laws. Its vital companies are proactive in assessing and understanding the potential risks on their platforms, and taking steps to make sure safety is designed in.” -Susie Hargreaves, Chief Executive Officer, Internet Watch Foundation (Source: The Standard) |
The guidelines do not explicitly focus on artificial intelligence but emphasize a "technology-neutral" approach. The draft guidance is part of a multiphase publication process, and the final regulations are expected next fall.
Ofcom plans to consult on more contentious issues in the future, such as content that's legal but harmful for children and the impact on end-to-end encryption in messaging apps.
What are your thoughts? Join the conversation in our Yes We Trust community, a free discussion group for data privacy professionals and enthusiasts, on LinkedIn: