Tech firms must start protecting UK users from illegal content
The Hindu
Tech companies must start putting in place measures to protect users from child sexual abuse images and other illegal content in Britain from Monday.
Tech companies must start putting in place measures to protect users from child sexual abuse images and other illegal content in Britain from Monday as enforcement of its online safety regime ramps up.
Media regulator Ofcom said Meta's Facebook, ByteDance's TikTok, Alphabet's YouTube and other companies must now implement measures such as better moderation, easier reporting and built-in safety tests to tackle criminal activity and make their platforms safer by design.
"Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that," Ofcom's enforcement director Suzanne Cater said.
The Online Safety Act, which became law in 2023, sets tougher standards for platforms, with an emphasis on child protection and the removal of illegal content.
In December, Ofcom published its first codes of practice for the new law and set companies a deadline of March 16 to assess the risks illegal content posed to users on their platforms.
The regulator will be able to issue fines of up to 18 million pounds ($23.31 million) or 10% of a company's annual global turnover, if they fail to comply with the law.
Ofcom said file-sharing and file-storage services were particularly vulnerable to being used for sharing child sexual abuse material.

The Puducherry government has decided to launch a scheme on April 14, 2025, to distribute free 20-litre water cans to households in places in the Union Territory (UT) where the quality of drinking water has deteriorated, Minister for Public Works K. Lakshminarayanan informed the Assembly on Wednesday (March 19, 2025).