UK children exposed to violent content online, see it as ‘inevitable’, report finds
Children in Britain encounter violent online content, including self-harm promotion, from a young age, viewing it as an unavoidable aspect of internet use. Research highlights the difficulties global governments and tech companies face in protecting minors online. Britain has enacted legislation requiring social media platforms to block children’s access to harmful content through age verification. Ofcom has the authority to fine non-compliant tech companies, though penalties are pending the development of implementation codes. Some messaging services, including WhatsApp, resist parts of the law that might compromise end-to-end encryption. A study involving 247 children aged 8-17 found they primarily encountered violent content through social media, video-sharing, and messaging platforms. The content ranged from violent gaming to verbal discrimination and street fight footage. Many children felt powerless against the content recommended to them and had a limited understanding of the algorithms behind these recommendations. Ofcom emphasizes the urgent need for tech firms to prepare for their child protection responsibilities under new online safety laws.