Online Safety Bill to become law in crackdown on harmful social media content
The Online Safety Bill has passed its last parliamentary hurdle in the House of Lords, meaning it will finally become law after years of delay.
The flagship piece of legislation will force social media firms to remove illegal content and protect users, especially children, from material which is legal but harmful.
The idea was conceived in a white paper in 2019 but it has been a long and rocky road to turn it into law – with delays and controversies over issues such as freedom of speech and privacy.
Perhaps most controversially, one of the proposals would force platforms like WhatsApp and Signal to undermine messaging encryption so private chats could be checked for criminal content.
Technology Secretary Michelle Donelan said: “The Online Safety Bill is a game-changing piece of legislation. Today, this government is taking an enormous step forward in our mission to make the UK the safest place in the world to be online.”
The bill will require social media companies to remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm.
Other illegal content it wants to crack down on includes selling drugs and weapons, inciting or planning terrorism, sexual exploitation, hate speech, scams, and revenge porn.
Communications regulator Ofcom will be largely responsible for enforcing the bill, with social media bosses facing fines of billions of pounds or even jail if they fail to comply.
The bill has also created new criminal offences, including cyber-flashing and the sharing of “deepfake” pornography.
The legislation has received widespread support from charities like the NSPCC, safety group the Internet Watch Foundation (IWF), bereaved parents who say harmful online content contributed to their child’s death, and sexual abuse survivors.
However, there have been concerns within the Tory Party that it is simply too far-reaching, potentially to the point of threatening free speech online.
Meanwhile, tech companies criticised proposed rules for regulating legal but harmful content, suggesting it would make them unfairly liable for material on their platforms.
Ms Donelan removed this measure from the bill in an amendment last year, which said that instead of platforms removing legal but harmful content, they will have to provide adults with tools to hide certain material they do not wish to see.
This includes content that does not meet the criminal threshold but could be harmful, such as the glorification of eating disorders, misogyny and some other forms of abuse.
However after backlash from parents she stressed that the bill still tasks companies with protecting children from not just illegal content, but any material which can “cause serious trauma”, like cyber-bullying, by enforcing age limits and age-checking measures.
NSPCC Chief Executive, Sir Peter Wanless said: “We are absolutely delighted to see the Online Safety Bill being passed through Parliament. It is a momentous day for children and will finally result in the ground-breaking protections they should expect online.”