The Online Safety Bill is deemed ‘not fit for purpose’ by the majority of UK IT professionals, according to a survey by BCS, the Chartered Institute for IT.
Rather than technical measures to police online content, BCS members would rather see investment in tackling the root causes of online harms, the body’s chief executive Rob Deri told Tech Monitor.
Only 14% of 1,300 IT professionals surveyed consider the Bill, which is intended to reduce the risk of harm to UK citizens online, to be ‘fit for purpose’. Over half (51%) believe it will not make the internet safe, and 46% believe it is “not workable”.
Nearly three-quarters (74%) of the respondents say the Bill will not stop the spread of disinformation and fake news. And 57% do not agree that is reasonable for Ofcom to ask platform operators to ‘develop or source’ technology to detect child sexual abuse material (CSAM).
The Online Safety Bill, originally proposed in 2019 and revised in 2021, would establish a ‘duty of care’ among online service providers towards their users. This includes a requirement to remove both illegal and ‘legal but harmful’ content.
The latest revisions to the Online Safety Bill, proposed last month, would require messaging providers that use end-to-end encryption to deploy systems that automatically detect CSAM so it can be reported to authorities.
The Bill also includes a provision for jail sentences for social media users who post content that causes “psychological harm amounting to at least serious distress”.
Online Safety Bill: reform needed
IT professionals’ opposition to the Bill stems from its focus on policing content, not addressing the underlying causes of online harm, says Deri. “There is a concern that too much of the Bill is around taking down harmful stuff rather than reforming the underlying systems to minimise the amount of harmful stuff online in the first place.”
Content from our partners
Deri adds that the Bill does not appear to have taken the wishes of young people, who it ostensibly protects, or of their parents into consideration. “Who’s been talking to students, pupils, parents, and actually finding out what they want?”, he asks.
“Too much of the commentary that we’re seeing comes from what you might call vested interests rather than from the students themselves. Young people are going to be affected by this. They’re the ones dealing with this. Why aren’t we hearing more from them about what they want to see done?”
The Bill’s prohibition of ‘legal but harmful’ content is more likely to foment social conflict than reduce harms, Deri adds. “It feels as if that’s building in a mechanism for causing conflict rather than a mechanism for ensuring there is less harmful content in the first place.”
“Our members want to see much more thought put into support and education for students and parents and schools so that they can understand how to navigate what’s going on in social media.”
Last week, campaign group the Electronic Freedom Foundation called for the Bill to be “scrapped”. “The current Bill is a threat to free expression, and it undermines the encryption that we all rely on for security and privacy online,” it said.
The Bill’s requirement that encrypted messaging platforms scan for CSAM would likely force them to adopt client-side scanning, a controversial technique that is opposed by many security researchers.
As Tech Monitor reported this week, many security researchers are strongly opposed to client-side scanning, warning that it opens the door for government surveillance and censorship.
Read more: Is client-side scanning the future of content moderation?