The committee is one of several that have been tasked with reviewing and offering their opinion on the internet-regulating bill proposed by the government, before it can proceed to a vote.
In a report, the DCMS revealed that its members saw problems with the bill’s poorly defined types of harms it is supposed to tackle, which they believe would hamper the future law’s effectiveness in establishing “comprehensive safety regime.”
The draft bill was first released in the spring of 2021, and has since been defended by cabinet members as designed to change internet culture in a systemic manner, by introducing a duty of care which online platforms must respect.
In addition to dealing with harms that are illegal, like terrorism, child abuse and hate speech, the future law also goes into the gray area of removing content labeled as bullying or promoting behaviors such as eating disorders.
The bill is already considered controversial and has been criticized for allowing large amounts of various content to be removed as online platforms look to protect themselves from nebulously defined liability.
The DCMS report calls on the government to come up with definitions of harmful content and safety duties that align with international human rights rules, and protect freedom of expression by incorporating “minimum standards” that platforms should adhere to while assessing what is harmful content and moderating it, including by using automation and algorithms.
At the same time, the committee wants the government to add even more stringent rules around child exploitation – the bill, considered as overbroad by rights campaigners, is often promoted by its sponsors using the “think of the children” tactic – as well as around abuse that targets women and girls.
But, according to the committee, the overarching concern is how the regulator, Ofcom, would even enforce the law – given that on this front, too, the proposal lacks proper clarity.
Our IP Address: