Tech companies in the UK face new rules, sanctions and oversight as the UK government declared to end the “era of self-regulation”. The government will impose a new legal “duty of care” on companies to take steps to tackle illegal and harmful activity on their services, according to plans announced in a white paper.
Under the proposals, companies will have to take “reasonable and proportionate action” to tackle “online harms” — ranging from terrorist content and child sexual exploitation to problems that “may not be illegal but are nonetheless highly damaging” such as disinformation, extremist content and cyberbullying.
Senior managers could be held personally liable if the rules are breached, and their companies could be fined or banned in the UK.
Margot James, digital minister, called the government’s plans the most ambitious adopted by any G7 country.“I think this is groundbreaking in its scale and scope,” she said.
The new rules will apply to a broad range of online companies including file-hosting sites, public discussion forums, messaging services and search engines, rather than just social media platforms.
An independent regulator, funded by the industry, will oversee and enforce the rules. There will be a 12-week consultation period on the proposals, after which the government will set out its final plans. One key decision still to be made is whether the regulator should be a new body or an existing one such as Ofcom.
Ms James said the government was “genuinely undecided” on that question, adding that a hybrid entity, combining Ofcom with the Information Commissioner’s Office, was another option.
On terrorism and child sexual exploitation, the Home Office will have the power to direct the regulator on codes of practice that set out what companies should do to fulfil their new “duty of care”.
Terrorism content shared on social media will have to be taken down “in a short pre-determined timeframe”, the government added.
“The era of self-regulation for online companies is over,” said Jeremy Wright, the UK’s digital secretary. “
Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.”
The white paper offers some suggestions that could be included in the code of best practice. It suggests the spread of fake news could be tackled by forcing social networks to employ fact-checkers and promote legitimate news sources.
But the regulator will be allowed to define the code by itself.
The white paper also says social media companies should produce annual reports revealing how much harmful content has been found on their platforms.
Information Commissioner, Elizabeth Denham said:
“I think the white paper proposals reflect people’s growing mistrust of social media and online services. People want to use these services, they appreciate the value of them, but they’re increasingly questioning how much control they have of what they see, and how their information is used. That relationship needs repairing, and regulation can help that. If we get this right, we can protect people online while embracing the opportunities of digital innovation.
“While this important debate unfolds, we will continue to take action. We have powers, provided under data protection law, to act decisively where people’s information is being misused online, and we have specific powers to ensure firms are accountable to the people whose data they use.
“We’ve already taken action against online services, we acted when people’s data was misused in relation to political campaigning, and we will be consulting shortly on a statutory code to protect children online. We see the current focus on online harms as complementary to our work, and look forward to participating in discussions regarding the White Paper.“
Rebecca Stimson, Facebook’s head of UK policy, said in a statement:
“New regulations are needed so that we have a standardised approach across platforms and private companies aren’t making so many important decisions alone.
“New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.”
Twitter’s head of UK public policy Katy Minshall said in a statement:
“We look forward to engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the internet.”
Online Harms White Paper
[wp-embedder-pack width=”100%” height=”400px” download=”logged-in” download-text=”” url=”https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/793360/Online_Harms_White_Paper.pdf” /]