UK government Digital Secretary Nicky Morgan and Home Secretary Priti Patel have recommended that Ofcom be appointed as the official online harms regulator. This decision is the government's official initial response to the Online Harms White Paper consultation, and more details concerning the scope of the regulations and Ofcom powers, as well as freedom of speech protections will be forthcoming.
The idea of the white paper is to make changes to social media regulation to particularly protect children and vulnerable people - and give consumers greater confidence to use technology. Social media firms are keen to stay self-regulating but with some companies this means little or no regulations at all, or simply that money, fame, or power allows for what regulations there are to bend.
UK government ministers reckon it is now time for them to step in to "enforcing a statutory duty of care to protect users from harmful and illegal terrorist and child abuse content". This move is said to be following through on the government's pledge to make the UK "the safest place in the world to be online".
As the stock to beat unruly social media firms, Ofcom will get new powers to carry out its extended responsibilities. Nicky Morgan said that Ofcom will provide a "proportionate and strong regulatory regime," to nurture a thriving digital economy which is trusted and protects all users. Morgan dismissed concerns that this outside regulation could stifle the vibrant and open internet. Priti Patel said that a strong regulator would "ensure social media firms fulfil their vital responsibility to vulnerable users," and would help stop criminals from using social media for their benefit. In the government's own news blog about the Ofcom appointment, children's charity Barnados was very supportive of the plans, citing the growing risks to minors online.
In a statement about the news emailed to HEXUS, the Head of Reputation Protection at Mishcon de Reya Emma Woollcott was generally supportive. "Regulating the behaviour of global corporations will always be challenging – but essential if we are to ensure that platforms take greater responsibly in exercising the enormous power they wield. The possibility of meaningful sanctions when platforms fail to properly protect users should drive greater investment in transparent complaints processes and shorter response times," wrote Woollcott.
Another email we received, this one from the Internet Services Providers' Association (ISPA) was a little more cautious. "In order to effectively address online harms, it is important for interventions to be targeted at the specific part of the internet ecosystem, so we welcome the proposed approach of focusing measures on platforms that facilitate user generated content," wrote Andrew Glover, the Chair of ISPA. Glover went on to raise potential issues about ISP level blocking, and technical developments such as DNS-over-HTTPS.
Going forward, Ofcom may be tasked specifically to decide what platforms will fall under the scope of regulation and then make sure social media companies remove illegal content quickly and make steps to stop it appearing in the first place with "particularly robust action on terrorist content and online child sexual abuse". Contrasting to previous and ill fated government internet regulation initiatives the regulator will not do anything to "stop adults from accessing or posting legal content that some may find offensive".
Expect a follow up to the above recommendations by spring. So in a few weeks we should find out further details of the potential enforcement powers Ofcom may have.