Social media sites like Facebook, Instagram and Twitter could be fined and blocked by Ofcom under new ‘landmark’ internet laws aimed at curtailing harmful and abusive content online.
The Online Safety Bill, which the government published a draft of today, aims to protect children online and prevent online racism and abuse.
The bill puts the onus on social media companies to protect its users and to restrict harmful content. If they fail to do so, they could be liable for large fines from broadcasting regulator Ofcom.
The government have long treated social media companies with a light touch, opting for minimal regulations – but a series of high-profile abuse and bullying incidents, such as the racist abuse of Premier League footballers and the death of teenager Molly Russell, has forced the government to be stricter.
A recent social media boycott saw a wide range of professional sports, athletes and organisations temporarily stop using the platforms in protest at alleged inaction by social media firms against online abuse.
In a first, the bill could also make senior managers at social media companies criminally liable if they fail to stick to the rules, as well as including provisions to tackle online scams and protect freedom of expression.
However, the government said this would only take place if tech companies failed to live up to their new responsibilities, with a review of the rules to take place two years after they are first enacted.
Ofcom will be given the power to fine companies up to £18 million or 10% of their annual global turnover, whichever is higher, as well as the ability to block sites to the general public.
This could cost companies like Facebook or Twitter billions of pounds for failure to stick to the new laws.
‘Today the UK shows global leadership with our ground-breaking laws to usher in a new age of accountability for tech and bring fairness and accountability to the online world,’ Digital Secretary Oliver Dowden said.
Writing in the Daily Telegraph, he added: ‘What does all of that mean in the real world? It means a 13-year-old will no longer be able to access pornographic images on Twitter. YouTube will be banned from recommending videos promoting terrorist ideologies.
‘Criminal anti-semitic posts will need to be removed without delay, while platforms will have to stop the intolerable level of abuse that many women face in almost every single online setting.
‘And, of course, this legislation will make sure the internet is not a safe space for horrors such as child sexual abuse or terrorism.’
Large social media firms will now have a ‘duty of care’ to look after their users, which means they will be expected to take action against harmful content, like suicide, self-harm and misinformation, as well as illegal content.
The laws will also seek to limit the proliferation of online scams, asking online firms to remove fraudulent user-generated content, like financial fraud schemes, romance scams and fake investment opportunities.
However, the government has sought provisions to preserve ‘democratic content’ on the platforms, seeking to preserve free speech where possible – this includes forbidding platforms from discriminating against political viewpoints, and allowing content that would be otherwise banned if it is ‘democratically important’.
‘This new legislation will force tech companies to report online child abuse on their platforms, giving our law enforcement agencies the evidence they need to bring these offenders to justice,’ Home Secretary Priti Patel said.
‘Ruthless criminals who defraud millions of people and sick individuals who exploit the most vulnerable in our society cannot be allowed to operate unimpeded, and we are unapologetic in going after them.
‘It’s time for tech companies to be held to account and to protect the British people from harm. If they fail to do so, they will face penalties.’
But child protection organisations like the NSPCC have warned the draft bill fails to offer the protection that children need on social media.
‘Government has the opportunity to deliver a transformative Online Safety Bill if they choose to make it work for children and families, not just what’s palatable to tech firms,’ said NSPCC chief executive Sir Peter Wanless.
‘The ambition to achieve safety by design is the right one. But this landmark piece of legislation risks falling short if Oliver Dowden does not tackle the complexities of online abuse and fails to learn the lessons from other regulated sectors.
‘Successful regulation requires the powers and tools necessary to achieve the rhetoric.’
‘Unless Government stands firm on their promise to put child safety front and centre of the Bill, children will continue to be exposed to harm and sexual abuse in their everyday lives which could have been avoided.’
Meanwhile, Labour has called the proposals ‘watered down and incomplete’ and said the new rules did ‘very little’ to ensure children are safe online.
Shadow culture secretary Jo Stevens said: ‘There is little to incentivise companies to prevent their platforms from being used for harmful practices.
‘The Bill, which will have taken the Government more than five years from its first promise to act to be published, is a wasted opportunity to put into place future proofed legislation to provide an effective and all-encompassing regulatory framework to keep people safe online.’
Source: Read Full Article