According to the Office for National Statistics, in today’s technological world, 96% of people between ages 16 to 24 use social media, with the figure being 66% for those over 25.

The Law Commission have released a report which says that the law regarding online communications is incoherent and fails to protect victims of online abuse.

The study, commissioned by the Department for Digital Culture Media and Sport (DCMS) calls for reform and consolidation of existing criminal legislation which deals with abusive and offensive communications.

According to the report, prosecutor and police often deal with overlapping offences and ambiguous terminology which do not provide sufficient legal clarity.

The review looked at the deployment of:

  • Malicious communication act 1988
  • The communications act 2003
  • Public Order Act 1986
  • Protection from harassment act 1997
  • Data Protection Act

The report recommends that the law in relation to online abuse is overhauled and brought up-to-date.

Children’s commissioner for England writes open letter to social media, following the suicide of 14 year old Molly Russel

Social media was partly to blame for the death of 14 year old Molly Russel, with her father Ian describing Molly’s Instagram account as containing distressing material about depression, and suicide.

The children’s commissioner for England Anne Longfield described the content as horrific, and accused social media companies of losing control of the content on their platforms. She also called for an independent digital ombudsman to better protect children from disturbing material stating “I would appeal to you to accept there are problems and to commit to tackling them – or admit publicly that you are unable to.”

If social media companies cannot manage content, she wrote “children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.”

There is talk that a new standalone internet regulator should be established, with the power to fine companies that fail in their duty to protect young people and children.

Reflecting on the case of Molly Russell’s suicide, Culture Secretary Jeremy Wright commented that “the era of self-regulation on the internet is coming to end.”

Social media accused of radicalising and grooming algorithms

Last year, the chair of Home Affairs Committee Ms Cooper warned representatives from Twitter, YouTube and Facebook that police were extremely worried about the role of technology aiding extremism and online grooming.

In a hearing in the Commons, Ms Cooper stated: “the algorithms groom and radicalise, because once you view one slightly dodgy thing, you are then linked to a stream of other similar things. Whether racist extremism or Islamic extremism, technology is doing that job.”

Labour MP Ms Cooper stated she had been recommended videos from a white supremacist channel on YouTube, and had been shown tweets from Britain First leaders Paul Golding and Jayda Fransen before they were banned on twitter. She is calling companies to ensure their algorithms do not automatically pull people deeper into extremism or disturbing content.

Ms Cooper prepared evidence by sending google links to three YouTube videos posted by neo-Nazis David Duke, and National Action. During the committee, MPs questioned why they could find hate speech material online within seconds on social media sites, and how neo-Nazis could earn advertising revenue through videos posted on YouTube.

The enquiry was launched following the murder of Labour MP Jo Cox. The report on Hate Crime and its Violent Consequences Inquiry suggested that social media companies need to be treated like traditional publishers, and considered whether failure to remove illegal content is itself a crime; with the conclusion ‘ what is illegal offline should be illegal and enforced online.’

Whilst the principles of open public debate and free speech within democracy are maintained, the report argues it is essential to drown out the promotion of violence against particular groups, terrorism and extremism.

Laws to make Social Media safe: Internet Safety Strategy

New online safety laws will be created and introduced in the UK to ensure safety online. The Internet Safety Strategy aims to ensure the responsibilities of companies to their users, through the use of technical solutions to prevent online harm. According to the Internet Safety Strategy Green Paper;

  • Six in ten of those surveyed said they had witnessed inappropriate or harmful content online
  • Four in ten said they had experienced online abuse
  • Four in ten said concerns issued to social media companies were not taken seriously

The strategy covers various aspects of online safety;

The introduction of a social media code of practice, transparency reporting and a social media levy

  • Technological solutions to online harm
  • Developing children’s digital literacy
  • Support for parents and carers
  • Adults’ experience of online abuse, and
  • Young people’s use of online dating websites/applications

The government will be considering whether a code of practice should be written into legislation. Digital Secretary Matt Hancock announced “We will work closely with industry to provide clarity on the roles and responsibilities of companies that operate online in the UK to keep users safe. We will also work with regulators, platforms and advertising companies to ensure that principles that govern advertising in traditional media- such as preventing companies targeting unsuitable advertisements at children need to also apply and be enforced online.”

The Scoping Report, Next steps:

The DCMS will analyse the findings and decide upon the next steps, and produce recommendations as to how criminal law can be improved to tackle online abuse as well as offline.

If you wish to seek out further information, call Ison Harrison.

Share this...