Social media firms agree to quickly take down prejudicial posts

Education, Social Media

A girl using a computer

 A report by the Attorney General’s Office said comments on social media were not enough of a risk to require new laws. Photograph: Alamy

Social media companies have agreed “takedown” arrangements with the Attorney General’s Office to ensure the swift removal of prejudicial comments about active trials.

The likes of Facebook, Google and Twitter do not, however, pose such a severe threat that new laws protecting fair trials are needed, according to the solicitor general, Robert Buckland QC.

A report by the AG’s office commissioned after the 2016 Angela Wrightson murder trial concluded it was possible to control the risks of online comments prejudicing juries’ deliberations.

In July 2015, Mr Justice Globe discharged a jury and ordered a retrial in the case of two teenage girls accused of murdering Wrightson after threats against them and attacks on court procedures were posted on Facebook.

Local and national newspapers reported the trial accurately, but their stories were posted on social media platforms under which prejudicial remarks were written. A retrial was heard at Leeds crown court the following year and the teenagers, identified only as F and D, were convicted.

The AG’s office then launched a call for evidence about the impact of social media on criminal trials. Its report, produced in response, said: “The evidence received suggests that while there are new challenges with the use of social media, these challenges are not unmanageable.

“Indeed, the relatively low volume of responses suggests the scale of the problem is more limited in scope than the original R v F&D case might have suggested.”

Buckland said: “We have had talks with Twitter, Facebook and Google and have set up a regime where we can … get rapid takedown [of material that is in contempt]. This will cover live trials. This seems to be the best solution to the problems.”

The agreement on removing prejudicial or unlawful social media posts, the report said, “will ensure that social media platforms are alerted to potentially unlawful or contemptuous posts and can review them quickly, thereby mitigating the risk to the administration of justice. This, however, may not remain the case if the issue is not addressed.”

Buckland added: “Every defendant in this country is entitled to a fair trial where a verdict is delivered based on the evidence heard in court.”

Whether the government has sufficient powers to enforce strict observance by social media companies of contempt of court rules is being considered by a separate government white paper on online harms. It is due to be published later this year by the Department for Digital, Culture, Media and Sport.

Buckland recommended news organisations close comment sections under reports about live trials to make it more difficult for readers to post potentially unlawful material and remarks. But he added: “I’m also acutely aware of people’s rights to express themselves in a democracy.

“Social media users must think before they post. The rules are the same as those for traditional media, and being found in contempt of court could result in a fine or up to two years in prison.

“I am pleased … that our respondents reported that this risk [to the criminal justice system] is relatively minor, and that they are already confident that they can mitigate the risk where it does arise. We need to guard against any future proliferation of the threat, however.”

The easy availability of vast quantities of information online, often relating to defendants in criminal trials, is not yet a reason to abandon the ban on jurors searching for material while hearing evidence, according to the solicitor general.

“Jurors are there to make decisions on the evidence before them,” Buckland said. “I don’t think it’s unreasonable to try to protect [strict contempt rules in trials] even at a time when information is more readily available. But we should keep a ready eye on that.”

One concern has been that when questions are typed into Google and other search engines, an autoprompt function sometimes reveals names supposed to be the subject of anonymity orders, such as victims of a sexual offence when the offender’s name is used as a search term. Google, the report said, has made improvements to its autocomplete tool to help prevent inappropriate results.