(CEX0001)

Written evidence submitted by Assistant Commissioner Neil Basu QPM, National Lead for Counter-Terrorism Policing (CEX0001)

[Note: This evidence has been redacted by the Committee. [***] represents redacted text.]

Dear Yvette,

1. During my oral evidence at the Home Affairs Select Committee on Counter-Extremism and Counter Terrorism (HC 784), you challenged the lack of action by Counter Terrorism Policing (CTP) against the social media provider, BitChute, and their platforming of National Action. I will open this response with a transcript of the relevant questions and a brief explanation of the role of the Counter Terrorism Internet Referral Unit (CTIRU).

Questions

Chair: Why has there been no action against the promotion of National Action on BitChute?

Chair: It would be really helpful if you could look further into the issue about the promotion of National Action and write to us on that.

Summary

2. This response is directly relevant to the wider issues of hateful extremism, so eloquently described by Sara Khan in the same HASC session. Sara rightly identified limitations of the current system, in that no legal intervention is available for publication of extremist content, below the threshold of inciting violence or breaching Terrorism Act statutes. As such, police power to directly intervene has clearly defined limitations. Also relevant are the terms and conditions of the platform and the resources they can bring to bear to review and police their own content. In summary, National Action material on platforms such as BitChute would remain for a number of reasons:

1. They do not breach the legal threshold, e.g. material with enough ambiguity that National Action is merely implied but not explicit.

2. They do breach the legal threshold but there has been a delay due to the technical challenges of identifying and removing all the illegal content (expanded on below).

3. They do breach terms and conditions but the platform is not adequately resourced to find and remove the content efficiently.

3. To answer the question more directly, there has been action taken to remove National Action and other material that breaches the law, (2020 figures set out below). However, it is difficult to be sure that all such material has been found and removed and this is largely a problem of the platforms’ technical capability to remove content at scale, given the quantity, rapidity of upload and the ways material can be hidden.

Context - CTIRU (CEX0001)

4. The core business of the CTIRU involves assessing and identifying online material, believed to be hosted in the UK, which is assessed to breach Terrorist Act (TACT) legislation. However, by the nature of the work, which often is close to a subjective threshold, CTIRU work can identify illegal material, often described as a ‘grey area’. In general CTIRU would only refer TACT-breaching material to companies for removal, or breach of other legislations such as public order, referenced below.

5. The CTIRU, generally via public referrals, assesses content, which, on occasion falls short of terrorism but which breaches other legislation – for example sections of the 1986 Public Order Act (including those racially or religiously aggravated as defined by S1 of the 1998 Crime & Disorder Act). These sections are, primarily:

 Section 4: Are likely to cause fear of, or to provoke, immediate violence;

 Section 4A: Intentionally cause harassment, alarm or distress; or

 Section 5: Are likely to cause harassment, alarm or distress (threatening or abusive words or behaviour only).

6. If content is assessed by CTIRU to breach such legislation, and the sender is verifiably UK based, it is referred to the National Hate Crime Hub for that team to assess and report to the appropriate UK Police Service. If the content breaches UK public order legislation but is not UK based, it will be assessed against the host platform’s own terms and conditions. If the CTIRU assesses that it does breach the company T&Cs, it will be referred for removal.

BitChute and the CTIRU

7. BitChute was launched in 2017 and is owned and operated by Richard Jones and Ray Vahey. The company is registered in the and at Companies House.

8. BitChute is lauded as an alternative to the likes of YouTube, where it advocates free speech and people’s rights to express their opinions without undue censorship and this has been embraced by a number of groups, including the likes of Daesh and Right Wing Terrorist groups such as National Action.

9. Whilst advocating free speech and encouraging those who do not wish to be overly moderated in the same way as more mainstream platforms do, they do have a full set of Terms & Conditions which in turn contain Community Guidelines that all those accessing the site must adhere to. Failure to adhere to these can mean that posts or accounts are removed.

10. The Community Guidelines contain the following sections:

 Respect and Decency  Personal Responsibility  Compliance with the Law (CEX0001)

 Content Sensitivity  Prohibited Content  Platform Misuse

11. Within the ‘Prohibited Content’ section, content from all proscribed organisations from the UK proscribed organisations list and the UN’s ISIL and Al-Qaida sanctions list is prohibited on the site.

12. Whilst BitChute has deployed moderation tools, these have limited success and this coupled with the fact that those who publish content on the site may not actually title or tag their with information that can assist in this identification of prohibited content, means that fast time moderation and removal is frustrated.

13. Whilst the platform faces challenges in identifying and removing content, the CTIRU has found them cooperative when TACT breaching content has been referred to them directly [***]. The CTIRU continue to maintain regular dialogue with BitChute to understand their business area and helping them to fully understand the needs of the police and especially those of Counter Terrorism policing.

[***]

14. Being UK based and subject to UK law means that they are more likely to respond to requests to remove content based upon TACT or other legislation and this is supported by the number of removals as a result of requests made by the CTIRU. By virtue of the fact that BitChute is currently based in the UK it will also be subject to the regulation that aims to be introduced by the forthcoming Online Harms legislation.

15. Whilst this regulation may help to ensure that terrorist and extremist material is removed from the platform in a timely fashion, depending on how any future regulator operates with the platform, it could be persuaded to move its operating base out of the UK to avoid such impact, which may present a risk in itself.

Legislation

16. UK legislation currently does not provide the ability to compel a UK company to remove material, unless there is a clearly defined crime. It therefore remains the choice of companies how far they moderate themselves where the content is extreme but not illegal. I do support the need for a debate on the threshold between illegal material, and lawful but extremist material capable of causing harm. There is an argument that the threshold is too high and we are cooperating with the Counter Extremism Commissioner in supplying case studies we think illustrate this point. CTP continues to work closely with the Home Office, DCMS and towards the Online Harms Legislation to ensure that the future regulators reflect the operational need and public position.

Neil Basu (CEX0001)

Assistant Commissioner National Lead for Counter-Terrorism Policing

January 201