Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality

Marion Oswald, Jamie Grace, Sheena Urwin, Geoffrey C. Barnes

Research output: Contribution to journalArticlepeer-review

208 Downloads (Pure)

Abstract

As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.
Original languageEnglish
Pages (from-to)223-250
Number of pages28
JournalInformation & Communications Technology Law
Volume27
Issue number2
DOIs
Publication statusPublished - 3 Apr 2018

Keywords

  • Big Data
  • Algorithms
  • Machine Learning
  • Artificial Intelligence
  • Data Analytics
  • Policing
  • Crime
  • Black Box
  • Ethics
  • Transparency
  • Proportionality
  • Governance
  • Rule of Law
  • Law
  • Criminal justice
  • Risk assessment
  • Predictions
  • 2020

Cite this