Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality

Marion Oswald, Jamie Grace, Sheena Urwin, Geoffrey C. Barnes

Research output: Contribution to journalArticleResearchpeer-review

Abstract

As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.
LanguageEnglish
Pages223-250
Number of pages28
JournalInformation & Communications Technology Law
Volume27
Issue number2
DOIs
Publication statusPublished - 3 Apr 2018

Keywords

  • Big Data
  • Algorithms
  • Machine Learning
  • Artificial Intelligence
  • Data Analytics
  • Policing
  • Crime
  • Black Box
  • Ethics
  • Transparency
  • Proportionality
  • Governance
  • Rule of Law
  • Law
  • Criminal justice
  • Risk assessment
  • Predictions

Cite this

Oswald, Marion ; Grace, Jamie ; Urwin, Sheena ; Barnes, Geoffrey C. / Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality. 2018 ; Vol. 27, No. 2. pp. 223-250.
@article{b5686ed901cc4519b163ab1f3840b7a5,
title = "Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality",
abstract = "As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.",
keywords = "Big Data, Algorithms, Machine Learning, Artificial Intelligence, Data Analytics, Policing, Crime, Black Box, Ethics, Transparency, Proportionality, Governance, Rule of Law, Law, Criminal justice, Risk assessment, Predictions",
author = "Marion Oswald and Jamie Grace and Sheena Urwin and Barnes, {Geoffrey C.}",
year = "2018",
month = "4",
day = "3",
doi = "10.1080/13600834.2018.1458455",
language = "English",
volume = "27",
pages = "223--250",
number = "2",

}

Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality. / Oswald, Marion; Grace, Jamie; Urwin, Sheena; Barnes, Geoffrey C.

Vol. 27, No. 2, 03.04.2018, p. 223-250.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality

AU - Oswald, Marion

AU - Grace, Jamie

AU - Urwin, Sheena

AU - Barnes, Geoffrey C.

PY - 2018/4/3

Y1 - 2018/4/3

N2 - As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.

AB - As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.

KW - Big Data

KW - Algorithms

KW - Machine Learning

KW - Artificial Intelligence

KW - Data Analytics

KW - Policing

KW - Crime

KW - Black Box

KW - Ethics

KW - Transparency

KW - Proportionality

KW - Governance

KW - Rule of Law

KW - Law

KW - Criminal justice

KW - Risk assessment

KW - Predictions

U2 - 10.1080/13600834.2018.1458455

DO - 10.1080/13600834.2018.1458455

M3 - Article

VL - 27

SP - 223

EP - 250

IS - 2

ER -