What you see is not what you get
: Mafia Woman and Artificial Intelligence: a diagnostic enquiry of a lost femininity in the criminal justice system

  • Supuni Perera

    Student thesis: Doctoral Thesis

    Abstract

    Artificial Intelligence (AI) is a wide-ranging branch of computer science that enables smart machines to solve problems faster. The integration of AI in the legal field, e.g. through predictive algorithms, poses the question about fairness of these justice tools. AI-based systems are trained on available data which has been shown to contain unacceptable levels of gender biases. The gender data gap affecting the law enforcement and criminal justice system is of importance because of the silencing of women when collecting data. The invisibility stems from a deep-rooted patriarchy in society, especially within criminal organisations. In this thesis, the role of the woman in the Italian mafia is exposed as a case-study to identify and discuss consequences of her invisibility in AI integration.
    A multidisciplinary approach is suggested to study how a male-dominated structure has shadowed women’s presence, and to work towards less biased predictive systems. A mixed methodology was adopted comprising of Phase 1, which was a qualitative collection of female mafia profiles, and Phase 2 which quantified the gender bias through open- and close-ended questions to AI experts. The study then sought to reflect and explore with stakeholders the project’s outcomes and confirm further research avenues.
    As a result of Phase 1, 30 mafia women profiles were identified. Phase 2 recognised a pattern of bias raising awareness among those leading legal-tech changes of possible problems affecting the informatisation of mafia trials. The reflective chapter confirmed the validity of the multidisciplinary approach used as the benchmark to tackle gender bias during data collection and algorithmic integration.
    The thesis suggests that a novel joint effort of socio-legal and tech expertise may be the preferred operational environment to address inequalities generated by automated legal tools. This research can represent a springboard to analyse possible corrections of wider justice systems for a fairer functioning of AI in predictive justice.
    Date of Award27 Jul 2022
    Original languageEnglish
    Awarding Institution
    • University of Winchester
    SupervisorEmma Nottingham (Supervisor) & Tim Hall (Supervisor)

    Keywords

    • Predictive justice
    • Gender
    • Unconscious bias
    • Artificial intelligence
    • Judiciary
    • Mafia

    Cite this

    '