The Dutch Crime Anticipation System (CAS) has been embraced as a step towards forward-looking crime prevention, but it has also been criticized for its potential discriminatory effects in both design and application. This project focuses on whether CAS has risks of discrimination and what interventions are possible to mitigate this risk.
Predictive policing tools have proliferated in the last two decades in response to louder calls for more effective and objective policing. Such tools allow for automated predictions about who will commit a crime or when and where crimes will occur. Predictive policing has been embraced as a step towards forward-looking crime prevention, but it has also been criticized for its potential discriminatory effects in both design and application.
This project is carried out by the Department of Anthropology and Development Studies (CAOS) and the Interdisciplinary Hub for Security, Privacy and Data Governance (iHub), both at Radboud University.
In the PhD project, we therefore focus on questions such as: How is CAS designed to work and how does it work as a (cultural) system in practice? Do the design and application of CAS bring risks of discrimination? Does current regulation sufficiently protect people against such risks? What legal and technical interventions are possible to mitigate the risk of discrimination? Using simulations, can we predict what the long-term effects of such predictive systems are? These questions ask for a combination of anthropology, law and computer science, disciplines that are reflected in the team of two CAOS and two iHub supervisors.
Solidarity & justice
Freedom & democracy