The Socio-Economic Impacts of Predictive Policing on Minority Communities and Potential Solutions
DOI:
https://doi.org/10.58445/rars.2024Keywords:
recidivism, AI, algorithms, predictive policing, racial biasAbstract
Black defendants are 77% more likely to be labeled high-risk for violent recidivism than white defendants, even when accounting for prior offenses, age, gender, and other factors. This significant disparity highlights the urgency of addressing racial bias in predictive policing algorithms in the criminal justice system. While technologies of predictive policing were designed to enhance efficiency in law enforcement, they have quietly embedded systemic racial biases that are devastating for minority communities. This paper examines how predictive policing models contribute to cycles of over-policing and socio-economic inequality by relying on historically biased data. My analysis presents the devastating consequences of wrongful arrests, economic hardship, and psychological trauma, calling for urgent reforms. It outlines a way forward in the quest for fairer and more ethical use of predictive technologies. This paper aims to ensure policing technologies serve all communities equitably by improving data quality, integrating restorative justice practices, and establishing robust oversight policies.
References
Acemoglu, D., Johnson, S., & Robinson, J. A. (2001). The Colonial Origins of Comparative Development: An Empirical Investigation. The American Economic Review, 91(5), 1369–1401.
Adensamer, A., & Klausner, L. D. (2021). “Part Man, Part Machine, All Cop”: Automation in Policing. Frontiers in Artificial Intelligence, 4. https://doi.org/10.3389/frai.2021.655486
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016, May 23). Machine Bias. Retrieved from ProPublica website: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
Azavea. (2014). NYC_0002305. Retrieved from https://www.brennancenter.org/sites/default/files/NYC_0002305_HunchlabPromoOverview%20-%20Copy.pdf
Bagaric, M., Svilar, J., Bull, M., Hunter, D., & Stobbs, N. (2022). The Solution to the Pervasive Bias and Discrimination in the Criminal Justice System: Transparent and Fair Artificial Intelligence. American Criminal Law Review, 59, 95. Retrieved from https://heinonline.org/HOL/LandingPage?handle=hein.journals/amcrimlr59&div=7&id=&page=
Bellamy, R. K. E., Mojsilovic, A., Nagar, S., Ramamurthy, K. N., Richards, J., Saha, D., … Mehta, S. (2019). AI Fairness 360: An extensible toolkit for detecting and mitigating algorithmic bias. IBM Journal of Research and Development, 63(4/5), 4:1–4:15. https://doi.org/10.1147/jrd.2019.2942287
Berk, R. A. (2020). Artificial Intelligence, Predictive Policing, and Risk Assessment for Law Enforcement. Annual Review of Criminology, 4(1). https://doi.org/10.1146/annurev-criminol-051520-012342
Bor, J., Venkataramani, A. S., Williams, D. R., & Tsai, A. C. (2018). Police killings and their spillover effects on the mental health of black Americans: a population-based, quasi-experimental study. The Lancet, 392(10144), 302–310. https://doi.org/10.1016/s0140-6736(18)31130-9
Burn, D., Crawford, A., & Gray, E. (2018). Enhancing the Use of Restorative Justice within Policing. Retrieved from https://sites.manchester.ac.uk/n8-policing-research-partnership/wp-content/uploads/sites/315/2021/10/Restorative-justice-findings-final-Jan-2018.pdf
Busuioc, M. (2022, August 12). AI algorithmic oversight: new frontiers in regulation. Retrieved from www.elgaronline.com website: https://www.elgaronline.com/edcollchap-oa/book/9781839108990/book-part-9781839108990-43.xml
Castets-Renard, C. (2021, July 20). Human Rights and Algorithmic Impact Assessment for Predictive Policing. Retrieved from papers.ssrn.com website: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3890283
College of Policing. (2022). Restorative justice Evidence briefing. Retrieved from https://assets.college.police.uk/s3fs-public/2022-01/Restorative-justice-evidence-briefing.pdf
Ensign, D., Friedler, S. A., Neville, S., Scheidegger, C., & Venkatasubramanian, S. (2018, January 21). Runaway Feedback Loops in Predictive Policing. Retrieved from proceedings.mlr.press website: https://proceedings.mlr.press/v81/ensign18a.html
Federal Bureau of Investigation. (2018). Crime in the U.S. Retrieved from FBI website: https://ucr.fbi.gov/crime-in-the-u.s
Félix Tréguer. (2021). Doing Action-Research on Algorithmic Urban Policing: IA-Powered Surveillance, Elusive Democratic Oversight. Hal.science. https://hal.science/hal-03540934
Gramlich, J. (2024, April 24). What the data says about crime in the U.S. Retrieved from Pew Research Center website: https://www.pewresearch.org/short-reads/2024/04/24/what-the-data-says-about-crime-in-the-us/
Hao, K. (2019, February 4). This is how AI bias really happens—and why it’s so hard to fix. Retrieved from MIT Technology Review website: https://www.technologyreview.com/2019/02/04/137602/this-is-how-ai-bias-really-happensand-why-its-so-hard-to-fix/
Jackson, E., & Mendoza, C. (2020). Setting the Record Straight: What the COMPAS Core Risk and Need Assessment Is and Is Not. 2.1, 2(1). https://doi.org/10.1162/99608f92.1b3dadaa
Jowaheer, Y. (2018). Effect of Implicit and Explicit Prejudice on Perceptions of Drug Effect of Implicit and Explicit Prejudice on Perceptions of Drug Users of Different Races Users of Different Races. Retrieved from https://core.ac.uk/download/pdf/235418241.pdf#page=4.09
Lanni, A. (2022). COMMUNITY-BASED AND RESTORATIVE-JUSTICE INTERVENTIONS TO REDUCE OVER-POLICING. American Journal of Law and Equality, 2, 69–84. https://doi.org/10.1162/ajle_a_00040
Lapowsky, I. (2018, May 22). How the LAPD Uses Data to Predict Crime. Retrieved from Wired website: https://www.wired.com/story/los-angeles-police-department-predictive-policing/
Larson, J., Mattu, S., Kirchner, L., & Angwin, J. (2016, May 23). How We Analyzed the COMPAS Recidivism Algorithm. Retrieved from ProPublica website: https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm
Looney, A., & Turner, N. (2018, March 14). Work and opportunity before and after incarceration. Retrieved from Brookings website: https://www.brookings.edu/articles/work-and-opportunity-before-and-after-incarceration/
Lum, K., & Isaac, W. (2016). To predict and serve? Significance, 13(5), 14–19. https://doi.org/10.1111/j.1740-9713.2016.00960.x
Management Concepts. (2024, July 26). The Ethics of Data Collection and Analytics. Retrieved from Management Concepts website: https://managementconcepts.com/resource/the-ethics-of-data-collection-and-analytics/
Meijer, A., & Wessels, M. (2019). Predictive policing: Review of benefits and drawbacks. International Journal of Public Administration, 42(12), 1031–1039. https://doi.org/10.1080/01900692.2019.1575664
Minocher, X., & Randall, C. (2020). Predictable policing: New technology, Old bias, and Future Resistance in Big Data Surveillance. Convergence: The International Journal of Research into New Media Technologies, 26(5-6), 135485652093383. https://doi.org/10.1177/1354856520933838
National Center on Restorative Justice. (2024, January 23). Restorative Approaches to Policing Institutes - National Center on Restorative Justice. Retrieved November 1, 2024, from National Center on Restorative Justice website: https://ncorj.org/institutes/restorative-approaches-to-policing-institutes/
Nazer, L., Zatarah, R., Waldrip, S., Janny, X. C. K., Moukheiber, M., Khanna, A. K., … Mathur, P. (2023). Bias in artificial intelligence algorithms and recommendations for mitigation. PLOS Digital Health, 2(6), e0000278–e0000278. https://doi.org/10.1371/journal.pdig.0000278
Northpointe. (2015). Practitioner’s Guide to COMPAS Core. Retrieved from https://s3.documentcloud.org/documents/2840784/Practitioner-s-Guide-to-COMPAS-Core.pdf
Ntoutsi, E., Fafalios, P., Gadiraju, U., Iosifidis, V., Nejdl, W., Vidal, M., … Broelemann, K. (2020). Bias in Data‐driven Artificial Intelligence systems—An Introductory Survey. WIREs Data Mining and Knowledge Discovery, 10(3). https://doi.org/10.1002/widm.1356
O’Donnell, R. (2019). CHALLENGING RACIST PREDICTIVE POLICING ALGORITHMS UNDER THE EQUAL PROTECTION CLAUSE. Retrieved from https://www.nyulawreview.org/wp-content/uploads/2019/06/NYULawReview-94-3-ODonnell.pdf
Quillian, L., Pager, D., Hexel, O., & Midtbøen, A. H. (2017). Meta-analysis of field experiments shows no change in racial discrimination in hiring over time. Proceedings of the National Academy of Sciences, 114(41), 10870–10875. https://doi.org/10.1073/pnas.1706255114
Rose, E., Card, D., Mccrary, J., Kline, P., Li, N., Nichols, A., … Yagan, D. (2018). The Effects of Job Loss on Crime: Evidence From Administrative Data. Retrieved from https://ekrose.github.io/files/jobloss_crime_ekr_vf.pdf
Sankin, A., & Mattu, S. (2023, October 2). Predictive Policing Software Terrible at Predicting Crimes – The Markup. Retrieved from The Markup website: https://themarkup.org/prediction-bias/2023/10/02/predictive-policing-software-terrible-at-predicting-crimes
Schwartz, G. L., & Jahn, J. L. (2020). Mapping fatal police violence across U.S. metropolitan areas: Overall rates and racial/ethnic inequities, 2013-2017. PLOS ONE, 15(6). https://doi.org/10.1371/journal.pone.0229686
Shapiro, A. (2021). Accountability and indeterminacy in predictive policing. Routledge EBooks, 185–213. https://doi.org/10.4324/9780429265365-10
Sholademi, D., & Raji, I. (2024). Predictive Policing: The Role of AI in Crime Prevention. International Journal of Computer Applications Technology and Research. https://doi.org/10.7753/ijcatr1310.1006
The Importance of the Community in Restorative Justice. (2023, March 23). Retrieved from Restorative Justice 101 website: https://restorativejustice101.com/the-importance-of-community-in-restorative-justice/
United States Department of Commerce . (2022). COMMERCE DATA ETHICS FRAMEWORK 2022. Retrieved from https://www.commerce.gov/sites/default/files/2023-02/DOC-Data-Ethics-Framework.pdf
UNSW Sydney. (2024, May 29). Data Ethics: Examples, Principles and Uses | UNSW Online. Retrieved from studyonline.unsw.edu.au website: https://studyonline.unsw.edu.au/blog/data-ethics-overview
Wang, L. (2022, December 22). New data: Police use of force rising for Black, female, and older people; racial bias persists. Retrieved from Prison Policy Initiative website: https://www.prisonpolicy.org/blog/2022/12/22/policing_survey/
Weltz, J., & Hardin, J. (2019). Over-Policing and Fairness in Machine Learning. Retrieved from https://pages.pomona.edu/~jsh04747/Student%20Theses/justin_weltz_2019.pdf
Zaroff, A. (2022). AI-based Automated Decision Making: An investigative study on how it impacts the rule of law, and the case for regulatory safeguards. Retrieved from https://lup.lub.lu.se/luur/download?func=downloadFile&recordOId=9104598&fileOId=9104605
Downloads
Posted
Categories
License
Copyright (c) 2024 Hritvik Singhvi
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.