chunk1

ABSTRACT. This research explores algorithmic bias in predictive policing within smart governance systems. Predictive policing uses algorithms to analyze data and predict crime areas or suspects. However, these systems often produce unfair results, especially for minority communities. The study aims to understand how bias happens, its effects on fairness, and possible solutions. A survey of 52 people showed that many believe predictive policing lacks transparency and unfairly targets certain groups. Most respondents think bias in data and algorithms harms minority communities. They also show low trust in the fairness of AI decisions in policing. Most of them desire more human supervision and frequent reviews to make it just. The study discovered that unfair predictions are as a result of biased data and much of this obsolete data is based on discrimination which occurred in the past. This places another kind of groups under closer watch. The absence of clarifications on the working of an algorithm further decreases the trust. The research proposes some solutions to rectify these problems. Bias can be minimized through the utilization of more and more useful data. Trust can be enhanced with the simplification of algorithms. Better context is guaranteed by getting human involvement in the decision. Community input and regular auditing helps to achieve fairness in systems as well. Its results reveal that bias is an essential issue when it comes to equal policing. Just mechanisms are able to enhance the safety of people without discriminating the latter. This study demonstrates that predictive policing requires some ethical principles. It requires measures that would guarantee that smart governance apparatuses are fair and credible. Predictive policing has the potential to help all and recover trust in the people once the bias is corrected.

Keywords: algorithmic bias; predictive policing; smart governance; equity; fairness; transparency; human oversight

How to cite: Abdullah, F., Drugău Constantin, A.-L., and AlAkoum, A. (2023). “Algorithmic Bias in Smart Governance Systems: Evaluating Equity in Predictive Policing,” Smart Governance 2(3): 37–51. doi: 10.22381/sg2320233.

Received 15 April 2023 • Received in revised form 24 September 2023
Accepted 26 September 2023 • Available online 29 September 2023

1Universiti Teknologi MARA, Shah Alam, Selangor, Malaysia, This email address is being protected from spambots. You need JavaScript enabled to view it..
2Bucharest University of Economic Studies, Bucharest, Romania, This email address is being protected from spambots. You need JavaScript enabled to view it. (corresponding author); This email address is being protected from spambots. You need JavaScript enabled to view it..

Home | About Us | Events | Our Team | Contributors | Peer Reviewers | Editing Services | Books | Contact | Online Access

© 2009 Addleton Academic Publishers. All Rights Reserved.

 
Joomla templates by Joomlashine