The British government is under the spotlight for its controversial "murder prediction" program, a project to predict violent crime using personal data. It looks like an initiative taken from a science fiction book, but this is how reality works right now.
The bad news? There are people concerned about its privacy violations, algorithmic bias, and the ethics of pre-crime risk assessments.
What is this 'Murder Prediction' Tool?
The UK Ministry of Justice (MoJ) is said to be working on an advanced computer program that can foretell people likely to commit violent crimes, even murder. Initially titled the "homicide prediction project," the program is now titled "sharing data to improve risk assessment," but remains as contentious as ever.
The purpose of the scheme is said to be aimed at improving public safety by tracking potential threats beforehand, as declared by the authorities.
However, critics are anxious that this technology could be misused and result in biased profiling and privacy invasion. The information on which these predictive models are trained comes from reported personal details of people who have had interaction with the justice system, thus evoking fears of privacy and discrimination.
How Does the 'Murder Prediction' Algorithm Work?
At the core of the program are machine learning algorithms and data analytics tailored to review personal data regarding victims as well as individuals who have past criminal histories.
The algorithms sort through data like names, birthdates, gender, ethnicity, and previous criminal history. There are also reports that the system may incorporate sensitive health information, such as mental health issues, drug and alcohol addiction, and history of suicide or attempted suicide, in a bid to make more informed predictions.
According to The Guardian, the program aims to try to identify individuals who can be at heightened risk of perpetrating violent offenses in the future. However, the application of data from individuals who have not been convicted of a crime is highly contentious.
Statewatch, a civil liberties organization, discovered evidence that indicated that the MoJ could be using information on individuals who do not have criminal convictions, including data from victims of domestic abuse and people with mental illnesses.
The Controversy: Data Privacy and Possible Bias
Critics claim that the scheme would exacerbate the inequalities that already exist in the criminal justice system.
Sofia Lyall, a Statewatch researcher, warns that relying on algorithms to predict crime will amplify biases against minority ethnic and poor communities.
She also claims that these predictive models tend not to capture the complexities of people's lives, and instead, draw on flawed data that may lead to discrimination.
The program's dependency on existing police records, which can be prone to bias, is another area of concern. Opponents argue that basing the use of such highly sensitive data without clear permission might violate privacy rights and single out vulnerable members.
UK Government Insists it's Only on Research Stage
The Ministry of Justice has come to the defense of the project, insisting that it is only in the research stage and is using information from convicted criminals merely to conduct the study.
They point out that the project aims at enhancing the process of risk assessment for individuals on probation but does not yet aim at forecasting precise criminal behavior such as murder.
The MoJ also ensures that people will be able to opt out of data collection where necessary.
Even with these guarantees, the project is extremely contentious. While the UK government continues to play with predictive technologies, the public and privacy groups will be closely monitoring to determine if this program results in meaningful gain or still more violations of personal liberty.