'Murder prediction' algorithms echo some of Stalin's most horrific policies — governments are treading a very dangerous line in pursuing them
When you buy through links on our articles, Future and its syndication partners may earn a commission.
Describing the horrors of communism under Stalin and others, Nobel laureate Aleksandr Solzhenitsyn wrote in his magnum opus, "The Gulag Archipelago," that "the line dividing good and evil cuts through the heart of every human being." Indeed, under the communist regime, citizens were removed from society before they could cause harm to it. This removal, which often entailed a trip to the labor camp from which many did not return, took place in a manner that deprived the accused of due process. In many cases, the mere suspicion or even hint that an act against the regime might occur was enough to earn a one way ticket with little to no recourse. The underlying premise here that the officials knew when someone might commit a transgression. In other words, law enforcement knew where that line lies in people's hearts.
The U.K. government has decided to chase this chimera by investing in a program that seeks to preemptively identify who might commit murder. Specifically, the project uses government and police data to profile people to "predict" who have a high likelihood to commit murder. Currently, the program is in its research stage, with similar programs being used for the context of making probation decisions.
Such a program that reduces individuals to data points carries enormous risks that might outweigh any gains. First, the output of such programs is not error free, meaning it might wrongly implicate people. Second, we will never know if a prediction was incorrect because there's no way of knowing if something doesn't happen — was a murder prevented, or would it never have taken place remains unanswerable? Third, the program can be misused by opportunistic actors to justify targeting people, especially minorities — the ability to do so is baked into a bureaucracy.
Consider: the basis of a bureaucratic state rests on its ability to reduce human beings to numbers. In doing so, it offers the advantages of efficiency and fairness — no one is supposed to get preferential treatment. Regardless of a person's status or income, the DMV (DVLA in the U.K.) would treat the application for a driver's license or its renewal the same way. But mistakes happen, and navigating the labyrinth of bureaucratic procedures to rectify them is no easy task.
In the age of algorithms and artificial intelligence (AI), this problem of accountability and recourse in case of errors has become far more pressing.
Mathematician Cathy O'Neil has documented cases of wrongful termination of school teachers because of poor scores as calculated by an AI algorithm. The algorithm, in turn, was fueled by what could be easily measured (e.g., test scores) rather than the effectiveness of teaching (a poor performing student improved significantly or how much teachers helped students in non quantifiable ways). The algorithm also glossed over whether grade inflation had occurred in the previous years. When the teachers questioned the authorities about the performance reviews that led to their dismissal, the explanation they received was in the form of "the math told us to do so" — even after authorities admitted that the underlying math was not 100% accurate.
If a potential future murderer is preemptively arrested, "Minority Report"-style, how can we know if the person may have decided on their own not to commit murder?
As such, the use of algorithms creates what journalist Dan Davies calls an "accountability sink" — it strips accountability by ensuring that no one person or entity can be held responsible, and it prevents the person affected by a decision from being able to fix mistakes.
This creates a twofold problem: An algorithm's estimates can be flawed, and the algorithm does not update itself because no one is held accountable. No algorithm can be expected to be accurate all the time; it can be calibrated with new data. But this is an idealistic view that does not even hold true in science; scientists can resist updating a theory or schema, especially when they are heavily invested in it. And similarly and unsurprisingly, bureaucracies do not readily update their beliefs.
To use an algorithm in an attempt to predict who is at risk of committing murder is perplexing and unethical. Not only could it be inaccurate, but there's no way to know if the system was right. In other words, if a potential future murderer is preemptively arrested, "Minority Report"-style, how can we know if the person may have decided on their own not to commit murder? The UK government is yet to clarify how they intend to use the program other than stating that the research is being carried for the purposes of "preventing and detecting unlawful acts."
We're already seeing similar systems being used in the United States. In Louisiana, an algorithm called TIGER (short for "Targeted Interventions to Greater Enhance Re-entry") — predicts whether an inmate might commit a crime if released, which then serves as a basis for making parole decisions. Recently, a 70-year-old nearly blind inmate was denied parole because TIGER predicted he had a high risk of re-offending..
In another case that eventually went to the Wisconsin Supreme Court (State vs. Loomis), an algorithm was used to guide sentencing. Challenges to the sentence — including a request for access to the algorithm to determine how it reached its recommendation — were denied on grounds that the technology was proprietary. In essence, the technological opaqueness of the system was compounded in a way that potentially undermined due process.
Equally, if not more troublingly, the dataset underlying the program in the U.K. — initially dubbed the Homicide Prediction Project — consists of hundreds of thousands of people who never granted permission for their data to be used to train the system. Worse, the dataset — compiled using data from the Ministry, Greater Manchester Police of Justice, and the Police National Computer — contains personal data, including, but not limited to, information on addiction, mental health, disabilities, previous instances of self-harm, and whether they had been victims of a crime. Indicators such as gender and race are also included.
Related stories
—The US is squandering the one resource it needs to win the AI race with China — human intelligence
—Climate wars are approaching — and they will redefine global conflict
—'It is a dangerous strategy, and one for which we all may pay dearly': Dismantling USAID leaves the US more exposed to pandemics than ever
These variables naturally increase the likelihood of bias against ethnic minorities and other marginalized groups. So the algorithm's predictions may simply reflect policing choices of the past — predictive AI algorithms rely on statistical induction, so they project past (troubling) patterns in the data into the future.
In addition, the data overrepresents Black offenders from affluent areas as well as all ethnicities from deprived neighborhoods. Past studies show that AI algorithms that make predictions about behavior work less well for Black offenders than they do for other groups. Such findings do little to allay genuine fears that racial minority groups and other vulnerable groups will be unfairly targeted.
In his book, Solzhenitsyn informed the Western world of the horrors of a bureaucratic state grinding down its citizens in service of an ideal, with little regard for the lived experience of human beings. The state was almost always wrong (especially on moral grounds), but, of course, there was no mea culpa. Those who were wronged were simply collateral damage to be forgotten.
Now, half a century later, it is rather strange that a democracy like the U.K. is revisiting a horrific and failed project from an authoritarian Communist country as a way of "protecting the public." The public does need to be protected — not only from criminals but also from a "technopoly" that vastly overestimates the role of technology in building and maintaining a healthy society.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


E&E News
16 minutes ago
- E&E News
Trump foes press Supreme Court to reject bid to restart layoffs
Unions and other groups challenging the Trump administration's plans for large-scale layoffs are urging the Supreme Court to reject the administration's latest bid to let those layoffs proceed. The Supreme Court should refuse the Trump administration's request to intervene, challengers said in a motion filed Monday. Their response comes after the Trump administration last week asked the justices to block a lower court's ruling that has paused layoffs across much of the government. 'If the breakneck reorganization of the federal government ordered by the President is implemented before the merits of this case may be decided based on a full record, then statutorily required and authorized programs, offices, and functions across the federal government will be abolished, agencies will be radically downsized from what Congress authorized, critical government services will be lost, and hundreds of thousands of federal employees will lose their jobs,' says the response filed by unions, conservation groups and other organizations fighting the layoffs in court. Advertisement 'There will be no way to unscramble that egg,' they added. 'If the courts ultimately deem the President to have overstepped his authority and intruded upon that of Congress, as a practical matter there will be no way to go back in time to restore those agencies, functions, and services.'


Bloomberg
17 minutes ago
- Bloomberg
Johnson Urges Senate to Minimize Changes to $40,000 SALT Deal
House Speaker Mike Johnson said he's pressuring Senate Republicans to refrain from changing a deal to increase the state and local tax deduction cap to $40,000, pushing back on President Donald Trump's willingness to scale back the write-off. 'I've asked them to modify it as little as possible because I've got a very delicate balance there,' Johnson told reporters at the White House on Monday.
Yahoo
22 minutes ago
- Yahoo
MP urges Government to protect live music venues from new neighbours' complaints
Housebuilders should face having their plans blocked if they fail to protect live music venues, an MP has suggested. Dame Caroline Dinenage has proposed letting decision-makers take into account existing properties, when they grant or refuse permission for new projects. The Commons Culture, Media and Sport Committee chairwoman warned that 'live music's in crisis, the Government needs to be listening' as she proposed a new clause to the Planning and Infrastructure Bill. Dame Caroline, the Conservative MP for Gosport, told the Commons: 'It isn't about venues versus developers. 'It's about making sure we have a balance right between building enough good homes and making sure the places we're building keep the things that make life worth living. 'In Westminster and our constituencies, everyone agrees that our high streets have been in decline, so it's vitally important that we protect the places that are special to us, our constituents and our communities, the places that provide a platform for our creators and our world-beating creative industries where we can make memories, celebrate and have fun.' Dame Caroline called on the Government to let town halls and ministers rule on plans 'subject to such conditions that would promote the integration of the proposed development of land with any existing use of land, including such conditions as may be necessary to mitigate the impact of noise on the proposed development'. A similar principle already exists in national planning rules, known as the National Planning Policy Framework, to ease pressure on existing businesses which 'should not have unreasonable restrictions placed on them as a result' of newer builds. But the Music Venue Trust's annual report last year warned that, in 2023, 22.4% of venues closed as a result of 'operational issues', compared with 42.1% of its members reporting 'financial issues'. The Trust identified noise abatement orders or other neighbour disputes as being among the issues which have resulted in permanent closures. 'Consistent application of the 'agent of change' principles will de-risk and speed up planning and development,' Dame Caroline told MPs, and added that her proposal was 'good for venues' and 'good for developers and new neighbours'. She said the law change could help authorities stop 'expensive and often pointless bun fights' when neighbours complain about noise. She continued: 'It'll make sure the needs of an existing cultural venue are considered from the start and it will save developers from late-stage objections and lengthy expensive legal disputes down the line.' Dame Caroline said music venues 'are the foundation of our world-beating creative industries and also very important for our local communities', and that they had been placed 'under threat, including from our disruptive planning system and our onerous licensing regime'. The Commons select committee recommended last year that the 'agent of change' principle should be put on a statutory footing, to protect grassroots music venues.