Jump to content

First force to try predictive policing quietly cancelled contract


Recommended Posts

Move to victims-based model sees change of direction for Kent Police.

In the eye of the beholder: Kent weighs up options after puling the plug on predictive policing scheme

In the eye of the beholder: Kent weighs up options after puling the plug on predictive policing scheme

Date - 27th November 2018
By - Nick Hudson - Police Oracle
2 Comments2 Comments}


A pioneering force’s switch of emphasis in its policing model has brought the curtain down on a five-year project to predict and prevent crime.

Kent Police quietly binned Predpol eight months ago – after being the first in England and Wales to introduce a system that used historical data and an algorithm to assist law enforcement in identifying public spaces that would benefit from patrols.

The force is now evaluating its options for predictive policing – looking at the potential of going it alone, or with a partner, as it saves an estimated £100,000-a-year on the cancelled contract with the California-based company.

The surprising development comes as campaigners called for urgent government action to stop, limit, or safeguard the rapid increase in prevalent police surveillance technologies they claim are infringing on individuals’ fundamental rights.

Independent cross-party body Brotherhood Watch told the Commons’ Home Affairs Select Committee’s Policing for the Future inquiry in the summer that “police forces naturally seeking to acquire technology they believe will improve public safety or save them time and money” were “outpacing the law”.

But the decision by the Kent force to scrap the scheme, despite an innovation that had produced a “good record of predicting where crimes are likely to take place, was entirely due to a change in direction – not brought on by any outside pressure.

Kent Superintendent John Phillips told Police Oracle: “We became the first force in England and Wales to introduce predictive policing in December 2013 and using historical data and an algorithm it allowed officers to identify public spaces that would benefit from patrols.

“While it did not predict crime, it was used as a preventative tool and supported the force’s focus at the time on neighbourhood policing.

“The launch of a new policing model that places victims and witnesses at its centre, has led Kent Police to evaluate alternative options which will support a focus on both traditional and emerging crime types.

“Therefore Kent Police has not renewed its contract with the current provider of predictive policing.”

The force – which saw the new technology was piloted in the region in December 2013 after a successful, four-month trial in Medway resulted in a six per cent drop in street violence – admitted it stopped using the system on March 31 this year, after an extensive review looked into the cost and overall effectiveness.

Several forces in the UK – Greater Manchester, West Midlands, West Yorkshire and the Metropolitan Police – have been following Kent’s lead in trialling predictive policing tools.

The Met says it does not use person-specific data to make predictions while Durham Constabulary has worked with researchers to develop a programme that predicts the likelihood of an arrested individual re-offending, based on factors including previous criminal history, age and postcode.

Predpol chief executive Brian MacDonald says his company is talking to other UK forces and hopeful that Kent Police – which had not been using the most recent version of its system which tracks officers using GPS to make sure they follow the paths suggested by the algorithm – is just “on pause” at this juncture.

The doubters on predictive policing – such as Big Brother Watch – are seeking an immediate end to the UK police use of automated facial recognition in public spaces; the introduction of a policy of automatic deletion of the custody images of unconvicted individuals from police databases, and removing of all historic images of unconvicted individuals; safeguards in relation to police use of AI, algorithms and other automated systems to protect fundamental rights and prevent the use of AI to make decisions which engage fundamental rights, and restrict the use of predictive policing systems which have the potential to reinforce discriminatory policing; and a ban on the  “indiscriminate tracking and monitoring of UK citizens” via the national ANPR network.

The organisation said in a statement to Parliament: “There is an increasing trend of police forces acquiring, developing, and operationally deploying new, intrusive, and untested technologies that are likely to be incompatible with people’s fundamental rights.”

view On Police Oracle

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Create New...