UK police will start using AI to assign threat levels to criminals; ultimately deciding how long suspects should be kept in custody


Police officials in Durham, U.K. are slated to roll out an artificial intelligence system designed to help the authorities determine whether or not a suspect should be kept in police custody.The system, known as the Harm Assessment Risk Tool (HART) will classify suspects as low, medium, or high risk offenders. The system has been previously tested by the police. The AI system has been trained to identify and profile offenders for five years. The system used the offenders’ criminal history in order to effectively classify the suspects.

In order to facilitate machine learning, experts fed the system data taken from Durham police records between 2008 and 2012. The researchers then tested the system in 2013. According to the results, the HART exhibited a 98 percent accuracy in identifying low risk perpetrators. The results also revealed that the system was 88 percent accurate in identifying high risk offenders. According to Sheena Urwin, Head of Criminal Justice at Durham Constabulary, suspects with no offending history had lower odds of being classified as high risk offenders. Urwin also projected that the system will be launched in the next two to three months.

However, outside experts expressed concerns over the probability of racial bias in using the system. “The lack of transparency around this system is disturbing. If the police want to maintain public trust as technology develops, they need to be up front about exactly how it works. With people’s rights and safety at stake, Durham Police must open up about exactly what data this AI is using,” Political Scientist Professor Cary Coglianese was quoted in saying in BBC.com.

AI may bolster unwanted cultural stereotypes

Reports on Durham Police’s plans to roll out the AI system spurred concerns, especially about human-based prejudices such as racism and sexism. A recent study published in Science demonstrated that AIs and devices were being taught to foster unwanted cultural stereotypes. As part of the study, the researchers examined millions of words searched online and assessed the semantic closeness of different terms were to each other. This, according to the experts, is how automatic translators utilize “machine learning” to determine the language’s meaning.

The study revealed that male names were more closely related to career-oriented terms, while female names were associated with family-oriented terms. The research team also found that male names were closely associated with maths and sciences, while female names were also strongly associated with artistic words. In addition, random subjects such as musical instruments and flowers were labeled pleasant words, while weapons and insects were unpleasant terms. Furthermore, word embeddings for European or American names were labeled pleasant terms. In contrast, word embeddings for African-American names were labeled unpleasant terms.

“Our work has implications for AI and machine learning because of the concern that these technologies may perpetuate cultural stereotypes. Our findings suggest that if we build an intelligent system that learns enough about the properties of language to be able to understand and produce it, in the process it will also acquire historical cultural associations, some of which can be objectionable. Already, popular online translation systems incorporate some of the biases we study. Further concerns may arise as AI is given agency in our society,” the researchers wrote.

“Our work suggests that behaviour can be driven by cultural history embedded in a term’s historic use. Such histories can evidently vary between languages. Before providing an explicit or institutional explanation for why individuals make prejudiced decisions, one must show that it was not a simple outcome of unthinking reproduction of statistical regularities absorbed with language…we must check whether simply learning language is sufficient to explain (some of) the observed transmission of prejudice,” the researchers added.

Sources include:

BBC.com

Independent.co.uk 1

Independent.co.uk 2

ScienceMag.org

style="display:inline-block;width:728px;height:90px"

data-ad-client="ca-pub-8193958963374960"

data-ad-slot="1479220332">



Comments
comments powered by Disqus

RECENT NEWS & ARTICLES