Home
Newsletter
Events
Blogs
Reports
Graphics
RSS
About Us
Support
Write for Us
Media Info
Advertising Info

Crime prediction software used to prosecute people for misdeeds they have yet to commit


Future crime

(NaturalNews) Are people responsible for actions they have yet to commit? This moral dilemma doesn't just make for fun armchair philosophy. A team of researchers has demonstrated that computers are better than human judges at predicting who will commit a violent act.

In a paper published last month, the researchers detailed how they created a system that began with people who had been previously arrested for domestic violence, and determined which participants would most likely commit the same crime later.

Authorities can use the crime prediction software to detect patterns. These patterns can help officials recognize a criminal's intent and the probability that they will commit the same act twice. The technology could prevent injuries and even save lives. On the other hand, critics note that such technology is corrosive to the foundations of justice and moral responsibility.

In response, proponents insist that police departments already use computers to decipher when and where crimes are most likely to occur. More than half of state parole boards use predictions based on data analysis to determine whether a convict should be released from prison or remain incarcerated. In addition, the U.S. Department of Homeland Security already uses FAST (Future Attribute Screening Technology) to pinpoint potential terrorists, by analyzing an individual's body language and tendencies. The most recent system is simply an add-on to these technologies.

Human judgement vs. computer judgment

Although the technology is better at predicting the behavior of criminals than human judges are, it is not 100 percent airtight. What makes the recent study unique, is that it highlights how effective the system is at gauging criminal behavior in comparison to experts.

"The algorithms are not perfect. They have flaws, but there are increasing data to show that they have fewer flaws than existing ways we make these decisions," said Richard Berk, a criminology and statistics professor at Penn's school of Arts and Sciences, who helped design them system. "You can criticize them -- and you should because we can always make them better -- but, as we say, you can't let the perfect be the enemy of the good."

Berk emphasized that he only used publicly available data on individuals who had been previously arrested. The system isn't monitoring citizens; however, it is being used to decide if a criminal ought to be released or detained.

Berk has been involved in crime predictive software for more than a decade. By 2008, he had built a computer system that was better than professionals at deciphering which parolees would most likely commit another crime. In particular, Berk used a machine learning system, which extrapolated data from numerous computers until it unearthed a pattern that could make future predictions and be tested against background knowledge.

Feeding the machine

In a study published last month in the Journal of Empirical Legal Studies, Berk and Penn psychologist Susan Sorenson reviewed nearly 100,000 legal cases, which took place between 2009 and 2013. They used a machine learning system, which collected data on age, sex, zip code, age of first arrest and a list of previous charges like drunk driving, animal abuse and gun related crimes. The system wasn't fed information on race; however, the system did discern racial patterns based upon zip codes.

The duo reported that approximately two-thirds of the data was used to "train the system." The remaining data was used to test the system. They fed the computer the same information a judge would have access to, such as whether an individual had been arrested for domestic violence before.

One way to ensure that offenders do not commit repeat offenses is to jail anyone charged with domestic violence. However, there is a hefty price tag attached to jailing everyone. Nearly half of individuals arrested for domestic violence are released, according to Berk. The hurdle doesn't lie in releasing half the detainees, but in determining which half ought to be let go.

The researchers found that approximately 20 percent of the detainees released by the judge committed the same crime later, whereas only 10 percent of the computer's choices repeated the same crime.

Berk and Sorenson are helping the Philadelphia law enforcement use the machine learning system to assign the risk level for domestic violence in households. The parole system is already used in Philadelphia. The machine learning system determines whether city parolees are a low, medium or high threat, enabling police to primarily concentrate on high-risk cases.

Ensuring moral responsibility

Nevertheless, it's hard to shake off the uneasiness attached to committing someone for a crime they have yet to commit. This technology isn't limited to criminals, but has implications for society in general. Such software could be used by companies to decide whether they should fire an employee, or by doctors to decide if they should deny a patient surgery.

At a more fundamental level, punishing people based on their proclivities corrodes the belief that an individual ought to have committed an act in order to be held morally responsible. A person's thoughts may reflect an individual's character, but we don't prosecute people based upon thoughts alone. In addition, holding people accountable for future crimes undermines the idea that people are innocent until proven guilty. Had an individual known they would commit a deplorable act in the future, they could take active steps in the now to avoid those undesirable consequences.

Regarding people as moral agents instead of numbers is the only way to ensure that the state prosecutes people based upon past actions instead of future actions. Feeding a machine physical data about a person isn't sufficient to determine how a person will act, because not all human choices are fueled by physical causes. The ideas of other people's minds, for instance, can cause us to change our minds and make choices that alter the course of the future. Thus, such technology – though useful and potentially necessary in several respects – places too much trust in the hands of machines, and not enough trust in the hands of people.

Sources include:

BloombergView.com

EurekaAlert.org

PopSci.com

Science.NaturalNews.com

Receive Our Free Email Newsletter

Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.


comments powered by Disqus
Most Viewed Articles



Natural News Wire (Sponsored Content)

Science.News
Science News & Studies
Medicine.News
Medicine News and Information
Food.News
Food News & Studies
Health.News
Health News & Studies
Herbs.News
Herbs News & Information
Pollution.News
Pollution News & Studies
Cancer.News
Cancer News & Studies
Climate.News
Climate News & Studies
Survival.News
Survival News & Information
Gear.News
Gear News & Information
Glitch.News
News covering technology, stocks, hackers, and more