Skip to main content
Black Listed News
Trending Articles:
Trending Articles:

Courts Are Using AI to Sentence Criminals. That Must Stop Now

Published: April 18, 2017
Share | Print This


Source: Wired

 

There is a stretch of highway through the Ozark Mountains where being data-driven is a hazard.

Heading from Springfield, Missouri, to Clarksville, Arkansas, navigation apps recommend the Arkansas 43. While this can be the fastest route, the GPS’s algorithm does not concern itself with factors important to truckers carrying a heavy load, such as the 43’s 1,300-foot elevation drop over four miles with two sharp turns. The road once hosted few 18-wheelers, but the last two and half years have seen a noticeable increase in truck traffic—and wrecks. Locals who have watched accidents increase think it is only a matter of time before someone is seriously hurt, or worse.

Truckers familiar with the region know that Highway 7 is a safer route. However, the algorithm creating the route recommendation does not. Lacking broader insight, the GPS only considers factors programmed to be important. Ultimately, the algorithm paints an incomplete or distorted picture that can cause unsuspecting drivers to lose control of their vehicles.

Algorithms pervade our lives today, from music recommendations to credit scores to now, bail and sentencing decisions. But there is little oversight and transparency regarding how they work. Nowhere is this lack of oversight more stark than in the criminal justice system. Without proper safeguards, these tools risk eroding the rule of law and diminishing individual rights.

Currently, courts and corrections departments around the US use algorithms to determine a defendant’s “risk”, which ranges from the probability that an individual will commit another crime to the likelihood a defendant will appear for his or her court date. These algorithmic outputs inform decisions about bail, sentencing, and parole. Each tool aspires to improve on the accuracy of human decision-making that allows for a better allocation of finite resources.

Typically, government agencies do not write their own algorithms; they buy them from private businesses. This often means the algorithm is proprietary or “black boxed”, meaning only the owners, and to a limited degree the purchaser, can see how the software makes decisions. Currently, there is no federal law that sets standards or requires the inspection of these tools, the way the FDA does with new drugs.

Read More...

Share This Article...


Emigrate While You Still Can! Learn more...



More Blacklisted News...

Free Newsletter
Blacklisted Radio
Blacklisted Nation
On Twitter
On Reddit
On Facebook
Blacklisted Radio:
Podcasts on Youtube
Podcasts on Demand
Podcasts on Spreaker
Podcasts on Stitcher
Podcasts on iTunes
Podcasts on Tunein

Podcasts Audio
Converter


Our IP Address:
198.245.55.242

Sponsors:
good
longboard
brands


ONNIT Labs


Calling for Contributors!

Got something to say?
We want to hear from you.

Submit your article contributions and participate in the world's largest independent online news community today!

Contact us today!






BlackListed News 2006-2017