#

Judges let algorithms help them make decisions, except when they don’t

Photo collage of the Lady Justice statue blinded by an algorithm.
Image: Cath Virginia / The Verge, Getty Images

When Northwestern University graduate student Sino Esthappan began researching how algorithms decide who stays in jail, he expected “a story about humans versus technology.” On one side would be human judges, who Esthappan interviewed extensively. On the other would be risk assessment algorithms, which are used in hundreds of US counties to assess the danger of granting bail to accused criminals. What he found was more complicated — and suggests these tools could obscure bigger problems with the bail system itself.

Algorithmic risk assessments are intended to calculate the risk of a criminal defendant not returning to court — or, worse, harming others — if they’re released. By comparing criminal defendants’ backgrounds to a vast database…

Continue reading…