Comments
  1. MIT Technology ReviewKaren Hao12/4/2012 min
    18 reads3 comments
    9.8
    MIT Technology Review
    18 reads
    9.8
    You must read the article before you can comment on it.
    • SEnkey3 years ago

      There is an interesting irony here. Anyone who has been in a room with lawyers may leave feeling like a simple question (was the law broken? is this legal? what liability is involved? is this contract a good one?) was way overcomplicated to the point of being inscrutable to a normal person. Dave Chapel has a great stand up on this very issue. The algorithms have done that to the lawyers. It's something a normal person, and normal lawyer, can't begin to understand.

      I'm all for making state used algorithms transparent (and maybe some other ones-looking at you google). I'm also for making fewer laws and the existing ones simpler.

    • Florian3 years ago

      Public agencies often buy automated decision-making tools directly from private vendors. The result is that when systems go awry, the individuals affected——and their lawyers—are left in the dark.

      This is insane but also not surprising , sadly.

    • Ruchita_Ganurkar3 years ago

      Credit-scoring algorithms are not the only ones that affect people’s economic well-being and access to basic services. Algorithms now decide which children enter foster care, which patients receive medical care, which families get access to stable housing. Those of us with means can pass our lives unaware of any of this. But for low-income individuals, the rapid growth and adoption of automated decision-making systems has created a hidden web of interlocking traps.