1. We are a community of readers. Join us!

    Readup is a social reading platform. No ads. No distractions. No liking or upvotes. We help you pay attention to what matters: reading.

    MIT Technology Review | Karen Hao | 12/4/20 | 12 min
    18 reads3 comments
    MIT Technology Review
    18 reads
    You must read the article before you can post or reply.
    • SEnkey
      Top reader this weekScout
      1 month ago

      There is an interesting irony here. Anyone who has been in a room with lawyers may leave feeling like a simple question (was the law broken? is this legal? what liability is involved? is this contract a good one?) was way overcomplicated to the point of being inscrutable to a normal person. Dave Chapel has a great stand up on this very issue. The algorithms have done that to the lawyers. It's something a normal person, and normal lawyer, can't begin to understand.

      I'm all for making state used algorithms transparent (and maybe some other ones-looking at you google). I'm also for making fewer laws and the existing ones simpler.

    • Florian
      Top reader this weekReading streak
      1 month ago

      Public agencies often buy automated decision-making tools directly from private vendors. The result is that when systems go awry, the individuals affected——and their lawyers—are left in the dark.

      This is insane but also not surprising , sadly.

    • Ruchita_Ganurkar
      1 month ago

      Credit-scoring algorithms are not the only ones that affect people’s economic well-being and access to basic services. Algorithms now decide which children enter foster care, which patients receive medical care, which families get access to stable housing. Those of us with means can pass our lives unaware of any of this. But for low-income individuals, the rapid growth and adoption of automated decision-making systems has created a hidden web of interlocking traps.