Nevada Asked A.I. Which Students Need Help. The Answer Caused an Outcry.

Renee Fairless, a charter school principal, said that at her school, which serves many children living in poverty, a new A.I. algorithm led to a sharp drop in the number identified as needing more resources.

The New York Times

Nevada Asked A.I. Which Students Need Help. The Answer Caused an Outcry.

The new system cut the number of students deemed “at risk” in the state by 200,000, leading to tough moral and ethical questions over which children deserve extra assistance.

The cynicism it takes to build the thing, sell the thing, run the thing, apply the thing, all to tell kids they’re out of luck.

I guess if you sprinkle a little computer magic on your budget cut you’re absolved from actually making decisions and bear no responsibility?

Mastodon

Cat Hicks (@grimalkina@mastodon.social)

Set aside the question of whether the model(s) at hand predict a certain percentage of the variance in the constrained outcome of the model(s). You don’t even need to know that. The question is, we have an enormous real-world intervention that’s being performed right now on children: the change in funding. Reclassifying these schools has an immediate, scaled effect. What evidence does a tech team have to choose this change? What accountability? What expertise? What right to this data?