this is from a couple years ago. everyone is here thinking this is like minority report but it’s not about individual behavior, this is just about using predictive models to self-justify racist policing patterns:
Fears of “The Thought Police” are probably running through your head right now, but Chattopadhyay is keen to stress that the focus isn’t on individuals, “We do not focus on predicting individual behavior, and do not suggest that anyone be charged with a crime that they didn’t commit, or be incarcerated for that. Our model learns from, and then predicts, event patterns in the urban space, for example, predicting that within a two block radius around the intersection of 67th Street and SW Avenue, there is a high risk of a homicide a week from now. It does not indicate who is going to be the victim or the perpetrator,” she says.
Which of course has a certain degree of success baked in, because if you focus policing in a particular place you will find crimes there because a) crimes happen everywhere and b) cops can just juke the stats / make shit up / make arrests without cause.
exactly. it’s amazing to me that these nerds can talk themselves into creating an ouroboros like this because they don’t actually bother to understand how any of this shit works, but i guess whatever justifies their salary…
It’s the result of other scientists pretending sociology isn’t a science. Sociology makes shit like this worthless, so instead of just working together with sociologists, they ignore them.
“using advanced AI, we have determined there will be more crime in the high crime area”
it’s even worse than that! they’re treating crimes like they’re forces of nature or fucking dice rolls to begin with and completely ignore the role police play in defining and creating crime and the construction of criminality!
i mean garbage in, garbage out, and the whole edifice is built upon a giant pile of racist garbage and these assholes will happily congratulate themselves about how good at math they are
AI bringing back miasma theory, past crimes are creating bad odors in the area that are just turning previously pure citizens into criminals. I hope the government gives the police more military equipment to purge these evil vapors
exactly but we call it “broken window theory” to jazz it up a bit
With 90% accuracy it will successfully identify 90 out of 100 criminals and falsely accuse 90 out of 1000 innocent people.
It’s better for ten innocent people be jailed than for one full time wage to be paid
10 innocent people may be jailed, but it’s a risk I’m willing to take
I see anywhere where prisons are a private thing being massively in favour of this.
Wow how innovative
pub fn predict_crime(suspect: Person) -> bool { if suspect.race() == Race::Black { return true; } else { return false; } }
ew…
pub fn predict_crime(suspect: Person) -> bool { return suspect.race() == Race::Black }
Good change but also why is
race
a getter method whileRace::Black
is a constant enum? Israce
an impure function dependent on global state? Is it derived from some other internal immutable state?race()
is a getter method as it is dependent on which Eastern and Southern Europeans are considered white at the time
you dont need the return statement either
i dont even know what language this is :D i just thought it’d be a nice bit to silently pass over the racism aspect and nitpick the code
It’s Rust.
If you omit the semicolon on the last line, it will return that value, so
suspect.race() == Race::Black
will return true/false for the containing expression.
Nerds with a rudimentary understanding of undergrad stats do this all the time with extra steps by just building a simplistic model based on (racist) “crime data”. Sometimes literally just a basic Bayesian model.
And they get hired by Palantir to do versions of that for $300k/year.
U Chicago continuing its proud reactionary legacy https://en.wikipedia.org/wiki/Chicago_Boys
Insider trading is probably very predictable, with enough data.
The Torment Nexus is only when you do 1984, not Minority Report.
Predicts police behaviour not crime. And who can’t do that.
Can’t wait until in “freedomland” I get arrested not because I commit any crimes, but because I look like someone who might.
“Red always sus” but in real life.
That’s been happening for a really really long time already. It’s called racism, now they’re teaching it to computers.
Didn’t it come out early on that AI was racist?
It didn’t really “come out”. It was always known that garbage in leads to garbage out, and that models will reflect their training data. No serious researcher was surprised to learn that models reflect the biases of their training data, because that’s part of the design.
deleted by creator
Does it poop out cute little billard balls too?
We do not focus on predicting individual behavior, and do not suggest that anyone be charged with a crime that they didn’t commit, or be incarcerated for that. Our model learns from, and then predicts, event patterns in the urban space, for example, predicting that within a two block radius around the intersection of 67th Street and SW Avenue, there is a high risk of a homicide a week from now. It does not indicate who is going to be the victim or the perpetrator.
…
We found that when stressed, the law enforcement response is seemingly different in high socio-economic-status (SES) areas compared to their more disadvantaged neighboring communities. It is suggested in the paper, that when crime rates spike, the higher SES neighborhoods tend to get more attention at the cost of resources drawn away from poorer neighborhoods.
We found that when stressed, the law enforcement response is seemingly different in high socio-economic-status (SES) areas compared to their more disadvantaged neighboring communities.