- cross-posted to:
- news@lemmy.world
- cross-posted to:
- news@lemmy.world
I’m not super happy with the lack of supporting data, just colorful graphs and a lot of emotional anecdotes but there is a clear disparity that I think warrants greater investigation. I love the credit union movement and was shocked to read this article. I would love to hear your opinions lemmings!
With respect to data, there does seem to be a damning amount of it in the CFPB dataset they analyzed for the article. The fact that approvals were this disproportionate even when accounting for “income, debt-to-income ratio, property value, downpayment percentage, and neighborhood characteristics” is alarming. Specifically with respect to income, approval for lowest-quartile whites exceeded that of highest quartile blacks. Yes, credit score was not available in the dataset, but we know it doesn’t fully explain the gap because of its frequency as a cited reason for denial, and reliance on credit doesn’t really do much to dig NFCU out of this hole IMO.
I’m tempted to agree with the authors assessment that the use of automated tools by the underwriters is a likely contributor. Use a tool trained on historically racist data and practices, and that’s what you’ll get more of.
We have a saying at work regarding data, “garbage in, garbage out”. If they’re using clearly racist historical data they deserve this shitstorm. They really should have known better
That’s a rather old and common concept. GIGO
You’re not wrong, work’s just the first place I heard it
How would a bank or credit union even know what racial background a loan applicant comes from? I have a mortgage, and I’ve had auto loans and personal loans in the past. Not once did I ever see a bank employee face-to-face, even for my mortgage.
A suppose the sound of a person’s voice or their name could give some clues on certain occasions.
In the US lots of forms have race or ethnicity input fields. It always baffles me as a European. Like how is that relevant, except for discrimination.
I think the logic is to track data like this… I’ve never understood it either though…
We’re tracking the data to prove we’re going to do what you said we’re going to do.
Very much for this. In schools in decent states the students self identity. That is then used to look for over representation in suspension or expulsion.
How else can mathematically prove bias or discrimination?
Individually, the ethnicity is known or presumed by staff but without it being known systemically it can’t be addressed systemically.
It’s easier to ask for it upfront and ban companies from using than try and reconstruct the data to analyze after the fact. There scale of discrimination was so severe in housing the government forces this information to be collected and reported for all applications, because it’s easier to detect the discrimination that way. There’s also penalties for not submitting the race or ethnicity on enough applications.
Creditors have access to your entire life in the background. So even if your loan application doesn’t have race/ethnicity on it, your credit file sure as hell does.
Equfax knows a whole lot about you, more than just your race.
They run your information against a number of services when you apply for a loan. That data is 100% available to them. Source: I work with this data daily.
Like the author says this is probably due to how automated systems were trained. They weren’t made to be racist, but based on certain traits are more likely to reject, and that ends up making them discriminate against black applicants. I’m thinking: neighborhood, street, schools attended, stuff like that.
They tried to make their decisions too clever. The analysis of personal background beyond basic measures is a problem. That additional analysis is where biases breed.
Systemic racism strikes again!