I’m not super happy with the lack of supporting data, just colorful graphs and a lot of emotional anecdotes but there is a clear disparity that I think warrants greater investigation. I love the credit union movement and was shocked to read this article. I would love to hear your opinions lemmings!

  • wyrmroot@programming.dev
    link
    fedilink
    arrow-up
    50
    ·
    11 months ago

    With respect to data, there does seem to be a damning amount of it in the CFPB dataset they analyzed for the article. The fact that approvals were this disproportionate even when accounting for “income, debt-to-income ratio, property value, downpayment percentage, and neighborhood characteristics” is alarming. Specifically with respect to income, approval for lowest-quartile whites exceeded that of highest quartile blacks. Yes, credit score was not available in the dataset, but we know it doesn’t fully explain the gap because of its frequency as a cited reason for denial, and reliance on credit doesn’t really do much to dig NFCU out of this hole IMO.

    I’m tempted to agree with the authors assessment that the use of automated tools by the underwriters is a likely contributor. Use a tool trained on historically racist data and practices, and that’s what you’ll get more of.

    • NOT_RICK@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      We have a saying at work regarding data, “garbage in, garbage out”. If they’re using clearly racist historical data they deserve this shitstorm. They really should have known better

  • corroded@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    11 months ago

    How would a bank or credit union even know what racial background a loan applicant comes from? I have a mortgage, and I’ve had auto loans and personal loans in the past. Not once did I ever see a bank employee face-to-face, even for my mortgage.

    A suppose the sound of a person’s voice or their name could give some clues on certain occasions.

    • lurch (he/him)@sh.itjust.works
      link
      fedilink
      arrow-up
      15
      ·
      11 months ago

      In the US lots of forms have race or ethnicity input fields. It always baffles me as a European. Like how is that relevant, except for discrimination.

      • zigmus64@lemmy.world
        link
        fedilink
        arrow-up
        17
        ·
        11 months ago

        I think the logic is to track data like this… I’ve never understood it either though…

        • MechanicalJester@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          11 months ago

          Very much for this. In schools in decent states the students self identity. That is then used to look for over representation in suspension or expulsion.

          How else can mathematically prove bias or discrimination?

          Individually, the ethnicity is known or presumed by staff but without it being known systemically it can’t be addressed systemically.

        • ryathal@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          It’s easier to ask for it upfront and ban companies from using than try and reconstruct the data to analyze after the fact. There scale of discrimination was so severe in housing the government forces this information to be collected and reported for all applications, because it’s easier to detect the discrimination that way. There’s also penalties for not submitting the race or ethnicity on enough applications.

    • bassomitron@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      11 months ago

      Creditors have access to your entire life in the background. So even if your loan application doesn’t have race/ethnicity on it, your credit file sure as hell does.

    • ShortBoweledClown@lemmy.one
      link
      fedilink
      arrow-up
      10
      ·
      11 months ago

      They run your information against a number of services when you apply for a loan. That data is 100% available to them. Source: I work with this data daily.

    • cybervseas@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 months ago

      Like the author says this is probably due to how automated systems were trained. They weren’t made to be racist, but based on certain traits are more likely to reject, and that ends up making them discriminate against black applicants. I’m thinking: neighborhood, street, schools attended, stuff like that.

  • MechanicalJester@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    11 months ago

    They tried to make their decisions too clever. The analysis of personal background beyond basic measures is a problem. That additional analysis is where biases breed.