Kenn Dahl says he has always been a careful driver. The owner of a software company near Seattle, he drives a leased Chevrolet Bolt. He’s never been responsible for an accident.

So Mr. Dahl, 65, was surprised in 2022 when the cost of his car insurance jumped by 21 percent. Quotes from other insurance companies were also high. One insurance agent told him his LexisNexis report was a factor.

LexisNexis is a New York-based global data broker with a “Risk Solutions” division that caters to the auto insurance industry and has traditionally kept tabs on car accidents and tickets. Upon Mr. Dahl’s request, LexisNexis sent him a 258-page “consumer disclosure report,” which it must provide per the Fair Credit Reporting Act.

What it contained stunned him: more than 130 pages detailing each time he or his wife had driven the Bolt over the previous six months. It included the dates of 640 trips, their start and end times, the distance driven and an accounting of any speeding, hard braking or sharp accelerations. The only thing it didn’t have is where they had driven the car.

On a Thursday morning in June for example, the car had been driven 7.33 miles in 18 minutes; there had been two rapid accelerations and two incidents of hard braking.

    • kbal@fedia.io
      link
      fedilink
      arrow-up
      27
      ·
      8 months ago

      It serves as a convenient representative example of the ways in which such systems can go wrong.

      • kbal@fedia.io
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        8 months ago

        I mean, this is the world of software and computer systems. The map is always outdated, the model is always fictional, and the metric is always measuring the wrong thing. Even aside from the obvious privacy problems this kind of big data approach has its limits which are too easily ignored by insurance companies eager to take the average across thousands of mistakes hoping to get something profitable. As is becoming increasingly more obvious to the general public as computer algorithms designed in secret rule more of our lives, quite often the best that can be managed is a system that works adequately well for the purposes of its designers even while it takes decisions that are utterly stupid at the level of the individual people subjected to it.

    • AHemlocksLie@lemmy.zip
      link
      fedilink
      arrow-up
      21
      arrow-down
      2
      ·
      8 months ago

      That assumes the outdated map software manages to somehow make an accurate report. Most likely, if it makes one, it’ll be “Going X over a Y MPH area” even though Y is wrong, or it’ll be just “speeding by X MPH for Y seconds/minutes”. Either way, nobody is likely to verify and correct the data, so you could be punished for perfectly safe and legal driving.

      • Aceticon@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        11
        ·
        8 months ago

        You can’t be punished for it because that “evidence” was not correctly collected.

        Also in your specific example and depending on the country, for them to report you on that would be a false accusation which means they’re the ones that could get into trouble if you go after them (basically any costs you incurred because of it would be on them).

        (IANAL, so take this with a pinch)

        It’s probably too much trouble for them to actually report it to the police (if they do it automatically, they run the risk I mention and they’re not going to spend the money manually reviewing it) - there is risk and cost involved with nothing in it for them.

        That said, they could still pass it on to some entities other than the police (such as insurers) and good luck for you to prove it and show the damage it caused you. In the EU you could request them all the data they had on you which would possibly be enough to catch them, but outside it, it really depends.

        • AHemlocksLie@lemmy.zip
          link
          fedilink
          arrow-up
          15
          ·
          8 months ago

          Maybe not legally punished, but this very article we’re discussing is about how insurance companies are, in fact, punishing you financially for it. As for the false accusation, sure, but how likely is anyone to even figure it out? You’re not being dragged into court, and people don’t even know this is happening yet. It’s only illegal if you get caught. I don’t expect them to report it to anyone. I just expect data collectors to sell data and other businesses to buy it for the express purposes of financially screwing you. You may stay out of court, but that extra 21% charge is gonna cost you a couple hundred per year at least.

          • Aceticon@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            Yeah, hence the last paragraph of my comment.

            I can see how it can indirectly used in ways that harm somebody, just wanted to point out it’s unlikelly to be reporting drivers to the police if only because there’s no money and some risk for them in doing it.

            Mind you, if the police does some kind of agreement with them were they’re paid for it and are immune to liability for misreporting, I can see rental companies doing it.

            I’m very happy that I live in Europe, not the US.

        • BakerBagel@midwest.social
          link
          fedilink
          arrow-up
          10
          ·
          8 months ago

          They aren’t reporting you to the police. They are selling that data to insurance companies who then use that information to jack up your premiums. So guess what. You are now being financially punished for safe driving while someone in a 20 year old shit box that miraculously avoids accidents and apeeding tickets pays a lower premium.

          The only solution is to forbid companies from collecting this data in the first place. It’s never going to be used to make something cheaper for you, it’s only ever going to be used to sell you something or to charge you more.

          • Aceticon@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            I think we’re basically seeing the same picture and in agreement on how things should be (which is why I pointed I’m happy to be in the EU, were that stuff IS forbiden unless people explicitly opt-in).

            • BakerBagel@midwest.social
              link
              fedilink
              arrow-up
              3
              ·
              8 months ago

              No opt-ins because companies will do whatever they can to force you into opting in for it. Same way that fast food companies are harvesting data from people by jacking up prices and making “discounts” available on their apps. Corporations have all the leverage again consumers.

    • BakerBagel@midwest.social
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      8 months ago

      Which then reports back to LexisNexis that you are speeding through an area, which is then reported to insurance companies who in turn flag you as a dangerous driver, raising your premiums.

    • brbposting@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      The anecdote doesn’t necessarily prove anything but it is conceivable that stretch of road is mismarked in multiple systems.

    • keyez@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      Some cars don’t just rely on maps and attempt to scan speed limit signs on the side of the road but doesn’t always see the sign or update accurately.