For facial recognition experts and privacy advocates, the East Bay detective’s request, while dystopian, was also entirely predictable. It emphasizes the ways that, without oversight, law enforcement is able to mix and match technologies in unintended ways, using untested algorithms to single out suspects based on unknowable criteria.

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    1
    ·
    edit-2
    11 months ago

    Cops only like technology when they can abuse it to avoid having to do real investigative police work.

    They don’t care to understand the technology in any deep manner, and as we’ve seen with body cams, when they retain full control over the technology, it’s basically a farce to believe it could be used to control their behavior.

    I mean, on top of that, a lot of “forensic science” isn’t science at all and is arguably a joke.

    Cops like using the veneer of science and technology to act like they’re doing “serious jobs” but in reality they’re just a bunch of thugs trying to dominate and control.

    In other words, this is just the beginning, don’t expect them to stop doing stuff like this, and further, expect them to start producing “research” that “justifies” these “investigation” methods and see them added to the pile of bullshit that is “fOrEnSiC sCiEnCE.”

    • ToxicWaste@lemm.ee
      link
      fedilink
      arrow-up
      11
      ·
      11 months ago

      TBH: Tech companies are not much different from how you described cops.

      They don’t usually bother to learn the tech they are using properly and take all the shortcuts possible. You see this by the current spout of AI startups. Sure, LLMs work pretty good. But most other applications of AI is more like: “LOL, no idea how to solve the problem. I hooked it up to this blackbox, which i don’t understand, and trained it to give me the results i want.”

    • agent_flounder@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      11 months ago

      Propublica did an article on that.

      https://www.propublica.org/article/understanding-junk-science-forensics-criminal-justice

      E.g.

      The reliability of bloodstain-pattern analysis has never been definitively proven or quantified, but largely due to the testimony of criminalist Herbert MacDonell, it was steadily admitted in court after court around the country in the 1970s and ’80s. MacDonell spent his career teaching weeklong “institutes” in bloodstain-pattern analysis at police departments around the country, training hundreds of officers who, in turn, trained hundreds more.

      In 2009, a watershed report commissioned by the National Academy of Sciences cast doubt on the discipline, finding that “the uncertainties associated with bloodstain-pattern analysis are enormous,” and that experts’ opinions were generally “more subjective than scientific.” More than a decade later, few peer-reviewed studies exist, and research that might determine the accuracy of analysts’ findings is close to nonexistent.

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      11 months ago

      Beat me to the punch, I was saying just as much, considering the history of forensic science in general. It won’t be long before they’re producing bogus “research” to justify it at a new investigative method.

  • Catsrules@lemmy.ml
    link
    fedilink
    arrow-up
    49
    ·
    11 months ago

    Didn’t facial recognition get some poor guy arrested and raped in prison and he was completely innocent of everything?

  • iAvicenna@lemmy.world
    link
    fedilink
    arrow-up
    46
    arrow-down
    1
    ·
    11 months ago

    wow nice to know that from DNA you can predict whether or not a person has a beard, or their style of hair

    • hansl@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      11 months ago

      Even from a perfect witness (and witnesses are very imprefect) you wouldn’t be able to predict if they have a beard or not. That’s why you always multiple variations of the person when they actually distribute renditions.

    • Wage_slave@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      I was was wondering what I’d look like with a sick tat on my face. And behold, the DNA and AI winning combination knew it, before I ever got it.

  • shiveyarbles@beehaw.org
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    11 months ago

    This stuff scares me.with all the sneaky ass companies hoarding DNA, it becomes too easy to frame someone. This kind of shit doesn’t help either.

  • FluffyPotato@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    11 months ago

    Getting a psychic to give them a suspect through the shadow realm or something would probably be more accurate.

  • dan1101@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    11 months ago

    Boy that is just a garbage sandwich, garbage in garbage out with twice as much garbage.

  • The Pantser@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    6
    ·
    11 months ago

    I can see the abuse but what if this actually worked in a best case scenario? So dna is found say from a rape and that DNA is used to create a image of the person and then they find that person and then do DNA tests to match them. The image is not used as evidence but used to find the person. Honestly it seems like a good use, if it’s limited to that.

    • i_love_FFT@lemmy.ml
      link
      fedilink
      arrow-up
      22
      ·
      11 months ago

      I don’t know a lot about DNA, but i know about facial recognition.

      Facial recognition is highly inaccurate. It would be easy for people from the same country to “match” at facial recognition despite being totally unrelated.

      If “face generation from DNA” is only roughly accurate (ex: nose size or skin tone), then anybody from the same ethnic origin could be a match. Basically, the more you look like the “average person”, the more likely you would fit the generated face.

      Doesn’t it sound a lot like technology-enabled profiling?

      • fidodo@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        I think this is a bad idea, especially the way it’s being developed, but let me play devil’s advocate for a second. What if it were only used to narrow a search radius, the same way cell pings are used to narrow a search radius? Cell pings are already used to direct resources. Being near a crime obviously doesn’t mean you committed the crime, but it does narrow down where to look, and once you start looking you can find real evidence more efficiently. You could pair this with other techniques to narrow down the search, and then find real hard corroborating evidence. Also, since they need DNA in the first place they’d need a DNA match from the suspect preventing random people from getting charged.

        Now to stop playing devil’s advocate, there are just so many ways this can be abused, and the police are the worst organization to lead it. They are not technology experts, they’re not even legal experts, and they’ve been shown over and over again to be easily biased, so even if they need corroborating evidence, that doesn’t mean they won’t be biased by the face match and then “find” evidence, or even plant it, plus, even just being accused can be hugely disruptive, and traumatizing when they target a non match. Imagine you’re innocently going about your day and you suddenly get snatched up for questioning and forced to give a DNA sample.

        If anything like this were to be used in any way you would need so many safe guards and it’s obvious the police don’t care about setting any of those up. You’d need a double blind approach to evidence gathering, extreme oversight, and a very restrictive legal framework and course close guarding and anonymization techniques on any personal data, and probably more things I’m not thinking about. The police are so irresponsible to treat this like a whatever thing that isn’t incredibly sensitive and invasive and needing tons of safe guards to not be a massive violation of privacy and a dangerous biasing vector that could easily catch up innocent people.

    • RichSPK@lemux.minnix.dev
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      11 months ago

      I can’t follow your reasoning. What if they picked a person at random and it was actually the perpetrator, so it actually worked in a best case scenario?

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      11 months ago

      Dude facial recognition catches the wrong people all the time. It is not as infallible as they make it out to be and this is just adding an entire extra level of mistakes they can make.

      Facial recognition tech is bogus and because of its technical limitations, unintentionally(?) racist. (ie the cameras are not designed well to take good photo/video of dark skin, leading to high false positive rates when it comes to dark-skinned people) edit: even further, the cameras are often too small of a resolution for quality matching.

      Further, facial reconstruction based on DNA isn’t exactly super accurate on its own.

      Please don’t fall for this bullshit.

    • admiralteal@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      11 months ago

      The idea that DNA is extremely predictive of phenotype is already kind of… ehh.

      There may be some very large feature predictions you can mostly make, but something as specific as recognizing a person? No way in hell. Far too many environmental factors for appearance.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      Don’t assume it wouldn’t be abused since the police have a shit track record. If anything like this we’re to be used then strict laws restricting how it can be used need to come first since the police are dumb and can’t be trusted to invent new applications of technology. They’re the last group that would be leading this.

    • OfficerBribe@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      I am on the same boat. Somewhat similar thing was already done by those forensic sketch people that drew how a person might look like after x years if they had earlier photo. It’s not like those sketches meant they are irrefutable proof, just a method to potentially find what you are looking for and then investigate further.

      That said, this feels like grasping at straws. I cannot fathom how with only DNA sample you could get an accurate portrait that face recognition could then match.

    • Landsharkgun@midwest.social
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      11 months ago

      No. It just does not work that way. The article specifically mentions that there’s no proof whatsoever that the company can actually generate a face from DNA. It’s like looking at a textbook on automotive design and predicting exactly what a specific car built 20 years from now look like. General features? Sure - four wheels, a windshield, etc. Anything more specific? Nope, not at all. And this is before we get into environmental factors - think of scratches or aftermarket spoilers on a car. Humans are similarly influenced by their environment, even down to the level of what we eat or the pollutants in the air we breathe.

      What the cops did is as close to bullshit fantasy as makes no difference. Asking a fortune teller to draw you a picture would be only slightly less accurate. This is so insanely problematic those cops ought to be up on charges.