• deegeese@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    302
    ·
    2 months ago

    Guy whose cars run into stopped fire trucks thinks he’s an expert on computer vision.

    • SomeAmateur@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      33
      ·
      edit-2
      2 months ago

      Speaking of fire trucks has anyone here ever read the emergency response procedures for teslas in severe accidents? When I was a volunteer we gave it a look over.

      If I remember right, Depending on the model they recommend up to 8,000 gallons (~30k liters) to keep an overheating battery’s temp stable in case of fire or exposure to high heat. I’ll link the resource page here.

      Our engine holds 700 gallons (5.2k liters) and the typical tanker in our area holds 2,000 (7.5k liters)

      That’s a house fire level response for a single electric vehicle. Just getting that much water moved to a scene would be challenging. We have tankers, but how many city departments can move that much water? You don’t see hydrants on highways. And foam is not effective like it is for normal car fires. The future will be interesting for firefighters.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      22
      ·
      2 months ago

      It is especially important to understand that Tesla’s struggles with navigation are entirely a result of Elon refusing to equip them with LiDAR. This isn’t some “The tech is really new and really complicated and we’re still figuring it out” problem. There is a very good solution to most collision avoidance scenarios, but Elon refuses to let them use it because he’s an idiot.

  • Tar_Alcaran@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    173
    arrow-down
    1
    ·
    2 months ago

    For those doing the maths at home:

    An F35 who obligingly flies top-towards-you (not exactly something you can do, but hey, maybe they’re turning) is all of 10m tall.

    An AIM-120C can very comfortably hit a target at 100km.

    At that range, the F-35 takes up 26 arcseconds, or 0.007 degrees. That’s roughly about the size of this period, at a distance of 3 meters away.

    [ . ]

    Good luck spotting that in a sky of roughly the same colour, full of other objects.

    • jimbolauski@lemm.ee
      link
      fedilink
      English
      arrow-up
      81
      arrow-down
      3
      ·
      2 months ago

      You can place cameras anywhere, they don’t need to be right next to what is being targeted. Nearer ranges will allow AI to misidentify at much higher rates than max standoff ranges of an AIM-120C.

    • Miles O'Brien@startrek.website
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      1
      ·
      2 months ago

      Pffffffff

      I can see that bright white dot against the dark mode background on my maximum brightness screen with ease! Therefore your argument is invalid!

    • Ledivin@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      2
      ·
      2 months ago

      Yeah but what about the AI? Have you thought about the AI that would be running it, which never misses, and would totally be a useful existing thing? 😉

    • chonglibloodsport@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      2 months ago

      Just for reference: JWST has an optical resolution of 0.07 arcseconds. It’s a mirror 22 feet in diameter though, not something you’d put inside a missile guidance package.

          • reinei@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            Well but I am!

            Although, we would still need to get it back here… Okay so first we send two more rockets after it! One to return it on and one with the/a human engineer on board to pack it back up.

            I mean we can hardly have it return while unpacked. That would damage the delicate heat baffles! And we need those to shield it from the rockt engine at the back of our missile so it doesn’t start targeting itself because it no longer knows where it is/isn’t…

      • RiceMunk@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 months ago

        Holy shit. I just realised that the reason they’re building the ELT is so they can mount it on a missile and shoot down an F-35 at some point.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      and then also dealing with the F-35 itself, even if you managed to lock on and target it, it will have anti-warfare capabilities you have to contend with.

    • Comment105@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 months ago

      Yeah, sure. But that doesn’t matter if you point the AI at it with a really good zoom lens, though. And then you have a ton of them, pointed in an directions, like the compound eye of a fly. F35 spotted.

  • DaddleDew@lemmy.world
    link
    fedilink
    English
    arrow-up
    149
    arrow-down
    2
    ·
    2 months ago

    If a fighter jet is within visual range of a camera, it’s already too late. And that’s if there aren’t any clouds.

    • Slab_Bulkhead@lemmy.world
      link
      fedilink
      English
      arrow-up
      66
      ·
      2 months ago

      your not thinking like a musk, not if the government pays the subscription and contract for his early warning camera drone balloon swarm thing or something something they could run on ketamine or something.

    • finitebanjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Part of the reason air defences mostly rely on radar and other parts of the electromagnetic spectrum from at least 3 locations using triangulation to build a precise map of objects in the sky, but just like cameras that doesn’t work when the objects in question are too high or hidden behind objects. From there you can send countermeasures to intercept coordinates and then arm them to search for nearby objects via infrared.

      Using the visible part of the electromagnetic spectrum is pretty much useless in modern weapons. I remember seeing even a Tank operator’s display being totally jank because they don’t use normal cameras either, perhaps because they wanted data to train machines to do it instead of human operators? Idk, didn’t make sense to me.

  • riodoro1@lemmy.world
    link
    fedilink
    English
    arrow-up
    95
    ·
    2 months ago

    His fucking obsession with computer vision. He’s so convinced he’s right he forgot that clouds exist… and his cars plow straight into obstacles.

    • Sylvartas@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      ·
      2 months ago

      Yeah, the “lidar is useless” guy whose cars are consistently crashing into things when visibility is bad is telling us that he can do the same thing with missile targeting systems… Sounds like a great idea

      • riodoro1@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        ·
        2 months ago

        And that a plane at altitude is too small for wide field cameras which means scanning the sky with narrow fov detectors.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          12
          ·
          2 months ago

          And F-35s are really fast. By the time you recognize and can target it, it’ll fly behind a cloud or something. So not only do you need to make a really fast rocket w/ vision-based AI integrated, it also needs to be able to detect said plane at great distances, as well as maneuver well enough to see it as it exits clouds and whatnot. That’s a lot more complicated than slapping radar on something with heat tracking at close distances.