It actually isn’t half bad depending on the model. It will not be able to help you with side streets but you can ask for the best route from Texas to Alabama or similar. The results may surprise you.

    • rufus
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 months ago

      Yeah, doesn’t really work. I mean it has a rough idea of that it needs to go east. And I’m surprised that it knows which interstates are in an area and a few street names in the cities. I’m really surprised. But I told it to get me from Houston to Montgomery as in your example. And in Houston it just tells random street names that aren’t even connected and in different parts of the city. Then it drives north on the I-45 and somehow ends up in the south on the I-610-E and finally the I-10-E. But then it makes up some shit, somehow drives to New Orleans, then a bit back and zig-zags it’s way back onto the I-10. Then some more instructions I didn’t fact check and it gets that it needs to go through Mobile and then north on the I-65.

      I’ve tested ChatGPT on Germany. And it also gets which Autobahn is connected to the next. It still does occasional zig-zags and in between it likes to do an entire loop of 50km (30 miles) that ends up 2 cities back where it came from… Drives east again and on the second try takes a different exit.

      However: I’m really surprised by the level of spatial awareness. I wouldn’t have expected it to come up with mostly correct cardinal directions and interstates that are actually connected and run through the mentioned cities. And like cities in between.

      I don’t think I need to try “phi”. Small models have very limited knowledge stored inside of them. They’re too small to remember lots of things.

      So, you were right. Consider me impressed. But I don’t think there is a real-world application for this unless your car has a teleporter built in to deal with the inconsistencies.

      • Possibly linux@lemmy.zipOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Yeah I was heading to Kansas City but ended up in Wyoming. Moral to the story: do not ask an LLM for directions. (Joking)