This might also be an automatic response to prevent discussion. Although I’m not sure since it’s MS’ AI.*

  • plz1@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    4 months ago

    I get Copilot to bail on conversations so often like your example that I’m only using it for help with programming/code snippets at this point. The moment you question accuracy, bam, chat’s over.

    I asked if there was a Copilot extension for VS Code, and it said yup, talked about how to install it, and even configure it. That was completely fabricated, and as soon as I asked for more detail to prove it was real, chat’s over.

    • DetectiveSanity@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      26 days ago

      That would force them to reveal it’s sources (unconsented scraping) hence make them liable for any potential lawsuits. As such they would need to withdraw from revealing sources