WormGPT Is a ChatGPT Alternative With ‘No Ethical Boundaries or Limitations’::undefined

    • KairuByte@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Not joking actually. Problem with jailbreak prompts is that they can result in your account catching a ban. I’ve already had one banned, actually. And eventually you can no longer use your phone number to create a new account.

      • Zaphod
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        oh damn, I didn’t know that. Guess I’ll better be careful then