• pavnilschanda@lemmy.worldOPM
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    7 months ago

    Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs

    ETA: My bad, the user was conversing in English, but the latter explanation still applies