Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs
ETA: My bad, the user was conversing in English, but the latter explanation still applies
To me it always just responds:
»I’m sorry, but I can’t comply with that request.«
Right after I send a Jailbreak. Also with 4o.
Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs
ETA: My bad, the user was conversing in English, but the latter explanation still applies