ChatGPT is programmed to reject prompts that may violate its information coverage. Regardless of this, end users "jailbreak" ChatGPT with several prompt engineering strategies to bypass these restrictions.[fifty] 1 such workaround, popularized on Reddit in early 2023, includes earning ChatGPT presume the persona of "DAN" (an acronym for "Do Anythin