The jailbreak can prompt a chatbot to engage in prohibited behaviors, including generating content related to explosives, bioweapons, and drugs. Microsoft has uncovered a jailbreak that allows someone ...