What does the term jailbreaking refer to in the context of generative AI?
Detailed Solution:
Correct Answer: Bypassing safety or content restrictions of an AI system
In generative AI, jailbreaking means trying to trick or manipulate an AI model into ignoring its built-in rules or ethical safeguards, allowing it to generate restricted or harmful content. This is similar to bypassing a phone’s security limitations but applied to AI systems.
