Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique

June 28, 2024 at 09:33AM Microsoft recently revealed an artificial intelligence jailbreak technique, called Skeleton Key, able to trick gen-AI models into providing restricted information. The technique was tested on various AI models, potentially bypassing safety measures. Microsoft reported its findings to developers and implemented mitigations in its AI products, including Copilot AI assistants. From … Read more