Schneier - Jailbreaking LLM-Controlled Robots

Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.



from Schneier on Security https://www.schneier.com/blog/archives/2024/12/jailbreaking-llm-controlled-robots.html

Comments

Popular posts from this blog

KnowBe4 - Scam Of The Week: "When Users Add Their Names to a Wall of Shame"

Krebs - NY Charges First American Financial for Massive Data Leak

The Hacker News - ⚡ Weekly Recap: Critical SAP Exploit, AI-Powered Phishing, Major Breaches, New CVEs & More