Schneier - Jailbreaking LLM-Controlled Robots
Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.
from Schneier on Security https://www.schneier.com/blog/archives/2024/12/jailbreaking-llm-controlled-robots.html
Comments
Post a Comment