Humans are not inherently safe; there is very little you can do to prevent a human with a hammer to kill another one. In fact what you usually do with these humans is to put them in jail because they have no direct ability to hurt anyone.
LLM are in jail: an LLM outputting {"type": "function", "function": {"name": "execute_bash", "parameters": {"command": "sudo rm -rf /"}}} isn't unsafe. The unsafe part is the scaffolding around the LLM that will fuckup your entire filesystem. And my whole point is that there are ways to make that scaffolding safe. There is a reason why we have permissions on a filesystem, why we have read only databases etc etc.
For scaffolding to be "safe", you basically need that scaffolding to know exactly what the LLM is being used for, and outsmart it at every turn if it misbehaves. That's impractical-to-impossible. There are tasks that need access for legitimate reasons - like human tasks that need hammer access - and the same access can always be used for illegitimate reasons.
It's like trying to engineer a hammer that can't be used to bludgeon someone to death. Good fucking luck.
LLM are in jail: an LLM outputting {"type": "function", "function": {"name": "execute_bash", "parameters": {"command": "sudo rm -rf /"}}} isn't unsafe. The unsafe part is the scaffolding around the LLM that will fuckup your entire filesystem. And my whole point is that there are ways to make that scaffolding safe. There is a reason why we have permissions on a filesystem, why we have read only databases etc etc.