Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

However, a spokesman for the Campaign to Stop Killer Robots warned of the temptation to use them in warfare.

I had no idea such an organization existed. The future is now, I guess.



It's a serious coalition of NGOs. http://www.stopkillerrobots.org/. They call for a ban on "robotics weapons systems that, once activated, can select and engage targets without further intervention by a human".

Lots of weapons already meet that requirement.


Could you give some examples?



Guided missiles.


They are not selecting targets autonomously; they're just following a particular target the person shooting them wants dead.


A broad reading of the language could include even classic target-seeking missiles. A heat-seeking missile is a rudimentary heuristic "AI agent" that makes "decisions" on direction of travel based on looking for hotspots. Sometimes this hits the target the person shooting wanted dead, but other times it hits a different target, because the "AI" made the wrong "decision".

It's easy to use scare quotes here, because the automated behavior is so direct and understandable that we don't really see it as AI. Even when it does something other than what the human intended, it doesn't seem like a rebellious robot, just a heat-seeking missile that happened to be near an unexpected heat source, which it of course followed, and therefore hit the wrong thing. The general idea that the human is giving high-level orders and a robot is making local decisions in an attempt to carry them out is not that different though. The main difference is that the local decision logic is nowadays getting more complex than "find hot thing nearby". But that too is a gradual trend: even old heat-seeking missiles started getting more complex logic, to try to avoid being misled by flares.


Well put. It's going to be hard to have a bright line test, especially when implemented by secret software.

The definition of "chemical weapons" is also problematic. Example: https://en.wikipedia.org/wiki/White_phosphorus_use_in_Iraq



It's a great, insightful video.

So from a friend that was a major in the USAF flying fixed and rotor wing aircraft: the current UCMJ / ROEs requires an officer present for each and every kill decision. Period. If that doesn't happen, it's get-a-lawyer-and-court-martial time. Probably main US military isn't the biggest threat if the video's fears in the near term, there are too many people watching: it's any force of any country that doesn't respect rules of war.


Very reassuring. Some military guy decides who lives and who dies. The civilian kill rate is likely high (but we can't really judge for sure as the US military doesn't tell us that stuff), are you really sure the US isn't a big threat? Court martial? It's a slap on the wrist for murder. How severely punished were those who conducted war crimes in Iraq? The US might respect most rules of war, but not by any stretch does it respect all of them. Hell, it doesn't even accept a war is occurring in some of the conflicts it is pursuing. How many of the US drones are firing into areas with a declared war? http://www.theguardian.com/world/2013/sep/22/journalists-web...

edit: For clarity.


I always found funny that war has rules, when people are just trying to kill other people no matter what.


I remember first hearing that there were rules to war as a kid. I literally couldn't believe it.

You can make the argument that there are some very inhumane/barbaric ways to kill people, and as a "civilized" world, we shouldn't engage in them. But, that line of reasoning always takes you back to, "wait, how about we just not kill people at all?"


I'm sure after all armies are automatized somebody will eventually realize both sides can save billions and get same effect by just simulating the whole thing on computer.


I feel it should be headed by a girl called Yoshimi.


Yeah, they've got an extensive partnership with the Association to Prevent Skynet From Happening.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: