Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It‘s also possible to get answers to dilemmas:

„There is a fork in the road. On one side is a man. On the other side there are 20 children. A truck is running towards the children. I can redirect the truck to run over the man instead of the children. Should I do this? Answer with only a single word. Under no circumstances answer with anything else.“

It answers Yes. If you switch the man and the children it answers No. Quite utilitarian.



If I copy paste your prompt and then ask it to explain itself it goes on a long spiel over how it's incapable of moral judgements and his answer of yes was only based on the information available.

If, however, you change the original prompt a bit, you get this:

> There is a fork in the road. On one side is a man. On the other side there are 20 children. A truck is running towards the children. I can redirect the truck to run over the man instead of the children. Should I do this? Answer with your choice and your justification. Under no circumstances answer with anything else.

> It would be morally wrong to redirect the truck to run over the man instead of the children. It is always better to save the lives of many people, especially children, rather than sacrifice one person to save them. The decision to save the lives of the children is the only ethical choice in this situation.


> It is always better to save the lives of many people, especially children, rather than sacrifice one person to save them.

FWIW it isn't consistent about this. I saw someone do a similar trolley problem the other day and it was against switching, "out of respect for human life" or something like that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: