Your emotions are surely caused by the chemical soup, but chemical soup need not be the only way to arrive at emotions. It is possible for different mechanisms to achieve same outcomes.
Perhaps we could say we don't know whether the human biological substrate is required for mental processes or not,
but either way we do not know enough about
said biological substrate and our mental processes, respectively.
> How do we know we've achieved that? A machine that can feel emotions rather than merely emulating emotional behaviour.
Let me pose back to you a related question as my answer: How do you know that I feel emotions rather than merely emulating emotional behavior?
This gets into the philosophy of knowing anything at all. Descartes would say that you can't. So we acknowledge the limitation and do our best to build functional models that help us do things other than wallow in existential loneliness.
And Popper would say you cannot ever prove another mind or inner state, just as you cannot prove any theory.
But you can propose explanations and try to falsify them. I haven’t thought about it but maybe there is a way to construct an experiment to falsify the claim that you don’t feel emotions.
I suppose there may be a way for me to conduct an experiment on myself, though like you I don't have one readily at hand, but I don't think there's a way for you to conduct such an experiment on me.
I wonder what Popper did say specifically about qualia and such. There's a 1977 book called "The Self and Its Brain: An Argument for Interactionism". Haven't read it.
Preface:
The problem of the relation between our bodies and our minds, and especially of the link between brain structures and processes on the one hand and mental dispositions and events on the other is an exceedingly difficult one. Without pretending to be able to foresee future developments, both authors of this book think it improbable that the problem will ever be solved, in the sense that we shall really understand this relation. We think that no more can be expected than to make a little progress here or there.
Philosophers have been worrying about the question of how you can know anything for thousands of years. I promise that your pithy answer here is not it.
I don’t think that’s an argument from authority. “Experts have been discussing X without reaching a conclusion for a long time” is a premise from which a reasonable argument can be made for the unlikelihood that an off-hand comment on HN has solved X. Argument from authority doesn't take that form though the two do have invoking authorities in common.
Ok, but ChatGPT speaks this language just as well as I do, and we also know that emotion isn't a core requirement of being a member of this species because psychopaths exist.
Also, you don't know what species I am. Maybe I'm a dog. :-)
Human-to-human communication is different from a human-to-computer communication. The google search engine speaks the same language as you, heck even the Hacker News speaks the same language as you as you are able to understand what each button on this page mean, and will respond correctly when you communicate back by pressing e.g. the “submit” button.
Also assuming psychopaths don‘t experience emotions is going going with a very fringe theory of psychology. Very likely psycopaths experience emotions, they are maybe just very different emotions from the ones you and I experience. I think a better example would be a comatose person.
That said I think talking about machine emotions is useless. I see emotions as a specific behavior state (that is you will behave in a more specific manner) given a specific pattern of stimuli. We can code our computers to do exactly that, but I think calling it emotions would just be confusing. Much rather I would simply call it a specific kind of state.
1) I know that I have emotions because I experience them.
2) I know that you and I are very similar because we are both human.
3) I know that we can observe changes in the brain as a result of our changing emotions and that changes to our brains can affect our emotions.
I thus have good reason to believe that since I experience emotions and that we are both human, you experience emotions too.
The alternative explanation, that you are otherwise human and display all the hallmarks of having emotions but do not in fact experience anything (the P-zombie hypothesis), is an extraordinary claim that has no evidence to support it and not even a plausible, hypothetical mechanism of action.
With an emotional machine I see no immediately obvious even hypothetical evidence to lend support to its veracity. In light of all this, it seems extraordinary to claim that non-biological means achieving real emotions (not emulated emotions) are possible.
After all, emulated emotions have already been demonstrated in video games. To call those sufficient would be setting an extremely low bar.
My emotions are definitely a function of the chemical soup my brain is sitting in (or the opposite).