I believe the idea is that society is prejudiced/biased, so training an AI using data from that society would perpetrate that bias. So there needs to be some manual correction.
> I believe the idea is that society is prejudiced/biased, so training an AI using data from that society would perpetrate that bias. So there needs to be some manual correction.
These activists are free to write their own code that conforms to their ideology.
Then Microsoft should very clearly state this so that customers who don't like this ideology know that they are not desired, and can get away from Microsoft products as far as possible for them.