I don't think it's silly. Whatever Copilot says, is said by Microsoft too, by extension. And so, it makes sense for Microsoft to not make themselves liable for whatever people make their product spit out. Especially after happenings like this:
"Microsoft's AI Twitter bot goes dark after racist, sexist tweets"
It could, but Word didn't want you to write offensively for quite some time now. I remember being a teenager and just poking at the Word 97, and it was promptly telling me that "you shouldn't write like that" or something similar.
"Microsoft's AI Twitter bot goes dark after racist, sexist tweets"
https://www.reuters.com/article/us-microsoft-twitter-bot-idU...