Come on man, you don't actually believe this. If you did you'd be a psychopath, and you certainly seem to care about people's lives when it comes to things like climate change. Just because you don't think AI doom is as likely, doesn't mean you should go and pretend that in that one case you all of a sudden have a nihilistic view of human life -- rhetoric matters.
There is no such thing as "universally a positive" unless you assume one. Not just in the sense of "there is no one true universal moral value function", but in the sense that "universal moral value function" is essentially gibberish -- as is "bad from a universal perspective". Humanity being wiped out would not be bad from a universal perspective because nothing is bad from a universal perspective. When we talk about good and bad we always implicitly couch that in "from a(/one or more) human perspective(s), ...".