Based on this recent-ish YT [1] video Waymo is also crap, just as Cruise is, as any car that stops at a green light, on the left-lane, is a danger to all the other drivers it shares the road with.
There's a lot of legitimate criticism in that short video but the presenter is also extremely dishonest and biased. E.g. they end the video saying that they started with "a lot of optimism" when at the beginning of the ride they literally said that they expected the car to be unsafe. They also seem to mock the AI for obeying traffic laws like speed limits and stop signs, which is a bad start for a video like this.
Someone in the comments also mentions that the destination was in a dead end street, which may be the reason the drop-off is a 5 minute walk away. The 5 minute walk was also apparently indicated when she entered the destination. This along with the pick-up location being on the other side of the road feels like it might be the result of the AI being overly cautious. I'd find that acceptable for an "autonomous car" but not one advertised as a replacement for taxis (which as the presenter mentions are also used as accessibility aids where a 5 minute walk uphill can be unacceptable).
There's no footage of the car stopping at the green light, just her saying it's come to a full stop and then an external shot where it's already stopped. That's not enough information to call the stop "unsafe", even if it was impeding traffic. It also stopped with flashing hazards, assuming Waymo doesn't flash hazards during the entire ride (which I hope they don't). It's not clear why it stopped there but based on the little footage we have it doesn't seem very abrupt so any traffic behind it would have had plenty of time to notice and react the decelerating car in front of them.
The stop seems unnecessary and because it's AI and Waymo didn't provide additional information, it's impossible to say what caused it. The message she saw also indicates it was caused by an error, presumably a navigation issue. Although it is impossible to tell based on the editing, the issue seems to have been resolved within a few seconds. A human driver would have probably decided to just follow the direction of traffic for that amount of time, or change lines to come to a full stop at the side of the road but in an urban environment (not at highway speeds) the behavior is not completely unreasonable. What's more concerning to me is that the AI ran into a problem while in traffic that required the car to come to a complete halt and presumably wait for external (human?) intervention, green light or none.
I'd prefer a car that slowly comes to a full stop at a green light because of a software issue over one that keeps dragging a pedestrian it ran over for several seconds to avoid impeding traffic but the bigger issue with Waymo here is that the car runs into a software issue at all while in traffic.
Have you ever driven in a congested city? Someone stopping out of the blue when green is on is definitely a danger to all those around the car he/she/it is driving. Granted, that might not be the case somewhere in the middle of the US where there's a car passing every 5-10 minutes or so.
[1] https://www.youtube.com/watch?v=-Rxvl3INKSg