Don't miss the point here. Tesla doesn't require that you be a licensed air pilot before you buy the car. It doesn't matter what pilots know about autopilots, it's about what the general public believes when they hear "autopilot". I wouldn't be surprised if 50% of the world believes that an autopilot allows both pilots to leave the cockpit for five or ten minutes. So the word "autopilot" misleads them, even if a pilot would know the limitations.
No, but you are required to be a licensed driver and the car warns you if you take your hands off the wheel. I don't know how you can claim that the meaning of "autopilot" isn't clear to the drivers.
Did your driver's license involve training in how to handle autopilot and its common errors and how to correct them? If not, I'm not sure how having a driver's license is relevant?
The problem is that if the accident was anything like this video[1] then the driver barely had any time to actively correct the problem, a problem that a well-worn and understood system with driver training would have prepared the driver for.
Also, note that the beep is the driver taking over from autopilot.
Based on that video, all that is required is that you stay in the lane. If you are paying attention, you have plenty of time to do that. If you are not paying attention and/or your hands are not on the wheel, I could see how you would struggle to react in time.
But blaming this on the use of the term "Autopilot" is idiotic and discussion centering around it is silly when there are much more important questions to answer about the design of autopilot, other driver assistance systems, and the future of self-driving cars.