I'm sorry, but seriously? How could you not care who has your health data?
I think the more plausible comment is "I've been protected my whole life by health data privacy laws that I have no idea what the other side looks like".
Quite frankly, this is even worse as it can and will override doctors orders and feed into people's delusions as an "expert".
I’d rather have all my health data be used in a way that can actually help me, even with a risk of a breach or misuse, than having it in a folder somewhere doing nothing.
In general, health insurance companies (at least in the US) are pretty much prevented from using any health data to set premiums. In fact, many US states prevent insurers from charge smokers higher premiums.
It doesn't have to get to your employer, it just has to get to the enormous industry of grey-market data brokers who will supply the information to a third-party who will supply that information to a third-party who perform recruitment-based analytics which your employer (or their contracted recruitment firm) uses. Employers already use demographic data to bias their decisions all the time. If your issue is "There's no way conversations with ChatGPT would escape the interface in the first place," are you... familiar with Web 2.0?
I’ve had mixed experiences with doctors. Often times they’re glancing at my chart for two minutes before an appointment and that’s the extent of their concern for me.
I’ve also lived in places where I don’t have a choice in doctor.
What is it with you people and privacy? Sure it is a minor problem but to be _this_ affected by it? Your hospitals already have your data. Google probably has your data that you have google searched.
What's the worst that can happen with OpenAI having your health data? Vs the best case? You all are no different from AI doomers who claim AI will take over the world.. really nonsensical predictions giving undue weight to the worst possible outcomes.
Your health data could be used in the future, when technology is more advanced, to infer things about you that we don't even know about, and target you or your family for it.
Health data could also be used now to spot trends and problems that an assembly-line health system doesn't optimize for.
I think in the US, you get out of the system what you put into it - specific queries and concerns with as much background as you can muster for your doctor. You have to own the initiative to get your reactive medical provider to help.
Using your own AI subscription to analyze your own data seems like immense ROI versus a distant theoretical risk.
It feels like everyone is ignoring the major part of the other side’s argument. Sure, sharing the health data can be used against you in the future, but it can be used to help you right now as well. Anyone with any sort of pain in the past will try any available method to get rid of it. And that’s fair when those methods, even with 50% success rate, are useful.
I'm in the same boat as them, I honestly wouldn't care that much if all my health data got leaked. Not saying I'm "correct" about this (I've read the rest of the thread), just saying they're not alone.
It's always been interesting to me how religiously people manage to care about health data privacy, while not caring at all if the NSA can scan all their messages, track their location, etc. The latter is vastly more important to me. (Yes, these are different groups of people, but on a societal/policy level it still feels like we prioritize health privacy oddly more so than other sorts of privacy.)
I think the more plausible comment is "I've been protected my whole life by health data privacy laws that I have no idea what the other side looks like".
Quite frankly, this is even worse as it can and will override doctors orders and feed into people's delusions as an "expert".