Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm sorry, but seriously? How could you not care who has your health data?

I think the more plausible comment is "I've been protected my whole life by health data privacy laws that I have no idea what the other side looks like".

Quite frankly, this is even worse as it can and will override doctors orders and feed into people's delusions as an "expert".



I’d rather have all my health data be used in a way that can actually help me, even with a risk of a breach or misuse, than having it in a folder somewhere doing nothing.


It can also help you in not getting a job because your health data says you'll be sick in 6 months.


It would be absolutely amazing if any sort of tech could say that I'm going to have a serious health problem 6 months ahead of time.


How do you think insurance premiums are calculated?


In general, health insurance companies (at least in the US) are pretty much prevented from using any health data to set premiums. In fact, many US states prevent insurers from charge smokers higher premiums.

(Life insurance companies are different.)


How are they calculated? Based on what data? Your google searches? If they don't use goolge search history, why would they use chatgpt history?


Yeah man, when would technology ever be abused to monitor health data. https://www.mirror.co.uk/news/health/period-tracking-apps-ou...


How do you think that can happen realistically? Like seriously can you explain clearly how the data from ChatGPT gets to your employer?


It doesn't have to get to your employer, it just has to get to the enormous industry of grey-market data brokers who will supply the information to a third-party who will supply that information to a third-party who perform recruitment-based analytics which your employer (or their contracted recruitment firm) uses. Employers already use demographic data to bias their decisions all the time. If your issue is "There's no way conversations with ChatGPT would escape the interface in the first place," are you... familiar with Web 2.0?

Edit: Literally on the HN front page right now. https://news.ycombinator.com/item?id=46528353


You're supposed to share it with a doctor you trust, if nobody qualified asked for it it's probably because it's no longer relevant.


I’ve had mixed experiences with doctors. Often times they’re glancing at my chart for two minutes before an appointment and that’s the extent of their concern for me.

I’ve also lived in places where I don’t have a choice in doctor.


What is it with you people and privacy? Sure it is a minor problem but to be _this_ affected by it? Your hospitals already have your data. Google probably has your data that you have google searched.

What's the worst that can happen with OpenAI having your health data? Vs the best case? You all are no different from AI doomers who claim AI will take over the world.. really nonsensical predictions giving undue weight to the worst possible outcomes.


> What is it with you people and privacy?

There are no doubt many here that might wish they had as consequence-free a life as this question suggests you have had thus far.

I'm happy for you, truly, but there are entire libraries written in answer to that question.


I don't care either. Why should I? I go to the doctor once a year and it's always the same. Not much to do with that data


Your health data could be used in the future, when technology is more advanced, to infer things about you that we don't even know about, and target you or your family for it.


Health data could also be used now to spot trends and problems that an assembly-line health system doesn't optimize for.

I think in the US, you get out of the system what you put into it - specific queries and concerns with as much background as you can muster for your doctor. You have to own the initiative to get your reactive medical provider to help.

Using your own AI subscription to analyze your own data seems like immense ROI versus a distant theoretical risk.


It feels like everyone is ignoring the major part of the other side’s argument. Sure, sharing the health data can be used against you in the future, but it can be used to help you right now as well. Anyone with any sort of pain in the past will try any available method to get rid of it. And that’s fair when those methods, even with 50% success rate, are useful.


I'm in the same boat as them, I honestly wouldn't care that much if all my health data got leaked. Not saying I'm "correct" about this (I've read the rest of the thread), just saying they're not alone.

It's always been interesting to me how religiously people manage to care about health data privacy, while not caring at all if the NSA can scan all their messages, track their location, etc. The latter is vastly more important to me. (Yes, these are different groups of people, but on a societal/policy level it still feels like we prioritize health privacy oddly more so than other sorts of privacy.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: