Unfortunately there's no money in privacy, and a lot of money in either outright selling data or cutting costs to the bare minimum required to avoid legal liability.<p>Wife and I are expecting our third child, and despite my not doing much googling or research into it (we already know a lot from the first two) the algorithms across the board found out somehow. Even my instagram "Explore" tab that I accidentally select every now and then started getting weirdly filled with pictures of pregnant women.<p>It is what it is at this point. Also I finally got my last settlement check from Equifax, which paid for Chipotle. Yay!
Interestingly in healthcare there is a correlation between companies that license/sell healthcare data to other ones (usually they try to do this in a revokable way with very stringent legal terms, but sometimes they just sell it if there is enough money involved) and their privacy stance... and it's not what you would think. Often it's these companies that are pushing for more stringent privacy laws and practices. For example, they could claim that they cannot share anonymized data with academic researchers, because of xyz virtuous privacy rules, when they are actually the ones making money off of selling patient data. It's an interesting phenomenon I have observed while working in the industry that seems to refute your claim that "there's no money in privacy". Another way to think about it is that they want to induce a lower overall supply for the commodity they are selling, and they do this by championing privacy rules.
As new moms tend to change their consumer purchasing habits they are coveted by advertisers. <a href="http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html" rel="nofollow">http://www.nytimes.com/2012/02/19/magazine/shopping-habits.h...</a> Certain cohorts and keywords are very valuable so even searching a medical condition once or clicking on a hiring ad for an in-demand job can shift ads toward that direction for a long time.
It seems more important than ever to have self hosted apps or browser extensions that will intermittently search for these valuable keywords. Ad Nauseum is much better than bare Ublock Origin for the same reason.
Yeah I'm less shocked that it got picked up and more how quickly it spread to literally every platform we use, even those that wouldn't have much if any hint that it was happening.<p>There's clearly quite the active market for this information
Could be as simple as buying a bunch of scent free soap / lotion and some specific vitamin supplements. Walmart / Target were able to detect pregnancy reliably back in 2012 from just their own shopping data.
> the algorithms across the board found out somehow.<p>It's worth keeping in mind that this is basically untrue.<p>In most of these algorithms, there's no "is_expecting: True" field. There are just some strange vectors of mysterious numbers, which can be more or less similar to other vectors of mysterious numbers.<p>The algorithms have figured out that certain ad vectors are more likely to be clicked if your user vector exhibits some pattern, and that some actions (keywords, purchases, slowing down your scroll speed when you see a particular image) should make your vector go in that direction.
<i>> Unfortunately there's no money in privacy</i><p>But there should be and there should be punishments for data breaches, or at least compensations for those affected. Then there would be an incentive for corporations to take their user's privacy more seriously.<p>Your personal data is basically the currency of the digital world. This is way data about you is collected left, right, and center. It's valuable.<p>When I trust a bank to safely lock away my grandmother's jewelry, I have to pay for it, but in return, if it just so happened that the bank gets broken into and all my possessions get stolen, at least I'll get (some) compensation.<p>When I give my valuable data to a company, <i>I have already paid them</i> (with the data themselves), but I have no case whatsoever if they get compromised.
Also on the front page of HN right now is a job posting for Optery (YC W22). Seems like they are growing really fast.
Also possible they have your location if you went to the hospital. Maybe from any Meta "partners" or third party brokers.
FYI, there's a .gov-maintained portal where healthcare companies in the U.S. are legally obliged to publish data breaches. It's an interesting dataset!<p><a href="https://ocrportal.hhs.gov/ocr/breach/breach_report.jsf" rel="nofollow">https://ocrportal.hhs.gov/ocr/breach/breach_report.jsf</a>
This is a suboptimal characterization of this site.<p>I think it would be less wrong to say this is where covered entities that discover reportable breaches of PHI (whether their own or that of a BA) that trigger the immediate reporting obligation report them.<p>This is a narrower scope of coverage and shallower depth of epistemic obligation than you implied.
One of my favorite HIPAA stories is about a doctor who utilized his patient list when sending out campaign-related information when he was running for local office. Over 2 decades of schooling and still didn't understand how stupid this was.
Although almost every company issues a 'we care about your privacy' statement, but there is often very little 'money where your mouth is' resources to back that up.<p>This is why I am almost always very reluctant to give out any information that is not absolutely necessary to provide me the service that I need. If they don't know it, they can't leak it.<p>Every company wants you to fill out their standard form that tries to get you to volunteer way more info than they really need.
If it's a paper form, I leave it blank. If it's a digital form and required, I put in the business's own phone number, address, etc.
<i>Several maps created to assist the agency with decisions — like where to open new offices and allocate certain resources — were made public through incorrect privacy settings between 2021 and 2025 ... the mapping website was unable to identify who viewed the maps ... implemented a secure map policy that prohibits uploading customer data to public mapping websites.</i><p>So a state employee/contractor (doesn't say) uploaded unaggregated customer records to a mapping website hosted on the public internet?
I've built Healthcare SAAS APIs that required custom integrations with EHR partners, as well as consulted on similar apps for others.<p>On top of common OWASP vulnerabilities, the bigger concern is that EHR and provider service apps do not have the robust security practices needed to defend against attacks. They aren't doing active pen testing, red-teaming, supply chain auditing -- all of the recurring and costly practices necessary to ensure asset security.<p>There are many regulations, HIPAA being the most notable, but their requirements and the audit process are incredibly primitive . They are still using a 1990s threat model. Despite HIPAA audits being expensive, the discoveries are trivial, and they are not recurring, so vulns can originate between the audit duration and the audit summary delivery.
And everyone was fired, the top management has stepped down, and the fines were so massive that nobody ever took a chance with sloppy security ever again. Oh, it's actually the opposite of all that.
I've been saying this forever. Computer security is and always will be nothing more than theater for with some minimal effort to cover bases, like hiring an INFOSEC then ignoring them. No on in charge cares about security because the number of people in charge punished for these breaches is still ZERO.
The last time this happened, did the AG prosecute the person who discovered the vulnerable data?
Ah, I think I recall the story you're referring to: reporter Josh Renaud of the St. Louis Post-Dispatch discovered that a public web site was exposing Social Security numbers of teachers in Missouri. He notified the site's administrators, and later published a story about the leak after it was fixed.<p>The governor of Missouri at the time, Mike Parson, called him a hacker and advocated prosecuting him. Fortunately the prosecutor's office declined to file charges though.
I'm sure they "take security very seriously".
I will admit that a level of fatigue has reached me as well. I am not even sure what would be an appropriate remedy at this point. My information has been all over the place given multiple breaches the past few years ( and, I might add, my kid's info too as we visited a hospital for her once ).<p>Anyway, short of collapsing current data broker system, I am not sure what the answer is. Experian debacle showed us they are too politically entrenched to be touched by regular means.<p>At this point, I am going through life assuming most of my data is up for grabs. That is not a healthy way to live though.
>I am not even sure what would be an appropriate remedy at this point.<p>It will have to be political and it's got to be fines/damages that are business impacting enough for companies to pause and be like A) Is it worth collecting this data and storing it forever? and B) If I don't treat InfoSec as important business function, it could cost me my business.<p>It also clear that certification systems do not work and any law/policy around it should not offer any upside for acquiring them.<p>EDIT: I also realize in United States, this won't happen.
I agree but I think the problem will be if the consequences are that dire then entire classes of business will cease to exist OR the cost of doing things properly will be passed on to the consumer.<p>I struggle to see how data brokers, social media, etc are a net benefit to society so would be happy to see those sorts of businesses cease to exist, but I suspect I'm in the minority.
The entire targeted advertising industry is basically a progressive tax.<p>The "social contract" is that many services are fully or partially financed by advertising. Rich people produce more ad revenue (because they spend more), but they get the same quality of service, effectively subsidizing access for the poorer part of the population, who couldn't afford it otherwise.<p>If we break this social contract down, companies will still try to extract as much revenue as possible, but the only way to do that will be through feature gating, price discrimination, and generally making your life a misery unless you make a lot of money.
The State of Illinois is going to lose its "business" already for other reasons. Do you think there is a reasonable privacy regime that prevents health systems from knowing where their patients live or using that information to site clinics?
Why is my data freely and instantly available within a centralized "health system" to begin with? Why can't we implement a digital equivalent of clunky paper records? Everything E2EE. Local storage requiring in person human intervention to access. When a new provider wants my records from an old one there should be a cryptographic dance involving all three parties. Signed request, signed patient authorization, and then reencryption for the receiving party using the request key.<p>What the health system should impose is a standard for interoperability. Not an internal network that presents a juicy target.
This has nothing to do with the "data broker system." Reading between the lines it was more of a "shadow IT" issue where employees were using some presumably third-party GIS service for a legitimate business purpose but without a proper authentication & authorization setup.
Assuming your tea leaf reading is correct, that particular third party would not even exist in its current form without 'data broker ecosystem'. It is, genuinely, the original sin.
A website where you can upload POIs to a shareable map seems like one of those things that's so obvious and so useful it exists almost under any economic arrangement of the advertising industry.<p>I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.
Heh. The shareable map is operated by someone and that someone has information that other people crowdsourced for them for free is even more valuable. If you want a more relatable example, I would like to point to defuct effort ( karma or something.. I can't find the specifics now ), where people were invited to crowdsource all sorts of info on other people. It only got shut down, because it was too on the nose. On the other hand, items like the shareable map like the one you mention is more easily defensible...<p><< I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.<p>I posit that both could be true at the same time.
Did you actually suffer any negative consequences of these breaches?<p>I see so many comments about how punishments for data breaches should be increased, but not a single story about quantifiable harm that any of those commenters has suffered from them.
It is difficult to read this. On the one hand, for a good chunk of the population that is true and yet, one knows that is absolutely not true to individuals, who will be affected.<p>Since I do have multiple breaches under my belt, I could offer you an anecdote, but I won't. Do you know why? Because it is not up to me to quantify harm that was done the same way I don't have to explain, to a reasonable person, why doxxing people is not something people should have to suffer through.<p>I have a personal theory as to why that state persists. The quantifiable harm is small per individual affected, but high across the population and thus underreported. Sadly, the entities that could confirm that are not exactly incentivized to say they are causing harm to begin with..
If you want to get more stressed about it and consider the impeding dystopian future, I invite you to think about the “harvest now, decrypt later” potential reality that quantum cryptography is going to enable.<p>At some point, everything that we have ever assumed to be confidential and secure will be exposed and up for grabs.
Change name to a very common one. Much better privacy.
I’m from a culture in which family use a very small number of very highly conserved names and non standard name positions. I’ve noticed this is sufficient to confuse the low rent data brokers that do statistical linkage. My parents and grandparents and my siblings and my children have all at various points shared addresses landlines and have overlapping names. The brokers are very unclear on how many people are involved , what sex , what generations what states.
I grew up around some people with the last name "Null". I often wonder how they're doing for data privacy today.
Would you like 2 years of credit monitoring? Or perhaps you can get $5 from this class action settlement.
Restrict data collection? It would kill all startups and firmly entrance a terrible provider monopoly who can comply.<p>Have the government own data collection? Yeah, I don't even know where to start with all the problems this would cause.<p>Ignore it and let companies keep abusing customers? Nope.<p>Stop letting class-action lawsuits slap the company's wrists and then give $0.16 payouts to everyone?<p>What exactly do we do without killing innovation, building moats around incumbents, giving all the power to politicians who will just do what the lobbyists ask (statistically), or accepting things as is?
Why do the start ups need to collect data like this?
We apply crippling fines on companies and executives that let these breaches happen.<p>Yes, some breaches (actual hack attacks) are unavoidable, so you don't slap a fine on <i>every</i> breach. But the vast majority of "breaches" are pure negligence.
> Restrict data collection? It would kill all startups and firmly entrance a terrible provider monopoly who can comply.<p>That's a terrible argument for allowing our data to be sprayed everywhere. How about regulations with teeth that prohibit "dragons" from hoarding data about us? I do not care what the impact is on the "economy". That ship sailed with the current government in the US.<p>Or, both more and less likely, cut us in on the revenue. That will at least help some of the time we have to waste doing a bunch of work every time some company "loses" our data.<p>I'm tired of subsidizing the wealth and capital class. Pay us for holding our data or make our data toxic.<p>Obviously my health provider and my bank need my data. But no one else does. And if my bank or health provider need to share my data with a third party it should be anonymized and tokenized.<p>None of this is hard, we simply lack will (and most consumers, like voters are pretty ignorant).
The solution is to anonymize all data at the source, i.e. use a unique randomized ID as the key instead of someone's name/SSN. Then the medical provider would store the UID->name mapping in a separate, easily secured (and ideally air-gapped) system, for the few times it was necessary to use.
<i>...use a unique randomized ID as the key...</i><p>33 bits is all that are required to individually identify any person on Earth.<p>If you'd like to extend that to the 420 billion or so who've lived since 1800, that extends to 39 bits, still a trivially small amount.<p><i>Every bit[1] of leaked data bisects that set in half</i>, and simply anonymising IDs does virtually nothing <i>of itself</i> to obscure identity. Such critical medical and billing data as date of birth and postal code <i>are themselves</i> sufficient to narrow things down remarkably, let alone a specific set of diagnoses, procedures, providers, and medications. Much as browser fingerprints are often unique or nearly so <i>without any universal identifier</i> so are medical histories.<p>I'm personally aware of diagnostic and procedure codes being used to identify "anonymised" patients across multiple datasets dating to the early 1990s, and of research into de-anonymisation in Australia as of the mid-to-late 1990s. Australia publishes anonymisation and privacy guidelines, e.g.:<p>"Data De‑identification in Australia: Essential Compliance Guide"<p><<a href="https://sprintlaw.com.au/articles/data-de-identification-in-australia-essential-compliance-guide/" rel="nofollow">https://sprintlaw.com.au/articles/data-de-identification-in-...</a>><p>"De-identification and the Privacy Act" (2018)<p><<a href="https://www.oaic.gov.au/privacy/privacy-guidance-for-organisations-and-government-agencies/handling-personal-information/de-identification-and-the-privacy-act" rel="nofollow">https://www.oaic.gov.au/privacy/privacy-guidance-for-organis...</a>><p>It's not <i>merely</i> sufficient to substitute an alternative primary key, but <i>also</i> to fuzz data, including birthdates, addresses, diagnostic and procedure codes, treatment dates, etc., etc., all of which both reduces clinical value of the data <i>and</i> is difficult to do sufficiently.<p>________________________________<p>Notes:<p>1. In the "binary digit" sense, not in the colloquial "small increment" sense.
What a silly idea. That would completely prevent federally mandated interoperability APIs from working. While privacy breaches are obviously a problem, most consumers don't want care quality and coordination harmed just for the sake of a minor security improvement.<p><a href="https://www.cms.gov/priorities/burden-reduction/overview/interoperability" rel="nofollow">https://www.cms.gov/priorities/burden-reduction/overview/int...</a>
[deleted]
Honestly I'd take the 16 cents. Usually its a discount voucher on a product you'd never buy.<p>Or if it's a freebie then it's hidden behind a plain text link 3 levels deep on their website.
one more reason to overhaul the system. if a health care provider has a security incident they should be sued for the value of the data - and if that bankrupts them, then other providers will (hopefully) learn from that mistake. sort of like OSHA
Sounds like some patients are in for some lucrative free credit and identity monitoring /s
Until we can guarantee privacy and security maybe it’s best we shut down Illinois health care system.
I just heard a chorus of AI agents rejoicing that there's more private data now made public available to train on.