Perhaps it's a cynical way to look at it, but in the days of the war on general purpose computing, and locked-down devices, I have to consider the news in terms of how it could be used against the users and device owners. I don't know enough to provide useful analysis so I won't try, but instead pose as questions to the much smarter people who might have some interesting thoughts to share.<p>There are two, non-exclusive paths I'm thinking at the moment:<p>1. DRM: Might this enable a next level of DRM?<p>2. Hardware attestation: Might this enable a deeper level of hardware attestation?
Just to level set here. I think its important to realize this is really focused on allowing things like search to operate on encrypted data. This technique allows you to perform an operation on the data without decrypting it. Think a row in a database with email, first, last, and mailing address. You want to search by email to retrieve the other data, but don't want that data unencrypted since it is PII.<p>In general, this solution would be expensive and targeted at data lakes, or areas where you want to run computation but not necessarily expose the data.<p>With regard to DRM, one key thing to remember is that it has to be cheap, and widely deployable. Part of the reason dvds were easily broken is that the algorithm chosen was inexpensive both computationally, so you can install it on as many clients as possible.
This is an exceptionally good point. For example, I suspect two major reasons DRM has been more successful on game consoles than video players are the much smaller ecosystems and much larger BOMs, not necessarily in that order.
> how it could be used against the users and device owners<p>Same here.<p>Can't wait to KYC myself in order to use a CPU.
KYC = Kill Your Conscience<p>It's truly amazing how modern people just blithely sacrifice their privacy and integrity for no good reason. Just to let big tech corporations more efficiently siphon money out of the market. And then they fight you passionately when you call out those companies for being unnecessarily invasive and intrusive.<p>The four horsemen of the infocalypse are such profoundly reliable boogeymen, we really need a huge psychological study across all modern cultures to see why they're so effective at dismantling rational thought in the general public, and how we can innoculate society against it without damaging other important social behaviors.
They probably meant "know your customer", you know, where you have to submit to an anal probe to think about getting a bank account and withdrawing more than $8 of cash at a time will trigger a suspicious activity report for money laundering/tax evasion while the Epstein class are getting away with the most heinous crimes possible.
I'm still waiting for the first password manager to incorporate biometrics and security questions, as predicted decades ago by Douglas Adams:<p><i>There were so many different ways in which you were required to provide absolute proof of your identity these days that life could easily become extremely tiresome just from that factor alone, never mind the deeper existential problems of trying to function as a coherent consciousness in an epistemologically ambiguous physical universe. Just look at cash point machines, for instance. Queues of people standing around waiting to have their fingerprints read, their retinas scanned, bits of skin scraped from the nape of the neck and undergoing instant (or nearly instant-a good six or seven seconds in tedious reality) genetic analysis, then having to answer trick questions about members of their family they didn't even remember they had, and about their recorded preferences for tablecloth colours. And that was just to get a bit of spare cash for the weekend. If you were trying to raise a loan for a jetcar, sign a missile treaty or pay an entire restaurant bill things could get really trying.</i><p><i>Hence the Ident-i-Eeze. This encoded every single piece of information about you, your body and your life into one all-purpose machine-readable card that you could then carry around in your wallet, and therefore represented technology's greatest triumph to date over both itself and plain common sense.</i>
KYC is generally a force for good because it prevents fraud. While it is not reasonable for Discord to collect your identity that is a fair requirement for a bank account because money laundering is a serious problem worth preventing.<p>The reason the 'Epstein class' are able to get away with crimes is because in recent US elections the US voted to elect politicions that intentionally are not investigating those crimes and even pardoned some criminals convicted of them.
<a href="https://www.smbc-comics.com/comic/2014-05-27" rel="nofollow">https://www.smbc-comics.com/comic/2014-05-27</a>
> how it could be used against the users<p>We are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.<p>3. Unskippable ads with data gathering at the CPU level.
I distinctly remember from university in one of my more senior classes designing logic gates, chaining together ands, nands, ors, nors, xors, and then working our way up to numerical processors, ALUs, and eventually latches, RAM, and CPUs. The capstone was creating an assembly to control it all.<p>I remember how thinking how fun it was! I could see unfolded before me how there would be endless ways to configure, reconfigure, optimize, etc.<p>I know there are a few open source chip efforts, but wondering maybe now is the time to pull the community together and organize more intentionally around that. Maybe open source chipsets won't be as fast as their corporate counterparts, but I think we are definitely at an inflection point now in society where we would need this to maintain freedom.<p>If anyone is working in that area, I am very interested. I am very green, but still have the old textbooks I could dust off (just don't have the ole college provided mentor graphics -- or I guess siemens now -- design tool anymore).
I was just thinking about this a few days ago, but not just for the CPU (which we have RISC-V and OpenPOWER), but for an entire system, including the GPU, audio, disk controllers, networking, etc. I think a great target would be mid-2000s graphics and networking; I could go back to a 2006 Mac Pro without too much hardship. Having a fully-open equivalent to mid-2000s hardware would be a boon for open computing.
Sounds like you might want to go play with RISC-V, either in hardware or emulation.
There's no point. The big chip makers control all the billion dollar fabs. Governments and corporations can easily dictate terms. We'll lose this battle unless we develop a way to cheaply fabricate chips in a garage.<p>The future is bleak.
I don't think it's applicable to DRM because you eventually need the decrypted content: DRM is typically used for books, music, video, etc., you can't enjoy an encrypted video.<p>I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.
Yes it must be decrypted eventually, but I've read about systems (I think HDMI does this) where the keys are stored in the end device (like the TV or monitor) that the user can't access. Given that we already have that, I think I agree that this news doesn't change anything, but I wonder if there are clever uses I haven't thought of
See: <a href="https://news.ycombinator.com/item?id=47323743">https://news.ycombinator.com/item?id=47323743</a><p>It's not related to DRM or trusted computing.
1. The private key is required to see <i>anything</i> computed under FHE, so DRM is pretty unlikely.<p>2. No, anyone can run the FHE computations anywhere on any hardware if they have the evaluation key (which would also have to be present in any FHE hardware).
I'm also thinking of what happens when quantum computing becomes available.<p>But when homomorphic encryption becomes efficient, perhaps governments can force companies to apply it (though they would lose their opportunity for backdooring, but E2EE is a thing too so I wouldn't worry too much).
When we are at the point where society feels the need that privacy means encryption at compute ... a product like this (or anything else in the supply chain) is not going to save them.
My thought is half cynical. As LLM crawlers seek to mop up absolutely everything, companies themselves start to worry more about keeping their own data secret. Maybe this is a reason for shifts like this; as encrypted and other privacy-preserving products become more in demand across the board.
No, because of the fundamental limitation of DRM. Content must be delivered as plaintext.
This is quite the opposite, better than we have.<p>It raises the hurdle for those looking to surveil.<p>If a tree falls in the forest and no one is around to hear it, does it make a sound?<p>This is primarily for cloud compute I'd imagine, AI specifically. As it's generally not feasible/possible to run the state of the art models locally.
Think GDPR and data sovereignty concerns, many demand privacy and can't use services without it.
Regarding DRM, You could use stream ciphers and other well understood cryptography schemes to use a FHE chip like this to create an effectively tamper-proof and interception proof OS, with the FHE chip supplementing normal processors. You'd basically be setting up e2ee between the streaming server and the display, audio output, or other stream target, and there'd be no way to intercept or inspect unencrypted data without breaking the device. Put in modern tamper detection and you get a very secure setup, with modern performance, and a FHE chip basically just handling keys and encapsulation operations, fairly low compute and bandwidth needs. DRM and attestation both, as well as fairly dystopian manufacturer and corporate controls over devices users should own.
Regarding DRM I don't see how it'll survive "Camera in front of the screen" + "AI video upscaling" once the second part is good enough. Can't DRM between the screen and your eyes. Until they put DRM in Neuralink.
> Can't DRM between the screen and your eyes.<p>No, but media can be watermarked in imperceptible ways, and then if all players are required to check and act on such watermarks, the gap becomes narrow enough to probably be effective.<p>See Cinavia.
Sure, but we already have good enough players, open source even, that don't support this technology, and recent codecs have, if anything, become more open, so this only seems problematic for playback on non-general purpose computing devices like smart TVs, set top boxes, and <i>maybe</i> smartphones, tablets, and battery-powered PCs if the tech is incorporated into hardware decoders for all acceptable codecs.