Rendered at 11:49:59 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
freedomben 21 hours ago [-]
Perhaps it's a cynical way to look at it, but in the days of the war on general purpose computing, and locked-down devices, I have to consider the news in terms of how it could be used against the users and device owners. I don't know enough to provide useful analysis so I won't try, but instead pose as questions to the much smarter people who might have some interesting thoughts to share.
There are two, non-exclusive paths I'm thinking at the moment:
1. DRM: Might this enable a next level of DRM?
2. Hardware attestation: Might this enable a deeper level of hardware attestation?
gpapilion 19 hours ago [-]
Just to level set here. I think its important to realize this is really focused on allowing things like search to operate on encrypted data. This technique allows you to perform an operation on the data without decrypting it. Think a row in a database with email, first, last, and mailing address. You want to search by email to retrieve the other data, but don't want that data unencrypted since it is PII.
In general, this solution would be expensive and targeted at data lakes, or areas where you want to run computation but not necessarily expose the data.
With regard to DRM, one key thing to remember is that it has to be cheap, and widely deployable. Part of the reason dvds were easily broken is that the algorithm chosen was inexpensive both computationally, so you can install it on as many clients as possible.
15155 8 hours ago [-]
DVD players also didn't have a great key revocation and forced field updates of keys and software and such. Blu Ray did, and was somewhat more effective. I also imagine console manufacturers have far more control over the supply chain at large.
Consoles after the original Xbox (which had an epic piracy ecosystem) all had online integration. The Xbox 360 had a massive piracy scene, but it was 100% offline only. The Xbox One has had no such breaches that I am aware of.
RE: BOM - famously, with many of these examples, certain specific disc drives or mainboards were far more compromised than others.
Crosseye_Jack 6 hours ago [-]
> The Xbox 360 had a massive piracy scene, but it was 100% offline only.
You could play pirated games online with the 360. The piracy was at the DVD Rom firmware level, replacing the stock firmware with one that basically changed the book type of the media. (And in later versions also mimicked other security checks preformed by the console to validate the authenticity of the disk)
However the DVD firmware mod didn’t break any digital signatures. It just allowed signed code to be executed from unauthentic media, so it only allowed piracy/backups not a full jailbreak allowing unsigned code. That was more the jtag/reset glitch era. Which was more “offline only” as it was easier for MS to detect and ban your key vault from Xbox live, but because people were willing to pay for modded lobbies in games like Call of Duty (which allowed you to rank up much faster) and Xbox dying if you sneezed that them, there was a even a market for extracting the keys from dead consoles to sell to those selling modded lobbies.
You still ran a risk of getting your console hardware banned for doing the DVD firmware mod, but towards the end I believe MS threw in the towel (even after trying to embed the flash chip in the samr package as the DSP for the drive which resulted in the kamikaze hack before the drive got further exploited) because one method they tried to use to detect piracy had such tight tolerances that it caused legit customers with aging drives to be caught up in the ban wave and MS had to walk it back.
The head of Xbox security (who sadly is no longer with us, he was a good egg at heart) left Microsoft not long afterwards. Obviously stating he wanted to move on to other things, but the word around the community at the time was that he was shown the door.
Personally I don’t hold much to that story (of him being pushed), this was so late in the consoles life that it seemed like it was trying to patch the hole in the titanic after it already sunk.
gpapilion 7 hours ago [-]
Home networks have made this much easier. DVD players didn’t expect network access for software updates etc…
jasomill 10 hours ago [-]
This is an exceptionally good point. For example, I suspect two major reasons DRM has been more successful on game consoles than video players are the much smaller ecosystems and much larger BOMs, not necessarily in that order.
jackyinger 7 hours ago [-]
How is searching encrypted data not going to be used for exfiltration? What a terrible idea.
I’m sure you can name benign useful things you could use it for. But it seems to me you’re blatantly overlooking the obvious flaw.
There is no getting around doing search on encrypted data reducing the level of secrecy. To have an even minutely useful search result, some information within the searched corpus must be exposed.
egorfine 21 hours ago [-]
> how it could be used against the users and device owners
Same here.
Can't wait to KYC myself in order to use a CPU.
observationist 17 hours ago [-]
KYC = Kill Your Conscience
It's truly amazing how modern people just blithely sacrifice their privacy and integrity for no good reason. Just to let big tech corporations more efficiently siphon money out of the market. And then they fight you passionately when you call out those companies for being unnecessarily invasive and intrusive.
The four horsemen of the infocalypse are such profoundly reliable boogeymen, we really need a huge psychological study across all modern cultures to see why they're so effective at dismantling rational thought in the general public, and how we can innoculate society against it without damaging other important social behaviors.
bigbuppo 15 hours ago [-]
They probably meant "know your customer", you know, where you have to submit to an anal probe to think about getting a bank account and withdrawing more than $8 of cash at a time will trigger a suspicious activity report for money laundering/tax evasion while the Epstein class are getting away with the most heinous crimes possible.
jasomill 10 hours ago [-]
I'm still waiting for the first password manager to incorporate biometrics and security questions, as predicted decades ago by Douglas Adams:
There were so many different ways in which you were required to provide absolute proof of your identity these days that life could easily become extremely tiresome just from that factor alone, never mind the deeper existential problems of trying to function as a coherent consciousness in an epistemologically ambiguous physical universe. Just look at cash point machines, for instance. Queues of people standing around waiting to have their fingerprints read, their retinas scanned, bits of skin scraped from the nape of the neck and undergoing instant (or nearly instant-a good six or seven seconds in tedious reality) genetic analysis, then having to answer trick questions about members of their family they didn't even remember they had, and about their recorded preferences for tablecloth colours. And that was just to get a bit of spare cash for the weekend. If you were trying to raise a loan for a jetcar, sign a missile treaty or pay an entire restaurant bill things could get really trying.
Hence the Ident-i-Eeze. This encoded every single piece of information about you, your body and your life into one all-purpose machine-readable card that you could then carry around in your wallet, and therefore represented technology's greatest triumph to date over both itself and plain common sense.
direwolf20 8 hours ago [-]
The saying is "every accusation is a confession". If the political class claims to be preventing us from doing something that we obviously are not doing, we should assume they're doing that thing until proven otherwise.
cmeacham98 14 hours ago [-]
KYC is generally a force for good because it prevents fraud. While it is not reasonable for Discord to collect your identity that is a fair requirement for a bank account because money laundering is a serious problem worth preventing.
The reason the 'Epstein class' are able to get away with crimes is because in recent US elections the US voted to elect politicions that intentionally are not investigating those crimes and even pardoned some criminals convicted of them.
mc32 14 hours ago [-]
Don’t pardoned people by definition need to have been convicted of a crime whether real or in some select instances otherwise? Can you pardon someone not convicted of a (federal) crime?
monocasa 8 hours ago [-]
Not according to Ex parte Garland (1866).
> 9. The power of pardon conferred by the Constitution upon the President is unlimited except in cases of impeachment. It extends to every offence known to the law, and may be exercised at any time after its commission, either before legal proceedings are taken or during their pendency, or after conviction and judgment. The power is not subject to legislative control.
Yes, the last president pardoned himself and his family on his way out.
mc32 10 hours ago [-]
I’m not sure if that has precedent. It’s unusual to grant a pardon before a case is brought to court.
In any event, my point was all presidents who grant pardons grant them to people convicted of a crime; it’s not a recent development. But that was framed as being upsetting precedent.
We are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.
3. Unskippable ads with data gathering at the CPU level.
dimitrios1 20 hours ago [-]
I distinctly remember from university in one of my more senior classes designing logic gates, chaining together ands, nands, ors, nors, xors, and then working our way up to numerical processors, ALUs, and eventually latches, RAM, and CPUs. The capstone was creating an assembly to control it all.
I remember how thinking how fun it was! I could see unfolded before me how there would be endless ways to configure, reconfigure, optimize, etc.
I know there are a few open source chip efforts, but wondering maybe now is the time to pull the community together and organize more intentionally around that. Maybe open source chipsets won't be as fast as their corporate counterparts, but I think we are definitely at an inflection point now in society where we would need this to maintain freedom.
If anyone is working in that area, I am very interested. I am very green, but still have the old textbooks I could dust off (just don't have the ole college provided mentor graphics -- or I guess siemens now -- design tool anymore).
linguae 17 hours ago [-]
I was just thinking about this a few days ago, but not just for the CPU (which we have RISC-V and OpenPOWER), but for an entire system, including the GPU, audio, disk controllers, networking, etc. I think a great target would be mid-2000s graphics and networking; I could go back to a 2006 Mac Pro without too much hardship. Having a fully-open equivalent to mid-2000s hardware would be a boon for open computing.
officeplant 20 hours ago [-]
Sounds like you might want to go play with RISC-V, either in hardware or emulation.
matheusmoreira 17 hours ago [-]
There's no point. The big chip makers control all the billion dollar fabs. Governments and corporations can easily dictate terms. We'll lose this battle unless we develop a way to cheaply fabricate chips in a garage.
The future is bleak.
direwolf20 8 hours ago [-]
Make one out of relays and use it to run PGP
youknownothing 21 hours ago [-]
I don't think it's applicable to DRM because you eventually need the decrypted content: DRM is typically used for books, music, video, etc., you can't enjoy an encrypted video.
I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.
freedomben 21 hours ago [-]
Yes it must be decrypted eventually, but I've read about systems (I think HDMI does this) where the keys are stored in the end device (like the TV or monitor) that the user can't access. Given that we already have that, I think I agree that this news doesn't change anything, but I wonder if there are clever uses I haven't thought of
NegativeLatency 20 hours ago [-]
Rent out your spare compute, like seti@home or folding@home, but it’s something someone could repackage and sell as a service.
A: "Intel/AMD is adding instructions to accelerate AES"
B: "Might this enable a next level of DRM? Might this enable a deeper level of hardware attestation?"
A: "wtf are you talking about? It's just instructions to make certain types of computations faster, it has nothing to do with DRM or hardware attestation."
B: "Not yet."
I'm sure in some way it probably helps DRM or hardware attestation to some extent, but not any more than say, 3nm process node helps DRM or hardware attestation by making it faster.
fc417fc802 9 hours ago [-]
If this were similar to SGX (which is what I initially assumed) then "not yet" is a perfectly reasonable position to take. However it's actually homomorphic encryption implemented in hardware thus not relevant to DRM (AFAIK).
That said, the unfortunate reality is that the same constructs that underpin DRM are also required to build a secure system. The only difference is who controls the root of trust. As such the problems with DRM (and hardware ownership more generally) are political as opposed to technical in nature.
direwolf20 8 hours ago [-]
I see the same thing every time there's a new medical thing.
> We discovered a substance that boosts your innate immune system and non-specifically clears out throat infections.
> This will be good for people prone to throat infections.
> Not when it's mandated.
someone else told me they're going to spy on your windows with drones to make sure you're verifying your age to your OS, like what??? I thought we were waking up to oppression but we're just inventing fake oppression to be mad at instead of responding to real oppression.
I mean, this would be perfect for the key provisioning portions of widevine or bluray.
benlivengood 18 hours ago [-]
1. The private key is required to see anything computed under FHE, so DRM is pretty unlikely.
2. No, anyone can run the FHE computations anywhere on any hardware if they have the evaluation key (which would also have to be present in any FHE hardware).
ddtaylor 17 hours ago [-]
HDCP does some of that already in many of your devices.
amelius 17 hours ago [-]
I'm also thinking of what happens when quantum computing becomes available.
But when homomorphic encryption becomes efficient, perhaps governments can force companies to apply it (though they would lose their opportunity for backdooring, but E2EE is a thing too so I wouldn't worry too much).
evolve2k 20 hours ago [-]
My thought is half cynical. As LLM crawlers seek to mop up absolutely everything, companies themselves start to worry more about keeping their own data secret. Maybe this is a reason for shifts like this; as encrypted and other privacy-preserving products become more in demand across the board.
F7F7F7 15 hours ago [-]
When we are at the point where society feels the need that privacy means encryption at compute ... a product like this (or anything else in the supply chain) is not going to save them.
19 hours ago [-]
mathgradthrow 17 hours ago [-]
No, because of the fundamental limitation of DRM. Content must be delivered as plaintext.
KoolKat23 19 hours ago [-]
This is quite the opposite, better than we have.
It raises the hurdle for those looking to surveil.
If a tree falls in the forest and no one is around to hear it, does it make a sound?
This is primarily for cloud compute I'd imagine, AI specifically. As it's generally not feasible/possible to run the state of the art models locally.
Think GDPR and data sovereignty concerns, many demand privacy and can't use services without it.
brookst 7 hours ago [-]
You’re right it’s a cynical take. I don’t get cynicism for the sake of it, detached from technical reality.
No, this does nothing for DRM or HW attestation. The interesting thought is: not everything is a conspiracy. Yes, that’s just what a conspirator would say. But it’s also true.
coliveira 6 hours ago [-]
Not everything is a conspiracy, yes. But when we have a class of conspirators in power, and we do have, everything can be used by the conspiracy.
observationist 17 hours ago [-]
Regarding DRM, You could use stream ciphers and other well understood cryptography schemes to use a FHE chip like this to create an effectively tamper-proof and interception proof OS, with the FHE chip supplementing normal processors. You'd basically be setting up e2ee between the streaming server and the display, audio output, or other stream target, and there'd be no way to intercept or inspect unencrypted data without breaking the device. Put in modern tamper detection and you get a very secure setup, with modern performance, and a FHE chip basically just handling keys and encapsulation operations, fairly low compute and bandwidth needs. DRM and attestation both, as well as fairly dystopian manufacturer and corporate controls over devices users should own.
vasco 19 hours ago [-]
Regarding DRM I don't see how it'll survive "Camera in front of the screen" + "AI video upscaling" once the second part is good enough. Can't DRM between the screen and your eyes. Until they put DRM in Neuralink.
RiverCrochet 18 hours ago [-]
> Can't DRM between the screen and your eyes.
No, but media can be watermarked in imperceptible ways, and then if all players are required to check and act on such watermarks, the gap becomes narrow enough to probably be effective.
See Cinavia.
jasomill 10 hours ago [-]
Sure, but we already have good enough players, open source even, that don't support this technology, and recent codecs have, if anything, become more open, so this only seems problematic for playback on non-general purpose computing devices like smart TVs, set top boxes, and maybe smartphones, tablets, and battery-powered PCs if the tech is incorporated into hardware decoders for all acceptable codecs.
fc417fc802 9 hours ago [-]
> if all players are required
Massive if. Why would I voluntarily purchase gimped hardware?
Cinavia depended on being implemented by the player itself. It's difficult to see how (for example) a smart tv could implement it for streams coming in via HDMI from a computer the user has full control of.
direwolf20 8 hours ago [-]
You would purchase a Blu-ray player in order to play Blu-rays, pretty simple. They have this watermarking.
fc417fc802 6 hours ago [-]
Right. To play legally purchased blu-rays. Who pirates movies and then burns them on a disk? And if someone did do that why would they be using a gimped blu-ray player instead of a media PC?
The only thing this scheme was ever going to catch was full blown counterfeit disks sold on a street corner to your average joe. I think that was only ever much of a thing in the developing world. Or was it just before my time?
zvqcMMV6Zcr 22 hours ago [-]
> Heracles, which sped up FHE computing tasks as much as 5,000-fold compared to a top-of the-line Intel server CPU.
That is nice speed-up compared to generic hardware but everyone probably wants to know how much slower it is than performing same operations on plain text data? I am sure 50% penalty is acceptable, 95% is probably not.
corysama 21 hours ago [-]
There are applications that are currently doing this without hardware support and accepting much worse than 95% performance loss to do so.
This hardware won’t make the technique attractive for ALL computation. But, it could dramatically increase the range of applications.
bobbiechen 20 hours ago [-]
Agreed. When I was working on TEEs/confidential computing, just about everyone agreed that FHE was conceptually attractive (trust the math instead of trusting a hardware vendor) but the overhead of FHE was so insanely high. Think 1000x slowdowns turning your hour-long batch job into something that takes over a month to run instead.
patchnull 21 hours ago [-]
Current FHE on general CPUs is typically 10,000x to 100,000x slower than plaintext, depending on the scheme and operation. So even with a 5,000x ASIC speedup you are still looking at roughly 20-100x overhead vs unencrypted compute.
That rules out anything latency-sensitive, but for batch workloads like aggregating encrypted medical records or running simple ML inference on private data it starts to become practical. The real unlock is not raw speed parity but getting FHE fast enough that you can justify the privacy tradeoff for specific regulated workloads.
tromp 19 hours ago [-]
10,000x to 100,000x / 5,000x = 2 to 10x, not 20 to 100x.
21 hours ago [-]
Foobar8568 20 hours ago [-]
Now we know why Intel more or less abandonned SEAL and rejected GPU requests.
Foobar8568 2 hours ago [-]
It's Microsoft who did the library, damn, I can't understand how I misremembered that after working on it for a few months last year.
bilekas 19 hours ago [-]
This is incredible work.. And makes the technology absolutely viable.
However... In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware. My cynic could envision the technology export ban worldwide in the vein of RSA [0] .
Why would any company offer the customers real out of the box e2e encryption possibilities built into their devices.
DRM was mentioned by another user. This will not be used to enable privacy for the masses.
Arguably this is less useful for consumer hardware in the first place. This is mostly useful when I don’t trust the service provider with my data but still need to use their services (casting my vote, encrypted inference, and so forth)
bilekas 19 hours ago [-]
True, in the case of casting a vote though for example, I would see it being used within the voting machines itself before sending off to be counted. Good application.
But getting them available for customers for example say even a PCIe card or something and then that automatically encrypting everything you ever run today over an encrypted connection would be a dream.
autoexec 19 hours ago [-]
> In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware.
Why not when government can just force companies to backdoor their hardware for them. That way users are secure most of the time except from the government (until the backdoor in intel's chips gets discovered anyway), and users have a false sense of security/privacy so people are more likely to share their secrets with corporations and the government gets to spy on people communicating more openly with each other.
Someone explain how you'd create a vector embedding using homomorphically encrypted data, without decrypting it. Seems like a catch 22. You don't get to know the semantic meaning, but need the semantic meaning to position it in high dimensional space. I guess the point I'm making is that sure, you can sell compute for FHE, but you quickly run up against a hard limit on any value added SaaS you can provide the customer. This feels like a solution that's being shoehorned in because cloud providers really really really want to have a customer use their data center, when in truth the best solution would be a secure facility for the customer so that applications can actually understand the data they're working with.
bob1029 20 hours ago [-]
Most of modern machine learning is effectively linear algebra. We can achieve semantic search over encrypted vectors if the encryption relies on similar principles.
Chance-Device 21 hours ago [-]
FHE is the future of AI. I predict local models with encrypted weights will become the norm. Both privacy preserving (insofar as anything on our devices can be) and locked down to prevent misuse. It may not be pretty but I think this is where we will end up.
boramalper 21 hours ago [-]
If you're interested in "private AI", see Confer [0] by Moxie Marlinspike, the founder of Signal private messaging app. They go into more detail in their blog. [1]
I don't get how this can work, and Moxie (or rather his LLM) never bothers to explain. How can an LLM possibly exchange encrypted text with the user without decrypting it?
The correct solution isn't yet another cloud service, but rather local models.
Within the enclave itself, DRAM and PCIe connections between the CPU and GPU are encrypted, but the CPU registers and the GPU onboard memory are plaintext. So the computation is happening on plaintext data, it’s just extremely difficult to access it from even the machine running the enclave.
olejorgenb 14 hours ago [-]
How is it then much different than trusting the policies of Anthropic etc? To be fair you need some enterprise deal to get the truly zero retention policy.
FrasiertheLion 12 hours ago [-]
Enclaves have a property that allows the hardware to compute a measurement (a cryptographic hash) of everything running inside it, such as the firmware, system software such as the operating system and drivers, the application code, the security configuration. This is signed by the hardware manufacturer (Intel/AMD + NVIDIA).
Then, verification involves a three part approach. Disclaimer: I'm the cofounder of Tinfoil: https://tinfoil.sh/, we also run inference inside secure enclaves. So I'll explain this as we do it.
First, you open source the code that's running in the enclave, and pin a commitment to it to a transparency log (in our case, Sigstore).
Then, when a client connects to the server (that's running in the enclave), the enclave computes the measurement of its current state and returns that to the client. This process is called remote attestation.
The client then fetches the pinned measurements from Sigstore and compares it against the fetched measurements from the enclave. This guarantees that the code running in the enclave is the same as the code that was committed to publicly.
So if someone claimed they were only analyzing aggregated metrics, they could not suddenly start analyzing individual request metrics because the code would change -> hash changes -> verification fails.
boramalper 19 hours ago [-]
They explain it in Private inference [0] if you want to read about it.
If encrypted outputs can be viewed or used, they can be reverse-engineered through that same interface. FHE shifts the attack surface, it does not eliminate it.
Chance-Device 18 hours ago [-]
If you know how to reverse engineer weights or even hidden states through simple text output without logprobs I’d be interested in hearing about it. I imagine a lot of other people would be too.
anon291 16 hours ago [-]
I mean, no they cannot be viewed at any point once encrypted unless you have the key. That's the point. Even the intermediate steps are random gibberish unless you have the key
Foobar8568 20 hours ago [-]
FHE is impractical by all means. Either it's trivially broken and unsecured or the space requirements go beyond anything usable.
There is basically no business demand beside from sellers and scholars.
eulgro 9 hours ago [-]
In science fiction maybe. We're hitting real limits on compute while AI is still far from a level where it would harmful, and FHE is orders of magnitude less efficient than direct calculation.
yalogin 15 hours ago [-]
FHE is great, if we can get this to work at scale and if this can be baked into the GPU complex, we don’t need the confidential compute pipeline. Of course we will still need to manage the user keys, so the current confidential pipeline will just be replaced with something else, but hopefully managing large amounts of data will become simpler. Not sure where the tech is but it could be a game changer for security. It still doesn’t eliminate the bad corporation issue though. We still rely on code they run on the servers inside the FHE.
I joke, but i think relative numbers like this are very misleading as FHE is starting from such an absurdly slow place.
Still, this is pretty cool and there are probably niche applications that become possible with this, but i think this is a small enough speed up that it is still very niche.
JanoMartinez 21 hours ago [-]
One thing I'm curious about is whether this could change how cloud providers handle sensitive workloads.
If computation can happen directly on encrypted data, does that reduce the need for trusted environments like SGX/TEE, or does it mostly complement them?
anon291 16 hours ago [-]
If it were as fast as a normal chip, it would obviate the need
gigatexal 20 hours ago [-]
If they can get this shrunk down and efficient enough in a future scenario I think Apple could move back to Intel for this with their stance on encryption and things it being a pillar of their image.
Joel_Mckay 17 hours ago [-]
Not going to happen anytime soon, as the modern M4/ARM unified memory with on-chip GPU is years ahead of Intel. The software ecosystem is slowly growing to leverage this chip architecture, and due to the annoying PC RAM, SSD, and RTX GPU shenanigans it is no longer the lower value option.
The PC market was made shitty enough this year, that Mid/High class Mac Pro/laptops are actually often a better value deal now (if and only if your use-case is covered software wise.)
Intel does plan on a RTX + amd64 SoC soon, but still pooched the memory interface with a 30 year old mailbox kludge. Intel probably wont survive this choice without bailouts. =3
bigyabai 15 hours ago [-]
> (if and only if your use-case is covered software wise.)
Judging by Nvidia's current valuation, that's a parenthetical worth ~4 trillion dollars. Apple isn't muscling AMD or Nvidia out of the datacenter anytime soon, and they're basically feeding Intel Foundry customers by dominating TSMC fab capacity. Apple's contribution to the chip shortage is so bad that even they have considered using Intel Foundry Services: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...
It's been 7 years of Apple Silicon and the macOS market share really hasn't shifted much. The Year Of Apple Silicon For People Whose Use-Case Is Covered Software Wise was 2019; the majority of remaining customers aren't showing any interest.
Joel_Mckay 15 hours ago [-]
> It's been 7 years of Apple Silicon and the macOS market share really hasn't shifted much
Indeed, but a local LLM finishing in 3 days instead of 1 on a $40k GPU changes the economic decision priority for some.
Apple sales grew "21.3% year-over-year as of the second quarter of 2025", but also sales flattened as supply chain pricing shocks from "AI"/tariffs hit late last year.
"Judging by Nvidia's current valuation" is a bad bet with current circular investment conditions.
We shall see, but as EOL drivers and OS rot hits legacy NVIDIA hardware... people are going to have to find some compromise in the next 2 years. Even AMD 9850X3D currently cost less than 64G of low end PC ddr5 memory.
Odd times for sure =3
gigatexal 6 hours ago [-]
Acktually ahem not to be that guy but to be that guy haha (insert me here) …
Apple’s Mac market share of the PC market went from 6.6% to 9% (https://www.cultofmac.com/news/mac-shipments-2025-apple) so that’s nothing to balk at. The MacBook Neo might grow that even more as maybe it converts low end buyers into locked in users in the ecosystem and then they move on to more Mac’s.
jpauline 19 hours ago [-]
This is a huge win for cybersecurity and data privacy.
newzino 19 hours ago [-]
[dead]
darig 20 hours ago [-]
[dead]
esseph 22 hours ago [-]
Everything about this in my head screams "bad idea".
If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
numpad0 18 hours ago [-]
It is a bad idea but not in the way you think. FHE hardware don't decrypt data on-chip. It's like using the Diffie-Hellman key exchange for general computation. The data and operations stay encrypted at any given moment while outside your client device.
The textbook example application of FHE is phone book search. The server "multiply" the whole phonebook database file with your encrypted query, and sends back the whole database file to you every time regardless of queries. When you decrypt the file with the key used to encrypt the query, the database is all corrupt and garbled except for the rows matching the query, thereby causing the search to have practically occurred. The only information that exists in the clear are query and the size of entire database.
Sounds fantastically energy-efficient, no? That's the problem with FHE, not risks of backdooring.
esseph 8 hours ago [-]
So one of the potential use cases mentioned is for secure data transfer.
So here's my question:
Could FHE hardware be used to extremely quickly and reliably secure something like a database connection?
I looked through Gemini, and it says the following:
"Zama are building libraries that use FHE accelerators to allow "Confidential Smart Contracts" or private AI queries. You could send a highly sensitive health query to an AI, and the AI hardware would process it and send the answer back without the AI company ever knowing what you asked."
Which is why I ask. Because if you have a backdoor into the hardware, as either a corporation or a government, then you can get access to those "very sensitive and fully secured communications".
They also discuss a crypto / smart contracts use case, and advertise " securing L1 or L2".
36 minutes ago [-]
u1hcw9nx 21 hours ago [-]
In FHE the hardware running it don't know the secrets. That's the point.
First you encrypt the data. Then you send it to hardware to compute, get result back and decrypt it.
Foobar8568 20 hours ago [-]
But you leak all type of information and and the retrieve either leak even more data or you'll end up with transferring a god knows amount of data or your encryption is trivially broken or spend days/month/years to unencrypt.
bilekas 19 hours ago [-]
I don't know how you got these ideas but when you crack it, do make sure to write a post about it. Can't wait for that writeup.
Foobar8568 19 hours ago [-]
LWE estimator isn't a proxy for this?
anon291 16 hours ago [-]
Math literacy needs to become standard for computer scientists. These takes are so bad
Foobar8568 15 hours ago [-]
Or reading papers on the subjects, and playing with implementing FHE search.
spicymaki 14 hours ago [-]
Yeah it pretty clear that many people in the HN community have no idea what this is and yet they have takes. It makes me wonder...
gruez 21 hours ago [-]
>If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
Are we reading the same article? It's talking about homorphic encryption, ie. doing mathematical operations on already encrypted data, without being aware of its cleartext contents. It's not related to SGX or other trusted computing technologies.
esseph 8 hours ago [-]
I updated another comment above, but FHE is also being advertised for use for securing crypto transactions and smart contracts.
"We believe that just like the internet went from zero encryption with HTTP to encrypting data in transit with HTTPS, the next natural step will be to use FHE to enable end-to-end encryption by default in every application, something we call HTTPZ"
cwmma 21 hours ago [-]
In theory you only need to trust the hardware to be correct, since it doesn't have the decryption key the worst it can do is give you a wrong answer. In theory.
esseph 20 hours ago [-]
But can you trust the hardware encryption to not be backdoored, by design?
That's my point, this sounds like a way to create a backdoor for at-rest data.
rjmunro 34 minutes ago [-]
There is no hardware encryption or decryption.
I encrypt some data and keep the key. I send the encrypted data to you (probably some cloud provider). I tell you to do some operations on the data. I don't tell you the key or what the data is or what the operations mean. You send the results back to me. I use the key to decrypt them.
You have helped me with my compute task, but the data you have is totally meaningless without the key, and only I have the key.
It's hard to believe that it's possible to make encryption where this can do useful work, but it is.
jayd16 17 hours ago [-]
By design, you don't trust it. You never hand out the keys so there's no secret to back door. The task is never unencrypted, at rest or otherwise.
cassonmars 20 hours ago [-]
You can if the manufacturer has a track record that refutes the notion, and especially if they have verifiable hardware matching publicly disclosed circuit designs. But this is Intel, with their track record, I wouldn't trust it even if the schematics were public. Intel ME not being disable-able by consumers, while being entirely omitted for certain classes of government buyers tells me everything I need to know.
bilekas 19 hours ago [-]
> That's my point, this sounds like a way to create a backdoor for at-rest data.
I get the feeling honestly it seems more expensive and more effort to backdoor it..
anon291 16 hours ago [-]
Well yeah... You do the initial encryption yourself by whatever means you trust
bokohut 16 hours ago [-]
First and foremost, grateful for the ability to take and give to this HN community for what HN has done for me. With that stated I am reminded near daily when reading posts on HN of my experience, my age, and some of my now lost hair color.
After nearly 3 decades of critical technology systems architecture and management involving ongoing industry audits my experience and age knows why my hair has lost some of its color. Much of that lost color comes from security management of third party systems, yes the old dreaded dependencies. Elimination of those third parties is key for one's cyber sanity and hair color yet with technology still in its infancy some cannot distinguish the forest from the trees.
Nothing remains the same as progress moves forward correcting for past mistakes while learning what works and does not along that journey, technology platforms are no exception. Analogously early automobiles lacked safety features as well such as windshield wipers and seatbelts so has the passage of time proved their addition to be valued? Few people today truly understand how things work as nearly all just want the instant fix "pill" to alleviate their issues however this approach cannot work with security. True security is designed in from the foundation and such secure platforms go unseen yet we have an endless list of victims from those insecure systems which have "bolted on" security after the fact. This security change and more is coming to system designs as the entire world is now fully aware of cyber security, or in this case, the lack of it.
Time, the young fail to consider it up until a single moment in their life, while the old reflect on where theirs went. After the reflection of one's time however change becomes obvious.
bitexploder 16 hours ago [-]
How does this relate to chip based homomorphic encryption? Just curious.
There are two, non-exclusive paths I'm thinking at the moment:
1. DRM: Might this enable a next level of DRM?
2. Hardware attestation: Might this enable a deeper level of hardware attestation?
In general, this solution would be expensive and targeted at data lakes, or areas where you want to run computation but not necessarily expose the data.
With regard to DRM, one key thing to remember is that it has to be cheap, and widely deployable. Part of the reason dvds were easily broken is that the algorithm chosen was inexpensive both computationally, so you can install it on as many clients as possible.
Consoles after the original Xbox (which had an epic piracy ecosystem) all had online integration. The Xbox 360 had a massive piracy scene, but it was 100% offline only. The Xbox One has had no such breaches that I am aware of.
RE: BOM - famously, with many of these examples, certain specific disc drives or mainboards were far more compromised than others.
You could play pirated games online with the 360. The piracy was at the DVD Rom firmware level, replacing the stock firmware with one that basically changed the book type of the media. (And in later versions also mimicked other security checks preformed by the console to validate the authenticity of the disk)
However the DVD firmware mod didn’t break any digital signatures. It just allowed signed code to be executed from unauthentic media, so it only allowed piracy/backups not a full jailbreak allowing unsigned code. That was more the jtag/reset glitch era. Which was more “offline only” as it was easier for MS to detect and ban your key vault from Xbox live, but because people were willing to pay for modded lobbies in games like Call of Duty (which allowed you to rank up much faster) and Xbox dying if you sneezed that them, there was a even a market for extracting the keys from dead consoles to sell to those selling modded lobbies.
You still ran a risk of getting your console hardware banned for doing the DVD firmware mod, but towards the end I believe MS threw in the towel (even after trying to embed the flash chip in the samr package as the DSP for the drive which resulted in the kamikaze hack before the drive got further exploited) because one method they tried to use to detect piracy had such tight tolerances that it caused legit customers with aging drives to be caught up in the ban wave and MS had to walk it back.
The head of Xbox security (who sadly is no longer with us, he was a good egg at heart) left Microsoft not long afterwards. Obviously stating he wanted to move on to other things, but the word around the community at the time was that he was shown the door.
Personally I don’t hold much to that story (of him being pushed), this was so late in the consoles life that it seemed like it was trying to patch the hole in the titanic after it already sunk.
I’m sure you can name benign useful things you could use it for. But it seems to me you’re blatantly overlooking the obvious flaw.
There is no getting around doing search on encrypted data reducing the level of secrecy. To have an even minutely useful search result, some information within the searched corpus must be exposed.
Same here.
Can't wait to KYC myself in order to use a CPU.
It's truly amazing how modern people just blithely sacrifice their privacy and integrity for no good reason. Just to let big tech corporations more efficiently siphon money out of the market. And then they fight you passionately when you call out those companies for being unnecessarily invasive and intrusive.
The four horsemen of the infocalypse are such profoundly reliable boogeymen, we really need a huge psychological study across all modern cultures to see why they're so effective at dismantling rational thought in the general public, and how we can innoculate society against it without damaging other important social behaviors.
There were so many different ways in which you were required to provide absolute proof of your identity these days that life could easily become extremely tiresome just from that factor alone, never mind the deeper existential problems of trying to function as a coherent consciousness in an epistemologically ambiguous physical universe. Just look at cash point machines, for instance. Queues of people standing around waiting to have their fingerprints read, their retinas scanned, bits of skin scraped from the nape of the neck and undergoing instant (or nearly instant-a good six or seven seconds in tedious reality) genetic analysis, then having to answer trick questions about members of their family they didn't even remember they had, and about their recorded preferences for tablecloth colours. And that was just to get a bit of spare cash for the weekend. If you were trying to raise a loan for a jetcar, sign a missile treaty or pay an entire restaurant bill things could get really trying.
Hence the Ident-i-Eeze. This encoded every single piece of information about you, your body and your life into one all-purpose machine-readable card that you could then carry around in your wallet, and therefore represented technology's greatest triumph to date over both itself and plain common sense.
The reason the 'Epstein class' are able to get away with crimes is because in recent US elections the US voted to elect politicions that intentionally are not investigating those crimes and even pardoned some criminals convicted of them.
> 9. The power of pardon conferred by the Constitution upon the President is unlimited except in cases of impeachment. It extends to every offence known to the law, and may be exercised at any time after its commission, either before legal proceedings are taken or during their pendency, or after conviction and judgment. The power is not subject to legislative control.
https://tile.loc.gov/storage-services/service/ll/usrep/usrep...
Basically you can't pardon acts that haven't happened yet, but you can pardon before any legal action has been taken on prior acts.
https://www.criminallawlibraryblog.com/amp/preemptive-pardon...
In any event, my point was all presidents who grant pardons grant them to people convicted of a crime; it’s not a recent development. But that was framed as being upsetting precedent.
We are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.
3. Unskippable ads with data gathering at the CPU level.
I remember how thinking how fun it was! I could see unfolded before me how there would be endless ways to configure, reconfigure, optimize, etc.
I know there are a few open source chip efforts, but wondering maybe now is the time to pull the community together and organize more intentionally around that. Maybe open source chipsets won't be as fast as their corporate counterparts, but I think we are definitely at an inflection point now in society where we would need this to maintain freedom.
If anyone is working in that area, I am very interested. I am very green, but still have the old textbooks I could dust off (just don't have the ole college provided mentor graphics -- or I guess siemens now -- design tool anymore).
The future is bleak.
I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.
It's not related to DRM or trusted computing.
A: "Intel/AMD is adding instructions to accelerate AES"
B: "Might this enable a next level of DRM? Might this enable a deeper level of hardware attestation?"
A: "wtf are you talking about? It's just instructions to make certain types of computations faster, it has nothing to do with DRM or hardware attestation."
B: "Not yet."
I'm sure in some way it probably helps DRM or hardware attestation to some extent, but not any more than say, 3nm process node helps DRM or hardware attestation by making it faster.
That said, the unfortunate reality is that the same constructs that underpin DRM are also required to build a secure system. The only difference is who controls the root of trust. As such the problems with DRM (and hardware ownership more generally) are political as opposed to technical in nature.
> We discovered a substance that boosts your innate immune system and non-specifically clears out throat infections.
> This will be good for people prone to throat infections.
> Not when it's mandated.
someone else told me they're going to spy on your windows with drones to make sure you're verifying your age to your OS, like what??? I thought we were waking up to oppression but we're just inventing fake oppression to be mad at instead of responding to real oppression.
2. No, anyone can run the FHE computations anywhere on any hardware if they have the evaluation key (which would also have to be present in any FHE hardware).
But when homomorphic encryption becomes efficient, perhaps governments can force companies to apply it (though they would lose their opportunity for backdooring, but E2EE is a thing too so I wouldn't worry too much).
It raises the hurdle for those looking to surveil.
If a tree falls in the forest and no one is around to hear it, does it make a sound?
This is primarily for cloud compute I'd imagine, AI specifically. As it's generally not feasible/possible to run the state of the art models locally. Think GDPR and data sovereignty concerns, many demand privacy and can't use services without it.
No, this does nothing for DRM or HW attestation. The interesting thought is: not everything is a conspiracy. Yes, that’s just what a conspirator would say. But it’s also true.
No, but media can be watermarked in imperceptible ways, and then if all players are required to check and act on such watermarks, the gap becomes narrow enough to probably be effective.
See Cinavia.
Massive if. Why would I voluntarily purchase gimped hardware?
Cinavia depended on being implemented by the player itself. It's difficult to see how (for example) a smart tv could implement it for streams coming in via HDMI from a computer the user has full control of.
The only thing this scheme was ever going to catch was full blown counterfeit disks sold on a street corner to your average joe. I think that was only ever much of a thing in the developing world. Or was it just before my time?
That is nice speed-up compared to generic hardware but everyone probably wants to know how much slower it is than performing same operations on plain text data? I am sure 50% penalty is acceptable, 95% is probably not.
This hardware won’t make the technique attractive for ALL computation. But, it could dramatically increase the range of applications.
That rules out anything latency-sensitive, but for batch workloads like aggregating encrypted medical records or running simple ML inference on private data it starts to become practical. The real unlock is not raw speed parity but getting FHE fast enough that you can justify the privacy tradeoff for specific regulated workloads.
However... In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware. My cynic could envision the technology export ban worldwide in the vein of RSA [0] .
Why would any company offer the customers real out of the box e2e encryption possibilities built into their devices.
DRM was mentioned by another user. This will not be used to enable privacy for the masses.
https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...
But getting them available for customers for example say even a PCIe card or something and then that automatically encrypting everything you ever run today over an encrypted connection would be a dream.
Why not when government can just force companies to backdoor their hardware for them. That way users are secure most of the time except from the government (until the backdoor in intel's chips gets discovered anyway), and users have a false sense of security/privacy so people are more likely to share their secrets with corporations and the government gets to spy on people communicating more openly with each other.
[0] https://confer.to/
[1] https://confer.to/blog/2025/12/confessions-to-a-data-lake/
The correct solution isn't yet another cloud service, but rather local models.
Within the enclave itself, DRAM and PCIe connections between the CPU and GPU are encrypted, but the CPU registers and the GPU onboard memory are plaintext. So the computation is happening on plaintext data, it’s just extremely difficult to access it from even the machine running the enclave.
Then, verification involves a three part approach. Disclaimer: I'm the cofounder of Tinfoil: https://tinfoil.sh/, we also run inference inside secure enclaves. So I'll explain this as we do it.
First, you open source the code that's running in the enclave, and pin a commitment to it to a transparency log (in our case, Sigstore).
Then, when a client connects to the server (that's running in the enclave), the enclave computes the measurement of its current state and returns that to the client. This process is called remote attestation.
The client then fetches the pinned measurements from Sigstore and compares it against the fetched measurements from the enclave. This guarantees that the code running in the enclave is the same as the code that was committed to publicly.
So if someone claimed they were only analyzing aggregated metrics, they could not suddenly start analyzing individual request metrics because the code would change -> hash changes -> verification fails.
[0] https://confer.to/blog/2026/01/private-inference/
There is basically no business demand beside from sellers and scholars.
5000 * 0 is still 0.
I joke, but i think relative numbers like this are very misleading as FHE is starting from such an absurdly slow place.
Still, this is pretty cool and there are probably niche applications that become possible with this, but i think this is a small enough speed up that it is still very niche.
If computation can happen directly on encrypted data, does that reduce the need for trusted environments like SGX/TEE, or does it mostly complement them?
The PC market was made shitty enough this year, that Mid/High class Mac Pro/laptops are actually often a better value deal now (if and only if your use-case is covered software wise.)
Intel does plan on a RTX + amd64 SoC soon, but still pooched the memory interface with a 30 year old mailbox kludge. Intel probably wont survive this choice without bailouts. =3
Judging by Nvidia's current valuation, that's a parenthetical worth ~4 trillion dollars. Apple isn't muscling AMD or Nvidia out of the datacenter anytime soon, and they're basically feeding Intel Foundry customers by dominating TSMC fab capacity. Apple's contribution to the chip shortage is so bad that even they have considered using Intel Foundry Services: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...
It's been 7 years of Apple Silicon and the macOS market share really hasn't shifted much. The Year Of Apple Silicon For People Whose Use-Case Is Covered Software Wise was 2019; the majority of remaining customers aren't showing any interest.
Indeed, but a local LLM finishing in 3 days instead of 1 on a $40k GPU changes the economic decision priority for some.
Apple sales grew "21.3% year-over-year as of the second quarter of 2025", but also sales flattened as supply chain pricing shocks from "AI"/tariffs hit late last year.
"Judging by Nvidia's current valuation" is a bad bet with current circular investment conditions.
We shall see, but as EOL drivers and OS rot hits legacy NVIDIA hardware... people are going to have to find some compromise in the next 2 years. Even AMD 9850X3D currently cost less than 64G of low end PC ddr5 memory.
Odd times for sure =3
Apple’s Mac market share of the PC market went from 6.6% to 9% (https://www.cultofmac.com/news/mac-shipments-2025-apple) so that’s nothing to balk at. The MacBook Neo might grow that even more as maybe it converts low end buyers into locked in users in the ecosystem and then they move on to more Mac’s.
If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
The textbook example application of FHE is phone book search. The server "multiply" the whole phonebook database file with your encrypted query, and sends back the whole database file to you every time regardless of queries. When you decrypt the file with the key used to encrypt the query, the database is all corrupt and garbled except for the rows matching the query, thereby causing the search to have practically occurred. The only information that exists in the clear are query and the size of entire database.
Sounds fantastically energy-efficient, no? That's the problem with FHE, not risks of backdooring.
So here's my question: Could FHE hardware be used to extremely quickly and reliably secure something like a database connection?
I looked through Gemini, and it says the following:
"Zama are building libraries that use FHE accelerators to allow "Confidential Smart Contracts" or private AI queries. You could send a highly sensitive health query to an AI, and the AI hardware would process it and send the answer back without the AI company ever knowing what you asked."
https://www.zama.org/
Which is why I ask. Because if you have a backdoor into the hardware, as either a corporation or a government, then you can get access to those "very sensitive and fully secured communications".
They also discuss a crypto / smart contracts use case, and advertise " securing L1 or L2".
First you encrypt the data. Then you send it to hardware to compute, get result back and decrypt it.
Are we reading the same article? It's talking about homorphic encryption, ie. doing mathematical operations on already encrypted data, without being aware of its cleartext contents. It's not related to SGX or other trusted computing technologies.
"We believe that just like the internet went from zero encryption with HTTP to encrypting data in transit with HTTPS, the next natural step will be to use FHE to enable end-to-end encryption by default in every application, something we call HTTPZ"
That's my point, this sounds like a way to create a backdoor for at-rest data.
I encrypt some data and keep the key. I send the encrypted data to you (probably some cloud provider). I tell you to do some operations on the data. I don't tell you the key or what the data is or what the operations mean. You send the results back to me. I use the key to decrypt them.
You have helped me with my compute task, but the data you have is totally meaningless without the key, and only I have the key.
It's hard to believe that it's possible to make encryption where this can do useful work, but it is.
I get the feeling honestly it seems more expensive and more effort to backdoor it..
After nearly 3 decades of critical technology systems architecture and management involving ongoing industry audits my experience and age knows why my hair has lost some of its color. Much of that lost color comes from security management of third party systems, yes the old dreaded dependencies. Elimination of those third parties is key for one's cyber sanity and hair color yet with technology still in its infancy some cannot distinguish the forest from the trees.
Nothing remains the same as progress moves forward correcting for past mistakes while learning what works and does not along that journey, technology platforms are no exception. Analogously early automobiles lacked safety features as well such as windshield wipers and seatbelts so has the passage of time proved their addition to be valued? Few people today truly understand how things work as nearly all just want the instant fix "pill" to alleviate their issues however this approach cannot work with security. True security is designed in from the foundation and such secure platforms go unseen yet we have an endless list of victims from those insecure systems which have "bolted on" security after the fact. This security change and more is coming to system designs as the entire world is now fully aware of cyber security, or in this case, the lack of it.
Time, the young fail to consider it up until a single moment in their life, while the old reflect on where theirs went. After the reflection of one's time however change becomes obvious.