previous discussion: <a href="https://news.ycombinator.com/item?id=46600839">https://news.ycombinator.com/item?id=46600839</a>
It’s exciting to hear that Moxie and colleagues are working on something like this. They definitely have the skills to pull it off.<p>Few in this world have done as much for privacy as the people who built Signal. Yes, it’s not perfect, but building security systems with good UX is hard. There are all sorts of tradeoffs and sacrifices one needs to make.<p>For those interested in the underlying technology, they’re basically combining reproducible builds, remote attestation, and transparency logs. They’re doing the same thing that Apple Private Cloud Compute is doing, and a few others. I call it system transparency, or runtime transparency. Here’s a lighting talk I did last year: <a href="https://youtu.be/Lo0gxBWwwQE" rel="nofollow">https://youtu.be/Lo0gxBWwwQE</a>
I don't know, I'd say Signal <i>is</i> perfect, as it maximizes "privacy times spread". A solution that's more private wouldn't be as widespread, and thus wouldn't benefit as many people.<p>Signal's achievement is that it's very private while being extremely usable (it just works). Under that lens, I don't think it could be improved much.
>Signal's achievement is that it's very private while being extremely usable (it just works).<p>Exactly. Plus it basically pioneered the multi-device E2EE. E.g., Telegram claimed defaulting to E2EE would kill multi-client support:<p>"Unlike WhatsApp, we can allow our users to access their Telegram message history from several devices at once thanks to our built-in instant cloud sync"<p><a href="https://web.archive.org/web/20200226124508/https://tgraph.io/Why-Isnt-Telegram-End-to-End-Encrypted-by-Default-08-14" rel="nofollow">https://web.archive.org/web/20200226124508/https://tgraph.io...</a><p>Signal just did it, and in a fantastic way given that there's no cross device key verification hassle or anything. And Telegram never caught up.
I do wonder what models it uses under the hood.<p>ChatGPT already knows more about me than Google did before LLMs, but would I switch to inferior models to preserve privacy? Hard tradeoff.
What he did with messaging... So he will centralize all of it with known broken SGX metadata protections, weak supply chain integrity, and a mandate everyone supply their phone numbers and agree to Apple or Google terms of service to use it?
The issue being there's not really a credible better option. Matrix is the next best, because they do avoid the tie-in to phone numbers and such, but their cryptographic design is not so great (or rather, makes more tradeoffs for usability and decentralisation), and it's a lot buggier and harder to use.
Do you know a better alternative that I can get my elderly parents and non-technical friends to use?
I haven’t come across one and from my amateur POV it seems much better than WhatsApp or Telegram.
Not sure why you're gettimg downvoted. This is exactly what he did to instant messaging; extremely damaging to everyone and without solid arguments for such design.
Or, he took a barely niché messaging app plugin (OTR), improved it to provide forward secrecy for non-round trips, and deployed the current state-of-the art end-to-end encryption to over 3,000,000,000 users, as Signal isn't the only tool to use double-ratchet E2EE.<p>>broken SGX metadata protections<p>Citation needed. Also, SGX is just there to try to verify what the server is doing, including that the server isn't collecting metadata. The real talking is done by the responses to warrants <a href="https://signal.org/bigbrother/" rel="nofollow">https://signal.org/bigbrother/</a> where they've been able to hand over only two timestamps of when the user created their account and when they were last seen. If that's not good enough for you, you're better off using Tor-p2p messengers that don't have servers collecting your metadata at all, such as Cwtch or Quiet.<p>>weak supply chain integrity<p>You can download the app as an .apk from their website if you don't trust Google Play Store.<p>>a mandate everyone supply their phone numbers<p>That's how you combat spam. It sucks but there are very few options outside the corner of Zooko's triangle that has your username look like "4sci35xrhp2d45gbm3qpta7ogfedonuw2mucmc36jxemucd7fmgzj3ad".<p>>and agree to Apple or Google terms of service to use it?<p>Yeah that's what happens when you create a phone app for the masses.
Exactly. These arguments are so weak that they read more like a smear campaign than an actual technical discussion.<p>"You have to agree to Apple's terms to use it"? What's Signal meant to do, jailbreak your phone before installing itself on it?
>>broken SGX metadata protections<p>>Citation needed.<p><a href="https://sgx.fail" rel="nofollow">https://sgx.fail</a><p><a href="https://en.wikipedia.org/wiki/Software_Guard_Extensions#List_of_SGX_vulnerabilities" rel="nofollow">https://en.wikipedia.org/wiki/Software_Guard_Extensions#List...</a>
> You can download the app as an .apk from their website if you don't trust Google Play Store.<p>I wish apple & google provided a way to verify that an app was actually compiled from some specific git SHA. Right now applications can claim they're opensource, and claim that you can read the source code yourself. But there's no way to check that the authors haven't added any extra nasties into the code before building and submitting the APK / ios application bundle.<p>It would be pretty easy to do. Just have a build process at apple / google which you can point to a git repo, and let them build the application. Or - even easier - just have a way to see the application's signature in the app store. Then opensource app developers could compile their APK / ios app using github actions. And 3rd parties could check the SHA matches the app binaries in the store.
>over 3,000,000,000 users<p>Is that a typo or are you really implying half the human population use Signal?<p>Edit: I misread, you are counting almost every messaging app user.
Just WhatsApp. Moxie's ideas are used in plenty of other messengers. The context was "what Moxie did for the field of instant messaging".
Yeah, whatsapp uses the same protocol.
>> and agree to Apple or Google terms of service to use it?<p>> Yeah that's what happens when you create a phone app for the masses.<p>No, that's what happens when you actively forbid alternative clients and servers, prevent (secure) alternative methods of delivery for your app and force people to rely on the American megacorps known for helping governmental spying on users, <a href="https://news.ycombinator.com/item?id=38555810">https://news.ycombinator.com/item?id=38555810</a>
I’m missing something, won’t the input to the llm necessarily be plaintext? And the output too? Then, as long as the llm has logs, the real input by users will be available somewhere in their servers
According to the article:<p><i>>Data and conversations originating from users and the resulting responses from the LLMs are encrypted in a trusted execution environment (TEE) that prevents even server administrators from peeking at or tampering with them.</i><p>I think what they meant to say is that data is <i>decrypted</i> only in a trusted execution environment, and otherwise is stored/transmitted in an encrypted format.
<a href="https://news.ycombinator.com/item?id=46619643">https://news.ycombinator.com/item?id=46619643</a>
The website is: <a href="https://confer.to/" rel="nofollow">https://confer.to/</a><p>"Confer - Truly private AI. Your space to think."<p>"Your Data Remains Yours, Never trained on. Never sold. Never shared. Nobody can access it but you."<p>"Continue With Google"<p>Make of that what you will.
My issue is it claims to be end-to-end encrypted, which is really weird. Sure, TLS between you and your bank's server is end-to-end encrypted. But that puts your trust on the service provider.<p>Usually in a context where a cypherpunk deploys E2EE it means only the intended parties have access to plaintexts. And when it's you having chat with a server it's like cloud backups, the data must be encrypted by the time it leaves your device, and decrypted only once it has reached your device again. For remote computing, that would require LLM handles ciphertexts only, basically, fully homomorphic encryption (FHE). If it's that, then sure, shut up and take my money, but AFAIK the science of FHE isn't nearly there yet.<p>So the only alternative I can see here is SGX where client verifies what the server is doing with the data. That probably works against surveillance capitalism, hostile takeover etc., but it is also US NOBUS backdoor. Intel is a PRISM partner after all, and who knows if national security requests allow compelling SGX keys. USG did go after Lavabit RSA keys after all.<p>So I'd really want to see this either explained, or conveyed in the product's threat model documentation, and see that threat model offered on the front page of the project. Security is about knowing the limits of the privacy design so that the user can make an informed decision.
Looks like using Google for login. You can also "Continue with Email." Logging in with Google is pretty standard.
It is not privacy oriented if you are sharing login, profile information with Google and Confer.<p>It wouldn't be long until Google and Gemini can read this information and Google knows you are using Confer.<p>Wouldn't trust it regardless if Email is available.<p>The fact that confer allows Google login shows that Confer doesn't care about users privacy.
Most people don't care about Google knowing whether they're using a particular app. If they do, they have the option not to use it. The main concern is that the chats themselves are E2E encrypted, which we have every reason to believe.<p>This is a perfect example of purism vs. pragmatism. Moxie is a pragmatist who builds things that the average person can actually use. If it means that millions of people who would otherwise have used ChatGPT will migrate because of the reduced friction and get better privacy as a result, that's a win even if at the margin they're still leaking one insignificant piece of metadata to Google.
You don’t have to use Google login though?
People building solutions like this that aim for broad adoption have to make certain compromises and this seems OK to me (just talking about offering a social login option, haven’t checked the whole project in detail)
Backdoor it?
Add a defunct cryptotoken?
Do what he did for messaging? Make a thing almost nobody uses?
If this is how little you think of an app with ~50 million monthly active users, I take it making apps with a billion MAU is something you routinely do during your toilet breaks, or...?
3 billion WhatsApp users use protocol built on his labor, every day.
what <i>did</i> he do for messaging? Signal is hardly more private than goddamn Whatsapp. in fact, given that Whatsapp had not been heavily shilled as the "totally private messenger for journalists and whistleblowers :^)" by the establishment media, I distrust it less.<p>edit @ -4 points: please go ahead and explain why does Signal need your phone number and reject third party clients.
Yeah, it seems kind of funny how Signal is marketed as a somewhat paranoid solution, but most people run it on an iPhone out of the app store with no way to verify the source. All it takes is one villain to infiltrate one of a few offices and Signal falls apart.<p>Same goes for Whatsapp, but the marketing is different there.
He implemented E2EE in Whatsapp as well.
> Signal is hardly more private than goddamn Whatsapp<p>Kind of because Whatsapp adopted Signal's E2EE... And not even that long ago!
Even if you discount Signal he did more or less design the protocol that WhatsApp is using <a href="https://techcrunch.com/2014/11/18/end-to-end-for-everyone/" rel="nofollow">https://techcrunch.com/2014/11/18/end-to-end-for-everyone/</a><p>Also while we would expect heavy promotion for a trapped app from some agency it's also a very reasonable situation for a protocol/app that <i>actually</i> was secure.<p>You can of course never be sure but the fact that it's heavily promoted/used by people on both the whistleblowers, large corporations and multiple different National Officials at the same time is probably the best trustworthyness signal we can ever get for something like this.<p>(if all of these can trust it somewhaat it has to be a ridiculously deep conspiracy to not have leaked at least to some national security agency and forbidden to use(
> Signal is hardly more private than goddamn Whatsapp.<p>To be fair, that is largely because WhatsApp partnered with Open Whisper to bring the Signal protocol into Whatsapp. So effectively, you're saying "Signal-the-app is hardly more private than another app that shares Signal-the-protocol".<p>In practical terms, the only way for Signal to be significantly more private than WhatsApp is if WhatsApp were deliberately breaking privacy through some alternative channel (e.g. exfiltrating messages through a separate connection to Meta).