WhatsApp Encryption, a Lawsuit, and a Lot of Noise
23 points by sjamaan
23 points by sjamaan
I'm a bit conflicted about this. On one hand, baseless allegations against an actually secure product (no matter what you think of Meta) should not receive this kind of attention.
On the other hand, it's great if this could start a bigger conversation around trust and closed source software. Probably too much to hope for, considering it's too technical a topic for most people, anyway.
In a sense there is a tension: this should receive court attention, subpoenas, and stuff (it's against a megacorp already having being found unscrupulous with user data, so they should get due process, but the threshold of due process being unduly onerous for them is very high); but of course everything is broken so it receives media attention before the subpoenas play out and any actual information is there.
Ah, very funny! This is essentially a much more cleanly articulated version of a post I wrote for Lobste.rs a little while ago. Discussion from that: https://lobste.rs/s/kiryys/whatsapp_is_untrustable
This post also makes the same (minor) oversight I did in my post (which was more major there): reproducible + verifiable builds on iOS don't... really exist? For anything? Signal does not provide them for Signal-iOS and (unfortunately) does not provide public discussion as to why, but this can be sussed out by looking at Telegram, which does... kinda. https://core.telegram.org/reproducible-builds
Telegram's ""reproducible builds"" verification for iOS are a) not something a technically-inclined user can casually do (like on Android). Like, it's actually really bad.
"As things stand now, you'll need a jailbroken device, at least 1,5 hours and approximately 90GB of free space to properly set up a virtual machine for the verification process."
This complexity is attributed to Apple's use of FairPlay DRM as an anti-app-piracy measure. But more importantly... it is my understanding from the output of the ipadiff.py script on the Telegram docs page that despite all this trouble, there b) still exist plugin files that are encrypted? And thus making Telegram's iOS app not technically fully verifiable.
So as commenters on my post pointed out, this means that if you want to fully remove the need to trust Signal Inc., this is only possible with Signal-Android or Signal-Desktop, and only if you only talk to other Signal-Android and Signal-Desktop users. (And of course you still have to trust Android, and trust your phone's not hacked, and trust your phone's SoC... but you don't need to trust Signal Inc.)
Rather than try to verify someone else's build, for iOS it is probably easier to take the source and make your own build.
But the key here is verifying that the official client is built without any modifications.
I don’t recall who, but someone here also noted that it’s hard for a messaging system like Whatsapp or Signal to protect their users against themselves: whether open or closed source, we’re all one update away from a backdoor.
Being open source helps, but we also need reproducible builds that can be audited by third parties, and the user must check such third party approvals before triggering the update. We’re still trusting third parties, but this would get rid of the single point of failure.
Yes, this is true! This is one of many unfortunate consequences of the tragedy of the "App Store" model on iOS / Android... my kingdom for a proper package manager.
It's such an unfortunate design, actually. Other downsides just off the top of my head are a lack of user control and (reliable) version pinning wrt. malicious / buggy / generally unpleasant updates, the inability to roll back to or even access previous versions of apps, no public infrastructure for public audits wrt. supply chain in the same way ex. Nixpkgs or the AUR let anybody do... guh. Awful awful!
I don't even buy the oft-repeated refrain that App Store models are safer wrt. security bugs. I think it has burned a genuinely large enough amount of goodwill with ordinary users that the automatic updates are offset by ordinary users broadly being update-avoidant now. It's a shame, too, because automatic updates aren't inherently linked to the user-hostility of the whole App Store model... you totally can have automatic security updates alongside standard package manager baselines.
What if the method of creating keys used for encryption was somehow compromised in a way that makes it straightforward for Meta to generate any device's keys?
While I don't believe allegations until there's proof, I do often wonder about feasibility, and this would explain how neither keys nor plaintext copies of anything would ever need to leave a device.
Let me preface this by saying I don't think there's any strong evidence that WhatsApp E2E is broken or backdoored in any way. But: would it be detectable if WhatsApp added an invisible extra participant to every conversation? That way, they wouldn't have to mess with anyone else's keys, they'd just have a copy of every message encrypted to their own keys. I'm not saying this has happened, but it is a way double ratchet chat systems have been attacked in the past (I think I recall something about a Synapse vulnerability that has long since been fixed?). I just don't know how observable it would be.
Yes that's one option. Another is to have the app send copies after decryption to wherever. Red these are just the two easiest ways.
They may not be happening now (probably aren't) but could start at any time or happen selectively and none would know.