this post was submitted on 04 May 2026
381 points (99.2% liked)

Privacy

4467 readers
135 users here now

Icon base by Lorc under CC BY 3.0 with modifications to add a gradient

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] quick_snail@feddit.nl 12 points 3 days ago (3 children)

They want to be able to link accounts to your identity, obviously

[–] boonhet@sopuli.xyz 5 points 2 days ago (3 children)

You know, I've been thinking...

Signal's end to end encrypted, yes... But we do the key exchange process through Signal's servers, don't we? How do we know they don't store copies of the keys? Does the client have a mechanism in place to make sure the man in the middle doesn't do anything funny? I haven't actually delved very deep into the code, but it sounds like I should.

And... Sure, their server code may be open source too, but nobody guarantees that that's the code actually running on their servers.

[–] dreamy@quokk.au 7 points 2 days ago

Here is an overview of all audits done on Signal:
https://community.signalusers.org/t/overview-of-third-party-security-audits/13243
No recent server audits really. They are pretty public about information requests made by the government and their responses though, and from what I can see the only pieces of information they have shared with the government to this point are the time of account creation and the date of the last connection to Signal servers:
vnBe1hD30fOJL5L.png
You can check more of their responses from here:
https://signal.org/bigbrother/

[–] kuberoot@discuss.tchncs.de 3 points 2 days ago

How do we know they don't store copies of the keys?

I don't know how Signal is built, but you can establish a secure communication channel through a channel that's being listened in on, meaning the server doesn't need to ever see the keys. Look up Diffie-Hellman for an example, an algorithm that lets two actors establish a shared secret without communicating enough information to reconstruct the secret.

So if the client uses a secure key exchange algorithm (or straight up asymmetrical encryption) the server can't just grab your keys - you just need a secure way to verify that your keys actually match, because what they could do is a man in the middle attack where they establish a secure channel with you and the person you're messaging, and decrypt and reencrypt messages going both ways, being able to listen in and modify messages.

[–] quick_snail@feddit.nl 1 points 2 days ago* (last edited 2 days ago)

They ship their app with blobs, so we cannot verify what their app is doing.

[–] pulsewidth@lemmy.world 3 points 3 days ago (1 children)

My thinking also. Corps/gov can also link identity to emails of course, but it's much harder due to email aliasing and the ease of new account creation.

Mobile phone numbers are far more personal - people generally only have one or two max and generally keep them for very long periods.

If its the true reason, it would make Signal not much better than Meta's WhatsApp, which gleans value from its users by noting metadata tying users to one-another by who they contact, when, and how often to extrapolate social circles and relationships. But Meta goes further in I think also tracking location, etc, and obviously has much more personal data in the linked phone numbers of many FB/Insta accounts.. Signal could potentially be doing some of that to a degree with IP geolocation... Not great.

TLDR: its the one thing stopping me from trusting Signal entirely as a benevolent actor - they want and 'need' your personal phone number. I use it still, as its the best available mainstream option, but I mention this concern when recommending it to those seeking privacy.

[–] quick_snail@feddit.nl 3 points 2 days ago

For me its the blobs. We can't trust anything they do so long as the code they ship isn't 100% open source