Signal

Drew DeVault: I don't trust Signal:

I expect a tool which claims to be secure to actually be secure. I don't view "but that makes it harder for the average person" as an acceptable excuse. If Edward Snowden and Bruce Schneier are going to spout the virtues of the app, I expect it to actually be secure when it matters - when vulnerable people using it to encrypt sensitive communications are targeted by smart and powerful adversaries.

Making promises about security without explaining the tradeoffs you made in order to appeal to the average user is unethical. Tradeoffs are necessary - but self-serving tradeoffs are not, and it's your responsibility to clearly explain the drawbacks and advantages of the tradeoffs you make. If you make broad and inaccurate statements about your communications product being "secure", then when the political prisoners who believed you are being tortured and hanged, it's on you. The stakes are serious. Let me explain why I don't think Signal takes them seriously. [...]

Truly secure systems do not require you to trust the service provider. This is the point of end-to-end encryption. But we have to trust that Moxie is running the server software he says he is. We have to trust that he isn't writing down a list of people we've talked to, when, and how often. We have to trust not only that Moxie is trustworthy, but given that Open Whisper Systems is based in San Francisco we have to trust that he hasn't received a national security letter, too (by the way, Signal doesn't have a warrant canary). Moxie can tell us he doesn't store these things, but he could. Truly secure systems don't require trust. [...]

And here comes the truly despicable bit:

Moxie forbids you from distributing branded builds of the Signal app, and if you rebrand he forbids you from using the official Open Whisper servers. Because his servers don't federate, that means that users of Signal forks cannot talk to Signal users. This is a truly genius move. No fork of Signal to date has ever gained any traction, and never will, because you can't talk to any Signal users with them. In fact, there are no third-party applications which can interact with Signal users in any way. Moxie can write as many blog posts which appeal to wispy ideals and "moving ecosystems" as he wants, but those are all really convenient excuses for an argument which allows him to design systems which serve his own interests.

No doubt these are non-trivial problems to solve. But I have personally been involved in open source projects which have collectively solved similarly difficult problems a thousand times over with a combined budget on the order of tens of thousands of dollars.

What were you going to do with that 50 million dollars again?

It is clear from its design and behavior that Signal's priority is to be a social network first and an encryption tool second. Growth at any cost.

Last year I gave Signal a try and it immediately spammed all of my contacts with my non-public phone number. So I was already aware that Signal is sketchy as fuck.

But abusing Trademark law to circumvent the checks and balances that open source development normally provides is just appalling. They get to pretend that it is open source, get the bullet item on the pitch sheet, get the good press associated with that, while still maintaining absolute control. It's no less a vertically-integrated, untrustworthy data silo than any product from Facebook or Google.

Previously, previously, previously, previously, previously, previously, previously, previously, previously.

Tags: , , , , , ,

32 Responses:

  1. HS says:

    >Moxie can tell us he doesn't store these things, but he could. Truly secure systems don't require trust.

    Couldn’t you verify that messages are encrypted end to end without having to trust him?

    • Ham Monger says:

      How do you know the official servers don't do key escrow?

      I don't use Signal and I'm not saying it does key escrow, but I could trivially set up my servers and clients that behave differently when my official client for my protocol connects to my official servers, versus when they connect to open or unofficial servers/clients.

      Due to the fact that the official client is distributed in binary form by the app stores, there's also no guarantee that the open source code was used to create it. In fact, if Signal is one of the class of applications where new features show up in closed builds before making it into a "community" edition, they're almost certainly different.

      Also also, a third party could have injected key escrow into the official builds distributed by the app stores, and only the Signal company would have a chance of knowing, and that requires them checking each build release against the known behavior that they expect. (Targeted distribution of Trojan Horse or haxed apps to specific people is outside the scope of this comment, despite being germane to Signal's operations, since it gets into vendor lockin, closed smartphone ecosystems and monocultures, etc., etc.)

      • Nick Lamb says:

        This hypothetical third party that has somehow tampered with the official app store builds. First of all how does that even work?

        The Android package system (and I'm assuming Apple's isn't significantly more stupid than this) requires packages signed by the vendor. The app store doesn't end up even knowing how to make new packages that can be seamlessly presented to a device as genuine if it wanted to‡. So, is this hypothetical adversary tampering with Signal's official build servers before the packages are signed, or what?

        Either way: Most of the Signal app is deliberately a reproducible build. So, the fact that an official client you got from the legitimate store doesn't match the reproducible one is an obvious sign to any third party that something is wrong, even if we only bother to check later, it's a smoking gun.

        ‡ If you're Jamie Zawinski you might include your private keys in a source tarball by mistake, but let's assume for a moment that we're holding Open Whisper Systems to a higher standard.

        Now key escrow was a big deal back in the 1990s, but technology for this stuff moved. So, for Signal you're going to need to track all the ephemeral state inside at least one participant's systems to snoop messages.

        This means you don't just have a secret "Here's the keys, psst, I'm pretending this is um... a firmware version check call" feature hidden in the initial key generation, because that key doesn't get the job done over an extended conversation. You need to steal new ephemeral state with every message, or at least the vast majority of messages. If messages go back and forth between sender and recipient without you either having a copy of the underlying ephemeral state or messing with the plaintext of the messages somehow, you are now locked out. This is Signal's "double ratchet" design, both participants are "constantly" (with each message) feeding each other new Diffie-Hellman state and changing their symmetric keys.

        But if you've got a mechanism that lets you steal all this new data, why not just steal the plaintext itself? What's this complicated "escrow" feature doing for you?

      • Nolan says:

        You can look at the source and see that the key never leaves the phone, so key escrow is not possible.

        Signal (at least for Android) builds reproducibly so you can verify that the bits you get from the app stores match the source code you looked at.

        Signal development happens on the same github repo you see, so releases are done source code first, app store binaries 2nd.

    • jwz says:

      Even if the end-to-end crypto is trustworthy, metadata is still a thing that people care about. In fact, it puts people in prison with some regularity.

      • Perry Metzger says:

        The system is deliberately built to make it extremely difficult for them to capture metadata. Moxie & Co. have even published papers on how they do that.

  2. jancsika says:

    Do you know if anyone did a write-up since your last complaint that explains how the source code of Signal can achieve what you described in your anecdotes? (I.e., sending non-public phone numbers to users who didn't already have them.)

    • HS says:

      Probably not, because from what I can tell, what he wrote was categorically false.

      JWZ: You're doing this thing

      Moxie: Its not even possible

      JWZ: But these users on twitter seem to think you are which is just as bad.

      • jwz says:

        "HEY EVERYBODY YOUR CASUAL ACQUAINTANCE SO-AND-SO IS ON SIGNAL NOW AND HERE'S THEIR NUMBER" is, without question or dispute, a thing that happens. Without warning or option. This is despicable.

        Many people believe, based on circumstantial evidence, that Signal leaks the phone number.

        The Signal developers claim, "No, we didn't leak the number, we just data-mined the contacts and SMS logs of each and every other user, surfaced a number that those users didn't even know they already had, and then attached a name to it and popped up a notification. We did nothing wrong! We're helping, watch us help!"

        That attitude is like, "Well if you didn't want everyone to know who you are, you shouldn't have walked past all of those face-tracking video cameras."

        • jancsika says:

          So then two options:

          "Let all my contacts know I use Signal and connect with me"

          "Let me choose which contacts can see I'm on Signal"

          Choosing the second option would let you choose from a list of contacts the Signal app mined on your phone. After that it only sends the hashes of those contacts to the server (or whatever it currently does for conact discovery).

          Would such a design have met your requirements back when you tried out Signal?

          • rozzin says:

            What am I missing here?

            I don't get why you even need to make choices about the _mechanism_ or _scope_ of advertising your availability for something that's supposed to be a drop-in replacement for existing lines of SMS-based communication; I don't undertand why the advertisement is even necessary in the first place, when the premise is "the client software already has access to the user's list of contacts". Why do people even need to know _up-front_ who they can/cannot send Super-SMS messages to vs. who they still need to send plain SMS to?

            Why isn't it sufficient for Signal to just present _all_ of your SMS contacts as people you can message? If you try sending them a Super-SMS message and it doesn't work, then offer to fall back to sending the message over plain SMS and/or sending a "you should upgrade to Super-SMS" invitation to them over plain SMS (or whatever)—so you discover someone by sending them a message, and you can't discover them without them knowing that you've done so.

            And if you want to go a step further, on the other end, have the program catch _incoming_ SMS and send back invitations like "I've switched to this new Super-SMS system, you should too--please use it instead of this plain old SMS".

            For something that sounds like it's supposed to be a drop-in replacement for SMS, why does it not seem to actually be doing anything like that? Is it because for some reason none of this is actually possible, or what?

            • jancsika says:

              Is your suggestion to do contact discovery and key exchange over SMS?

              • jwz says:

                Let me try and simplify what I think he was saying.

                The Signal protocol has a feature where you can upload a (hash of a) phone nmber and they will tell you "yes, this person has a Signal account -- and here's their name!"

                The Signal client (effectively) periodically uploads and probes every phone number you've ever been To or Cc on, and if there's a new match, pops up a happy notification.

                His proposal, which I agree with is: how about FUCKING DON'T.

                You want to send a message to someone with Signal? Go ahead and try. Maybe you get back an error saying "hey that number is not a Signal user." So then you use SMS. Maybe Signal can make falling back to SMS easy. Or not. Whatever.

                Of course the "how about FUCKING DON'T" approach is anathema to "but we must grow the network! Invite invite invite!" And so it is not VC Catnip.

                • Nick Lamb says:

                  The Signal protocol has a feature where you can upload a (hash of a) phone nmber and they will tell you "yes, this person has a Signal account -- and here's their name!"

                  This doesn't exist. For this to work Signal would need to know the names of Signal users, which it doesn't want to learn because (remember you ranted about it earlier) obviously if they have that information it will get subpoenaed at the very least.

                  "But," I can already hear you thinking, "my friend's name appeared in my Signal app". Yup. Almost as you described in your "FUCKING DON'T" stuff these days the two Signal apps will automatically do all this heavy lifting when their users communicate. Specifically, if you have Alice in your Contacts but not Bob, when you communicate with Alice she gets your profile data, but communicating with Bob doesn't result in him getting your profile. The app is doing a bunch of work anyway, may as well skip the "New phone, who dis?" step for people you clearly know. [ If you deny Signal access to Contacts then it doesn't know who your Contacts are and won't provide your profile to anyone. ]

                  You want to send a message to someone with Signal? Go ahead and try. Maybe you get back an error saying "hey that number is not a Signal user." So then you use SMS. Maybe Signal can make falling back to SMS easy. Or not. Whatever.

                  Ignorance is not the same thing as privacy. Making the UX worse in order that you can pretend lots of people who actually have your phone number, don't - is not a privacy win, it just allows you to keep up your preferred ignorance. In particular this doesn't hamper an adversary who wonders whether Jamie has Signal, under FUCKING DON'T the adversary still gets their answer to this question but your friends don't know (without trying) if they can call you securely because unlike the adversary they're reliant on FUCKING DON'T which hides this from them until after they decide.

                  • jwz says:

                    This doesn't exist.

                    I assure you that I got messages from people whom I barely know, who did not even realize that they had my phone number -- and whose number I have never had -- and who further assure me that they did not have a "Contacts" entry with my name and number in it -- who got a "Jamie is on Signal!" pop-up 30 minutes after I created my account.

                    So again, I don't know who to believe, you or my lying eyes.

                    just allows you to keep up your preferred ignorance

                    So you favor the starry-eyed 90s David Brin argument that all of those facial recognition video cameras every 30 feet in every city should be publishing their logs for everyone to see, because if it can be mined in some way then there was no expectation of privacy.

                    Yes, someone could mine their 5-year-old archive of text messages which included some group-texts of unrecognized numbers, and pipe all of those half-decade-old unknown numbers into some sketchy-ass search engine to see who that actually was. Oh wait, they don't have to, Signal is that sketchy-assed search engine and it does it for free twice an hour.

                    "You thought so much about whether you could that you didn't think about whether you should."

                  • NT says:

                    In my case, within an hour of installing Signal I got a message from the only phone number I've ever blocked.
                    I understand why they are doing this but fuck them for pretending it's not a problem.

                • jancsika says:

                  It sounds to me like there are three separate thingies:

                  1. Signal server has a "Don't tell me your card" hashing trick that it uses to do contact discovery without storing a social graph. (Let's assume they have solved the problem of the phone number keyspace being brute-forceable.)

                  2. Signal clients upload a hash "deck" generated from the contacts the client's user authorized it to read.

                  3. Based on your reports, Signal client probably mines more data on the user's device for inferences about potential signal users-- SMS logs, etc.-- and adds that to deck it uploads to the server. This causes a big, uncontrollable, and dangerous surprise for even technically-minded users.

                  So...

                  If #3 were considered a bug and fixed, and the option I suggested above were added, would that address the despicable behavior you highlighted in all caps?

                  • jwz says:

                    Look, the post about Signal's despicable social-graph-mining and privacy-leaking practices was last year. Must we re-hash all of that again? ("noun: re-hash ˈrēˌhaSH/ 1. a reuse of old comments without significant change or improvement.")

                    The point of this post was to say, "Hey, Signal says it's open source but it uses Trademark law to quash any meaningful outside development, deployment or forking, and that is fucked up". It's not really open source at all unless your definition of that is limited to "we published the source code".

                  • Nate says:

                    I assure you that I got messages from people whom I barely know, who did not even realize that they had my phone number -- and whose number I have never had -- and who further assure me that they did not have a "Contacts" entry with my name and number in it -- who got a "Jamie is on Signal!" pop-up 30 minutes after I created my account.

                    I don't want to keep rehashing the original post either, but I am certain that the other person had your Contact on their phone ("Jamie" + your phone number) if it showed them "Jamie is on Signal" notification. Perhaps

                    Only hashed phone numbers are uploaded to the Signal server, not names of either party. The notification is generated locally, by joining the phone number hash with the contact card to get a name to display.

                    https://support.signal.org/hc/en-us/articles/360007061452-Does-Signal-send-my-number-to-my-contacts-

                    You can look at the source to confirm this.

                • rozzin says:

                  jwz wrote:

                  His proposal, which I agree with is: how about FUCKING DON'T.

                  You want to send a message to someone with Signal? Go ahead and try. Maybe you get back an error saying "hey that number is not a Signal user." So then you use SMS. Maybe Signal can make falling back to SMS easy. Or not. Whatever.

                  Yes, that's exactly what I'm saying.

                  The answer to the whole "how do I tell whether it's going to work" question really seems like it's just "you don't need to--you'll know whether it does work once you've tried it". Like "duck typing" in programming, I guess....

                  But I'm actually not sure which part of the thing that I wrote jancsika was even responding to: my more cynical part expects that the reader managed to skip over all of the important parts and read too much into the bit at the end where I suggested they could give people some sort of preconfigured SMS-autoresponder peripheral to even also further the "grow the network" goal (SMS autoresponders already exist, and have been a thing for a while now--I didn't think the idea of "include an SMS autoresponder preconfigured to send a message like `I am done with SMS please download this program and use it to communicate with me instead of SMS' whenever someone sends you an old-style SMS" would be so novel to anyone that it would actually detract from the point I laid out in the previous two paragraphs). Maybe it's because I still call them "programs" instead of "apps"--is that confusing?

                  • Nate says:

                    The problem with SMS fallback is that it gives an easy way to do a downgrade attack (possibly silently, depending on how it's implemented). Given the choice between not getting a message through or sending it insecurely, users are likely to take the easy route. The small effort required to cut/paste or retype it into an SMS app at least makes it clear they're intentionally doing so.

  3. Nick Lamb says:

    The author's responses when asked what he thinks people should use instead are I think helpful in deciding whether this is an unhinged rant or a meaningful assessment of Signal.

    First he says people should use Matrix. Matrix doesn't have working end-to-end encryption by default, so you know, make sure you go into a preferences panel and remember to switch it on. If it got mysteriously "switched off" and you're subsequently tortured don't blame DeVault presumably, for some reason. Your Matrix client may not have such a preference - too bad, you can't have end-to-end encryption, it's just a niche feature anyway so why make it central to er... wait, what were we talking about again?

    He subsequently recommended Telegram. Again, no end-to-end by default, and whereas DeVault imagines that somehow Moxie might have hidden a backdoor the Telegram custom hand-rolled crypto design purposefully includes one, and considers it a feature. So when you're being tortured you can feel assured that this is all by design... It's OK though because unlike Open Whisper Systems which may be vulnerable to pressure from a National Security Letter, the people behind Telegram are Russians, and we know Russia doesn't er... well at least its leader isn't... oh.

    Thomas Ptacek doesn't agree with me about very much at all, but on this point we're entirely in sync. Moxie's plan with Signal gets the protocol used. Not "Well, it was more important to focus on federating with GNOME Bonkleweedle 0.0.2a which only supports plaintext, but we do want encryption some day" but control all the moving parts to actually get real encryption to real users today. Encryption you don't use hasn't delivered. Our host has even been down this road personally, spending months working on S/MIME. People don't use it, so S/MIME made no real difference, even if the technology in it was fairly sound (for the time).

    • jwz says:

      The author's responses when asked what he thinks people should use instead are I think helpful in deciding whether this is an unhinged rant or a meaningful assessment of Signal.

      Absolutely false! If I say "X is bad" there is no requirement that I also say "Y is better". The attitude that you must do that is endemic to the open source world and it's just toxic and insane. It's their second-worst attitude, right behind "well why don't you just fix it yourself then".

      Moxie's plan with Signal gets the protocol used.

      "Growth at any cost." Self-promotion comes first. Get the users, invite invite invite, work that network effect, all other concerns secondary, crew expendable.

      • HS says:

        Absolutely false! If I say "X is bad" there is no requirement that I also say "Y is better". The attitude that you must do that is endemic to the open source world and it's just toxic and insane. It's their second-worst attitude, right behind "well why don't you just fix it yourself then".

        True - but here, the author has indeed made alternate recommendations and they seem to be objectively worse than Signal. The point isn't that Signal is great, but you're probably still better off using it than anything else that's available

      • Nick Lamb says:

        Remind me, those people who decided "Hillary is bad" and so they stayed home instead, what was the effect of that again? I mean, those people made sure Hillary didn't win, so they got what they wanted right?

        The article you linked could say "Everything is terrible, never communicate with other people except in person and then only in an underground bunker on neutral ground after all scanning for bugs". That would be largely useless advice in practice, but it's not obviously the wrong the way the actual text is.

        The article instead just recommends you use stuff that's less secure, on the basis that it suits the author's preferences for how to write software. "Trust" turns out to be a question of whether they're amenable to the author's fervour for cloning other people's stuff (DeVault's clones of Minecraft and GitHub for example), DeVault couldn't give a crap about security, and that's the point of my criticism. When they were challenged about this they doubled down - and proposed yet more insecure alternatives only because hey, they agree with the philosophy behind their development methods.

        Rather than "Growth at any cost" what you've got is "Encryption is the priority". Skype growing a Signal protocol implementation doesn't achieve any of the "Growth" you imagine Open Whisper Systems is going for, but it did mean more of Microsoft's users got working encryption. WhatsApp, Faceboom Messenger, Google Allo. None of them achieved "Growth at any cost" for Signal, but by adopting Signal protocol with help from Moxie's team their users got encryption.

        • jwz says:

          Remind me, those people who decided "Hillary is bad"

          YOW are we GODWINNING yet??

          The article you linked could say "Everything is terrible, never communicate with other people

          Wow, if I was complaining about HDMI, you'd accuse me of kowtowing to the Big VHS lobby.

          No, it doesn't say that at all.

          It says several things quite clearly: 1) Signal's policies require use of Google Play, which is unsafe; 2) Signal requires that you trust the developers' servers, which is a poor design for security software; 3) The developers abuse Trademark law to completely undermine how open source development is supposed to work; 4) The developers know all this and have shown themselves uninterested in fixing it.

          Now, you might chose to argue about any of these things, whether they are true or whether you care. Ok. But then at the end of this 1,900 word essay, there was a literally 15 word "PS" with some recommendations of alternatives. And that "PS" is the only thing you want to talk about.

          The article instead just recommends you use stuff that's less secure, on the basis that it suits the author's preferences for how to write software.

          How to write security software that is less likely to be co-opted by extremely motivated attackers. That's an important fucking distinction.

          • Nick Lamb says:

            When somebody skewers you the traditional response is "Touché" rather than referencing Zippy and Nazis.

            You're correct that the blog post doesn't say the thing I claimed it doesn't say. And? Maybe your reading comprehension is off, maybe I didn't do a great job of making my point.

            As to your listed 4 things, sigh, I took it as read that these are all wrong. 1. Signal's policy correctly assumes most Android users have Google Play. It doesn't require this, but if you insist on not having Google Play you will need to manually install the non-Google Play build they provide instead of getting it from the Google Play store and obviously you don't magically get the benefits of Google Play in this non-Google Play app‡. The author believes instead that Signal binaries should be compiled for and offered in every independent alternative to Google Play of the author's choice. How much did GNU used to claim they'd charge for pre-compiled binaries? 2. The apparent alternatives are trust everybody (the author's preference, which is clearly far worse if you actually want any security) or trust nobody (the same security as when you switch off a PC and drop it into a deep ocean trench... useless, but secure). 3. This is how lots of popular Free Software works, downloading their code from GitHub neither entitles me to issue my own working publicly trusted certificates like Let's Encrypt, nor to call my non-working service "Let's Encrypt". 4. Because it's not true the developers have mostly lost patience with "correcting" people who never cared whether it was true anyway.

            ‡ Unlike DeVault I think you actually care about security, so here's a great example. Don't Stand Out is clearly violated by operating your own special notification servers unique to your private communications channel. Google Play and Apple's equivalent allow your "New Signal message" notification to blend in with the millions of people whose Amazon parcel was just delivered, or who got a new email, or whose high score was just beaten by a friend, or who just got a "Like" on their Youtube video. But the non-Google Play version of Signal for Android can't use that, and so must improvise its own alternative so it's guaranteed to Stand Out.

            Assuming work like DPRIVE and draft-rescorla-tls-esni goes to plan, we're probably 3-5 years away from the official Signal being entirely indistinguishable to an adversary (other than the phone vendor) from typical usage of any popular smartphone. Did our target get a Signal message? No idea, her phone did some opaque network stuff, it talked to a cloud service. Could have been anything.

            But private federated servers ensure you'll Stand Out forever. It's essentially painting a target on your back. Anybody who actually cared about security wouldn't want that. But like I said, DeVault doesn't actually care about security.

    • Licaon_Kter says:

      Regarding the recommendations...

      Matrix has high enough server requirements (hardware) that make federation a challenge, hence most just use the main Synapse instance anyway. Also for those $5mil from status.im not sure what they'll do, actually standardize it? Code Dendrite (successor for Synapse)? Oh right, replace Slack with Riot, because business I guess.

      Tox has a slow development (at least on the mobile side), does it even work when one party is offline?

      Not sure why XMPP+OMEMO was missed, was it because no one has thrown a couple of million dollars here or what? Even without that money, Conversations on Android is the standard (free on F-Droid and with forks like Pix-Art), ChatSecure on iOS (with Monal coming soon) has a lot of work ahead but it's usable (on up to date servers), classic Gajim on desktop (with Dino in the works) still going fine. Server side, start your RPi1 from 2012, put an instance of ejabberd/prosody/metronome and off you go, your own e2ee chat system that federates, is open-source and works.

    • ennui says:

      the answer is, of course, that you shouldn't use the internet to communicate things you don't want cops to read or to talk to people you don't want cops to know about.

      I don't understand how the same community that believes DRM will never work for encrypting media products also believes that you can a secure encrypted messaging platform: analog "piracy" is always going to work, in this case, leaning on you or whoever you message to unlock your phone so "they" can read your messages.

      but on a basic level, the problem isn't even technical. encryption is ultimately a security model for whatever you are doing. but as a security model it's really best for the whole spy situation, where you have an "agent" communicating to a secure silo which is protected from the people you need security from. this is a terrible model for most people whose communication is neither siloed nor protected.

      the whole thing is typical of the tech mindset where thinking about the actual political, social, human problem never actually happens. privacy only becomes a technical problem once you decide who you need privacy from. but sure, let's convince everyone that you really can have secure conversations on the internet despite this never actually having ever happened.

      you can see the same broken argument with "two-factor" authentication. by convincing people that their messages are secure, you lock them into a security arms race they have no way to keep up with. secure passwords --> two-factor to sms --> two-factor to dongle --> what's next?

      the only way to win is convince people not to play, but where would that leave your start-up?

  4. fattire says:

    Aside from the number one issue I have with Signal(federation and associated real-world vulnerabilities that go with it), there are other potential security concerns that are reported but remain unaddressed.

    Here's one I submitted a few months ago relating to the overly broad storage permission request in the Android client. It was closed almost immediately with a request to submit for further discussion on their forum. I posted it there, where it received widespread discussion and analysis of--

    just kidding. Where it was completely ignored.

    ¯\_(ツ)_/¯

  5. Jeremiah Blatz says:

    And yet, in the real world, people who work with actual dissidents who are actually likely to be hanged or tortured advise this dissidents to use WhatsApp. Why? Because it's damn secure, and _everybody uses WhatsApp_. Cover traffic is important. Without it, a repressive regime will just torture and then hang everyone who uses the super-secure-but-impossible-to-use protocol.

    • Nick Lamb says:

      The thing that makes WhatsApp "damn secure" is literally the Signal Protocol. Open Whisper Systems (who our host is claiming are in this to try to build a social network and get them VC dollars rolling in) actually just wants to encrypt people's communications, so they worked with WhatsApp to give WhatsApp users this functionality a few years back.

      For example if you look in your WhatsApp conversations with a friend you'll find a "Security code" you've probably never checked. This is the exact same type of code you see in Signal, because it serves the same purpose. Unlike you, actual dissidents should make sure to verify this code with conversation partners because if it doesn't exactly match then an adversary has MITM'd your conversation and the secret police may be eavesdropping on everything said or alter any messages of their choosing.

Leave a Reply

Your email address will not be published. But if you provide a fake email address, I will likely assume that you are a troll, and not publish your comment.

You may use these HTML tags and attributes: <a href="" title=""> <b> <blockquote cite=""> <code> <em> <i> <s> <strike> <strong> <img src="" width="" height="" style=""> <iframe src="" class=""> <video src="" class="" controls="" loop="" muted="" autoplay="" playsinline=""> <div class=""> <blink> <tt> <u>, or *italics*.

  • Previously