Filippo Valsorda wrote an excellent essay on why he's giving up on PGP. I have long believed PGP to be more trouble than it is worth. It's hard to use correctly, and easy to get wrong. More generally, e-mail is inherently difficult to secure because of all the different things we ask of it and use it for.
Valsorda has a different complaint, that its long-term secrets are an unnecessary source of risk:
But the real issues, I realized, are more subtle. I never felt confident in the security of my long-term keys. The more time passed, the more I would feel uneasy about any specific key. Yubikeys would get exposed to hotel rooms. Offline keys would sit in a far away drawer or safe. Vulnerabilities would be announced. USB devices would get plugged in.
A long-term key is as secure as the minimum common denominator of your security practices over its lifetime. It's the weak link.
Worse, long-term key patterns, like collecting signatures and printing fingerprints on business cards, discourage practices that would otherwise be obvious hygiene: rotating keys often, having different keys for different devices, compartmentalization. Such practices actually encourage expanding the attack surface by making backups of the key.
Both he and I favor encrypted messaging, either Signal or OTR.
The business card I gave you in Berlin did have a PGP key fingerprint but that is only one key of many. It is useful to have one well known stable key. Printing it there did not stop me from creating other keys for specific purposes. The real problem with any encryption is that almost nobody outside of the infosec community bothers to use it until after they are being pursued. People may have heard of PGP and so there is a better chance they would use it.
It's funny how security puritans periodically pop up to attack PGP. The idea of using Signal instead of PGP does not hold water. Email's email, IM's IM. Their usage is totally different and they are not intergchangeable.
What is needed are solutions that make using PGP easier and less error-prone. The disastrously and irreparably UX-oblivious security techies totally failed in this task (except maybe Protonmail, but they are full of other flaws like being a walled garden and not supporting IMAP) and from time time they wake up and say - let's ditch email altogether. It's like bad doctors who want to kill the patient to get rid of him.
As to Signal, Moxie Marlinspike refuses to federate (and made sure LibreSignal die because of this). Because of this, Signal will remain a small niche, even if IM were interchangeable with email which it is not.
Email is and will likely remain in the observable future the only robustly and globally federated messaging system using which people can still control their privacy - using open source PGP. More usable PGP bases solutions are needed.
Well I wish the purists good luck trying to ditch email and even bigger luck required for finding other people who will ditch email.
I don't see why a yubikey would be "exposed" in a hotel room, unless an attacker decapsulates it or knows your PIN, he can't access the private key stored in it. PGP (or GPG) isn't hard to use at all, it's very basic, a public key you publish and share which people encrypt content sent to you with, a private key you keep to yourself and use it to decrypt messages sent to you. As to lifetime keys, it just depends on how secure they are, I have one in a safely stored yubikey 4 which I use to generate and/or revoke subkeys used in my other yubikeys or smartcards. If you can revoke it, a lifetime key isn't a problem.
I also don't share the idea that using PGP is so bad because its user friendliness is frankly quite bad. There are tons of new solutions and projects which are working on it changing it.
When you look at [openpgp.org] and check their roadmaps, you will see their devs are trying to bring more user friendly PGP software during the next year. Automatisation exactly as Signal or OTR is already providing is in plan as well.
Infosec people are using PGP last 20 years and its security holds and it's still developing. You can't say the same about lot of other security software. When the end user PGP software will become more friendly and easy to use for ordinary people, the spread will be a lot faster as for Signal. Think about how many people are still using email.
Some points are quite valid but quid of the alternative solutions suggested ? Signal is not fully open source, Wire's code is not yet published, Whatsapp is Facebook and clearly won't ever be open source.
For now we can't really trust any other solution... Or can we ?
Having a long-term key within a web of trust to assert one's identity (with the usual "identity" footnote) is important. It is completely decentralized, works with _any_ file format or protocol, and has stood the test of time. I have yet to see a viable alternative that does anything more than address a small subset of the problem.
Yeah, "Too Cool for PGP" has his own kewl-ness issues, it seems. First, my impression was that yes you are indeed anonymous when using TOR - you just don't have privacy, which are two different things (VPNs for privacy, TOR / I2P for anonymity).
Second he is dead wrong in flogging the old claim that 80% of TOR traffic to hidden services is CP: [www.wired.com]
Seriously: "One does not Kewl become by parroting the lies of the government"
Also, have to comment on this paragraph: /// TOR does a good job anonymizing the network route. In my opinion, TOR really has only two big problems. First, they make no effort to curb illegal activities. ... An unmoderated environment attracts criminals who attract law enforcement and governments. ///
That is by design. And it has worked perfectly in making people think that TOR is secure. For "people", substitute "child pornography consumers" etc..
/// The second problem is that the TOR Project has been repeatedly promoting their service as a secure solution. ///
Plan, part of, you connect the dots.
/// But TOR is only one small part of a secure solution. This misleading promotional campaign is causing regular users to put themselves at risk by assuming that TOR is safe.
The entire concept of TOR is based on trust. However, the trust model is broken. ///
No, the concept of TOR is based on providing US (and Israeli, I guess) spooks with a secure network by inviting as many irrelevants as possible, making data correlation hard.
After all, a network made for and used by spies would be a pretty bad network if only spies used it.
I think we want to try to achieve two separate things. First, we want ordinary people who are going about their business without focusing too much on security to be reasonably safe. Our communications systems ought to be resistant to inexpensive automated large scale mass surveillance. If your grandmother sends a message to your aunt, it ought to be private.
Second, we want people who have a need for very high security to have the ability to feel confident that they've kept their communications private. They should have tools that they can trust, and we need a culture of key management that works.
The web of trust wasn't designed for the first goal, and doesn't do anything for it. And it's so hard to use that it's not certain that non-technical people who need to use GPG -- journalists and sources, for example -- won't make mistakes.
I like the Ars Technica article because it suggests ad hoc ways to bootstrap the second goal out of a system like Signal that achieves the first. It seems like a step in the right direction. Maybe we can build better first goal tools that will be more open, and maybe we can make new second will make it easier to bootstrap out of the first goal to the second.
I used to lurk on the old cypherpunks list. I think a lot of people believed that if crypto tools were given away for free, privacy would be achieved. But we never got much adoption with GPG, and email has never been very secure in the real world.
We've let the best be the enemy of the good with crypto, and as a result we live in a world with ubiquitous mass surveillance. We need to take the privacy of those grandmother messages seriously.
I'm no longer a privacy hawk. Terrorism scares me, and I don't think of the NSA as being made up of monsters. But I've learned that sometimes electoral results scare the heck out of me as well. More often than not, institutions that run infrastructure will defer to power. We need to be private by default.
Before ye abandon all hope, take a look at SecureMyEmail [www.securemyemail.com.] It may not satisfy the granularity requirements or potential exploit fantasies of every PGP user :), but we just launched and spent 3 years and a great deal of money trying to take the best of PGP and "modernize" it to increase its ease of use and hopefully make encryption popular.
Their usage is totally different and they are not intergchangeable.
I agree. I don't use PGP. I have never used PGP. For that matter I don't use Signal or OTR either. None of them fit my threat model.
Part of the problem is that there is a tendency to abstract the content of the communication away from the method of communication, which is a mistake. If nothing else, encrypted e-mail can be a useful decoy while the real substance of your communications is transmitted by some other means.
As for Tor, I have given up on discussing Tor. There is so much disinformation spread about Tor that I no longer have the energy to combat it. Used properly as parted of layered security, Tor is the single best anonymity software that exists. And used properly it is highly effective in most use cases even from nation-state actors. The problem is that people don't use it properly and then blame Tor for their own failures.
I would rather not give up on e-mail so quickly. Because when it comes to privacy, it has one big advantage: decentralized, open infrastructure. To use Signal, you need a smartphone with either Google's or Apple's push services running (at least since they decided to lock out LibreSignal). You get that only (unless you're really deep into smart phone hacking) when you accept to run an infrastructe on your phone that creates a huge load of data. Essentially, it is a privacy trade-off: well encrypted messages vs. privacy of your location, contact list, installed apps, etc.
For many, both kinds of protection are necessary so Signal is no solution.
OTR runs on open infrastructure but needs to solve the problem of offline communication. Both OTR and Signal are also no good solution for people who need to be available to contacts they do not know. Publishing a PGP key and broadcasting fingerprints is a good way for total strangers to get in touch securly. For Signal, that also would mean to publish your phone number, which brings all kinds of other troubles.
I think we got to get rid of the idea that there is only crypto solution for all. There are so many different use cases of digital communication and each should have a good encrypted alternative. No use in playing them off against each other.
One of the problems faced by PGP and S/MIME and every other email encryption scheme nobody adopted is that they're trying to tack security onto SMTP, which never had any meaningful security in its initial design. That puts those projects at a huge disadvantage.
It's a shame the Darkmail project appears to be faltering - that seemed like the only project realistically likely to deal with the inherent problems of SMTP based email.
There is a trivially evident point which however has to be recalled and stressed, viz the disconnected or asynchronous nature of all forms of email communication. Email is no substitute for instant messaging or paging, nor vice versa. Sure, in this time and age unlike a few years ago, (most) mails make it within seconds, or a few minutes at most, from sender to recipient's email box (often on a POP or IMAP server, or alternatively some web-mail). But the act of retrieving/reading the email stays fundamentally asynchronous and independent.
@Myself and I: "As for Tor, I have given up on discussing Tor. ... And used properly it is highly effective in most use cases even from nation-state actors. The problem is that people don't use it properly and then blame Tor for their own failures."
This is pretty much the sentiments Shava Nerad had, when I pointed out to her how strange it was that so many child pornography consumers were being caught when using TOR.
One would *think* they were an interesting use case to gauge the effectiveness of TOR as an anonymizing tool, but no, "there is a difference between a political dissident knowing that his entire family will be killed and tortured if he is found out, and a pedophile looking for abuse imagery".
Now I am sorry for derailing, but it is this kind of elitist attitude - "man man" - that in my view is the cause of such a divide between actual users and the tinkerers.
As they say: "Don't make me think or you *will* regret it"
about user experience, user demand and overengineering
From a user perspective, crypto-messengers like WhatsApp, Signal... are very simple. The first time I receive a message from a contact, I get a notification that an encrypted communication has been established and I'm happy. Whenever some third party tries to interfere, I will receive a warning "key has changed, might be a different person now".
Guess what? That's ok! Users are (quite) happy. They simply trust that a communication they started can't be tempered with. They do not need a CA infrastructure and they do not need a web of trust they have to manage. Just an application telling them "either your dad has a new device or somebody is interfering".
Thinking of email, without encryption we got less, anybody could impersonate my dad at any time. And with SMIME or PGP we have so much more, that's simply not required for "I'm still communicating with the same party". Compared to WhatsApp and Signal, we go to extremes with email. Not unreasonably, in certain contexts, but compared to user experience on WhatsApp and Signal, SMIME and PGP is a nice example for overengineering.
Consider SMIME: Alice receives a signed message from a new contact claiming to be Bob, containing all the information Alice needs to reply encrypted to the contact claiming to be Bob. If the email-clients stopped requiring CA-certification but simply implemented "store the public key with the contact and alert the user if the contact starts using a new public key", the user experience would be very close to WhatsApp. (ignoring details like forward secrecy,...)
It could be that simple if people were not using multiple email clients. But we could change the problem from "web of trust" or "certification" to "secure key distribution between devices the user controls", which is still hard, but possibly simpler....
Consider every user creating her own CA for each email address the person uses. There might be a CA for [email protected], another one for [email protected] The CA only signes the very email address it has been created for. So there could be several SMIME keys for [email protected], one for her outlook, one for her mobile phone, one for thunderbird on linux...
Imagine a manual on email encryption like "there is this mobile app on your iphone you have to use once whenever you start using your email address on a new device, to exchange something called a key, but as an end user you don't have to care for the details anyway."
It would certainly have a much lower level than SMIME or PGP have today, when properly implemented. But given their lack of adoption, even a lower security level than "perfect" would be an improvement...
Not sure I understand your comment - re. need to create a new CA for every new account. If you choose to have your own CA for making and signing certs, surely one CA is enough for creating an unlimited number of certificates, isn't it ?
By the way, while creating a personal CA is fine, it's not needed for the email app, AFAICT, all email clients (that I know) will accept self-signed S-MIME certificates for signing and for encryption, justly warning the recipient (ONCE) that he should OK the sender's cert - hopefully after checking fingerprints with the sender.
The blog post (original and wrapper/commentary) was an interesting and slightly disconcerting one for me. Having faith and or trust in a third party is exactly what I do not want. I've always preached that security is the inverse of convenience, and the opposite: C = 1/S, and S = 1/C.
So taking these wonderfully convenient applications leaves me where in a secure knowledge that my data and transmissions are safe, secure, and intact. Yes, there is a learning curve in securing your communications, my faith is still in the publicly viewable source of GPG.
I tried PGP once. What I found was a wild west of half-assed programs, things that only worked in console, things that only worked on one platform.
Trying to use PGP as a non-expert right now is a non-starter- things like GPG4Win's Cleopatra crash or fail constantly, and the more mature and stable implementations simply don't get ported outside Linux or never get a GUI meaning they don't exist to 99% of potential users (sorry, but this is a true statement of demographics and not something that's going to change.) There are clients on Android that deviate from the standard and so messages they encrypt can't be decrypted by other clients... the issues go on and on
The problems you raise are nothing that couldn't be solved by sufficiently advanced software- software nobody has ever bothered to build. Instead of giving up, how about trying to solve the problem? I might be able to help, in a few years, but right now it's up to experts like you to make this tech available to everyone who needs it with the features that it needs to have.
1. argument: people don't need Bob to testify the email they read was actually sent by Alice. For many applications it is sufficient to know that they are still communicating to the same Alice like last week. Rendering web-of-trust or CA infrastructures obsolete
2. that might be accomplished with existing technology like SMIME, it just needs changing the verification.
3. WhatsApp/Signal have the advantage of "one device only", email is expected to be used on multiple devices, so a new problem of "moving my private key around" exists.
4. "moving my private key around" might be solved with a better user experience (again, using existing technology) thatn PGPs web-of-trust.
5. result is easier to use than PGP, but offers less in asserting the identity of other parties. Which still might be good enough for many applications.
Valsorda raises interesting points in regard to e-mail and PGP. However, I wouldn't use Signal because I don't feel comfortable handing out my phone number for no apparent reason. I prefer Threema, which can be used anonymously. Also, let's not forget that Signal is backed by the US government, which doesn't exactly inspire confidence.
Moxie Marlinspike, one of the developers of Signal, himself wrote an article decrying GPG. The gist of it was:
These are deep structural problems. GPG isn’t the thing that’s going to take us to ubiquitous end to end encryption, and if it were, it’d be kind of a shame to finally get there with 1990’s cryptography.
Another excellent article was written by Professor Matthew Green, again the gist of it was:
But as they say: a PGP critic is just a PGP user who’s actually used the software for a while. At this point so much potential in this area and so many opportunities to do better. It’s time for us to adopt those ideas and stop looking backwards.
A Formal Security Analysis of the Signal Messaging Protocol
We conduct the first security analysis of Signal’s Key Agreement and Double Ratchet as a multi-stage key exchange protocol. We extract from the implementation a formal description of the abstract protocol, and define a security model which can capture the “ratcheting” key update structure.
We then prove the security of Signal’s core in our model, demonstrating several standard security properties. We have found no major flaws in the design, and hope that our presentation and results can serve as a starting point for other analyses of this widely adopted protocol.
Re PGP "... that its long-term secrets are an unnecessary source of risk:"
Personally I do not have the means to hold a long-term secret like a PGP key.
Innumerable pick-pocketings, robberies, beatings, rubber-hose treatments, covert druggings, (either outright illegal or by the воры в законе who force me to take drugs,) black-bag jobs, evil maid attacks, identity theft, and plain old remote compromise of my computer.
Forget it. I live in the crumbling empire of the once United States, and I can't even keep the Kremlin from reading my private parts, much less my PGP keys, or any other cryptographic keys for that matter. According to Title 18 U.S. Code, §922, ¶¶(d)(4), (g)(4), I am forever and irrevocably adjudicated "a mental defective" in this God-cursed land of thieves in law.
Frankly, there is no need for secure communication using smartphones. Simple as that. That whole mobile universe is crapped from top to bottom. Starting at the EM end, over the custom hardware based on arm, up to the OS. To add anything secure to that is akin to installing a high quality door lock onto a cardboard house with plenty man sized holes in it.
As for PGP, it's "flaw" is that it's all but unusable for the normal clicky clicky user. The effort to repair that flaw, however, is modest. More importantly, the complaints about PGP should be directed at the email client producers and at the standards committees who did what they alway do; they wasted immense amounts of time, resources, and money to come up with a bad non-solution compromise.
And that's not the only mess they have created in what is one of modern mankind most important communication means. Just have a look at the SSL zoo across diverse ports and protocols in email.
One reason for that mess not being cleaned up is certainly the mobile idiocy. It's "cooler" to create the zillionth app thingy instead of creating a reasonably useable and secure mail server and a couple of up-to-date good clients.
Maxie Marlinspike may be a smart man but obviously not smart enough to understand that no lock whatsoever, no matter how good, will make a cardboard house with man size holes secure.
I recall reading a story of a very well educated man, a professor no less, who specialised in computer security. He was convinced that he had sent his email PGP encrypted but because of a defective add-in (it had worked perfectly previously) his email was sent in cleartext. That's just one flaw.
Long term key security is another. The "normal clicky clicky user" are the ones we're supposed to be protecting. Mobile apps are probably the best thing for the layman; they're not perfect but they protect most people. If your adversary is a nation state then go forth using PGP, smartcards and air-gapped systems. Just don't forget the energy gap, electromagnetic shielding, soundproofing, sweeping the room for bugs every time. And did I mention that you have to ask your contact to do the same?
Moxie is certainly smart and he does understand the inherent insecure nature of mobile platforms (your 'cardboard house' metaphor) but he nor anybody else can work miracles. You've got to do what you can to protect what we have. He can't go on a crusade fixing Android, iOS etc., he's got to work with the two dominant smartphone platforms and provide a solution that makes it harder to access Signal users' communications.
Most people need reasonably good security (or Pretty Good security). PGP doesn't provide any more protection than Signal for the majority of users.
And as Bruce often says... if the NSA want in, they're in. Whether that be PGP or Signal. I know which is easiest (and I understand both technologies!)
@Ron: I have two major gripes with your arguments - they are not very significant, but they are still relevant.
Point One, IMHO Signal does not protect you against metadata sniffing, and communication correlation. Case in point, if you wish to speak to a journalist you don't call that journalist from your personal phone, Signal or no Signal.
That's why there are things like SecureDrop and OPSEC best practices, right?
Second point, you (and Bruce) can argue all day long that the NSA is like omnipotent gods - which kind of jives with you both coming from a pretty much Jebuss-land nation - but the saying remains: "You don't have to outrun the bear, only the slower guys".
Security isn't about painting a target-circle on your forehead, it's much more complicated than mere technical tools. It's a way of thinking that aims at reducing your risk.
Oh and also what the NSA can or cannot do becomes begging the question of proving a negative.
All client-server communications are protected by TLS. Once the server removes this layer of encryption, each message contains the phone number of either the sender or the receiver in plaintext. This metadata could in theory allow the creation of "a detailed overview on when and with whom users communicated".
The group messaging mechanism is designed so that the servers do not have access to the membership list, group title, or group icon. Instead, the creation, updating, joining, and leaving of groups is done by the clients, which deliver pairwise messages to the participants in the same way that one-to-one messages are delivered.
"You don't have to outrun the bear, only the slower guys"
Exactly. And Signal provides exactly this protection. It's easy to use, provides self-destructing messages, is end-to-end encrypted, costs nothing and isn't ad-sponsored.
Plenty of people use Signal. It's virtually impossible to mess up sending an encrypted message over Signal. With PGP it's very easy to do something which fatally undermines the encryption.
"It's a way of thinking that aims at reducing your risk."
You seem to be agreeing with me.
"Oh and also what the NSA can or cannot do becomes begging the question of proving a negative."
Obviously this is correct but the Snowden disclosures give us a pretty good idea of their capability back then. Who knows where they're at now?
Well, we kind of agree, only that you don't seem to incorporate metadata into your threat-modelling.
If I pick up my phone and make a call to a journalist in regards to my supar-secret-clandestine terrorist child trafficking organisation (mainly staffed by GNAA), then that's pretty much duly noted.
And this is where I believe there is a flaw - and I would recommend the book Astro Noise which has a nice first chapter on that - namely that you don't really need the contents of a conversation if you know enough about the /context/.
Second (also referring to Astro Noise), where the NSA has moved seems to be to painting itself in a corner - information overload. In fact, it has been argued from several views, that more is less, and gaining too much information sort of bogs a system down.
Which may be why they are looking towards AI (e.g. cognitive computing like IBM Watson).
If you're worried about metadata you've got to remember that PGP won't protect you either. The email address can be linked to your identity fairly easily (if you've provided genuine details). PGP will light you up like a Christmas tree because it's easy to identify in bulk datasets.
There's nothing to stop you using a disposable phone number in conjunction with Signal. Buy a burner phone, insert the SIM, receive the confirmation SMS, enter it into the app on your smartphone and that's it. Assuming your contact does the same then you've got secure communication. Then your more secure than PGP plus you can make voice calls too.
Also I think Signal does employ Noise Pipes. WhatsApp does and I believe their encryption is precisely modelled from Signal. They got Open Whisper Systems to help implement the Signal ratchet!
"All communication between WhatsApp clients and WhatsApp servers is layered within a separate encrypted channel . On Windows Phone, iPhone, and Android, those end-to-end encryption capable clients use Noise Pipes with Curve25519, AES-GCM, and SHA256 from the Noise Protocol Framework for long running interactive connection."
What Moxie Marlinspike does (certainly well meaning) is *security theater*. He is co-responsible for millions of people *(wrongly) thinking* they are secure - no, they are not.
If one needs to communicate securely (private, conidential, ...) then one needs to use adequate means, simple as that.
I agree that it's utmost sad to see that there are basically no such means that are easy to use. I agree that we should work on such means. But Joe and Jane clicky clicky will *need* to make some effort, too. The effort, for instance, to think for a second whether some communication is sensitive or not; the effort to understand that paying 39$ to Symerksi or to install some magic app like Signal is *not* the way to secure communications.
But again, it's certainly a pity that there are no easy to use (click, click) email client with PGP built in (in an easily understandable and reasonable way). Something like a "Click here to encrypt all attachments" button for instance.
(Nota bene, I'm not at all a PGP fan. It just happens to be a useful and useable (albeit not for Joe and Jane click click) compromise)
Those "millions of people" are much more secure when compared to communicating over the conventional GSM network where things like SS7 affects them.
If you're looking to communicate over a smartphone which is most secure: 3G/4G or Signal? It's obvious. It exponentially increases the cost factor of intercepting those communications.
Most people realise that a sophisticated adversary will be able to defeat encryption via side channels although they'd express that sentiment a little differently.
But again, it's certainly a pity that there are no easy to use (click, click) email client with PGP built in (in an easily understandable and reasonable way). Something like a "Click here to encrypt all attachments" button for instance.
It is a pity that it's not more readily implemented; I wish it was. Different implementations of PGP often aren't compatible thus compounding the problem.
However PGP would remain insecure in my opinion because the underlying web of trust checks (key signing parties) are never going to be done by the majority of people.
Why would anyone use or recommend signal, which is a walled garden at best, when they could use an open and federated protocol like xmpp, which has multiple open source servers and clients for mobile and desktop, as well as a forward secret multi device encryption protocol?
Nope, you are mistaken. 3G/4G and Signal are completely different things and on different layers. Signal is on top of 3G/4G, hence my card board house and high quality door lock image.
No matter how forward secret and whatnot Signal encrypts you are *not* secure if all that is running on utterly insecure devices with lots of insecure building blocks.
One can continue to use cool smartphones, which, for instance, share ones current location and enjoy lots of nonsense crap. No problem, after all most of what normal everyday citizens comunicate is not confidential or sensitive. One can not, however, somehow magically make those devices secure.
Moreover, the security hype seems to deactivate peoples brains. They seem to be in a "security!" frenzy. Security, however, starts with a proper understanding and assessment. Would I like, for example, my tax data or nudes of my wife to be looked at by god knows whom? Certainly not. *But*: Will the nsa be interested in my tax data? Hardly. Security starts with understanding and properly assessing "what is the need to protect?", "Against whom and what to protect?", "To protect at what cost?".
The last one being a barrier the vast majority of smartphone users won't cross. Clicking once on "download" and then "install" is what they are accepting as cost. To think about the above questions or, let alone, to move to their desktop PC (as a slightly more secure option) is a cost they are not willing to bear. Similar story for passwords; most users are simply not willing (or capable) to have multiple passwords for multiple sites.
Some of that can - and should - be compensated by better software, for instance, by realiable and easy to use PGP mail clients. Some of it, however, can not be compensated as we know even from strict military settings where people use the same usb stick both for their porn PC and their official work system.
Really the only concern I have about Yubikey is not using the default PIN, and (a mistake I have made) making a backup of your private key at generation time. I didn't do that so the only copy I have of my PGP private key is on my YubiKey.
I use PGP/GPG multiple times per day. Every email that I send is signed with my private key, verifiable with my public key. My feeling is that the digital signature is just as strong as my wet signature on a document.
In the (so far rare) case that I need to encrypt a message stream, I create a separate key pair for that endpoint, confirmed in person if possible, and if not, via my 'signature' key pair.
I've been a proponent of PGP since its inception. I donated to Zimmerman's defense fund when he was being threatened under federal export laws. I wish that everyone I know could use it to secure their communication. However, I understand that most people just don't get it, can't use it, and never will. It needs to built in to email clients, IM, and other channels that are mainstream, not niche products.
More importantly, the complaints about PGP should be directed at the email client producers and at the standards committees who did what they alway do; they wasted immense amounts of time, resources, and money to come up with a bad non-solution compromise.
You maybe aware of an old say about never attributing to malice, that which can be attributed to other human faillings such as stupidity.
However as another saying has it, it's the exception that proves the rule etc. But I've had the misfortune in the distant past to be involved with standards committees and see behaviour that results in changes that make things either insecure or worse unsecurable.
If you are of an enquiring mind as I was, you dig a little, and find out that the proposers, seconders and even some of the objectors, have ties back to the various signals intelligence agencies in the old 5eyes group.
To them it's the "game of finessing" to put "useful holes" in standards and specifications, usually claiming "Health and Safety" as the basis for their changes or objections.
I understand why a lot of people thought prior to Ed Snowden that "it was the stuff of conspiracy theories" and many still do. But the NSA got a little careless and became way to obvious with some NIST and other standards committees. And although initially getting their way, the NSA representative caused such ructions that others realised what they were upto and in the end NIST had to publically revoke a standard, which was probably a first.
Whilst the NSA et al do not have representatives on every comms standards committees, --due to various reasons (such as fast moving new technology,-- they often rely on the greed of other "industry players" to achive the same results.
As you may have heard there have been a couple of resignations over at the FCC, so the "Dho'nald" will have yet another "warm body" to find. One of those going although originaly a republican appointee, turned out to be a bit of a wolf in sheeps clothing and pushed back very hard against "vested interests" to actually give us some semblance of "Net Neutrality". Which also gave the SigInt organisations issues. It will be worth watching what the replacment appointees do as it may show more "finessing", to those who have a jaundiced eye.
Moxie is certainly smart and he does understand the inherent insecure nature of mobile platforms...
There are all types of "smart" and sometimes they make things worse. As noted by the saying "The road to hell is paved with good intentions".
The history of Darpa TCP/IP -v- ISO OSI shows that "quick and dirty" often beats "cautious and thorough" in the market place. Likewise many of our more used applications from the likes of MS, Adobe etc. Apart from market dominance, the other thing they have in common is the fact they are full of errors and omissions and are a living hell to secure etc.
No matter how bad, once something becomes established dealing with it's defects becomes a task of Hercules. People do not like change they rail against change and they put the boot in to stop change, even when it's clearly for their own good.
Thus you get the "stop gap problem" a hack or temporary fix becomes the default standard and you get all sorts of problems --like fall back attacks-- because you can not force a "clean break" to something better. The likes of SSL/TLS should make that clear.
If Moxie's system becomes a defacto standard then we are stuck with it's failings. Others will have to build in compatability with it's failings in their product and like the spectre at the feast it will haunt many generations to come.
We know this and we should learn from it. Thus the question arises as to what faults Moxie's system has and are their systemic design faults in particular.
We know that it "sits upon" many lower layers, but you have to ask how... That is if you go to Venice or Amsterdam you see houses and boats on canals. Boats are designed to float on water by displacement, the buildings however are propped up by friction on the piles. If the water levels change the boats will carry on floating, the houses however will unfortunatly suffer, hence you have to have drainage and pumps and all sorts of cludgy add ons all that are quite fragile. Worse however, is the fact that the wooden piles rot faster than the bricks and mortar on top of them, thus you have to either demolish the house or make eye wateringly expensive renovations. It's a lesson that has sort of been learnt by some architects in that they now build new buildings on rafts not piles.
So the first key question should be is Moxie's solution built like a boat or on troublesome piles?
The latter is currently "the industry way" and the fact Moxie has effectivly gone the closed source way and apparently built in future limiting dependancies on the underlying "carrier technology" suggests it's in effect propped up on piles.
As has oft been pointed out, it is unwise to build a castle on shifting sands.
I'm really struggling to understand what particular vulnerability you say affects encrypted messaging apps. I appreciate the 'shifting sands' argument you make but it seems that Signal overcomes most of the underlying vulnerabilities with standard voice calls.
With Signal, as an example, you can verify your contact's identity by using a QR code or numeric string.
(I'm deliberately excluding the potential of a malicious app/update because I'm more interested in the other aspects of its security.)
How do you say then that Signal can be circumvented?
I could find lots of articles; do you have the link?
What would you recommend? Do you think Signal offers: the same, less or more security than regular voice calls?
(Most people would consider face to face unacceptable for all but the most confidential information.)
I think the main concerns raised here are not specifically against Signal. However, that specific points may be Google and closed source server SW .
There are more of the broader, general points to be cautious of, if I understood correctly (@ab and @Clive):
- The “Secure Signal” hype conceals other issues that make privacy vulnerable, Jane and John Doe will feel safe but are not.
- We should be very cautious with monopoles and monopole solutions. If Signal is good than it’s bad at the same time, in case it is the only one. Only diversity can provide resilient solutions (see big business versus life on earth).
- When something is built on a weak foundation it can crumble easily. To stand on floating ground it needs adaption (see buildings versus nature). But we have no concept how to adapt (Signal, …) securely.
- To enable diversity and adaption all contribution must be open source, in all details and documents.
 I think we are all aware of the problematic to have the adversary in the construction / system / home. 2/3 of our security is still based on obscurity. I don’t know if there is a solution to this.
2/3 of our security is still based on obscurity. I don’t know if there is a solution to this.
Use the remaining 1/3 :)
On a more serious note, a good method is to construct a DFD (Data Flow Diagram) from human to human and then look at all states and state transitions of the data (includes keymat), then construct a thread model table. You may find that the transport and protocol are "secure". The rest is probably not. You may also construct an attack tree.
During the design phase, I''m proponent of neither approach, though. This is basic threat modeling (threat modeling 101, and lab) :)
I consider myself to be a normal user, maybe slightly above average. I'm not a computer or security expert, but I do occasionally dabble in my own home Linux servers and video game programming.
I really don't understand the argument that PGP is SO HARD to use, especially from security experts. They never say WHY it's hard to use.
Frankly, stop being lazy and read some documentation, man pages, even the terminal output of gpg --help. I really don't see what the big deal is. It's not any harder than learning any other piece of software.
I'm not exactly sure if it's 100% secure or open source, but Mailvelope is ridiculously easy. After a few test emails my brother and I can send PGP emails to each other regularly. Personally, though, I'd rather use the terminal.
I think the more security you want, the more you're going to work for it. It's unrealistic to think otherwise. If you want an easy to use app, somebody has to put in the work to program it. People are so spoiled by modern, instantly-gratifying technology that they can't follow a few steps to use PGP so they start whining about it! I love my iPhone and technology, too, but I also recognize that I'm going to have to put in a little effort to have security. I think that effort is worth it, too. If you don't, then maybe your privacy isn't worth it?
If the masses are too computer illiterate to consciously communicate securely, you can't force them to learn it or use apps that provide it. You would have to completely redesign computers, the Internet, and email to make them secure automatically. But, that's not going to happen. I think the people that want to be secure, will be.
Sorry for the rant. Maybe I don't know what I'm talking about, but that's my take on it.
"If you want to make an independent implementation, go wild. Also Signal works just fine on Android with an open source replacement for Play services, such as gmscore."
re risk of closed & mass adoption
I agree the risks are there. The "good" news is someone said Signal had a whopping 99,000 or so downloads on Android. The most usable one with most press in media following the Snowden leaks of all things was under 100,000 by end of 2016. So much for us getting stuck with it after massive adoption. ;) Lends more credit to my theory that anyone wanting massive adoption of good security should bake it invisibly into a product or service that would get massive adoption in market for non-security reasons. Another WhatApp or Dropbox. Telegram and SpiderOak took that strategy but who knows about their security. Even if implementation were perfect, they didn't cover enough bases for us to trust them. They did corroborate my theory in their success vs Signal or FOSS file encryption of the month.
It really is. I got memory problems these days where I have to read the docs to get back up to speed on tech I haven't used in years or rarely use. I re-installed GPG a while back to talk to someone here. The man page was a holy shit amount of stuff I really didn't need to know. I found a cheat sheet that gave me the necessary information in concise form. Checked it against the man pages for accuracy then started using it.
At this point, it's still a bitch to use. My method for simplicity & some obfuscation is to send all content as either sealed text or archive files. Sealed short here for encrypted + signed. The flow is (a) make the file, (b) run command to seal it, and (c) send that to the third party. The command itself was unnecessarily long having to type the person's full name and stuff. Maybe there's an alias option in that huge man file. I ended up writing a script that gave me a clean interface with aliases where I just typed a number for recipient. Even better would've been GUI integration or something where I right-click the file to encrypt, get the prior functionality in a dialog box, and it creates new filename based on old one with .asc extension added. So simple to do but still not default for GPG.
I also lost the keys twice when the export didn't work for whatever reason. Started using the cheat Dirk Praet gave me of backing up the hidden folder for GPG's databse. Better GUI would probably convert those commands into "Backup keys" or "Restore keys" covering everything in the database public or private. Also, that failure was significant where many people would quit using the product instead of going through the trouble to redo the key exchange or whatever.
So, it is unnecessarily hard to use. Googling it got me a cheat sheet. That worked fine if just sealing files sent over email, etc. I could see it getting to hard or just too much trouble for average person if they try to do more. Especially key servers and such. I recommend the creation of better interfaces at least for sending messages the way I describe where it would be easy to do that. Maybe a transport client or script on top of that which makes the message transmission part automatic.
@jaguardown "...I really don't understand the argument that PGP is SO HARD to use..."
In point of fact, PGP/GPG is not necessarily difficult to use. I imagine that using my methodology on a fart fone would be a nightmare but I can't state that with authority because I neither tried it nor own a fart fone. I both compose and encrypt/decrypt on a gapped desktop.
Also, the claim that a PGP/GPG user must be a propeller-head is specious. Of the dozen or so people I communicate with through PGP/GPG, four are non-techies: human rights worker; psychologist; librarian; yoga instructor(!). All four taught themselves using the dox ... plus lots of Q&A with me via (unencrypted) email and occasionally a phone call.
Preparing a supper from scratch requires more skills, entails more processes and consumes much more time than using PGP, but people do it anyway because the easy way isn't necessarily the best way.
 Gapped, not because I'm afraid my internet-facing system will get hacked but because I don't want to think about keyring + mail archives when it coughs up blood - which it does every year or so - or I install a new OS. PGP/GPG stuff and their mail lives over there; everything else resides here (where I am now).
"The GnuPG man page is over sixteen thousand words long; for comparison, the novel Fahrenheit 451 is only 40k words."
Some implementations work slightly different, others have idiosyncratic glitches which are difficult to replicate and it's so easy to forget to add a switch which can seriously undermine the encryption.
As Nick P says, the specification is open and you can build your own client. It's also, as I said earlier, been independently reviewed and passed with flying colours in an independent formal analysis.
From the Snowden disclosures we heard that the NSA consider it a level 5 app (i.e. catastrophic to SIGINT).
"Things become "catastrophic" for the NSA at level five - when, for example, a subject uses a combination of Tor, another anonymization service, the instant messaging system CSpace and a system for Internet telephony (voice over IP) called ZRTP. This type of combination results in a "near-total loss/lack of insight to target communications, presence," the NSA document states."
"ZRTP, which is used to securely encrypt conversations and text chats on mobile phones, is used in free and open source programs like RedPhone and Signal. "It's satisfying to know that the NSA considers encrypted communication from our apps to be truly opaque," says RedPhone developer Moxie Marlinspike."
The scripts are easy enough to write but it's integration into an email client that's difficult. You can download plugins for Thunderbird but they're riddled with bugs which take time to get fixed.
Windows clients are especially buggy and crash regularly necessitating multiple attempts to decrypt your message. If you're on Mac then you've got to wait whilst they develop a version which is compatible with the latest OS/mail application.
Developing a fully working, cross-platform GPG solution isn't a priority because the usage numbers are so low. The few businesses that rely on PGP use the paid Symantec version.
If you're sending a really sensitive email it's too late to realise something has gone wrong afterwards.
Or you can use an online service like ProtonMail but then you've got to trust them to manage your private key.
I've sent PGP messages to people and they've been unable to open them because their version isn't compatible. Or you run into a problem with people whose clients don't support 4096 bit keys. Etc. Etc. Etc.
There aren't any "legal problems" as far as I know because of the non-restrictive nature of the GPG licence. PGP is a trademark belonging to Symantec but that isn't problematic when it comes to developing plug-ins.
Yes, PGP isn't exactly friendly ease. I'm also not getting younger (and anyway disliking to keep details of peripheral stuff in my head). Maybe my solution that has been serving me well since some years now can be of use for you, too -> 2 shell scripts (pgpenc and pgpdec). Both of them are friendly and need only 1 or 2 parameters and when called naked they print some help-
Thanks for the cheat sheet. Not so much for the sheet itself; there is a man after all, but for the following idea: I will create a third script "pgphelp" that will serve as a life cheat sheet for less common cases.
Btw., am I the only one who uses *only* mksh and who avoids zsh, bash and accomplices if ay possible?
It seems to me that Clive Robinsons image of the houses floating, on sand, or on solid ground etc, should have been clarifying enough.
Signal is an application that a) runs on an utterly insecure device and b) transmitts over a rather questionable medium (with b) being less of a concern).
You seem to think (and stated multiple times now) that Signal somehow magically makes communications more secure no matter all the problems beneath.
*That is a wrong conclusion*
The first question is against whom and what you want to protect. Against a curious neighbour with a small hobby radio station? If yes then what you do is a mixture of total overkill and futility. Or do you want to want to defend against, say the fbi? If yes then you act like someone who uses a very secure 50 digits/chars password but who noted it on a post it on his monitor.
You must see the *full* picture, both re. the information you want to transmit and the hw/sw stack. For the latter Signal is running on a device that *you* (and end user) can not possibly consider as secure and is using libraries, system calls, a kernel, *many* firmware blobs etc. whose security is very doubtfull and next to certain unknown to and uncontrollable by you. We need not even discuss that; there is proof for the underlaying hw/sw not being secure and outsde of your control anyway. Re. the information, that is most probably not existing only during the transmission, i.e. it exists before and after (at a site you probably don't control). Maybe it exists on your drive, maybe in your head, maybe on paper, no matter. Moreover transmitting information usually also means creating multiple copies of it and to multiple places, etc. Moreover even the assumption that the information transmitted by Signal exists only within Signals memory area is highly doubtful. What, for instance, makes you sure that the get-users-password routine which uses the system beneath doesn't keep a copy? Not even for evil purposes, maybe just because some prgrammer worked sloppily...
Short: You view is way too focussed on only a part of the whole picture and based on hardly tenable premises.
Btw: Could you even make sense of it if I gave you the extracted machine code of some of the (unknown to you) chips in your mobile device using Signal? If not, we can stop that discussion right here.
But hey, have fun using it and feeling secure. And in case you ever need an Eiffel tower or a brooklyn bridge, contact me; nobody has better prices than me!
"2/3 of our security is still based on obscurity. I don’t know if there is a solution to this."
Yes. Make 3/3 of security obscurity and, even more importantly, let us finally understand that what we do *is* about obscurity.
It's damn about time to understand our job. We are professional obscurers.
It seems to me (maybe I'm too generous and optimistic) that about 134% of the people in IT Sec have fallen victim to a lack of differentiation, to a mangling disease, namely to abhoring obscurity.
We must leran tu understand and to discern the following: a) IT Sec is largely about obscurity b) the mechanisms to create oprimized obscurity, however, should *not* be obscure.
Look at rsa: We basically multiply two very large primes (plus some mumble jumble) so as to create obscurity. Let's be honest and realistic. That's exactly what we do and what we work on. We work on professionally created optimized obscurity. What's rsa all about? It's about presenting an opponent with such a horrendous amount of obcurity that he can't possibly see the information behind it (say a session pw for sym enc.).
They way however, to create that obscurity must not be obscure but it must be a solid and verifiable mechanism.
Another example is (P)RNGs. There we want some criteria to be met; like equal distribution, etc. In other words: We desire optimized obscurity. A lousy distribution, for example, is akin to the 007 being behind a curtain with the tip of his shoes still visible.
In fact, we even have a measure for the quality of the obscurity we produce: If an opponent's chance to see through our obscurity that is so ridiculously low that it's considered null then we are satisfied and consider our work "damn obscure enough (tm)".
Example: As per today we think that no opponent is capable to find the factors of a 2048 bit (about 650 - 700 decimal digits) number having only 2 (prime) factors within reasonable time.
Even better example: We sometimes call (crypto) has function "spreaders" because one the properties of an x-bit hash function is to take an input of arbitrary length, in the case of password typically about 20 - 30 bits, and to produce a representation of x bits (typ. 128 - 512 bits). That, ladies and gentlemen, is professional and optimized obscurification. In fact, a proper crypto has function even guarantees that one will not possibly be able to see the underlaying information.
**The way to do that, the mechanism** must, however, *not* be obscure.
It's about time that we a professional obscurers learn and keep in mind that difference.
Anyways, identifying assembly is easy you just run some common opcode comparisons and double check to make sure it makes sense. The making sense of it is the learning curve, but it's not too bad though you just watch for coherent transformations along the execution path. Subtype and exact model might be harder to figure out if you don't have physical access to the chip's (hopefully not falsified) screen printed identifiers. The real problems come into the arena when you start to do what Intel and AMD? have done in encrypting their microcode updates etc.
That's not something I even want to think about.
These days we have most of the tools readily available for whatever edventures you'd like to partake.
I'm not familiar with machine code although I will say this; I use Signal because it makes intercepting my communications less than trivial. That's the whole point in a 'secure' messaging app. Increase the cost, time and effort and you increase security. Think of a padlock which is extremely difficult to pick and very difficult to cut. It can be done but requires expertise and/or specialist training. It's not impenetrable but it keeps the majority out.
We could argue all day about the shifting sand and how compromising the baseband can result in total compromise but that's non-trivial. Equally unlikely is injecting the App Store with a fake build for a particular user (deterministic compiler territory). I'm sure we're all aware of other issues.
Using your logic we might as well abandon SSL/TLS because of the inherent insecurity of CAs, but we don't: because certificates make it more difficult to intercept transmitted communications. Yes, there could be a kernel vulnerability but what's the alternative? Do nothing?
I'm not suggesting Signal is a replacement for a one-time pad or other such methods but we all know that for most that is infeasible and unlikely to be used.
People who are likely to full victim to TAO shouldn't rely upon Signal alone. For most users it's an easy to use and reasonable compromise.
If Signal was so bad then Bruce and other eminent experts wouldn't be using it.
Um, the 1/3 I meant was “caution”, as to re-check before crossing the street even if the light says green, or not to plug in any USB thumb drive we found in the hotel garage in NK (esp. when Mr. Clapper just has left).
I’m a noob, so what you say makes me lose that small piece of hope I’ve gathered so far.
I did not say "don't use Signal!". What I said was that you should be aware that it's security theater to a large degree (and not by its won fault, but that doesn't change the result).
Please, do not take that personal, it isn't, but in a way you demonstrate a major problem in our field, namely complexity. You, for instance, say things like "SSL/TLS ... because certificates make it more difficult to intercept transmitted communications".
That's plain wrong. Certificates - in theory and lucky cases - provide one service, namely they give you the opportunity to verify an identity (usually that of the server at the other end or that of a communication partner).
As I said, this isn't against you; actually it's more against ourselves: We must create a) problem awareness and b) **simple and simple to use** solutions. It's not your fault to have misunderstood quite a lot. It's *our* failure.
As for "eminent experts", Bruce Schneier is our host, he has given us some pretty impressive and useful algorithms/programs/libraries, and I'm a polite man (well, I try hard), so I won't comment much on that argument of yours. Suffice it to say that "this is secure" is not the only and possibly not even the decisive criterion to use some technology.
Be that as it may, my point was *not* to advise against using Signal. It was against using Signal *cluelessly* and against believing that Signal somehow magically turns a cardboard box into a fortress. It does not.
"what you say makes me lose that small piece of hope I’ve gathered so far." - why would that be?
I think we actually quite some excellent and highly professional obscurity. Even rsa still seems to provide quite usefull and high quality obcurity.
I merely took the liberty to beg to differentiate. One day some wise (or not) man told the world "obscurity is not security" and as so often happens he quickly found a large crowd of believers who formed the "obscurity is not security!" sect and just love to preach their eternal und holy truth whereever people don't run away from them.
The basis of every good solution is to understand the problem, the context of the problem, the relevant factors, and ones own craft and tools.
And it just so happens that our field is the art of very solid and high quality obscurity. Not at all surprising, btw, as a very major part of security always was and still is based on obscurity. Looking closer we might divide obscurity into perception and action security, the former meaning obscuring and hence interdicting perception and the letter meaning obscuring->interdicting interaction with the obscured.
When we put a file into a drawer we obscure it in making it "vanish" out of the perception universe of third parties. If we additionally lock the drawer we obscure an interaction domain for the third party. Similarly, when we strongly desire quasi random/random looking outputs from encryption algorithms we desire obscurity in that we strongly limit bot the perception and the interaction domain of third parties.
We have vastly superior means at hand but at the very core we aren't different from master advisors or locksmiths a 1.000 years ago.
There is nothing to be ashamed about or to hide (attention: pun *g). Obcurity is not the dirty, despicable thing the above mentioned sect worked hard to make us believe.
What *is* shameful and stupid, however, is to not professionally and transparently create and apply the obscurification mechanisms. We don't want and shouldn't accept "magic obscurification" like certain patented and closed algorithms. We want and should accept only obscurification devices that have been professionally researched, designed, verified, cross verified, and that have survived being attacked in the best known ways by our best experts.
So, there is absolutely no reason to be desillusioned or hopeless. We have excellent grandmasters of obscurity; in fact our host here has created some well known and excellent obscurification devices/algorithms. Showing their wisdom as well as their professionalism they have fully published their algorithms and invited - and survived - many attacks.
No, our website's SSL certificate is definitely not invalid. Was that a joke, maybe? :)
If not, it's quite valid and signed by SwissSign AG out of Switzerland. If you really had some issue verifying the cert, please contact us with maybe your browser details or a screenshot or something. We'd VERY much appreciate it. Or, post here if you wish. Thanks. Bill
Have you ever heard the phrase "crabs in a bucket" ?
AHhh "the crab bucket" one of the more evocative descriptions of the human condition.
I chatted about it one evening a number of years ago with a humorous author, and whilst I was drinking scrumpy he was drinking some French "furniture polish" spirit, so neither of us were at our best ;-) He appeared to be of the opinion it was the ultimate fate of humanity in an unequal --by definition-- "capitalist democracy". Whilst I understood the logic of his argument, I was back then a little naively hopeful that "we the people" were a little better than that. But the intervening years has shown that I'm way way to optimistic in my "doom and gloom" outlook on life ;-)
For those who don't get the metaphor of the crab bucket, think instead of "corporate climbers" the real reason they stab more successful / capable people in the back, is so they can use the knife hilt as a stepping stone or ladder rung in their little games of "executive snakes and ladders". At least crabs have the decency to "eat the fallen alive" rather than just gut them and throw them down.
This has got me thinking about the fact that public keys advertise the fact you've got a secret key. Is there a scenario in countries that embed rubber hosing in law, like the UK with RIPA, that the government would require everyone who declares a public key to submit their private keys, or go to jail for 2 years, as applies to individuals at the moment?
Yes, there could be a kernel vulnerability but what's the alternative? Do nothing?
No you mitigate it before it happens.
There are a number of things you have to think about when it comes to "end point security" and as longtetm readers will know I keep going on about it and have done for a couple of decades or so one way or another.
What you are aiming to do varies from person to person but boils down to "sending information privately between two or more parties". So you have the information, the communicating parties, the method of communication and an assumed adversary, who you preferably would like to remain in "Total" ignorance that the communicating parties even know each other.
Well the use of smart phones telephone numbers means that a third party we know logs information, knows which devices can communicate with each other, and when last attempted, which is not a good start.
Especially as legal means against the logs have already been tried, which almost certainly means illegal ones are on the way. Which in all probability means that the last attempted becomes all attempted as the attacker repetedly does the equivalent of "a unix tail" on the log(s).
From that the first steps of traffic analysis start, which is often way way more useful to an attacker than actuall occasional message content is.
But why limit yourself when you can have it all as an attacker? You need to consider "end run attacks" that get the plaintext and how they would be done. The problem you have is that the security end point is not in a device you own and importantly have control over. The network operator / service provider has overall control of your smart phone voa the SIM and they can and do do all sorts of updates eyc over the air to your phone as and when they feel like it.
About a decade ago we saw the first "shim" attack against computers used for online banking to hide changes to the plaintext of transactions, authorisations and balance information so that stolen money had time not just to be taken electronically but also physically out of a bank etc.
So the real controler of your phone can install a "shim" on your phone to not just see the plaintext but actually change/replace it to who ever you are sending plaintext to or getting plaintext from.
That does not sound like anything is safe, secure or private to me, or I now hope you.
To mitigate a shim based end run attack, you need to not have plaintext on your smartphone. That is you extend the endpoint out of the smart phone in a way that makes the information not just secure but any tampering obvious. Which is what I've been telling people for decades but "they still don't get it".
And there are a lot earlier ones than that so Moxie has no excuse for not knowing about smart phone end run attacks, none whatsoever. Which raises other darker questions which only he can answer, and should...
But there is a more fundemental problem which goes under various names but is expressed most euphemisticaly as "The Human Condition". If you have a hunt on the Internet you will find "Why Johnny Can't Encrypt" which is the forerunner of this particular thread subject.
Put simply humans by and large are to trusting of their fellow man, and when mixed with their inherent need for ease of function... They not only "Shoot them selves in the foot" with a Howitzer, they simultaneously jump with that "just enough rope to hang themselves" tied in a neat Double Windsor below their chin. Clowns are professional idiots, by and large the rest of us didn't want to see let alone read the safety manual, as we were too busy watching hamsters dance etc.
It's actually got to the point where I am actually only half joking when I say, "ALL cyber security should be baned for a decade or three, and no cyber criminals should be sought out by law enforcment for the same period, thus giving the Darwin process a fighting chance on having it's way with people...".
Mind you being a suffer of an unelected fool of a Prime Minister in the UK, and the Snoopers Charter and similar legislation coming to a Five Eye near you real soon, I will see it start to happen for real before Xmass as will others in the US and wherever they feel they have to poke their mass surveillance member, oh and then of course monetize it for pleasure and profit...
@Boosmith "... public keys advertise the fact you've got a secret key."
Yeah, they do. And that's not the only consideration.
Simply transmitting such a file almost certainly will ring bells on LEAs' deep proctology... er.. packet... inspection. Also remember that a cut&pasted PGP/GPG ASCII file is easily spotted: the clear-text header and trailer with gobbledegook sandwiched between them scream "encryption".
There is something to be said for using codes instead of cyphers. An innocuous comment, for example "I hope you get rain soon", can mean something quite different, for example "al-Baghdadi is in town now".
 Am taking DNI at his word, that encrypted traffic - no matter their origin or destination - are slurped/collected/kept/retained/stored/archived/name-your-verb.
 In the days before cell phones and "free long distance", calling long distance while away from home usually involved a land line pay phone plus a pile of coins that one had to procure and then drop ...one... by... one....in a slot. It was inconvenient and could be expensive. Enter yankee ingenuity. Let's say you want to inform family in New York that you arrived safely in Hawaii. You make a "collect" call (the person who was called pays), telling the operator that you want to talk to [pre-arranged Name]. Operator puts the call through, says there's a collect call for [pre-arranged Name] from [your name]. Person answering the call recognizes your name, recognizes [pre-arranged Name] as code for "arrived safely", does a Snoopy dance, informs operator that [pre-arranged Name] is not there at the moment, and hangs up. No charges.
"Some of it, however, can not be compensated as we know even from strict military settings where people use the same usb stick both for their porn PC and their official work system."
Good point. I would go further: no organization that is truly strict about security would provide computers that have unmitigated USB or other consumer-grade I/O.
Perhaps the military has tightened up its security in the mean time, but in 2008 I observed a U.S. Navy flag officer(!) take his "thumb drive" from around his neck and hand it to a nineteen-year-old male uni student, who plugged it into his laptop to copy MP3 files. "Do you use that drive at work?" Yep. (cringe)
"Mind you being a suffer of an unelected fool of a Prime Minister in the UK, and the Snoopers Charter and similar legislation coming to a Five Eye near you real soon, I will see it start to happen for real before Xmass as will others in the US and wherever they feel they have to poke their mass surveillance member, oh and then of course monetize it for pleasure and profit..."
Most likely it's already gone global and hot. China has it's new IT "Security" law that inherently means more surveillance powers. The EU is crumbling under the "We need more surveillance power" rush which EU once prided itself for so-called protection of Human Rights and all that jazz.
Switzerland which also prided itself (and so does ProtonMail and many cypherpunk movements) have praised for it having a good reputation as a "Safe Harbour" suddenly took a turn and decided to introduce more powerful surveillance and Govt Hacking laws.
We don't need to wait for X'mas ... it's already here ... and here to stay.
Trying to settle things in a diplomatic and political way (talking to dumb law makers and so on) are just not working. Too much stake ($$$$$$$$ from the Mil-Industrial-Govt-IC complex) to not go on a war mongering spree to get the people's hard earned tax dollars into these bunch of very powerful low lives in society who are sadly also the elites in the society.
Am I the first to point out that much of the complaints against PGP center around the "web of trust"? Hasn't that been pretty much dead since introduction?
I'd have to assume you get the public key the same place/location as the email address (although this has huge MiM issues). Funny that public key crypto *still* has key distribution issues, but here we are.
Tor? Can't it be broken trivially if all the servers used are either owned by the FBI/NSA or otherwise pwned by NSL action? They don't seem concerned about broadcasting its deficiencies by making drug and kiddie porn busts around it (although "breaking" it by reading a few percent of all data certainly gives them the info to break it and possibly drive traffic to areas they can read 100%. And thanks to parallel constructions, it is impossible to know where the original data came from).
Doesn't Keepass suffer from the same issue? It's only as safe as the password used when creating the database, and you can't really update it over time, unless you change every password stored in the DB.
Achieving good data security is very difficult and complex. You can dumb it down for the masses to consume, at the behest of integrity. Cheap, Fast, & Good - these are the 3 pillars of commerce. Most companies can achieve 2 of these, but very few achieve all 3. If they do, consumers beat a path to their door and stay as long as possible. This is why the iPhone was so revolutionary, yet Apple also had the N.E.X.T. computer as a black mark. Prior to iPhone we had 15 years of over-engineered mobile phones and PDA's but most required training and none were very intuitive or practical. Shifting back to Encryption, the flavor is not as important as the delivery. I guarantee you it is possible to make intuitive implementations of PGP or PKI certificates for every client out there, it's just that nobody has effectively applied the 3 pillars approach yet.