Encrypt Your Email and Hard Drive:
|When:||2 December 2015, 2:30-3:30pm|
|Jonathan Poritz [a CSU-P math faculty who also has a sordid past career doing crypto and IT security]|
|Students, faculty, staff, and community members interested in protecting their digital lives|
|What:||The Math and Physics Club of CSU-Pueblo brings you this very practical (and, ok, somewhat mathematical, because that's who we are) introduction to using encryption to protect your data both when it is standing still on your computer and when it is in flight over the Internet, such as in an email.|
|BONUS:||First seventeen attendees get a free DVD of security software!|
|Video:||Part 1 and Part 2 [apologies for the poor production values -- I advise moving through this web page while listening to the audio track of these videos]|
|ATTENTION: If you are one of the ones who took an installation DVD from the end of this presentation when it happened at CSU-P, here is a word of advice: Please contact me before trying to install the software on that DVD! The software is great, powerful, and fun, and you can easily try it out without installing it. But before you install, if you think you will want to, there are couple of things I should tell you so that you don't loose any data you already have on whatever machine you are using.|
One response would be that many of the top jobs (by desirability, salary, low stress, etc.,) involve math — see this list at the Wall Street Journal (from 2014, the last year for which we have complete data).
- Student A [a math major with secondary ed emphasis]:
- What's your major? I don't see you at any meetings....
- Student B [a plain math major]:
- Student A:
- Just "math", not "math ed"?
- Student B:
- Student A:
- What are you going to do with that?
Many works of cryptology speak of two star-crossed lovers, Alice and Bob, who attempt to keep the guttering candle of their love alight, though distance separates them and their communications are being monitored by the evil Eve.
[Extra credit if you can name the two famous mathematicians who acted as models for these pictures of Alice and Bob.] [If you give up, hover your mouse over the image.]
It's important to realize that in many — maybe most — situations, it is entirely appropriate to assume that Eve can see all the communication between Alice and Bob while it is in transit. All of the channels you are used to suffer from this:
Answer: Digression on the history of the Internet, coming out of the Cold War, as a non-hierarchical [in contrast to the telephone system] method of command and control. Note, in particular, that it was never foreseen that there would be adversaries on the Internet, so no privacy and security were built into the fundamental IP protocol.
Therefore, when I am on a web page at my bank, Wells Fargo (headquarters in San Francisco, CA), every time I click , the information going to Wells Fargo is sent to their servers by a process analogous to:
Along the way, I am simply trusting that the various people in the various bus stations will act honestly, will know how to get my postcards (packets) closer to their destination, and will not choose to read them.
- I chop up the information into chunks (called packets -- the Internet is a "packet-switched network") and write each chunk on a postcard, with Wells Fargo's address
- I go to a the bus station downtown and put my postcards on the seat in bus headed towards Denver.
- In Denver, I trust someone to pick up the cards and to put them on a bus headed to San Francisco.
- If the bus to San Francisco is delayed, I trust someone in Denver will move (some of) my postcards to a seat in a bus to LA, some in a bus to Las Vegas, etc. In each of those respective cities, I trust someone to move the postcards to buses headed closer to San Francisco.
- In San Francisco, I trust the postcards to be moved to a city bus which goes pas the Wells Fargo offices.
Somewhat more mathematically, this diagram (from my free textbook Yet Another Introductory Number Theory Textbook, as are several similarly formatted diagrams in this presentation) gives some basic terminology:
In the design of the encryption and decryption algorithms, we follow something cryptologists call Kerckhoffs's Principle [named after Auguste Kerckhoffs a professor of languages at the École des Hautes Études Commerciales in Paris in the late 19th century who wrote influential papers on cryptology]. According to this Principle, one always publishes the details of one's cryptographic algorithms.
It may seem ridiculous to publish the algorithm used to protect your data, but we do this because humans have a nearly infinite capacity for self-deception. As a consequence, we are always thinking we have invented the best cryptographic algorithm, a perpetual motion machine, the way to square the circle and trisect the angle ... when another set of eyes, looking over our work independently, would immediately see flaws. This is nothing other than the famous idea of peer review the scientific method, which is the foundation of the modern world.
[The alternative to putting your proposed cryptographic algorithms out in the world for peer review is called by cryptologists — with enormous disdain — security by obscurity. Experience has shown that it is no security at all.]
If we are to publish our encryption and decryption algorithms, the security must lie in some other secret. This is an additional piece of information called the key, which is input into those algorithms, as follows:
The above is called symmetric (or private- or secret-key) cryptography. We shall see an alternative in a few minutes.
Demonstration of using GnuPG, for encryption
gpg --output <file.gpg> --cipher-algo AES256 [--armour] --symmetric <file>and decryption
gpg <file.gpg>Look at the file with
hexdump -C <file.gpg>
You may have noticed that in that demonstration I didn't do a lot of pointing and clicking. Instead, I typed commands, using what is called the command-line interface [CLI]. This is the major way that everyone I have ever met who does serious things with a computer interacts with the computer. If you want to use a computer to play games, by all means use a mouse or game controller. If you want to type a paper in an English class, you wouldn't point and click at an alphabet on the screen. If you wanted to process data for a chemistry lab report, you would enter the numbers into a spreadsheet (by typing them), create clever macros (by typing them), etc.
Pointing and clicking rather than typing commands is a lot like trying to communicate specific information and instructions to someone else by playing charades rather than simply speaking. Charades is a fun game, but I wouldn't act out a non-verbal version of the Fundamental Theorem of Calculus in a class of mine, I would say the words, and write them on the board. ...So why are we doing so much charades to communicate with our computers?
OK, there was one other thing that must have been obvious in my GnuPG demonstration: I wasn't using Windoze. A bit like the CLI, I don't know any serious computer scientist who uses Windoze. Trying to do security with Windoze would be like having a meeting of Alcoholic's Anonymous in a bar: the game is already over simply because of the environment.
Also, because the programs which constitute Windoze are Microsoft's greatest asset, its great crown jewels of intellectual property. That may be a good business move for them [although that is much less clear than it seems], but it means that they have never done science: their programs have never been put up for peer review, so as a scientist it would be absurd for me to have blind trust in them.
So my secret plot in this talk is now revealed: even more than telling you about some nice techniques and tools to protect your data — a valuable goal which I am also pursuing — I want you to ask the following very good question:
You've probably never thought about it, but there are alternatives. Some people are using those alternatives: see Usage share of operating systems. And some of these alternatives fit within the scientific method, as we've been discussing, while others do not. Which do you think it makes sense to use? [Hint: if you like computers, antibiotics, the polio vaccine, cell phones, etc., you like science.]
Some more reasons to think hard about the above question:
Use full-disk encryption.
In GNU/Linux, this is an installation option. Under the hood, it uses AES with a key built out of the user's passphrase.
For Windoze, there used to be a tool called TrueCrypt, but it took itself out of the business in 2014 (in a very suspicious way). Alternatives exist, such as VeraCrypt and CipherShed.
If Alice and Bob want to be able to communicate securely without ever having met to exchange the symmetric key, they can instead use asymmetric (or public-key) cryptography:
Here's a particular [very mathy!] way to do this, called the RSA cryptosystem (named after Ron Rivest, Adi Shamir, and Leonard Adelman, who published this idea in 1977):
RSA is often not the best (most efficient or most secure for a given key size) asymmetric cryptosystem, but it is definitely the most widely-used. This is probably due to the fact that it was the first one discovered, and also to the (comparative) ease of understanding the math. Other systems involve arithmetic on elliptic curves, which is a fairly chewy area of mathematics.
All asymmetric crypto relies upon a mathematical function which is easy to compute in one direction but difficult to invert. For RSA, this is essentially multiplication forward [easy], but factoring backwards [hard]. For other asymmetric algorithms, there are other of these one-way functions.
The main issue is Public-Key Infrastructure, PKI, because of the following
Therefore, we need to be sure that the public keys we use really do belong to the people who we think they do. We do this either by getting the key from someone in person — but that kind of ruins the whole idea of asymmetric crypto! — or we get a key in some way that we are sure of its provenance.
One kind of proof of ownership would be a digital signature on a public key, signed by someone whom we trust. Digital signatures work like this:
Another, less formal, approach is for individuals to sign each other's keys, when they know each other personally, until gradually there is a large web of trust. The fun way to do this is to throw a key-signing party where people who know each other bring laptops and sign each other's keys. We could have one here, on campus, and then we would all start to be able to use asymmetric crypto with each other....
Let's install and use a FLOSS Firefox and Chrome extension which does public-key crypto for common webmail clients: Mailvelope.
The third-party doctrine suggests we should keep only the encrypted versions on the webmail provider's servers. Mailvelope does this. It also keeps track of your keys ... protected by a password and the security of your machine. [So there is not much point in using this under Windoze, because its security is so spectacularly weak. But you are all going to run a FLOSS OS in the future, aren't you?]
Here is a public key for a key I set up
for this demonstration [and only for this demonstration — please
do not use it for real, secure communication with me, I have not followed good
security practices in creating or storing this key!]. It's contents are:
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: Mailvelope v1.2.3
-----END PGP PUBLIC KEY BLOCK-----
....Extended Mailvelope demo.
By the way, if you do not use one of the webmail clients that Mailvelope supports, there are other things you can do. One would be to use GnuPG on the command line and then send your messages always as attachments. Another thing to try would be if you use the FLOSS mail program Thunderbird (which is produced by Mozilla, the same people who make the Firefox browser), and the Enigmail Thunderbird extension.
Just the year before RSA was published, Whitfield Diffie and Martin Hellman started the whole idea of public-key crypto with their algorithm, now known as Diffie-Hellman Key Exchange