File:The World on a Wire Show S01E02- The Case of the Poisoned Certificates (IA ep02 20200124).mp3

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

The_World_on_a_Wire_Show_S01E02-_The_Case_of_the_Poisoned_Certificates_(IA_ep02_20200124).mp3(MP3 audio file, length 30 min 0 s, 96 kbps overall, file size: 20.6 MB)

Captions

Captions

Add a one-line explanation of what this file represents

Summary[edit]

Description The Case of the Poisoned Certificates

What does privacy mean on the internet? What needs protecting, and from  whom? In June 2019 these questions threatened to upend an encryption  scheme relied upon by human rights activists and corporations alike. Episode transcript follows. A warning: this episode starts with an expression of violent anger. Robert J. Hansen didn't mince words. "I have never in my adult life wished violence on any human being," he said. "I have witnessed too much of it and its barbaric effects, stood by the graves of too many people cut down too young. I do not hate you and I do not wish harm to befall you. But if you get hit by a bus while crossing the street, I'll tell the driver everyone deserves a mulligan once in a while. You fool. You absolute, unmitigated, unadulterated, complete and utter fool." The World on a Wire Show. Opening theme music Season One, Episode Two.

The Case of the Poisoned Certificates

Hansen was angry because, in June 2019, someone had finally attacked part of the de facto standard system for encrypting email communication, established over fifteen years before. OpenPGP uses big, secret numbers called secret keys which are randomly generated for each user of the system. A difficult-to-reverse series of mathematical operations transforms that secret key into a public key that can be shared publicly on the internet without exposing the secret key from which it was derived. Using a combination of your own secret key and someone else's public key, you can transform any file or string of text so that no one can open it without access to the recipient's secret key. It's kind of a clunky old system if you use it without the help of some kind of tool that obscures the gnarly details of how it all works, but it's still used by some corporations to allow their customers to verify the software they sell. It's also one of the few standards for private communication trusted by human rights activists and others who have some reason to distrust public communications networks and the governments that monitor them. When Edward Snowden leaked details of the United States' domestic spying program to members of the press, <a href="https://www.dailydot.com/layer8/edward-snowden-gpg-for-journalists-video-nsa-glenn-greenwald/">he relied on PGP encryption</a> to keep their email private.

When encrypting a message using OpenPGP, it is very important to know for sure that the public key you're using actually belongs to the intended recipient. If someone else posing as the intended recipient of the message tricked you into using their public key, they would have access to the contents of your message. Therefore the OpenPGP standard provides a mechanism for verifying a friend's public key: each public key is associated with a short but unique string of text called a "fingerprint," so you can check the fingerprint of your own copy of the recipient's public key against the fingerprint on the recipient's machine by meeting the recipient in person, where the chance that someone is convincingly impersonating the recipient is low, by calling them over the phone and asking them to recite the fingerprint for you in their own voice, or through various other means which may be less secure. You can then sign the recipient's public key, using your own secret key to create a record that you have verified the identity of the key-holder in some way. Signatures of different levels indicate different levels of thoroughness and certitude in your verification methods. Of course, in a global communications environment, it is not reasonable to expect that you can visit everyone you contact in person, or that you will recognize the sound of their voice in a phone call before you contact them. Therefore, you may need to rely on a signature produced by a mutual friend whose public key you have already verified, or by a friend of that friend, and so on. Relying on signatures produced by other people in this way creates what OpenPGP proponents call a web of trust, where the certainty with which you can trust the identity of the person who uses a particular key is as strong as the weakest link in the chain of signatures between you and that person. The social effort required to build webs of trust out of signatures on public keys made the OpenPGP encryption method more than a system; it became something of a culture, complete with its own rituals. Public key fingerprints appeared in people's email signatures, as actual tattoos on their mortal flesh prisons. Some enthusiasts would use their secret keys to sign the contents of every email they sent, so that even if the contents of the email were public, anyone could verify that they actually wrote it themselves by checking it with their public key. People who wanted to expand their webs of trust could hold key-signing parties where they would verify each other's cryptographic fingerprints, generate signatures, and ultimately upload these signatures to internet servers. Around 2003, research at the Massachusetts Institute of Technology culminated in the publication of server software to efficiently store and share OpenPGP public keys and signatures. The Synchronizing Key Server, or SKS, replaced the earlier Public Key Server (PKS), and offered features that made it more-or-less the uncontested standard key server for the next fifteen years. One of the features that made SKS such an enduring solution was the absolute robustness of its data storage. Once you uploaded a key or a signature to an SKS server, the data would be there forever, and anyone who wanted to remove it from the server—even an administrator with full access to the database—would have a really hard time doing that. If a key you had uploaded became compromised or outdated, your only choice to revoke they key so other people would know not to trust it any longer was to upload what was essentially a special kind of signature marking the key as no longer valid. But the key would continue to exist on the key server, just with this extra information that you had since revoked it. And chances are your key wouldn't be on just one key server. SKS systems are easily networked together, allowing them to constantly synchronize the keys and signatures in their databases so they all have the same data. Even if a few of these servers were abruptly taken offline, the rest would continue to mirror all the keys and signatures that had been synchronized between them as if nothing had happened. The extreme robustness and redundancy of SKS data storage was attractive to early-2000s users of OpenPGP because it offered a certain level of resilience in the face of hostile government action. That might sound like an odd fear for a bunch of cryptography nerds to have, but at times there have been legal challenges to the import and export of cryptography software across national borders. Until 1996, for example, cryptography software export from the United States was under the jurisdiction of the Department of State, classified on its Munitions List. As of recording in 2020, France still has extensive regulations on the import and export of cryptographic software. And if software can't be legally distributed across national borders, it's effectively barred from being published on the Internet. SKS is resistant to this kind of threat because if a single server, or even several servers, are knocked out by government action, servers in other parts of the world will remain operational. The absolute persistence of data on SKS servers means server administrators can't comply with legal demands to remove the keys of dissidents from the system. In May 2018, GitHub user yakamok published <a href="https://github.com/yakamok/keyserver-fs/">proof-of-concept</a> software tools that would allow users to upload arbitrary textual information in the descriptive data on keys to be stored on SKS systems. A key stored in SKS can have multiple e-mail addresses, each with an associated name or description, and SKS doesn't verify these email addresses. By abusing this feature of SKS, anyone who uses the system can upload any information they like—even someone else's home address and credit card information to the SKS system, where it will be automatically synchronized to servers around the world and become impossible to take down. yakamok says, "I wrote this because I think this characteristic of key-servers is actually dangerous, for example someone could upload leaked data and it would be spread around the world and accessible by anyone and unstoppable, how would this situation be dealt with?" and goes on to point out how this feature of SKS could make it difficult for people who own and operate SKS servers to comply with the General Data Protection Regulation. The General Data Protection Regulation or GDPR, enacted by the European Union government in 2016, placed largely unprecedented restrictions on the storage and transmission of the personally identifying information (including, but not limited to, names and email addresses) of EU citizens. Crucially, the GDPR requires informed consent for every use of personally identifying information, and if an EU citizen requests the removal of their personally identifying information from some storage system, compels the maintainer of that system to destroy the relevant records within a reasonable response time. I have personally heard from a lot of people who have a very positive outlook on the GDPR, as they feel that it has curbed some of the worst privacy-violating behaviors of internet advertisers and social media giants, and the GDPR is frequently cited in requests to be excluded from unwanted mass emails or to have personally identifying information expunged from a database, even when the requester has no personal connections to any EU member state. In 2018, California even enacted the California Consumer Privacy Act, a piece of legislation which echoes many of the provisions of the GDPR. If someone using yakamok's proof-of-concept tools uploaded my name, email address, home address, and bank account information to the SKS system, not only would this information spread to severs around the world—but even if I were a citizen of the European Union and cited the GDPR in strongly-worded legal requests to keyserver maintainers, those maintainers would not be able to simply remove my information from the system, and if I had effective lawyers, the resulting legal pressure could corner them into shutting the servers down. But it wasn't a legal threat that made Robert J. Hansen so angry in June 2019. What happened was this: someone took the public keys of two prominent OpenPGP proponents, Robert J. Hansen and Daniel Kahn Gillmor, and signed them a lot, like over one-hundred-thousand times each, using a bunch of phony keys and email addresses. These junk signatures took up more space than usual, but when the attacker uploaded them to the SKS system, the servers didn't have much trouble handling them, because it's not the keyserver's job to parse apart these signatures and figure out how they relate to your web of trust and how you use the public key; that's a job for the PGP software on your own computer. The problem became apparent when people who had either of these two public keys on their "keyrings" connected to the keyservers to copy the latest signatures from the internet and make sure their copies of the certificates that represent the verified status of the public keys were up to date. When they carried out this best-practice routine using the most popular OpenPGP software, called GNU Privacy Guard, <a href="https://gnupg.org/">GnuPG</a>, or GPG, GPG incorporated these enormous, "poisoned" certificates into its own data and tried to parse them apart any time it was used. This was so incredibly slow that it basically made GPG unusable.

The OpenPGP community went into high alert. On the Twenty-Eighth of June 2019, Daniel Kahn Gillmor posted a notice on his personal blog warning that the SKS system had been affected by what he called "certificate flooding"; on the following day <a href="https://gist.github.com/rjhansen/67ab921ffb4084c865b3618d6955275f">Robert J. Hansen published a detailed memo</a> about the situation on GitHub; that's the source of the remarks I shared at the beginning of this episode. On the day of Hansen's post, the U.S.-government-funded Mitre Corporation cataloged the poisoned-certificate incident on its list of Common Vulnerabilities and Exposures with identification number <a href="https://nvd.nist.gov/vuln/detail/CVE-2019-13050">CVE-2019-13050</a>. <a href="https://access.redhat.com/articles/4264021">A tech breif for this CVE ID published by Red Hat</a> notes that:

GnuPG is often used to verify downloaded packages for Linux distributions. If someone were to poison a vendor's public certificate and upload it to the key server network, the next time a system administrator refreshed their keyring from the key server network the vendor's now-poisoned certificate would be downloaded. At that point upgrades become impossible because the authenticity of downloaded packages cannot be verified.  

Popular Linux distributions quickly provided updates that addressed the problem by limiting the default interactions between GPG and key servers, so that GPG wouldn't download any signatures unless specifically requested to do so. Some started using Web Key Directories that don't allow anyone to upload data without special permission. But while these quick fixes may have limited the potential damage of future attacks, the underlying vulnerabilities remain. In <a href="https://twitter-com/lambdafu/status/1147162583969009664">a Twitter thread from the Fifth of July 2019</a>, cryptography expert Marcus Brinkmann laid out how inefficiencies in the way GPG processed keys and signatures made it particularly vulnerable to certificate-poisoning attacks, but some of these inefficiencies have since been addressed, and most people familiar with the matter place at least as much blame on the key server software SKS.

In his memo, Robert J. Hansen cites two reasons that SKS might not be fixed any time soon. One reason he gives is that SKS is written in the programming language OCaml, which is a lot-less widely-known than popular programming languages like C++ and Java, and will be unintuitive to a lot working programmers because as a functional language, its way of doing things is fundamentally different from the way most programmers write code in imperative languages on an everyday basis. But OCaml does have an active community around it, and as soon as Hansen's memo went up on GitHub a number of OCaml programmers expressed interest in addressing the problem. But the other reason Hansen gave for the unlikelihood of a fix holds more water: the certificate-poisoning attack doesn't target an unexpected bug in the software. Instead, it takes advantage of one of the software's premier features, a design choice that underlies the whole system: anyone can publish any key or signature to the SKS system, and that key or signature can never be removed, lest repressive governments seek to undermine the ability of OpenPGP users to communicate privately. What makes the system strong against that kind of interference makes it vulnerable to spam—such a volume of spam that it can disrupt your computer's ability to encrypt and decrypt anything whatsoever. That strength and that vulnerability have been baked into the system for over fifteen years.

Before Daniel Kahn Gillmor, who works for the American Civil Liberties Union, had his public key certificate "poisoned", he had published several working drafts of a formal Request For Comments, a document that could eventually become accepted as part of the industry standards for OpenPGP cryptography. The document, entitled <a href="https://datatracker.ietf.org/doc/draft-dkg-openpgp-abuse-resistant-keystore/">"Abuse-Resistant OpenPGP Keystores"</a>, details various proposed modifications to OpenPGP standards that would allow a PGP key server to make itself less vulnerable to spam, denial-of-service attacks, and unwanted publication or retention of personally identifying information. In fact, Gillmor had been suggesting fixes to these vulnerabilities for years, never quite building enough momentum to get his ideas adopted into OpenPGP standards. In his own notice about the spamming incident, Gillmor says:

This is a mess, and it's a mess a long time coming. The parts of the OpenPGP ecosystem that rely on the naive assumptions of the SKS keyserver can no longer be relied on, because people are deliberately abusing those keyservers. We need significantly more defensive programming, and a better set of protocols for thinking about how and when to retrieve OpenPGP certificates.  

"Naïve assumptions" are a major theme here. The people who designed the SKS system made assumptions about who posed a danger to user privacy and what attacks on that privacy might look like. They made assumptions about what kinds of user information might need to be protected in the first place. These were assumptions that probably made some sense in the context of the early-2000s internet, and they are assumptions that are very dangerous today.

The SKS system's redundancy and persistence of data, that made it resilient to individual servers being shut down at the expense of making it more vulnerable to spam and what is now called "doxing"—the non-consensual release of someone's personally identifying information—was implemented under the assumption that it was governments engaged in wiretapping that posed the real threat to privacy. But as the internet increasingly becomes part of the lives of the common people, the crooked power dynamics that govern our offline relationships have moved online with us. The internet was once a community of weirdos who had desktop computers and wanted to connect to geographically distant people; now it's just another mode of communication woven into the infrastructure of our societies. Some people want to do each other harm, offline or online, and some people—especially people who are marginalized in the offline world in some way, are frequent targets of that kind of abuse. Marginalized people need and deserve systems that don't make naïve assumptions at the cost of their privacy. Gillmor's ideas for addressing keyserver vulnerabilities largely involve making SKS behaviors that are now expected from keyservers optional, so new keyserver software can choose not to implement the features that make the system vulnerable to abuse. <a href="https://gitlab.com/hagrid-keyserver/hagrid">Hagrid</a> is a relatively new, working keyserver intended to run at only one location on the internet—at <a href="https://keys.openpgp.org/">keys.openpgp.org</a>. It incorporates many of Gillmor's proposed ideas. It does not automatically "federate" its data to other keyservers. It does not store any key metadata apart from an email address and a fingerprint. It does not handle "web of trust" signatures at all, instead relying on users to verify keys through their own means. It requires users who upload their keys to verify their email addresses in order to make their keys discoverable. And, much unlike SKS, it allows users to unpublish a key through email verification. As the poisoned certificate incident unfolded, Hansen and others suggested using <a href="https://keys.openpgp.org/">keys.openpgp.org</a> as one possible mitigation for the exposed vulnerabilities. I think this mess with OpenPGP makes for a great case study and cautionary tale because it endangers the very thing—user privacy—that it was created to protect. But I wouldn't want to give you the impression that OpenPGP's problems pose the greatest danger to people's privacy online. Social media corporations have deliberately profited off the surreptitious use of users' personal information for years now, and they will get their due in future episodes of this podcast. One thing that makes OpenPGP's issues less catastrophic is that very few people, relative to the scale of the internet, actually use it to protect their personal communication. OpenPGP has succeeded in serving as an industry standard, but it has failed by another, more important metric: it hasn't proliferated to non-technical users. The strength of encryption is often said to depend on the proliferation of its use. One person encrypting all their communication in a forum where nobody else is doing it tends to arouse some suspicion, whereas when everyone is encrypting even their most mundane messages the act of encryption itself becomes routine for those communicating and unremarkable, if annoying, to anyone in the business of espionage. This strength-in-numbers approach is important because it decreases the risk of someone being singled out for increased scrutiny for their use of encryption, but also because, theoretically, the success of a brute-force attack against any person's secret key is only a matter of CPU time. Surveillance authorities have access to the biggest, most efficient hardware available and could use this to throw hundreds of years of CPU time at a problem, and the jury is still out on how the advance of quantum computing could affect the strength of cryptography—but bringing this kind of power to bear on one person's encrypted communications is still costly and time consuming, so there would have to be a pretty convincing argument that it would be worth the time and money. And if you're encrypting your email, but everyone else is doing it too, maybe that strong argument isn't there. OpenPGP has been out there for over fifteen years and email encryption still hasn't become this kind of pervasive norm among non-technical computer users; indeed it's still associated culturally with paranoid hackers, political dissidents, and people who "have something to hide." One factor that has prevented wide-scale adoption of OpenPGP for encrypted communication between everyday users is that it isn't simply integrated into the most popular tools for online communication. The web interface for Hotmail never implemented an OpenPGP button, and the web interface for Google's GMail doesn't have anything like that either. In fact, until recently, there were no standards that would even allow such an intuitive interface to OpenPGP email encryption; the processes of copying and exchanging keys had to be performed manually by the user. The only people who use OpenPGP are the people who care about encryption enough to use specialized tools that provide it, like the <a href="https://www.mailvelope.com/">Mailvelope</a> browser plugin or the <a href="https://www.enigmail.net/index.php/en/">Enigmail</a> plugin for the Thunderbird email client, or <a href="https://k9mail.github.io/">K-9 Mail</a> on Android, or even the GnuPG command line tool. Still, I think it's fair to say that the developers of tools like Mailvelope are addressing another barrier to adoption by simplifying what have previously been almost infamously technical interfaces full of technical jargon and a potentially bewildering array of options. When U.S. surveillance whistleblower Edward Snowden contacted Guardian reporter Glenn Greenwald in 2013 at great personal risk, he sent the journalist a video tutorial on setting up PGP tools for his Windows work computer. Greenwald later explained in his book No Place to Hide that it took him months of time and the help of colleagues to comprehend and follow the instructions in the video. <a href="https://www.dailydot.com/layer8/edward-snowden-gpg-for-journalists-video-nsa-glenn-greenwald/">The Daily Dot reported</a> that the video, which was posted anonymously to Vimeo, resurfaced upon the publication of Greenwald's book.

It's just detailed enough to impart the basic understanding needed to use the tools it describes, appropriately hand-holding but not condescending. But the distracting effects of vocal anonymization that protect the video narrator's identity make it difficult to follow, and moreover the complexity of the tools involved is self-evidently beyond easy reach of the average non-technical user like Greenwald. Here's a brief clip from a section of the video that demonstrates the web interface of a keyserver:

Let's take a quick look at publishing and getting keys from people we've never had contact with. <a href="https://pgp.mit.edu/">pgp.mit.edu</a> is a public key server; anyone can put their keys up there for free. You can search by emails, strings a little bit like what you're searching for. You want to use exact matches if you know what the email is. So let's look at... This is what results look like. This is the key we just generated and published. That is our key ID, which are the last eight characters of the fingerprint.  

It's not hard to see why the fastest-growing forms of encrypted personal communication today are those that take essentially no effort for the end user to set up and maintain, where the management of keys and encryption of messages is transparently and automatically performed by the messaging software. <a href="https://signal.org/">Signal</a> and <a href="https://www.whatsapp.com/">WhatsApp</a> are increasingly popular examples, despite the efforts of some national governments to curb their use. OpenPGP may be an industry standard, but without a significant non-technical user base, its impact on the proliferation of encryption—and therefore on the tactical effectiveness of encryption as a means of achieving privacy—is limited.

Since 2016, a project called <a href="https://autocrypt.org/">Autocrypt</a> has aimed to make OpenPGP encryption of email as effortless as using Signal or WhatsApp. It's a new set of guidelines for compliant email clients to set up OpenPGP and exchange public keys automatically without the user having to do or know anything. It has the potential to dramatically expand OpenPGP adoption—but for now, most people who use email on a daily basis aren't benefiting from Autocrypt, because, again, they're accessing their email through something like the GMail in-browser client or Apple's iOS Mail app that doesn't support Autocrypt—at least, not yet. Still, though, small as it may be, the OpenPGP community isn't going anywhere, and there's no sign that OpenPGP is about to be supplanted as an industry standard. The world of technology may be fast-moving, but old standards die hard. Closing theme music This episode of The World on a Wire Show was written and narrated by me, Dominique Cyprès. The opening theme music was <a href="http://beta.ccmixter.org/files/zep_hurme/59681">"Come Inside"</a> by Zep Hurme featuring Snowflake. The closing theme is <a href="http://beta.ccmixter.org/files/AlexBeroza/31670">"Start Again"</a> by Alex Beroza. Special thanks to Daniel Kahn Gillmor, Vincent Breitmoser, and Wiktor Kwapisiewicz for their input on the script of this episode. Get the latest updates on The World on a Wire Show at <a href="https://www.patreon.com/lunasspecto">patreon.com/lunasspecto</a>; that's Lima Uniform November Alpha Sierra Sierra Papa Echo Charlie Tango Oscar. You can also reach me by an encrypted or plaintext email to <a href="mailto:lunasspecto@gmail.com">lunasspecto@gmail.com</a>; my <a href="https://keys.openpgp.org/vks/v1/by-fingerprint/BABAD8C5A85348159329F78E2B53214633970D95">PGP public key</a> is available from <a href="https://keys.openpgp.org/">keys.openpgp.org</a>.

This work is licensed under a <a href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International Licens</a><a href="https://creativecommons.org/licenses/by/4.0/">e</a>.

Date
Source
Internet Archive identifier: ep02_20200124
https://archive.org/download/ep02_20200124/ep02.mp3
Author Dominique Cyprès
Title
InfoField
The World on a Wire Show S01E02: The Case of the Poisoned Certificates
Language
InfoField
eng
Publicdate
InfoField
2020-01-24 16:16:45
Subject
InfoField
podcast; computer science; The World on a Wire Show; comuter history
Collection
InfoField
podcasts

Licensing[edit]

w:en:Creative Commons
attribution
This file is licensed under the Creative Commons Attribution 4.0 International license.
You are free:
  • to share – to copy, distribute and transmit the work
  • to remix – to adapt the work
Under the following conditions:
  • attribution – You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeThumbnailDimensionsUserComment
current07:51, 19 August 202030 min 0 s (20.6 MB) (talk | contribs)Internet Archive podcast ep02_20200124 (IA audio)

The following 2 pages use this file:

Transcode status

Update transcode status
Format Bitrate Download Status Encode time
Ogg Vorbis 82 kbps Completed 17:57, 17 February 2022 45 s