It's a pleasure to meet you, sir! I'm happy to serve as your guide through our short introductory course on "Surreptitious Surveillance".
I know that as the new overlord you'd like to understand how you can best use surveillance. You're already familiar with overt surveillance, such as installing obvious cameras. Beyond collecting information, this can help keep people in line: the idea is that they know you're watching, so they'll be terrified to misbehave. But this can also backfire. I'll give you some cautionary examples below.
Whether or not you carry out any overt surveillance, you'll definitely want to invest heavily in surreptitious survillance, surveillance that doesn't change people's behavior. I'll give you concrete examples of how this works and what its benefits are.
The British Empire as a police state. I'll start with an example from hundreds of years ago that ended up backfiring.
George Montagu-Dunk, Earl of Halifax, issued a general warrant in 1763 to search for, and to "apprehend and seize together with their papers, the Authors, Printers and Publishers of a seditious and treasonable Paper, Intitled, The North Briton, Number XLV". Officials then "broke down at least 20 doors and scores of trunks, and broke hundreds of locks", dumped "thousands of books, charts and manuscripts on the floor", and arrested 49 people.
The North Briton was a weekly publication that had been started the year before by opposition politician John Wilkes, with a circulation of thousands of people. Wilkes is one of the people who was arrested as a result of the search.
Sounds like the search worked, right? But then Wilkes sued the officials for "trespass, for entering the plaintiff's house, breaking his locks, and seizing his papers, &c."
Serjeant Glyn was another opposition politician and one of the lawyers for Wilkes. He began the courtroom proceedings by saying "that the case extended far beyond Mr. Wilkes personally, that it touched the liberty of every subject of this country ... In vain has our house been declared, by the law, our asylum and defence, if it is capable of being entered, upon any frivolous or no pretence at all, by a Secretary of State. Mr. Wilkes, unconvicted of any offence, has undergone the punishment. That of all offences that of a seizure of papers was the least capable of reparation; that, for other offences, an acknowledgment might make amends; but that for the promulgation of our most private concerns, affairs of the most secret personal nature, no reparation whatsoever could be made. That the law never admits of a general search-warrant. That in France or Spain, even in the Inquisition itself, they never delegate an infinite power to search, and that no magistrate is capable of delegating any such power. That some papers, quite innocent in themselves, might, by the slightest alteration, be converted to criminal action."
After listening to testimony regarding what had happened, the jury returned a verdict for Wilkes "with a thousand pounds damages". To put this amount of money in perspective: The average income for a day of work by a carpenter at the time was 2 shillings 6 pence, where a pound was 20 shillings and a shilling was 12 pence, so 1000 pounds would pay for 8000 days of work.
So that's our first lesson: it isn't popular for police to be pawing through the populace's private papers.
Rebellion. In its American colonies, the British Empire was using "writs of assistance", which were similarly general search warrants. In a high-profile 1761 case, a lawyer named James Otis Jr. presented a long speech against those warrants. Another lawyer named John Adams later described the political importance of the speech as follows:
Otis was a flame of fire! — with a promptitude of classical allusions, a depth of research, a rapid summary of historical events and dates, a profusion of legal authorities, a prophetic glance of his eye into futurity, and a torrent of impetuous eloquence, he hurried away every thing before him. American independence was then and there born; the seeds of patriots and heroes were then and there sown, to defend the vigorous youth, the non sine Diis animosus infans. Every man of a crowded audience appeared to me to go away, as I did, ready to take arms against writs of assistance. Then and there was the first scene of the first act of opposition to the arbitrary claims of Great Britain. Then and there the child Independence was born. In fifteen years, namely in 1776, he grew up to manhood, and set himself free.
Are you impressed by the rhetoric? These terrorists, led by a treasonous former colonel named George Washington, killed many thousands of soldiers and ultimately forced the British Empire to abandon its American colonies. What a disaster!
The terrorists then tried to design a system of governance that would protect people against the government, obviously not understanding that the government naturally has their best interests in mind. For example, a cryptographer named Thomas Jefferson wrote: "In questions of power, then, let no more be heard of confidence in man, but bind him down from mischief by the chains of the Constitution." George Washington, John Adams, and Thomas Jefferson were the first three "presidents" of their so-called "United States". In response to the British Empire's general search warrants, the Fourth Amendment to the United States Constitution stated the following: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."
To be clear, the terrorist rhetoric included endless whining about the British Empire: not just that the Empire was sending "swarms of Officers to harrass our people" but also that the Empire was "Quartering large bodies of armed troops among us" and "protecting them, by a mock Trial, from punishment for any Murders which they should commit on the Inhabitants of these States" and so on. I'm focusing on surveillance in this minicourse, but I encourage you to also investigate our related courses, such as "Thou Shalt Not Kill Without Sufficient Marketing".
The importance of stealth.
Fundamentally,
you're trying to understand what
eight billion potential terrorists
inside and outside this country are doing, saying, and thinking.
The obvious approach would be to have billions of cameras following those potential terrorists around
and recording everything to be analyzed by your trusted AI system.
But what happens if a bunch of potential terrorists start complaining
that you're violating their so-called "rights":
for example, complaining that the United States government is violating the Fourth Amendment?
Or if they figure out ways to keep some important information hidden from your cameras?
The critical advantage of surreptitious surveillance is that the targets don't know it's happening. If you have agents secretly sneaking into a potential terrorist's home or office to copy some information and to plant some recording devices, the potential terrorist won't even realize this happened! He or she won't complain and won't make any adjustments to hide things from you. See how useful that is?
Even if you decide to intimidate potential terrorists with some overt surveillance, you'll also want to have surreptitious surveillance to find out what they're really thinking.
You should be aware of three important ways that potential terrorists might find out what you're doing. First, actions that you take on the basis of surveillance might allow the potential terrorists to figure out that they're under surveillance. The main technique to address this risk is to pretend that you obtained the information in a different way. Remember that you don't want to blow the cover off your surveillance operation.
Second, there are occasional horror stories about agents being caught during physical breakins. This is why it's so wonderful to have the Internet, an increasingly pervasive network of connected devices that the potential terrorists voluntarily install. Your agents can quietly take control of many of these devices, directly seeing any information stored on those devices and at the same time using the devices to spy on everything else. The NSA Director already commented in 1979, regarding "intelligence activity", that "the electronic age has made it much easier to do that surreptitiously".
Third, sometimes a rogue agent starts empathizing with those eight billion potential terrorists and then, under the banner of "whistleblowing", illegally leaks information to newspapers about what you're doing. Our primary recommendation for addressing this risk is to make sure your hiring process selects the right people. For example, in the United States, you want the sort of people who seriously believe that "everything the United States Government does is good"; that the politicians blackmailed by J. Edgar Hoover are people who were threatening to damage the country (not upright leaders like you!) and needed to be brought to heel; that spying on civil-rights leaders, journalists, and senators is the right thing to do because those people are "dangerous"; that Martin Luther King, Jr., was recruited by the Russians; that anti-war protester Jane Fonda was recruited by the Vietnamese; and so on.
The loss of some NSA secrecy. A presidential memo in 1952 created NSA to centralize the "communications intelligence (COMINT) activities of the United States", replacing four previous COMINT agencies. The memo was classified for decades, because of course the government wanted to minimize public awareness of NSA's activities.
Unfortunately, complete secrecy didn't last. Some traitors, claiming to be disillusioned by the U.S. government "deliberately violating the airspace of other nations" and "lying about such violations in a manner intended to mislead public opinion" and "intercepting and deciphering the secret communications of its own allies" and so on, leaked NSA's existence in 1960. A book called "The Codebreakers" included a chapter about NSA in 1967. Another of these so-called "whistleblowers" leaked more information about NSA in the early 1970s. A New York Times article in 1974 about one of NSA's sister agencies had title "Huge C.I.A. operation reported in U.S. against antiwar forces, other dissidents in Nixon years". A month later, the Senate voted 82–4 to form a committee "to conduct an investigation of Government intelligence activities".
After a year of investigation, the committee produced a multi-volume report. The second volume, "Intelligence activities and the rights of Americans", included descriptions of activities by United States intelligence agencies against United States citizens, such as the following: "From the early 1960's until 1973, NSA compiled a list of individuals and organizations, including 1200 American citizens and domestic groups, whose communications were segregated from the mass of communications intercepted by the Agency, transcribed, and frequently disseminated to other agencies for intelligence purposes." The report went on to note that the "Americans on this list, many of whom were active in the anti-war and civil rights movements", were placed on the list without a warrant.
The full report is well worth reading as an illustration of how important it is to keep your surveillance secret so that investigations like this don't happen in the first place. Here's another quote from the report:
The Government has often undertaken the secret surveillance of citizens on the basis of their political beliefs, even when those beliefs posed no threat of violence or illegal acts on behalf of a hostile foreign power. The Government, operating primarily through secret informants, but also using other intrusive techniques such as wiretaps, microphone 'bugs', surreptitious mail opening, and break-ins, has swept in vast amounts of information about the personal lives, views, and associations of American citizens. Investigations of groups deemed potentially dangerous—and even of groups suspected of associating with potentially dangerous organizations—have continued for decades, despite the fact that those groups did not engage in unlawful activity. Groups and individuals have been harassed and disrupted because of their political views and their lifestyles. Investigations have been based upon vague standards whose breadth made excessive collection inevitable. Unsavory and vicious tactics have been employed—including anonymous attempts to break up marriages, disrupt meetings, ostracize persons from their professions, and provoke target groups into rivalries that might result in deaths. Intelligence agencies have served the political and personal objectives of presidents and other high officials. While the agencies often committed excesses in response to pressure from high officials in the Executive branch and Congress, they also occasionally initiated improper activities and then concealed them from officials whom they had a duty to inform.
Governmental officials—including those whose principal duty is to enforce the law—have violated or ignored the law over long periods of time and have advocated and defended their right to break the law.
The report led to some new laws, which ultimately turned out to be easy to evade, much like the laughably ineffective "chains" of the Fourth Amendment. But the real damage here came from eight billion potential terrorists (or, well, only four billion at the time) being alerted as to what's going on, and in some cases taking steps to hide their so-called "private" information.
Cryptography. Even though advances in technology have generally been fantastic for surveillance, they also have a downside: broad availability of computers makes it much easier for potential terrorists to encrypt data. Normally you can just read any messages passing by one of your spying devices, but if the data is encrypted then the only way for you to read it is to break the encryption, which might be infeasible.
Even worse, in the 1970s, so-called "universities" (that's what potential terrorists typically call their think tanks) were starting to publish new ideas for making cryptography even easier to deploy, and at the same time were pointing out various weaknesses in existing cryptographic systems.
The existence of NSA was no longer a secret. But NSA still had important secrets such as how to break various cryptosystems. NSA was faced with the terrifying prospect of public research correctly identifying some cryptographic systems as weak and others as strong, eventually leading to billions of potential terrorists using strong cryptography.
Was there anything NSA could do about this? Quite a lot, as we'll see; this will be the focus of the rest of this minicourse. You'll see that the overt examples backfired and were ultimately unsuccessful, while the surreptitious examples worked much more smoothly.
Strategy 1: censorship. An internal NSA history book says that "NSA hunted diligently for a way to stop cryptography from going public". The first proposal described in the book was "to use the International Traffic in Arms Regulation (ITAR) to put a stop to the publication of cryptographic material".
According to the book, NSA agent Joseph A. Meyer was pushing this idea internally and then "took matters into his own hands". Meyer sent a letter in his own name to a publisher, IEEE. This "raised considerable commotion within IEEE", snowballing into backlash against NSA. For example, a 1977 New York Times article titled "Harassment alleged over code research" said "Computer scientists and mathematicians whose research touches on secret codes say they have been subjected by the National Security Agency to growing harassment and the threat of sanctions or even prosecution for publishing articles about their work. ... Mr. Boardman, the security agency spokesman, denied that the N.S.A. had directed any employee to bring pressure on Mr. Hellman or the others. But an informant in the National Science Foundation said the letter from Mr. Meyer to [IEEE] was merely one of a number of similarly threatening letters that had been sent to scientist and their organizations by known employees of the security agency."
By 1978, NSA knew that using ITAR to censor research papers violated the First Amendment to the United States Constitution. You already took our "Censorship Techniques" course; you know that the First Amendment is pretty damn annoying. A 1979 speech by the NSA Director said he was working to "clarify the ITAR so as to allay any fears that it may improperly apply to scholarly activity".
But NSA continued using ITAR to try to censor cryptographic software. For example, Phil Zimmermann, author of a subversive cryptographic program called PGP, was subjected to a grand jury investigation and further government interrogation starting in 1993. There are many more examples. The censorship produced further backlash, and eventually court cases under the First Amendment.
In a 1995 court case, NSA filed a declaration saying that NSA's operations "depend heavily upon NSA's ability to exploit encrypted communications" and that "Policies concerning the export control of cryptographic products are based on the fact that the proliferation of such products will make it easier for foreign intelligence targets to deny the United States Government access to information vital to national security interests". Obviously the same argument also says that NSA doesn't want a proliferation of cryptographic research papers, but NSA drew a line between papers and software in the hopes of avoiding the First Amendment (compare "we regulate the software depending on its significance and its power ... the history is replete with showing the importance of code breaking" to "We don't have a prepublication review requirement on journals"). NSA lost the case.
Even if NSA had won in court, what it was trying to do would have proven futile, exactly because the regulations were public. If NSA is overtly crippling the cryptography available from the United States then billions of potential terrorists will instead use better cryptography from uncensored sources in other countries.
You're asking why NSA didn't coordinate with its counterparts in other countries to censor cryptography everywhere? That's an excellent question, sir. They actually tried that via a multi-government export-control committee called "COCOM", later replaced with the "Wassenaar Arrangement". But there's this annoying problem of incentives. Most countries end up thinking "Censoring cryptography in our country won't end up having much benefit for our spies, whereas we'll have a much more obvious benefit from good cryptography". COCOM agreed in 1991 that it wouldn't try to regulate mass-market software. NSA maintained its regulations inside the United States but had to fight against companies arguing that "stringent U.S. export control of products with encryption capabilities reduced their international sales", against the State Department arguing "that it would be impossible to control the export of mass-market software because the products were widely available", etc.
Strategy 2: appealing to nationalism. I've already mentioned a couple of things that the NSA Director said in a speech in 1979. This was a speech on "what has been happening to NSA in the public sector with regard to growing interest in public cryptography" and "what is going on in our trying to deal with that problem".
NSA's internal history book says that NSA had "concluded that the damage was already so serious that something needed to be done. ... It was essential, then, to slow the rate of academic understanding of these techniques in order for NSA to stay ahead of the game". NSA was considering "new legislation" and "nonlegislative means such as voluntary commercial and academic compliance". The NSA Director decided that "the legislative approach, even if successful, would have to be supplemented by some sort of jawboning with academia"; this led to the speech.
The speech announced a "new policy of open dialogue with the public", a "significant break with NSA tradition and policy". The speech admitted that this break wasn't by NSA's choice: NSA explained that it was the victim of "constant newspaper articles" that were "uniformly critical", and that NSA wanted to "try to create a climate which does not stampede to introduce changes inimical to our interests, and which, if we need it, would provide the basis of some public understanding for future legislation".
The speech continued by saying that, within NSA's mission of "carrying out the signals intelligence activities of the United States Government", NSA had "provided a vast quantity of intelligence information of inestimable value in the conduct of the Nation's defense and foreign policy". Obviously public development of cryptography was threatening this:
Viewed from NSA's perspective, the crux of the problem is that increased concern over telecommunications protection in the nongovernmental sector implies increased public knowledge and discussion of communications protective techniques. The principle such technique, of course, is cryptography. There is a very real and critical danger that unrestrained public discussion of cryptologic matters will seriously damage the ability of this government to conduct signals intelligence ...
While some people outside NSA express concern that the government has too much power to control nongovernmental cryptologic activities, in candor, my concern is that the government has too little. I believe that there are serious dangers to our broad national interests associated with the uncontrolled dissemination of cryptologic information within the United States. It should be obvious that the National Security Agency would not continue to be in the signals intelligence business if it did not at least occasionally enjoy some cryptanalytic successes. Application of the genius of the American scholarly community to cryptographic and cryptanalytic problems, and widespread dissemination of the resulting discoveries, carries the clear risk that some of NSA's cryptanalytic successes will be duplicated, with a consequent improvement of cryptography by foreign targets.
This type of Americans-vs.-foreigners narrative plays fairly well in the United States. NSA's internal history book names Hellman as an example of someone who ended up supporting NSA's "legitimate national security interest" in censorship of cryptographic research, and who then applied to NSA for research funding.
On the other hand, NSA had just been caught surveilling United States citizens. Also, some Americans have this weird idea that foreigners are people too. Large parts of the intended audience simply won't be receptive to this type of speech. Even worse, they'll use quotes from the same speech to drum up opposition to NSA. Meanwhile the whole strategy of overtly slowing down public cryptography in the United States again runs into the basic problem of better cryptography being available from overseas.
Strategy 3: sabotaging standards. Something else that started back then was public cryptographic standardization.
The dictionary says: "To standardize things means to change them so that they all have the same features." The feature you want from standardized cryptography is that you know how to break it. Of course, you don't want people realizing that you know how to break it.
NSA's internal history book shows NSA promptly recognizing cryptographic standardization as a surveillance opportunity: "Narrowing the encryption problem to a single, influential algorithm might drive out competitors, and that would reduce the field that NSA had to be concerned about. Could a public encryption standard be made secure enough to protect against everything but a massive brute force attack, but weak enough to still permit an attack of some nature using very sophisticated (and expensive) techniques?" (Emphasis added.)
The book explains how NSA weakened the original "Data Encryption Standard" (DES) to 56-bit keys, weak enough for NSA to break. Of course, NSA issued a series of lies about this: continually exaggerating how strong 56-bit keys were, claiming that NSA hadn't touched the DES design, and later claiming that NSA had strengthened the DES design.
By 2012, NSA's budget for its "SIGINT Enabling Project", part of its amusingly named "Comprehensive National Cybersecurity Initiative", had reached a quarter billion dollars per year. In its budget request, NSA wrote that this project "actively engages the US and foreign IT industries to covertly influence and/or overtly leverage their commercial products' designs. These design changes make the systems in question exploitable ... To the consumer and other adversaries, however, the systems' security remains intact." Specific project activities listed by NSA were to "influence policies, standards and specification for commercial public key technologies", to "shape the worldwide commercial cryptography marketplace to make it more tractable to advanced cryptanalytic capabilities being developed by NSA/CSS", etc. (Emphasis added.)
See the part about influencing cryptographic standards to make them exploitable, while "the consumer and other adversaries" think that security remains intact? This is a perfect example of the virtues of stealth. Instead of eight billion potential terrorists switching to non-American cryptography because they see that you're crippling American cryptography, you have eight billion potential terrorists happily using cryptographic standards that you secretly know how to break.
The importance of rebranding. I already mentioned the part of NSA's 1979 speech highlighting the "risk" of "improvement of cryptography", for example saying "It should be obvious that the National Security Agency would not continue to be in the signals intelligence business if it did not at least occasionally enjoy some cryptanalytic successes". In the question-and-answer part of the same presentation, NSA said that the Carter administration wanted "communications security for the private sector", and that the administration found it "intolerable for that to be done by an Agency which had signals intelligence as its primary mission".
This might sound like a serious obstacle to the idea of creating weak cryptographic standards. Why would the targets of surveillance follow security recommendations from a surveillance agency?
Let me explain three ways that you can work around this obstacle. First, developers of standards will often make exploitable mistakes all by themselves. Cryptography is hard to get right even for developers who are prioritizing security. Even better, developers are usually distracted by other desiderata such as efficiency. So you can often just sit back and watch as the developers screw up.
Second, what NSA tried for DES was laundering its involvement through NIST ("NBS" at the time) and IBM. Similarly, NSA tried to launder its 512-bit DSA proposal in the early 1990s through NIST. But this approach is risky: consider, for example, how NSA's involvement in DSA was revealed by a lawsuit.
Third, you'd like most people to believe that your goal is to improve the cryptography they're using rather than to sabotage it. To get this past the laugh test, you need some rebranding. Concretely, you need a name for a pseudo-agency that will advertise itself as improving everybody's security by, for example, publishing security advice, coordinating reports regarding security incidents, and, of course, helping produce safe standards for public use. Meanwhile the pseudo-agency is under your control: the people in the pseudo-agency are people you've hired, people who will follow your orders regarding which cryptosystems to choose.
Inside NSA, this pseudo-agency has been branded as the Information Assurance Directorate, NSA Information Assurance, NSA Cybersecurity, and, starting in 2019, the NSA Cybersecurity Directorate. The pseudo-agency advertises itself as having "thousands" of people. To put this in perspective, NSA's budget in 2010 was about $10 billion. Salaries for a few thousand people are just a few percent of this budget, a small price to pay for being able to fool standards-development organizations into believing that you aren't sabotaging their standards.
Here's a similar example from overseas. In 2016, UK surveillance agency GCHQ registered a new trademark and web site called "NCSC", the "National Cyber Security Centre", which supposedly "helps businesses, the public sector and individuals protect the online services and devices that we all depend on". The power of GCHQ over NCSC is a matter of public record: the GCHQ director who created NCSC later wrote in 2019 that "Complete ownership by GCHQ was also key to making the NCSC acceptable to foreign intelligence allies". And yet various standards developers act as if NCSC weren't controlled by a surveillance agency.
Does this really work? Yes! It turns out that—even when there are public records of NSA having "signals intelligence as its primary mission", NSA spying on Americans, NSA opposing "improvement of cryptography", NSA proposing a weak Data Encryption Standard, NSA threatening cryptographers, NSA proposing a weak Digital Signature Standard, NSA creating export-law exceptions to solidify the market for weak cryptography, NSA pushing a chip with weak cryptography, NSA fighting multiple court battles to try to preserve the export controls, NSA proposing a weak standard for random-number generation, NSA bribing companies to use that standard, NSA spying on even more Americans, NSA hacking into devices, and NSA's SIGINT Enabling Project having a quarter-billion-dollar-a-year budget to "covertly influence and/or overtly leverage" standards and other systems to make them "exploitable" while "the consumer and other adversaries" think that "the systems' security remains intact"—a wholly owned subsidiary of NSA saying "That wasn't our department; we're here to help" will end up with many people believing this. Marketing is an amazing thing.
Consider, for example, cryptographers Neal Koblitz and Alfred Menezes in 2015 contrasting "IAD's mission as the defensive arm of the NSA" with the "offensive arm, called Signals Intelligence (SIGINT)", as if IAD were an independent agency. The article claims that NSA wouldn't push weak cryptography (except in certain special cases), since if it did then other nations "would soon be able to attack private and government users in the U.S., and part of the NSA's mission is to prevent this".
Or consider how a standards-development organization, the Internet Engineering Task Force, selected NSA agent Deb Cooley in 2024 out of several nominees to be a "security area director", a position having tremendous power within that organization.
I have one last example, Dual EC, which I'll spend a bit more time on since it brings together a bunch of today's lessons.
Dual EC is another example of a cryptographic system that was weak enough for NSA to break. This is the weak random-number generator that I mentioned before.
Dual EC was standardized by ANSI, ISO, and NIST, three major standards-development organizations. Of course, NSA didn't tell the organizations that Dual EC was weak: instead it told them the opposite, namely that Dual EC provided "increased assurance" compared to alternatives. (Boldface and underline in original.)
This quote is from slides presented by NSA contractor Don Johnson from Entrust subsidiary Cygnacom. Those slides didn't mention NSA, but it wasn't a secret that Dual EC came from NSA. Bruce Schneier commented in 2007 that Dual EC was standardized "only because it's been championed by the NSA, which first proposed it years ago in a related standardization project at the American National Standards Institute". NIST's Dual EC standard said that it "gratefully acknowledges and appreciates contributions by Mike Boyle, Paul Timmel and Debby Wallner from the National Security Agency".
Let's take a moment to admire Mike Boyle's public bio. In 2014, Boyle published an article about NSA's "rich history of contributing to standards that enable cyber defense". The bio attached to the article identified him as "the Standards Lead for the Information Assurance Directorate (IAD) at the National Security Agency (NSA)", coordinating "IAD's efforts in various public standards bodies".
Here's another bio for Boyle, this one from 2023: "Mike Boyle is the Co-Lead of NSA's Center for Cybersecurity Standards. In this role, he develops and funds the NSA's standards development strategy, with a focus on supporting the marketplace of secure, interoperable technologies that defend US National Security Systems. He has a lengthy history of spearheading initiatives to solve tough cybersecurity problems with government and industry partners. Mike began his career as a crypto mathematician at the National Security Agency, where he used his skills to understand how good encryption may fail in implementation and collaborated with the industry to guarantee that products purchased by the US government avoided such problems. His attention has shifted to secure network protocols, and he is involved in various open standards initiatives devoted to their advancement."
Remember what I said about rebranding? The "Information Assurance Directorate" was already building trust in the 1990s with public presentations and papers on non-controversial work such as speedups. NSA then exploited this trust to standardize Dual EC.
Unfortunately, one of those so-called "whistleblowers", rogue agent Ed Snowden, leaked the fact that NSA was secretly describing Dual EC standardization as an "exercise in finesse". More importantly, he leaked the description of the overall SIGINT Enabling Project, including NSA's description of its stealth game ("covertly influence" and "To the consumer and other adversaries, however, the systems' security remains intact"). But don't give up when there's this sort of setback: it's just another "PR and Reputational issue" that you can manage by spending enough money on marketing.
In closing, let me say what an honor it has been to have you here. I hope you've found this minicourse useful, and I hope to see you back for even more of our courses. That's it for today!