. 21
( 29)


The first field is “Proc-Type, ” and identifies the type of processing performed on the message. There
are three possible types of messages. The “ENCRYPTED” specifier says that the message is encrypted
and signed. The “MIC-ONLY” and “MIC-CLEAR” specifiers would indicate that the message is
signed, but not encrypted. MIC-CLEAR messages are not encoded and can be read using non-PEM
software. MIC-ONLY messages need PEM software to transform them to a human-readable form. A
PEM message is always signed; it is optionally encrypted.

Proc-Type: 4,ENCRYPTED
Content-Domain: RFC822
DEK-Info: DES-CBC,F8143EDE5960C597
Originator-ID-Symmetric: schneier@counterpane.com,,
Recipient-ID-Symmetric: schneier@chinet.com,ptf-kmc,3
Recipient-ID-Symmetric: pem-dev@tis.com,ptf-kmc,4

Figure 24.4 Example of an encapsulated message (symmetric case).

Proc-Type: 4,ENCRYPTED
Content-Domain: RFC822
DEK-Info: DES-CBC,BFF968AA74691AC1
Key-Info: RSA,

Page 482 of 666
Applied Cryptography: Second Edition - Bruce Schneier

Key-Info: RSA,

Figure 24.5 Example of an encapsulated encrypted message (asymmetric case).

The next field, “Content-Domain, ” specifies the type of mail message. It has nothing to do with
security. The “DEK-Info” field gives information on the Data Exchange Key (DEK), the encryption
algorithm used to encrypt the text, and any parameters associated with the encryption algorithm.
Only DES in CBC mode is currently specified, or “DES-CBC.” The second subfield specifies the IV.
Other algorithms may be specified by PEM in the future; their use will be noted in DEK-Info and in
other fields that identify algorithms.

For messages with symmetric key management (see Figure 24.4), the next field is “Originator-ID-
Symmetric” with three subfields. The first subfield identifies the sender by a unique electronic mail
address. The second subfield is optional and identifies the authority that issued the interchange key.
The third is an optional Version/Expiration subfield.

Continuing with the symmetric key-management case, each recipient has two fields: “Recipient-ID-
Symmetric” and “Key-Info.” The “Recipient-ID-Symmetric” field has three subfields; these identify
the receiver in the same way that “Originator-ID-Symmetric” identified the sender.

The “Key-Info” field specifies the key-management parameters. This field has four subfields. The first
subfield gives the algorithm used to encrypt the DEK. Since the key management in this message is
symmetric, the sender and receiver have to share a common key. This is called the Interchange Key
(IK), which is used to encrypt the DEK. The DEK can be either encrypted using DES in ECB
(denoted by “DES-ECB”) or triple-DES (which would be denoted “DES-EDE”). The second subfield
specifies the MIC algorithm. It can be either MD2 (denoted by “RSA-MD2”) or MD5 (which would be
denoted “RSA-MD5”). The third subfield, the DEK, and the fourth field, the MIC, are both encrypted
with the IK.

Figures 24.5 and 24.6 show messages with public-key key management (called “asymmetric” in PEM
nomenclature). The headers are different. In ENCRYPTED messages, after the “DEK-Info” field
comes the “Originator-Certificate” field. The certificate follows the X.509 standard (see Section 24.9).
The next field is “Key-Info” with two subfields. The first subfield specifies the public-key algorithm
used to encrypt the DEK; currently only RSA is supported. The next subfield is the DEK, encrypted
in the originator™s public key. This is an optional field, intended to permit the originator to decrypt
his own message in the event that it is returned by the mail system. The next field “Issuer-Certificate,
” is the certificate of whomever signed the Originator-Certificate.

Continuing with the asymmetric key-management case, the next field is “MIC-Info.” The first
subfield gives the algorithm under which the MIC was computed. The second subfield shows the
algorithm under which the MIC was signed. The third subfield consists of the MIC, signed by the

Page 483 of 666
Applied Cryptography: Second Edition - Bruce Schneier

sender™s private key.

Proc-Type: 4,MIC-ONLY
Content-Domain: RFC822

Figure 24.6 Example of an encapsulated MIC-ONLY message (asymmetric case).

Still continuing with asymmetric key management, the next fields deal with the recipients. There are
two fields for each recipient: “Recipient-ID-Asymmetric” and “Key-Info.” The “Recipient-ID-
Asymmetric” field has two subfields. The first identifies the authority that issued the receiver™s public
key; the second is an optional Version/Expiration subfield. The “Key-Info” field specifies the key
management parameters: The first subfield identifies the algorithm used to encrypt the message and
the second subfield is the DEK encrypted with the receiver™s public key.

Security of PEM

RSA keys in PEM can range from 508 bits to 1024 bits. This should be long enough for anyone™s
security needs. A more likely attack would be against the key-management protocols. Mallory could
steal your private key”don™t write it down anywhere”or attempt to fool you into accepting a bogus
public key. The key certification provisions of PEM make this unlikely if everyone follows proper
procedures, but people have been known to be sloppy.

A more insidious attack would be for Mallory to modify the PEM implementation running on your
system. This modified implementation could surreptitiously send Mallory all of your mail, encrypted
with his public key. It could even send him a copy of your private key. If the modified implementation
works well, you will never know what is happening.

There™s no real way to prevent this kind of attack. You could use a one-way hash function and
fingerprint the PEM code. Then, each time you run it, you could check the fingerprint for
modification. But Mallory could modify the fingerprint code at the same time he modifies the PEM

Page 484 of 666
Applied Cryptography: Second Edition - Bruce Schneier

code. You could fingerprint the fingerprint code, but Mallory could modify that as well. If Mallory
can get access to your machine, he can subvert the security of PEM.

The moral is that you can never really trust a piece of software if you cannot trust the hardware it is
running on. For most people, this kind of paranoia is unwarranted. For some, it is very real.


Trusted Information Systems, partially supported by the U.S. government Advanced Research
Projects Agency, has designed and implemented a reference implementation of PEM (TIS/PEM).
Developed for UNIX-based platforms, it has also been ported to VMS, DOS, and Windows.

Although the PEM specifications indicate a single certification hierarchy for use by the Internet,
TIS/PEM supports the existence of multiple certification hierarchies. Sites may specify a set of
certificates that are to be considered valid, including all certificates issued by them. A site need not
join the Internet hierarchy in order to use TIS/PEM.

TIS/PEM is currently available to all U.S. and Canadian organizations and citizens upon request. It
will be distributed in source code form. Interested parties should contact: Privacy-Enhanced Mail,
Trusted Information Systems, Inc., 3060 Washington Road (Rte. 97), Glenwood, MD 21738; (301)
854-6889; fax: (301) 854-5363; Internet: pem-info@tis.com.


RIPEM is a program, written by Mark Riordan, that implements the PEM protocols. Although
technically not public domain, the program is publicly available and can be used royalty-free for
personal, noncommercial applications. A license for its use is included with the documentation.

The code cannot be exported. Of course, U.S. government laws don™t apply outside the United States,
and some people have ignored the export rules. RIPEM code is available on bulletin boards
worldwide. Something called RIPEM/SIG, which only does digital signatures, is exportable.

At this writing, RIPEM is not a complete implementation of the PEM protocols; it does not implement
certificates for authenticating keys.

Before writing RIPEM, Riordan wrote a similar program called RPEM. This was intended to be a
public-domain electronic-mail encryption program. To try to avoid patent issues, Riordan used
Rabin™s algorithm (see Section 19.5). Public Key Partners claimed that their patents were broad
enough to cover all of public-key cryptography and threatened to sue; Riordan stopped distributing
the program.

RPEM isn™t really used anymore. It is not compatible with RIPEM. Since RIPEM can be used with
the full blessing of Public Key Partners, there is no reason to use RPEM instead.

24.11 Message Security Protocol (MSP)

The Message Security Protocol (MSP) is the military equivalent of PEM. It was developed by the NSA
in the late 1980s under the Secure Data Network System (SDNS) program. It is an X.400-compatible
application-level protocol for securing electronic mail. MSP will be used for signing and encrypting
messages in the Department of Defense™s planned Defense Message System (DMS) network.

Page 485 of 666
Applied Cryptography: Second Edition - Bruce Schneier

The Preliminary Message Security Protocol (PMSP), to be used for “unclassified but sensitive”
messages, is a version of MSP adapted for use with both X.400 and TCP/IP. This protocol is also
called Mosaic.

Like PEM, MSP and PMSP software applications are flexible and designed to accommodate a variety
of algorithms for security functions including signing, hashing, and encryption. PSMP will work with
the Capstone chip (see Section 24.17).

24.12 Pretty Good Privacy (PGP)

Pretty Good Privacy (PGP) is a freeware electronic-mail security program, originally designed by
Philip Zimmermann [1652]. It uses IDEA for data encryption, RSA (with keys up to 2047 bits) for key
management and digital signatures, and MD5 as a one-way hash function.

PGP™s random public keys use a probabilistic primality tester, and get their initial seeds from
measuring the user™s keyboard latency while typing. PGP generates random IDEA keys using the
method delineated in ANSI X9.17, Appendix C (see Section 8.1) [55], with IDEA as the symmetric
algorithm instead of DES. PGP also encrypts the user™s private key using a hashed pass phrase
instead of a password.

PGP-encrypted messages have layered security. The only thing a cryptanalyst can learn about an
encrypted message is who the recipient is, assuming he knows the recipient™s key ID. Only after the
recipient decrypts the message does he learn who signed the message, if it is signed. Contrast this
approach with PEM, which leaves quite a bit of information about the sender, recipient, and message
in the unencrypted header.

The most interesting aspect of PGP is its distributed approach to key management (see Section 8.12).
There are no key certification authorities; PGP instead supports a “web of trust.” Every user
generates and distributes his own public key. Users sign each other™s public keys, creating an
interconnected community of PGP users.

For example, Alice might physically give her public key to Bob. Bob knows Alice, so he signs her
public key. He then gives the signed key back to her and keeps a copy for himself. When Alice wants
to communicate with Carol, Alice sends Carol a copy of the key Bob signed. Carol, who already has
Bob™s public key (she got it at some other time) and trusts Bob to certify other people™s keys, verifies
his signature on Alice™s key and accepts it as valid. Bob has introduced Alice to Carol.

PGP does not specify a policy for establishing trust; users are free to decide who they trust and who
they do not. PGP provides mechanisms for associating trust with public keys and for using trust. Each
user keeps a collection of signed public keys in a file called a public-key ring. Each key in the ring has
a key legitimacy field that indicates the degree to which the particular user trusts the validity of the
key. The higher the trust level, the more the user believes the key is legitimate. A signature trust field
measures how far the user trusts the signer to certify the public keys of other users. And finally, an
owner trust field indicates the degree to which the particular user trusts the key™s owner to sign other
public keys; this field is set manually by the user. PGP continuously updates these fields as users
supply new information.

Figure 24.7 shows how this model might look for a particular user, Alice. Alice™s key is at the top, and
the owner trust value is ultimate trust. Alice has signed Bob™s, Carol™s, Dave™s, Ellen™s, and Frank™s
keys. She trusts Bob and Carol to sign other people™s public keys, and she partially trusts Dave and
Ellen to sign other people™s public keys. And she trusts Gail to sign other people™s public keys, even
though she has not signed Gail™s key herself.

Page 486 of 666
Applied Cryptography: Second Edition - Bruce Schneier

Two partially trusted signatures may be sufficient to certify a key. Alice believes that Kurt™s key is
legitimate because both Dave and Ellen have signed it. This is not automatic in PGP; Alice can set her
own paranoia level.

Just because Alice believes a key to be valid, she does not have to trust it to sign other people™s keys.
She does not trust Frank to sign other people™s public keys, even though she signed his key herself.
And she does not trust Ivan™s signature on Martin™s key, or Kurt™s signature on Nancy™s key.

Owen™s key doesn™t fit into the web anywhere; perhaps Alice got it from a key server. PGP does not
assume that the key is valid; Alice must either declare the key valid or decide to trust one of the key™s

Of course, nothing prevents Alice from using keys she does not trust. PGP™s job is to alert Alice that
the key is not trusted, not to prevent communications.

The weakest link of this whole system is key revocation: It is impossible to guarantee that no one will
use a compromised key. If Alice™s private key is stolen she can send out something called a key
revocation certificate, but since key distribution is ad hoc and largely word of mouth there is no
guarantee that it will reach everyone who has her public key on his key ring. And as Alice has to sign
the key revocation certificate with her private key; if she loses the key altogether she cannot revoke it.

Figure 24.7 PGP trust model.

The current version of PGP is 2.6.2. A new version of PGP, PGP 3.0, is scheduled for release by the
end of 1995. Changes in 3.0 include options for triple-DES, SHA, and other public-key algorithms, a
split of the encryption and signature public-key/private-key key pairs, enhanced procedures for key
revocation, improved key-ring management functions, an API for integrating PGP in other programs,
and a completely rewritten code base.

PGP is available for MS-DOS, UNIX, Macintosh, Amiga, and Atari. It is free for personal,
noncommercial use, and is available from many ftp sites on the Internet. To ftp PGP from MIT, telnet
to net-dist.mit.edu, log in as getpgp, answer the questions, then ftp to net-dist.mit.edu and change to
the directory named in the telnet session. It is also available from ftp.ox.ac.uk, ftp.dsi.unimi.it,
ftp.funet.fi, ftp.demon.co.uk, Compuserve, AOL, and elsewhere. For U.S. commercial users, PGP can
be bought”complete with licenses”for about $100 from a company called ViaCrypt, 9033 N 24th
Ave., Phoenix, AZ, 85021; (602) 944-0773; viacrypt@acm.org. Several shareware front-ends are
available to help integrate PGP into MS-DOS, Microsoft Windows, Macintosh, and UNIX.

There are several books about PGP [601, 1394, 1495]. The source code has even been published in
book form [1653] in an attempt to frustrate the U.S. Department of State, which continues to maintain
that source code is exportable on paper but not electronically. Assuming you trust IDEA, PGP is the
closest you™re likely to get to military-grade encryption.

24.13 Smart Cards

A smart card is a plastic card, the size and shape of a credit card, with an embedded computer chip.
It™s an old idea”the first patents were filed 20 years ago”but practical limitations made them
feasible only five or so years ago. Since then they have taken off, mostly in Europe. Many countries
use smart cards for pay telephones. There are also smart credit cards, smart cash cards, smart
everything cards. The U.S. credit-card companies are looking at the technology, and within a few

Page 487 of 666
Applied Cryptography: Second Edition - Bruce Schneier

years even backwards Americans will have smart cards in their wallets.

A smart card contains a small computer (usually an 8-bit microprocessor), RAM (about a quarter
kilobyte), ROM (about 6 or 8 kilobytes), and either EPROM or EEPROM (a few kilobytes). Future-
generation smart cards will undoubtedly have more capacity, but some physical limitations on smart
cards make expansion difficult. The card has its own operating system, programs, and data. (What it
doesn™t have is power; that comes when the card is plugged in to a reader.) And it is secure. In a
world where you might not trust someone else™s computer or telephone or whatever, you can still trust
a card that you keep with you in your wallet.

Smart cards can have different cryptographic protocols and algorithms programmed into them. They
might be configured as an electronic purse, and be able to spend and receive digital cash. They may
be able to perform zero-knowledge authentication protocols; they may have their own encryption
keys. They might be able to sign documents, or unlock applications on a computer.

Some smart cards are assumed to be tamperproof; this often protects the institution that issues the
cards. A bank wouldn™t want you to be able to hack their smart card to give yourself more money.

There is a lot of interest in smart cards, and a lot of information about them is available. A good
survey article on the cryptography in smart cards is [672]. CARTES is a conference held in Paris
every October; and CardTech is held in Washington, D.C. every April. The proceedings of two other
smart-card conferences are [342, 382]. There are hundreds of smart-card patents, mostly owned by
European companies. An interesting paper on possible future applications”integrity checking, audit
trails, copy protection, digital cash, secure postage meters”is [1628].

24.14 Public-Key Cryptography Standards (PKCS)

The Public-Key Cryptography Standards (PKCS) are RSA Data Security, Inc.™s attempt to provide
an industry standard interface for public-key cryptography. Traditionally, this sort of thing would be
handled by ANSI, but, considering the current situation in cryptography politics, RSADSI figured
that they had better do it on their own. Working with a variety of companies, they developed a series
of standards. Some are compatible with other standards and some are not.

These are not standards in the traditional sense of the word; no standards body convened and voted
on PKCS. According to its own materials, RSADSI will “retain sole decision-making authority on
what each standard is” and will “publish revised standards when appropriate” [803].

Even so, there is a lot of good stuff here. If you™re not sure what kind of syntax and data structures to
use when programming public-key cryptography, these standards are probably as good as anything
else you can come up with. And, since they™re not really standards, you can tailor them to suit your

Following is a short description of each PKCS (PKCS #2 and PKCS #4 have been incorporated into
PKCS #1).

PKCS #1 [1345] describes a method for RSA encryption and decryption, primarily for constructing
the digital signatures and digital envelopes described in PKCS #7. For digital signatures, the message
is hashed and then the hash is encrypted with the private key of the signer. Both message and hash
are represented together as detailed in PKCS #7. For digital envelopes (encrypted messages), the
message is first encrypted with a symmetric algorithm, and then the message key is encrypted with the
public key of the recipient. The encrypted message and encrypted key are represented together
according to the syntax of PKCS #7. Both of these methods are compatible with PEM standards.

Page 488 of 666
Applied Cryptography: Second Edition - Bruce Schneier

PKCS #1 also describes a syntax, identical to the syntax in X.509 and PEM, for RSA public and
private keys and three signature algorithms”MD2 and RSA, MD4 and RSA, and MD5 and RSA”
for signing certificates and the like.

PKCS #3 [1346] describes a method for implementing Diffie-Hellman key exchange.

PKCS #5 [1347] describes a method for encrypting messages with a secret key derived from a
password. It uses either MD2 or MD5 to derive the key from the password, and encrypts with DES in
CBC mode. The method is intended primarily to encrypt private keys when transferring them from
one computer system to another, but can be used to encrypt messages.

PKCS #6 [1348] describes a standard syntax for public key certificates. The syntax is a superset of an
X.509 certificate, so that X.509 certificates can be extracted if necessary. Over and above the X.509
set, additional attributes extend the certification process beyond just the public key. These include
other information, such as electronic mail address.

PKCS # 7 [1349] is a general syntax for data that may be encrypted or signed, such as digital
envelopes or digital signatures. The syntax is recursive, so that envelopes can be nested, or someone
can sign some previously encrypted data. The syntax also allows other attributes, such as timestamps,
to be authenticated along with the message content. PKCS #7 is compatible with PEM so that signed
and encrypted messages can be converted to PEM messages without any cryptographic operations,
and vice versa. PKCS #7 can support a variety of architectures”PEM is one”for certificate-based
key management.

PKCS #8 [1350] describes a syntax for private key information”including a private key and a set of
attributes”and a syntax for encrypted private keys. PKCS #5 can be used to encrypt the private key

PKCS #9 [1351] defines selected attribute types for PKCS #6 extended certificates, PKCS #7 digitally
signed messages, and PKCS #8 private-key information.

PKCS #10 [1352] describes a standard syntax for certification requests. A certification comprises a
distinguished name, a public key, and (optionally) a set of attributes, collectively signed by the person
requesting certification. Certification requests are sent to a certification authority, who either
transforms the request into an X.509 public-key certificate or a PKCS #6 certificate.

PKCS #11 [1353], the Cryptographic Token API Standard, specifies a programming interface called
“Cryptoki” for portable cryptographic devices of all kinds. Cryptoki presents a common logical
model, enabling applications to perform cryptographic operations on portable devices without
knowing details of the underlying technology. The standard also defines application profiles: sets of
algorithms that a device may support.

PKCS #12 [1354] describes syntax for storing in software a user™s public keys, protected private keys,
certificates, and other related cryptographic information. The goal is to standardize on a single key
file for use among a variety of applications.

These standards are comprehensive, but not exhaustive. Many things are outside their scope: the
problem of naming, noncryptographic issues surrounding certification, key lengths, and conditions on
various parameters. What the PKCS provide are a format for transferring data based on public-key
cryptography and an infrastructure to support that transfer.

24.15 Universal Electronic Payment System (UEPS)

Page 489 of 666
Applied Cryptography: Second Edition - Bruce Schneier

The UEPS is a smart-card banking application initially developed for rural South Africa, but later
adopted by all of that country™s major banking groups. About 2 million cards were issued in that
country by early 1995. It has also been adopted in Namibia, and is also being deployed by at least one
bank in Russia.

The system provides a secure debit card suitable for regions where poor telephone service make on-
line verification impossible. Both customers and merchants have cards; customers can use their cards
to transfer money to merchants. Merchants can then take their cards to a telephone and deposit the
money in their bank account; customers can take their cards to a telephone and have money moved
onto their card. There is no intention to provide anonymity, only to prevent fraud.

Here is the communications protocol between customer Alice and merchant Bob. (Actually, Alice and
Bob just plug their cards into a machine and wait for it to complete the transaction.) When Alice first
gets her card, she is given a key pair, K1 and K2; the bank calculates them from her name and some
secret function. Only the merchant cards have the secrets necessary to work out these customer keys.

(1) Alice sends Bob her name, A, his name, B, and a random number, RA, encrypted using
DES: first with K2 and then with K1. She also sends her name in the clear.
A, EK1(EK2(A, B, RA))
(2) Bob calculates K1 and K2 from Alice™s name. He decrypts the message, confirms that A
and B are correct, then encrypts Alice™s unencrypted second message with K2.
EK2(A, B, RA)

Bob does not send this message to Alice; 56 bits of the ciphertext become K3. Bob then sends
Alice his name, her name, and another random number, RB, encrypted using DES: first with K3
and then with K1.
EK1(EK3(B, A, RB))
(3) Alice computes K3 in the same manner Bob did. She decrypts Bob™s message, confirms
that B and A are correct, then encrypts Bob™s unencrypted message with K3.
EK3(B, A, RB)

Alice does not send this message to Bob; 56 bits of the ciphertext become K4. Alice then sends
Bob her name, his name, and the digital check, C. This check contains the names of the sender
and recipient, a date, a check number, an amount, and two MACs, all encrypted using DES:
first with K4 and then with K1. One of the MACs can be verified by Alice™s bank, and the other
can only be verified by the clearing center. Alice debits her account by the correct amount.
EK1(EK4(A, B, C))
(4) Bob computes K4 in the same manner Alice did. Assuming all the names match and
the check is correctly formed, he accepts it for payment.

A really clever thing about this protocol is that the encryption key for each message depends on the
previous message. Each message doubles as an authenticator for all previous messages. This means
that someone can™t replay an old message; the receiver could never decrypt it. I am impressed with
this idea and expect that it will see wider use once it becomes widely known.

Another clever thing about this protocol is that it enforces correct implementation. If the application

Page 490 of 666
Applied Cryptography: Second Edition - Bruce Schneier

developer doesn™t implement this protocol correctly, it just won™t work.

Both cards store records of every transaction. When the cards eventually go online to communicate
with the bank”the merchant to deposit his money and the customer to get more money”the bank
uploads these records for auditing purposes.

Tamperproof hardware prevents either participant from messing with the data; Alice cannot change
the value of her card. Extensive audit trails provide data to identify and prosecute fraudulent
transactions. There are universal secrets in the cards”MAC keys in the customer cards, functions to
convert customer names to K1 and K2 in the merchant cards”but these are assumed to be difficult to

This scheme is not meant to be perfect, only more secure than either paper checks or traditional debit
cards. The threat of fraud is not from rival militaries, but from opportunistic customers and
merchants. UEPS protects against that kind of abuse.

The message exchange is an excellent example of a robust protocol: Every message names both
parties, includes unique information to ensure freshness, and depends explicitly on all the messages
that came before it.

24.16 Clipper

The Clipper chip (also known as the MYK-78T) is an NSA-designed, tamper-resistant VLSI chip
designed for encrypting voice conversations; it is one of the two chips that implements the U.S.
government™s Escrowed Encryption Standard (EES) [1153]. VLSI Technologies, Inc. manufactures
the chip, and Mykotronx, Inc. programs it. Initially, the Clipper chip will be available in the AT&T
Model 3600 Telephone Security Device (see Section 24.18). The chip implements the Skipjack
encryption algorithm (see Section 13.12), an NSA-designed classified secret-key encryption algorithm,
in OFB only.

The most controversial aspect of the Clipper chip, and the entire EES, is the key-escrow protocol (see
Section 4.14). Each chip has a special key, not needed for messages. This key is used to encrypt a copy
of each user™s message key. As part of the synchronization process, the sending Clipper chip generates
and sends a Law Enforcement Access Field (LEAF) to the receiving Clipper chip. The LEAF contains
a copy of the current session key, encrypted with a special key (called the unit key). This allows a
government eavesdropper to recover the session key, and then recover the plaintext of the

According to the director of NIST [812]:

A “key-escrow” system is envisioned that would ensure that the “Clipper Chip” is used to
protect the privacy of law-abiding Americans. Each device containing the chip will have
two unique “keys, ” numbers that will be needed by authorized government agencies to
decode messages encoded by the device. When the device is manufactured, the two keys
would be deposited separately in two “key-escrow” databases established by the attorney
general. Access to these keys would be limited to government officials with legal
authorization to conduct a wiretap.

The government also wants to encourage the sale of telephones with these devices abroad; no one
knows what might happen to those key-escrow databases.

Page 491 of 666
Applied Cryptography: Second Edition - Bruce Schneier

Politics aside, the internal structure of the LEAF is worth discussing [812, 1154, 1594, 459, 107, 462].
The LEAF is a 128-bit string containing enough information to allow law enforcement to recover the
session key, KS, assuming the two escrow agencies in charge of those key-escrow databases cooperate.
The LEAF contains a 32-bit unit identifier, U, unique to the Clipper chip. It also contains the current
80-bit session key encrypted with the chip™s unique unit key, KU, and a 16-bit checksum, C, called an
escrow identifier. This checksum is a function of the session key, the IV, and possibly other
information. These three fields are encrypted with a fixed family key, KF, shared by all interoperable
Clipper chips. The family key, the encryption modes used, the details of the checksum, and the exact
structure of the LEAF are all secret. It probably looks something like this:


KU is programmed into Clipper chips at the factory. This key is then split (see Section 3.6) and stored
in two different key-escrow databases, guarded by two different escrow agencies.

For Eve to recover KS from the LEAF, she first has to decrypt the LEAF with KF and recover U. Then
she has to take a court order to each escrow agency, who each return half of KU for the given U. Eve
XORs the two halves together to recover KU, then she uses KU to recover KS, and KS to eavesdrop on
the conversation.

The checksum is designed to prevent someone from circumventing this scheme; the receiving Clipper
chip won™t decrypt if the checksum doesn™t check. However, there are only 216 possible checksum
values, and a bogus LEAF with the right checksum but the wrong key can be found in about 42
minutes [187]. This isn™t much help for Clipper voice conversations. Because the key exchange
protocol is not part of the Clipper chip, the 42-minute brute-force attack must occur after key
exchange; it cannot be done before making the telephone call. This attack may work for facsimile
transmission or with the Fortezza card (see Section 24.17).

Supposedly, the Clipper chip will resist reverse-engineering by “a very sophisticated, well-funded
adversary” [1154], but rumors are that Sandia National Laboratories successfully reverse-engineered
one. Even if those rumors aren™t true, I suspect that the largest chip manufacturers in the world can
reverse-engineer Clipper; it™s just a matter of time before someone with the right combination of
resources and ethics comes along.

Enormous privacy issues are associated with this scheme. Numerous civil liberty advocacy groups are
actively campaigning against any key-escrow mechanism that gives the government the right to
eavesdrop on citizens. But the sneaky thing is that this idea never went through Congress; NIST
published the Escrowed Encryption Standard as a FIPS [1153], bypassing that irritating legislative
process. Right now it looks like the EES is dying a slow and quiet death, but standards have a way of
creeping up on you.

Anyway, Table 24.2 lists the different agencies participating in this program. Anyone want to do a
threat analysis on having both escrow agents in the executive branch? Or on having escrow agents
who really don™t know anything about the wiretap requests, and can do no more than blindly approve
them? Or on having the government impose a secret algorithm as a commercial standard?

In any case, implementing Clipper raises enough problems to question its value in court. Remember,
Clipper only works in OFB mode. Despite what you may have been told to the contrary, this does not
provide integrity or authentication. Imagine that Alice is on trial, and a Clipper-encrypted telephone
call is part of the evidence. Alice claims that she never made the call; the voice is not hers. The

Page 492 of 666
Applied Cryptography: Second Edition - Bruce Schneier

phone™s compression algorithm is so bad that it is hard to recognize Alice™s voice, but the prosecution
argues that since only Alice™s escrowed key will decipher the call it must have been made from her

Alice argues that the call was forged like so [984, 1339]: Given the ciphertext and the plaintext, it is
possible to XOR them to get the keystream. This keystream can then be XORed with an entirely
different plaintext to form a forged ciphertext, which can then be converted to forged plaintext when
fed into the Clipper decryptor. True or not, this argument could easily put enough doubt in a jury™s
mind to disregard the telephone call as evidence.

Another attack, called the Squeeze attack, allows Alice to frame Bob. Here™s how [575]: Alice calls
Bob using Clipper. She saves a copy of his LEAF as well as the session key. Then, she calls Carol (who
she knows is being wiretapped). During the key setup, Alice forces the session key to be identical to
the one she used with Bob; this requires hacking the phone, but it is not hard. Then, instead of
sending her LEAF she sends Bob™s. It™s a valid LEAF, so Carol™s phone will not notice. Now she can
say whatever she wants to Carol; when the police decrypt the LEAF, they will find that it is Bob™s.
Even if Bob wasn™t framed by Alice, the mere fact that he can claim this in court undermines the
purpose of the scheme.

The law enforcement authorities of the United States should not be in the business of collecting
information in criminal investigations that is useless in court. Even if key escrow were a good idea,
Clipper is a bad way of implementing it.

24.17 Capstone

Capstone (also known as the MYK-80) is the other NSA-developed VLSI cryptographic chip that
implements the U.S. government™s Escrowed Encryption Standard [1153]. Capstone includes the
following functions [1155, 462]:

Table 24.2
EES Participating Agencies
Justice”System Sponsor and Family Key Agent
NIST”Program Manager and Escrow Agent
FBI”Decrypt User and Family Key Agent
Treasury”Escrow Agent
NSA”Program Developer

” The Skipjack algorithm in any of the four basic modes: ECB, CBC, CFB, and OFB.
” A public-key Key Exchange Algorithm (KEA), probably Diffie-Hellman.
” The Digital Signature Algorithm (DSA).
” The Secure Hash Algorithm (SHA).
” A general purpose exponentiation algorithm.
” A general purpose, random-number generator that uses a pure noise source.

Capstone provides the cryptographic functionality needed for secure electronic commerce and other
computer-based applications. The first application of Capstone is in a PCMCIA card called Fortezza.
(It was originally called Tessera until a company called Tessera, Inc. complained.)

NSA had considered lengthening Capstone™s LEAF checksum in production versions for use in
Fortezza cards, in order to foil the brute-force attack against the LEAF previously discussed. Instead,

Page 493 of 666
Applied Cryptography: Second Edition - Bruce Schneier

they added a feature that reset the card after 10 incorrect LEAFs. This only increases the time
required to find a fake but valid LEAF by 10 percent, to 46 minutes. I am not impressed.

24.18 AT&T Model 3600 Telephone Security Device (TSD)

The AT&T Telephone Security Device (TSD) is the Clipper phone. Actually, there are four models of
the TSD. One contains the Clipper chip, another contains an exportable proprietary AT&T
encryption algorithm, the third contains a proprietary algorithm for domestic use plus the exportable
algorithm, and the fourth contains the Clipper, domestic, and exportable algorithms.

TSDs use a different session key for each telephone call. A pair of TSDs generate a session key using
Diffie-Hellman key exchange, independent of the Clipper chip. Since Diffie-Hellman incorporates no
authentication, the TSD has two methods to thwart a man-in-the-middle attack.

The first is a screen. The TSD hashes the session key and displays that hash on a small screen as four
Hex digits. The conversants should confirm that their screens show the same digits. The voice quality
is good enough that they can recognize each other™s voice.

Eve still has a possible attack. Imagine her in the middle of Alice and Bob™s conversation. She uses
one TSD on the line with Alice and a modified TSD on the line with Bob; in the middle she bridges the
two phone calls. Alice tries to go secure. She generates a key as normal, except that Eve is acting as
Bob. Eve recovers the key, and using the modified TSD, forces the key she generates with Bob to have
the same hash value. This attack may not sound very likely, but the TSD uses a variant of the
interlock protocol to prevent it.

The TSD generates random numbers using a noise source and a chaotic amplifier with digital
feedback. This generates a bit stream, which is fed through a post-whitening filter using the digital
signal processor.

Despite all of this, the TSD manual does not mention security at all. In fact, it says [70]:

AT&T makes no warranty that the TSD will prevent cryptanalytic attack on any
encrypted transmission by any government agency, its agents, or any third party.
Furthermore, AT&T makes no warranty that the TSD will prevent any attack on any
communication by methods which bypass encryption.

Chapter 25
25.1 National Security Agency (NSA)

The NSA is the National Security Agency (once called “No Such Agency” or “Never Say Anything,”
but they™ve been more open recently), the official security body of the U.S. government. President
Harry Truman created the agency in 1952 under the Department of Defense, and for many years its
very existence was kept secret. The NSA is concerned with signals intelligence; its mandate is to listen
in on and decode all foreign communications of interest to the security of the United States.

The following paragraphs are excerpted from NSA™s original charter, signed by President Truman in
1952, and classified for many years thereafter [1535]:

Page 494 of 666
Applied Cryptography: Second Edition - Bruce Schneier

The COMINT mission of the National Security Agency (NSA) shall be to provide an
effective, unified organization and control of the communications intelligence activities of
the United States conducted against foreign governments, to provide for integrated
operational policies and procedures pertaining thereto. As used in this directive, the terms
“communications intelligence” or “COMINT” shall be construed to mean all procedures
and methods used in the interception of communications other than foreign press and
propaganda broadcasts and the obtaining of information from such communications by
other than intended recipients, but shall exclude censorship and the production and
dissemination of finished intelligence.

The special nature of COMINT actives requires that they be treated in all respects as
being outside the framework of other or general intelligence activities. Orders, directives,
policies, or recommendations of any authority of the Executive Branch relating to the
collection, production, security, handling, dissemination, or utilization of intelligence,
and/or classified material, shall not be applicable to COMINT actives, unless specifically
so stated and issued by competent department or agency authority represented on the
Board. Other National Security Council Intelligence Directives to the Director of Central
Intelligence and related implementing directives issued by the Director of Central
Intelligence shall be construed as non-applicable to COMINT activities, unless the
National Security Council has made its directive specifically applicable to COMINT.

NSA conducts research in cryptology, both in designing secure algorithms to protect U.S.
communications and in designing cryptanalytic techniques to listen in on non-U.S. communications.
The NSA is known to be the largest employer of mathematicians in the world; it is also the largest
purchaser of computer hardware in the world. The NSA probably possesses cryptographic expertise
many years ahead of the public state of the art (in algorithms, but probably not in protocols) and can
undoubtedly break many of the systems used in practice. But, for reasons of national security, almost
all information about the NSA”even its budget”is classified. (Its budget is rumored to be $13 billion
per year”including military funding of NSA projects and personnel”and it is rumored to employ
16,000 people.)

The NSA uses its power to restrict the public availability of cryptography, so as to prevent national
enemies from employing encryption methods too strong for the NSA to break. James Massey discusses
this struggle between academic and military research in cryptography [1007]:

If one regards cryptology as the prerogative of government, one accepts that most
cryptologic research will be conducted behind closed doors. Without doubt, the number of
workers engaged today in such secret research in cryptology far exceeds that of those
engaged in open research in cryptology. For only about 10 years has there in fact been
widespread open research in cryptology. There have been, and will continue to be,
conflicts between these two research communities. Open research is a common quest for
knowledge that depends for its vitality on the open exchange of ideas via conference
presentations and publications in scholarly journals. But can a government agency,
charged with responsibilities of breaking the ciphers of other nations, countenance the
publication of a cipher that it cannot break? Can a researcher in good conscience publish
such a cipher that might undermine the effectiveness of his own government™s code-
breakers? One might argue that publication of a provably secure cipher would force all
governments to behave like Stimson™s “gentlemen,” but one must be aware that open
research in cryptography is fraught with political and ethical considerations of a severity
more than in most scientific fields. The wonder is not that some conflicts have occurred
between government agencies and open researchers in cryptology, but rather that these
conflicts (at least those of which we are aware) have been so few and so mild.

Page 495 of 666
Applied Cryptography: Second Edition - Bruce Schneier

James Bamford wrote a fascinating book about the NSA: The Puzzle Palace [79], recently updated by
Bamford and Wayne Madsen [80].

The Commercial COMSEC Endorsement Program (CCEP)

The Commercial COMSEC Endorsement Program (CCEP), codenamed Overtake, is a 1984 NSA
initiative to facilitate the development of computer and communications products with embedded
cryptography [85,1165]. The military had always paid for this kind of thing for themselves, and it was
very expensive. The NSA figured that if companies could sell equipment to both the military and to
corporate users, even overseas, costs would go down and everyone would benefit. They would no
longer endorse equipment as complying with Federal Standard 1027, and then CCEP would provide
government-endorsed cryptographic equipment [419].

NSA developed a series of cryptographic modules for different purposes. Different algorithms would
be used in the modules for different applications, and manufacturers would be able to pull one
module out and plug in another depending on the customer. There were modules for military use
(Type I), modules for “unclassified but sensitive” government use (Type II), modules for corporate
use (Type III), and modules for export (Type IV). Table 25.1 summarizes the different modules,
applications, and names.

This program is still around, but never became popular outside the government. All the modules were
tamperproof, all the algorithms were classified, and you had to get your keys from NSA. Corporations
never really bought into the idea of using classified algorithms dictated by the government. You™d
think the NSA would have learned from this lesson and not even bothered with Clipper, Skipjack,
and escrowed encryption chips.

25.2 National Computer Security Center (NCSC)

The National Computer Security Center, a branch of the NSA, is responsible for the government™s
trusted computer program. Currently, the center evaluates commercial security products (both
hardware and software), sponsors and publishes research, develops technical guidelines, and
generally provides advice, support, and training.

The NCSC publishes the infamous “Orange Book” [465]. Its actual title is the Department of Defense
Trusted Computer System Evaluation Criteria, but that™s a mouthful to say and the book has an orange
cover. The Orange Book attempts to define security requirements, gives computer manufacturers an
objective way to measure the security of their systems, and guides them as to what to build into their
secure products. It focuses on computer security and doesn™t really say a lot about cryptography.

The Orange Book defines four broad divisions of security protection. It also defines classes of
protection within some of those divisions. They are summarized in Table 25.2.

Table 25.1
CCEP Modules
Application Type I Type II
Voice/low-speed data Winster Edgeshot
Computer Tepache Bulletproof
High-speed data Foresee Brushstroke

Page 496 of 666
Applied Cryptography: Second Edition - Bruce Schneier

Next Generation Countersign I Countersign II

Sometimes manufacturers say things like “we have C2 security.” This is what they™re talking about.
For more information on this, read [1365]. The computer security model used in these criteria is
called the Bell-LaPadula model [100,101,102,103].

The NCSC has published a whole series of books on computer security, sometimes called the Rainbow
Books (all the covers have different colors). For example, Trusted Network Interpretation of the Trusted
Computer System Evaluation Criteria [1146], sometimes called the “Red Book,” interprets the Orange
Book for networks and network equipment. The Trusted Database Management System Interpretation
of the Trusted Computer System Evaluation Criteria [1147]”I can™t even begin to describe the color of
that cover”does the same for databases. There are now over 30 of these books, some with hideously
colored covers.

For a complete set of the Rainbow Books, write Director, National Security Agency, INFOSEC
Awareness, Attention: C81, 9800 Savage Road, Fort George G. Meade, MD 20755-6000; (410) 766-
8729. Don™t tell them I sent you.

25.3 National Institute of Standards and Technology (NIST)

The NIST is the National Institute of Standards and Technology, a division of the U.S. Department of
Commerce. Formerly the NBS (National Bureau of Standards), it changed its name in 1988. Through
its Computer Systems Laboratory (CSL), NIST promotes open standards and interoperability that it
hopes will spur the economic development of computer-based industries. To this end, NIST issues
standards and guidelines that it hopes will be adopted by all computer systems in the United States.
Official standards are published as FIPS (Federal Information Processing Standards) publications.

If you want copies of any FIPS (or any other NIST publication), contact National Technical
Information Service (NTIS), U.S. Department of Commerce, 5285 Port Royal Road, Springfield, VA
22161; (703) 487-4650; or visit gopher://csrc.ncsl.nist.gov.

When Congress passed the Computer Security Act of 1987, NIST was mandated to define standards
for ensuring the security of sensitive but unclassified information in government computer systems.
(Classified information and Warner Amendment data are under the jurisdiction of the NSA.) The Act
authorizes NIST to work with other government agencies and private industry in evaluating proposed
technology standards.

Table 25.2
Orange Book Classifications

D: Minimal Security
C: Discretionary Protection
C1: Discretionary Security Protection
C2: Controlled Access Protection
B: Mandatory Protection
B1: Labeled Security Protection
B2: Structured Protection
B3: Security Domains
A: Verified Protection
A1: Verified Design

Page 497 of 666
Applied Cryptography: Second Edition - Bruce Schneier

NIST issues standards for cryptographic functions. U.S. government agencies are required to use
them for sensitive but unclassified information. Often the private sector adopts these standards as
well. NIST issued DES, DSS, SHS, and EES.

All these algorithms were developed with some help from the NSA, ranging from analyzing DES to
designing DSS, SHS, and the Skipjack algorithm in EES. Some people have criticized NIST for
allowing the NSA to have too much control over these standards, since the NSA™s interests may not
coincide with those of NIST. It is unclear how much actual influence NSA has on the design and
development of the algorithms. Given NIST™s limited staff, budget, and resources, NSA™s involvement
is probably considerable. NSA has significant resources to contribute, including a computer facility

The official “Memorandum of Understanding” (MOU) between the two agencies reads:


Recognizing that:

A. Under Section 2 of the Computer Security Act of 1987 (Public Law 100-
235), (the Act), the National Institute of Standards and Technology (NIST) has the
responsibility within the Federal Government for:
1. Developing technical, management, physical, and administrative standards
and guidelines for the cost-effective security and privacy of sensitive information in
Federal computer systems as defined in the Act; and,
2. Drawing on the computer system technical security guidelines of the
National Security Agency (NSA) in this regard where appropriate.
B. Under Section 3 of the Act, the NIST is to coordinate closely with other
agencies and offices, including the NSA, to assure:
1. Maximum use of all existing and planned programs, materials, studies, and
reports relating to computer systems security and privacy, in order to avoid
unnecessary and costly duplication of effort; and,
2. To the maximum extent feasible, that standards developed by the NIST
under the Act are consistent and compatible with standards and procedures
developed for the protection of classified information in Federal computer systems.
C. Under the Act, the Secretary of Commerce has the responsibility, which he
has delegated to the Director of NIST, for appointing the members of the Computer
System Security and Privacy Advisory Board, at least one of whom shall be from the

Therefore, in furtherance of the purposes of this MOU, the Director of the NIST and the
Director of the NSA hereby agree as follows:

I. The NIST will:
1. Appoint to the Computer Security and Privacy Advisory Board at least one
representative nominated by the Director of the NSA.
2. Draw upon computer system technical security guidelines developed by the
NSA to the extent that the NIST determines that such guidelines are consistent with
the requirements for protecting sensitive information in Federal computer systems.

Page 498 of 666
Applied Cryptography: Second Edition - Bruce Schneier

3. Recognize the NSA-certified rating of evaluated trusted systems under the
Trusted Computer Security Evaluation Criteria Program without requiring
additional evaluation.
4. Develop telecommunications security standards for protecting sensitive
unclassified computer data, drawing upon the expertise and products of the
National Security Agency, to the greatest extent possible, in meeting these
responsibilities in a timely and cost-effective manner.
5. Avoid duplication where possible in entering into mutually agreeable
arrangements with the NSA for the NSA support.
6. Request the NSA™s assistance on all matters related to cryptographic
algorithms and cryptographic techniques including but not limited to research,
development evaluation, or endorsement.
II. The NSA will:
1. Provide the NIST with technical guidelines in trusted technology,
telecommunications security, and personal identification that may be used in cost-
effective systems for protecting sensitive computer data.
2. Conduct or initiate research and development programs in trusted
technology, telecommunications security, cryptographic techniques and personal
identification methods.
3. Be responsive to the NIST™s requests for assistance in respect to all matters
related to cryptographic algorithms and cryptographic techniques including but not
limited to research, development, evaluation, or endorsement.
4. Establish the standards and endorse products for application to secure
systems covered in 10 USC Section 2315 (the Warner Amendment).
5. Upon request by Federal agencies, their contractors and other government-
sponsored entities, conduct assessments of the hostile intelligence threat to federal
information systems, and provide technical assistance and recommend endorsed
products for application to secure systems against that threat.
III. The NIST and the NSA shall:
1. Jointly review agency plans for the security and privacy of computer
systems submitted to NIST and NSA pursuant to section 6(b) of the Act.
2. Exchange technical standards and guidelines as necessary to achieve the
purposes of the Act.
3. Work together to achieve the purposes of this memorandum with the
greatest efficiency possible, avoiding unnecessary duplication of effort.
4. Maintain an on-going open dialogue to ensure that each organization
remains abreast of emerging technologies and issues affecting automated
information system security in computer-based systems.
5. Establish a Technical Working Group to review and analyze issues of
mutual interest pertinent to protection of systems that process sensitive or other
unclassified information. The Group shall be composed of six federal employees,
three each selected by NIST and NSA and to be augmented as necessary by
representatives of other agencies. Issues may be referred to the group by either the
NSA Deputy Director for Information Security or the NIST Deputy Director or may
be generated and addressed by the group upon approval by the NSA DDI or NIST
Deputy Director. Within days of the referral of an issue to the Group by either the
NSA Deputy Director for Information Security or the NIST Deputy Director, the
Group will respond with a progress report and plan for further analysis, if any.
6. Exchange work plans on an annual basis on all research and development
projects pertinent to protection of systems that process sensitive or other
unclassified information, including trusted technology, for protecting the integrity
and availability of data, telecommunications security and personal identification
methods. Project updates will be exchanged quarterly, and project reviews will be

Page 499 of 666
Applied Cryptography: Second Edition - Bruce Schneier

provided by either party upon request of the other party.
7. Ensure the Technical Working Group reviews prior to public disclosure all
matters regarding technical systems security techniques to be developed for use in
protecting sensitive information in federal computer systems to ensure they are
consistent with the national security of the United States. If NIST and NSA are
unable to resolve such an issue within 60 days, either agency may elect to raise the
issue to the Secretary of Defense and the Secretary of Commerce. It is recognized
that such an issue may be referred to the President through the NSC for resolution.
No action shall be taken on such an issue until it is resolved.
8. Specify additional operational agreements in annexes to this MOU as they
are agreed to by NSA and NIST.
IV. Either party may elect to terminate this MOU upon six months™ written
notice. This MOU is effective upon approval of both signatories.


Acting Director, National Institute of Standards and Technology, 24 March

Vice Admiral, U.S. Navy; Director, National Security Agency, 23 March 1989

25.4 RSA Data Security, Inc.

RSA Data Security, Inc. (RSADSI) was founded in 1982 to develop, license, and market the RSA
patent. It has some commercial products, including a standalone e-mail security package, and various
cryptographic libraries (available in either source or object form). RSADSI also markets the RC2 and
RC4 symmetric algorithms (see Section 11.8). RSA Laboratories, a research lab associated with
RSADSI, performs basic cryptographic research and provides consulting services.

Anyone interested in either their patents or products should contact Director of Sales, RSA Data
Security, Inc., 100 Marine Parkway, Redwood City, CA 94065; (415) 595-8782; fax: (415) 595-1873.

25.5 Public Key Partners

The five patents in Table 25.3 are held by Public Key Partners (PKP) of Sunnyvale, California, a
partnership between RSADSI and Caro-Kahn, Inc.”the parent company of Cylink. (RSADSI gets 65
percent of the profits and Caro-Kahn gets 35 percent.) PKP claims that these patents, and 4,218,582
in particular, apply to all uses of public-key cryptography.

In [574], PKP wrote:

These patents [4,200,770, 4,218,582, 4,405,829, and 4,424,414] cover all known methods of
practicing the art of Public Key, including the variations collectively known as ElGamal.

Due to the broad acceptance of RSA digital signatures throughout the international
community, Public Key Partners strongly endorses its incorporation in a digital signature
standard. We assure all interested parties that Public Key Partners will comply with all of
the policies of ANSI and the IEEE concerning the availability of licenses to practice this
art. Specifically, in support of any RSA signature standard which may be adopted, Public

Page 500 of 666
Applied Cryptography: Second Edition - Bruce Schneier

Key Partners hereby gives its assurance that licenses to practice RSA signatures will be
available under reasonable terms and conditions on a nondiscriminatory basis.

Whether this is true depends on who you talk to. PKP™s licenses have mostly been secret, so there is no
way to check if the licenses are standard. Although they claim to have never denied a license to
anyone, at least two companies claim to have been denied a license. PKP guards its patents closely,
threatening anyone who tries to use public-key cryptography without a license. In part, this is a
reaction to U.S. patent law. If you hold a patent and fail to prosecute an infringement, you can lose
your patent. There has been much talk about whether the patents are legal, but so far it has all been
talk. All legal challenges to PKP™s patents have been settled before judgment.

Table 25.3
Public Key Partners™ Patents
Patent # Date Inventors Patent Covers
4,200,770 4/29/80 Hellman, Diffie, Merkle Diffie-Hellman Key Exchange
4,218,582 8/19/80 Hellman, Merkle Merkle-Hellman Knapsacks
4,405,829 9/20/83 Rivest, Shamir, Adleman RSA
4,424,414 3/3/84 Hellman, Pohlig Pohlig-Hellman
4,995,082 2/19/91 Schnorr Schnorr Signatures

I am not going to dispense legal advice in this book. Maybe the RSA patent will not hold up in court.
Maybe the patents do not apply to the entirety of public-key cryptography. (Honestly, I can™t see how
they cover ElGamal or elliptic curve cryptosystems.) Perhaps someone will eventually win a suit
against PKP or RSADSI. But keep in mind that corporations with large legal departments like IBM,
Microsoft, Lotus, Apple, Novell, Digital, National Semiconductor, AT&T, and Sun have all licensed
RSA for use in their products rather than fight them in court. And Boeing, Shell Oil, DuPont,
Raytheon, and Citicorp have all licensed RSA for their own internal use.

In one case, PKP brought suit against TRW Corporation for using the ElGamal algorithm without a
license. TRW claimed they did not need a license. PKP and TRW reached a settlement in June 1992.
The details of the settlement are unknown, but they included an agreement by TRW to license the
patents. This does not bode well. TRW can afford good lawyers; I can only assume that if they
thought they could win the suit without spending an unreasonable amount of money, they would have

Meanwhile, PKP is having its own internal problems. In June 1994 Caro-Kahn sued RSADSI alleging,
among other things, that the RSA patent is invalid and unenforceable [401]. Both partners are trying
to have the partnership dissolved. Are the patents valid or not? Will users have to get a license from
Caro-Kahn to use the RSA algorithm? Who will own the Schnorr patent? The matter will probably
be sorted out by the time this book sees publication.

Patents are good for only 17 years, and cannot be renewed. On April 29, 1997, Diffie-Hellman key
exchange (and the ElGamal algorithm) will enter the public domain. On September 20, 2000, RSA
will enter the public domain. Mark your calendars.

25.6 International Association for Cryptologic Research (IACR)

The International Association for Cryptologic Research is the worldwide cryptographic research
organization. Its stated purpose is to advance the theory and practice of cryptology and related fields.
Membership is open to any person. The association sponsors two annual conferences, Crypto (held in

Page 501 of 666
Applied Cryptography: Second Edition - Bruce Schneier

Santa Barbara in August) and Eurocrypt (held in Europe in May), and publishes quarterly The
Journal of Cryptology and the IACR Newsletter.

The address of the IACR Business Office changes whenever the president does. The current address
is: IACR Business Office, Aarhus Science Park, Gustav Wieds Vej 10, DK-8000 Aarhus C, Denmark.

25.7 RACE Integrity Primitives Evaluation (RIPE)

The Research and Development in Advanced Communication Technologies in Europe (RACE)
program was launched by the European Community to support pre-competitive and pre-normative
work in communications standards and technologies to support Integrated Broadband
Communication (IBC). As part of that effort, RACE established the RACE Integrity Primitives
Evaluation (RIPE) to put together a portfolio of techniques to meet the anticipated security
requirements of IBC.

Six leading European cryptography research groups made up the RIPE consortium: Center for
Mathematics and Computer Science, Amsterdam; Siemens AG; Philips Crypto BV; Royal PTT
Nederland NV, PTT Research; Katholieke Universiteit Leuven; and Aarhus Universitet. After calls
for algorithms in 1989 and 1991 [1564], 32 submissions from around the world, and a 350 man-month
evaluation project, the consortium published RIPE Integrity Primitives [1305,1332]. The report
included an introduction and some basic integrity concepts, and these primitives: MDC-4 (see Section
18.11), RIPE-MD (see Section 18.8), RIPE-MAC (see Section 18.14), IBC-HASH, SKID (see Section
3.2), RSA, COMSET (see Section 16.1), and RSA key generation.

25.8 Conditional Access for Europe (CAFE)

Conditional Access for Europe (CAFE) is a project in the European Community™s ESPRIT program
[204,205]. Work began in December 1992 and is scheduled to be finished by the end of 1995. The
consortium involved consists of groups for social and market studies (Cardware, Institut für
Sozialforschung), software and hardware manufacturers (DigiCash, Gemplus, Ingenico, Siemens),
and cryptographers (CWI Amsterdam, PTT Research Netherlands, SPET, Sintef Delab Trondheim,
Universities of …rhus, Hildesheim and Leuven).

The goal is to develop systems for conditional access, particularly digital payment systems. Payment
systems must give legal certainty to everybody at all times and require as little trust as possible”this
certainty should not depend on the tamper-resistance of any devices.

The basic device for CAFE is an electronic wallet: a small computer that looks something like a
pocket calculator. It has a battery, keyboard, screen, and an infrared channel for communicating with
other wallets. Every user owns and uses his own wallet, which administers his rights and guarantees
his security.

A device with a keyboard and screen has an advantage over a smart card; it can operate independent
of a terminal. A user can directly enter his password and the amount of the payment. The user does
not have to give his wallet up to complete a transaction, unlike the current situation with credit cards.

Additional features are:

” Offline transactions. The purpose of the system is to replace small cash transactions;
an online system would be too cumbersome.
” Loss tolerance. If a user loses his wallet, or if it breaks or is stolen, he can recover his

Page 502 of 666
Applied Cryptography: Second Edition - Bruce Schneier

” Support for different currencies.
” An open architecture and open system. A user should be able to pay for arbitrary
services, such as shopping, telephone, and public transport, by a range of service providers. The
system should be interoperable between any number of electronic money issuers, and between
different wallet types and manufacturers.
” Low cost.

At this writing there is a software version of the system, and the consortium is hard at work on a
hardware prototype.

25.9 ISO/IEC 9979

In the mid-80s, the ISO tried to standardize DES, which by then was already a FIPS and an ANSI
standard. After some political wrangling, the ISO decided not to standardize cryptographic
algorithms, but instead to register algorithms. Only encryption algorithms can be registered; hash
functions and signature schemes cannot. Any national body can submit an algorithm for registration.

Currently only three algorithms have been submitted (see Table 25.4). A submission includes
information about applications, parameters, implementations, modes, and test vectors. A detailed
description is optional; it is possible to submit secret algorithms for registration.

The fact that an algorithm is registered does not imply anything about its quality, nor is registration
an approval of the algorithm by the ISO/IEC. Registration merely indicates that a single national
body wants to register the algorithm, based on whatever criteria that body uses.

I am not impressed with this idea. Registration obstructs the standardization process. Rather than
agreeing on a few algorithms, the ISO is allowing any algorithm to be registered. With so little control
over what is registered, stating that an algorithm is “ISO/IEC 9979 Registered” sounds a whole lot
better than it is. In any case, the registry is maintained by the National Computer Centre Ltd., Oxford
Road, Manchester, M1 7ED, United Kingdom.

Table 25.4
ISO/IEC 9979
Registered Algorithms
Name Registration Number
B-CRYPT 0001
IDEA 0002
LUC 0003

25.10 Professional, Civil Liberties, and Industry Groups

Electronic Privacy Information Center (EPIC)

EPIC was established in 1994 to focus public attention on emerging privacy issues relating to the
National Information Infrastructure, such as the Clipper chip, the Digital Telephony proposal,
national identity numbers and systems, medical records privacy, and the sale of consumer data. EPIC
conducts litigation, sponsors conferences, produces reports, publishes the EPIC Alert, and leads
campaigns on privacy issues. Anyone interested in joining should contact Electronic Privacy
Information Center, 666 Pennsylvania Avenue SE, Suite 301, Washington, D.C. 20003; (202) 544-

Page 503 of 666
Applied Cryptography: Second Edition - Bruce Schneier

9240; fax: (202) 547-5482; Internet: info@epic.org.

Electronic Frontier Foundation (EFF)

The EFF is dedicated to protecting civil rights in cyberspace. With respect to cryptographic policy in
the United States, they believe that information and access to cryptography are fundamental rights,
and therefore should be free of government restriction. They organized the Digital Privacy and
Security Working Group, a coalition of 50 organizations. The group opposed the Digital Telephony
bill and the Clipper initiative. The EFF is also helping in a lawsuit against cryptography export
controls [143]. Anyone interested in joining the EFF should contact Electronic Frontier Foundation,
1001 G Street NW, Suite 950E, Washington, D.C. 20001; (202) 347-5400; fax: (202) 393-5509;
Internet: eff@eff.org.

Association for Computing Machinery (ACM)

The ACM is an international computer industry organization. In 1994 the U.S. ACM Public Policy
Committee produced an excellent report on U.S. cryptography policy [935]. This should be required
reading for anyone interested in the politics of cryptography. It is available via anonymous ftp from
info.acm.org in /reports/acm_crypto/acm_crypto_study.ps.

Institute of Electrical and Electronics Engineers (IEEE)

The IEEE is another professional organization. The U.S. office investigates and makes
recommendations on privacy-related issues including encryption policy, identity numbers, and
privacy protections on the Internet.

Software Publishers Association (SPA)

The SPA is a trade association of over 1000 personal computer software companies. They have
lobbied for relaxation of export controls on cryptography, and maintain a list of commercially
available foreign cryptography products.

25.11 Sci.crypt

Sci.crypt is the Usenet newsgroup for cryptology. It is read by an estimated 100,000 people worldwide.
Most of the posts are nonsense, bickering, or both; some are political, and most of the rest are
requests for information or basic questions. Occasionally nuggets of new and useful information are
posted to this newsgroup. If you follow sci.crypt regularly, you will learn how to use something called
a kill file.

Another Usenet newsgroup is sci.crypt.research, a moderated newsgroup devoted to discussions about
cryptology research. There are fewer posts and they are more interesting.

25.12 Cypherpunks

The Cypherpunks are an informal group of people interested in teaching and learning about
cryptography. They also experiment with cryptography and try to put it into use. In their opinion, all
the cryptographic research in the world doesn™t do society any good unless it gets used.

In “A Cypherpunk™s Manifesto,” Eric Hughes writes [744]:

Page 504 of 666
Applied Cryptography: Second Edition - Bruce Schneier

We the Cypherpunks are dedicated to building anonymous systems. We are defending our
privacy with cryptography, with anonymous mail forwarding systems, with digital
signatures, and with electronic money.

Cypherpunks write code. We know that someone has to write software to defend privacy,
and since we can™t get privacy unless we all do, we™re going to write it. We publish our


. 21
( 29)