<<

. 22
( 29)



>>

code so that our fellow Cypherpunks may practice and play with it. Our code is free for
all to use, worldwide. We don™t care much if you don™t approve of the software we write.
We know that software can™t be destroyed and that widely dispersed systems can™t be shut
down.

People interested in joining the cypherpunks mailing list on the Internet should send mail to
majordomo@toad.com. The mailing list is archived at ftp.csua.berkeley.edu in /pub/cypherpunks.

25.13 Patents

Software patents are an issue much larger than the scope of this book. Whether they™re good or bad,
they exist. Algorithms, cryptographic algorithms included, can be patented in the United States. IBM
owned the DES patents [514]. IDEA is patented. Almost every public-key algorithm is patented. NIST
even has a patent for the DSA. Some cryptography patents have been blocked by intervention from
the NSA, under the authority of the Invention Secrecy Act of 1940 and the National Security Act of
1947. This means that instead of a patent, the inventor gets a secrecy order and is prohibited from
discussing his invention with anybody.

The NSA has special dispensation when it comes to patents. They can apply for a patent and then
block its issuance. It™s a secrecy order again, but here the NSA is both the inventor and the issuer of
the order. When, at some later date, the secrecy order is removed, the Patent Office issues the patent
good for the standard 17 years. This rather clearly protects the invention while keeping it secret. If
someone else invents the same thing, the NSA has already filed for the patent. If no one else invents it,
then it remains secret.

Not only does this fly directly in the face of the patent process, which is supposed to disclose as well as
protect inventions, it allows the NSA to keep a patent for more than 17 years. The 17-year clock starts
ticking after the patent is issued, not when it is filed. How this will change, now that the United States
has ratified the GATT treaty, is unclear.

25.14 U.S. Export Rules

According to the U.S. government, cryptography can be a munition. This means it is covered under
the same rules as a TOW missile or an M1 Abrams tank. If you sell cryptography overseas without
the proper export license, then you are an international arms smuggler. Unless you think time in a
federal penitentiary would look good on your r©sum©, pay attention to the rules.

With the advent of the Cold War in 1949, all of the NATO countries (except Iceland), and later
Australia, Japan, and Spain, formed CoCom, the Coordinating Committee for Multilateral Export
Controls. This is an unofficial nontreaty organization, chartered to coordinate national restrictions on
the export of sensitive military technologies to the Soviet Union, other Warsaw Pact countries, and the
People™s Republic of China. Examples of controlled technologies are computers, milling machinery,
and cryptography. The goal here was to slow technology transfer into those countries, and thereby
keep their militaries inferior.




Page 505 of 666
Applied Cryptography: Second Edition - Bruce Schneier




Since the end of the Cold War, the CoCom countries realized that many of their controls were
obsolete. They are supposedly in the process of defining something called the “New Forum,” another
multinational organization designed to stop the flow of military technologies to countries the members
don™t particularly like.

In any case, U.S. export policy on strategic goods is defined by the Export Administration Act, the
Arms Export Control Act, the Atomic Energy Act, and the Nuclear Non-Proliferation Act. The
controls established by all this legislation are implemented through a number of statutes, none of
them coordinated with each other. Over a dozen agencies including the military services administer
controls; often their regulatory programs overlap and contradict.

Controlled technologies appear on several lists. Cryptography has traditionally been classified as a
munition and appears on the U.S. Munitions List (USML), the International Munitions List (IML),
the Commerce Control List (CCL), and the International Industrial List (IIL). The Department of
State is responsible for the USML; it is published as part of the International Traffic in Arms
Regulations (ITAR) [466,467].

Two U.S. government agencies control export of cryptography. One is the Bureau of Export
Administration (BXA) in the Department of Commerce, authorized by the Export Administration
Regulations (EAR). The other is the Office of Defense Trade Controls (DTC) in the State Department,
authorized by the ITAR. As a rule of thumb, the Commerce Department™s BXA has far less stringent
requirements, but State Department™s DTC (which takes technical and national security advice from
the NSA, and always seems to follow that advice) sees all cryptography exports first and can refuse to
transfer jurisdiction to BXA.

The ITAR regulates this stuff. (Before 1990 the Office of Defense Trade Controls was called the Office
of Munitions Controls; presumably this public relations effort is designed to help us forget that we™re
dealing with guns and bombs.) Historically, the DTC has been reluctant to grant export licenses for
encryption products stronger than a certain level”not that they have ever been public about exactly
what that level is.

The following sections are excerpted from the ITAR [466,467]:

§ 120.10 Technical data.


Technical data means, for purposes of this subchapter:
(1) Information, other than software as defined in 120.10(d), which is
required for the design, development, production, processing, manufacture,
assembly, operation, repair, maintenance or modification of defense articles. This
includes, for example, information in the form of blueprints, drawings,
photographs, plans, instructions and documentation;
(2) Classified information relating to defense articles and defense services;
(3) Information covered by an invention secrecy order;
(4) Software as defined in Sec. 121.8(f) directly related to defense articles;
(5) This definition does not include information concerning general scientific,
mathematical or engineering principles commonly taught in schools, colleges and
universities in the public domain as defined in § 120.11. It also does not include
basic marketing information on function or purpose or general system descriptions
of defense articles.




Page 506 of 666
Applied Cryptography: Second Edition - Bruce Schneier




§ 120.11 Public domain.


Public domain means information which is published and which is generally accessible or
available to the public:
(1) Through sales at newsstands and bookstores;
(2) Through subscriptions which are available without restriction to any
individual who desires to obtain or purchase the published information;
(3) Through second class mailing privileges granted by the U.S. Government;
(4) At libraries open to the public or from which the public can obtain
documents;
(5) Through patents available at any patent office;
(6) Through unlimited distribution at a conference, meeting, seminar, trade
show or exhibition, generally accessible to the public, in the United States;
(7) Through public release (i.e., unlimited distribution) in any form (e.g., not
necessarily in published form) after approval by the cognizant U.S. government
department or agency (see also § 125.4(b)(13)).
(8) Through fundamental research in science and engineering at accredited
institutions of higher learning in the U.S., where the resulting information is
ordinarily published and shared broadly in the scientific community. Fundamental
research is defined to mean basic and applied research in science and engineering
where the resulting information is ordinarily published and shared broadly within
the scientific community, as distinguished from research the results of which are
restricted for proprietary reasons or specific U.S. Government access and
dissemination controls. University research will not be considered fundamental
research if:
(i) The University or its researchers accept other restrictions on publication of
scientific and technical information resulting from the project or activity, or
(ii) The research is funded by the U.S. Government and specific access and
dissemination controls protecting information resulting from the research are
applicable.

§ 120.17 Export.


Export means:
(1) Sending or taking defense articles out of the United States in any manner,
except by mere travel outside of the United States by a person whose personal
knowledge includes technical data; or
(2) Transferring registration, control or ownership to a foreign person of any
aircraft, vessel, or satellite covered by the U.S. Munitions List, whether in the United
States or abroad; or
(3) Disclosing (including oral or visual disclosure) or transferring in the
United States any defense articles to an embassy, any agency or subdivision of a
foreign government (e.g., diplomatic missions); or
(4) Disclosing (including oral or visual disclosure) or transferring technical
data to a foreign person, whether in the United States or abroad; or
(5) Performing a defense service on behalf of, or for the benefit of, a foreign
person, whether in the United States or abroad.
(6) A launch vehicle or payload shall not, by the launching of such vehicle, be
considered export for the purposes of this subchapter. However, for certain limited
purposes (see § 126.1 of this subchapter), the controls of this subchapter apply to



Page 507 of 666
Applied Cryptography: Second Edition - Bruce Schneier



sales and other transfers of defense articles or defense services.


Part 121”The United States Munitions List

§ 121.1 General. The United States Munitions List


Category XIII”Auxiliary Military Equipment
(1) Cryptographic (including key management) systems, equipment,
assemblies, modules, integrated circuits, components or software with the capability
of maintaining secrecy or confidentiality of information or information systems,
except cryptographic equipment and software as follows:
(i) Restricted to decryption functions specifically designed to allow the
execution of copy protected software, provided the decryption functions are not
user-accessible.
(ii) Specifically designed, developed or modified for use in machines for
banking or money transactions, and restricted to use only in such transactions.
Machines for banking or money transactions include automatic teller machines, self-
service statement printers, point of sale terminals or equipment for the encryption of
interbanking transactions.
(iii) Employing only analog techniques to provide the cryptographic
processing that ensures information security in the following applications....
(iv) Personalized smart cards using cryptography restricted for use only in
equipment or systems exempted from the controls of the USML.
(v) Limited to access control, such as automatic teller machines, self-service
statement printers or point of sale terminals, which protects passwords or personal
identification numbers (PIN) or similar data to prevent unauthorized access to
facilities but does not allow for encryption or files or text, except as directly related
to the password of PIN protection.
(vi) Limited to data authentication which calculates a Message Authentication
Code (MAC) or similar result to ensure no alteration of text has taken place, or
authenticate users, but does not allow for encryption of data, text or other media
other than that needed for the authentication.
(vii) Restricted for fixed data compression or coding techniques.
(viii) Limited to receiving for radio broadcast, pay television or similar
restricted audience television of the consumer type, without digital encryption and
where digital decryption is limited to video, audio or management functions.
(ix) Software designed or modified to protect against malicious computer
damage, (e.g., viruses).
(2) Cryptographic (including key management) systems, equipment,
assemblies, modules, integrated circuits, components or software which have the
capability of generating spreading or hopping codes for spread spectrum systems or
equipment.
(3) Cryptographic systems, equipment, assemblies, modules, integrated
circuits, components or software.

§ 125.2 Exports of unclassified technical data.

(a) General. A license (DSP-5) is required for the export of unclassified
technical data unless the export is exempt from the licensing requirements of this
subchapter. In the case of a plant visit, details of the proposed discussions must be
transmitted to the Office of Defense Trade Controls for an appraisal of the technical



Page 508 of 666
Applied Cryptography: Second Edition - Bruce Schneier



data. Seven copies of the technical data or the details of the discussions must be
provided.
(b) Patents. A license issued by the Office of Defense Trade Controls is
required for the export of technical data whenever the data exceeds that which is
used to support a domestic filing of a patent application or to support a foreign
filing of a patent application whenever no domestic application has been filed.
Requests for the filing of patent applications in a foreign country, and requests for
the filing of amendments, modifications or supplements to such patents, should
follow the regulations of the U.S. Patent and Trademark Office in accordance with
37 CFR part 5. The export of technical data to support the filing and processing of
patent applications in foreign countries is subject to regulations issued by the U.S.
Patent and Trademark Office pursuant to 35 U.S.C. 184.
(c) Disclosures. Unless otherwise expressly exempted in this subchapter, a
license is required for the oral, visual or documentary disclosure of technical data
by U.S. persons to foreign persons. A license is required regardless of the manner in
which the technical data is transmitted (e.g., in person, by telephone,
correspondence, electronic means, etc.). A license is required for such disclosures by
U.S. persons in connection with visits to foreign diplomatic missions and consular
offices.

And so on. There™s a lot more information in this document. If you™re going to try to export
cryptography, I suggest you get a copy of the entire thing and a lawyer who speaks the language.

In reality, the NSA has control over the export of cryptographic products. If you want a Commodity
Jurisdiction (CJ), you must submit your product to the NSA for approval and submit the CJ
application to the State Department. After State Department approval, the matter moves under the
jurisdiction of the Commerce Department, which has never cared much about the export of
cryptography. However, the State Department will never grant a CJ without NSA approval.

In 1977 an NSA employee named Joseph A. Meyer wrote a letter”unauthorized, according to the
official story of the incident”to the IEEE, warning them that the scheduled presentation of the
original RSA paper would violate the ITAR. From The Puzzle Palace:

He had a point. The ITAR did cover any “unclassified information that can be used, or
adapted for use, in the design, production, manufacture, repair, overhaul, processing,
engineering, development, operation, maintenance, or reconstruction” of the listed
materials, as well as “any technology which advances the state-of-the-art or establishes a
new art in an area of significant military applicability in the United States.” And export
did include transferring the information both by writing and by either oral or visual
means, including briefings and symposia in which foreign nationals are present.

But followed literally, the vague, overly broad regulations would seem to require that
anyone planning to write or speak out publicly on a topic touching the Munitions List
must first get approval from the State Department”a chilling prospect clearly at odds
with the First Amendment and one as yet untested by the Supreme Court.


In the end NSA disavowed Meyer™s actions and the RSA paper was presented as planned. No actions
were taken against any of the inventors, although their work arguably enhanced foreign
cryptography capabilities more than anything released since.

The following statement by NSA discusses the export of cryptography [363]:




Page 509 of 666
Applied Cryptography: Second Edition - Bruce Schneier




Cryptographic technology is deemed vital to national security interests. This includes
economic, military, and foreign policy interests.

We do not agree with the implications from the House Judiciary Committee hearing of 7
May 1992 and recent news articles that allege that U.S. export laws prevent U.S. firms™
manufacture and use of top encryption equipment. We are unaware of any case where a
U.S. firm has been prevented from manufacturing and using encryption equipment within
this country or for use by the U.S. firm or its subsidiaries in locations outside the U.S.
because of U.S. export restrictions. In fact, NSA has always supported the use of
encryption by U.S. businesses operating domestically and overseas to protect sensitive
information.

For export to foreign countries, NSA as a component of the Department of Defense (along
with the Department of State and the Department of Commerce) reviews export licenses
for information security technologies controlled by the Export Administration
Regulations or the International Traffic in Arms Regulations. Similar export control
systems are in effect in all the Coordinating Committee for Multilateral Export Controls
(CoCom) countries as well as many non-CoCom countries as these technologies are
universally considered as sensitive. Such technologies are not banned from export and are
reviewed on a case-by-case basis. As part of the export review process, licenses may be
required for these systems and are reviewed to determine the effect such export could
have on national security interests”including economic, military, and political security
interests. Export licenses are approved or denied based upon the type of equipment
involved, the proposed end use and the end user.

Our analysis indicates that the U.S. leads the world in the manufacture and export of
information security technologies. Of those cryptologic products referred to NSA by the
Department of State for export licenses, we consistently approve over 90%. Export
licenses for information security products under the jurisdiction of the Department of
Commerce are processed and approved without referral to NSA or DoD. This includes
products using such techniques as the DSS and RSA which provide authentication and
access control to computers or networks. In fact, in the past NSA has played a major role
in successfully advocating the relaxation of export controls on RSA and related
technologies for authentication purposes. Such techniques are extremely valuable against
the hacker problem and unauthorized use of resources.

It is the stated policy of the NSA not to restrict the export of authentication products, only encryption
products. If you want to export an authentication-only product, approval may merely be a matter of
showing that your product cannot easily be used for encryption. Furthermore, the bureaucratic
procedures are much simpler for authentication products than for encryption products. An
authentication product needs State Department approval only once for a CJ; an encryption product
may require approval for every product revision or even every sale.

Without a CJ, you must request export approval every time you wish to export the product. The State
Department does not approve the export of products with strong encryption, even those using DES.
Isolated exceptions include export to U.S. subsidiaries for the purposes of communicating to the U.S.,
exports for some banking applications, and export to appropriate U.S. military users. The Software
Publishers Association (SPA) has been negotiating with the government to ease export license
restrictions. A 1992 agreement between them and the State Department eased the export license rules
for two algorithms, RC2 and RC4, as long as the key size is 40 bits or less. Refer to Section 7.1 for
more information.




Page 510 of 666
Applied Cryptography: Second Edition - Bruce Schneier




In 1993, Rep. Maria Cantwell (D-WA) introduced a bill at the behest of the software industry to relax
export controls on encryption software. Sen. Patty Murray (D-WA) introduced a companion bill in
the Senate. The Cantwell Bill was appended to the general export control legislation going through
Congress, but was deleted by the House Intelligence Committee after a massive lobbying effort by the
NSA. Whatever the NSA did, it was impressive; the committee voted unanimously to delete the
wording. I can™t remember the last time a bunch of legislators voted unanimously to do anything.

In 1995 Dan Bernstein, with the help of the EFF, sued the U.S. government, seeking to bar the
government from restricting publication of cryptographic documents and software [143]. The suit
claimed that the export control laws are unconstitutional, an “impermissible prior restraint on speech,
in violation of the First Amendment.” Specifically, the lawsuit charges that the current export control
process:

” Allows bureaucrats to restrict publication without ever going to court.
” Provides too few procedural safeguards for First Amendment rights.
” Requires publishers to register with the government, creating in effect a “licensed
press.”
” Disallows general publication by requiring recipients to be individually identified.
” Is sufficiently vague that ordinary people cannot know what conduct is allowed and
what conduct is prohibited.
” Is overbroad because it prohibits conduct that is clearly protected (such as speaking to
foreigners within the United States).
” Is applied too broadly, by prohibiting export of software that contains no
cryptography, on the theory that cryptography could be added to it later.
” Egregiously violates the First Amendment by prohibiting private speech on
cryptography because the government wishes its own opinions on cryptography to guide the
public instead.
” Exceeds the authority granted by Congress in the export control laws in many ways, as
well as exceeding the authority granted by the Constitution.


Everyone anticipates that the case will take several years to settle, and no one has any idea how it will
come out.

Meanwhile, the Computer Security and Privacy Advisory Board, an official advisory board to NIST,
voted in March 1992 to recommend a national policy review of cryptographic issues, including export
policy. They said that export policy is decided solely by agencies concerned with national security,
without input from agencies concerned with encouraging commerce. Those agencies concerned with
national security are doing everything possible to make sure this doesn™t change, but eventually it has
to.

25.15 Foreign Import and Export of Cryptography

Other countries have their own import and export rules [311]. This summary is incomplete and
probably out of date. Countries could have rules and ignore them, or could have no rules but restrict
import, export, and use anyway.

” Australia requires an import certificate for cryptography only upon request from the
exporting country.
” Canada has no import controls, and export controls are similar to those of the United
States. The exportation of items from Canada may be subject to restriction if they are included
on the Export Control List pursuant to the Export and Import Permits Act. Canada follows the



Page 511 of 666
Applied Cryptography: Second Edition - Bruce Schneier



CoCom regulations in the regulation of cryptographic technology. Encryption devices are
outlined in category five, part two of Canada™s export regulations. These provisions are similar
to U.S. category five in the Export Administration Regulations.
” China has a licensing scheme for importing commodities; exporters must file an
application with the Ministry of Foreign Trade. Based on China™s List of Prohibited and
Restricted Imports and Exports enacted in 1987, China restricts the import and export of voice-
encoding devices.
” France has no special rules for the import of cryptography, but they have rules
regarding the sale and use of cryptography in their country. All products must be certified:
Either they must meet a published specification, or the company proprietary specification must
be provided to the government. The government may also ask for two units for their own use.
Companies must have a license to sell cryptography within France; the license specifies the
target market. Users must have a license to buy and use cryptography; the license includes a
statement to the effect that users must be prepared to give up their keys to the government up to
four months after use. This restriction may be waived in some cases: for banks, large
companies, and so on. And there is no use license requirement for cryptography exportable
from the U.S.
” Germany follows the CoCom guidelines, requiring a license to export cryptography.
They specifically maintain control of public-domain and mass-market cryptography software.
” Israel has import restrictions, but no one seems to know what they are.
” Belgium, Italy, Japan, Netherlands, and the United Kingdom follow the CoCom
guidelines on cryptography, requiring a license for export.
” Brazil, India, Mexico, Russia, Saudi Arabia, Spain, South Africa, Sweden, and
Switzerland have no import or export controls on cryptography.

25.16 Legal Issues

Are digital signatures real signatures? Will they stand up in court? Some preliminary legal research
has resulted in the opinion that digital signatures would meet the requirements of legally binding
signatures for most purposes, including commercial use as defined in the Uniform Commercial Code
(UCC). A GAO (General Accounting Office) decision, made at the request of NIST, opines that digital
signatures will meet the legal standards of handwritten signatures [362].

The Utah Digital Signature Act went into effect on May 1, 1995, providing a legal framework for the
use of digital signatures in the judicial system. California has a bill pending, while Oregon and
Washington are still writing theirs. Texas and Florida are right behind. By this book™s publication,
more states will have followed suit.

The American Bar Association (EDI and Information Technology Division of the Science and
Technology Section) produced a model act for states to use for their own legislation. The act attempts
to incorporate digital signatures into the existing legal infrastructure for signatures: the Uniform
Commercial Code, the United States Federal Reserve regulations, common law of contracts and
signatures, the United Nations Convention on Contracts for the International Sale of Goods, and the
United Nations Convention on International Bills of Exchange and International Promissory
Committees. Included in the act are responsibilities and obligations of certification authorities, issues
of liability, and limits and policies.

In the United States, laws about signatures, contracts, and commercial transactions are state laws, so
this model act is designed for states. The eventual goal is a federal act, but if this all begins at the state
level there is less chance of the NSA mucking up the works.

Even so, the validity of digital signatures has not been challenged in court; their legal status is still
undefined. In order for digital signatures to carry the same authority as handwritten signatures, they



Page 512 of 666
Applied Cryptography: Second Edition - Bruce Schneier



must first be used to sign a legally binding document, and then be challenged in court by one party.
The court would then consider the security of the signature scheme and issue a ruling. Over time, as
this happened repeatedly, a body of precedent rulings would emerge regarding which digital
signature methods and what key sizes are required for a digital signature to be legally binding. This is
likely to take years.

Until then, if two people wish to use digital signatures for contracts (or purchase requests, or work
orders, or whatever), it is recommended that they sign a paper contract in which they agree in the
future to be bound by any documents digitally signed by them [1099]. This document would specify
algorithm, key size, and any other parameters; it should also delineate how disputes would be
resolved.




Afterword by Matt Blaze
One of the most dangerous aspects of cryptology (and, by extension, of this book), is that you can
almost measure it. Knowledge of key lengths, factoring methods, and cryptanalytic techniques makes
it possible to estimate (in the absence of a real theory of cipher design) the “work factor” required to
break a particular cipher. It™s all too tempting to misuse these estimates as if they were overall
security metrics for the systems in which they are used. The real world offers the attacker a richer
menu of options than mere cryptanalysis. Often more worrisome are protocol attacks, Trojan horses,
viruses, electromagnetic monitoring, physical compromise, blackmail and intimidation of key holders,
operating system bugs, application program bugs, hardware bugs, user errors, physical
eavesdropping, social engineering, and dumpster diving, to name just a few.

High-quality ciphers and protocols are important tools, but by themselves make poor substitutes for
realistic, critical thinking about what is actually being protected and how various defenses might fail
(attackers, after all, rarely restrict themselves to the clean, well-defined threat models of the academic
world). Ross Anderson gives examples of cryptographically strong systems (in the banking industry)
that fail when exposed to the threats of the real world [43, 44]. Even when the attacker has access only
to ciphertext, seemingly minor breaches in other parts of the system can leak enough information to
render good cryptosystems useless. The Allies in World War II broke the German Enigma traffic
largely by carefully exploiting operator errors [1587].

An NSA-employed acquaintance, when asked whether the government can crack DES traffic,
quipped that real systems are so insecure that they never need to bother. Unfortunately, there are no
easy recipes for making a system secure, no substitute for careful design and critical, ongoing
scrutiny. Good cryptosystems have the nice property of making life much harder for the attacker than
for the legitimate user; this is not the case for almost every other aspect of computer and
communication security. Consider the following (quite incomplete) “Top Ten Threats to Security in
Real Systems” list; all are easier to exploit than to prevent.

1. The sorry state of software. Everyone knows that nobody knows how to write software.
Modern systems are complex, with hundreds of thousands of lines of code; any one of them has
the chance to compromise security. Fatal bugs may even be far-removed from the security
portion of the software.
2. Ineffective protection against denial-of-service attacks. Some cryptographic protocols
allow anonymity. It may be especially dangerous to deploy anonymous protocols if they increase
the opportunities for unidentified vandals to disrupt service; anonymous systems therefore need
to be especially resistant to denial-of-service attacks. Robust networks can more easily support
anonymity; consider that hardly anyone worries very much about the millions of anonymous




Page 513 of 666
Applied Cryptography: Second Edition - Bruce Schneier



entry points to more robust networks like the telephone system or the postal service, where it™s
relatively difficult (or expensive) for an individual to cause large-scale failures.
3. No place to store secrets. Cryptosystems protect large secrets with smaller ones (keys).
Unfortunately, modern computers aren™t especially good at protecting even the smallest secrets.
Multi-user networked workstations can be broken into and their memories compromised.
Standalone, single-user machines can be stolen or compromised through viruses that leak
secrets asynchronously. Remote servers, where there may be no user available to enter a
passphrase (but see threat #5), are an especially hard problem.
4. Poor random-number generation. Keys and session variables need good sources of
unpredictable bits. A running computer has a lot of entropy in it but rarely provides
applications with a convenient or reliable way to exploit it. A number of techniques have been
proposed for getting true random numbers in software (taking advantage of unpredictability in
things like I/O interarrival timing, clock and timer skew, and even air turbulence inside disk
enclosures), but all these are very sensitive to slight changes in the environments in which they
are used.
5. Weak passphrases. Most cryptographic software addresses the key storage and key
generation problems by relying on user-generated passphrase strings, which are presumed to be
unpredictable enough to produce good key material and are also easy enough to remember that
they do not require secure storage. While dictionary attacks are a well-known problem with
short passwords, much less is known about lines of attack against user-selected passphrase-
based keys. Shannon tells us that English text has only just over 1 bit of entropy per character,
which would seem to leave most passphrases well within reach of brute-force search. Less is
known, however, about good techniques for enumerating passphrases in order to exploit this.
Until we have a better understanding of how to attack passphrases, we really have no idea how
weak or strong they are.
6. Mismatched trust. Almost all currently available cryptographic software assumes that
the user is in direct control over the systems on which it runs and has a secure path to it. For
example, the interfaces to programs like PGP assume that their passphrase input always comes
from the user over a secure path like the local console. This is not always the case, of course;
consider the problem of reading your encrypted mail when logged in over a network connection.
What the system designer assumes is trusted may not match the needs or expectations of the real
users, especially when software can be controlled remotely over insecure networks.
7. Poorly understood protocol and service interactions. As systems get bigger and more
complex, benign features frequently come back to haunt us, and it™s hard to know even where to
look when things fail. The Internet worm was propagated via an obscure and innocent-looking
feature in the sendmail program; how many more features in how many more programs have
unexpected consequences just waiting to be discovered?
8. Unrealistic threat and risks assessment. Security experts tend to focus on the threats
they know how to model and prevent. Unfortunately, attackers focus on what they know how to
exploit, and the two are rarely exactly the same. Too many “secure” systems are designed
without considering what the attacker is actually likely to do.
9. Interfaces that make security expensive and special. If security features are to be used,
they must be convenient and transparent enough that people actually turn them on. It™s easy to
design encryption mechanisms that come only at the expense of performance or ease of use, and
even easier to design mechanisms that invite mistakes. Security should be harder to turn off
than on; unfortunately, few systems actually work this way.
10. Little broad-based demand for security. This is a well-known problem among almost
everyone who has tied his or her fortune to selling security products and services. Until there is
widespread demand for transparent security, the tools and infrastructure needed to support it
will be expensive and inaccessible to many applications. This is partly a problem of
understanding and exposing the threats and risks in real applications and partly a problem of
not designing systems that include security as a basic feature rather than as a later add-on.




Page 514 of 666
Applied Cryptography: Second Edition - Bruce Schneier




A more complete list and discussion of these kinds of threats could easily fill a book of this size and
barely scratch the surface. What makes them especially difficult and dangerous is that there are no
magic techniques, beyond good engineering and ongoing scrutiny, for avoiding them. The lesson for
the aspiring cryptographer is to respect the limits of the art.

Matt Blaze
New York, NY




Part V
SOURCE CODE
1. DES
2. LOKI91
3. IDEA
4. GOST
5. Blowfish
6. 3“Way
7. RC5
8. A5
9. SEAL

DES
#define EN0 0 /* MODE == encrypt */
#define DE1 1 /* MODE == decrypt */

typedef struct {
unsigned long ek[32];
unsigned long dk[32];
} des_ctx;

extern void deskey(unsigned char *, short);
/* hexkey[8] MODE
* Sets the internal key register according to the hexadecimal
* key contained in the 8 bytes of hexkey, according to the DES,
* for encryption or decryption according to MODE.
*/

extern void usekey(unsigned long *);
/* cookedkey[32]
* Loads the internal key register with the data in cookedkey.
*/

extern void cpkey(unsigned long *);
/* cookedkey[32]
* Copies the contents of the internal key register into the storage
* located at &cookedkey[0].
*/

extern void des(unsigned char *, unsigned char *);
/* from[8] to[8]
* Encrypts/Decrypts (according to the key currently loaded in the
* internal key register) one block of eight bytes at address ˜from™
* into the block at address ˜to™. They can be the same.
*/




Page 515 of 666
Applied Cryptography: Second Edition - Bruce Schneier




static void scrunch(unsigned char *, unsigned long *);
static void unscrun(unsigned long *, unsigned char *);
static void desfunc(unsigned long *, unsigned long *);
static void cookey(unsigned long *);

static unsigned long KnL[32] = { 0L };
static unsigned long KnR[32] = { 0L };
static unsigned long Kn3[32] = { 0L };
static unsigned char Df_Key[24] = {
0—01,0x23,0x45,0x67,0x89,0xab,0xcd,0xef,
0xfe,0xdc,0xba,0x98,0x76,0x54,0x32,0x10,
0x89,0xab,0xcd,0xef,0—01,0x23,0x45,0x67 };

static unsigned short bytebit[8] ={
0200, 0100, 040, 020, 010, 04, 02, 01 };

static unsigned long bigbyte[24] = {
0x800000L, 0x400000L, 0x200000L, 0x100000L,
0x80000L, 0x40000L, 0x20000L, 0x10000L,
0x8000L, 0x4000L, 0x2000L, 0x1000L,
0x800L, 0x400L, 0x200L, 0x100L,
0x80L, 0x40L, 0x20L, 0x10L,
0x8L, 0x4L, 0x2L, 0x1L };

/* Use the key schedule specified in the Standard (ANSI X3.92“1981). */

static unsigned char pc1[56] = {
56, 48, 40, 32, 24, 16, 8, 0, 57, 49, 41, 33, 25, 17,
9, 1, 58, 50, 42, 34, 26, 18, 10, 2, 59, 51, 43, 35,
62, 54, 46, 38, 30, 22, 14, 6, 61, 53, 45, 37, 29, 21,
13, 5, 60, 52, 44, 36, 28, 20, 12, 4, 27, 19, 11, 3 };

static unsigned char totrot[16] = {
1,2,4,6,8,10,12,14,15,17,19,21,23,25,27,28 };

static unsigned char pc2[48] = {
13, 16, 10, 23, 0, 4, 2, 27, 14, 5, 20, 9,
22, 18, 11, 3, 25, 7, 15, 6, 26, 19, 12, 1,
40, 51, 30, 36, 46, 54, 29, 39, 50, 44, 32, 47,
43, 48, 38, 55, 33, 52, 45, 41, 49, 35, 28, 31 };
void deskey(key, edf) /* Thanks to James Gillogly & Phil Karn! */
unsigned char *key;
short edf;
{
register int i, j, l, m, n;
unsigned char pc1m[56], pcr[56];
unsigned long kn[32];

for ( j = 0; j < 56; j++ ) {
l = pc1[j];
m = l & 07;
pc1m[j] = (key[l >> 3] & bytebit[m]) ? 1 : 0;
}
for( i = 0; i < 16; i++ ) {
if( edf == DE1 ) m = (15 “ i) << 1;
else m = i << 1;
n = m + 1;
kn[m] = kn[n] = 0L;
for( j = 0; j < 28; j++ ) {
l = j + totrot[i];
if( l < 28 ) pcr[j] = pc1m[l];
else pcr[j] = pc1m[l “ 28];
}
for( j = 28; j < 56; j++ ) {
l = j + totrot[i];




Page 516 of 666
Applied Cryptography: Second Edition - Bruce Schneier



if( l < 56 ) pcr[j] = pc1m[l];
else pcr[j] = pc1m[l “ 28];
}
for( j = 0; j < 24; j++ ) {
if( pcr[pc2[j]] ) kn[m] |= bigbyte[j];
if( pcr[pc2[j+24]] ) kn[n] |= bigbyte[j];
}
}
cookey(kn);
return;
}

static void cookey(raw1)
register unsigned long *raw1;
{
register unsigned long *cook, *raw0;
unsigned long dough[32];
register int i;

cook = dough;
for( i = 0; i < 16; i++, raw1++ ) {
raw0 = raw1++;
*cook = (*raw0 & 0—00fc0000L) << 6;
*cook |= (*raw0 & 0—00000fc0L) << 10;
*cook |= (*raw1 & 0—00fc0000L) >> 10;
*cook++ |= (*raw1 & 0—00000fc0L) >> 6;
*cook = (*raw0 & 0—0003f000L) << 12;
*cook |= (*raw0 & 0—0000003fL) << 16;
*cook |= (*raw1 & 0—0003f000L) >> 4;
*cook++ |= (*raw1 & 0—0000003fL);
}
usekey(dough);
return;
}

void cpkey(into)
register unsigned long *into;
{
register unsigned long *from, *endp;
from = KnL, endp = &KnL[32];
while( from < endp ) *into++ = *from++;
return;
}

void usekey(from)
register unsigned long *from;
{
register unsigned long *to, *endp;
to = KnL, endp = &KnL[32];
while( to < endp ) *to++ = *from++;
return;
}

void des(inblock, outblock)
unsigned char *inblock, *outblock;
{
unsigned long work[2];

scrunch(inblock, work);
desfunc(work, KnL);
unscrun(work, outblock);
return;
}

static void scrunch(outof, into)
register unsigned char *outof;




Page 517 of 666
Applied Cryptography: Second Edition - Bruce Schneier



register unsigned long *into;
{
*into = (*outof++ & 0xffL) << 24;
*into |= (*outof++ & 0xffL) << 16;
*into |= (*outof++ & 0xffL) << 8;
*into++ |= (*outof++ & 0xffL);
*into = (*outof++ & 0xffL) << 24;
*into |= (*outof++ & 0xffL) << 16;
*into |= (*outof++ & 0xffL) << 8;
*into |= (*outof & 0xffL);
return;
}

static void unscrun(outof, into)
register unsigned long *outof;
register unsigned char *into;
{
*into++ = (*outof >> 24) & 0xffL;
*into++ = (*outof >> 16) & 0xffL;
*into++ = (*outof >> 8) & 0xffL;
*into++ = *outof++ & 0xffL;
*into++ = (*outof >> 24) & 0xffL;
*into++ = (*outof >> 16) & 0xffL;
*into++ = (*outof >> 8) & 0xffL;
*into = *outof & 0xffL;
return;
}
static unsigned long SP1[64] = {
0—01010400L, 0—00000000L, 0—00010000L, 0—01010404L,
0—01010004L, 0—00010404L, 0—00000004L, 0—00010000L,
0—00000400L, 0—01010400L, 0—01010404L, 0—00000400L,
0—01000404L, 0—01010004L, 0—01000000L, 0—00000004L,
0—00000404L, 0—01000400L, 0—01000400L, 0—00010400L,
0—00010400L, 0—01010000L, 0—01010000L, 0—01000404L,
0—00010004L, 0—01000004L, 0—01000004L, 0—00010004L,
0—00000000L, 0—00000404L, 0—00010404L, 0—01000000L,
0—00010000L, 0—01010404L, 0—00000004L, 0—01010000L,
0—01010400L, 0—01000000L, 0—01000000L, 0—00000400L,
0—01010004L, 0—00010000L, 0—00010400L, 0—01000004L,
0—00000400L, 0—00000004L, 0—01000404L, 0—00010404L,
0—01010404L, 0—00010004L, 0—01010000L, 0—01000404L,
0—01000004L, 0—00000404L, 0—00010404L, 0—01010400L,
0—00000404L, 0—01000400L, 0—01000400L, 0—00000000L,
0—00010004L, 0—00010400L, 0—00000000L, 0—01010004L };

static unsigned long SP2[64] = {
0x80108020L, 0x80008000L, 0—00008000L, 0—00108020L,
0—00100000L, 0—00000020L, 0x80100020L, 0x80008020L,
0x80000020L, 0x80108020L, 0x80108000L, 0x80000000L,
0x80008000L, 0—00100000L, 0—00000020L, 0x80100020L,
0—00108000L, 0—00100020L, 0x80008020L, 0—00000000L,
0x80000000L, 0—00008000L, 0—00108020L, 0x80100000L,
0—00100020L, 0x80000020L, 0—00000000L, 0—00108000L,
0—00008020L, 0x80108000L, 0x80100000L, 0—00008020L,
0—00000000L, 0—00108020L, 0x80100020L, 0—00100000L,
0x80008020L, 0x80100000L, 0x80108000L, 0—00008000L,
0x80100000L, 0x80008000L, 0—00000020L, 0x80108020L,
0—00108020L, 0—00000020L, 0—00008000L, 0x80000000L,
0—00008020L, 0x80108000L, 0—00100000L, 0x80000020L,
0—00100020L, 0x80008020L, 0x80000020L, 0—00100020L,
0—00108000L, 0—00000000L, 0x80008000L, 0—00008020L,
0x80000000L, 0x80100020L, 0x80108020L, 0—00108000L };

static unsigned long SP3[64] = {
0—00000208L, 0—08020200L, 0—00000000L, 0—08020008L,
0—08000200L, 0—00000000L, 0—00020208L, 0—08000200L,




Page 518 of 666
Applied Cryptography: Second Edition - Bruce Schneier



0—00020008L, 0—08000008L, 0—08000008L, 0—00020000L,
0—08020208L, 0—00020008L, 0—08020000L, 0—00000208L,
0—08000000L, 0—00000008L, 0—08020200L, 0—00000200L,
0—00020200L, 0—08020000L, 0—08020008L, 0—00020208L,
0—08000208L, 0—00020200L, 0—00020000L, 0—08000208L,
0—00000008L, 0—08020208L, 0—00000200L, 0—08000000L,
0—08020200L, 0—08000000L, 0—00020008L, 0—00000208L,
0—00020000L, 0—08020200L, 0—08000200L, 0—00000000L,
0—00000200L, 0—00020008L, 0—08020208L, 0—08000200L,
0—08000008L, 0—00000200L, 0—00000000L, 0—08020008L,
0—08000208L, 0—00020000L, 0—08000000L, 0—08020208L,
0—00000008L, 0—00020208L, 0—00020200L, 0—08000008L,
0—08020000L, 0—08000208L, 0—00000208L, 0—08020000L,
0—00020208L, 0—00000008L, 0—08020008L, 0—00020200L };

static unsigned long SP4[64] = {
0—00802001L, 0—00002081L, 0—00002081L, 0—00000080L,
0—00802080L, 0—00800081L, 0—00800001L, 0—00002001L,
0—00000000L, 0—00802000L, 0—00802000L, 0—00802081L,
0—00000081L, 0—00000000L, 0—00800080L, 0—00800001L,
0—00000001L, 0—00002000L, 0—00800000L, 0—00802001L,
0—00000080L, 0—00800000L, 0—00002001L, 0—00002080L,
0—00800081L, 0—00000001L, 0—00002080L, 0—00800080L,
0—00002000L, 0—00802080L, 0—00802081L, 0—00000081L,
0—00800080L, 0—00800001L, 0—00802000L, 0—00802081L,
0—00000081L, 0—00000000L, 0—00000000L, 0—00802000L,
0—00002080L, 0—00800080L, 0—00800081L, 0—00000001L,
0—00802001L, 0—00002081L, 0—00002081L, 0—00000080L,
0—00802081L, 0—00000081L, 0—00000001L, 0—00002000L,
0—00800001L, 0—00002001L, 0—00802080L, 0—00800081L,
0—00002001L, 0—00002080L, 0—00800000L, 0—00802001L,
0—00000080L, 0—00800000L, 0—00002000L, 0—00802080L };

static unsigned long SP5[64] = {
0—00000100L, 0—02080100L, 0—02080000L, 0x42000100L,
0—00080000L, 0—00000100L, 0x40000000L, 0—02080000L,
0x40080100L, 0—00080000L, 0—02000100L, 0x40080100L,
0x42000100L, 0x42080000L, 0—00080100L, 0x40000000L,
0—02000000L, 0x40080000L, 0x40080000L, 0—00000000L,
0x40000100L, 0x42080100L, 0x42080100L, 0—02000100L,
0x42080000L, 0x40000100L, 0—00000000L, 0x42000000L,
0—02080100L, 0—02000000L, 0x42000000L, 0—00080100L,
0—00080000L, 0x42000100L, 0—00000100L, 0—02000000L,
0x40000000L, 0—02080000L, 0x42000100L, 0x40080100L,
0—02000100L, 0x40000000L, 0x42080000L, 0—02080100L,
0x40080100L, 0—00000100L, 0—02000000L, 0x42080000L,
0x42080100L, 0—00080100L, 0x42000000L, 0x42080100L,
0—02080000L, 0—00000000L, 0x40080000L, 0x42000000L,
0—00080100L, 0—02000100L, 0x40000100L, 0—00080000L,
0—00000000L, 0x40080000L, 0—02080100L, 0x40000100L };

static unsigned long SP6[64] = {
0x20000010L, 0x20400000L, 0—00004000L, 0x20404010L,
0x20400000L, 0—00000010L, 0x20404010L, 0—00400000L,
0x20004000L, 0—00404010L, 0—00400000L, 0x20000010L,
0—00400010L, 0x20004000L, 0x20000000L, 0—00004010L,
0—00000000L, 0—00400010L, 0x20004010L, 0—00004000L,
0—00404000L, 0x20004010L, 0—00000010L, 0x20400010L,
0x20400010L, 0—00000000L, 0—00404010L, 0x20404000L,
0—00004010L, 0—00404000L, 0x20404000L, 0x20000000L,
0x20004000L, 0—00000010L, 0x20400010L, 0—00404000L,
0x20404010L, 0—00400000L, 0—00004010L, 0x20000010L,
0—00400000L, 0x20004000L, 0x20000000L, 0—00004010L,
0x20000010L, 0x20404010L, 0—00404000L, 0x20400000L,
0—00404010L, 0x20404000L, 0—00000000L, 0x20400010L,
0—00000010L, 0—00004000L, 0x20400000L, 0—00404010L,




Page 519 of 666
Applied Cryptography: Second Edition - Bruce Schneier



0—00004000L, 0—00400010L, 0x20004010L, 0—00000000L,
0x20404000L, 0x20000000L, 0—00400010L, 0x20004010L };

static unsigned long SP7[64] = {
0—00200000L, 0—04200002L, 0—04000802L, 0—00000000L,
0—00000800L, 0—04000802L, 0—00200802L, 0—04200800L,
0—04200802L, 0—00200000L, 0—00000000L, 0—04000002L,
0—00000002L, 0—04000000L, 0—04200002L, 0—00000802L,
0—04000800L, 0—00200802L, 0—00200002L, 0—04000800L,
0—04000002L, 0—04200000L, 0—04200800L, 0—00200002L,
0—04200000L, 0—00000800L, 0—00000802L, 0—04200802L,
0—00200800L, 0—00000002L, 0—04000000L, 0—00200800L,
0—04000000L, 0—00200800L, 0—00200000L, 0—04000802L,
0—04000802L, 0—04200002L, 0—04200002L, 0—00000002L,
0—00200002L, 0—04000000L, 0—04000800L, 0—00200000L,
0—04200800L, 0—00000802L, 0—00200802L, 0—04200800L,
0—00000802L, 0—04000002L, 0—04200802L, 0—04200000L,
0—00200800L, 0—00000000L, 0—00000002L, 0—04200802L,
0—00000000L, 0—00200802L, 0—04200000L, 0—00000800L,
0—04000002L, 0—04000800L, 0—00000800L, 0—00200002L };

static unsigned long SP8[64] = {
0x10001040L, 0—00001000L, 0—00040000L, 0x10041040L,
0x10000000L, 0x10001040L, 0—00000040L, 0x10000000L,
0—00040040L, 0x10040000L, 0x10041040L, 0—00041000L,
0x10041000L, 0—00041040L, 0—00001000L, 0—00000040L,
0x10040000L, 0x10000040L, 0x10001000L, 0—00001040L,
0—00041000L, 0—00040040L, 0x10040040L, 0x10041000L,
0—00001040L, 0—00000000L, 0—00000000L, 0x10040040L,
0x10000040L, 0x10001000L, 0—00041040L, 0—00040000L,
0—00041040L, 0—00040000L, 0x10041000L, 0—00001000L,
0—00000040L, 0x10040040L, 0—00001000L, 0—00041040L,
0x10001000L, 0—00000040L, 0x10000040L, 0x10040000L,
0x10040040L, 0x10000000L, 0—00040000L, 0x10001040L,
0—00000000L, 0x10041040L, 0—00040040L, 0x10000040L,
0x10040000L, 0x10001000L, 0x10001040L, 0—00000000L,
0x10041040L, 0—00041000L, 0—00041000L, 0—00001040L,
0—00001040L, 0—00040040L, 0x10000000L, 0x10041000L };

static void desfunc(block, keys)
register unsigned long *block, *keys;
{
register unsigned long fval, work, right, leftt;
register int round;

leftt = block[0];
right = block[1];
work = ((leftt >> 4) ^ right) & 0—0f0f0f0fL;
right ^= work;
leftt ^= (work << 4);
work = ((leftt >> 16) ^ right) & 0—0000ffffL;
right ^= work;
leftt ^= (work << 16);
work = ((right >> 2) ^ leftt) & 0x33333333L;
leftt ^= work;
right ^= (work << 2);
work = ((right >> 8) ^ leftt) & 0—00ff00ffL;
leftt ^= work;
right ^= (work << 8);
right = ((right << 1) | ((right >> 31) & 1L)) & 0xffffffffL;
work = (leftt ^ right) & 0xaaaaaaaaL;
leftt ^= work;
right ^= work;
leftt = ((leftt << 1) | ((leftt >> 31) & 1L)) & 0xffffffffL;

for( round = 0; round < 8; round++ ) {




Page 520 of 666
Applied Cryptography: Second Edition - Bruce Schneier



work = (right << 28) | (right >> 4);
work ^= *keys++;
fval = SP7[ work & 0x3fL];
fval |= SP5[(work >> 8) & 0x3fL];
fval |= SP3[(work >> 16) & 0x3fL];
fval |= SP1[(work >> 24) & 0x3fL];
work = right ^ *keys++;
fval |= SP8[ work & 0x3fL];
fval |= SP6[(work >> 8) & 0x3fL];
fval |= SP4[(work >> 16) & 0x3fL];
fval |= SP2[(work >> 24) & 0x3fL];
leftt ^= fval;
work = (leftt << 28) | (leftt >> 4);
work ^= *keys++;
fval = SP7[ work & 0x3fL];
fval |= SP5[(work >> 8) & 0x3fL];
fval |= SP3[(work >> 16) & 0x3fL];
fval |= SP1[(work >> 24) & 0x3fL];
work = leftt ^ *keys++;
fval |= SP8[ work & 0x3fL];
fval |= SP6[(work >> 8) & 0x3fL];
fval |= SP4[(work >> 16) & 0x3fL];
fval |= SP2[(work >> 24) & 0x3fL];
right ^= fval;
}

right = (right << 31) | (right >> 1);
work = (leftt ^ right) & 0xaaaaaaaaL;
leftt ^= work;
right ^= work;
leftt = (leftt << 31) | (leftt >> 1);
work = ((leftt >> 8) ^ right) & 0—00ff00ffL;
right ^= work;
leftt ^= (work << 8);
work = ((leftt >> 2) ^ right) & 0x33333333L;
right ^= work;
leftt ^= (work << 2);
work = ((right >> 16) ^ leftt) & 0—0000ffffL;
leftt ^= work;
right ^= (work << 16);
work = ((right >> 4) ^ leftt) & 0—0f0f0f0fL;
leftt ^= work;
right ^= (work << 4);
*block++ = right;
*block = leftt;
return;
}
/* Validation sets:
*
* Single“length key, single“length plaintext “
* Key : 0123 4567 89ab cdef
* Plain : 0123 4567 89ab cde7
* Cipher : c957 4425 6a5e d31d
*
**********************************************************************/

void des_key(des_ctx *dc, unsigned char *key){
deskey(key,EN0);
cpkey(dc“>ek);
deskey(key,DE1);
cpkey(dc“>dk);
}
/* Encrypt several blocks in ECB mode. Caller is responsible for
short blocks. */
void des_enc(des_ctx *dc, unsigned char *data, int blocks){
unsigned long work[2];




Page 521 of 666
Applied Cryptography: Second Edition - Bruce Schneier



int i;
unsigned char *cp;
cp = data;
for(i=0;i<blocks;i++){
scrunch(cp,work);
desfunc(work,dc“>ek);
unscrun(work,cp);
cp+=8;
}
}

void des_dec(des_ctx *dc, unsigned char *data, int blocks){
unsigned long work[2];
int i;
unsigned char *cp;

cp = data;
for(i=0;i<blocks;i++){
scrunch(cp,work);
desfunc(work,dc“>dk);
unscrun(work,cp);
cp+=8;
}
}
void main(void){
des_ctx dc;
int i;
unsigned long data[10];
char *cp,key[8] = {0—01,0x23,0x45,0x67,0x89,0xab,0xcd,0xef};
char x[8] = {0—01,0x23,0x45,0x67,0x89,0xab,0xcd,0xe7};

cp = x;

des_key(&dc,key);
des_enc(&dc,cp,1);
printf(“Enc(0..7,0..7) = ”);
for(i=0;i<8;i++) printf(“%02x ”, ((unsigned int) cp[i])&0—00ff);
printf(“\n”);

des_dec(&dc,cp,1);

printf(“Dec(above,0..7) = ”);
for(i=0;i<8;i++) printf(“%02x ”,((unsigned int)cp[i])&0—00ff);
printf(“\n”);

cp = (char *) data;
for(i=0;i<10;i++)data[i]=i;

des_enc(&dc,cp,5); /* Enc 5 blocks. */
for(i=0;i<10;i+=2) printf(“Block %01d = %08lx %08lx.\n”,
i/2,data[i],data[i+1]);

des_dec(&dc,cp,1);
des_dec(&dc,cp+8,4);
for(i=0;i<10;i+=2) printf(“Block %01d = %08lx %08lx.\n”,
i/2,data[i],data[i+1]);
}

LOKI91
#include <stdio.h>

#define LOKIBLK 8 /* No of bytes in a LOKI data“block */
#define ROUNDS 16 /* No of LOKI rounds */




Page 522 of 666
Applied Cryptography: Second Edition - Bruce Schneier



typedef unsigned long Long; /* type specification for aligned LOKI blocks */

extern Long lokikey[2]; /* 64“bit key used by LOKI routines */
extern char *loki_lib_ver; /* String with version no. & copyright */

#ifdef __STDC__ /* declare prototypes for library functions */
extern void enloki(char *b);
extern void deloki(char *b);
extern void setlokikey(char key[LOKIBLK]);
#else /* else just declare library functions extern */
extern void enloki(), deloki(), setlokikey();
#endif __STDC__

char P[32] = {
31, 23, 15, 7, 30, 22, 14, 6,
29, 21, 13, 5, 28, 20, 12, 4,
27, 19, 11, 3, 26, 18, 10, 2,
25, 17, 9, 1, 24, 16, 8, 0
};

typedef struct {
short gen; /* irreducible polynomial used in this field */
short exp; /* exponent used to generate this s function */

} sfn_desc;
sfn_desc sfn[] = {
{ /* 101110111 */ 375, 31}, { /* 101111011 */ 379, 31},
{ /* 110000111 */ 391, 31}, { /* 110001011 */ 395, 31},
{ /* 110001101 */ 397, 31}, { /* 110011111 */ 415, 31},
{ /* 110100011 */ 419, 31}, { /* 110101001 */ 425, 31},
{ /* 110110001 */ 433, 31}, { /* 110111101 */ 445, 31},
{ /* 111000011 */ 451, 31}, { /* 111001111 */ 463, 31},
{ /* 111010111 */ 471, 31}, { /* 111011101 */ 477, 31},
{ /* 111100111 */ 487, 31}, { /* 111110011 */ 499, 31},
{ 00, 00} };

typedef struct {
Long loki_subkeys[ROUNDS];
} loki_ctx;

static Long f(); /* declare LOKI function f */
static short s(); /* declare LOKI S“box fn s */

#define ROL12(b) b = ((b << 12) | (b >> 20));
#define ROL13(b) b = ((b << 13) | (b >> 19));

#ifdef LITTLE_ENDIAN
#define bswap(cb) { \
register char c; \
c = cb[0]; cb[0] = cb[3]; cb[3] = c; \
c = cb[1]; cb[1] = cb[2]; cb[2] = c; \
c = cb[4]; cb[4] = cb[7]; cb[7] = c; \
c = cb[5]; cb[5] = cb[6]; cb[6] = c; \
}
#endif

void
setlokikey(loki_ctx *c, char *key)
{
register i;
register Long KL, KR;
#ifdef LITTLE_ENDIAN
bswap(key); /* swap bytes round if little“endian */
#endif
KL = ((Long *)key)[0];
KR = ((Long *)key)[1];




Page 523 of 666
Applied Cryptography: Second Edition - Bruce Schneier



for (i=0; i<ROUNDS; i+=4) { /* Generate the 16 subkeys */
c“>loki_subkeys[i] = KL;
ROL12 (KL);
c“>loki_subkeys[i+1] = KL;
ROL13 (KL);
c“>loki_subkeys[i+2] = KR;
ROL12 (KR);
c“>loki_subkeys[i+3] = KR;
ROL13 (KR);
}

#ifdef LITTLE_ENDIAN
bswap(key); /* swap bytes back if little“endian */
#endif
}
void
enloki (loki_ctx *c, char *b)
{
register i;
register Long L, R; /* left & right data halves */

#ifdef LITTLE_ENDIAN
bswap(b); /* swap bytes round if little“endian */
#endif
L = ((Long *)b)[0];
R = ((Long *)b)[1];

for (i=0; i<ROUNDS; i+=2) { /* Encrypt with the 16 subkeys */
L ^= f (R, c“>loki_subkeys[i]);
R ^= f (L, c“>loki_subkeys[i+1]);
}

((Long *)b)[0] = R; /* Y = swap(LR) */
((Long *)b)[1] = L;

#ifdef LITTLE_ENDIAN
bswap(b); /* swap bytes round if little“endian */
#endif
}

void
deloki(loki_ctx *c, char *b)
{
register i;
register Long L, R; /* left & right data halves */

#ifdef LITTLE_ENDIAN
bswap(b); /* swap bytes round if little“endian */
#endif

L = ((Long *)b)[0]; /* LR = X XOR K */
R = ((Long *)b)[1];

for (i=ROUNDS; i>0; i“=2) { /* subkeys in reverse order */
L ^= f(R, c“>loki_subkeys[i“1]);
R ^= f(L, c“>loki_subkeys[i“2]);
}

((Long *)b)[0] = R; /* Y = LR XOR K */
((Long *)b)[1] = L;
}

#define MASK12 0—0fff /* 12 bit mask for expansion E */

static Long
f(r, k)




Page 524 of 666
Applied Cryptography: Second Edition - Bruce Schneier



register Long r; /* Data value R(i“1) */
Long k; /* Key K(i) */
{
Long a, b, c; /* 32 bit S“box output, & P output */
a = r ^ k; /* A = R(i“1) XOR K(i) */

/* want to use slow speed/small size version */
b = ((Long)s((a & MASK12)) ) | /* B = S(E(R(i“1))^K(i)) */
((Long)s(((a >> 8) & MASK12)) << 8) |
((Long)s(((a >> 16) & MASK12)) << 16) |
((Long)s((((a >> 24) | (a << 8)) & MASK12)) << 24);

perm32(&c, &b, P); /* C = P(S( E(R(i“1)) XOR K(i))) */

return(c); /* f returns the result C */
}

static short s(i)
register Long i; /* return S“box value for input i */
{
register short r, c, v, t;
short exp8(); /* exponentiation routine for GF(2^8) */

r = ((i>>8) & 0xc) | (i & 0x3); /* row value“top 2 & bottom 2 */
c = (i>>2) & 0xff; /* column value“middle 8 bits */
t = (c + ((r * 17) ^ 0xff)) & 0xff; /* base value for Sfn */
v = exp8(t, sfn[r].exp, sfn[r].gen); /* Sfn[r] = t ^ exp mod gen */
return(v);
}

#define MSB 0x80000000L /* MSB of 32“bit word */

perm32(out, in , perm)
Long *out; /* Output 32“bit block to be permuted */
Long *in; /* Input 32“bit block after permutation */
char perm[32]; /* Permutation array */
{
Long mask = MSB; /* mask used to set bit in output */
register int i, o, b; /* input bit no, output bit no, value */
register char *p = perm; /* ptr to permutation array */

*out = 0; /* clear output block */
for (o=0; o<32; o++) { /* For each output bit position o */
i =(int)*p++; /* get input bit permuted to output o */
b = (*in >> i) & 01; /* value of input bit i */
if (b) /* If the input bit i is set */
*out |= mask; /* OR in mask to output i */
mask >>= 1; /* Shift mask to next bit */
}
}

#define SIZE 256 /* 256 elements in GF(2^8) */

short mult8(a, b, gen)
short a, b; /* operands for multiply */
short gen; /* irreducible polynomial generating Galois Field */
{
short product = 0; /* result of multiplication */

while(b != 0) { /* while multiplier is non“zero */
if (b & 01)

<<

. 22
( 29)



>>