"The New Cryptography"

Writing is a medium of communication and understanding, but there are times and places when one wants an entirely different function from writing: concealment and deliberate bafflement.

Cryptography, the science of secret writing, is almost as old as writing itself. The hieroglyphics of ancient Egypt were deliberately arcane: both writing and a cypher. Literacy in ancient Egypt was hedged about with daunting difficulty, so as to assure the elite powers of priest and scribe.

Ancient Assyria also used cryptography, including the unique and curious custom of "funerary cryptography." Assyrian tombs sometimes featured odd sets of cryptographic cuneiform symbols. The Assyrian passerby, puzzling out the import of the text, would mutter the syllables aloud, and find himself accidentally uttering a blessing for the dead. Funerary cryptography was a way to steal a prayer from passing strangers.

Julius Caesar lent his name to the famous "Caesar cypher," which he used to secure Roman military and political communications.

Modern cryptographic science is deeply entangled with the science of computing. In 1949, Claude Shannon, the pioneer of information theory, gave cryptography its theoretical foundation by establishing the "entropy" of a message and a formal measurement for the "amount of information" encoded in any stream of digital bits. Shannon's theories brought new power and sophistication to the codebreaker's historic efforts. After Shannon, digital machinery could pore tirelessly and repeatedly over the stream of encrypted gibberish, looking for repetitions, structures, coincidences, any slight variation from the random that could serve as a weak point for attack.

Computer pioneer Alan Turing, mathematician and proponent of the famous "Turing Test" for artificial intelligence, was a British cryptographer in the 1940s. In World War II, Turing and his colleagues in espionage used electronic machinery to defeat the elaborate mechanical wheels and gearing of the German Enigma code- machine. Britain's secret triumph over Nazi communication security had a very great deal to do with the eventual military triumph of the Allies. Britain's code-breaking triumph further assured that cryptography would remain a state secret and one of the most jealously guarded of all sciences.

After World War II, cryptography became, and has remained, one of the crown jewels of the American national security establishment. In the United States, the science of cryptography became the high-tech demesne of the National Security Agency (NSA), an extremely secretive bureaucracy that President Truman founded by executive order in 1952, one of the chilliest years of the Cold War.

Very little can be said with surety about the NSA. The very existence of the organization was not publicly confirmed until 1962. The first appearance of an NSA director before Congress was in 1975. The NSA is said to be based in Fort Meade, Maryland. It is said to have a budget much larger than that of the CIA, but this is impossible to determine since the budget of the NSA has never been a matter of public record. The NSA is said to the the largest single employer of mathematicians in the world. The NSA is estimated to have about 40,000 employees. The acronym NSA is aptly said to stand for "Never Say Anything."

The NSA almost never says anything publicly. However, the NSA's primary role in the shadow-world of electronic espionage is to protect the communications of the US government, and crack those of the US government's real, imagined, or potential adversaries. Since this list of possible adversaries includes practically everyone, the NSA is determined to defeat every conceivable cryptographic technique. In pursuit of their institutional goal, the NSA labors (in utter secrecy) to crack codes and cyphers and invent its own less breakable ones.

The NSA also tries hard to retard civilian progress in the science of cryptography outside its own walls. The NSA can suppress cryptographic inventions through the little-known but often-used Invention Secrecy Act of 1952, which allows the Commissioner of Patents and Trademarks to withhold patents on certain new inventions and to order that those inventions be kept secret indefinitely, "as the national interest requires." The NSA also seeks to control dissemination of information about cryptography, and to control and shape the flow and direction of civilian scientific research in the field.

Cryptographic devices are formally defined as "munitions" by Title 22 of the United States Code, and are subject to the same import and export restrictions as arms, ammunition and other instruments of warfare. Violation of the International Traffic in Arms Regulations (ITAR) is a criminal affair investigated and administered by the Department of State. It is said that the Department of State relies heavily on NSA expert advice in determining when to investigate and/or criminally prosecute illicit cryptography cases (though this too is impossible to prove).

The "munitions" classification for cryptographic devices applies not only to physical devices such as telephone scramblers, but also to "related technical data" such as software and mathematical encryption algorithms. This specifically includes scientific "information" that can be "exported" in all manner of ways, including simply verbally discussing cryptography techniques out loud. One does not have to go overseas and set up shop to be regarded by the Department of State as a criminal international arms trafficker. The security ban specifically covers disclosing such information to any foreign national anywhere, including within the borders of the United States.

These ITAR restrictions have come into increasingly harsh conflict with the modern realities of global economics and everyday real life in the sciences and academia. Over a third of the grad students in computer science on American campuses are foreign nationals. Strictly appled ITAR regulations would prevent communication on cryptography, inside an American campus, between faculty and students. Most scientific journals have at least a few foreign subscribers, so an exclusively "domestic" publication about cryptography is also practically impossible. Even writing the data down on a cocktail napkin could be hazardous: the world is full of photocopiers, modems and fax machines, all of them potentially linked to satellites and undersea fiber-optic cables.

In the 1970s and 1980s, the NSA used its surreptitious influence at the National Science Foundation to shape scientific research on cryptography through restricting grants to mathematicians. Scientists reacted mulishly, so in 1978 the Public Cryptography Study Group was founded as an interface between mathematical scientists in civilian life and the cryptographic security establishment. This Group established a series of "voluntary control" measures, the upshot being that papers by civilian researchers would be vetted by the NSA well before any publication.

This was one of the oddest situations in the entire scientific enterprise, but the situation was tolerated for years. Most US civilian cryptographers felt, through patriotic conviction, that it was in the best interests of the United States if the NSA remained far ahead of the curve in cryptographic science. After all, were some other national government's electronic spies to become more advanced than those of the NSA, then American government and military transmissions would be cracked and penetrated. World War II had proven that the consequences of a defeat in the cryptographic arms race could be very dire indeed for the loser.

So the "voluntary restraint" measures worked well for over a decade. Few mathematicians were so enamored of the doctrine of academic freedom that they were prepared to fight the National Security Agency over their supposed right to invent codes that could baffle the US government. In any case, the mathematical cryptography community was a small group without much real political clout, while the NSA was a vast, powerful, well-financed agency unaccountable to the American public, and reputed to possess many deeply shadowed avenues of influence in the corridors of power.

However, as the years rolled on, the electronic exchange of information became a commonplace, and users of computer data became intensely aware of their necessity for electronic security over transmissions and data. One answer was physical security -- protect the wiring, keep the physical computers behind a physical lock and key. But as personal computers spread and computer networking grew ever more sophisticated, widespread and complex, this bar-the-door technique became unworkable.

The volume and importance of information transferred over the Internet was increasing by orders of magnitude. But the Internet was a notoriously leaky channel of information -- its packet-switching technology meant that packets of vital information might be dumped into the machines of unknown parties at almost any time. If the Internet itself could not locked up and made leakproof -- and this was impossible by the nature of the system -- then the only secure solution was to encrypt the message itself, to make that message unusable and unreadable, even if it sometimes fell into improper hands.

Computers outside the Internet were also at risk. Corporate computers faced the threat of computer-intrusion hacking, from bored and reckless teenagers, or from professional snoops and unethical business rivals both inside and outside the company. Electronic espionage, especially industrial espionage, was intensifying. The French secret services were especially bold in this regard, as American computer and aircraft executives found to their dismay as their laptops went missing during Paris air and trade shows. Transatlantic commercial phone calls were routinely tapped by French government spooks seeking commercial advantage for French companies in the computer industry, aviation, and the arms trade. And the French were far from alone when it came to government-supported industrial espionage.

Protection of private civilian data from foreign government spies required that seriously powerful encryption techniques be placed into private hands. Unfortunately, an ability to baffle French spies also means an ability to baffle American spies. This was not good news for the NSA.

By 1993, encryption had become big business. There were one and half million copies of legal encryption software publicly available, including widely-known and commonly-used personal computer products such as Norton Utilities, Lotus Notes, StuffIt, and several Microsoft products. People all over the world, in every walk of life, were using computer encryption as a matter of course. They were securing hard disks from spies or thieves, protecting certain sections of the family computer from sticky-fingered children, or rendering entire laptops and portables into a solid mess of powerfully-encrypted Sanskrit, so that no stranger could walk off with those accidental but highly personal life- histories that are stored in almost every PowerBook.

People were no longer afraid of encryption. Encryption was no longer secret, obscure, and arcane; encryption was a business tool. Computer users wanted more encryption, faster, sleeker, more advanced, and better.

The real wild-card in the mix, however, was the new cryptography. A new technique arose in the 1970s: public-key cryptography. This was an element the codemasters of World War II and the Cold War had never foreseen.

Public-key cryptography was invented by American civilian researchers Whitfield Diffie and Martin Hellman, who first published their results in 1976.

Conventional classical cryptographic systems, from the Caesar cipher to the Nazi Enigma machine defeated by Alan Turing, require a single key. The sender of the message uses that key to turn his plain text message into cyphertext gibberish. He shares the key secretly with the recipients of the message, who use that same key to turn the cyphertext back into readable plain text.

This is a simple scheme; but if the key is lost to unfriendly forces such as the ingenious Alan Turing, then all is lost. The key must therefore always remain hidden, and it must always be fiercely protected from enemy cryptanalysts. Unfortunately, the more widely that key is distributed, the more likely it is that some user in on the secret will crack or fink. As an additional burden, the key cannot be sent by the same channel as the communications are sent, since the key itself might be picked-up by eavesdroppers.

In the new public-key cryptography, however, there are two keys. The first is a key for writing secret text, the second the key for reading that text. The keys are related to one another through a complex mathematical dependency; they determine one another, but it is mathematically extremely difficult to deduce one key from the other.

The user simply gives away the first key, the "public key," to all and sundry. The public key can even be printed on a business card, or given away in mail or in a public electronic message. Now anyone in the public, any random personage who has the proper (not secret, easily available) cryptographic software, can use that public key to send the user a cyphertext message. However, that message can only be read by using the second key -- the private key, which the user always keeps safely in his own possession.

Obviously, if the private key is lost, all is lost. But only one person knows that private key. That private key is generated in the user's home computer, and is never revealed to anyone but the very person who created it.

To reply to a message, one has to use the public key of the other party. This means that a conversation between two people requires four keys. Before computers, all this key-juggling would have been rather unwieldy, but with computers, the chips and software do all the necessary drudgework and number-crunching.

The public/private dual keys have an interesting alternate application. Instead of the public key, one can use one's private key to encrypt a message. That message can then be read by anyone with the public key, i.e,. pretty much everybody, so it is no longer a "secret" message at all. However, that message, even though it is no longer secret, now has a very valuable property: it is authentic. Only the individual holder of the private key could have sent that message.

This authentication power is a crucial aspect of the new cryptography, and may prove to be more socially important than secrecy. Authenticity means that electronic promises can be made, electronic proofs can be established, electronic contracts can be signed, electronic documents can be made tamperproof. Electronic impostors and fraudsters can be foiled and defeated -- and it is possible for someone you have never seen, and will never see, to prove his bona fides through entirely electronic means.

That means that economic relations can become electronic. Theoretically, it means that digital cash is possible -- that electronic mail, e-mail, can be joined by a strange and powerful new cousin, electronic cash, e- money.

Money that is made out of text -- encrypted text. At first consideration such money doesn't seem possible, since it is so far outside our normal experience. But look at this:

ASCII-picture of US dollar

This parody US banknote made of mere letters and numbers is being circulated in e-mail as an in-joke in network circles. But electronic money, once established, would be no more a joke than any other kind of money. Imagine that you could store a text in your computer and send it to a recipient; and that once gone, it would be gone from your computer forever, and registered infallibly in his. With the proper use of the new encryption and authentication, this is actually possible. Odder yet, it is possible to make the note itself an authentic, usable, fungible, transferrable note of genuine economic value, without the identity of its temporary owner ever being made known to anyone. This would be electronic cash -- like normal cash, anonymous -- but unlike normal cash, lightning-fast and global in reach.

There is already a great deal of electronic funds transfer occurring in the modern world, everything from gigantic currency-exchange clearinghouses to the individual's VISA and MASTERCARD bills. However, charge- card funds are not so much "money" per se as a purchase via proof of personal identity. Merchants are willing to take VISA and MASTERCARD payments because they know that they can physically find the owner in short order and, if necessary, force him to pay up in a more conventional fashion. The VISA and MASTERCARD user is considered a good risk because his identity and credit history are known.

VISA and MASTERCARD also have the power to accumulate potentially damaging information about the commercial habits of individuals, for instance, the video stores one patronizes, the bookstores one frequents, the restaurants one dines in, or one's travel habits and one's choice of company.

Digital cash could be very different. With proper protection from the new cryptography, even the world's most powerful governments would be unable to find the owner and user of digital cash. That cash would secured by a "bank" -- (it needn't be a conventional, legally established bank) -- through the use of an encrypted digital signature from the bank, a signature that neither the payer nor the payee could break.

The bank could register the transaction. The bank would know that the payer had spent the e-money, and the bank could prove that the money had been spent once and only once. But the bank would not know that the payee had gained the money spent by the payer. The bank could track the electronic funds themselves, but not their location or their ownership. The bank would guarantee the worth of the digital cash, but the bank would have no way to tie the transactions together.

The potential therefore exists for a new form of network economics made of nothing but ones and zeroes, placed beyond anyone's controls by the very laws of mathematics. Whether this will actually happen is anyone's guess. It seems likely that if it did happen, it would prove extremely difficult to stop.

Public-key cryptography uses prime numbers. It is a swift and simple matter to multiply prime numbers together and obtain a result, but it is an exceedingly difficult matter to take a large number and determine the prime numbers used to produce it. The RSA algorithm, the commonest and best-tested method in public-key cryptography, uses 256-bit and 258-bit prime numbers. These two large prime numbers ("p" and "q") are used to produce very large numbers ("d" and "e") so that (de-1) is divisible by (p-1) times (q-1). These numbers are easy to multiply together, yielding the public key, but extremely difficult to pull apart mathematically to yield the private key.

To date, there has been no way to mathematically prove that it is inherently difficult to crack this prime-number cipher. It might be very easy to do if one knew the proper advanced mathematical technique for it, and the clumsy brute-power techniques for prime-number factorization have been improving in past years. However, mathematicians have been working steadily on prime number factorization problems for many centuries, with few dramatic advances. An advance that could shatter the RSA algorithm would mean an explosive breakthrough across a broad front of mathematical science. This seems intuitively unlikely, so prime-number public keys seem safe and secure for the time being -- as safe and secure as any other form of cryptography short of "the one-time pad." (The one-time pad is a truly unbreakable cypher. Unfortunately it requires a key that is every bit as long as the message, and that key can only be used once. The one-time pad is solid as Gibraltar, but it is not much practical use.)

Prime-number cryptography has another advantage. The difficulty of factorizing numbers becomes drastically worse as the prime numbers become larger. A 56-bit key is, perhaps, not entirely outside the realm of possibility for a nationally supported decryption agency with large banks of dedicated supercomputers and plenty of time on their hands. But a 2,048 bit key would require every computer on the planet to number-crunch for hundreds of centuries.

Decrypting a public-keyed message is not so much a case of physical impossibility, as a matter of economics. Each key requires a huge computational effort to break it, and there are already thousands of such keys used by thousands of people. As a further blow against the decryptor, the users can generate new keys easily, and change them at will. This poses dire problems for the professional electronic spy.

The best-known public-key encryption technique, the RSA algorithm, was named after its inventors, Ronald L. Rivest, Adi Shamir and Leon Adleman. The RSA technique was invented in the United States in the late 1980s (although, as if to spite the international trade in arms regulations, Shamir himself is an Israeli). The RSA algorithm is patented in the United States by the inventors, and the rights to implement it on American computers are theoretically patented by an American company known as Public Key Partners. (Due to a patent technicality, the RSA algorithm was not successfully patented overseas.)

In 1991 an amateur encryption enthusiast named Phil Zimmerman wrote a software program called "Pretty Good Privacy" that used the RSA algorithm without permission. Zimmerman gave the program away on the Internet network via modem from his home in Colorado, because of his private conviction that the public had a legitimate need for powerful encryption programs at no cost (and, incidentally, no profit to the inventors of RSA). Since Zimmerman's action, "Pretty Good Privacy" or "PGP" has come into common use for encrypting electronic mail and data, and has won an avid international following. The original PGP program has been extensively improved by other software writers overseas, out of the reach of American patents or the influence of the NSA, and the PGP program is now widely available in almost every country on the planet -- or at least, in all those countries where floppy disks are common household objects.

Zimmerman, however, failed to register as an arms dealer when he wrote the PGP software in his home and made it publicly available. At this writing, Zimmerman is under federal investigation by the Office of Defense Trade Controls at the State Department, and is facing a possible criminal indictment as an arms smuggler. This despite the fact that Zimmerman was not, in fact, selling anything, but rather giving software away for free. Nor did he voluntarily "export" anything -- rather, people reached in from overseas via Internet links and retrieved Zimmerman's program from the United States under their own power and through their own initiative.

Even more oddly, Zimmerman's program does not use the RSA algorithm exclusively, but also depends on the perfectly legal DES or Data Encryption Standard. The Data Encryption Standard, which uses a 56-bit classical key, is an official federal government cryptographic technique, created by IBM with the expert help of the NSA. It has long been surmised, though not proven, that the NSA can crack DES at will with their legendary banks of Cray supercomputers. Recently a Canadian mathematician, Michael Wiener of Bell-Northern Research, published plans for a DES decryption machine that can purportedly crack 56-bit DES in a matter of hours, through brute force methods. It seems that the US Government's official 56- bit key -- insisted upon, reportedly, by the NSA -- is now too small for serious security uses.

The NSA, and the American law enforcement community generally, are unhappy with the prospect of privately owned and powerfully secure encryption. They acknowledge the need for secure communications, but they insist on the need for police oversight, police wiretapping, and on the overwhelming importance of national security interests and governmental supremacy in the making and breaking of cyphers.

This motive recently led the Clinton Administration to propose the "Clipper Chip" or "Skipjack," a government- approved encryption device to be placed in telephones. Sets of keys for the Clipper Chip would be placed in escrow with two different government agencies, and when the FBI felt the need to listen in on an encrypted telephone conversation, the FBI would get a warrant from a judge and the keys would be handed over.

Enthusiasts for private encryption have pointed out a number of difficulties with the Clipper Chip proposal. First of all, it is extremely unlikely that criminals, foreign spies, or terrorists would be foolish enough to use an encryption technique designed by the NSA and approved by the FBI. Second, the main marketing use for encryption is not domestic American encryption, but international encryption. Serious business users of serious encryption are far more alarmed by state-supported industrial espionage overseas, than they are about the safety of phone calls made inside the United States. They want encryption for communications made overseas to people overseas -- but few foreign business people would buy an encryption technology knowing that the US Government held the exclusive keys.

It is therefore likely that the Clipper Chip could never be successfully exported by American manufacturers of telephone and computer equipment, and therefore it could not be used internationally, which is the primary market for encryption. Machines with a Clipper Chip installed would become commercial white elephants, with no one willing to use them but American cops, American spies, and Americans with nothing to hide.

A third objection is that the Skipjack algorithm has been classified "Secret" by the NSA and is not available for open public testing. Skeptics are very unwilling to settle for a bland assurance from the NSA that the chip and its software are unbreakable except with the official keys.

The resultant controversy was described by Business Week as "Spy Vs Computer Nerd." A subterranean power- struggle has broken out over the mastery of cryptographic science, and over basic ownership of the electronic bit- stream.

Much is riding on the outcome.

Will powerful, full-fledged, state-of-the-art encryption belong to individuals, including such unsavory individuals as drug traffickers, child pornographers, black-market criminal banks, tax evaders, software pirates, and the possible future successors of the Nazis?

Or will the NSA and its allies in the cryptographic status-quo somehow succeed in stopping the march of scientific progress in cryptography, and in cramming the commercial crypto-genie back into the bottle? If so, what price will be paid by society, and what damage wreaked on our traditions of free scientific and technical inquiry?

One thing seems certain: cryptography, this most obscure and smothered of mathematical sciences, is out in the open as never before in its long history. Impassioned, radicalized cryptographic enthusiasts, often known as "cypherpunks," are suing the NSA and making it their business to spread knowledge of cryptographic techniques as widely as possible, "through whatever means necessary." Small in number, they nevertheless have daring, ingenuity, and money, and they know very well how to create a public stink. In the meantime, their more conventional suit-and-tie allies in the Software Publishers Association grumble openly that the Clipper Chip is a poorly-conceived fiasco, that cryptographic software is peddled openly all over the planet, and that "the US Government is succeeding only in crippling an American industry's exporting ability."

The NSA confronted the worst that America's adversaries had to offer during the Cold War, and the NSA prevailed. Today, however, the secret masters of cryptography find themselves confronting what are perhaps the two most powerful forces in American society: the computer revolution, and the profit motive. Deeply hidden from the American public through forty years of Cold War terror, the NSA itself is for the first time, exposed to open question and harrowing reassessment.

Will the NSA quietly give up the struggle, and expire as secretly and silently as it lived its forty-year Cold War existence? Or will this most phantomlike of federal agencies decide to fight for its survival and its scientific pre-eminence?

And if this odd and always-secret agency does choose to fight the new cryptography, then -- how?

Загрузка...