[12.0] Frontiers In Cryptology

v2.3.2 / chapter 12 of 13 / 01 oct 14 / greg goebel

* The general public now has access to very strong ciphers, a fact which has led to political controversy. New, even stronger, ciphers and codebreaking schemes based on the subtle principles of quantum physics are in the works. This chapter discusses the politics of ciphers and quantum cryptology.



* The use of steganography by terrorists underlines the ambiguity that governments feel over the cheap, readily available, powerful encryption tools now in widespread use.

Civil libertarians see the use of strong ciphers as a right of citizens in a free state, and believe that access by the government to the encrypted communications of its citizens could be disastrous if an authoritarian regime came to power. Government agencies, on the other hand, fear that criminals such as terrorists, drug runners, gangsters, and pedophiles will use encrypted communications to carry on their activities without interference from the law.

Unsurprisingly, the debate has been loud and bitter. Phil Zimmermann has defended his release of PGP by citing the use of the product by democratic activists in the Baltic States during the fall of Communism, and by resistance groups fighting the military dictatorship in Myanmar.

Privacy advocates have a particular concern over the worldwide "ECHELON" network operated by the NSA and that agency's counterparts in other nations. ECHELON began with a secret international agreement between the US, Britain, and Canada in 1948, with Australia and New Zealand signing up later. It now believed to consist of a set of eavesdropping satellites and ground stations that tap into international commercial radio and microwave telecommunications links.

Although ECHELON is secret, it is thought to rely on two types of satellites. The ORION / VORTEX satellites operate in geostationary orbit at an altitude of 36,000 kilometers (22,500 miles) to intercept telecom transmissions being sent over microwave links. There are three ORION / VORTEX satellites in orbit, providing global coverage. TRUMPET satellites operate in lower orbits, and are used to intercept cellphone conversations; usually there are about two in orbit. ECHELON has ground stations in the US, Canada, the UK, Australia, and New Zealand. The massive amounts of data are searched by supercomputers for keywords related to terrorist plots, drug cartel deals, and other national security information. If nothing interesting is found in the archive of communications intercepts, it is discarded after a month. The NSA is believed to be able to easily crack most encrypted commercial traffic, and has the capability to crack much stronger ciphers.

ECHELON has remained mostly hidden during its history, but it finally came to public notice when it was discussed in a report to the European Parliament. Concerns quickly surfaced that the US was using ECHELON to obtain a competitive economic advantage in the world marketplace. NSA strongly denied the charge, insisting that their intelligence is not provided to the public sector. Privacy advocates point to ECHELON as clear evidence that government interception of private communications is not a possible scenario but an established fact.

On the other side of the coin, government agencies have pointed out that the Japanese Aum Shinrikyo religious group, which conducted a nerve-gas attack on a Tokyo subway in 1995, also used encryption to protect their critical documents. Government authorities report that criminal organizations are unsurprisingly fond of data encryption. Laws have been passed in Singapore and Malaysia requiring citizens who have encrypted files to hand over keys on demand, and the US, Britain, and India have considered similar laws.

* The issue is complicated by the fact that the right to privacy is not as well-established in the law as popularly assumed. Those who assert the rights of individuals to complete privacy may on the other hand maintain that public officials do not have that same right, since such secrecy is not in the public interest. This line of reasoning could be gradually stretched to corporate officials; officials of religious groups; officials of non-profit organizations, such as political and advocacy groups; and back to individuals again.

The US Constitution does not assert a specific right to privacy. When privacy concerns face off with First Amendment freedom-of-speech issues in US courts, privacy usually loses. The Fourth Amendment, which outlaws unreasonable search and seizure, and the Fifth Amendment, which requires the US government to provide compensation for seizure of properties, do have implications for privacy, and the US Supreme Court judged in 1984 that Fifth Amendment protection of property extends to data. However, these protections are in the form of broad principles, not specific laws. The US government's right to wiretapping has been well established, though certain legal safeguards are required. The 1974 Federal Privacy Act restricted the US government to storing only "relevant and necessary" data on citizens, but this is obviously vague. US states have passed their own privacy laws, only to open up loopholes, such as permitting the mandatory drug testing of college athletes.

Advocates of strong ciphers use the weak legal status of privacy as a justification for the use of ciphers, saying in effect: "If we don't protect our privacy ourselves, then who will do it for us?"

Western European countries take a different approach to the issue of individual rights than the US. For example, few of these governments have any stated rights of free public expression comparable to those expressed by the US First Amendment, but in compensation many of these governments have passed privacy laws far stronger than any in the United States. With the gradual movement towards European unification, there has been a comparable movement towards establishing uniform standards of privacy for European citizens. The EU "Common Position" document for the European Parliament proclaims a "fundamental right" to the privacy of personal data. The document also proposes that EU governments not provide data to governments that do not meet such privacy guidelines, and so some European countries have banned provision of marketing or other personal data to the US government and US companies. However, this could be seen more as a trade-protection tactic than a defense of ethical principles.

* The controversy over public access to strong ciphers is closely intertwined with the concept of "key escrow". While certification authorities have been established to record and validate public keys, there is also a need for an organization to record and validate private keys. This provides insurance for an organization in case a private key gets lost, or an angry employee leaves and doesn't bother to tell anyone the key, and so on.

The controversy is over whether the third party is the government or a private organization. The US government came up with a key escrow scheme in 1994, under the designation of the "American Escrowed Encryption Standard". This involved two encryption standards, one named "Clipper" for telephone communications and the other named "Capstone" for computer communications. In this scheme, Alice could buy a phone with a Clipper chip in it, and when she obtained the phone, the government would receive the key for it. The key would be split in half, with each key provided to a different government organization. A law enforcement agency would have to go to both organizations to obtain the full key.

The US government mandated the use of Clipper and Capstone for its own use, and pressured companies doing business with the government to do the same. Clipper and Capstone still failed to catch on. Critics suggested that the government's desire to escrow cipher keys had about the same level of logic and desireability as would a government effort to escrow keys to the doors of all the homes of citizens. Other governments proved disinterested, and without widespread adoption Clipper and Capstone were useless.

A need for key escrow still remains. The ideal is a "trusted third party (TTP)", which would not be under government control and would have safeguards against improper use of private keys.

* The controversy over strong ciphers appears to have gone quiet for the moment. The push towards government key escrow seems to be dead, and some governments have come to the point of view that, in the face of the extremely insecure nature of the Internet, encryption does much to prevent crime instead of encourage it.

In the spring of 1999, the German government announced that it would help support the development of GNUPG and recommend it to German citizens. The French government has formally scrapped attempts at key escrow. The Canadian government formally announced a "hands-off" policy towards encryption, and in the US the Clinton Administration announced in a policy statement on cryptographic export that stated: "Americans will remain free to use any encryption system domestically."

In the summer of 2000, the Clinton Administration followed up this move by relaxing the restrictive export controls on cryptographic software that had been the law, much to the relief of US crypto-technology companies. For the time being, the angels seem to be on the side of the privacy advocates.



* Current ciphers such as RSA or AES are extremely difficult to break by brute-force methods. To be sure, there is no such thing as an "unbreakable" cipher, since if a cipher can't be broken by analytical means, there are other links in the chain where it can be attacked. For example, Eve could write a software virus or even a modified version of PGP and slip it unnoticed onto Alice's computer. When Alice tries to encrypt a message, Eve's "stealth" software could obtain the key or the plaintext, and then sneakily send it on to Eve over the Internet later.

Such trickery does not appeal to most cryptanalysts, who would really like to be able to crack a strong cipher such as RSA by analytic means. That means factoring a very large number in some period of time hopefully much shorter than the age of the Universe. There is a approach known as "quantum computing" that might be able to do the trick, if anybody can figure out how to make it work.

Modern computers are based on simple logic operations that are easily understood, though their elaborations may become very complicated. Quantum computing, in contrast, is based on operations that even those who are working on quantum computing not only admit, but insist, cannot be understood at an intuitive level.

Quantum computing is based on quantum physics, the branch of modern physics that deals with the often bizarre laws of nature at the microscale. One of the best-known experiments in quantum physics, an experiment very relevant to quantum computing, is the "two-slit light interference paradox", or "two-slit paradox" for short. This paradox is based on the phenomenon of light interference, which has been understood for over two centuries, and for most of that time did not seem paradoxical at all.

In 1803, the English polymath Thomas Young published a paper on experiments he had conducted on light. In one such experiment, he shined a light onto a screen through two parallel slits, spaced a short distance apart. The strip of light cast through the two slits was not a broad swath of light, but alternated from dark to bright to dark to bright across the swath. This could be explained if light were assumed to be a type of wave phenomenon. The dark positions in the swath occurred where the pathlengths of light from the two slits were such that the light from one was shifted by a half-wavelength from the other, and so the light canceled out. The bright positions in the swath occurred where the pathlengths resulted in the light from the two slits being in the same phase, and so the light from the two slits added up. There had long been a controversy among physicists over whether light was a particle or wave phenomenon. Young's experiments seemed to conclusively prove that it was a wave phenomenon, and light was accepted as a wave for over a century.

However, in the 20th century, light was proven to have behaviors clearly characteristic of particles as well. Quantum physics was able to reconcile, uncomfortably, the mutually conflicting ideas of light as a particle and light as a wave through a concept known as "wave-particle" duality, which said essentially that an experiment on a subatomic particle that was intended to show wavelike properties would show wavelike properties, and an experiment that was intended to show particle properties would show particle properties.

Light is now known to travel as individual particles called "photons". The two-slit interference pattern observed by Thomas Young can be seen as the result of countless photons pouring through the two slits, building up the interference pattern one photon at a time. This is a bit puzzling, since if light were simply a particle the photons would simply pile up in front of each slit, establishing maximum brightness at those locations and fading at locations farther away from the slits. There would be no interference pattern. However, wave-particle duality does allow the wave interference pattern to occur.

What is much more puzzling is that the interference pattern will build up over time even if only one photon is emitted at a time. This sounds perfectly crazy: if only one photon goes through at a time, what could it possibly be interfering with to build up the pattern? Although a photon absolutely has a wavelength, any one photon seems to be a point particle; there's nothing particularly "wavy" about it in itself.

One of the important features of the sub-microscopic world is that a measurement cannot be taken on a particle without seriously affecting it. This is known as the "uncertainty principle". According to traditional quantum physics, by a further implication of the uncertainty principle, nothing can be said about the properties of a particle, or its "state", until a measurement is made on it. This might not seem to be much of a condition, since intuitively it seems obvious that the particle is in some state before we measure it, even though the measurement will alter that state.

Intuition is wrong. Before a measurement is taken, nothing can be said about the properties of the particle; it is "indeterminate". It is not a question of the particle being in one of the particular states available to it before the particle is measured, and not knowing what that state is; the particle exists in all the states at the same time. If no measurement is taken to see which slit the particle goes through, it could go through either slit, and goes through both of them. This principle is called "superposition of states", and the existence of the photon in these multiple states is known as "coherent superposition". Once the photon is measured, the superposition of states "collapses" into a measured event, a process known as "decoherence".

This traditional view of quantum physics, known as the "Copenhagen interpretation", has its critics and a number of alternative interpretations have been proposed. None of them sound much less outrageous and there's no good reason to discuss them here, but all the interpretations agree on the existence of the concepts of quantum indeterminacy and superposition of states. These notions are observables, not arbitrary theoretical contraptions, with the interpretations simply representing a struggle to explain the observed reality. When a student told famed American physicist Richard P. Feynman that he didn't believe in superposition of states, Feynman replied: "Well, go do the experiments until you do believe it."

* Just because superposition of states violates common-sense logic does not mean that it's useless. In 1985, following up on hints provided by Feynman a few years earlier, a British physicist named David Deutsch of the University of Oxford published a paper that suggested that superposition of states could be put to use in a "quantum computer".

A conventional computer encodes a bit, a 1 or 0, as the presence or absence of a voltage. A quantum computer, in contrast, uses a quantum-physical property of a particle to encode a 1 or 0, for example the spin axis of an electron (which is either "up" or "down" but nothing in between) or the polarization of a photon. In either case, calculations will be performed on a group of bits, say 32 bits, at one time. 32 bits can be arranged in a total of 4,294,967,296 different ways, allowing them to encode that number of different numbers. Suppose a conventional computer were to perform a calculation requiring a test of all possible values that can be represented by 32 bits. It would have to loop through every single value, one at a time.

In contrast, if a 32-bit value were represented by the spin states of 32 electrons or the polarization states of 32 photons, by the principle of superposition of states, all possible 32-bit values would be present at the same time! A quantum computer operating on these 32 particles could test all 4,294,967,296 values in a single calculation, without going in a loop.

While adding another bit doubles the number of loops for a conventional computer, even if another bit is added to the calculation in a quantum computer, the calculation would still be done at one time and would take no longer than before. The hardware would have to get bigger, but the computation time would remain the same.

Since the use of the spin state of an electron or polarization state of a photon to represent a bit has vastly different properties from the use of a voltage level to represent a bit, in a quantum computer the bits are referred to as "quantum bits" or "qubits".

* This was a completely mind-boggling idea. It was also a purely abstract proposal, similar to Diffie's conceptual invention of public-key cryptography. Deutsch was in a worse situation, however, since not only did he not know how to implement a quantum computer, he didn't really know what could be specifically done with such a technology even if it existed. All he had was a marvelous "what if?" scenario. For this reason, quantum computing remained little more than an exotic theoretical toy for almost a decade. Researchers figured out examples of tasks that a quantum computer could execute, but these tasks were trivial and of little practical value.

That changed in 1994, when Peter Shor of the AT&T Bell Laboratories in New Jersey came up with a practical algorithm that could be implemented with a quantum computer: fast factoring of large numbers. Two years later, his colleague Lov Grover, working from Shor's research, came up with a second practical algorithm: a fast technique for searching for the location of a specific value in an unsorted list of values. Grover announced an even more efficient scheme in the summer of 2000.

Neither of these algorithms actually got a result in a single step using a quantum computer, but they were far more efficient than comparable calculations performed on a conventional computer. For example, while the average number of searches to find a specific entry in a list of N items is no better than N/2 with a conventional sequential search, with Shor's scheme is SQRT(N). Cryptographers quickly realized that the first two practical algorithms to be devised for a quantum computer were not merely useful to cryptanalysis but of central importance to it. A fast factoring algorithm could crack RSA. A fast value-search algorithm could crack DES and other block ciphers. Interest in quantum computing increased significantly.

* A quantum computer is very high on the wishlist of cryptanalysts and the black organizations that hire them. However, nobody's figured out how to build a quantum computer yet. Controlling individual particles or small groups of particles is very difficult, and to make matters worse the qubits have to be isolated from the environment to achieve a superposition of states. This problem gets more difficult as more qubits are used at one time.

Researchers are painfully nailing down the technological steps pieces needed to build a quantum computer. Much recent research has focused on using atoms or molecules to implement qubit storage. For example, in 2000 a team from IBM and Stanford University announced that they had developed a quantum computer that used a five-atom fluorine molecule to perform a two-qubit quantum calculation. The qubit value was encoded by the spin states of the atoms in the molecule, with the spin states ultimately sensed by nuclear magnetic resonance (NMR) measurements. Two of the atoms were used for qubit storage, while the other three were used to evaluate the operation.

An actual quantum computer will require thousands of qubits to be useful, including a large number of bits simply for error correction. Nobody believes that the approach of sensing the spin states of molecules with NMR is useful for much more than demonstrating that quantum computing is possible, but other schemes are being investigated, based on semiconductor micromechanical systems implementing sets of single-ion traps; single photons fed through optical elements; single-electron transistors; oscillations of current across superconducting junction elements; or manipulation of magnetic flux states in superconducting loops.

So far, all demonstrations of quantum computing techniques have been just that, demonstrations, and a long way from practical technology. Some critics have compared quantum computing to "cold fusion", suggesting it has very little basis in reality. Others have more generously suggested it's more like "hot fusion", meaning that it's possible in principle, but there's no reason to believe that it will ever be practical.



* Although quantum technology may provide means to crack any existing cipher, it may also be used to produce a cipher that even a quantum computer can't crack.

In the late 1960s Stephen Weisner, at that time was a graduate student at Columbia University, came up with an unusual idea for "quantum money" that couldn't be counterfeited. Weisner's quantum money was based on the physics of light photons. One of the ways in which light is wavelike is in that it can be "polarized". It has a wave motion at a right angle to its direction of travel. Relative to the axis of travel, that wave motion could be oriented vertically, horizontally, or any direction in between. A "Polaroid" lens can be used to screen for light of a given polarization. Such lenses are made of plastic with all the polymer molecules oriented in the same direction. If they are all oriented vertically, photons with vertical polarization can pass through, but photons with horizontal polarization cannot.

To explain Weisner's quantum money, let's assume that we have filtered light so that we end up with photons polarized in only four directions: vertically, horizontally, +45 degrees, and -45 degrees. We'll designate these four polarizations as "V", "H", "P" (plus), and "M" (minus).

Suppose we try to send photons with these four polarizations through a vertically-polarized Polaroid lens. A V photon will pass through and an H photon will be blocked, as expected. However, the concept of quantum indeterminacy applies to the polarization of photons. What this means in practice is that a P photon has a 50:50 chance of passing through or being blocked, as does an M photon; if a P or M photon does pass through the vertically polarized lens, it will be vertically polarized and will be indistinguishable from a V photon. Similarly, if a P or M photon is passed through a horizontally polarized lens, it will have a 50:50 chance of being blocked; if it is passed through, it will be horizontally polarized and will be indistinguishable from an H photon.

Weisner's quantum money, as he envisioned it, included 20 "light traps", which were devices that could capture and store a photon. Each trap could be loaded with a single photon with V, H, P, or M polarization. The polarization sequence of the 20 photons would be unique for each bill. Each bill would also have a printed serial number, and a record could be kept of serial numbers and the corresponding polarization sequence loaded into each and every bill. To forge the bill, a counterfeiter would have to copy a legitimate serial number, which is trivial, and copy the polarization pattern for the bill with that serial number, which is not.

The problem arises because of the ambiguity in reading photon polarizations. If the counterfeiter uses a vertically polarized filter, it would pass V photons and block H photons. But what about the P and M photons? On the average, half the time they would be passed, with the counterfeiter interpreting them as V photons, and the other half of the time they would be blocked, and the counterfeiter interpreting them as H photons. The counterfeiter could use a Polaroid lens with an orientation of +45 degrees to discriminate between P and M photons, but then will be confounded by the V and H photons. Since the photons can only be released from the traps to be measured once, the counterfeiter cannot use a trial-and-error method to determine the polarization sequence.

The bank can reliably read the polarization sequence because the information describing the sequence has been recorded and so the proper filters can be used. To be sure, there is some ambiguity even for the bank, since M and P photons could be misinterpreted as V and H photons or the reverse, but the likelihood of properly testing all 20 photons by accident is very small. Of course, the bill has to be "reinitialized" with the recorded polarization sequence after it has been read, since the original is destroyed by reading it.

* Even Weisner didn't think that quantum money was a practical idea; he just found it an interesting theoretical toy. Nobody else was interested, and all the papers he wrote on it were rejected by scientific publications.

One person who did find it interesting was his friend Charles Bennett, who kept tinkering with the idea over a period of several years. In the early 1980s, when Bennett was working for IBM, he got to talking with a computer scientist from the University of Montreal named Gilles Brassard, and the two realized that the quantum money concept had applications in cryptography. In particular, it offered the possibility of a cipher that could not be broken analytically.

Suppose Alice wants to send Bob an enciphered message in binary code. She could transmit each bit of the message using the polarization of photons: a V photon could represent a 1, while an H photon could represent a 0. Alternatively, she could use a P photon to send a 1, and a M photon to send a 0.

For purposes of this argument, we can call the V-H scheme the "rectilinear" scheme and designate it with a "+", and we can call the P-M scheme the "diagonal" scheme and designate it with an "x". Alice, being devious, decides to use the rectilinear scheme and the diagonal scheme interchangeably, at random. For example, if she wants to send the binary string "10111001110001", she could send it like this:

   1 0 1 1 1 0 0 1 1 1 0 0 0 1
   + + x + x x + + x + + x + x
   V H P V P M H V P V H M H P

What makes this devious is that if Eve tries to intercept the message, just as with the quantum-money counterfeiter, she has no way of knowing what polarization filter to use to sense the polarization of a particular bit of the message, and she will get the wrong bit value about half the time. Eve simply can't read the message.

Of course, Bob can read the message reliably if he knows the pattern of polarization schemes that Alice is using. However, this leads straight back to the classic key-distribution problem, with Alice trying to think of a secure way to get the pattern of polarization schemes to Bob. Alice and Bob could use RSA to transfer the key, but then we would have a scheme that would be no more and no less secure than RSA. Bennett and Brassard chewed on this problem at length and finally came up with an answer in 1984. The result was quantum cryptography.

* A quantum cryptography session involves a set of preliminary initialization steps:

STEP 1: Alice sends Bob a purely random sequence of a given number of bits, using a purely random sequence of the two polarization schemes.

STEP 2: Bob has no idea what bits are being sent and no idea which of the two polarization schemes are being used for each bit. On his end, he tries the two polarization filters at random, sometimes getting the correct result, sometimes not.

STEP 3: Now Alice phones Bob, on an ordinary insecure line, and tells him what sequence of polarization schemes she used, but not what the bit values were. Bob then replies and tells Alice which of the bits he read using the correct filter, meaning he got the correct values, but does not tell her what those results were.

STEP 4: The actual values of the bits that were read correctly can then be used as a completely unbreakable one-time pad cipher key. Alice's message can then be sent, using one or the other of the polarization schemes but not both at random, and will be completely secure.

STEP 5: Eve, the eavesdropper, is in the same position of using randomly-chosen filters to read Alice's sequence, but the likelihood of Eve using the same sequence of filters as Bob is vanishingly small. Eve cannot obtain the full cipher key by intercepting the transmission.

To illustrate, using an unrealistically short binary sequence:

   Alice's random bit sequence:   1 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 1 0 1 1 0
   Alice's random polarizations:  + + + + x x + + + x x + x + + x x + x + x
   Bob's random filters:          x + + x + x + + x x x + x x x + + + x + +
   Correctly read bits:           - 0 0 - - 1 0 1 - 0 0 1 0 - - - - 0 1 1 -

   Resulting key sequence:        001010010011

   Eve's random filters:          + x x + + x + x + + + x + x x x + + x + +
   Correctly read bits:           1 - - 0 - 1 0 - 1 - - - - - - 0 - 0 1 1 -

   Eve's reconstruction of key:   ??10?????011

Eve is actually in a worse position than this scenario shows. Since only single photons are being sent between Alice and Bob, if Eve intercepts them she will affect their polarization, for example converting P photons to V photons, and this will corrupt Alice and Bob's encryption. If Bob tries to decrypt the message and gets gibberish, he knows someone's been listening in.

Since the measurement of photon polarizations is tricky, it is possible that Bob will make errors in measurement. Bob can perform a little error-checking by actually telling Alice some subset of the bit values he received, with Alice confirming that they matched. If they don't, they construct another key. If they do, they simply discard the bits Bob told Alice, on the presumption that Eve was listening in, and use the remainder.

For example, they could build a key of 1,100 bits, use 100 of those bits as an error check, and then use the remaining 1,000 bits as a key. An alternative that would not throw away bits and would be as easy to do would be for Alice to simply tell Bob what the "parity" of a number of bits selected from the key would be, or in other words indicate if there were an even or odd number of "1" bits in that random selection. Less than 20 such parity comparisons would be enough to ensure that the key hadn't been tampered with by eavesdropping.

Quantum cryptography promised to make the absolutely secure one-time pad cipher useful for common use. Cipher users could produce an absolutely random keystream for every individual message, and communicate it between each other with little fear of interception.

* The cryptography community was fascinated by the idea of quantum cryptography, but many doubted it could be made to work in practice. However, in 1988, Bennett and one of his students, John Smolin, managed to send a quantum-encrypted message between two personal computers, appropriately named "Alice" and "Bob", and read it properly. The distance between the two computers was only 30 centimeters (a foot), but other researchers have been steadily increasing the distance between transmitter and receiver.

By late 2008, the technology had advanced to the level where a quantum cryptography system had been implemented as a demonstration in Vienna, Austria, with six sites linked over the city's fiber-optic communications network. However, so far none of the systems have been perfectly secure. Sending and receiving single photons is very difficult, so current technologies use pulses of polarized photons instead. This opens the possibility that Eve could use a half-silvered mirror, set at an angle to the line of transmission of the pulse, to divert part of the pulse and let the rest through. Eve could then test the polarization of the diverted portion of the pulse without disturbing the portion of the pulse that went through the mirror. The probability that Eve can successfully sample the pulse in this way increases with the size of the pulse, and at some size of pulse, the security of the system is lost completely. On the other hand, reducing the size of the pulse reduces the range over which it can be sent.

Progress is being made towards development of a true quantum cryptography system. A number of firms have developed "light emitting diodes (LEDs)" that could emit single photons, providing a critical element for a quantum cryptography system. It is unclear how far a single photon might be transmitted, but the length of a system could be multiplied by adding "repeaters" at intervals. The photon itself can't be regenerated by the repeater without measuring it and corrupting the quantum cryptography scheme, but the message could be securely enciphered and deciphered between each set of repeaters, with the full transmission consisting of a chain of such sessions. Work is now underway towards development of experimental communications satellites featuring quantum encryption technology in hopes of creating a long-distance quantum-encrypted communications system.

* Other approaches are being considered for quantum cryptography. An alternate scheme, based on what is known as "quantum entanglement (QE)", has been proposed by Artur Ekert of Oxford University in the UK.

QE is based on a quantum-physical quandary, the "Einstein-Podolsky-Rosen (EPR) paradox", which has been known since the 1930s. The EPR paradox imagines the generation of two photons from an event that creates them with, say, polarizations at right angles to each other, and sends them in opposition directions. In quantum physics terms, the two photons are said to be "entangled". Entangled photons with a right-angle relationship can be generated, for example, by shining a laser beam through certain types of optically-active crystals.

However, according to quantum indeterminacy, until the polarization of at least one of the photons is measured, the polarization of both photons remains indeterminate. Once the polarization of one photon is measured and "resolved", then the polarization of the other "entangled" photon is also implicitly resolved, even though it is far away by that time.

Now suppose Alice generates pairs of entangled photons, storing her half of the pairs in some kind of optical "trap" and sending the other half to Bob, who also stores his in an optical trap. When Alice wants to send an enciphered message to Bob, she calls him, and the two then perform measurements on the trapped photons with filters, much as with the quantum cryptography scheme outlined previously. The beauty of the scheme is that if Eve gets her hands on any of the entangled photons they do her no good, since she can't duplicate them without resolving them and making them absolutely useless for the procedure. They can be delivered to Bob or to Eve, but not both. Alice and Bob can even use the same filter sequence -- all "+" for example, or all "x" -- and eliminate the guesswork, at least as long as Alice is certain she is talking with Bob.

Some starflight enthusiasts have jumped to the conclusion that QE permits instantaneous communications. In fact, Einstein and his colleagues came up with the EPR paradox just to show how absurd quantum physics is, since instantaneous communications are impossible in terms of relativistic physics. The joke was on them, of course, since the EPR paradox turned out to be provably true, but the joke cut both ways. Without non-instantaneous communications between Alice and Bob about the proper filter orientations, there's no way to get a useful interpretation of the results of their measurements. Quantum physics has a quirky tendency to cheat on classical physics, but in a way that can't really be detected or used, at least directly. The Creator does have a sense of humor.

* Or, in other words, what quantum physics gives, quantum physics takes away, and this applies at a higher level as well. Quantum computing will be able to crack any existing cipher, but quantum cryptography will result in ciphers that even quantum computers won't be able to crack. A question that will remain when quantum computing and cryptography become practical: Is the cryptographic game over? Or will it just move on to a new and even more devious level?