COMPUTERS OF GENERATION OMEGA: August Stern's research on quantum brain sub specie aeternitatis Vladik Kreinovich Department of Computer Science, University of Texas at El Paso, El Paso, TX 79968, USA, email vladik@cs.utep.edu COMPUTER GENERATIONS: WE NEED FASTER AND FASTER COMPUTERS. No matter how fast modern computers are, there are still problems that take too much computational time and, thus, cannot yet be handled by modern computers. To solve these problems, we must design faster and faster computers. So far, the speed of the computers has been doubling every year or so. According to special relativity, all velocities are bounded by the speed of light; thus, to make computer elements faster, designers try to decrease the size of these elements. Within each hardware technology, they eventually reach a limit, i.e., the smallest element size that this technology can achieve; after that, to decrease the size further, we need to invent a new technology. Computers that use this new technology are usually called computers of a "new generation". The existing 4th generation computers are based on VLSI technology. At the current speed-up rate, this technology will soon exhaust its potential. Physicists and engineers are therefore working on new technologies for fifth, sixth, etc., generations of computers. Vague ideas are proposed for technologies suitable for even further generations. The further generation, the more vague the ideas. It is therefore desirable to get a clear view of the computers of the very distant future generations. We will call these computers "generation omega" after the notation "omega" for the first infinite ordinal number proposed by Cantor, the founder of set theory (the first consistent theory of infinite objects). COMPUTER GENERATIONS AND QUANTUM PHYSICS: GENERAL DESCRIPTION. Typically, quantum effects become more and more essential with the decrease in size of the studied objects: * for macro-size objects, quantum effects are rare (e.g., in lasers), small, and difficult to measure; * in chemistry, that studies molecules, quantum effects are often important; * for elementary particles, quantum effects are so overwhelming that their non-quantum description is practically impossible. Therefore, as the size of the computer elements decreases, we need to take quantum effects into consideration to a larger and larger extent. To take these effects into consideration, we must use quantum physics. In general, a physical theory describes how particles and fields interact in space-time. Therefore, in the ideal quantum physical theory, particles, fields, and space-time structures must be considered from the quantum viewpoint. In practice, the effects of their quantization is different, so, some of these quantum effects can often be neglected: * the largest quantum effects are related to objects that have been known and analyzed for the longest time, i.e., particles; * the next quantum effects are related to newer objects: fields; * and finally, the smallest quantum effects are due to quantization of space-time physics, physics whose experimental effects are still on the edge of modern observation abilities. The smaller the objects, the more effects we need to consider. At first, we have to use traditional quantum mechanics (also called "first quantization"), in which fields (and space-time structures) are described by non-quantum formulas, but the particles' quantum behavior is taken into consideration. This quantum mechanics describes atoms, quantum chemistry, etc. Modern engineering research into "quantum dots" as computer units and modern theoretical research into quantum computing, with its exciting potential ability of solving such hard-to-solve problems as factoring large integers, is at this quantization level. From the practical viewpoint, quantum dots will have a huge potential of further miniaturizing computers, so, if this project is successful, we will not need to worry about it for at least a few decades. However, from the fundamental viewpoint of a more distant future, we need to look further. AUGUST STERN'S PIONEER RESEARCH INTO COMPUTATIONAL USE OF QUANTUM FIELD THEORY. To describe even smaller objects, we need to use "second quantization", or quantum field theory (QFT), in which both particles and fields are quantized, while space-time structure is assumed to be classical (non-quantum). The research into the possible use of quantum filed theory in computing is currently very far from the practical engineering level, but the theoretical potential of using QFT effects in computing was uncovered by A. Stern (see, e.g., his "Quantum Brain", North Holland, Amsterdam, 1994), who showed that any QFT can be naturally reformulated as a logical calculus. Thus, similarly to the way the classical Boolean two-valued logic forms the basis of the modern 0-1-based computers, QFT-related logical calculus can be lead to a computer based on "quantum bits". This idea and its potential applications are described in detail in this book. What we want to show in this talk is that the potential of this idea is even greater if we consider this idea in the framework of the full quantum theory. AUGUST STERN'S IDEA SUB SPECIE AETERNITATIS. As we consider the computers of further and further generations, in which processing elements get smaller and smaller, we must take into consideration the quantization of space-time as well. In other words, we must consider the effects of the full quantization. In general (curved) space-time the maximum possible communication speed (i.e., the speed of light c) is determined by the metric tensor field. In non-quantum theories, this field is smoothly depending on coordinates and therefore, the corresponding maximal speed is slightly changing in space and time (and is practically constant for small areas). Quantization of space-time means, in particular, that the metric tensor field undergoes quantum fluctuations, and, as a result, the actual maximal speed at any given point is randomly larger or randomly smaller than c. Since the average deviation must be 0, this means, roughly speaking, that in half of the cases, the maximal possible speed is larger than c, and in half of the cases, it is < c. An object of finite size is influenced by the "average" field in the area that this object occupies. If the object is large enough, then the random fluctuations "average out", and the object moves as if in a space where the maximal speed is the macro-world speed of light. However, if we consider much smaller objects, then these objects can actually feel the local fluctuations. Therefore, if this tiny object moves (and transfers information) at a maximal local speed, and this local speed, due to a fluctuation, is larger than c, then we get a microobject that, without violating causality, is able to transfer information at a speed v that is larger than the macro-level speed of light c. The smaller the object, the larger this potential speed v (it can be, actually, as large as possible). As a result, we have an UNEXPECTED ADDITIONAL BOOST IN COMPUTER PERFORMANCE: * we considered smaller and smaller processing elements because the smaller these elements, the faster the computer; * it turns out that if these elements are small enough to take into consideration full quantum theory, then not only their size gets smaller, but also the actual speed of communication transfer can be made larger than the macro-level speed of light; thus, computers become even faster. Such quantum processes that are faster than the macro-level speed of light (i.e., processes violating micro-causality) have appeared in physics for quite some time, but have not been part of mainstream physics until the late 80s when Kip Thorne, the well known astrophysicist, made them mainstream (see, e.g., his monograph "From black holes to time warps", Norton, N.Y., 1994). The ideas of how to use acausal processes in computations were described in O. M. Kosheleva et al., "What can physics give to constructive mathematics?" (In: "Mathematical logic and mathematical linguistics", Kalinin, 1981, pp. 117-128; in Russian), and in the 1991 Carnegie-Mellon University preprint "Time travel and computing" by H. Moravec. IS THIS HOW OUR BRAIN WORKS? Since we humans are the result of the billions of years of improving evolution, it is natural to expect that our body is actually using all the physically possible processes that it can benefit from. In particular, since full quantum theory is beneficial for computing, it is natural to conjecture that our brain actually uses acausal processes. Thus hypothesis is in line with known anticipatory abilities of the human unconscious. CONCLUSION. To make computers faster, we must take smaller and smaller processing elements, and to analyze these elements, we need to take into consideration more and more subtle quantum effects. A. Stern has shown a natural way to use quantum field theory for computing. It turns out that if we take into consideration all quantum effects, then this idea leads to an even faster performance. ACKNOWLEDGMENTS. The author is greatly thankful to the organizers of the August Stern Symposium for financial support, and to Kip Thorne (Caltech) for valuable discussions.