Journal of Quantum Information Science
Vol.07 No.04(2017), Article ID:81046,5 pages
10.4236/jqis.2017.74012

Physical Limits of Computation

Tibor Guba1, László Nánai2*, Thomas F. George3

1Department of Optics and Quantum Electronics, University of Szeged, Szeged, Hungary

2Department of Experimental Physics, University of Szeged, Szeged, Hungary

3Department of Chemistry and Physics, University of Missouri-St. Louis, St. Louis, MO, USA

Copyright © 2017 by authors and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: June 8, 2017; Accepted: December 11, 2017; Published: December 14, 2017

ABSTRACT

The paper deals with theoretical treatment of physical limits for computation. We are using some statements on base of min energy/bit, power delay product, Shannon entropy and Heisenberg uncertainty principle which result in about kTln(2) energy for a bit of information.

Keywords:

Computation, Power Product, Entropy, SNL Limit

1. Introduction

Current computation technology is based on binary “thinking” and semiconductor hardware. The former has a very well developed and working theory, and thus it is expected to dominate informatics for some time. The scientific community is working on several solutions to replace the latter, which has consisted for the past several decades mainly of CMOS (complementary metal oxide semiconductor) architecture [1] . This research is necessary, because the currently available devices are slowly approaching their limits. But how long are we able to increase our computers’ performance by replacing the technologies? This key question may be translated to the following: What is the minimal value of the important quantities in informatics?

2. Power-Delay Product

When the input voltage changes, logical circuits briefly dissipate power (usually a fixed amount). We call this the dynamic power. The power-delay product (PDP) is the dissipated energy per switching cycle. This quantity can be viewed as the sum of the individual switching events occurring during the cycle. The PDP is then connected to the switching time, whereby it is connected to the processors’ heat generation and clock rate.

To calculate the PDP, let us consider the following thought experiment (Figure 1). The two switches are always in alternating positions. If I is the current, Uout is the output voltage and U0 is the “upper” voltage, than the energy W required to charge the capacitor is [2]

W = 0 τ / 2 I ( U 0 U o u t ) d t . (1)

It is easy to calculate I during this event as [2]

I = C d U o u t d t , (2)

There equations related to experiment based on Figure 1 are describing the charge-discharge processes on base of classical electronics.

where C is the capacitance. Combining these equations yields

W = C U 0 2 / 2 . (3)

This is only the first half of a switching cycle. For the second half it can be shown that the result is the same, so that the power-delay product in this case is [3] [4]

P D P = C U 0 2 . (4)

The time needed to perform the former cycle is [1]

τ = C U 0 / I . (5)

3. Scaling Limits

We are closing in on the limit of Moore’s 1st law (the 2nd will probably last longer). A good example why we cannot go on much longer with the exponential growth of the hardware’s “device density” is the problem of conductances. The Moore’ law―nowadays―looks like a bit problematique because the linewidth decreasing speed is NOT fully synchronized with density increasing speed.

Figure 1. Thought experiment to calculate PDP.

3.1. Electrostatic Control

To have adequate control over the charge in a channel, it is necessary to maintain a distance of l L , where L is the channel length and l is the thickness of the insulator. Unfortunately, l today is close to 1 nm, which means that the insulator is only several atoms thick. This places severe demands on the insulators [5] .

3.2. Power Density

Removal of the heat generated by an integrated circuit has become perhaps the crucial constraint on the performance of modern electronics. The fundamental limit of the power density appears to be approximately 1000 W/cm2. A power density of 790 W/cm2 has already been achieved by using water cooling of a uniformly heated Si substrate with embedded micro channels (Note that the Sun’s surface is around 6000 W/cm2) [6] .

4. Minimal Energy Dissipated Per Bit

To calculate the minimal energy required to generate a single bit of information, we need the entropy,

S = k ln ( Ω ) , (6)

where k is the Boltzmann constant and Ω is the degeneracy of the state. A more practical form of this quantity in binary fashioned systems is Shannon’s entropy [7] ,

H = log 2 Ω , (7)

which means Ω = 2H. Information from our viewpoint of humanity is the direct opposite of entropy: the larger the entropy, the more chaotic our system is. On the other hand, we are only able to gain more information from less chaotic (or more organized) systems.

To define the information quantitatively, we need further assumptions. Let us view the system (that we try to gain information from) as a set of events (or messages, or sometimes states). These messages have probabilities. Following Shannon’s approach [7] [8] , the definition of information (i) is

i j = log 2 p j , (8)

where p is the probability of the jth event that the message represents. We can see from this definition that a message is more valuable if it is more unlikely. Sometimes this quantity is also referred to as the uncertainty of the state. Usually the probabilities are equal, p j = 1 / N , where N is the number of possible events.

It can be shown that the formerly mentioned entropy can be also calculated with information ij [8] :

H = j p j i j . (9)

Computation is an information producing event, where it decreases the entropy of the computer. According to the 2nd law of thermodynamics, the whole universe’s entropy may only increase. Labeling the environment as “e” and the computer as “c”, this statement can be expressed as

Δ S = Δ S e + Δ S c 0 , (10)

which means that the traditional computation is an irreversible process. We obtain the heat by multiplying both sides by the temperature [6] :

Δ Q = T Δ S e T Δ S c (11)

Combining this with Equations ((6) and (7)), we obtain [9]

Δ Q k T Δ H c ln ( 2 ) . (12)

This means that we need at least kTln(2) of energy to generate a bit of information, which is the Shannon-Neumann-Landauer (SNL) limit. It is possible to interpret this result as the maximal efficiency of the information generating cycle. If we assume that the full energy invested is kT, then this efficiency is η = ln ( 2 ) = 0. 693 . Applying Heisenberg’s uncertainty principle to the SNL limit [5] ,

E min τ , (13)

we get τmin ≈ 0.04 ps [3] , although this theory is not proven so far.

5. Pursued Fields to Bypass the Limits

● Reversible computing (noise immunity is the main problem) [3] [10] .

● New information tokens (spin of an electron, photons, etc.) [3] [11] .

● Integration: switching from 2D circuits to 3D would increase the performance (If we are able to cool these 3D systems...) [3] .

● Architecture.

Cite this paper

Guba, T., Nánai, L. and George, T.F. (2017) Physical Limits of Computation. Journal of Quantum Information Science, 7, 155-159. https://doi.org/10.4236/jqis.2017.74012

References

  1. 1. Markov, L.L. (2014) Limits on Fundamental Limits to Computation. Nature, 512, 147-154 https://doi.org/10.1038/nature13570

  2. 2. Hevesi, I. (1998) Theory of Electricity. Hungarian National Textbook Publisher, Budapest.

  3. 3. Baldo, M. (2011) Introduction to Nanoelectronics. MIT OpenCourseWare Publication.

  4. 4. Singh, A.K. (2011) Electronic Devices and Integrated Circuits. PHI Learning Pvt. Ltd., Delhi.

  5. 5. Stathis, J.H. (2002) Reliability Limits for the Gate Insulator in CMOS technology. IBM Journal of Research and Development, 46, 265-286.https://doi.org/10.1147/rd.462.0265

  6. 6. Heo, S., Barr, K. and Asanovic. K. (2003) Reducing Power Density through Activity Migration. Proceedings of the 2003 International Symposium on Low Power Electronics and Design, Seoul, 25-27 August 2003, 217-222. https://doi.org/10.1145/871506.871561

  7. 7. Feynman, R.P., Hey, J.G. and Allen, R.W. (1998) Feynman Lectures on Computation. Addison-Wesley Longman Publishing Co., Boston.

  8. 8. Shannon, C.E. (2001) A Mathematical Theory of Communication. ACM SIGMOBILE Mobile Computing and Communications Review, 5, 3-55. https://doi.org/10.1145/584091.584093

  9. 9. Lloyd, S. (2000) Ultimate Physical Limits to Computation. Nature, 406, 1047-1054.https://doi.org/10.1038/35023282

  10. 10. Bennett, C.H., and Landauer, R. (1985) The Fundamental Physical Limits of Computation. Scientific American, 253, 48-56. https://doi.org/10.1038/scientificamerican0785-48

  11. 11. Norton, J.D. (2005) Eaters of the Lotus: Landauer’s Principle and the Return of Maxwell’s Demon. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 36, 375-411. https://doi.org/10.1016/j.shpsb.2004.12.002