COMPUTATION AS A THERMODYNAMIC PROCESS APPLIED TO BIOLOGICAL SYSTEMS T. B. FOWLER Department of Aeronautical Information Systems, The MI TRE Corporation, 18.70 Do&y Madison Boulevard, McLean, I/^rginia22102 (U.S.A.)

(Received: 30 June, 1979)

SUMMARY

A physical quantity, ‘information,’ can be defined and must be included in the second law of thermodynamics. This quantity is differentjiiom entropy, though closely related to it. Acquisition and use of information is characteristic of most systems, especially those of biological origin. Treatment of information as a physical quantity permits many types of machines (in the most general sense) to be understood in respect of a thermodynamic function, and as well an investigation of the extent to which the laws governing information are or may limit their behavior. This limitation is most important at the molecular level, such as in the process of DNA and RNA synthesis. However, it may ultimately have a significant impact on all areas of biology including the theory of evolution, which in light of the concept of information can now be addressed in an important new way.

SOMMAIRE

On peut dejmir une grandeur physique Y‘information” qui doit Ptre introduite duns la seconde loi de la thermodynamique. Cette grandeure differe de l’entropie bien qu’elle lui soit etroitement apparentee. Eacquisition et l’emploi de l’information est caracte’ristique de la plupart des systkmes en particulier de ceux d’origine biologique. Le traitement de l’information comme une grandeur physique permet de considerer de nombreux types de machines (au sens le plus general) sous l’angle thermodynamique, ilpermet aussi une etude des limitations que les lois gouvernant l’information peuvent eventuellement apporter a leur comportement. Cette limitation est surtout importante au niveau moleculaire, par example duns la synthese de I’ADN et de l’ARN. Neanmoins elle peutfinalement avoir une inj7uence significative duns tous les domaines de la Int. J. &o-Medical Computing (10) (1979) 477-489

@Elsevier/North-Holland

Scientific Publishers Ltd.

478

T. B. FOWLEX

biologie, y-compris la thtforie de l’tvolution qui, ri la 1umiPredu concept d’information, peut 2tre abondei maintenant d’une man&e nouvelle et digne d’intPrCt.

INTRODUCTION

In a now classic work, first published about 20 years ago, Leon Brillouin demonstrated the essential interrelation between information and entropy, thereby linking information theory and thermodynamics (Brillouini 1962). Ultimately, Brillouin’s theoretical developments turn on the fact that information costs entropy i.e., the acquisition of information about a physical system requires a net increase in entropy, either of the system itself or some other source of free energy (Brillouin, 1962a). This fact, that information is not free, has profound implications for all systems dealing with information in any sense. It is the intention of this paper to clarify the notion of information as a physical quantity’related to systems, and to demonstrate the critical role played by computers in the conversion of entropy into information and vice-versa. Briilouin’s work Brillouin takes as his point of departure information theory, as elaborated in 1948 by Shannon (1948), and since developed and refined by many others. He defines what he calls ‘negentropy,’ N, which is the negative of entropy, S, as usually defined, and then enuntiates the Negentropy Principle of Information: ‘Information represents a negative contribution to entropy’ (Brillouin, 1964; 1962b). Stated quantitatively, the principle represents a new version of the second law of thermodynamics (Brillouin, 1962c): A(S-I)>0 or

A(N+I)sO Brillouin then proceeds to determine the minimal cost of information in terms of negentropy (Brillouin, 1962d). For one bit of information, and where v is the frequency of observation: As,

kln2 ’ i hv/T

hvkT

Armed with these theoretical tools, Brillouin is able to quite elegantly dispose of Maxwell’s Demon and various other paradoxes left over from classical physics (Brillouin, 1962e). In the case of Maxwell’s Demon, for example, the basic idea is very simple: the Demon must expend more negentropy acquiring information about the position and motion of particles than he is able to create by selectively

479

BIOLOGICAL SYSTEMS

permitting them to pass through his trap door. Brillouin also discusses a large number of other typical applications of the Negentropy Principle in communications, coding, signal detection, and with respect to physical knowledge in general (Brillouin, 1962f). He then draws some rather far-reaching conclusions regarding the status of determinism in physical science (Brillouin, 19628; 1964a), but they will not be discussed here.

REFINEMENT OF THE INFORMATION

CONCEPT

Brillouin’s central intuition, viz. that there is an entropy cost associated with information, is undoubtedly correct and represents a significant advance in thermodynamics and information theory, as is amply demonstrated by the uses to which he puts it in his published works. However, there are some problems and lacunae in Brillouin’s theory, stemming in part from a confusion of phenomenological and epistemological questions (Brillouin, 1962h). In order to clarify matters, it is useful to consider the various aspects of information considered phenomenologically, i.e. the various ways in which information is manifested and dealt with in the physical world: (1) Information about some object or system can be determined. For example, information about the position and velocity of a moving object may be acquired through use of a measuring device. This information acquisition will require an expenditure of negentropy and is part of the task of Maxwell’s Demon, for example. (2) Information, e.g. as obtained in (l), may be transmitted from one location to another. The question of the laws governing information transmission in a real-world (i.e. noisy) environment is that addressed by Shannon’s theorems (Khinchin, 1957). Every living organism, in the process of replicating itself, transmits information. An excellent example of the application of Shannon’s theorems to problems of biological evolution may be found in Gatlin (1972). (3) Information, once acquired, can be stored and recalled. This is done in computer memories, of course; but it is also done whenever words are written on a page, books printed, etc. (4) Information can be transformed or processed. Again, computers do this, but it is also done by many classes of living organisms. Nearly all feedback control systems involve information processing in one form or another. The foregoing aspects of information are briefly summarized in Table 1, with an additional column for the important laws governing each aspect. Brillouin discusses aspects (l), (2) and (3); but (4) is not addressed by him at all.

480

T. B. FOWLBB TABLE 1 INFORMATION

Tw

IN THE

PHYSICAL

Some important laws governing

(1) Information determined about external world

Negentropy principle, Uncertainty principle

(2) Information transmission

Physical laws of medium, Shannon’s theorems Negentropy principle, Uncertainty principle Finite state machine theory, Negentropy principle.

(3) Storage and recall of information (4) Information processing

WORLD

Typical instances

Biological sensory organs, radar, feedback control systems Communications systems, DNA replication All types of information storage and retrieval systems Calculating machines, many types of feedback control systems

He discusses the function of computers, but in a different context (Brillouin, 1962i). A proof that information processing involves expenditure of negentropy can be found in Appendix A. For the purposes of this essay, it will be assumed that information is a physical quantity, essential for understanding the purpose and function of many systems, and which is very closely related to the particular system using, acquiring, transmitting, or processing it. This will avoid some of the bizarre problems Brillouin encounters (Brillouin, 1962j) and will permit attention to be focused on the most important questions. Complex information systems, in particular, may require considering information in all 4 of the ways described above, just as complex thermodynamic systems require consideration of energy and entropy, both internally and in heat and mass transfer. One of the most important points is the question of whether information is ultimately superfluous, because it is simply entropy in another form. Brillouin sometimes speaks as if this were true (Brillouin, 1962k) and so it is important to clarify the status of information with respect to entropy.? Such clarification will also help delineate information as an independent physical quantity characteristic of certain physical systems and processes. Consider the information a system may possess. It may be some bits describing a feature of the system’s environment, or it may be a list of names and addresses, or a set of instructions about what the system is to do next. Can this information, these data bits, be identified with the negentropy of the system? The answer is that in many practical cases the method used for storing information does involve negentropy, but this is not necessary and in fact is a rather costly way of doing it. To investigate this question, recall the definition of entropy ‘from statistical tit should be noted that because 2 quantities are measured in the same units, it does not follow that they are the same: take the case of work and heat, for example. Both are measured in the same unit e,g., joules! but heat is not work, and can only with some difficulty be converted mto work. Thermodyeamtcs as a discipline evolved in response to the need for determining the hmitations and conditions necessary to convert heat into work.

BIOLOGICAL

SYSTEMS

481

mechanics : S=k In R

where &3is the number of accessible states of the system. For simplicity, consider a device consisting of n elements or regions, possibly embedded in a medium; for example, take electric dipoles in the absence of external fields, each capable of being put in (and remaining in) either of 2 states. These states are in addition to its ordinary thermodynamic states R, among which it is constantly shifting. Such a device can store n bits and has 2” states due to its information storage capacity. Of course, the total number of states the device can be in is a*, where R* is the product of the ordinary thermodynamic states R and the 2” states due to information storage. However, the number of accessible states, i.e. those figuring in system entropy, is just Sz, since by hypothesis the device will not jump from one of the 2” states used for information storage to another. Hence, entropy is still given by S = k In R, regardless of which of the 2” information storage states the device may be in. They may therefore be changed with no change in entropy. Moreover, the system could be, say, heated or cooled, with corresponding changes in LIand hence entropy, but not information. A similar argument can be applied mutatis mutandis to most media used or usable for information storage e.g., the tape of a Turing Machine, though in some cases there will be an irrelevant entropy change when information is changed. Such would be the case if the Turing Machine punched holes in the squares of its tape instead of, say, marking them with symmetrical up and down arrows. The point;quite simply stated, is that the information content of a medium can change with no change in its entropy, and its entropy can change with no change in its information content. In fact, the medium may have rrc information stored in it, as when it is first manufactured and has a random pattern of l’s and 0’s. The 2 quantities, therefore, cannot be considered the same. It is also important, in this connection, to note another feature characteristic of all media used for information storage. Because each binary storage element of the medium has to be in 1 of its 2 states, and each of the 2”possible configurations of the medium is equally probable and equally usable for information storage, there is no possible way to determine by inspection whether information has actually been placed in the medium. Consider, for example, the case of a scientist who labors for many yearsand enters the results of his work into a computer memory. If he then dies, and leaves no key to interpret his results, then all an observer would tind in the memory is a random pattern of l’s and O’s,and he would have no way of knowing whether they were supposed to represent a pattern or were simply the state the memory came up in when the computer was turned on. The death of the scientist, in other words, totally destroyed the information content of the memory. This, obviously, does not occur in the case of negentropy. If the same scientist, for example, had partitioned a box and then evacuated half of it, even after his death the free energy thereby created would remain and be detectable by any observer.

482

T. B. FOWLW THERMODYNAMIC FUNCTION OF THE COMPUTER

At this point, a fundamental question arises. Since information as a physical quantity requires expenditure of negentropy for storage and recall, as well as processing, what sort of machine or machines are involved (see Fig. l)? That is, if information is gleaned from a source, stored away, then later recalled for use in the form of negentropy, what sort of machine effects the transformation from negentropy (e.g., light waves from a moving object) to information (e.g., bits in a memory)? And once this information is acquired, what sort of machine manipulates the information e.g., through computation according to an algorithm?

NEGENTROPY/ mfIxoNMENT

COMPUTATION

INFOREULTION

NEGENTROPY

-

INFORMATION

Fig. 1. Schematic representation of generalized information processing machine.

With respect to the first question, the situation is somewhat analogous to macroscopic thermodynamics, where there are 2 quantities, heat and work, related by a factor J=4.186 joules/cal, and interconvertible by a machine called a heat engine. Here there are 2 quantities, information and entropy, related by a factor which by analogy may be termed J*, with .I* = k In 2 (ergs/deg)/bit; and the machine that interconverts them is the computer. This, then is a primary thermodynamic function of the large class of machines dealing with information, namely to permit the conversion of one physical quantity into another. Consider, for example, a programmer typing in a program at the console of a digital computer. He causes electrical impulses, which represent negentropy, to go to the central processing unit, where they are then stored in the computer’s memory as information. The program may later be recalled and appear on the programmer’s console screen as illuminated characters-negentropy once more. The computation performed by the computer also has a thermodynamic function, which is less direct but of no less importance. A complete discussion of it is far beyond the scope of this paper,? but basically the computation performed by a tThe author is currently writing a book on this subject.

BIOLOGICAL SYSTEMS

483

process control computer (or the nervous system of an animal) takes input information from some type of sensor, manipulates it according to an algorithm, and outputs the resulting information to control an action (e.g., opening a valve or activating muscles). This permits the process to be controlled, and the animal to function in its environment, to acquire food, seek shelter, etc., all of which are more directly thermodynamic functions.

TYPICAL

APPLICATIONS

At this juncture it will be useful to look at some typical applications of the foregoing concepts. First consider the relative efficiency of the conversion processes for heat and work on one hand, and entropy and information on the other. When Sadi Carnot (1977) began his investigations (about 1820), there was of course no clear idea of how efficient heat engines could be made; but many were in use and it was believed by some that their efhciency in transforming heat into work could be extended indefinitely. Carnot himself remarked: The question has often been raised whether the motive power of heat is unlimited, whether the possible improvements in steam engines have an assignable limit-a limit which the nature of things will not allow to be passed by any means whatever; or whether, on the contrary, these improvements may be carried on indefinitely (Carnot, 1977). Thanks to Carnot, of course, the answer to that question is now known. In point of fact, however, the actual engines in use during Carnot’s time were extremely inefficient and were constantly being refined, possibly giving rise to the above notion. He himself cites an example of an engine used in Cornwall. Though having a theoretical efficiency of: 560-100 =55x 560+273 he points out that it actually delivered only about 5% or l/lOth of the theoretical maximum (Camot, 1977a). Matters have improved, however. Modern day engines operating on the Otto cycle, for instance, can deliver efficiencies on the order of 5@-60% (Wylen and Sonntag, 1965). When efficiencies of digital comfiuters are considered, the situation turns out to be radically different. A typical minicomputer in operation may consume 1000 W. It can operate at a speed of about 500 O&l operations/set, corresponding to 500000 x 16= 8000000 bit changes/set. Assuming an isothermal process, the energy cost/bit may be calculated as: 103j 10’ erg PXPX8x106bits set j

1 set

= 1250 ergs/bit

484

T. B. FOWLER

The theoretical minimum is dQ = T dS= kT In 2/hit or at 293 K, 1.38 x lo- I6 x 293 x In 2 ergs/bit = 2.8 x lo-l4

ergs/bit

This means that the relative efficiency of modern day electronic hardware is on the order of: 2.8 % lo--l4 =2.2x 10-l’ 1250 This number is fantastically small, and accounts for the fact that effects proper to information entropy conversion have never been observed in such machines. Indeed, someone unaware of the principles discussed here could easily assume that an indefinite improvement in conversion efficiency i.e., computer operation, is possible ! The situation is improving somewhat, however. Josephson junction devices, currently under development at IBM’s Zurich Research Laboratory and elsewhere, are much more entropy-efficient, and in fact are capable of storing one bit of information with a single flux quantum (Gueret, 1975). Such devices can function with an energy expenditure of about? lo-16j/bit=

lo-’ ergs/bit.

The theoretical limit for devices operating in the superconducting region is approx. 2 x 10-15, so the efficiency is about: 2x lo-l5 .

“lo-9

temperature

=2x 10-e

which is a considerable improvement. These results concerning minimum entiopy can be generalised to cover any type of device operating at any speed in such a way as to link device size, power dissipation and temperature (Fowler, 1979). Considering the human brain phenomenologically as a computer, the following extremely crude estimate can be made of its efficiency. Assuming there are about 1O’Ocells in the brain, each capable of taking 1 of 2 states, and allowing that the brain consumes about 5 W of power, its entropy-information conversion efficiency (assuming each cell can change state in about 100 msec) is:

3

secxlO-’

1 10’ erg x ----= secx lOlo j

5 x 10e4 erg/bit.

Compare the theoretical minimum 2.8 x lo-lo erg/bit. Shifting to another frontier, that of information-directed processes within cells, consider the tist step in such a process, namely synthesising mRNA by ‘reading’ DNA templates. This, of course, represents a type of conversion of information to TFrom article, “Josephson devices may provide Times (6 March 1978) p. 24.

high-density

chip alternative,”

Hectronic

Engineering

485

BIOLOGICAL SYSTEMS

negentropy. Following Bykhovsky,? but completing his schema to make a cyclic process, (Lehninger, 1971; Hinkle and McCarty, 1978) the reaction in Fig. 2 may be taken. The net energy input for this reaction is roughly the sum of the 2.5 kcal required to break the phosphodiester bonds in the chain of polyribonucleotides and the 2 x 15 = 30 kcal needed to drive the ATP synthesis, giving 32.5 kcal or roughly 30 kcal. This yields: 30x103x4.186x107 ergs/bit = 1.04 x lo- ’2 ergs/bit 6.023 x 10z3 x log, 4 which is fairly close to the theoretical limit. In fact, it is about as close as possible, since error rate is a function of negentropy expeaditure/bit, and applying Brillouin’s (19621) formula for error rate, l/r, ASak lnr or AEakT

In r.

The above value for AE yields an error rate l/r> 10-l’. The actual error rate for DNA replication (a similar process) is 1O-8-1O-g (Watson, 1970). Hence,‘infirmation reading in cellular processes is very nearly optimal, in the sense that any decrease in the free energy expended would have to be accompanied by a (probably intolerable) increase in error rate.

+

4411P20,

+

E (-6.5 kcalhle)

+ 2.5 kcal. .4TP

Fig. 2. Cyclic reaction schema for mRNA synthesis (Cycle shown only for ATP portion).

?A. I. Bykhovsky, “The ne entropy principle of information and some problems in bioenergetics,” Mathematical Biosciences, s , (1968), p. 353-370. Unfortunately, Bykhovsky himself makes several critical mistakes, including use of the wrong formula for error rate from Brillouin, and failure to consider a cyclical process for determination of energy input required for mRNA synthesis. He considered only the final phosphodiester bond energy and the heat of reaction involvt$ in creating it. What must be known is the energy input the entire cyclical process requires so that it can be continually repeated. Bykhovsky’s approach is roughly equivalent to trying to determine the efficiency of a Carnot cycle while ignoring the energy input required for the adiabatic compression, for example.

486

T. B. FOWLBR FVTURB WORK

Recognition of information as a physical quantity characteristic of many systems and processes as well as the existence of machines for inter-converting negentropy and information permits many outstanding problems in science (and especially biology) to be addressed in an important new way. To take just one example, consider the question of Darwinian evolution. One may wish to address it in terms such as the following: To what extent’may a probabilistic finite state machine,? subject to the laws of physics and information theory (in the sense discussed here, especially the negentropy principle), replicate itselfS and through errors in replication, become a more complicated machine? ‘More complicated’ will typically be understood in terms of ability to gather and process information about the environment, and to use that information for dealing with the environment as well as passing it on to future generations. But of course, unless information is understood in a phenomenological sense, as described in this paper, the foregoing questiQn cannot even be posed; and to this day the hypothesis of Darwinian evaluation iemains a basically qualitative one. In point of fact, some questions peripherally related to the above have been addressed by Gatlin (1972), Hasegawa and Yano (1975), Smith (1969), and others, employing information theory techniques; but the main problem has not yet been directly attacked.

CONCLUSIONS

Scientific progress has reached the point where information theory in the generalised sense considered in this paper must be taken into account, both conceptually and quantitatively. This will require rethinking of some established ways of viewing systems, and recognition of the funcjlamentalrole information as a physical quantity plays in them. In particular, the thermodynamic function of computation must be recognised and its quantitative importance to systems considered. tCf. essay “domputability by probabalistic machines,” by E. de Leeuw, E. F. Moore, C. E. Shannon and N. Shalpiro, in Automata Studies, Princeton: Princeton University demonstrate that a Turing Machine subject to random errors can be completely deterministic Turing Machine, provided basically that the error rate involved be a computable number. There is, unfortunately, no way to guarantee that DNA replication, considered as a stochastic process, is even stationary, much less than its error rate can be considered as a computable number; but as a first approximation, the statement in the text will likely prove useful. $On self-replicating Turing Machines, see C. Y. Lee, “A Turing Machine which prints its own sce!t,” Proceedings ofthe Symposium on Mathematical Theory ofdutomata, New York, Apnl 1962, publls ed by Polytechmc Press, Brooklyn, 1963, p. 155-164.

487

BIOLOGICAL SYSTEMS APPENDIX A

Entropy cost of information processing Brillouin (1962m) has demonstrated that the entropy cost of 1 bit of information obtained about a physical system is given by: S=kln2

(A-l)

for low frequencies of observation. Consider now the case of a device whose function is to ‘process’ 2 incoming signals, and output some Boolean combination of the 2. There are 24 = 16 possible Boolean combinations of 2 variables, each corresponding to a rule for evaluating the 2 input variables. It is not possible to derive all 16 combinations by repeated application of all the combination rules singly; at least 1 negation operation is required.? But some operations e.g., the Sheffer stroke (known as the NAND function by digital logic designers), do suffice by themselves for derivation of all other combinations, since they include negation. Hence, an element which is to process 2 inputs and be capable of yielding any one of the Boolean combinations must incorporate a negation-type operation. To see that such an element must be ‘active’ i.e., require power input for its functioning, consider the hypothetical passive processing element in Fig. A-l, which can perform negation. Assume true is represented by a positive signal voltage, and false by a 0 or ground voltage. This represents no loss of generality, since passive level shifters are available to shift any 2 differing voltages in such a way that one is ground and the other is positive or negative with respect to ground. Let the impedance the device sees be R.

input

p

-

-

‘7

output

-p

-

Fig. A-l. Hypothetical passive processing device.

Now assume the input is grounded, so that no energy appears at the input. Then the output will consist of a steady positive voltage, or series of positive pulses. In either case, however, with no energy input, the output power of the device will be V’/R, which will be greater than 0 so long as R is not infinite. But in order for the We

any ehmntarY

logic

New York, 1959. p. 11.

text, for emnple,

W. V. Quine,

Methods

oflogic,

Holt, Rinehart, and Winston,

488

T. B. FOWLBR

output to be observed, an energy expenditure of at least T dS = kT In 2 is required by (A-l). Per second, this represents a power of k7b In 2. Now, this energy must be supplied either by the device itself, in which case: ;3kTvln2>0

(A-2)

or an outside source. But the outside source can be eliminated because ex hypothesis the device is passive. So the first alternative alone remains, and by (A-2) the assumed device would violate the First Law of Thermodynamics and is therefore impossible.

REFBRENCE?S

BRILLOUIN,L.? Science and information theory, Second Edition, Academic Press, New York, 1962. (Frist edrtlon published in 1956). B~LLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, Chapters 12,16,1962a. BRILLOUIN,L., Scientific uncertainty nnd information, Academic Press, New York, 1964, p. 14. A similar statement is found in Ref. 5. BRILLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, 1962b, p. 153. BIULLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, 1962c, p. 154. BRILLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, 1962d, p. 190. BRILLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, Chapter 13,1962e.. BRILLOUIN, L., Science and information theory, Second Edition, Academic Press, New York, Chapters 14-17, 1962f. BRILLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, 1962g, p. 302ff. BRILLOUIN,L., Scientific uncertainty and information, Academic Press, New York, Chapter 5, 1964a. BRILLOUIN,L., This confusion is evident, for example, in his discussion of new information on p. 265 of Science and Information Theory, and es cially on pp. 17-18 and throughout chapter 3 of Scientific Uncertainty and Information, 1962 IF

BRILLOUIN,L., Science and information theory, Second Edition, Academic Press, New York, Chapter 19. In this chapter he is concerned with proving that computers do not create new informatton, 1962i. BRILLOUIN,L., For instance, in Science and information theory, 1962j pp. 265266, the author has a problem determining whether new information comes into being if one person discovers something previously discovered by someone else. This is, of course, a pseudo-problem if information IS considered phenomenologically as described in the paper, since every system acquiring information about the world is independent of others in respect of that information. The problem only arises if epistemological questions, i.e., questions about knowledge in general, such as, “What does it mean to say that science tells us something about the world,” are confused wtth henomenological questions such as “How much information is represented by each letter o P the Enghsh alphabet?” BIULLOUIN,L., For example, on p. xii of Science and information theory, he says, “. . in short, information is negentropy.” The Negentropy Principle of Information, cited in the text, is also phrased in such a way as to imply that information is ultimately just another form of negentropy, BRIL~.??~

L. Science and information theory, Second Edition, Academic Press, New York, 19621, eqn. 14.3i, pp. 194-195. BRILWUIN,L., Science and Information Theory, Second Edition, Academic Press, New York, 1962m, p. 190. CARNOT?S., Reflections on the motive power offire, tr. by R. H. Thurston, Gloucester, Mass.: Peter Smtth, 1977, p. 5.

BIOLOGICAL SYSTEMS CARNOT,S., Reflections on the motive power of fire, tr. by R. H. Thurston. Gloucester, Mass.

489

: Peter Smith, 1977a. FOWLBR,T. B., “Some Theoretical Considerations Regarding Maximum Speed of Active Switching Devices, with Special Reference to Josephson Junctions,” 1979., to be published. GATLIN,L., Information theory and the living system, Columbia Umversity Press, New York, 1972, p. 191ff. GUERET,P., Storage and Detection of a Single Flux Quantum in Josephson Junction Devices, IEEE Trans. Magn., Mag-11, No. 2 (March 1975) pp. 751-754; “Operation of Strings of Single FluxQuantum Memory Cells Using Josephson Junctions,” unpubhshed (communicated to the author by Dr. Gueret). HASEGAWA, M. and YANO,T., The Genetic Code and the Entropy of Protein. Math. Biosci., 24 (1975) pp. 169-182. HINKLE,P. C. and MCCARTY,R. E., “How Cells Make ATP,” Sci. Am., 238 (1978). No. 3. p. 104 for the energy required to make ATP from ADP. KHINCHIN,A. I., Mathematicalfoundations of information theory, trans. by R. A. Silverman and M. D. Friedman, Dover Publications, New York 1957, pp. 102ff. LEHNINGER, A. L., Bioenergetics, Second Edition, W. A. Benjamin, Menlo Park, 1971, p. 143, for the AMP-ADP reaction. SHANNON,C. E. The mathematical theory of communication, Bell System Tech. J. 27 (1948) pp. 379423,623-656. SMITH,T. F., The genetic code, information density, and evolution, Math. Biosci. 4 (1969) pp. 179-187. WATSON,J., Molecular Biology of the Gene, Second Edition, W. A. Benjamin, Menlo Park, 1970, DD. 297. VAN’WYLEN,G. and SONNTAG,R. E., Fundamentals of Classical thermodynamics, Wiley & Sons, New York, 1965, pp. 293-297.

Computation as a thermodynamic process applied to biological systems.

COMPUTATION AS A THERMODYNAMIC PROCESS APPLIED TO BIOLOGICAL SYSTEMS T. B. FOWLER Department of Aeronautical Information Systems, The MI TRE Corporati...
904KB Sizes 0 Downloads 0 Views