At present, we do not have a thermodynamic criterion that would define the steady state in open systems in a similar way as maximum entropy defines equilibrium in closed systems. It was believed for some time that such criterion was provided by minimum entropy production, a statement known as “Prigogine’s Theorem.” Although it is still taken for granted by some biologists (e.g., Stoward, 1962), it should be emphasized that Prigogine’s Theorem, as was well known to its author, applies only under rather restrictive conditions. In particular, it does not define the steady state of chemical reaction systems (Denbigh, 1952; von Bertalanffy, 1953a, 1960b; Foster et al., 1957). A more recent generalization of the theorem of minimum entropy production (Glansdorff and Prigogine, 1964; Prigogine, 1965) encompassing kinetic considerations has still to be evaluated in its consequences.
Another unsolved problem of a fundamental nature originates in a basic paradox of thermodynamics. Eddington called entropy “the arrow of time.” As a matter of fact, it is the irreversibility of physical events, expressed by the entropy function, which gives time its direction. Without entropy, i.e., in a universe of completely reversible processes, there would be no difference between past and future. However, the entropy functions do not contain time explicitly. This is true of both the classical entropy function for closed systems by Clausius, and of the generalized function for open systems and irreversible thermodynamics by Prigogine. The only attempt I know of to fill this gap is a further generalization of irreversible thermodynamics by Reik (1953), who attempted to introduce time explicitly into the equations of thermodynamics.
A third problem to be envisaged is the relation between irre-versible thermodynamics and information theory. Order is the basis of organization and therefore the most fundamental problem in biology. In a way, order can be measured by negative entropy in the conventional Boltzmann sense. This was shown, e.g., by Schulz (1951) for the nonrandom arrangement of amino acids within a protein chain. Their organization in contrast to hazard arrangement can be measured by a term called chain entropy (Kettenentropie). However, there exists a different ap-proach to the problem, i.e., by measurement in terms of yes-or-no decisions, so-called bits, within the framework of information theory. As is well-known, information is defined by a term formally identical with negative entropy, thus indicating a correspondence between the two different theoretical systems of thermodynamics and of information theory. Elaboration ‘of a dictionary, as it were, for translating the language of thermodynamics into that of information theory and vice versa, would seem to be the next step. Obviously, generalized irreversible thermodynamics will have to be employed for this purpose because it is only in open systems that maintenance and elaboration of order do not run contrary to the basic entropy principle.
The Russian biophysicist Trincher (1965) came to the conclusion that the state function, entropy, is not applicable to living systems; he contrasts the entropy principle of physics with biological “principles of adaptation and evolution,” expressing an increase of information. Here we have to take into consideration that the entropy principle has a physical basis in the Boltzmann derivation, in statistical mechanics and in the transition toward more probable distributions as is necessary in chance processes; presently, no physical explanation can be given for Trincher’s phenomenological principles.
Here we are dealing with fundamental problems which, I believe, “are swept under the carpet” in the present biological creed. Today’s synthetic theory of evolution considers evolution to be the result of chance mutations, after a well-known simile (Beadle, 1963), of “typing errors” in the reduplication of the genetic code, which are directed by selection, i.e., the survival of those populations or genotypes that produce the highest number of offspring under existing external conditions. Similarly, the origin of life is explained by a chance appearance of organic compounds (amino acids, nucleic acids, enzymes, ATP, etc.) in a primeval ocean which, by way of selection, formed reproducing units, viruslike forms, protoorganisms, cells, etc.
In contrast to this it should be pointed out that selection, com- petition and “survival of the fittest” already presuppose the existence of self-maintaining systems; they therefore cannot be the result of selection. At present we know no physical law which would prescribe that, in a “soup” of organic compounds, open systems, self-maintaining in a state of highest improbability, are formed. And even if such systems are accepted as being “given,” there is no law in physics stating that their evolution, on the whole, would proceed in the direction of increasing organization, i.e., improbability. Selection of genotypes with maximum offspring helps little in this respect. It is hard to understand why, owing to differential reproduction, evolution ever should have gone beyond rabbits, herring or even bacteria, which are unrivaled in their reproduction rate. Production of local conditions of higher order (and improbability) is physically possible only if “organizational forces” of some kind enter the scene; this is the case in the formation of crystals, where “organizational forces” are represented by valencies, lattice forces, etc. Such organizational forces, however, are explicitly denied when the genome is considered as an accumulation of “typing errors.”
Future research will probably have to take into consideration irreversible thermodynamics, the accumulation of information in the genetic code and “organizational laws” in the latter. Presently the genetic code represents the vocabulary of hereditary substance, i.e., the nucleotide triplets which “spell” the amino acids of the proteins of an organism. Obviously, there must also exist a grammar of the code; the latter cannot, to use a psychiatric expression, be a word salad, a chance series of unrelated words (nucleotide triplets and corresponding amino acids in the protein molecules). Without such “grammar” the code could at best produce a pile of proteins, but not an organized organism. Certain experiences in genetic regulation indicate the existence of such organization of the hereditary substratum; their effects will have to be studied also in macroscopic laws of evolution (von Bertalanffy, 1949a; Rensch, 1961). I therefore believe that the presently generally accepted “synthetic theory of evolution” is at best a partial truth, not a complete theory. Apart from additional biological research, physical considerations have to be taken into account, in the theory of open systems and its present borderline problems.
Source: Bertalanffy Ludwig Von (1969), General System Theory: Foundations, Development, Applications, George Braziller Inc.; Revised edition.