Apart from the above-presented perspectives on information, two views are predominant among researchers. The mathematical/ statistical view is primarily used in connection with telecommunications and databases to quantify and measure channel and storage capacity for information exchange and processing. This concept has no epistemological aims to explain the nature of information. It was used by Shannon and Weaver in their pioneering work. Here, information is defined as a measure of freedom of choice when selecting a message. Their concept of information must not be confused with meaning. Meaning can be defined as the significance of information for the system which processes it and has to be measured by units other than those normally used in information theory. Meaning does not relate to the symbols used, rather to what the symbols represent.
The information physics view states that information is an implicit component of virtually every equation governing the laws of physics. Information is a property of the universe and it does not have to be perceived, to have meaning or to be understood, in order to exist. Several well-known researchers has mathematically stated the relation between matter, energy and information. Einstein presented the relation between matter and energy in his formula E = me2. The connection between energy and information was described by Szilard (1898-1964) and between matter and information by Bremerman (1965). Bremerman also suggested an upper bound on the rate at which symbols can be processed by matter, “Bremermans limit”. This limit was defined to 1047 bits/gram/sec. and is founded on the fact that the speed of light cannot be exceeded. The reasoning presupposes that one photon is considered equivalent to one bit. The information physicist Tom Stonier (1990) has published some theorems concerning the interrelationship between matter, energy and information. Some may be regarded as quite revolutionary by classical information theorists. Freely interpreted they are as follows:
- All organized structures contain information, and — as a corollary — no organized structure can exist without containing some form of information.
- A structure that is rich in information tends to lack pattern (a random set of figures cannot be condensed into a simple set of instructions, so it has a high information content).
- The addition of information to a system manifests itself by causing a system to become more organized, or reorganized.
- An organized system has the capacity to release or convey information.
- Heat is the product of energy interacting with matter. Structure is the product of information interacting with matter.
- The information content of a system is directly proportional to the space it occupies.
- Time, like entropy, is inversely related to information. The greater the interval between two events, the less the information content of the system.
- Energy may be converted into information or be used to convert information from one form to another (information transducing).
- The more highly organized a system is, the more its information is separated from the energy that bears it. (Information should not be confused with the matter-energy markers which bear it.)
- Information cannot be stored in a system in thermal equilibrium. (With time, the printer’s ink in the individual letters of a book will diffuse away until a homogenous state is reached).
Organization may thus be expressed as the manifestation of infor- mation interacting with matter and energy. When added to matter it exhibits itself as structure or organization; consequently organization may be regarded as stored information. But the stored information into a crystal is, however, practically non-existent as it is a simple, regular array of atoms with a periodic structure.
Information may not only organize matter and energy, it may also organize information itself. From that it follows that the less organized a structure or process is, the less information is needed to describe it completely. A totally disordered state needs only a few bits of information to describe it. The thermodynamic steady- state of a gas can be completely described simply by giving its temperature and volume. As a system approaches steady-state, it loses information irreversibly. On the other hand order is created by reducing complexity. Furthermore, information is everywhere, but knowledge only exists within a goal- seeking adaptive system consisting of goal-seeking subsystems.
A modification of the information content of a system always results in a corresponding change of its pertinent entropy and a loss of organization is always associated with an increase in entropy. Such a change may be brought about through the alteration of its organization or its heat content. The function of additional heat may be to stabilize the organization and minimize externally induced entropy. But if all existing heat was withdrawn from a system, its temperature would be zero degrees Kelvin and also its entropy would be zero. A phenomenon such as this may be interpreted according to the third law of thermodynamics (see p. 20).
The expansion of a system in physical space as a result of the application of energy will not produce change in information, unless accompanied by change in organization. The information content of a syàtem normally tends to vary directly with the space occupied and inversely with the time occupied. Concerning the impact of time, physical information is time-dependent and does not withstand the forces of entropy. A system which is more resistant to erosion over time has to contain more information. The configuration of a physical system blurs with time and consequently an observation of it becomes increasingly obsolete.
Several inventions within the area of human communication enact information patterns on various forms of energy. Radio transmitters and printing presses, for example, multiply the information content several million times.
Accepting this view of information, physics gives some inevitable general consequences for the relation between information, probability and entropy. Information, order and improbability create the opposite to lack of information, disorder and probability, which represents entropy. Information is therefore to be considered as an inverse exponential function of entropy and is sometimes referred to as negative entropy (or negentropy) by some researchers. As entropy increases information decreases. As entropy approaches infinity, information approaches zero.
These facts explain why the famous Maxwell’s demon (see p. 20) does not disobey the second law of thermodynamics when it sorts molecules through a trap-door in a closed system. By his decisions about energy levels, the demon generates negentropy, the local increase of which is necessarily matched by more entropy elsewhere.
A paradox revealed by information physics is that as the universe evolves, its information content increases and it may end up in a state where all matter and energy have been converted into pure information. The main laws of thermodynamics state that the total entropy in the universe has to increase. Evidently there exist simultaneously two universal contradictory forces: one entropie, destroying and levelling out, and one organizing and building up.
Another paradox is that meaning and information have very little to do with each other, just as no directly visible correlation exists between order and information. The more disorganized and unpredictable a system is, the more information it is possible to obtain by watching it. One can never know if a hidden order exists; it may well-exist even if it is not possible to reveal for the moment. Organization, information and predictability are thus quite paradoxically interrelated.
A closer look at the concept of information will reveal some strange qualities. One is that it is impossible to wear out information. You may duplicate information in as many copies as you want without deterioration of the source and moreover mostly without cost. To get rid of information will on the other hand always involve some cost.
Unlike the basic entities of matter and energy, information is a function of the observer as the same message may have different meanings for different people. Bateson defined information in one of his books as “a difference that makes a difference” (Bateson 1972). This difference is dependent on a matter or energy carrier (or marker, see p. 124), like a written letter or speech with sound waves in which a pattern appears. It can be observed at the atomic level, in molecules, in cells, in organs, in groups as norms and in society as culture.
Information may be measured on the basis of the amount of surprise, unpredictability or ‘news value’ that it conveys to the receiver. Paradoxically, disorder possesses a greater surprise potential than order. A completely unstructured sequence of letters like EVSYEDTOQPF is very difficult to describe; there is no simpler description than the sequence itself. It must therefore be assigned a maximum of information. A structure imposed on the letter source will reduce the average amount of information per letter from that source. Whereas an ordered row of letters, say ABCDEFGHIJKLMN, holds less information, the combination EINSTEIN can provide a great deal of information. The latter row has been the subject of more information processing in the ordering of the letters in relation to a meaningful context. Its information history is composed of the knowledge of both the humanity and the whole Western-scientific culture. Information is extremely context dependent and very often the real content of a message is read between the words.
Figure 4.5 Relation between information and uncertainty in a communication channel.
The reception of information will normally result in a decrease of uncertainty for the receiver. This process is shown in Figure 4.5. The slope of the channel conveying the information shows a linear relation by the 45° angle.
The amount of uncertainty reduction is dependent upon the amount of information already held by the receiver. According to this preexisting information the slope in the figure may have a quite different angle.
The communication system with transmitter, channel, receiver, etc. and its given set of message categories constitutes a closed system. Information expressed in categories other than those normally used therefore has a tendency to be interpreted as an error arising from distortions or mistakes. Information, however, is normally ordered simultaneously on many levels. The regular effect of communicated information is surprise followed by uncertainty reduction. Information which initially increases uncertainty evokes a higher order of surprise related to a discontinuity in the information accumulation.
Imagine the optical information system of traffic lights. Each driver knows what to do when the light shifts to green via amber, but what if the light suddenly turns blue instead of green? With a shift from one level of categories or channels to another the reduction of uncertainty ceases. When the passenger tells you that blue light is a request to pull over and stop the car in order to let the fire brigade pass (exclusive for this town), you say ‘Aha!’. A fresh set of categories accommodating both the old and new is established and the uncertainty reduction can be resumed. Figure 4.6 depicts graphically the discontinuity in uncertainty reduction and the transition to a more diverse channel, all related to the example given above.
Figure 4.6 Discontinuity in uncertainty reduction and transition to a more diverse channel.
A discontinuity is also possible in a reverse way. Imagine the car at the red light when it suddenly turns green without showing amber first. This transition to a less diverse channel poses no problem as the information received still fits into the basic content of the existing category and need not be reinterpreted into new categories. Both red and green retain their meaning (see Figure 4.7).
The shift between channels of varying complexity is an important part of the communication process. The involved parties have to ensure that the channels used are neither too simple nor too complex for the kinds of message used and they must adjust them as required.
A message may hold information which, although not present in the message per se, is comprehended by the receiver through reference to previously known facts. This is the basis for the concept of exformation, derived from external information (Norretranders 1993). It is information which exists in the head of the sender, is omitted in the composition of the message and is presumed to be deduced by the receiver. The intention with a message is primarily to induce the receiver to form an idea corresponding to that of the sender. This use of exformation is possible because human beings share experiences which are possible to interpret through a common language giving the same associations. The words refer to something not intrinsic to the words themselves but conceivable in the mind of their user.
Figure 4.7 Discontinuity in uncertainty reduction and transition to a less diverse channel.
Information is something measurable, expressed by letters or bits used in the concrete message, while exformation is all that was omitted or extracted. More information is not necessarily more exformation. It is therefore not possible to measure the amount of exformation; this is dependent upon each context. In certain circumstances the omission or non-existence of a special signal may stand for a complete message. A phrase like ‘silence speaks volumes’ tells us that a general silence may convey a very comprehensive message. The same goes for an absent daughter, who has promised to phone home over the weekend if she does not feel well. She has sent a message without a single signal if she does not telephone. Thus the information content of a message clearly depends on someone’s prior expectations about the message. With this perspective we are not able to speak of how much information a person has, only how much a message has.
Empty spaces within the organized structure of a message may also be highly significant pieces of information. The most frequent symbol in written English is the space between words which conveys information until the words are removed and the page becomes blank. It is the organization and structure of the surrounding system which defines the information content of existing empty spaces. Spaces as discontinuity define boundaries of structural entities, but the absence of structure within a structure can sometimes constitute information as significant as the structure itself. Therefore, a message not sent is also informative. The value of information may here be defined as the amount of work which is performed by the sender and which the receiver need not repeat.
Source: Skyttner Lars (2006), General Systems Theory: Problems, Perspectives, Practice, Wspc, 2nd Edition.