Basic concepts of information theory

Information theory is concerned with the problem of how to measure changes in information or knowledge content. It is based on the fact that we can represent our experience by the use of symbols like the alphabet, pictures, etc. Generally, we only need information when faced with some kind of choice. If you know the road to the bus stop you do not need to consult a map for information. Information, ignorance, choice, prediction and uncertainty are all closely related.

Since the establishment of classical information and communication theory in the 1940s, nobody has, however, succeeded in stating a general definition of the concept of information. This problem is based on the fact that information is a relation, not a concrete thing. Without a context there is no difference between information and noise. What is noise in one relation may be considered information in another. Information is a kind of variation pattern, not differing from other kind of variations. If not coded it must be considered noise. From a philosophical point of view, matter and energy exist without the need of observation while information only exists under observation. The idea of information does not make sense unless there is an information processor.

Several conceptions of information exist, often used simultaneously and in the same context, which is a cause of confusion. Information is a highly abstract term, possible to interprete in hundreds of ways. It could be radio signals travelling in the air, light pulses down a fibre optic cable, sequences of bases of DNA, etc. The literal meaning of the word is as a rule ‘that which determines form’. As commonly employed its significance is mostly derived from the context in which it is used. Information is not just information in itself. It becomes a concept with a content when it is information to somebody, i.e. as a mental construct. Information is neither matter nor energy, it is rather an abstract concept of the same kind as entropy, which must be considered a conceptual relative. ‘Amount of information’ is a metaphorical term and has in fact no numerical properties.

An existential view of information is relativistic and states that information per se is something imperceptible. Digital letters, numbers, sounds and images are a sequence of zeros and ones, not something possible to perceive as information. Pure information, like pure knowledge, signifies nothing at all; it is the context in which it is employed that gives it existence and value. Information becomes knowledge only when we decide to put it into use. Without this transformation, stored information is nothing more than physical or electronic signs.

Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty.

Information scientists, however, with their more explictit need for clarity, use some of the terms defined below to measure information content. Selective information has to do with the number of minimum independent choices between two equally likely possibilities. This gives the promise of narrowing the range of prospects about which we are ignorant. Its measure is a relation between a signal and an ensemble. Descriptive information is seen as small entities which, when added together, build up more knowledge about something. A microscope with higher resolution accordingly gives more structural information about an object than one with lower resolution. On the other hand, when an observation gives more precision by better instrumentation and finer readings (more decimal places), it has gained in metrical information content. The measures of selective, structural and metrical information content can be seen as complementary. A simple analogy of their mutual relationship would be volume, area and height as measures of size.

Information is always dependent on some physical base, or energy flow, where the energy component is subordinate to the structure of variation, manifested by the flow. The structure of variations in the media used must always remain unaffected by the carrier, however it is chosen. If these variations in some way match the structure of the receiving entity, a dynamic relation is possible. Information is therefore a kind of relationship between sets of structured variety and not a substance or concrete entity.

Information has also been defined according to what has been called the infological equation (Langefors 1973). It is represented by the following formula:

I = i (D, S, t)

In  this   formula,   I  stands   for  the   information  achieved   by  an interpretation process, i, acting on data, D, with regard to previous knowledge, S, during the time, t, which is available.

Whatever the definition, information is an invisible agent — such as electricity in a modem town, tying together all components (personnel, machines, money, material, etc.). It is the instructions which permit a system to perform structural or logical work. Information organizes not only matter and energy but also itself. It has therefore to be filtered, condensed, stored, transmitted, received, aggregated, integrated, manipulated, and presented, that is, processed. As best, the processing is non-destructive and leaves the information

  • relevant
  • complete
  • uncorrupted
  • actual

Decision making and control, regulation and measurement are affected through information. These processes are based on the fact that information may be infinite but only possible to organize in a finite way from a human point of view. All information can be structured according to the following:

  • Category
  • Time
  • Location
  • Alphabet
  • Continuum

Of course, each choice has many variations but the main alternatives are still basically five in number. If we take as an example a book going to be part of a library, this demonstrates that when a structure is used, the book is easily recognized.

Category in a library means the main topic according to the content of the book. Such topics are fiction, history, philosophy, etc. Time in a library sense is the printing year of the book, while location is its physical position on one of the shelves. Alphabet is the arrangement of the book stock in alphabetic order, both with regard to the author’s name and the title. Continuum is the current newsletter presenting all recently acquired books in the library stock. Traditionally, libraries have used all structures simultaneously, well aware that each way of organizing information will permit a different understanding. The possibility of multiple perspectives is a good approach when the aim is to extract maximum value and significance from information.

A special problem is information unconstrained by package in the form of books or journals e.g. in the Internet. There it is copied  and added into a continuous process like the ongoing adaptation of stories in the oral tradition before literacy.

Source: Skyttner Lars (2006), General Systems Theory: Problems, Perspectives, Practice, Wspc, 2nd Edition.

Leave a Reply

Your email address will not be published. Required fields are marked *