Widely-known laws, principles, theorems and hypotheses

Systems knowledge of a more general nature, particularly within systems behaviour, has been expressed in different laws, principles, theorems and hypotheses. This knowledge is considered to be within the core of General Systems Theory even if its origin is to be found in another area. Some of the formulations presented here cover a broad scope of systems aspects and are extensively applicable, although most of them concern living systems.

The different parts of General Systems Theory are reiterated below, beginning with the laws.

  • The second law of thermodynamics: In any closed system the amount of order can never increase, only decrease over time.
  • The complementary law: Any two different perspectives (or models) about a system will reveal truths regarding that system that are neither entirely independent nor entirely compatible (Weinberg 1975).
  • The law of requisite variety: Control can be obtained only if  the variety of the controller is at least as great as the variety of the situation to be controlled (Ashby 1964).
  • The law of requisite hierarchy: The weaker and more uncertain the regulatory capability, the more hierarchy is needed in the organization of regulation and control to get the same result (Aulin and Ahmavaara 1979).
  • The law of requisite parsimony: Human short-term memory is incapable of recalling more than seven plus minus two items. Three elements and the four interacting combinations of them will consist of such seven items (Miller 1956).

The following general principles are valid for all kinds of systems.

  • System holism principle: A system has holistic properties not manifested by any of its The parts have properties not manifested by the system as a whole.
  • Suboptimalization principle: If each subsystem, regarded  separately, is made to operate with maximum efficiency, the system as a whole will not operate with utmost efficiency.
  • Darkness principle: No system can be known completely.
  • Eighty-twenty principle: In any large, complex system, eighty per cent of the output will be produced by only twenty per cent of the system.
  • Hierarchy principle: Complex natural phenomena are organized in hierarchies wherein each level is made up of several integrated systems.
  • Redundancy of resources principle: Maintenance of stability under conditions of disturbance requires redundancy of critical resources.

Redundancy of potential command principle: In any complex decision network, the potential to act effectively is conferred by an adequate concatenation of information.

Relaxation time principle: System stability is possible only if the system’s relaxation time is shorter than the mean time between disturbances.

Negative feedback causality principle: Given negative feedback, a system’s equilibrium state is invariant over a wide range of initial conditions (equifinality).

Positive feedback causality principle: Given positive feedback in  a system, radically different end states are possible from the same initial conditions (multifinality).

Homeostasis principle: A system survives only so long as all essential variables are maintained within their physiological limits.

Steady-state principle: For a system to be in a state of equilibrium, all subsystems must be in equilibrium. All subsystems being in a state of equilibrium, the system must be in equilibrium.

Self-organizing systems principle:  Complex systems organize themselves and their characteristic structural and behavioural patterns are mainly a result of interaction between the subsystems.

Basins of stability principle: Complex systems have basins of stability separated by thresholds of instability. A system dwelling on a ridge will suddenly return to the state in a basin.

Viability principle: Viability is a function of the proper balance between autonomy of subsystems and their integration within the whole system, or of the balance between stability and adaptation.

First cybernetic control principle: Successful implicit control must be a continuous and automatic comparison of behavioural characteristics against a standard. It must be followed by continuous and automatic feedback of corrective action.

Second cybernetic control principle: In implicit control, control is synonymous with communication.

  • Third cybernetic control principle: In implicit control, variables are brought back into control in the act of, and by the act of, going out of control.
  • The feedback principle: The result of behaviour is always scanned and its success or failure modifies future behaviour.
  • The maximum power principle: Those systems that survive in competition between alternative choices are those that develop more power inflow and use it to meet the needs of survival.

Living systems also follow a number of main systemic principles, foremost in connection with preserving stability. The twelve below have been defined by Watt and Craig in their book Surprise, Ecological Stability Theory (1988).

  • The omnivory principle: The greater the number of different resources and of pathways for their flow to the main system components, the less likely the system will become In other words: spread the risks or ‘Don’t put all your eggs in one basket’.
  • The high-flux principle: The higher the rate of the resource flux through the system, the more resources are available per time unit to help deal with the Whether all resources are used efficiently may matter less than whether the right ones reach the system in time for it to be responsive.
  • The variety-adaptability principle: Systemic variety enhances stability by increasing adaptability.
  • The flatness principle: The wider their base in relation to their number of hierarchic levels, the more stable organizational pyramids will be. A larger number of independent actors increases stability.
  • The system separability principle: System stability increases as the mean strength of interaction between components is Stability is enhanced by separating the elements of the system from one another.
  • The redundancy principle: Generally, arithmetic increases in redundancy yield geometric increases in reliability. In selforganizing systems, negative feedback regulates reproduction where too little redundancy leads to the species dying out and too much to over-reproduction.
  • The buffering principle: Stability is enhanced by maintaining a An unused reserve cannot however help the system.
  • The environment-modification principle: To survive, systems have to choose between two main One is to adapt to the environment, the other is to change it. The beaver, for example, changes the environment for its own benefit.
  • The robustness principle: The ability of a system to passively withstand environmental change may derive from simple physical protection or it may involve a complex of mechanisms similar to those used by the butterfly to overwinter as a pupa.
  • The patchiness principle: The lack of capacity to use a variety of resources leads to instability (the external counterpart to the omnivory principle). Rule-bound systems, stipulating in advance the permissible and the impermissible, are likely to be less stable than those that develop pell-mell.
  • The over-specialization principle: Too much of a good thing may render systems unstable in the face of environmental It is through this principle that the conflict between the parts and the whole is played out.
  • The safe environment principle: Based upon the environment- modification principle, it states the importance of creating a permanently stable environment whereby the system is protected from change.
  • The principle of adaptation: For continued system cohesion, the mean rate of system adaptiation must equal or exceed the mean rate of change of environment (Hitchins 1992).
  • The Red Queen Principle: A system must continously develop in order to merely  maintain its fitness relative  to  the  system it coevolves with (Van Valen 1973).

The following theorems together contribute further perspectives to the above material.

  • Gôdel’s incompleteness theorem: All consistent axiomatic foundations of number theory include undecidable propositions.
  • Redundancy-of-information theorem: Errors in information transmission can be prevented by increasing the redundancy in the messages.
  • Recursive-system theorem: In a recursive organizational structure, each viable system contains, and is contained in, a viable system. (In a military context, for example, it says that group, platoon, company, etc. all have the same functions and structure as integrated parts of the battalion.)
  • Feedback dominance theorem: In an efficient system, feedback dominates output in spite of extensive variations of That is, regulation is more efficient on input side.
  • Conant-Ashby theorem: Every good regulator of a system must be a model of the system.

The following hypotheses have been selected from the approximate total of one hundred in Miller’s book Living Systems (1976), in which he introduced the General Living Systems theory (GLS). Their social and managerial implications are obvious.

  • A system’s processes are affected more by its suprasystem than by its supra-suprasystem or above, and by its subsystems more than by its sub-subsystems or below.
  • The amount of information transmitted between points within a system is significantly larger than the amount transmitted across its boundary.
  • The larger the percentage of all matter/energy input that a system consumes for the information processing necessary to control its various system processes, as opposed to matter-energy processing, the more likely the system is to survive.
  • Strain, errors and distortions increase in a system, as the number of channels blocked for information transmission increases.
  • In general, the farther components of a system are from one another and the longer the channels between them are, the slower is the rate of information flow among them.
  • The higher the level of a system the more correct or adaptive its decisions.
  • Under equal stress, functions developed later in the phylogenetic history of a given type of system break down before the more primitive functions do.
  • The greater the resources available to a system, the less likely there is to be conflict among its subsystems or components.
  • The vigour of the search for resolutions of conflicts increases as the available time for finding a solution decreases.
  • A component will comply with a system’s purposes and goals to the extent that those functions of the component directed towards the goal are rewarded and those directed away from it are punished.

Some of the axioms related to GST can be found on p. 51-56. The nature of axioms has been explicitly expressed in Godel’s theorem, presented on p. 104. His theorem is a very elegant series of proofs about the nature of mathematics. There he concludes that an axiom system in mathematics that is sufficiently complex to support arithmetic will either be inconsistent or incomplete (not both). You cannot prove a system with the system itself…!

Source: Skyttner Lars (2006), General Systems Theory: Problems, Perspectives, Practice, Wspc, 2nd Edition.

Leave a Reply

Your email address will not be published. Required fields are marked *