1. 3

Ahmed’s Dev-Shop Laws of Systemantics by John Gall

Posted in Engineering, software by Ahmed Siddiqui on May 15, 2008

The Primal Scenario or Basic Datum of Experience: Systems in general work poorly or not at all. (Complicated systems seldom exceed five percent efficiency.)

The Fundamental Theorem: New systems generate new problems.

The Law of Conservation of Anergy{sic}: The total amount of anergy in the universe is constant. (“Anergy” := ‘human energy’)

Laws of Growth: Systems tend to grow, and as they grow, they encroach.

The Generalized Uncertainty Principle: Systems display antics. (Complicated systems produce unexpected outcomes. The total behavior of large systems cannot be predicted.)

Le Chatelier’s Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.

Functionary’s Falsity: People in systems do not do what the system says they are doing.

The Operational Fallacy: The system itself does not do what it says it is doing.

The Fundamental Law of Administrative Workings (F.L.A.W): Things are what they are reported to be. The real world is what it is reported to be.

Systems attract systems-people. (For every human system, there is a type of person adapted to thrive on it or in it.)

The bigger the system, the narrower and more specialized the interface with individuals.

A complex system cannot be “made” to work. It either works or it doesn’t.

A simple system, designed from scratch, sometimes works.

Some complex systems actually work.

A complex system that works is invariably found to have evolved from a simple system that works.

A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.

The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.

The Newtonian Law of Systems Inertia: A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.

Systems develop goals of their own the instant they come into being.

Intrasystem{sic} goals come first.

The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.

A complex system can fail in an infinite number of ways. (If anything can go wrong, it will.) (see Murphy’s law)

The mode of failure of a complex system cannot ordinarily be predicted from its structure.

The crucial variables are discovered by accident.

The larger the system, the greater the probability of unexpected failure.

“Success” or “Function” in any system may be failure in the larger or smaller systems to which the system is connected.

The Fail-Safe Theorem: When a Fail-Safe system fails, it fails by failing to fail safe.

Complex systems tend to produce complex response (not solutions) to problems.

Great advances are not produced by systems designed to produce great advances.

The Vector Theory of Systems: Systems run better when designed to run downhill.

Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)

As systems grow in size, they tend to lose basic functions.

The larger the system, the less the variety in the product.

Control of a system is exercised by the element with the greatest variety of behavioral responses.

Colossal systems foster colossal errors.

Choose your systems with care.


  2. 2

    (I got to this from https://ontonixqcm.wordpress.com/2016/06/26/brexit-the-most-likely-future-scenario/ )

    More on ‘systemantics’ and Gall: https://en.wikipedia.org/wiki/Systemantics

    And it is Quite Interesting: General Systemantics (retitled to Systemantics in its second edition and The Systems Bible in its third) is a systems engineering treatise by John Gall in which he offers practical principles of systems design based on experience and anecdotes.

    It is offered from the perspective of how not to design systems based on system engineering failures. The primary precept of treatise is that large complex systems are extremely difficult to design correctly despite best intentions and so care must be taken to design smaller less complex systems and to do so with incremental functionality based on close and continual touch with user needs and measures of effectiveness.

    The term systemantics is a commentary on prior work by Alfred Korzybski called General Semantics which conjectured that all systems failures could be attributed to a single root cause—a failure to communicate. Dr. Gall observes that, instead, system failure is an intrinsic feature of systems. He thereby derives the term ‘General Systemantics’ in deference to the notion of a sweeping theory of system failure, but attributed to an intrinsic feature based on laws of system behavior. He observes as a side-note that system antics also playfully captures the concept that systems naturally “act up.”

    This is more a universal observation than a law. The origin of this observation is traced back: Murphy’s Law that “if anything can go wrong, it will”,

    Alfred Korzybski’s General Semantics notion of failure’s root cause being a communication problem,

    Humorist Stephen Potter’s One-upmanship on ways to “game” the system for personal benefit,

    Historian C. Northcote Parkinson’s principle called Parkinson’s Law – “Work expands so as to fill the time available for its completion”

    Educator Lawrence J. Peter’s widely cited Peter Principle – “In a hierarchy every employee tends to rise to his level of incompetence … in time every post tends to be occupied by an employee who is incompetent to carry out its duties … Work is accomplished by those employees who have not yet reached their level of incompetence.”

    By “systems”, the author refers to those that “…involve human beings, particularly those very large systems such as national governments, nations themselves, religions, the railway system, the post office…” though the intention is that the principles are general to any system.

    Additionally, the author observes.

    Everything is a system.

    Everything is part of a larger system.

    The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).

    All systems are infinitely complex.


    1. 1

      Full book - Systemantics - how systems fail and especially how they work, John Gall http://wtf.tw/ref/gall.pdf

      And from the ‘panarchy’ website: http://www.panarchy.org/gall/systemantics.html

      John Gall Systemantics How Systems Work And Especially How They Fail



      During the ‘70’s, when Systems Theory was very much in the ascendancy, this booklet came out to remind people that large systems are prone to failure and that is no use to rely on bigger and bigger systems to solve bigger and bigger problems as they are, very likely, the product of those very systems. Towards the end of that booklet, John Gall introduces the brilliant idea that, in order to avoid the concentration of power the best way is not the diffusion of power (as power has the habit to re-concentrate itself in the long run) but the diffusion of the targets of power, i.e. of the citizens of this world. To allow so he advocates the introduction of two new Freedoms:

      • The Free Choice of Territory ( Distributional Freedom ).
      • The Free Choice of Government ( Principle of Hegemonic Indeterminacy ).

      These two freedoms show remarkable similarities with the idea of Panarchy. Considering that many other voices have put forward the idea of Aterritorialism and Free Choice of Government, it seems then that, whatever the name used (the Principle of Hegemonic Indeterminacy or Panarchy or Polyarchy) we are dealing with a recurring basic aspiration and it is time that those still indoctrinated by the territorial states take notice of it because it is something that is not going away; the alternative being the continuation of innumerable conflicts and clashes right up to full-blown so-called “civil” wars.