>>>>> www.lesliemelcher.com


Complexity Theory: The Theory of the Ultimate Void.

7-7-07 - Revised Tuesday, July 17, 2007 - Download PDF


The so-called Complexity Theory is almost like the Visa card commercial: everywhere, but maybe not always where YOU want to be.

All “serious books”, magazines, just about anything listed under the ‘non-fiction' label somehow needs to mention or allude to some elusive ‘complex systems'. Complex Systems, Complexity Theory: today's mantra... In order to sound scientifically versed or simply show rational credentials, one must never omit mentioning complex systems, either as an explanation, an aside, a consideration, or a problem that will need to be fixed at some ulterior time. Complex systems are here to stay. Words or notions that invariably follow the mere mention of complexity are – this is a mere sample: networks , interrelations, interconnections , chaos , irreducibility , relationships , global (global anything will do quite well) , randomness , probability , incompleteness , interactions , loops, negative or positive feedbacks and so on. The only correlation these terms share is their meaninglessness.

It matters not that Complexity Theory has no definition – or as many as there are books written about it - there is a specific gravitational grammar around it. The sudden discovery of our post modern mindset suggests that we have finally arrived at the conclusion that things are not always as simple they seem.

This revolutionary conceptual new frontier is now officially open to all, so that a ‘new' pseudo-scientific babble can expend endlessly.

Three main categories of ‘complexity' seem to fit the mold of a complex system:

•  The extreme or near impossibility to solve a specific or a set of similar problem. (Mostly found in Human Sciences or esoteric mathematics) In this type of complexity, time is the primary factor creating complexity. Complex issues in this category would take anywhere from a lifetime to infinity to solve. It is important to note here the elements that make up this type of complex system are fully described, understood, and known. It is their relationships or interconnectivity – computational factor – that creates complexity. NP complete problems or Omega – type numbers are typical examples of application / version of complexity in mathematics. Nature and its wonder – such as why a mound of sand collapses after x amount of sand is dripped unto it, is a human science version of the same idea (hint: stability, it seems, sits at the very edge of chaos, i.e. just before the sand pile collapses under its own weight) .

•  The lack of data to explain or predict situations or outcomes (Generic Version and Sociological Studies) is another version of Complexity Theory. This is usually used as an explanation as to why we have gross scientific deficiencies. Such science professes to predict the future using advanced modeling techniques and uses the complexity as an explanation for their constant failure to explain or predict anything. Think economy or weather forecasting. Error is built-in and an existential part of these systems. Even physics has joined the fray – complaining that our current backward state in technology is to blame for the lack of adequate concrete experimental data, hence the impossibility to find solutions to most questions bouncing around the quantum field. Most sciences fall in this category these days. It is interesting to note that, whatever the complaints may be, the technology is there but the interpretation is not. Discoveries are made but no one seems to be able to make heads or tails about it. Complexity Theory: bits of data without glue.

•  World issues, Global economy, Global warming, etc. ( The rest of the unexplained because unexplainable….The impossibility to conceive an explanation for lack of any data whatsoever save for the effect of too many unknown unknowns) constitutes the third version. This one is close to ‘chaos theory' as it usually bundles together all fields of human activities in particular and in general, predicting universal fireworks of a infinite magnitude. The principle goes like this: the author illustrates how should one of these human activity - may it be farming, stock trading or industrial processing - go wrong, the damage of this single event would go from singular to global, from one to many, multiplying before collapsing – as in the domino effect – on all the cards that mankind built. The universe as we know it would ultimately implode. In this scenario, we are apparently doomed if we do or if we don't. Since complexity arises from nowhere in particular or somewhere/nowhere under unknown conditions and conjunctions with horrendous repercussions everywhere and to everything, civilization is bound to collapse pretty soon. Global this or global that are the usual suspects here.

The overall logic of complexity theory always starts out by explaining our lack of understanding, hence solutions. It then proceeds to propose solutions to our lack of understanding… The argumentation stops when the author(s) ‘stumble' unto some core complex system that cannot be further explained in simple terms. That complex is deemed irreducible: all the parts of the system are intermingled so badly that they promptly declare the apparent cluster the foundation of their personal complex system.

This is the premise of all the Complexity Theorist. Based on this phenomenal discovery, we are subjected to a long explanation as to the meaning of this irreducibility and its formidable significance. The conclusion typically exposes a loose method for the befuddled reader to play with the system so that he too, can be part of the ‘Secret'.

In short, the Complexity Theorists explain how complexity can be simplified by some external explanation - a simple yet meta-complex solution.

Complexity theory is basically a single name for totally separate and distinct set of issues conveniently buried under pseudo scientific jargon. Since it is impossible for technocrats and specialists alike NOT to have a ready made answer to everything, the lack of answer becomes the answer itself. As such Complexity theory might just as well be renamed Theory of the Void, Theory of Nothing, Theory of The I don't Know and should include all things and unknown phenomena, including but not limited to dark matter (the old Ether thing) ghost, flying saucers and alien abductions.

Complexity Theory: Who needs it?






































E-Task Definition

Download PDF

Managing Projects by E-tasks breakdowns (atomic or elementary tasks) is an application of the probabilistic law of large numbers: while some tasks will take longer than initially projected, others will take less. It follows that we can:

•  Temporary re-allocate additional resources to late tasks (from the pool of completed parallel tasks, and

•  Re-distribute the overall probability that all tasks will finish according to schedule, as a late task and an early task cancel each other in the overall schedule time estimate.

An elementary task (e-task) cannot be objectively (scientifically) ‘justified' nor ‘proven' by ordinary deduction.

An e-task is thus a somewhat subjective decision.

An e-task is decided ahead of time and injected into the main project schedule

An e-task should not be over 2 weeks long

An e-task is part of a larger inductive process 1 and the very core of the project schedule and its iterative estimates – See Bayle's Update Theorem

An e-task is Axiomatic. They are axioms or part of an a axiomatic process defined as an inductive process*. An axiom is any starting assumption from which other statements are logically derived. It can be a sentence, a proposition, a statement or a rule that forms the basis of a formal system. Unlike theorems, axioms cannot be derived by principles of deduction, nor are they demonstrable by formal proofs—simply because they are starting assumptions—there is nothing else they logically follow from (otherwise they would be called theorems). In many contexts, "axiom," "postulate," and "assumption" are used interchangeably.

How to define an e-task (Borrowing from Descartes)

  1. Accept as true only what is indubitable .
  2. Divide every question into manageable parts.
  3. Begin with the simplest issues and ascend to the more complex.
  4. Review frequently enough to retain the whole argument at once

Project methodology

•  all complex elements are deemed to be aggregates of simpler ones

NOTE: Complexity theory or complex systems with infinite variables – think Chaos theory, Traffic, business inner-workings, software development… would totally disagree with #1. Although this discussion is beyond the scope of this document, suffice to say that we consider complexity theory as an ill-formed system with no coherent definition, no scope and no research boundaries. Moreover it seems to rely on two concepts we cannot accept 1

A) The irreducibility of a complex system

B) The notion of randomness, chance or chaos

The two (A and B) are actually always defined unknowns, yet always explained

Randomness may just be a state of limited understanding, and not exist per se.

See L.Wittgenstein in Tractatus Logico Philosophicus


•  all aggregates(complex elements) are deemed finite

•  All complex elements have a beginning (The question: why?) and an end (The answer: This!)

•  One aggregate at the time will be de-composed. Step by step into series of simpler elements until it makes no more common sense to break down any further.

•  Common Sense is the ability of all trained minds to “sense” a beginning and the end of a proposition. This sense is converted to a certitude, This sense becomes a mathematically true statement (indubitable)

•  Break down aggregate until a set of simple “element” can re-create previous aggregate (this will be proof that you have defined the issue correctly) – Regression to the root cause

•  Organize elements into pre-low level specification(s)

•  Organize element (s) into possible task assignment to self or to third party

•  Organize element (s) into time frames

•  look over all aggregates and make sure they are all fully distilled

•  review once more if you can distill more

•  This will reveal how many man-hours you need, and the percent of work to be done by each man.


Example: Project Phase: “High Level Specification / Code Design Phase Start”

One method will allow me to schedule

•  Senior Systems Architect - Programmer 1

•  (option one) Depending on the resulting task processed by the e-task method, he will be allocated to two parallel tasks:

•  Supervise, document and review (with 1 additional System Architect) high level specifications and write guidelines for low level coding ( C#/ASP.net) for 1 month + ½ at 40 % of his time

•  Program 60% of his time in VXML and WSDL (web service interface contract)

•  (option two) His time, depending on result processed by the method will be divided into:

•  25 percent allocated to coding and 75 percent working alongside lower level junior ASP.NET/C# programmers on Web / Interface design

Exceptions: Architectural Backbone – led by Critical Path

•  Programmer 2 – Senior Architect, C, C++ Programmer

•  80 % for 1.8 months on the ‘Pre-fetch engine' task or what we would call the main architectural process or ‘core mechanism' in the project's architecture; this is a situation to avoid at all cost. Modularity and reconfiguration are keys to success.

•  NOTE: Any low level process that supports the software / hardware architecture that cannot / will not be shared or broken down into multiple e-tasks, that is, if no other solution can be found and provided the E-Task breakdown method has bee applied to this aggregate and proven that the task cannot be partially de-allocated - is eventually going to be a death sentence. Down the line as upgrades, patches and customizations will be needed or requested, these will make this indivisible task grow exponentially into the realm of impossibility.

•  The other 10 % would go to project maintenance, architecture review and tune-up and code review.



•  If we go thru high-level specifications in a methodical way – as described in this document - by disassembling data aggregate into ALL of their parts. 

•  If we break all complex programming units into a lot of simpler algorithms and functions,

•  Then we can do something that would save a great amount of time.  We could deduct from high-level specs most of the data needed for low-level specifications and

•  be able to create task assignments and approx duration of tasks at the same time :

We can postulate the following: the overall cost ( without contingency planning ) of an etask is thus:

et = t(tc) x r(nc) where an Etask is the sum of itself: t(tc) and nc is a variable cost per hour (n) of a resource - programmer (r) – n being based on seniority level.

•  Cost of sub-project (phase 1)

P(1) = S (r(nc) x t(x)), where t(x) is the total amount of tasks (x) in Project - Phase 1

à The sum of all contingencies should be accounted and evaluated in the range of <35% -> 50%> and multiplied by existing cost analysis.



* Brief definition of induction: Probable reasoning whose conclusion goes beyond what is formally contained in its premises.

1 - This process is a subset of a methodology using conditional logic and Markov models: Quantifications and attributions of events are conditional probabilities based on degrees of confirmation or Criterion of Adequacy (CoA) or degrees of confirmation of an unknown event B inferred from a known event A (Prb(A|B) (See Paper: Probability Factors, the future and Predictions'

2- See paper on Complexity Theory: ‘ Complexity Theory or the Theory of the void ”)