complexity

Complexity

Complexity, like all other human notions involves the blending to various degrees of both a subjective (extrinsic, context dependent) interpretation and an objective (intrinsic, Platonic) description of some system. However, due to the particularly strong role of understanding (interacting) with the system of interest, the subjective component is particularly strong in this concept. (Perhaps another manifestation of Schrodinger's cat? :) but see also the Ensemble interpretation of quantum mechanics: are you Bayesian, Frequentist or some happy mix?). 

In fact, for some such as Rosen (1977) and von Foerster (1984), complexity is NOT an intrinsic property of systems. Rather, they consider them to be a completely subjective, context and language dependent property. Take for example the complexity of the brain in the perspectives a neuroscientist and a five year old. 

The interaction of observer with the observed is generally considered anathema by respectable (objective) scientists practicing the venerable scientific method. Thus, even though a rather formidable Schrödinger's cat exists in the the study of complexity, it tends to be politely ignored by most respectable scientists.

For those that ignore this rather significant issue, complexity is  studied as structural and behavioral/functional components of systems. The primary statistics used include: the number of components/functions;  the number of  interactions/connectivity;  etc. Often these numbers are scaled in some manner (e.g., by following an information-theoretic approach). While the approach has been somewhat useful, the subjectivists' concerns are completely ignored.

Perhaps the most general and promising approach that I am aware of that seems to provide a solution potentially acceptable to both camps is that of the algorithmic or computational complexity of a system (e.g., see von Foerster 1984). That is, what is the shortest program that can describe a given system (in the context of a Turing Machine). While the length of the program can change dramatically depending upon how clever the programmer may be (the context-dependence), there may be an asymptotic levelling off of what "cleverness" can do and so a convergence to some threshold number of computational steps may be expected. And of course, through the application of Occam's razor, the convergent solution may be considered a heuristic but nearly objective measure of system complexity.

The next question is how do such measures of complexity evolve and change under the influence of time and perturbations. Sound familiar?

 

[ Obviously, this page is not complete and likely will never be considered compete. But I will add more as time permits.] 

 

 

References: 

Heinz von Foerster. 1984. Disorder/Order: Discovery or Invention. In: Disorder and Order, Proceedings of the Stanford International Symposium, P. Livingston (Hg.),  Anma Libri, Saratoga, pp. 177-189.