Home‎ > ‎

System Concepts




System Concepts


The justification of all mathematical models is that, oversimplified, unrealistic, and even false as they may be in some respect, they force analysts to confront possibilities that would not have occurred to them otherwise.
— Sylvia Nasar, A Beautiful Mind

[R]ational thought imposes a limit on a person's concept of his relation to the cosmos.
— John F. Nash, Jr., Les Prix Nobel 1994

The value of any formal model, including game theory, is that you can see some of the principles that are operating more clearly than you could without the model. But it's only some of the principles. You have to leave off a lot of things, some of which are bound to be important.
— Robert Axelrod, quoted by William Poundstone in Prisoner's Dilemma

The purpose of computation is insight, not numbers.
— Richard W. Hamming, Numerical Methods for Scientists and Engineers

Mathematical reasoning may be regarded rather schematically as the exercise of a combination of two faculties, which we may call intuition and ingenuity. [...] The activity of the intuition consists in making spontaneous judgments which are not the result of conscious trains of reasoning.
— A.M. Turing, quoted by Andrew Hodges in Alan Turing: The Enigma

Systems Theory
Modeling is the art of representing certain salient properties of a system by means of a formal construct. A system is a set of interrelated elements organized in such a way that they collectively exhibit structural order, functional purpose or some apparently meaningful behavior to an observer. The determination of what constitutes a system is entirely subjective and oftentimes relative. There exist no systems “out there” in the “real world.” The world actually consists of physical objects and living entities which engage in events brought about by natural forces (often harnessed by humans and other life forms). A system is conceived by intellection. It is strictly an idea: a conceptualization that is dependent on the perspective assumed by an observer.

Systems are said to be embedded in environments. An environment is the overall context in which a given system operates. Thus, environment is also a relative concept. Relative here means that both system and environment are subjective categorizations whose relevance or suitability for an intended purpose is determined by the observer. It follows that the boundary separating system from environment is also subjective, that is to say, defined by the observer. Boundaries can be physical or purely conceptual.

A given system and its immediate environment can often be conceived of as a higher-level system which itself is embedded in a more general environment. This illustrates the hierarchical nature of systems thinking. A system can also be decomposed into two or more lower-level subsystems by defining boundaries between different groups of tightly connected components. Typically, the criteria used for inferring connections are structure, the arrangement of the parts that make up the whole, and function, the characteristic action performed by the entity under study. Structure usually implies spatial contiguity while function is associated with temporal relatedness, usually synchronous.

Systems can be classified as either open or closed. An open system is an entity that exchanges materials, energy or information with its environment or other closely coupled systems. A closed system is sealed from its environment; its boundary is impermeable. Since closed systems cannot replenish resources, they eventually become disorganized and, because of increasing entropy, collapse. Open systems have the potential to prolong their existence in a stable environment. To survive in a dynamic environment, however, a system must not only be open but also be capable of adapting to change. Adaptation, the process of adjusting to changing environmental conditions, implies a capacity for self-regulation. Self-regulation, a form of control, relies on feedback: the furnishing of information regarding system output in order to maintain system behavior within operational parameters. Basically, feedback control consists of comparing system output with performance standards in order to adjust system inputs or internal operations so as to attain the desired level of performance.

Feedback takes two basic forms. Positive feedback reinforces system behavior by amplifying output and leads to either exponential growth or decay, depending on the output trend. If left unchecked, positive feedback will eventually bring any system to collapse because unlimited growth is impossible in any environment with finite resources, while incessant decay leads to system extinction. Negative feedback allows for self-correcting system behavior by regulating system inflows and/or internal processes (resource transformations) if and when outflows exceed given thresholds. When properly conducted, the resulting effect is to check erratic behavior and return the system to a desired steady state. Thus negative feedback can lead to homeostasis: a state of dynamic equilibrium achieved through internally generated operational stability. Self-regulation implies negative feedback.

Systems Modeling
Modeling consists of three steps: abstraction, idealization and formalization. Abstraction refers to the mental process of selecting key elements (and disregarding others) from the perceived whole. The chosen elements are presumed to be significant constituents of the system being modeled, that is, critical factors responsible for system behavior. Any such factor can, in principle, be conceptualized as a subsystem itself. Abstraction is required because a model is a simplified representation of a system. Complex systems are difficult to analyze and manipulate. Model simplification affords analytical and computational tractability. This implies that model results must be interpreted as approximations to actual system behavior. In general, models do not produce hard-and-fast, real-world solutions because they only portray systems abstractly — they ignore some factors which may well affect system behavior in the real world.

Idealization, the second step in the modeling process, consists of attributing to the abstracted elements ideal status. An ideal is a conception embodying perfection. Consider, for example, an idealization of a production system. Suppose that a certain assembly line produces an average of ten units per hour of operation. Clearly, one expects actual production to be somewhere near ten units per hour under normal conditions. But output can be different if actual conditions are at variance with the norm: a lack of, say, continuity of inputs could bring production down to zero momentarily. Idealization of the process would attribute a constant production rate of exactly ten units per hour of operation, although it is clear this may not always hold in the real world. Idealization further simplifies the model, thus enhancing its tractability.

Formalization is the final step in the modeling process. To formalize means to give a definite form to something, to concretize a conception. (Concretization is done with symbols in the case of mathematical models.) Formalization objectifies those system elements that were abstracted and idealized, making them amenable to manipulation. Managerial decisions often lend themselves to mathematical formulation since many of the variables under consideration can be readily quantified. The usefulness of quantification lies in the fact that, when properly done, the power of mathematical analysis can be brought to bear on the problem. That is the hallmark of science. And that is what makes formal modeling valuable to managers: it provides a means to check their intuition and personal judgment with the rigorous logic of mathematics.

Systems Analysis and Design
Systems analysis is a methodical approach to problem solving in general, although it has been primarily employed in engineering and the management of technology. Analysis proceeds by decomposing a system into its constituent parts in order to determine the functional relationships that enable the system to perform as an integrated whole. Systems design (synthesis) avails itself of the knowledge obtained by analysis to rationally devise, develop and implement (synthesize) improved operational systems. Given that models are idealized representations of real-world systems, sensitivity analysis plays a crucial role in all system analysis and design endeavors. Sensitivity analysis is conducted by varying model parameters (quantities assigned a constant value in the model) within their range of possible values to assess their impact on modeled system behavior. This affords greater insight into actual system behavior and increases confidence in the model and its results.

Terms
Abstraction – mental act of extracting a general concept apart from its particular instances

Adaptation – the process of adjusting to changing environmental conditions

Analysis – the separation of a whole into its constituent parts in order to study them individually

Boundary – the border established between a system and its environment by an observer

Closed System – an entity isolated from its environment

Control – the process of maintaining system behavior within established bounds

Entropy – degree of disorder or randomness in a system; a measure of the uncertainty associated with a random variable; a measure of the amount of energy in a physical system unavailable to do work

Environment – the overall context in which a system operates

Feedback – collected information about system output used to maintain system behavior within operational bounds

Feedback Control – the comparing of system output to performance standards so as to regulate subsequent output

Formalization – the act of giving definite form to a concept by means of symbols

Function – action or pattern of behavior that characterizes a given entity

Homeostasis – the maintenance of internal stability by a system in response to external disturbances; the state of a system when in (or tending to maintain) equilibrium

Idealization – mental act of endowing an abstraction with a set of standard properties

Model – an idealized representation of a system in simplified form

Modeling – the process of representing a system by a formal construct

Negative Feedback – the use of output data to regulate system inputs and internal operations

Open System – an entity that exchanges materials, energy or information with its environment

Parameter – a measurable factor that influences system behavior and is modeled as a numerical constant

Positive Feedback – the use of output data to amplify the current trend of system output

Self-regulation – the capacity of a system to effectively govern its own internal environment

Sensitivity Analysis – systematic examination of model behavior based on varying key parameters

Structure – the arrangement of the components that make up a whole

Synthesis – the combination of parts into a complex whole; the opposite of analysis

System – a set of interrelated elements that exhibit order, purpose or meaningful behavior to an observer

Systems Analysis – the process of decomposing a system into its constituent parts in order to determine the functional relationships that enable the system to perform as an integrated whole

Systems Design – the process of rationally devising, developing and implementing a new system






Reference: “Organizations and the System Concept” in Katz, Daniel, and Robert L. Kahn, The Social Psychology of Organizations, 1966, 1978 & 1990, John Wiley and Sons. Highly recommended. This classic is arguably the best chapter-length introduction to system concepts available. Reprinted in Classics of Organization Theory, by Shafritz, Ott & Jang (eds.), Wadsworth Publishing.

Online Text


Emergence: Complexity and Organization (Special Issue)
 

Systems Theory

 
What Is Cybernetics?


System
Systems Analysis
Cybernetics
Cybernetics & Systems Science
Control
Entropy & Information
Entropy & Thermodynamics
Feedback
Homeostasis
Self-organization
Complex Adaptive Systems
Online Library
Dictionary of Cybernetics & Systems



 






Model
System
Systems Analysis
Systems Science
Systems Engineering
Systems Theory
Systems Thinking
Systems Philosophy
Complex System
Complex Systems
Mathematical Model
Norbert Wiener
Systems Science Portal

Systems, Models & Methods


A search engine for terms & definitions


Publications


Recent Publications


University of Michigan
Center for the Study of Complex Systems




Links to Websites
 
 
 Complexity & Emergence
 
 
Axelrod & Tesfatsion:
Online Guide to Agent-Based Modeling

Axelrod:
Complexity of Cooperation Website
Evolution of Cooperation Website
Annotated Bibliography on The Evolution of Cooperation

Hoffman:
Twenty Years On: The Evolution of Cooperation Revisited

Tesfatsion:
Agent-Based Computational Economics

Dolan: Artificial Life Database
Resources on AL, Cellular Automata, Genetic Algorithms, Neural Nets, Complexity, Bots ...

Pangaro's Guide to Cybernetics
 
Tufillaro: An Experimental Approach to
Nonlinear Dynamics and Chaos
Systems Glossary

Vidal: Multiagent Systems
 
What Is a System?

Callahan: What Is the Game of Life?

FML: Systems Thinking