Min mening

Denne artikkelen ble refusert av ITCon i 2008, antakelig fordi redaktøren ikke forsto den eller at menigheten ikke burde forstyrres med uvante tanker.

BIM - Building information mess: past, present and future

REVISED: February 27, 2009.

Tor G. Syvertsen, Professor

Department of Structural Engineering and Studio Apertura, Norwegian University of Science and Technology

SUMMARY: The history of human construction demonstrates that impressive works have been carried out without any formal information at all. The information and managing age started a few centuries ago, but gained speed with the introduction of computers in the mid-20th century. Since then, information has been produced at no expense, and an apparent urge for control has risen to a unprecedented level of detail and fragmentation. Any attempt to cope with the confused situation is doomed to fail unless one challenges the roots of the problem; the prevailing complexity, fear, distrust and incompetence have to be replaced by trust and competent decisions. The BuildingSMART initiative is destined to break down due to lack of a firm foundation. Recent advances in information an communication technologies bring promise of a prosperous future if one dare to let go of rigid information and control structures.

KEYWORDS: Information overload, Digital communication, Self-organizing, Superdistribution.

  1. Introduction

1.1 A magnificent construction history

From early times magnificent construction works have been undertaken with very little or no explicit information. The Seven Wonders of the Ancient World were all constructions made by men who were illiterate. The only one still standing is the Great Pyramid of Giza from around 2500 BC. The associated information and information work was unquestionably a tiny fraction of the total effort. However, cathedrals have 1 millionth the mass of pyramids. The difference was the arch. Architecture demands arches. Building arches demands information? Information demands knowledge?

St. Peter’sBasilica in the Vatican City was consecrated in 1626. The work had been supervised by several architects ("Capomaestros"), among them Bramante, Raphael, Sangallo, Michelangelo, Maderno and Bernini, and the construction period spanned the reigns of a dozen popes from Julius II to Paul V.

Recently a fragment of a red chalk drawing of a section of the dome of Saint Peter's, almost certainly by the hand of Michelangelo, was discovered in the Vatican archives. The drawing shows a small, precisely drafted section of the plan of the entablature above two of the radial columns of the cupola drum (Wikipedia). Otherwise very little formal information has been found. Nevertheless, building information models did exist; mainly in the mind of the master builder and partly distributed in the minds of the craftsmen. They took pride in doing an excellent job, and made no fuss of information.

I suspect that the “modern” construction industry is incapable of erecting such a building, and even if the construction was feasible, the authorities would definitely not have given a building permit.

When Michelangelo, aged 72, was called by Pope Paul III to take the position as Capomaestro after Sangallo the Younger, he had no formal qualifications. However, the pope knew he was the best man for the mission, based on his reputation. And Michelangelo proved it by designing the magnificent dome with an internal diameter of 41.47 metres rising to a height of 136.57 metres, from the floor of the basilica to the top of the external cross. It is the tallest dome in the world. The dome has inspired many other great buildings (for instance the St. Paul’s Cathedral in London, and the US Capitol in Washington DC), but none of the imitations excel the original. Unfortunately Michelangelo died (in 1564), before he could see the completion of the dome in 1590.

1.2 Computers

When computers arrived in the mid-20th century they were designed for scientific purposes. In a few decades administrative calculations and technical computations were performed regularly as well. Interactive computer graphics was introduced by Ivan Sutherlands Sketchpad in 1963 (Sutherland 1963), and the computer gradually also became a drawing machine.

The microprocessor and the Personal Computer mainly turned the computer into a typewriter, even if people stopped to write, and started “word-processing” instead. The Personal Computer was, in fact, conceived already in 1968 by Alan Kay and others at Xerox PARC. Powerful information tools in the hands of everyone have led to the exploding amount of all sorts of information (“documentation”). Since the main purpose of information is communication, communication needs and corresponding problems amplified as well.

1.3 Complexity

Many observers claim that the building process has become exceedingly complex and that is often the argument for more planning and more documentation. From my viewpoint, building has not become complex, it has been made unnecessary complicated. Use of computers has to some extent enabled the management of complexity, but it has at the same time multiplied the problems.

To me it seems that this complication is basically due to a fragmentation of tasks and removal of responsibilities and decisions from the workplace, in time and space. Decisions are made years ahead and hundreds of kilometres away from where the actual problems arise and where the capabilities should be placed to resolve the difficulties. The need for formal information and communication increases as the responsibility is moved from the formal decision makers to some subordinates in charge when and where (not if) something goes really wrong. The construction industry is in a state of disorder, distrust and misunderstandings. Expert proficiency has been displaced by legal distrust and bureaucratic formalities.

Several attempts have been done to make the communication effective and efficient, but except the internet and World Wide Web, none has so far succeeded to a noticeable extent.

  1. BuildingSMART

The BuildingSMART-initiative promises everything: “Proactively facilitate with key leaders the active use and promulgation of open data standards enabling civil infrastructure and building asset data and life-cycle processes to be seamlessly integrated, improving the value achieved from investments in the built environment and enhancing opportunities for growth”. The mission is almost as broad as the vision: “Realization of the full societal, environmental and economic benefits of open sharable civil infrastructure and building asset information into commercial and institutional processes worldwide”. (BuildingSMART)

In my humble understanding, the main shortcomings of BuildingSMART are:

  • No appropriate problem is addressed. If the problem is “too many data formats”, creation of yet another “format” will just increase the problem. We recently learned that Autodesk and Bentley Systems will make their file systems compatible, hence reducing the “problem” themselves (AEC Magazine 2008).

  • Lack of a precise and consistent terminology. IFC is for example, named a “format” by most even if IFC is an acronym for Industry Foundation Classes.

  • Lack of basic principles. As a substitute there are torrents of gobbledygook invented whenever the original approach breaks down, and a new TLA (Three Letter Abbreviation) gives short-term relief like any snake-oil.

  • Obsolete technology. BuildingSMART is based on technology developed i the 1970’s or even the 1960’s. Conceptually stuck to file-transfer by portable media like floppy disks or dial-up modem connections from the pre-object and pre-internet era, the chances of success in the 21st century are diminishing.

  • Detailing level. The scheme of an extremely detailed information model is as absurd as the idea of a map in the scale 1:1. Such a model is useless for any purpose, except for the model makers who can earn a good living by making tons of useless stuff.

  • Proprietary. The BuildingSMART-standards are owned by ISO who needs revenues from selling standards. Expensive as they are, you don’t even get running software, only specifications of rigid data structures.

  • Rigid. In a fast changing world where innovations flourish, the accompanying information need to be adaptable. The cycle time of complicated “voluntary” ISO-standards is a decade or more.

At best, the BuildingSMART initiative can be regarded as yet another confirmation of Kranzberg’s2nd law of technology: “Invention is the mother of necessity”. Or in other words: BuildingSMART is suffering from the SSP-syndrome (Solution Seeking Problem).

I would suggest the focus to be shifted from “transfer of data between software programs” to “sharing of information between persons”. The latter include sharing of meaning, commitment, confidence and trust between cooperating people. This is definitely no pure technical problem, and has no solution that can be formalised or standardized by regulations or management.

  1. Coordination

3.1 Working Together

Working together, combining knowledge and transfer of experiences requires coordination. Following Malone (Malone & Crowston 1992) we can define this broadly as “the act of working together”.

If we look more closely into the concept, we can identify the components of coordination; goals, actors, resources and activities. The goals are something to be achieved.

In economical and technical reasoning organizational goals are usually taken for granted. Empirical studies have demonstrated that this assumption more often than not is a dubious one. Formulation and selection of goals is regularly a question of internal struggles, contesting interests and different perceptions of reality. The activities are the actions necessary to perform the different sub-tasks of the goal. The actors are the participants, within as well as outside of the organisation, whose efforts have to be combined.

Actors and activities are linked by interdependencies.

A more narrow definition based on this decomposition may be: “Coordination is the act of managing interdependencies between activities”. (ibid.)

Compared to the traditional view, attention has been shifted from the components to the relationships between them, see Figure 1. This represents a more fundamental change than may be apparent at first glance; the dynamics of a system are determined by the relationships between the components of the system, not by the components themselves. (Schiefloe and Syvertsen 1993).

Figure 1: Simplified model of coordination

The essential requirement for smooth coordination is to have a shared understanding of the goal(s) or objective(s) of the endeavour. Risks should be included as well. Otherwise one could end up like the U.S. Financial institutions recently did. Joseph J. Cassano, a former A.I.G. executive, declared in August 2007: “It is hard for us, without being flippant, to even see a scenario within any kind of realm of reason that would see us losing one dollar in any of those transactions.” (New York Times 2008). One year later economic failure required a multibillion dollars bailout from the federal government to keep A.I.G. barely alive.

The truly profound understanding may be obtained by a hard work-out following a simple procedure as outlined in figure 2. The goal-setting work requires deep thinking over time and thorough discussion between the involved people. As Henry Ford pointed out: “Thinking is the hardest job in the world, which is why so few people engage in it”.

Figure 2: Foundation of Coordination

The procedure of Figure 2 represents an extremely simple quality assurance system. It is easy to understand and to remember: Five questions starting with the letter W and five corresponding answers with the letter P. There is no need for a full dozen of manuals of “Quality Assurance”.

What is required is only hard thinking together and time to achieve a profound consensus on vocabulary, criteria, options and opinions. The key question is “Why?”. This question could be asked over and over again to all the answers, for example: Why should we do this in that way? After some days or weeks of hard thinking a profound and shared understanding will emerge.

As a digression, I can provide an anecdote of the simplest and probably best engineering quality system ever:

The Russian Czar Alexander III Alexandrovich (1845-1894) demanded for the construction of railway bridges that ”The designer shall sit in front of the locomotive during the first trial run of any railway bridge”. Most of the bridges are still in operation, for example the bridge on the Trans-Siberian Railroad crossing the river Ob near Novosibirsk (then Novonikolayevsk). This simple statement was superior to any fifty-volumes “quality assurance”, and history demonstrates that it worked well. Documentation and rigorous control were not necessary because engineering proficiency, trust and responsibility were in charge.

The smoothest and best coordination is the one taking place as an integral part of the working process. This kind of organic or immanent coordination is invisible, and the interdependencies do not have to be “managed”. The need for external directing arises first when the organic coordination for some reason breaks down. Cases of this can, for example, be studied at games of soccer or ice hockey; the coach does not act until something does not work as intended. The role of “coordinator” is simply superfluous. A similar sort of immanent effort should also be aimed at on the subject of quality. Any fairly good professional worker knows what is quality in his trade, and achieves that by simply doing his job.

Clay Shirky (Shirky 2008) illustrates with many examples how “Personal motivation meets collaborative production”. Based on trust, the work of many could be smoothly coordinated using coordination technologies, for example the collaborative production of Wikipedia:

“The first wiki was created by Ward Cunningham, a software engineer, in 1995. (The name wiki is taken from the Hawaiian word for “quick.”) Cunningham wanted a way for the software community to create a repository of shared design wisdom. He observed that most of the available tools for collaboration were concerned with complex collections of roles and requirements ­ only designated writers could create text, whereas only editors could publish it, but not until proofreaders had approved it, and so on. Cunningham made a different, and radical, assumption: groups of people who want to collaborate also tend to trust one another. If this was true, then a small group could work on a shared effort without needing formal management or process.

Cunningham’s wiki, the model for all subsequent wikis, is a user-editable website. Every page on a wiki has a button somewhere, usually reading “Edit this,” that lets the reader add, alter or delete the contents of the page.” (Shirky 2008, pp. 111-112).

3.2 Collective Action

“Collective action, where a group acts as a whole, is even more complex than collaborative production, but here again new tools give life to new forms of action. This in turn challenge existing institutions, by eroding the institutional monopoly on large-scale coordination” (ibid. p. 143).

“...social tools don’t create collective action - they merely remove the obstacles to it. Those obstacles have been so significant and pervasive, however, that as they are being removed, the world is becoming a different place” (ibid. p. 189).

The construction industry as well as the standardization bodies should be aware of the trend towards replacing planning with coordination. A viable example is the Linux operating system; Linus’s proposal was modest but interesting - a new but small operating system, undertaken as a way to learn together. And that is what happened, not planned but by collective action of many people over a long period of time. (ibid.)

Physical buildings has traditionally also been created in a similar manner, for example voluntary building of a village hall or a barn.

3.3 Autopoietic Enterprise

Autopoiesis literally means "auto (self)-creation" (from the Greek: auto – αυτό for self- and poiesis – ποίησις for creation or production), and expresses a fundamental dialectic between structure and function. The term was originally introduced by Chilean biologists Humberto Maturana and Francisco Varela in 1973:

"An autopoietic machine is a machine organized (defined as a unity) as a network of processes of production (transformation and destruction) of components which: (i) through their interactions and transformations continuously regenerate and realize the network of processes (relations) that produced them; and (ii) constitute it (the machine) as a concrete unity in space in which they (the components) exist by specifying the topological domain of its realization as such a network." (Wikipedia 2008).

This definition apply quite well to living information organisms like wikipedia, but definitely not to rigid formal, bureaucratic organizations like BuildingSMART or ISO.

Limone and Bastias (Limone 2006) start out with categories of cognitive systems:

  • Cognitivism: ‘Processing of information: manipulation of symbols based on a set of norms and rules’.

  • Connectionism: ‘The emergence of global states in a network of simple components’.

  • Enactivism: ‘Effective action in a domain: a history of structural coupling that generates a world’.

The function of these three categories are slightly different:

  • Cognitivism: Through any mechanism that can sustain and manipulate discrete physical elements: symbols. The system interacts only with the form of the symbols (their physical attributes), not with their meanings.

  • Connectionism: Through local norms for individual functioning and norms for changes in connectivity between elements.

  • Enactivism: Through a network of interconnected elements capable of structural changes that experience an uninterrupted history.

Limone and Bastias provide criteria for proper functioning of the categories of cognitive systems (validation):

  • Cognitivism: When the symbols appropriately represent some aspect of the ‘real world’ and the processing of information leads to successfully solving problems that confront the system.

  • Connectionism: When one can see that the emerging properties (and the resulting structure) correspond to a specific cognitive capacity: success in solving a required task.

  • Enactivism: When it becomes part of a world of continuous and existing meaning (in ontogeny) or forms a new meaning (in phylogeny).

The autopoietic organization aims at the state of enactivism, as do any living organism, because autopoeisis is the only form of life as far as we know.

Fuchs states that: “Emergence and self-organization are two particularly important concepts of the sciences of complexity.” (Fuchs 2003). He explains briefly the major aspects of emergence:

  • Synergism: Emergence is due to the productive interaction between entities. Synergy is a very general concept that refers “to combined or ‘co-operative’ effects – literally, the effects produced by things that ‘operate together’ (parts, elements or individuals)” (Corning 1998: 136). Synergy takes place and shapes systems on all organisational levels of matter, it is a fundamental quality of matter. Synergies between interacting entities are the cause of the evolution and persistence of emergent systems.

  • Novelty: On a systemic level different from the level of the synergetically interacting entities new qualities show up. Emergent qualities are qualities that have not been previously observed and have not previously existed in a complex system (“a whole is more than the sum of its parts”).

  • Irreducibility: The new produced qualities are not reducible to or derivable from the level of the producing, interacting entities.

  • Unpredictability: The form of the emergent result and the point of emergence can’t be fully predicted.

  • Coherence/Correlation: Complex systems with emergent qualities have some coherent behaviour for a certain period of time (Goldstein 1999). This coherence spans and correlates the level of the producing entities into a unity on the level of emergence (ibid.).

  • Historicity: Emergent qualities are not pre-given, but the result of the dynamical development of complex systems.

Among the other aspects of self-organizing, Fuchs mention Information: All self-organising systems are information generating systems. Information is the processual relationship between self-organising material units that form a coherent whole that has emergent properties. For a more thorough discussion of self-organizing, see Fuchs (2003).

As briefly described in the previous section, the existing technologies for coordination facilitate autopoietic self-organizing if one let rigid control and management go.

4. Principles of digital communication

4.1 Information and knowledge

Burgin (2003) constitutes a good starting point for exploring the phenomenon commonly called “information”. The term is so generic that almost everything may be covered by it. This should be a concern for anyone in the field of information modelling.

Burgin states: “As Goguen writes (1997), “we live in an “Age of Information,” but it is an open scandal that there is no theory, not even definition, of information that is both broad and precise enough to make such an assertion meaningful.” We are overwhelmed with myriad of information from a wide spectrum of information sources, such as the World Wide Web, emails, images, speeches, documents, etc. At the same time, our experience demonstrates that common sense understanding of the notion of information may be very misleading. Consequently, we have to go to information science and develop theoretical perspective on entertainment. The main problem is to find the right theory.”

If a useful definition of information could be established based on symbols, signs and data, an even more blurred concept is ready to block our journey: Knowledge.

I quote from (Limone and Bastias 1996): “Although there may be no doubt as to the importance of the concept of knowledge, its epistemological status generates certain disquiet, as do the various terms employed by the various authors when referring to it.

Intellectual concern about the topic and the concept of knowledge is nothing new. To the contrary, its study is rooted in the most pristine anxieties of humanity and there is a long, philosophical tradition surrounding it. In the Western culture this tradition dates back to the pre-Socratic philosophers. Epistemology, both in its original consideration as a branch of philosophy and as its more recent consideration as a scientific discipline (experimental epistemology), has had the nature of knowledge as its central object of study—What is it? How is it disseminated? How is it conserved? How is it validated? Nevertheless, it is not common to find references to this tradition or to the scientific depictions referring to the nature and generation of knowledge in studies dealing with Knowledge Management (KM).

In our opinion this consideration should constitute the starting point for any reflection on the matter. In this regard, the approach of E. Bueno, of the U. Autonoma de Madrid, seems quite appropriate when he states, ‘This development in economic thought of considering knowledge as a critical resource and as the objective of the creation of value, has been quite successful. At the same time it is introducing greater complexity and a certain degree of confusion of concepts, terms, models, proposals, and other mental developments at the end of the 20th century. And it seems that it will continue doing so in the first years of the 21st century’. ‘The profusion of terms, occasionally amounting to linguistic nonsense, flippancy as to the way the concept is used, ignorance of the classical categories of thought and the frivolous abuse of fashions and of pseudo-scientific and postmodernist movements are constructing a new ‘‘Tower of Babel’’, provoking injustice and unease in the unnecessary formulation and accelerated substitution of propositions of new models and expressions without allowing them to mature and without making even a minimal effort to contrast them to prior ones . . . At the present time, the lack of agreement, of order, of objectivity, and the social explanation in this field of knowledge is an ‘‘objective reality’’’.

In my humble opinion, information and knowledge are not only a matter of quantity (like knowledge is information in context, which again is information), but they are of dissimilar qualities, and I imagine that information relates to knowledge in a similar manner that energy relates to power. Hence, if it can be stored, it is not knowledge!

Mark Burgin (Burgin 2003) has a slightly different view as he suggest an analogy of matter to Knowledge/data and information to energy, presuming that matter contains energy as knowledge/data contains information, see Figure 3. After a brief email discussion with Mark Burgin, I may subscribe to his model, because he includes in his analogy that data and knowledge are like molecules; data are water molecules with just three atoms, while knowledge is like DNA-molecules with billions of atoms. Mark Burgin’s model is depicted in Figure 3.

Figure 3: Information/Energy Diagram (Burkin 2008). Courtesy of Mark Burkin


4.2 Roles and Relationships

Both within an enterprise and between companies there are several roles to play in separate relationships as depicted in Figure 4.

Figure 4: Roles and relations

The functional relation is the only one where any creation of value can take place. Nevertheless, most emphasis is put on legal and economic relations that are of no value. One has to be a lawyer, an economist, a bureaucrat or a politician to be able to understand such nonsense.

4.3 Modern Verbosity

The regulatory and contractual domains create mountains of more or less irrelevant information. This non-valuable information increases exponentially, and threatens to bury the productive work. Here are some historical examples of regulatory information:

  • The Ten Commandments constitute a regulatory system embracing all (moral) aspects of (Christian) life. These rules are quite easy to understand, and may even be learned by heart. The Ten Commandments in modern formulation constitute approximately 40 words. (The one writing in stone favours short messages).

  • The United States Constitution (except the 27 Amendments) amounts to 4143 words in 7 articles. The Principles of the Constitution are understood without much difficulty and may be applied to slightly different situations in the various states.

  • The “modern” state union construction, The European Union (EU) is said to have more than 30000 directives translated into all the 23 official languages. One of them, Directive 1999/31/EC on the landfill of waste comprises 9512 words (English edition). This kind of regulatory system requires armies of lawyers, controllers and other bureaucrats for preparation and to ensure compliance. Rumors have it that the Directive on the import of caramels comprises 26911 words. I have not been able to confirm this, but with bureaucrats, lawyers and translators in charge it should not come as a surprise.

  • The recent Norwegian Planning and Building Law is accompanied by a Technical Regulation. The authorities have in advance realized the intricacy of this document, and provide a Guide. The Introductory chapter of the Guide alone occupies more than 3000 words. Poor performance of the building industry will be the rule rather than the exception.

4.4 Digital Information and communication

Digital information media have made it possible for the first time in history to separate the content, the structure(s) and the form(s) of information, see Figure 5.

Figure 5: Digital Information

So far, the digital medium has been used mainly to imitate paper. “Electronic Journals” like ITCon is a representative example of a digital journal putting more emphasis on paper-style formatting than ease of access. More advanced publishers may be found, good examples are NatureNews and The New York Times.

We are in a similar state of confusion with digital information as the printing press were in, five hundreds years ago. The inventor, JohannGutenberg, used his invention mainly to produce incunabula that were the printed imitations of bibles handwritten by scribes in the Latin language and gothic typography. Gutenberg’s 42-lines bible from 1452-53 was his masterpiece as a perfect copy of what the monastery scribes had laboriously copied for centuries. Even JohannesTrithemius’ exhortation to his monks in 1492 printed “De laude scriptorum” (in praise of the Scribes) used the printing press to great advantage in circulating his own works.

My favourite is Aldus Manutius. Gutenberg copied the old religious content, gothic form and Latin language, so nobody except clericals could benefit from inexpensive printing.

Aldus took up old masterpieces of literature from Greece, used the French language, and invented the simple typefont (today known as italics). However, he did not use his italic typeface for emphasis as we do today, but rather for its narrow and compact letterforms, which allowed the printing of pocket-sized (or rather sized to fit in a saddle-bag) books. Thereby, scholars could travel and bring their books with them (the lap-top), which was not possible with the table-sized incunabula of Gutenberg which still cost a fortune and required a monastery table to be read by people familiar with gothic typography and Latin language.

As long as the potentials of the digital technologies are not harnessed for more than imitating the old media, their vast powers bring mainly disadvantage: Information overload leading to organizational obesity and stagnation. Nobel laureate economist Herbert Simon clearly pointed out the problem of information: "What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the over-abundance of information sources that might consume it." (Simon 1971, p. 40-41).

My favorite thinker on new media is Ted Nelson, who coined the word "hypertext" in 1965. Some of his thinking is still valid for the future, like "Back tothe future - Hypertext the way it used to be" or "Transliterature,A Humanist Design", (Nelson 2007).

This problem was recently addressed in the editorial of the scientific journal Nature (Nature 2008).

Based on this separation, a set of principles could be suggested for the digital information space:

  • Simplicity; the least amount of simple information should be used

  • Pulling; the one in need of information is responsible for getting it, preserving one’s attention

  • Rewarding; information should be rewarded based on use, not produce

The principle of information simplicity is automatically achieved with cumbersome media like stone slates or stencil duplicators. Using chisel and stone, energy restrictions will restrain ones verbosity, and enforce a punctilious work. A mistake in the last sentence means you have to do it all over again.

What happened when we got IBM typewriters and Xerox photocopiers in the 1980’ties? From the few copies of a stencil copier, we became accustomed to make copies to everyone at a very low cost. The next stage was a PC to everyone. They just substituted typewriters. Now every engineer had his own typewriter, and could bypass the secretary and send the printout directly to the laser printer. It seemed very good, but the downside was the cost: there used to be two secretaries serving twenty engineers using typewriters costing $1000 and lasting 10 years. Now twenty engineers needed PC-Typewriters costing $5000 and lasting 3 years, plus laser writers.

Today we produce trillions of emails with Terabytes of attachments, that just make the digital world horrible!

What is the benefit of the new technology? For Whom?

No buyer of a house has ever seen any benefit from this silly technology worship!

Technology is simply absurd if it has no benefit for human beings.

The idea of “information delivery” is ridiculous, because the content, amount and form (format) of the information should be determined by the actual user at the time of use.

4.5 Basic Principles

4.5.1 Simplicity

The information should be as simple as possible but not simpler; There is no need to overwhelm anyone with loads of information that cannot be understood.

This principle has been well understood since the days of Plato (427-347 B.C.): “Beauty of style and harmony and grace and good rhythm depend on simplicity."

Recently the same concept is renewed in what is known as Ockham’s razor among computer scientists: "entia non sunt multiplicanda praeter necessitatem", roughly translated as "entities must not be multiplied beyond necessity", attributed to the 14th-century English logician and Franciscan friar, William of Ockham.

Nobel laureate in physics, Albert Einstein (1879-1955) is attributed the statement: “everything should be made as simple as possible – but not simpler”.

I feel that I stand on rather firm ground when stating the principle of simplicity!


4,5.2 Pulling

The Person who is in need of information should at any time and place be able to access it. This is the basic principle of the World Wide Web, where nothing comes to you unless you ask for it.

I contemplate why some media complain about all porn and cruelty on the World Wide Web. It can only mean that they have actively visited it. It never happened to me that I get anything I did not want from the World Wide Web. I do get some unsolicited emails, but the spam filter sorts them neatly out for me, so that is no big problem at all.

The World Wide Web is basically a “pull-medium”; you get what you want (YGWYW). When you need information, access it or search for it.


4,5.3 Rewarding

This is still an important issue in society, and will be in the building industry as well. It is denoted “Intellectual Property Rights” (IPR), which is different when the information medium is not tangible, but consist of bit configurations. The basic principle must be changed from “produce” to “use”, as I will explain later under Superdistribution. A use-based rewarding will have its cost, and that is the surplus of people that during their entire working life have produced information that appears to be in no demand at all. Before adopting the reward-by-use mechanism, an organization should be well prepared for social and psychological disasters.

As Wilson pose it: “government agencies are not driven by goals but by constraints. Bemuse bureaucracies aren’t rewarded with profits when they do something right, avoiding doing something wrong (by “following the rules”) becomes far more important than achieving results.” (Wilson 1990).

In the new era, people not contributing to achieving a result will simply become obsolete.


4.5.4 Systems development

Gall’s law: “A complex system that works is invariably found to have evolved from a simple system that worked.

The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.” (Gall, 1986).

Consequently, It might be better to start out from the cooperation between Autodesk and Bentley Systems on harmonising their formats, and go on from there! Otherwise, the BuildingSMART-initiative has to go back to start (1968) and try over again!

Any complex and working system is invariably built on remarkably simple principles. Examples are the human brain, the internet, World Wide Web, an anthill. Systems that appear smart are inevitably built from simple components with effective communication (like the human brain). Smart components with poor communication will, however, often emerge as a stupid whole. (The concept Natural Stupidity is observed in any “expert committee”). Remember Albert Einstein’s advice: “Everything should be made as simple as possible, but not simpler”.

4.6 What makes an information ‘object’?

Three properties characterize objects:

  1. Identity: the property of an object that distinguishes it from other objects

  2. State: described by the data stored in the object

  3. Behaviour: performed by the methods in the object's interface by which the object can be used

Data structures like those found in IFC are not defining objects in any software sense.

Simply put: an object has a state and delivers services to its environment. The object might be located anywhere in the information space and provides services upon request, not by intervention in its inner state.

Among the key principles of object modelling are:

Classification (using is-a-kind-of relationships and inheritance)

Encapsulation (Information hiding)

Aggregation (using is-part-of relationships between parts and the whole)

There is a difference between "Identity" and "Identification": identity is related to existence, while identification is related to reference. A newborn baby has, for instance an identity even before it is given any name or social security number. Any object may have any number of identifications, but only one identity (the self). In the software world, objects have identity, but in the buildingSMARTworld they have only identification, so they do not exist.

The inventor of the object-oriented programming language C++, Bjarne Stroustrup, has remarked: “Any Software System is a Model of a Part of the Real World”. Software may here denote almost any kind of information. Because the real world is built from objects, the object-oriented paradigm is probably the most feasible one for making information systems that model (a part of) the world.

There are several textbooks on elementary object modelling, for example (Rumbaugh et. al. 1991).

One major difference should, however, be considered when it comes to object-orientation in the software world compared to the real world: It the software world an object is an instance of a single class, while in the real world an object is member of several classes dependent of the context which may change over time. I myself is for example a male human, a father, a husband, a professor, a teacher, a writer, and so on. This kind of complexity can not be handled by object-oriented multiple inheritance because one will end up with a unique class for every object. A different approach called subject-orientation was suggested in (Syvertsen et. al. 1991).

Other frequent concepts from product modelling and object modelling domains are compared in the Table below (Syvertsen 1991).

Suggestions for the buildingSMART society

For the buildingSMART society I suggest four actions:

  1. Redefine the scope and objectives, because riding a dead mule requires a very big carrot and/or a hard whip.

  2. Establish a theoretical foundation based on information science.

  3. Start working with live objects not only specification of dead data structures, because the object paradigm will probably prevail for still another 50 years.

  4. Change modus operandi from a closed, deterministic and bureaucratic structure to an open, open-ended and inclusive freeware-type of working, because the latter has proven superior in collaborative knowledge work.

In the following sections I will elaborate these issues.

Scope and Objectives

The simplest and probably best approch is to follow a recipe like the one outlined in Figure 2. Som of the answers might seem obvious, but they will not be if the questions are posed in an open and unbiased manner.

Who is the customer is the first question. The answer could be the owner of the building, the residents, the contractor, the consultants, or the authority, but not all of them.

What the customer needs is the second question. Again, the answer should not be everything, but rather a few specific objectives. Time, cost and quality are for instance contradictory, and can not be achieved simultaneously.

The question of How will probably be answered with some technological solutions, e.g. distributed objects, hyperspace, brain-computer interface, etc. The important issue is not to confine the options to technologies that worked 50 years ago (like data-modeling) or are dominating today (like graphical user interaction).

Why the effort should be undertaken will bring in rationale for the participants, which in case of contradictions need to be balanced.

When and where is mainly a rather trivial planning effort that most engineers should master.

Other and more detailed approaches that may appeal to engineers include Quality Function Deployment (QFD), see for instance (Akao 1990).


Construction Informatics

Any successful information exchange effort will have to be grounded in a profound understanding of the governing principles and laws. A convenient starting point for a field of Construction Informatics is to join ongoing endeavours, see for example Burgin (2003), Goguin (1997).


Live Objects

The most important technical issue is in my view, to abandon the obsolete data modelling approach, and take a braver approach for future developments. One important step would be to encapsulate data in executable code. Each object could keep its own copy of the code at virtually no cost. Objects could for instance be programmed in Java, and executed in Java Virtual Machine, or a similar framework could be established. Objects will be accessed over the network by using SOAP (Simple Object Access Protocol), CORBA (Common Object Request Broker Architecture) or some equivalent architecture dedicated to information for construction works.


Modus Operandi

The IAI/BuildingSMART organization resembles the strict centralized and hierarchical structure of the Roman Catholic Church. This kind of organization is perfect for maintaining the central power over beliefs and idea and to normalize and stabilize the practises.

If one, on the other hand, want to achieve just the opposite, one should rather look to the extreme performance of organizations like the ones behind Wikipedia, Linux og other intrepid and open initiatives. This would mean an organic rather than a mechanistic way of cooperation. The hardest part is to let go of "control" and instead have confidence in the performers.

  1. Future directions

5.1 Distributed objects

According to Object Management Group: “An object system includes entities known as objects. An object is an identifiable, encapsulated entity that provides one or more services that can be requested by a client.” (OMG 1998).

The object may be an aggregate of sub objects (or getting services from its component objects, e.g. mass, area, cost, volume or delivery time), and it may be located anywhere. Hence the need for data transfer is abandoned; simply ask the object for services based on its current state.

There is no need to distinguish between “program” and “data” because they are both contained in an active object located somewhere in the information space. There will probably be no need of knowing the location of objects because they all have unique identity. Identity (being) is not identification (name), just like a newborn baby has an identity even before it has been given a name (or social security number). Identification names may be chosen at will according to the practical needs of anyone who want to access the object. In Microsoft Windows, one can, for example, create shortcuts that provide direct access to any file in the file system (in object terminology this is called a “handle”). One can have as many identifications as wanted for any object, because the object has its identity in object-space.

When it comes to distributed, live information, lessons could be learned from the architecture of internet, as David Kalman claimed in the editorial of DBMS Mag in May 1995: “it's the Internet architecture that I find so intriguing. This architecture embodies several key ideas that portend the future of business computing on networks. These ideas include:

1. Location independence - From the user's perspective, the lines between remote and local are blurred. Assuming that you have a fast enough network, applications running on a server could be made to appear indistinguishable from those running locally.

2. Pervasiveness - The network is everywhere. Users can plug into the wall, or connect via wireless communications.

3. Standards-based - Information is stored and presented in a way that is compatible across a huge user population. Also, applications and tools become more interoperable as the network itself becomes the platform.

4. Redundancy/resiliency - If a network node goes down, others can continue to operate. Fault tolerance in business computing is provided by connecting redundant, mirrored servers in various locations.

5. Scalability - To support the addition of new applications and new information sources, you can add more low-cost servers without disrupting service.” (Kalman 1995)

Technical as well as business computing in the construction industry will benefit from taking these lessons seriously.

5.2 Superdistribution

I was introduced to Superdistribution by Brad Cox (Cox 1995). He explained the very fundamental difference between physical goods made of atoms from the digital goods made of bits.

The production chain of even simple information tools like a pencil is extremely well coordinated based on the need that arises for raw materials like wood, lead, rubber, metal, and so on. The exchange mechanism is that atoms are traded for a universal commodity: money.

The digital goods are different, because bits can be duplicated and exchanged at almost no cost. The well-known “Moore’s law” still holds even if electronic components probably are approaching the limit. New technologies are, however, under development, for example quantum transistors which are capable of holding not only two states, but many states simultaneously (Johnston 2008). I do not understand quantum physics in depth, but I still can imagine that the increasing capacity at lower price for computing equipment and digital communication will continue.

The basic principle of superdistribution is that objects contain a mechanism that automatically rewards the producer for use, not for produce. By this scheme, the entire production chain will be streamlined, just like the production of a pencil, and it will stimulate the reuse of information as much as possible. Otherwise, the construction industry will be overwhelmed with claims regarding “intellectual property rigths”, and provide a fertile ground for even more lawyers while the engineers and architects are starving.

Consider a situation where all information in an organization has an attached mechanism, based on use, such that the producer of information is only paid when the information is used by others. In such an administration many managers and bureaucrats would starve to death, because what they have spent most of their time to cultivate of regulations, etc. show up to be used by nobody at all. Technically, a system working in this way is quite simple to implement, but the psychological, social and personal issues will be very hard to deal with. Any pulling information system will rock the fundament of the traditional pushing information organizations based on authority, power, and self-esteem of superiors. When the old hierarchies break down, superior and high-performing organizations will emerge as they already have done in “Cyberspace” (Shirky 2008).

By the way, in traditional economic statistics there is a division between products and services. A product is a competent composition of atoms (like a pencil or a PC) while a service is a use of somebody’s competent time (like haircutting or an examination of your body by a physician). There is now a different category which I prefer to denote a “digital configuration”. This is some kind of information, for which the individual bits are not of interest (they are only 1 or 0); it is the configuration that is important. I prefer the term “digital” to “electronic”. Even “electronic mail” or e-mail is today mostly transported over fibre cables by means of light or photons, so the term should probably be “photonic mail” or p-mail.

We have had digital configurations for many centuries, like the compositions of Mozart, Bach and others. They did not sell it as digital in their time , but by performance, and they still live as digital configurations.

Digital configurations have three characteristics separating them from products and services: They are independent of time and space, they do not deteriorate, and they can be copied and distributed at practically no cost compared to the original production cost.

5.3 Hyperspace

IT will in itself bring no change to the construction industry. Change will appear only when the work-processes have been changed. This require thinking and coperation in dialogue between human beings.

IT itself is directly dangerous because it will consolidate and reinforce the established (mal-)practice of work. This could compare to the document-centered regime of bureaucratic offices or the ERP-mammoths ("Enterprise Resource Planning") of many businesses.

Governed by the "New Public Management"-terror of Europe there is just a matter of time before petrofication. The only to benefit from this slow and painful death of meaningful activity are the bureaucrats and lawyers who dictate the situation by laws, directives, standards and other nonsense.

2009 is the 25th anniversary of the concept "cyberspace" which appeared for the first time in the novel "Neuromancer" by William Gibson. Gibson wrote: “Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts ... A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity." (Gibson 1984). The graphic representation became observable in 1993 when the World Wide Web for the first time could be accessed via a graphic interface - the NSCA Mosaic browser.

Gibson also mentioned cranial jacks for the human-cyberspace interface. Today we see practical use of neural interconnections by functional magnetic resonance imaging (fMRI) for the brain-computer interface.

It is time to start exploration of hyperspace, as explained by Nelson and Smith (Nelson and Smith 2007):

" To escape from the limitations of files as we now know them, transliterary structure has been designed to obey a particular rule of combination: Division into files need not affect the structure. Transliterary document structures may be broken into files many different ways and still behave the same way."

and:

" By "folding space" we mean a pseudospace, presented to a user, whose proportions can be changed in more than three dimensions.

For example, a 3D model of a house may be squashed flat into a 2D picture, or stretched out of proportion. It can also, in principle, be extended into a fourth dimension showing its physical changes and value over time, and other dimensions showing parametric variations among houses built from the same design. (Note that such visualizations are tricky but doable.)

These flexible arrangements allow the presentation of ideas and relations which could never be presented before." (Nelson and Smith 2007).

  1. CONCLUSIONS

Ted Nelson addresses what is crucial to human understanding and imagination. As the cognitive scientist Marvin Minsky has taught us: "The secret of what anything means to us depends on how we've connected it to all the other things we know. That's why it's almost always wrong to seek the "real meaning" of anything. A thing with just one meaning has scarcely any meaning at all." (Minsky 1988).

In my humble opinion, the restriction of the flexibility of meaning leads to severe impairment of human imagination and definitely to the end of innovation.

At the First International Symposium Building Systems Automation-Integration held at University of Wisconsin, Michigan in June 1991 (Syvertsen 1991), I concluded my presentation of “The Design Office of the Future” with this claim: “The future is perhaps here when nobody talks about Computer-Aided”. My point was that as long as you give more attention to the tool than to the task, you have too much overhead. Any kind of technology that is important is simply immanent in our work, being it individual or collective. As long as the technology itself is an issue, it is not working smoothly. Now, almost 20 years later, I will suggest that the future is approaching when nobody is talking about BIM, simply because there are no other ways of working.

Just try to remember any kind of communication technology; telegraph, telephone, radio, television, the internet,... Not of these became important to society until our worrying about them ceased. That is maybe why any major change of communication technology will take generations; you have to be born with it to accept it? And the natives are always the most fluent in the language.

Nevertheless, in hundred years, compatibility will not be a question at all because people will still talk to each other in the most convenient way.

I will conclude with three simple statements:

  1. The immense power of modern information and communication technologies makes them extremely dangerous if misused.

  2. Careless standardization of the application of technology may lead to severe impairment of human imagination and definitely to the end of innovation.

  3. Self-organizing seems to be the only promising principle for harnessing technology.

If still ignorant, the construction industry will probably continue in the direction depicted in Figure 6 (from undisclosed internet source). (The New Zealander could be substituted by a Polish in Scandinavia, by a Mexican in the U.S. and so on)

Figure 6: Australian Corporate Management

References

AEC Magazine (2008). Autodesk and Bentley interoperability, August 23, 2008. http://www.aecmag.com/index.php?option=com_content&task=view&id=247

Akao, Yoji (1990). QFD: Quality Function Deployment - Integrating Customer Requirements into Product Design, Productivity Press, 1990.

Bosma, H. (1983). Information Quality rather then Information Quantity, In: Information Policy and Scientific Research”. Amsterdam. Elsevier, pp. 99-106

BuildingSMART: www.buildingsmart.com, September 7, 2008

Burgin, M. (2003). Information: Problems, Paradoxes, and Solutions, tripleC 1(1): 53-70, 2003, (ISSN: 1726-670X), Vienna, https://www.triple-c.at/index.php/tripleC/article/view/5

Corning, P.A. (1998). The Synergism Hypothesis. In: Journal of Social and Evolutionary Systems, 21(2), pp. 133-172.

Cox, B. (1995). Superdistribution. Objects as a Property on the Electronic Frontier, Addison Wesley Publishing Company ISBN: 0-201-50208-9, https://web.archive.org/web/20160324233052/www.virtualschool.edu/mon/TTEF.html

Dahl, Ole-Johan and Nygaard, Kristen(1965): SIMULA: A language for programming and description of discrete event systems. Introduction and user's Manual, Norsk Regnesentral (The Norwegian Computing Center), Oslo, 1965

Fuchs, C. (2003). Co-Operation and Self-Organization, tripleC 1(1): 1-52, 2003, (ISSN: 1726-670X), Vienna, https://triple-c.at/index.php/tripleC/article/view/2

Gall, J. (1986) Systemantics: The Underground Text of Systems Lore. How Systems Really Work and How They Fail (2nd edition), ISBN 0-9618251-0-3, General Systemantics Press, Ann Arbor, Michigan.

Gibson, William (1984): Neuromancer, ISBN 0-441-56956-0, Ace Books, Toronto.

Goldstein, J. (1999). Emergence as a Construct: History and Issues. In: Emergence 1, 1, pp. 49-72.

Goguen, J.A. (1997). Towards a Social, Ethical Theory of Information, In: Social Science Research, Technical Systems and Cooperative Work, Erlbaum, pp. 27-56.

Johnston, H.: A simpler way to test quantum computers, Physicsworld.com, September 25, 2008, https://web.archive.org/web/20120128144521/http://physicsworld.com/cws/article/news/36028

Kalman, D.L. (1995). Learning from the Internet, editorial, DBMSMag, May 1995. Also available at: https://web.archive.org/web/20071009160732/www.dbmsmag.com/fred9505.html

Limone, A., Bastias, L.: Autopoeisis and Knowledge in the Organization: Conceptual Foundation for Authentic Knowledge Management, Systems Research and Behavioural Science Syst. Res.23, 39-49 (2006), Published online in Wiley InterScience (https://onlinelibrary.wiley.com/doi/abs/10.1002/sres.745)

Løvstad, M., Syvertsen, T.G.(1992): Information Modeling of Offshore Structures Using an Object-oriented Methodology, Proceedings ASME Joint European Conference on Engineering Systems Design and Analysis, Istanbul, Turkey, July 1992.

Malone, T.W., Crowston, K.(1992): Toward an interdisciplinary theory of coordination, Center for Coordination Science, Cambridge, MA, MIT Press, 1992.

Minsky, M. (1988): The Society of Mind, Simon and Schuster, New York, 1988

Nature (2008). CommunityCleverness Required, Nature Vol 455, Issue no. 7209, 4 September 2008

Nelson, Theodor Holm and Smith, Robert Adamson (2007): Back to the Future: Hypertext theway it used to be, Project Xanadu, http://xanadu.com/XanaduSpace/btf.htm

New York Times (2008). Behind Insurer’s Crisis, Blind Eye to a Web of Risk by Gretchen Morgenson, September 27, 2008 (http://www.nytimes.com/2008/09/28/business/28melt.html?hp)

OMG: CORBA v2.2, February 1998, www.omg.org

Rice, Jocelyn: ReadingThoughts with Brain Imaging, Technology Review, MIT, Cambridge, Massachusetts, February 18, 2009

Rumbaugh, J., Blaha, M., Premerlan,W.,Eddy, F., Lorensen,W.: Object-Oriented Modeling and Design, Prentice-Hall, Englewood Cliffs, New Jersey, 1991.

Schiefloe, P., Syvertsen,T.G. (1993): Coordination: Challenge of the Nineties. Multimedia as a Coordination Technology, Telektronikk, No 4, Norwegian Telecom Research, Kjeller, Norway.

Shirky, C. (2008): Here Comes Everybody - The Power of Organizing without Organizations. Penguin Press, New York, 2008.

Simon, H. (1971). Designing Organizations for an Information-Rich World, written at Baltimore, MD, in Martin Greenberger: Computers, Communication, and the Public Interest, The Johns Hopkins Press, ISBN 0-8018-1135-X.

Sutherland, I. (1963): Sketchpad, A Man-Machine Graphical Communication System, PhD-thesis, MIT, 1963. (available at http://www.cl.cam.ac.uk/TechReports/UCAM-CL-TR-574.pdf )

Syvertsen, T.G. (1991): The Design Office of the Future, First International Symposium Building Systems Automation-Integration, University of Wisconsin, Madison, June 2-8, 1991

Syvertsen, T.G., Lillehagen, F., Løvstad, M. (1991): A Generic Object Model for Engineering Design, TOOLS Europe, Dortmund, March 30-April 2, 1991.

Syvertsen, T.G. (1991): Object-Oriented Product Modeling with Application to The Conceptual Design of Structures, ARECDAO '91, Barcelona, April 1991.

Wilson, J.Q. (1990): Bureaucracy: What Government Agencies Do And Why They Do It, Basic Books, New York, 1990