This web page is associated with a book called called Hermeneutics in Agile Systems Development.
The book can be bought at: https://play.google.com/store/books/details/Dr_Jerome_Heath_Hermeneutics_in_Agile_Systems_Deve?id=mi6VBQAAQBAJ
The book gives a more complete explanation of these issues and includes a number of related topic discussions. The combination develops the understanding of the concepts of Agile software development.
Super Archeological Layering
In modern discussions about software systems and the analysis of those systems (much of these discussions are related to security problems), there is a discussion about the the data (and all of the data in particular) and how all this data fits together and how it is used. Security depends on our understanding of all the relationship between the data entities of the system. To me the discussion is always rummaging though the territory of Formation Data Context without having a formal methodology to do it.
Formation Data Context is a concept relating how the data and data relationships are formed in the system These relationships need to be understood so that we can manipulate the data and relationships, and secure the data in programming the computer and have decent predictions about what is happening and what the results will be.
In hermeneutics there is a process called form analysis or analysis of form. The idea is to study what forms the textual information came through as it was being used and developed. Was this saved as history, was it used a some kind of liturgy religious or secular (approaching the king or the judge), and was it used in common meetings of people? Various texts could have grown through these forms. A particularly important consideration is the reason that this text was in written form and not just verbally past down. In earlier times only a few could read and write so text was special. It is significant to know when and why this text was put in written form.
Now we need to carry this analysis to the digital form; for both controlled security and smooth transitions. Why is this data being digitized? Then, what forms will this data be part of in its digital use? And, also, how do we secure the data for our purposes? Knowing these provides a better understanding of what users are using this data for. So the programmer can understand these uses from the beginning; rather than tacking them on later with much extra work. The important difference here is that we are looking to define details of all phases of the process. The original form analysis was looking to define, in particular, the “first use” of the form. The digital process uses the concepts of language acton developed by Habermas. Simply put, the data forms are developed and defined based on their usefulness. This is augmented by some archeological layering per Foucault.
In the real world the data or information tends to be related to real things: temperature, clouds, rain, miles traveled, dollars spent, etc. There is a reality associated with the data. The data within the computer can be associated with such realities that are outside the computer but inside the computer the relationship between the real and the data is more distant.
Data comes into a computer system through a user interface. This the first form of the data. It arrives as some kind of stream of information from a user or a device. The original form often is quite different from the form needed in the application. The changes needed for the input may appear simple, but the process must consider all possible combinations that might appear in the stream.
Within the computer the data has four basic characteristics: type, value, address, and function. The data has a type (which is extremely important in digital information because the type defines how the data can be processed), say here, an integer. The data has a value, say 76. The fact that this is degrees or speed or height is actually a separate data item.
In the computer an important characteristic of data is the address, where the data is “kept”. When we first worked with computer data, the importance of the “address” of the data was less understood. But knowing the address is the only way of finding the data in the computer. Often the address is “held” in the name given to the site or place of the data by the programmer. This makes finding the data “easy.” But there are more complex ways to keep track of the address of the data that at first seem elusive but make finding and using large amounts of data easier for the computer.
As we use a lot of data in a given process it is of advantage to have that data together in some kind of form. The form can be a list, a table, page, index, or an object. Such a form allows us to move or reference the data that is needed together. Once it is referenced as part of a collection we can deal with the individual items more quickly and smoothly in the process development. Such structures are necessary to have workable and efficient data processes. Devising such forms are part of developing efficient and effective data processes. Analysis of form from a hermeneutic approach allows us to develop a more complete understanding of how the forms are associated and thus how they can be developed and used more effectively.
Object oriented design establishes a structure for data that is meant to be more in tune with real events and activities. The use of inheritance, encapsulation, overloading, and polymorphism creates an object that has personality. The functions of the structured design methodology could not be understood as a thing. These were functions. An object can be understood as a thing. This allows a more native recognition of object capabilities and the ability to predict the outcome on reuse of the object. This methodology allows a more hermeneutic approach rather than pure functional decomposition
As we develop our Formation Data Context analysis we need to tie the data together through archeological layering (Foucault). Though this may not be a legacy system there are legacies in the process that is being analyzed. And as the system is analyzed the relationships between data will be more accurately placed using archeological layering than more structured methodologies. And the definition of data and data structures must fit the concepts of Habermas; that is usefulness (pragmatics). But the usefulness is not just inside the systems. Users of the system are part of the pragmatics.
The Story Board
When on a project on my own I do most of the archeological layering in my head. I do write notes and lists of important items and issues. I also mark the old code in important areas; especially where I am sure there are repercussions in any changes I would make. But for a team there needs to be some way to communicate viewpoints and ideas between members.
While watching the mysteries on television these days, I see a common phenomena in the use of a “Murder Board” or Story Board. As the mystery progresses people’s pictures, names, and evidence (including alibis and statements) are put on the board. Issues relating to solving the crime are also put on the board. People who have relationships, are related to each other, are put close together as possible. Extra relationships that can’t fit close together are linked by strings, often red for emphasis. Issues are tied to people through this same kind of string. Evidence, often in plastic bags, and pictures of evidence are also added. This all begins to look like a scrapbook on the wall. It is referred to as the board or murder board (as the crime in the plot is usually murder). This is archeological layering on board (or it could be on a white board).
This form is actually archeological layering that is formalized to allows a team to work together on this archeological layering of the project. In systems processing, I could see Use Cases as commonly appearing on some kind of story board; along with information and drawings and pictures and diagrams related to the system as it is and, also, the beginnings of the new design process. In agile, the narratives are the most important basis for coding efforts. These beginnings of the design process are the beginnings of Formation Data Context. On the story board the context is always clear and relevant.
There is a point where the design issues could be moved to a separate design story board. These are then factored into design narratives. As we pivot from the analysis to design the goal changes from recognizing the relationship between items that are in the collage, to defining the new data and processes and do this in a pragmatic, or useful, way. The key to developing Agile data and processes is maximum usefulness. This is a complete change from the Structured Methodology, where function was the important issue. Usefulness is far more effective as a filter for defining the new systems data and processes. Since, with Agile the programming starts early, code is produced before all the systems are completely designed. The design story board becomes the basis of building the backlog and sprints for deciding the next coding efforts to accomplish.
Meaning and Truth
My experience with textbooks on Structured Design and Project Management is that the Structured Systems Development was aimed at “truth”. The textbooks often relate the fact that you can prove the code produce is true and accurate. They often allude to going back into the “lab” and testing the code for this truth. The concept is that there is only one true way of doing anything and we need to find exactly that way. Of course the only way to find that “truth” is to use Structured Methodology.
The Agile approach has a different view on choosing the best code for a process. The basic design of various Agile Methodologies relates to developing meaning and not “truth”. The ultimate problem with “truth” is that in order to be ”truth”it must be taken out of context. “Truth”, by its definition, is a law unto itself. Meaning is always and must be always in context. All Agile Methodologies emphasize that context since it is the only way to control the process; since, in Agile, you have rejected “truth” (which is based on setting aside the Structured Methodology) as “the” controlling factor.
Dr. Jerome Heath
Google Books: Hermeneutics in Agile System Development