This topic contains the following:
Modeling Business Processes within the enterprise, including the study of lean optimization
Communication channels, including Web 2.0 tools, email and scheduled meetings
SIFOA : Simulation of Information Flows and Organizational Architecture
I have made considerable progress over the last few years on social networks and corporate meetings. I still need to get my research article on Affiliation Networks published, but I consider that I have achieved the goals stated in the previous plan. On the other hand, there is still a lot to be done to better characterize communication channels, including those of the "Enterprise 2.0".
This first part of my agenda is strongly linked to my first blog "Architecture Organisationnelle". Although I have made some significant progress in understanding the various components of "2.0 communication", I am still far from my goal of "collaboration sociometry". Hence I plan to continue my study of "Enterprise 2.0" and how it is relevant to the challenges of 21st century business.
A key goal stated in my previous plan was:
"Build a workbench to explore and demonstrate the benefits of BP optimization techniques, such as Lean Six-Sigma. Lean Six-Sigma has become quite popular, and a lot has been written on lean management and six-sigma total quality management. Most books or texts explain how or why deploy such a method, but few try to explain why or when it actually works. It turns out that this is a non-trivial question. A longer-term objective is to write a textbook on Business Process Optimization"
I am still working on it :) I have definitely made progress in understanding why lean works, especially from my practical experience at Bouygues Telecom. However, before building a workbench (cf. SIFOA v2 later), I need to build a better "Business Process Enterprise Model". It turns out that these two topics (understanding lean and understanding 2.0) are more closely related that I had tought at first (at least, there is a very interesting intersection). This is why I found that I needed a common model as a foundation for this quest. I came up with this concept of BPCM, as explained in my post of August 9th, 2009 (a sad day):
The goal of BPCM is summarized (Following March & Simon in "Organizations") as the quantitative study of efficiency of enterprise communication as a way to run business processes (seen as strongly inter-related and complex activity patterns). Three parts may be found in such a model:
Business Process Model : a model which describes the entreprise, its business processes and which defines efficiency. This is a classical topic and one may build on top of time-proven foundations. For instance one may describe the entreprise with units, resources, skillls; activities mauy be used to build processes and clients are introduced to define the value produced by these processes.
Enterprise Information Flow Model : a model which described the enterprise organization and its capacity to route information flows. Roughly speaking, this includes management organization (since, according to March & Simon, a key management function is precisely to transmit information) and a formal model for communication channels.
Business Process Communication : a model which defines which information is necessary to run business processes. A process is defined as an ordered network of activities, which is most often complex and strongly interdependant (especially for service processes). Making the process run requires a fair amount of communication. A key thesis in March&Simon's book is that entreprise organisation should be designed to foster these communication flows. This third part of BPCM intends to characterize the "information load" that the entreprise organization must process
I quote from their book: « The capacity of an organization to maintain a complex, highly interdependent pattern of activity is limited in part by its capacity to handle the communication required for coordination. The greater the efficiency of the communication within the organization, the greater the tolerance for interdependence. The problem has both quantitative and qualitative aspects".
A longer-term goal will be to integrate all of this into a new simulation tool (SIFOA v2) which I have been posponing for a while. As a matter of fact, the first simulation tool was sucessful in terms of results but difficult to communicate about, since its design was very much "ad hoc". My goal is to build this new tool on top of models that have been independently published, described and debated. I could also state my long terl goal as understanding how "communucation theory" may contribute to "management theory".
A major goal for this first topic is to write a new book, which should be out early 2011.
Objectives:
BPCM
Better understanding of the contribution of 2.0 tools to enterprise efficiency
Formal study of BPO (Business Process Optimization) such as lean management
SIFOA v2
GTES is a simulation technique that combines Monte-Carlo, Game Theory and Evolutionary Algorithms. As explained in my blog or my research paper, it is mostly a "model benchmark", that is a tool to explore what can be said about a very general model.
GTES applies to problems for which a model is conjectured, but with far too many unknown parameters to be useful for a direct simulation. It also adresses the behavior of a set of actors/players, who try to maximize some form of objective function. GTES separates these unknown parameters into three families:
parameters that are not related to the actors but represent the environment. The GTES approach is to sample these parameters using a Monte-Carlo simulation
parameters that represent the objective function, i.e. the strategy of the actors. These are the control parameters for the model, those for which a "game theory strategy matrix" is desired. GTES will yield a simulation value for each setting of these parameters.
so-called tactical parameters that are assigned to each actor, but are "controlled" by the strategy parameters, in the sense that each actor may be assumed to learn the optimal value to best fit its objective function. This is where the "evolutionary algorithms" (or any form of local optimization and meta-heuristic) kick in: GTES solves each optimization sub-problem to find the "best behavior" for each actor according to its strategy parameters.
The beauty of the GTES approch is to reduce the set of parameters that need to be looked at (i.e., the second set). The first set is sampled and the third set is derived from solving a set of optimization problem. The game theoretical part comes from solving each actor's optimization problems at the same time. A simple fixpoint computation (iterating the local moves) may find a Nash equilibrium, but there is no warranty. However, the first application of GTES to various problems using a naive search for Nash equilibrium has proven to be quite interesting.
GTES has been a true success of the previous three years, in terms of results and published works. The journal article in RAIRO was a nice conclusion to this first piece of work. I have now established a formal framework which allows me to describe the set of algorithms that constitute the GTES approach in a precise and complete manner. The next step is, as was hinted before, to work on a broader characterization of equilibriums:
- better way to find Nash equilibrium when they exist
- defining a lesser form of equilibrium when no Nash equilibrium may be found (for instance, by using a minmax approach)
I plan to continue developping GTES using two key problems:
- SIFOA : optimizing the organization and allocation of communication flows between channels
- CGS (Cellular Game Simulation) : competition amongst cellular phone operators (for instance, introduction of a new player)
I have now added a third topic, which is to study a crude global model that binds together the energy production, the world economy and global warming. I have coined a new acronym (GWDG: Global Warming Dynamic Games) and started to collaborate with my former collegues of the e-Lab. It is too early to tell, but I believe that I am onto something good :)
Objectives:
Improve the characterization of equilibriums (beyhond Nash equilibriums)
Publish a research paper in English :)
Apply GTES to global warming : GWDG !
Improve the modelling of competition between operators on a mass-market
This chapter of my research agenda has evolved with my course at Polytechnique (which was not part of my previous plan). Where I had a more focused interest on "Biology of Distributed Information Systems", I now have a broader interest to better understand/explain IT "as a system". My previous topics of interest:
Quality of Service as as emerging property (part of Autonomic Computing)
Distributed Data Architecture
Information Systems as Complex Systems
are still on my research agenda, but they are now part of a larger goal to formalize the know-how that I have acquired over the years. More precisely a fresh set of topics have been added:
measuring IS complexity
Sustainable development of IT
Service-Oriented Architecture
I still need to learn more about system theory. This inquiry is actually useful both for my first and third topic. Complexity is a key issue; as I have said in my previous post, I now see a CIO's job mostly as mastering the evolving complexity of the information system.
I also plan to write a book eventually, but I am far from ready yet. The goal would be to write a book very different from my previous two, with more meaning and fewer sentences (i.e., a more abstract, hence elegant, book).
A related topic, which is also related to autonomic computing, is how to build autonomic network systems. They could be smart grids (a hot topic nowadays) or smart home networks. Autonomous/autonomic networks is the answer to the complexity issue, so these two topics are linked. This is also where biology may be used as a model (biomimicricy), hence the topic of my blog "Biology of Distributed Information Systems".
Objectives:
Improve my course at Polytechnique :)
Simplify through abstraction the "lessons from the pit" learned while being a CIO
Build a systemic description of IS
explore the concept of autonomous distributed networks as a cure for complexity