Abstract:
Architecting distributed software applications is a complex design activity. It involves making decisions about a number of inter-dependent design choices that relate to a range of design concerns. Each decision requires selecting among a number of alternatives; each of which impacts differently on various quality attributes. Additionally, there are usually a number of stakeholders participating in the decision-making process with different, often conflicting, quality goals, and project constraints, such as cost and schedule. To facilitate the architectural design process, we propose a quantitative quality-driven approach that attempts to find the best possible fit between conflicting stakeholders' quality goals, competing architectural concerns, and project constraints. The approach uses optimization techniques to recommend the optimal candidate architecture. Applicability of the proposed approach is assessed using a real system.
Introduction:
Designing the software architecture (SA) for a distributed application is widely an important and complex activity. Its importance lies in the fact that it embodies several design decisions that will be difficult and costly to change downstream if they are subsequently discovered to be flawed. It is complex because the architect must make complex design trade-offs to meet competing architectural requirements [1].
Further difficulties arise due to the early life-cycle nature of architecture design. As few or no concrete artifacts typically exist at this stage, it is hard, often impossible, to thoroughly reason about the consequences of many design decisions. This is especially true of design decisions that are embodied in off-the-shelf distributed component infrastructures (e.g. J2EE, .NET, CORBA) that are often utilized in the architecture. For these reasons, architects use prototypes and previous experience to justify their designs.
Additionally, there are often a number of stakeholders involved in the design process with each Abstract having conflicting quality goals. Furthermore, software design is often constrained by project cost and schedule, which always need to be satisfied. In an attempt to help alleviate architectural design complexity, this research aims at providing techniques and tools to support disciplined reasoning during the SA design process. This work is motivated by the need to:
Help various stakeholders of a system express the desired quality goals in a measurable form, and formalize the process of prioritizing those goals.
Help architects determine the candidate architecture that best satisfy stakeholders' quality goals and meet stated project constraints.
This paper describes a quality driven design approach, ArchDesigner, that promotes a disciplined engineering and reasoning framework during SA design. The novelty of our approach lies in the use of optimization techniques, particularly Integer Programming [2], for optimizing the SA design comprised of multiple inter-dependent design decisions.
ArchDesigner improves upon previous approaches, which evaluate and select among given coarse-grained SAs [3, 4] without giving guidance on how to arrive at these architectures. We argue that the evaluation of all candidate SAs is a difficult, often impossible task, since the number of candidate SAs can be very large. Our approach therefore evaluates and selects among candidate SAs in a fine-grained fashion, thereby helping stakeholders arriving at a suitable SA solution. We assess the applicability of ArchDesigner using a case study of a deployed system that had several stakeholders with different quality goals, multiple inter-dependent design decisions, and project constraints.
The ArchDesigner Approach:
Computation of Value Scores:
The AHP Process:
Basically, AHP comprises four main steps:
Preparation: this step articulates the different elements involved in the decision-making process for the designated decision. It entails identifying design alternatives, quality attributes that will be used for evaluating these alternatives, and also the different stakeholders participating in this decision.
Weighting Quality Attributes: the aim of this step is to determine the relative weight for every quality attribute.
Weighting Alternatives' Quality Support: next we determine how each design alternative relatively supports the relevant quality attributes.
Computing Value Scores: we can now compute the value score Vij for alternative i of design decision j using the following formula: