Design typically answers the ‘how questions’ – how will the solution requirements be achieved?
- Planning how the solution will function and its appearance. The solution design typically involves identifying what specific data is required and how the data will be named, structured, validated and manipulated. Typical design tools for this purpose include data dictionaries and data structure diagrams, input-process-output (IPO) charts, flowcharts, pseudocode, object descriptions. Solution design also involves, where appropriate, showing how the various components of a solution relate to one another, for example web pages, style sheets, scripts; queries, forms, reports; modules, procedures, methods, functions. Typical design tools used to show relationships include storyboards, site maps, entity-relationship diagrams, data flow diagrams, structure charts, hierarchy charts, and context diagrams.
- Planning the solution also involves determining its appearance, including, where appropriate, the user interface. This typically involves identifying the position and size of text, images and graphics, font types, colours and text enhancements. Design tools used for this purpose include layout diagrams, annotated diagrams/mock ups.
- Determining the evaluation criteria. What measures will be used to judge whether or not the solution meets the requirements? These criteria should arise from the solution requirements identified in the analysis stage.
Development typically asks the questions of how do we realise or transform solution instructions into a working solution through the use of digital systems.
- Electronically ‘building’ or creating the solution following initial designs. It may, however, warrant modifying initial designs in order to create a working solution.
- Validation to check for the reasonableness of data being input. Validation can be both manual and electronic. Proofreading is a manual technique and it occurs when a human scans the data for errors. Electronic validation occurs when the validation process is built into the solution. Its effectiveness is determined through the testing activity.
- Testing whether the solution does what it was intended to do. This activity typically involves: – establishing what tests will be conducted – determining what test data will be used – determining expected results – conducting the test – recording the actual results – correcting any identified errors.
- Writing internal and user documentation, including within the user interface, to support the functioning and use of the solution.
Evaluation typically answers the question: ‘How well did the solution meet its stated requirements’?
- Determining a strategy for finding out the extent to which the solution meets the required needs of the user. Typically this occurs after the solution has been developed. Usually an evaluation strategy would include specifying a timeline, outlining what data will be collected and by what methods and techniques, and how the data relates to the criteria that were generated in the design stage.
- Reporting on the extent to which the solution meets the requirements of the user. This usually takes place after the solution has been used by the user/client and is based on the criteria generated in the design stage.