There may be hundreds of models created to guide the design of instruction. A "Survey of Instructional Development Models" by Gustafson and Branch (2002) surveys 15 of these models. This is my 10 page synopsis of their survey.
INTRODUCTION
SUGGESTION: Select and modify ID model on specific context of the project.
TAXONOMY OF INSTRUCTIONAL ID MODELS
Models are built with many implicit assumptions about the conditions of development and delivery.
Different taxonomy systems but this one will be based on three categories that are connected to context:
1) Individual classroom, product for implementation for other users, larger and complex system
Classroom oriented models (mostly for professional teachers, little time for development, limited resources, lessons/activities done once a year):
Model 1: Gerlach and Ely (1980)
Specify objectives/content
Assess learners
Determine strategy, organization of groups, allocation of time, allocation of space, select resources
Evaluate performance
Analyze feedback and refine
Model 2: Heinich, Molenda, Russell and Smaldino (1999)
ASSURE (in most widely adopted college text on instructional media for teachers). Matches to reality of K-12 classrooms.
A- analyze learners (find out a selection of entry characteristics, entry competencies like knowledge, technical vocab etc., and “learning style” which they say is problematic)
S- state objectives (ABCD for audience, behaviors, conditions, and degree)
S- select media and materials (may adapt these or in rare cases develop them but time limitations usually push this audience to select existing media and materials)
U- utilize media and materials (plan how to use resources)
R- require learner participation (keep learners actively involved, use feedback and practice)
E- evaluate and revise (did learner achieve objectives? Was instruction feasible? Revise based on discrepancies between intended and actual outcomes and deficiencies in media, methods or materials.
Model 3: Newby, Stepich, Lehman and Russell (2000)
PIE model: planning, implementing, evaluating with a focus on having a learner-centered classroom rather than teacher through defining roles. Computers can play a major role.
Phase 1: Plan by gathering info about learner, content and setting.
Phase 2: Implement using media, methods and computer.
Phase 3: Evaluate learner performance and how to your own and student performance
Model 4: Morrison, Ross and Kemp (2001)
Defined by 6 key factors that influence the whole process and 9 steps each of which can be the beginning of the process.
1. What level of readiness do individual students need for accomplishing the objectives?
2. What instructional strategies are most appropriate in terms of objectives and student characteristics?
3. What media or other resources are most suitable?
4. What support is needed for successful learning?
5. How is achievement of objectives determined?
6. What revisions are necessary if a tryout of the program does not match expectations?
Based on the above key factors there are nine elements to attend to:
1. Identify instructional problems and specify goals for the instructional program
2. Examine learner characteristics that will influence decisions
3. Identify subject content and analyze task components related to goals
4. Specify instructional objectives
5. Sequence content within each instructional unit for logical learning
6. Design instructional strategies so that each learner can master objectives
7. Plan the instructional message and develop instruction
8. Develop evaluation instruments to assess objectives
9. Select resources to support instruction and learning activities
Product-Oriented Models
Assumptions:
· Amount of product will be several hours, or perhaps a few days, in duration.
· Front-end analysis varies.
· Assume product is technically sophisticated.
· Users may have no contact with developers except during prototype tryout (with exception to rapid prototyping models where user interaction is a central feature of the process).
1. The instructional product is needed.
2. Something needs to be produced rather than selected or modified.
3. There will be considerable emphasis on tryout and revision.
4. Product must be usable by learners with only managers or facilitators, but not teachers, available (pretty much standalone).
Five models are reviewed (4 new to this edition):
Model 1: Bergman and Moore (1990)
Model intended to guide and manage production of interactive multimedia products and contains 6 major activities:
1. Analysis (driven by an internal or external request for proposal—RFP—includes identifying audience, tasks, user environments, and content)
2. Design (sequence major segments and define treatment—a.k.a. a high-level design)
a. Detailed design (specify motivational elements, media, interaction strategies, assessment methodology)
3. Develop (prepare docs for production—storybooks, audio scripts, shot lists, art and graphics rendition, database for managing production)
4. Produce (use documentation and create media components)
5. Author (integrate different components into complete product—code, test, and tune)
6. Validate (compare finished product to original objectives and revise)
Each activity has inputs, outputs (deliverables) and evaluation strategies. Output for one=input for another and should evaluate each output.
Model 2: de Hoog, de Jong, and de Vries (1994)
Underlying base is rapid prototyping and the need to intertwine method, product and tools using a comprehensive approach. Partial products:
1. Conceptual model
2. Operational model
3. Instructional model
4. Interface model
5. Learner model
For each partial product there is local spiral development using four components: compliance, quality, integration, and specificity.
Model 3: Bates (1995)
Front-end system design
Model designed to develop open and distance learning. Crucial principles include recognizing the need for extensive pre-planning and design for students at a distance who work largely independently and the need for interaction and flexibility which is lacking in distance learning. This model has four phases:
1. Course outline development (ID target group, ID product place in curriculum, agree on content, agree on teaching approach)
2. Selection of media (access, cost, teaching functions, interaction/user-friendliness, organizational issues/existing facilities, novelty, speed)
3. Development/production of materials (copyright clearance, printing, produce audio and video or other computer based materials, tutorial arrangements)
4. Course delivery (warehouse, packing, mailing/transmission, tutoring, library services, student assessment, course evaluation)
Relies heavily on theories of instructional design, including building in student activities, providing clear timely feedback, and carefully structuring content.
Model 4: Nieveen (1997)
Start with preliminary research on needs and end with summative evaluation but in between are several iterative development cycles of analysis, design and formative evaluation.
There are four levels but each level may go through more than one cycle of analysis, design and evaluation:
Level 1 deliverable: design specs (usually done by design team)
Level 2 deliverable: global materials created (expert appraisal would usually be the evaluation)
Level 3 deliverable: partially-detailed materials (for expert appraisal and small-scale tryout)
Level 4 deliverable: complete materials (expert appraisal, small scale try out and large group tryout)
Summative evaluation after release
Model 5: Seels and Glasgow (1998) ISD Model 2: For Practitioners
This model is similar to others but based on the assumption that design and development occur in a context of project management. As a result their model has three management phases:
1. Needs analysis management (needs assessment of goals, performance analysis that results in instructional requirements, context analysis related to constraints, resources, and learner characteristics)
2. Instructional design management (may be linear or consist of iterative cycles and is complete when formative evaluation is satisfactory)
3. Implementation and evaluation of management
Emphasis is placed on management and diffusion of results.
Systems-Oriented Models
Assumptions:
1. These models are used to develop a complete course or entire curriculum.
2. Substantial resources will be invested.
3. Team consists of highly trained developers.
4. High front end analysis.
These models begin with data collection to determine if instruction is the best solution to the problem (for example if the problem is motivation or environmental factors—e.g. lack of resources—then instruction may not be the best solution).
One of the differences between this and product-oriented models is the emphasis placed on analyzing organizational goals before developing because these projects usually take in a much larger scope.
Model 1: Branson (1975) Interservice Procedures for Instructional Systems Development (IPISD)
This model is detailed and specific to military which makes it difficult to apply in other contexts. It consists of 5 phases which are broken down into 20 sub-steps which can then contain hundreds of more detailed steps.
Phase 1: Analyze: analyze job, select tasks (subtract those known or easy for learner), construct performance measures, analyze existing courses, and select setting
Phase 2: Design: develop objectives, tasks, describe entry behavior, and determine sequence and structure
Phase 3: Develop: Specify learning events, instruction management plan and delivery, Review/select existing materials, develop instruction, and validate instruction
Phase 4: Implement: implement instructional management plan, conduct instruction
Phase 5: Control: conduct internal evaluation, external evaluation then revise system
Strength: extensive specification of procedures; Limitations: narrow instructional focus and linear approach to ID
Model 2: Gentry (1994) Instructional Project Development and Management (IPDM)
Defines what needs to be done in an ID project and how it is done and is intended for grad students, practicing instructional developers and teachers.
Model consists of two clusters: development components and supporting components with communication between the two.
Development Components:
1. Needs analysis (establish needs and prioritize goals for instruction)
2. Adoption (establish acceptance by decision makers, and obtain commitment of resources)
3. Design (specify objectives, strategies, techniques and media)
4. Production (construct project elements specified by the design)
5. Prototyping (assemble, pilot, test, validate and finalize)
6. Installation (establish necessary conditions for effective operation of a new instructional product)
7. Operation (maintain the instructional product after it is installed)
8. Evaluation (collect, analyze, and summarize data to enable revision decisions)
Supporting components:
1. Management (process used to control, coordinate, integrate and allocate resources to accomplish goals)
2. Information handling (process used to select, collect, generate, organize, store, retrieve, distribute and assess information required by project)
3. Budget/resources allocation (processes used to determine resource needs, formalize budgets, acquire and distribute resources)
4. Personnel (process used to determine staff needs, hire, train, assess, motivate, counsel, censure, and dismiss members)
5. Facilities (process used to organize and renovate spaces for design, implementation, and testing)
Model 3: Dorsey, Goodrum and Schwen (1997) Rapid Collaborative Prototyping (course and perhaps product based)
Users play a central role in the development process and designers are collaborators on teams with users. Model suggests that you start with low fidelity prototypes and do a series of tightly spaced iterative testing. Each iteration consists of a user test which leads to conceptualization and building. This is done as you move through 5 cycles: Create a vision, Explore conceptual prototypes, experiment with hands-on mock-ups, pilot test working prototypes, and fully implement the evolving vision.
Principles:
Process: speed is critical, iteratively modify the prototype several times. Seek alternatives not just modifications.
Interaction: regard the user as designer, avoid use of technical language, maintain consistent communication.
Fidelity: employ low fidelity to get early feedback, high level fidelity prototype to get quality feedback, prototype is effective if it allows user to give pertinent feedback
Feedback: capture what user likes and what he/she doesn’t like, if user doesn’t want it fixed don’t fix it, gather micro, mini and macro level data.
MORE CONCEPTUAL THAN OPERATIONAL: Details on how to implement it are lacking!
Model 4: Diamond (1989)
This model is specific to higher ed institutions and is sensitive to political and social issues in organization.
Phase 1: Project selection and design
· Determine feasibility and desirability of launching project
Phase 2: Production, implementation and evaluation. In this phase each unit of the course or curriculum proceeds through a seven step process.
1. Determine objectives
2. Design of evaluation instruments and procedures
3. Select instructional format
4. Examine and select existing materials
5. Produce new materials/modify existing materials and field test
6. Coordinate logistics for implementation
7. Full-scale implementation, evaluation and revision
Model 5: Smith and Ragan (1999)
Phases: analysis, strategy, evaluation more based on cognitive psychology base.
Phase 1: Analysis (learning environment—prepare description of learning environment, learners—describe stable and changing characteristics of learner audience, learning task—recognize and write instructional goals, write test items)
Phase 2: Strategy (determine instructional strategies, deliver strategies, manage strategies, write and produce instruction)
Phase 3: Evaluation (formative evaluation and revision of instruction)
Model 6: Dick, Carey and Carey (2001)
Model is as follows:
1. Identify Instructional Goal(s) (a la Gagné[SRC1] )
2. Conduct Instructional Analysis
3. Analyze Learners and Contexts
4. Write Performance Objectives
5. Develop Assessment Instruments
6. Develop Instructional Strategy
7. Develop and Select Instructional Materials
8. Design and Conduct Formative Evaluation of Instruction
9. Revise Instruction
10. Design and Conduct Summative Evaluation
Conclusion
Many models have not been validated. Indeed perhaps some have never even been used! Case study is a low level validation and is even lacking.
VALIDATION needs to require PRECISE description of the elements of the model followed by SYSTEMATIC DATA COLLECTION concerning their application and the IMPACT of the RESULTING INSTRUCTION.
Be alert to possible discrepant or negative data.
Study what conditions should be present to use the model.
Model validation (see Rubinstein 1975, Patterns of Problem Solving)
DEVELOPERS: acquire a WORKING knowledge of several models, being certain that all three categories (classroom, product, and system) are used. Be able to use the right one, perhaps with modification, for the right job.
[SRC1] A la Gagné