Identifying Goal Areas
Goals should be written for those areas of need that will have a specific learning outcome (e.g., reading skills development, self-care independence, etc.). In some need areas, goals are not written. For example, transportation may be a need area, but a goal is not written. Physical modifications may be necessary for participation in certain activities and environments, but a goal is not written. While these are areas of need, there are no specific learning outcomes intended for the student.
Discussion of the skill performance expectations for the student’s age or grade level.
Evaluation of the student’s current performance (consider teacher/parent/service provider input, grades, attendance, group achievement scores, informal observations, etc.)
Review of progress on previous IEP goals.
The team's discussion should include a student-centered consideration of what achieving this goal will do for the student (e.g. level of independence, living skills, etc.)
A “new” area is one that is not closely related to existing areas that are addressed through special education services and supports. For example, if math services and supports are added to an IEP that previously addressed only reading, math is a “new” area; if services and supports promoting multiplication and division success are added to an IEP that previously addressed addition and subtraction, multiplication and division reflect a normal progression within math and are not “new.”
Neither an evaluation nor the general education intervention process are required to add a goal/service to the IEP of a special education student. However, this process should be considered whenever the needs of a special education student might reasonably be addressed without special education services or supports. The data supporting the addition of services and supports in a new area must be documented in the IEP.
IEP teams need to base their decision on sound data. Data may come from a variety of sources. When considering additional needs, the data generated through the general education classroom that reflects an individual’s progress in the context of assistance provided to the student is the first source to consider. Data must clearly support the conclusion that identified needs require special education services and supports, rather than other forms of assistance.
State the IEP team’s conclusion regarding the individual’s need for special education services and supports in the new area, and summarize the information that supports this conclusion, including the impact of previous efforts.
Determining Progress Monitoring Procedures
Rate - Given a window of time, how many problems can the student complete successfully, words can they read successfully, word sequences can they write successfully, etc.?
Accuracy - What is the student's ratio of successful versus unsuccessful responses?
Rate/Accuracy - Given a window of time or a certain number of opportunities, how much of the time (rate) or in how many opportunities (accuracy) was the student successful? The determination of success/non-success is binary. They either were or they weren't successful (e.g., being on-task or off-task).
Quality - How successful was the student in performing the skill? Instead of a binary determination of success/non-success, the degree of success in performing a skill is evaluated.
Independence - A prompt hierarchy is defined (e.g., from independent performance all the way to hand-over-hand physical prompting) and either the whole skill (e.g., greeting someone) or separate components of the skill (e.g., toothbrushing) are rated based on the level of prompting needed to illicit a successful response.
Duration - How long does the student engage in a specific desirable activity or stay in a desired environment? Similar to rate, this is a binary determination.
Latency - After a student receives a cue, how long does it take for them to respond?
The SEL PLAAFP, IEP Goal, & LRE Examples document includes more detailed guidance in this area.
Common examples include: Curriculum-Based Measurement (CBM) probes (e.g., ORFs, CWSs), rating forms (multiple choice category labels used to rate performance - e.g., Direct Behavior Ratings), rubrics (rating forms with a detailed description of what performance looks like at each rating level rather than a simple label), checklists (rating forms with a yes-no binary determination for each item), and direct observations (e.g., Momentary Time Sampling, duration, etc.).
Permanent products (e.g., rate of homework completion) are generally not recommended for IEP goal progress monitoring and only recommended as a secondary data collection tool to confirm intervention effectiveness.
A special education teacher or paraprofessional trained in administration of CBMs? A student's classroom teacher(s) when applicable? A service provider who is working with the student (e.g., an SLP)?
The interval should be reasonably calculated to be:
Infrequent enough to capture new data when we can reasonably expect a change in performance. For example, we cannot reasonably expect a student's skill proficiency to meaningfully change on a daily basis. If data is collected daily (e.g., a daily point sheet), the average or median of that data is typically graphed less frequently (e.g., weekly) in order to capture trends.
Frequent enough to allow the team to respond to inadequate progress. For example, if we collect data 1x/month, it will take far too long---at least a quarter of an annual IEP---to determine a trend of inadequate progress and make changes to our intervention.
Writing Measurable Goals & Objectives
Time Frame: When do we want the goal to be achieved by?
Condition: Under what circumstances will we be measuring the student's performance?
Behavior: What behavior/skill will we be measuring, and what aspect of it (e.g., rate, quality, independence, duration, etc.)?
Criterion: What performance level will indicate that the goal has been achieved? We strongly encourage developing goals that monitor an increase in a desired skill/behavior versus simply a reduction in an undesired behavior.
By [due date] when [measurement condition], XXX will increase [skill performance] from [baseline] to [target] as measured by [description of measurement tool, frequency of data collection, and individual(s) responsible].
By January 2018 (time frame), when presented with unrehearsed 5th grade level passages (condition), Cindy will increase her rate of reading orally (behavior) from 84 correct words per minute to 121 correct words per minute with at least 95% accuracy for two consecutive data points (criterion), as measured weekly by special education staff using oral reading fluency probes.
By October 2018 (time frame), when Beverly exhibits visible signs of agitation (clenched fists, scowling, angry voice tone) (condition), she will increase her rate of using anger control strategies (behavior) from 30% of opportunities to 65% for two consecutive data points (criterion), as measured every other week using a direct behavior rating completed by teachers working directly with Beverly.
By May 2018 (time frame), when observed in all school settings (condition), Randy will increase his rate of keeping a daily to-do list, using effective time management strategies, and maintaining organized materials (behavior) from a level of 21/55 points obtained on an organizational skills inventory completed by special education staff working directly with Randy to 44/55 points for two consecutive data points (criterion), as measured every other week.
By September 2018 (time frame), when presented with 30 tasks, 10 each of synonyms, analogies and multiple meanings (condition), George will increase his rate of correct response (behavior) from a current level of 4/30 tasks completed correctly to a level of 25/30 tasks completed correctly for two consecutive data points (criterion), as measured weekly by a Speech Language Pathologist working with George during small group instruction.
By July 2018 (time frame), when presented with a toothbrushing task (condition), Adam will increase his ability to complete a toothbrushing routine (behavior), from currently obtaining 8 out of 24 points on the toothbrushing rubric to obtaining 16 out of 24 points for two consecutive data points (criterion), as measured weekly by special education staff using the toothbrushing rubric.
Within the same IEP, teams may use either or both benchmarks and short-term objectives. However, these two measures should not be mixed under the same annual goal. Both of these measures break learning into measurable, intermediate steps that enable families, students, and educators to monitor progress during the year.
In order to meet minimal compliance standards in the state of Minnesota, IEP goals must include at least two benchmarks/short-term objectives (3525.2810 subp. 1(a-2)). It is recommended that they be intentionally plotted so that they are evenly spaced throughout the duration of the IEP. Some case managers choose to schedule their benchmarks/objectives near the end of grading periods when progress reports will be sent out; however, this is by no means mandatory.
Benchmarks are the most common method. When feasible, they are preferred. Use benchmarks when the skill or behavior leading to the annual goal remains essentially the same but some aspect of performance is expected to change (e.g., rate, duration, quality, independence, etc.). For example, for the first benchmark a student might be expected to read 43 words correctly on an Oral Reading Fluency probe, for the second benchmark they are expected to read 56, etc.
Objectives are more frequently used with students with significant needs. They are used when the skills leading to the annual goal are different. This typically occurs when the IEP team has engaged in task analysis and broken a sequence of skill performance down into component steps. For example, the first objective for a tooth-brushing routine might focus on picking up the toothbrush independently. The second objective might focus on applying toothpaste, etc.