Goals
The Bottom Line
IEP goals must:
Be measurable and meaningful for the student
Be designed to meet the student's academic and/or functional educational needs that result from their disability to enable them to be involved in, and make progress in, the general education curriculum
Be designed to meet each of the student's other educational needs the result from their disability
Include no less than two measurable benchmarks or objectives
Go-To Tools
The handy tools that help you get the job done efficiently.
⚒️ Selecting a measurement tool:
Literacy:
This resource points you to a tool that matches your intervention.
For functional reading/writing, reference the goal examples doc.
Math:
This resource points you to a tool that matches your intervention.
For functional math, reference the goal examples doc.
SEL/Behavior:
This index of goal examples includes links to rating scales (e.g., DBRs), rubrics, etc.
This logistics doc helps you make a sustainable plan for data collection.
Daily Living, Motor, Employability +: This index of goal examples (in development) includes links to rating scales, rubrics, checklists, etc.
Speech/Language: This index of goal examples (in development) will include links to example measurement tools.
🎯 Determining an appropriate goal target:
The Target and Rate of Improvement (ROI) Packet provides all the data and guidance you need.
✍️ Writing the goal in the IEP:
Example templates that can be pasted into SpEd Forms and individualized:
📈 Graphing progress monitoring data:
Visit our eduCLIMBER training hub on the SCRED website for a step-by-step tutorial.
IEP TV
What does it look like to discuss this topic during an IEP team meeting?
Check out these IEP meeting demo video(s) that model how to facilitate the conversation.
Writing a Goal That Works
We want to make data-driven, objective decisions about the instruction and supports we're providing to our students. In order to reach that point, a good deal of planning, coordination, and follow through is needed (i.e., infrastructure). Watch the video below (11:30) to learn more.
Identify the student’s needs and what skills you will teach them
We need to understand the "current state" of the student's proficiency in all areas of functioning (e.g., math, literacy, social emotional learning). Example reference points include the most recent evaluation report; progress on previous IEP goals; recent assessments, grades, attendance data, and discipline data; and input from teachers/family members/service providers. The Student Profile in eduCLIMBER is a handy tool for conveniently pulling much of this data together.
We compare this with the "desired state" — i.e., the performance of typically developing peers at the student's age/grade level. Example reference points include grade-level targets (e.g., data from the Target and Rate of Improvement (ROI) Packet) or grade level standards (e.g., in academics or social emotional learning).
This comparison is how we identify the "gaps" that we need to close with specialized instruction and supports. Our objective with specialized instruction is not to broadly cover grade-level standards (which is still the job of core instruction); instead, the purpose of specialized instruction is to narrow in on the array of skills where the student has a significant gap in proficiency/performance with typically developing peers.
Decide what skill(s) you want to measure
What We Teach vs. What We Measure
In the calendar year of an IEP, we will always teach the student far more skills than we can reasonably measure via progress monitoring. We won't write IEP goals for hundreds of skills.
Instead, we consider the array of skills we're going to teach and narrow down which skill(s) will be effective "barometers" that — as they rise, fall, or stay flat — will be indicative of the student's overall progress in that area of learning. ⚠️ We are not "prioritizing" skills, that is inaccurate. More accurate language is found in this example PWN statement:
"The team considered adding a goal in the area of following instructions but decided against that option because John’s progress in the goal area of staying on task will be indicative of his skill growth in multiple identified social-emotional areas of need, including following instructions. John will continue to receive specialized instruction in a variety of interrelated skills in this area."
When you're discussing adding/removing goals during the IEP meeting, we recommend that you use similar language to communicate to families about the relationship between goals/measurement and instruction.
Is there a learning outcome?
Note that, in some need areas, goals are not written. For example, transportation may be a need area, but a goal is not written. Physical modifications may be necessary for participation in certain activities and environments, but a goal is not written. Goals should be written for areas of need that will have a specific learning outcome (e.g., reading skills development, self-care independence, etc.) and, while these are areas of need, there are no specific learning outcomes intended for the student.
Decide what tools to use + plan logistics of collecting the data
NOTE: The eduCLIMBER --> Interventions page on the SCRED website ('Creating Interventions' heading) answers the questions below across learning areas.
What aspect of the student's performance will we measure?
Academic progress monitoring typically measures rate and/or accuracy:
Rate - Given a window of time, how many problems can the student complete successfully, words can they read successfully, word sequences can they write successfully, etc.?
Accuracy - What is the student's ratio of successful versus unsuccessful responses?
Non-academic progress monitoring typically measures one of the following:
Rate/accuracy - Given a window of time or a certain number of opportunities, how much of the time (rate) or in how many opportunities (accuracy) was the student successful? The determination of success/non-success is binary. They either were or they weren't successful (e.g., being on-task or off-task).
Quality - How successful was the student in performing the skill? Instead of a binary determination of success/non-success, the degree of success in performing a skill is evaluated.
Independence - A prompt hierarchy is defined (e.g., from independent performance all the way to hand-over-hand physical prompting) and either the whole skill (e.g., greeting someone) or separate components of the skill (e.g., toothbrushing) are rated based on the level of prompting needed to illicit a successful response.
Duration - How long does the student engage in a specific desirable activity or stay in a desired environment? Similar to rate, this is a binary determination.
Latency - After a student receives a cue, how long does it take for them to respond?
What will we use to measure the student's performance?
The categories of tools that we use are:
General Outcome Measurement (GOM) probes — Made from items that are representative of entire curriculum.
Skill-Based Mastery Measurement probes — Evaluating a student’s proficiency in a single target skill.
Rating Scales — A multiple choice scale used to rate performance. The levels of the scale are described with brief ~1-2 word labels (e.g., 0 = Never, 1 = Sometimes, 2 = Often, 3 = Almost Always).
Point Sheets — A type of multi-item rating scale that is typically linked to an incentive system.
Rubrics — A type of rating scale that includes a detailed description of performance at each rating level rather than a brief label.
Checklists — A type of rating scale that includes multiple items and a ‘YES-or-NO’ binary determination is made for each item.
Direct Observation — Momentary time sampling, duration, latency, etc.
NOTE: Permanent products (e.g., rate of homework completion) are generally not recommended for IEP goal progress monitoring and only recommended as a secondary source to confirm intervention effectiveness.
Who will collect the data?
A special education teacher or paraprofessional trained in administration of GOMs? A student's classroom teacher(s)? A service provider who is working with the student (e.g., an SLP)?
How often will we collect data?
The frequency of progress monitoring should be reasonably calculated to be:
Infrequent enough to capture new data when we can reasonably expect a change in performance. For example, we cannot reasonably expect a student's skill proficiency to meaningfully change on a daily basis. If data is collected daily (e.g., a daily point sheet), the average or median of that data is typically graphed less frequently (e.g., weekly) in order to capture trends.
Frequent enough to allow the team to respond to inadequate progress. For example, if we collect data 1x/month, it will take far too long — at least a quarter of an annual IEP — to determine a trend of inadequate progress and make changes to our intervention.
Follow through on collecting the data
Our plan must be sustainable — it won't be useful and meaningful and actually drive our instructional decisions if we cannot rely on the data being accessible. SpEd departments plan and coordinate in order to make this a reality. For example, teachers agree collectively to (in eduCLIMBER):
Enter data on a certain day of the week (so that the staff assigned to data entry can quickly pull up and enter data on a group of students)
Identify a 'Monitoring User' that is assigned to data entry (e.g., a paraprofessional with data entry duties built into their schedule)
Graph the data
Locally, we use the eduCLIMBER website to visually graph our progress monitoring data.
Visually analyze the data and make data-based decisions
The eduCLIMBER --> Interventions page on the SCRED website ('Managing Interventions' heading) offers guidance and resources re: how to:
Visually analyze data and objectively determine when instructional changes are needed.
Keep track of those instructional changes — by adding a "phase line" to your graph.
Documentation
Parts of a Goal
Time Frame: When do we want the goal to be achieved by?
Condition: Under what circumstances will we be measuring the student's performance?
Behavior: What behavior/skill will we be measuring, and what aspect of it (e.g., rate, quality, independence, duration, etc.)?
Criterion: What performance level will indicate that the goal has been achieved? We strongly encourage developing goals that monitor an increase in a desired skill/behavior versus simply a reduction in an undesired behavior.
IEP Goal "Formula"
By [due date] when [measurement condition], XXX will increase [skill performance] from [baseline] to [target] as measured by [description of measurement tool, frequency of data collection, and individual(s) responsible].
IEP Goal Examples
By January 2018 (time frame), when presented with unrehearsed 5th grade level passages (condition), Cindy will increase her rate of reading orally (behavior) from 84 correct words per minute to 121 correct words per minute with at least 95% accuracy for two consecutive data points (criterion), as measured weekly by special education staff using oral reading fluency probes.
By October 2018 (time frame), when Beverly exhibits visible signs of agitation (clenched fists, scowling, angry voice tone) (condition), she will increase her rate of using anger control strategies (behavior) from 30% of opportunities to 65% for two consecutive data points (criterion), as measured every other week using a direct behavior rating completed by teachers working directly with Beverly.
By May 2018 (time frame), when observed in all school settings (condition), Randy will increase his rate of keeping a daily to-do list, using effective time management strategies, and maintaining organized materials (behavior) from a level of 21/55 points obtained on an organizational skills inventory completed by special education staff working directly with Randy to 44/55 points for two consecutive data points (criterion), as measured every other week.
By September 2018 (time frame), when presented with 30 tasks, 10 each of synonyms, analogies and multiple meanings (condition), George will increase his rate of correct response (behavior) from a current level of 4/30 tasks completed correctly to a level of 25/30 tasks completed correctly for two consecutive data points (criterion), as measured weekly by a Speech Language Pathologist working with George during small group instruction.
By July 2018 (time frame), when presented with a toothbrushing task (condition), Adam will increase his ability to complete a toothbrushing routine (behavior), from currently obtaining 8 out of 24 points on the toothbrushing rubric to obtaining 16 out of 24 points for two consecutive data points (criterion), as measured weekly by special education staff using the toothbrushing rubric.
Benchmarks vs. Short-Term Objectives
Within the same IEP, teams may use either or both benchmarks and short-term objectives. However, these two measures should not be mixed under the same annual goal. Both of these measures break learning into measurable, intermediate steps that enable families, students, and educators to monitor progress during the year.
In order to meet minimal compliance standards in the state of Minnesota, IEP goals must include at least two benchmarks/short-term objectives (3525.2810 subp. 1(a-2)). It is recommended that they be intentionally plotted so that they are evenly spaced throughout the duration of the IEP. Some case managers choose to schedule their benchmarks/objectives near the end of grading periods when progress reports will be sent out; however, this is by no means mandatory.
Benchmarks
Benchmarks are the most common method. When feasible, they are preferred. Use benchmarks when the skill or behavior leading to the annual goal remains essentially the same but some aspect of performance is expected to change (e.g., rate, duration, quality, independence, etc.). For example, for the first benchmark a student might be expected to read 43 words correctly on an Oral Reading Fluency probe, for the second benchmark they are expected to read 56, etc.
Short-Term Objectives
Objectives are more frequently used with students with significant needs. They are used when the skills leading to the annual goal are different. This typically occurs when the IEP team has engaged in task analysis and broken a sequence of skill performance down into component steps. For example, the first objective for a tooth-brushing routine might focus on picking up the toothbrush independently. The second objective might focus on applying toothpaste, etc.
Frequently Asked Questions
✋ Can we add a goal (and services) in a new area between a student's evaluations?
A “new” area is one that is not closely related to existing areas that are addressed through special education services and supports. For example, if math services and supports are added to an IEP that previously addressed only reading, math is a “new” area; if services and supports promoting multiplication and division success are added to an IEP that previously addressed addition and subtraction, multiplication and division reflect a normal progression within math and are not “new.”
Neither an evaluation nor the general education intervention process are required to add a goal/service to the IEP of a special education student. However, this process should be considered whenever the needs of a special education student might reasonably be addressed without special education services or supports. The data supporting the addition of services and supports in a new area must be documented in the IEP.
Data Standards
IEP teams need to base their decision on sound data. Data may come from a variety of sources. When considering additional needs, the data generated through the general education classroom that reflects an individual’s progress in the context of assistance provided to the student is the first source to consider. Data must clearly support the conclusion that identified needs require special education services and supports, rather than other forms of assistance.
Documentation
State the IEP team’s conclusion regarding the need for supports beyond those typically provided to all general education students and summarize the data that supports this conclusion, including progress and discrepancy information. Example: "The IEP team believes that Karina requires greater assistance in the area of math than is provided to most students. She has struggled with concepts and applications related to algebra variables, data analysis, and probability. Her math scores on the Minnesota Comprehensive Assessments are well below the proficient level and her test, quiz, and assignment scores are consistently with the lowest 10 to 15% of the class."
State the IEP team’s conclusion regarding the individual’s need for special education services and supports in the new area, and summarize the information that supports this conclusion, including the impact of previous efforts. Example: "The IEP team believes that special education services and supports, including frequent progress monitoring and adjustments to instruction, are needed. Karina currently receives her math instruction in a co-taught classroom and is frequently included in small groups of students who are experiencing difficulties and receive focused instruction by a special educator. In addition, her general education teacher affords her additional individual time (15-20 minutes, 3-4 times per week) and her parents provide nightly homework support. Weekly algebra probes show little or no gains over the past eight weeks."