Fluency Norms Chart (2017 Update)

View the results of the updated 2017 study on oral reading fluency (ORF) by Jan Hasbrouck and Gerald Tindal, with compiled ORF norms for grades 1-6. You'll also find an analysis of how the 2017 norms differ from the 2006 norms.
In 2006, Jan Hasbrouck and Gerald Tindal completed an extensive study of oral reading fluency. The results of their study were published in a technical report entitledOral Reading Fluency: 90 Years of Measurement, archived in The Reading Teacher: Oral reading fluency norms: A valuable assessment tool for reading teachers.
In 2017, Hasbrouck and Tindal published an Update of Oral Reading Fluency (ORF) Norms, compiled from three widely-used and commercially available ORF assessments (DIBELS, DIBELS Next, and easy CBM), and representing a far larger number of scores than the previous assessments.
The table below shows the mean oral reading fluency of students in grades 1 through 6, as determined by Hasbrouck's and Tindal's 2017 data. You can also see an analysis of how the 2017 norms differ from the 2006 norms.
Oral reading fluency (ORF)
Of the various CBM measures available in reading, ORF is likely the most widely used. ORF involves having students read aloud from an unpracticed passage for one minute. An examiner notes any errors made (words read or pronounced incorrectly, omitted, read out of order, or words pronounced for the student by the examiner after a 3-second pause) and then calculates the total of words read correctly per minute (WCPM).
This WCPM score has 30 years of validation research conducted over three decades, indicating it is a robust indicator of overall reading development throughout the primary grades.
Interpreting ORF scores
ORF is used for two primary purposes: Screening and progress monitoring. When ORF is used to screen students, the driving questions are, first: “How does this student’s performance compare to his/her peers?” and then: “Is this student at-risk of reading failure?”
To answer these questions, the decision-makers rely on ORF norms that identify performance benchmarks at the beginning (fall), middle (winter), and end (spring) of the year. An individual student’s WCPM score can be compared to these benchmarks and determined to be either significantly above benchmark, above benchmark, at the expected benchmark, below benchmark, or significantly below benchmark.
Those students below or significantly below benchmark are at possible risk of reading difficulties. They are good candidates for further diagnostic assessments to help teachers determine their skill strengths or weaknesses, and plan appropriately targeted instruction and intervention (Hasbrouck, 2010. Educators as Physicians: Using RTI Data for Effective Decision-Making. Austin, TX: Gibson Hasbrouck & Associates.
When using ORF for progress monitoring the questions to be answered are: “Is this student making expected progress?” and “Is the instruction or intervention being provided improving this student’s skills?”
When ORF assessments are used to answer these questions, they must be administered frequently (weekly, bimonthly, etc.), the results are placed on a graph for ease of analysis, and a goal determined. The student’s goal can be based on established performance benchmarks or information on expected rates of progress. Over a period of weeks, the student’s graph can show significant or moderate progress, expected progress, or progress that is below or significantly below expected levels.
Based on these outcomes, teachers can decide whether to (a) make small or major changes to the student’s instruction, (b) continue with the current instructional plan, or (c) change the student’s goal (Hosp, Hosp, & Howell, 2007. The ABCs of CBM: A Practical Guide to Curriculum-based Measurement. NY: Guilford Press).
Using the data
You can use the information in this table to draw conclusions and make decisions about the oral reading fluency of your students.
Students scoring 10 or more words below the 50th percentile using the average score of two unpracticed readings from grade-level materials need a fluency-building program.
In addition, teachers can use the table to set the long-term fluency goals for their struggling readers.
2017 Oral reading fluency (ORF) data
Grade | %ile | Fall WCPM* | Winter WCPM* | Spring WCPM* |
---|---|---|---|---|
1 | 90 | 97 | 116 | |
75 | 59 | 91 | ||
50 | 29 | 60 | ||
25 | 16 | 34 | ||
10 | 9 | 18 | ||
2 | 90 | 111 | 131 | 148 |
75 | 84 | 109 | 124 | |
50 | 50 | 84 | 100 | |
25 | 36 | 59 | 72 | |
10 | 23 | 35 | 43 | |
3 | 90 | 134 | 161 | 166 |
75 | 104 | 137 | 139 | |
50 | 83 | 97 | 112 | |
25 | 59 | 79 | 91 | |
10 | 40 | 62 | 63 | |
4 | 90 | 153 | 168 | 184 |
75 | 125 | 143 | 160 | |
50 | 94 | 120 | 133 | |
25 | 75 | 95 | 105 | |
10 | 60 | 71 | 83 | |
5 | 90 | 179 | 183 | 195 |
75 | 153 | 160 | 169 | |
50 | 121 | 133 | 146 | |
25 | 87 | 109 | 119 | |
10 | 64 | 84 | 102 | |
6 | 90 | 185 | 195 | 204 |
75 | 159 | 166 | 173 | |
50 | 132 | 145 | 146 | |
25 | 112 | 116 | 122 | |
10 | 89 | 91 | 91 |
* WCPM = Words Correct Per Minute
The 2017 chart is available as a PDF: 2017 Hasbrouck & Tindal Oral Reading Norms
Comparison of ORF norms for 2006 and 2017
%iles | Grade 1 | F | W | S | Grade 2 | F | W | S | ||
---|---|---|---|---|---|---|---|---|---|---|
90 | 2017 | 97 | 116 | 2017 | 111 | 131 | 148 | |||
90 | 2006 | 81 | 111 | 2006 | 106 | 125 | 142 | |||
Difference | 16 | 5 | Difference | 5 | 6 | 6 | ||||
75 | 2017 | 59 | 91 | 2017 | 84 | 109 | 124 | |||
75 | 2006 | 47 | 82 | 2006 | 79 | 100 | 117 | |||
Difference | 12 | 9 | Difference | 5 | 9 | 7 | ||||
50 | 2017 | 29 | 60 | 2017 | 50 | 84 | 100 | |||
50 | 2006 | 23 | 53 | 2006 | 51 | 72 | 89 | |||
Difference | 6 | 7 | Difference | -1 | 12 | 11 | ||||
25 | 2017 | 16 | 34 | 2017 | 36 | 59 | 72 | |||
25 | 2006 | 12 | 28 | 2006 | 25 | 42 | 61 | |||
Difference | 4 | 6 | Difference | 11 | 17 | 11 | ||||
10 | 2017 | 9 | 18 | 2017 | 23 | 35 | 43 | |||
10 | 2006 | 6 | 15 | 2006 | 11 | 18 | 31 | |||
Difference | 3 | 3 | Difference | 12 | 17 | 12 |
%iles | Grade3 | F | W | S | Grade 4 | F | W | S | ||
---|---|---|---|---|---|---|---|---|---|---|
90 | 2017 | 134 | 161 | 166 | 2017 | 153 | 168 | 184 | ||
90 | 2006 | 128 | 145 | 162 | 2006 | 145 | 166 | 180 | ||
Difference | 6 | 15 | 4 | Difference | 8 | 2 | 4 | |||
75 | 2017 | 104 | 137 | 139 | 2017 | 125 | 143 | 160 | ||
75 | 2006 | 99 | 120 | 137 | 2006 | 119 | 139 | 152 | ||
Difference | 5 | 17 | 2 | Difference | 6 | 4 | 8 | |||
50 | 2017 | 83 | 97 | 112 | 2017 | 94 | 120 | 133 | ||
50 | 2006 | 71 | 92 | 107 | 2006 | 94 | 112 | 123 | ||
Difference | 12 | 5 | 5 | Difference | 0 | 8 | 10 | |||
25 | 2017 | 59 | 79 | 91 | 2017 | 75 | 95 | 105 | ||
25 | 2006 | 44 | 62 | 78 | 2006 | 68 | 87 | 98 | ||
Difference | 15 | 17 | 13 | Difference | 7 | 8 | 7 | |||
10 | 2017 | 40 | 62 | 63 | 2017 | 60 | 71 | 83 | ||
10 | 2006 | 21 | 36 | 48 | 2006 | 45 | 61 | 72 | ||
Difference | 19 | 26 | 15 | Difference | 15 | 10 | 11 |
%iles | Grade 5 | F | W | S | Grade 6 | F | W | S | ||
---|---|---|---|---|---|---|---|---|---|---|
90 | 2017 | 179 | 183 | 195 | 2017 | 185 | 195 | 204 | ||
90 | 2006 | 166 | 183 | 194 | 2006 | 177 | 195 | 204 | ||
Difference | 13 | 1 | 1 | Difference | 8 | 0 | 0 | |||
75 | 2017 | 153 | 160 | 169 | 2017 | 159 | 166 | 173 | ||
75 | 2006 | 139 | 156 | 168 | 2006 | 153 | 167 | 177 | ||
Difference | 14 | 4 | 1 | Difference | 6 | -1 | -4 | |||
50 | 2017 | 121 | 133 | 146 | 2017 | 132 | 145 | 146 | ||
50 | 2006 | 110 | 127 | 139 | 2006 | 127 | 145 | 150 | ||
Difference | 11 | 6 | 7 | Difference | 5 | 5 | -4 | |||
25 | 2017 | 87 | 109 | 119 | 2017 | 112 | 116 | 122 | ||
25 | 2006 | 85 | 99 | 109 | 2006 | 98 | 111 | 122 | ||
Difference | 2 | 10 | 10 | Difference | 14 | 5 | 0 | |||
10 | 2017 | 64 | 84 | 102 | 2017 | 89 | 91 | 91 | ||
10 | 2006 | 61 | 74 | 83 | 2006 | 68 | 82 | 93 | ||
Difference | 3 | 10 | 19 | Difference | 21 | 9 | -2 |
Average differences in ORF for each grade level
Difference | ||||
---|---|---|---|---|
Grade | Fall | Winter | Spring | Average * |
1 | 41 | 30 | 7 | |
2 | 32 | 61 | 47 | 9 |
3 | 57 | 80 | 39 | 12 |
4 | 28 | 30 | 36 | 6 |
5 | 43 | 30 | 38 | 8 |
6 | 54 | 18 | -10 | 4 |
* Average across all percentile range values.
Hasbrouck, J. & Tindal, G. (2017). An update to compiled ORF norms (Technical Report No. 1702). Eugene, OR, Behavioral Research and Teaching, University of Oregon.
Reprints
Comments
I believe it could make a difference if the child feels comfortable or threatened by the teacher’s testing. It’s not something that may be obvious or planned in any way, but it would definitely cause a child to stammer or make small errors. Is it counted as an error if the child self corrects?
should this be used as an tool for supervisors to evaluate teachers?
I'm wondering if the 50th percentile numbers for 6th grade are correct? Only a one-word increase from Winter of 6th to Spring? (145 to 146) Thanks for posting. So happy to see this update!
I have students in fourth grade reading at second and third grade 50%tile. K- Grade 1 sub skills are missing.
Why are Hasbrouk & Tindal norms lower than easy CBM norms?
I'm wondering the same thing. Do specific texts have to be used that increase in difficulty to apply to these benchmark scores?
Do specific reading assessments need to be used or can I use any text in the student’s instructional reading level? If specific assessment text samples are needed, where can I find them for various grades? I am a parent and while my child isn’t flagged for fluency problems in school, I think he should have a much higher fluency level than he does. I would like to test him independently at home so I can track him to make sure he is improving his fluency.
How do you account for a child's score that goes down throughout the year? Example: fall 205wpm ORF winter 203 and Spring 181? Clearly the child didn't forget how to read?
A couple of things might be happening. For example, the student might just not like taking the test and so he take it less seriously than he should be; he might not be understanding the reading material as well as the grade progresses, since it gets harder; he might be slacking on that day; or, something more sinister might be at hand, such as a serious neurological illness he might be suffering.
As was stated earlier in the post, one must take into consideration that the level of text complexity increases throughout the year. The child may not forget how to read, but as the level becomes increasingly difficult, the child's fluency must also increase in order to be able to read the same amount of correct words per minute. The fluctuation could indicate that the child is having difficulty keeping up with the increased complexity of the text and needs more practice at that level.
Why do wcpm decrease from 6th to 7th grade
The wcpm decreased when the student go from 6th to 7th grade because when they are promoted, the reading material gets harder. Students are faced with text a little harder than the previous grade, thus explaining the strange inconsistency in test scores. It's normal of course: when you are faced with the harder text you need more brain power to decode it, this takes more time.
What if the student is reading about the CWPM average, but at an independent level that is not grade level? For example my fourth grader reads 105 CWPM in Winter, but he is reading at a third grade benchmark.
in my opinion its not a major concern. your student has 'broken the code' and the most important thing is that literacy has taken an initial hold and that reading continues to be a part of his/her life. there are (probably) millions of ways to foster growth in reading and the internet is a tremendous resource for such things. in my opinion, anyone who claims to fret over the situation you describe either has an invalid perspective or an agenda to push. as a parent, model reading in your own life, encourage and reward your childs reading (but certainly do not nag or make reading unpleasant) but take a deep breath and KNOW that reading is a skill that gets better the more you use it.
I've been doing reading assessments with the QRI-4 by Leslie and Caldwell, so I'm wondering if you count ALL miscues for the CWPM score (total accuracy), or just the number of meaning change miscues (total acceptability) to get the oral reading fluency score.
This is for oral reading. Does anyone know how much faster a student can typically read silently?
This is a very useful matrix, especially with the information about improvement, rather than just being guided by percentiles. I recognize that silent reading cannot be measured similarly, given that miscues cannot be captured by the examiner, but is there a guideline for wpm read independent of comprehension results?
I am interested in using your chart for my learners. However, there is little information on howv it is used. For instance, how do you account for the fact that the score of a learner remains unchanged (the percentile) as the learner's number of correct words read in a minute over terms improves ?
The percentile is a normative score that reflects the student's relative standing compared to his same-grade peers, while words correct per minute is a raw or absolute score without reference to other students. As all students in a grade grow in wcpm, the only way a student could improve his percentile score (relative standing in the group) is to outpace his peers in wcpm growth. Use wcpm to measure a student's growth over time (i.e., progress monitoring), and use the percentile score to gauge how well he's doing compared to same-grade peers (i.e., decide if he needs to be referred for intervention).
It is because the child is aging and so is his peers. As everyone else improves the 50th %tile has to increase, too. You might ask, "Well, why is the end of the year WPM score higher than the fall WPM score for the next grade?" Remember, the text difficulty is increasing each year.
So to analyze a child's fluency skills to others, you could compare him to same aged peers reading the same text OR older peers at a higher level text OR younger peers at a lower level text OR all of these. In the past, this was a way for me to find a student's independent, instructional and frustration levels of reading. As a special education teacher, I used to write statements of strengths or deficits as, "As a sixth grader, John's fluency is commensurate with a fourth grade student at the 50th %tile reading a fourth grade level text." I would include the numbers of where a typical sixth grader would be at that time of year and include his %tile, too.
One of my favorite informal reading assessments is a Burns and Roe IRI.
In this case, the percentile is not an individual score per se, but rather a performance level. For example, students who are reading at the 50th percentile in the winter of first grade, read 23 words correct per minute (WCPM). So, if you want your students to read at the 50th percentile, you will want to follow that line across to check that their WCPM score improves at the suggested rate throughout the year and that they meet the fall, winter and spring WCPM benchmarks that are given.
My daughter is in second grade. Recently she read 125 words per minute on a fifth grade passage. what percentile would you consider her to be at in terms of her fluency?