Session 2.3: Universal Screening Measures Administration Overview (i.e., DIBELS 8th, easyCBM)

After reviewing their county’s diagnostic testing data, participants will target a small group of students who are performing at or below grade level proficiency within any of the five areas of reading. Upon identifying the target group, the educator will then implement a secondary round of diagnostic testing, using an alternate diagnostic screener and compare the reading results to the previously administered benchmark scores. This session will focus on assessing small groups of students using alternate diagnostic screeners and comparing data across diagnostic assessments.

Indicators targeted: 5.1, 5.12

 

Materials

 

Additional Materials to Consider
  • RTI for English Language Learners: Appropriately Using Screening and Progress Monitoring Tools to Improve Instructional Outcomes
  • For participants who do not have access to an alternative universal screener for their county, the facilitator will need to print off packets of Benchmark Materials (including the Scoring Booklet and the Student Materials) that corresponds with the teacher’s grade level) – OR – the participants can provide their own physical copies of the Benchmark Materials (including the Scoring Booklet and the Student Materials) for their grade level

 

Define Session Goals

The educator will…

  • Confirm the subset of students who are performing similarly and below grade level that will become the focus group for the remainder of the project.
  • Identify key attributes and characteristics of one curriculum-based screening measure (i.e., DIBELS 8th Edition).
  • Practice administering alternate preassessment screener with partners.

 

Learn About Administering Additional Screening Measures

1. Have participants review data from previous two sessions to confirm the subset of students for the focus group.

Now that educators have had time reviewing their benchmark screening and language proficiency data (Sessions 2.1 and 2.2), they should identify for the first time, or confirm, a subset of students (approximately 4 – 6 focus students) from their own classroom who are considered at risk for specific reading skills. The educator will conduct one more round of preassessment testing, using an alternate universal screening measure and one diagnostic measure to confirm which areas of reading need additional, targeted instruction for the students within the focus group.

Take a few minutes to read Making Sense of Terms Used in Early Reading Assessment & Assessment Terms Used in Reading Infographic Handout. On the Assessment Terms Used in Reading Infographic handout, ask participants to circle or highlight where they believe they are in the process of what has been covered up to this point in reviewing Competency 5 (i.e., either on or between the Screening Assessment and the Diagnostic Assessment step).

For participants who have identified ELLs within their focus group of students, they may want to spend an additional few minutes reading Response to Intervention and English Language Learners, starting on pg. 7 of RTI for English Language Learners: Appropriately Using Screening and Progress Monitoring Tools to Improve Instructional Outcomes and ending at the top of pg. 11, noting key information.

2. Familiarize participants with Dynamic Indicators of Basic Early Literacy Skills [DIBELS] 8th Edition.

*Note: Some participants may have access to an alternate universal screener provided by their home county. These participants may choose to use their screener when it comes time to collaborate with peers.

Pass out the DIBELS 8th Edition Introduction or have participants access the DIBELS 8th Edition Administration & Scoring Guide and Benchmark Materials, and turn to pgs. 7 – 8 to review the section: Dimension of Reading Assessed by DIBELS 8 and the Description of DIBELS 8, which states:

DIBELS 8th Edition offers six subtests designed to assess component skills involved in reading: Letter Naming Fluency (LNF), Phonemic Segmentation Fluency (PSF), Nonsense Word Fluency (NWF), Word Reading Fluency (WRF), Oral Reading Fluency (ORF), and Maze. These subtests are aligned to four of the five “Big Ideas” in reading identified by the National Reading Panel (National Institute of Child Health and Human Development, 2000), including phonological awareness, phonics (or the alphabetic principle), fluency, and comprehension (Riedel, 2007; see Table 1.1). In many ways the DIBELS subtests represent not only the constructs in the National Reading Panel Report (NICHD, 2000), but also a developmental continuum. As a result, the subtests included change across grades in a manner that parallels student development and instructional foci (Adams, 1990; Chall, 1996; Ehri, 2005; Paris & Hamilton, 2009).

Some DIBELS 8 subtests are also aligned to subskills of reading that are associated with risk for dyslexia and other word reading disabilities. The International Dyslexia Association (IDA) recommends universal screening of students in kindergarten through second grade (IDA, 2019). Consistent with IDA recommendations, DIBELS 8 offers LNF, PSF, and NWF subtests as dyslexia screening measures of rapid naming (or processing speed), phonemic awareness, and letter-sound correspondence for use in kindergarten and first grade. Also consistent with IDA recommendations, DIBELS 8 offers real and nonsense word measures (NWF, WRF, and ORF) as dyslexia screening measures.

DIBELS 8th Edition takes a curriculum-based measurement (CBM) approach to assessing reading. It is intended for assessing reading skills from the beginning of kindergarten through the end of eighth grade. DIBELS 8 subtests are designed as brief, easily administered measures of reading. Five of the subtests (LNF, PSF, NWF, WRF, and ORF) are 60-second measures designed to be administered individually in a quiet setting. The sixth subtest, Maze, is a 3-minute measure designed to be administered in group settings. Because DIBELS subtests are timed measures, efficiency in reading skills is considered as well as accuracy. The subtests offered in specific grades are aligned to curriculum and instruction typical for each grade, as well as to recommendations made by the IDA.

Allow participants to share observations, making initial connections from the reading overview to the universal screener used in their counties.

3. Model how to administer and score one of the subtests with a participant.

Ask one of the participants to assist the facilitator in modeling administration of one of the subtests, Oral Reading Fluency (ORF), from the DIBELS 8th Edition Instruments. At the same time, hand out the following copies (see below); participants may also choose to review the documents online in the DIBELS Modeling WRF Resources folder.

  • DIBELS 8th Edition Introduction (pgs. 1 – 41 of the Administration & Scoring Guide):
    • Oral Reading Fluency (ORF) Description Overview; pg. 11
    • Oral Reading Fluency Improvements; pgs. 20 - 21
    • Oral Reading Fluency Development Process; pgs. 32 - 36 (to be read at a later time on their own)
  • Modeling Oral Reading Fluency Test Administration and Scoring; pgs. 75 - 79
  • Modeling ORF Teacher Resources – Middle of the Year; Fifth Grade (pg. 3 of the document: Animal Tools)
  • Modeling ORF Student Booklet – Middle of the Year; Fifth Grade (pg. 3 of the document: Animal Tools)

Both the facilitator and the participants will spend some time familiarizing themselves with the above documents, before the facilitator models how to administer the assessment as the testing examiner and the participant responds to the assessment as a fifth-grade student. Allow participants 5 - 6 minutes to review the document independently or walk through the document as a whole group, stopping to ask if participants need any clarification along the way.

Oral Reading Fluency Simulation:
Inform the participant that s/he will be playing the part of a fifth-grade student, Phillip Dawson, who is a student in Ms. Teacher’s Fifth Grade class at Practice Elementary School. Today, Phillip will complete one of the two additional screeners (i.e., Oral Reading Fluency and MAZE) used for fifth grade, as part of his mid-year assessment screenings.

Position the participant playing the role of Phillip Dawson away from the facilitator, who will be playing the role of the test examiner, so that s/he cannot see the examiner’s responses throughout the testing segment. Then, place the copy of the Modeling ORF Student Booklet – Middle of the Year; Fifth Grade in front of the participant. Ask the participant to keep the student booklet closed until the facilitator/examiner ready to proceed with the testing.

As the test examiner, access the timer, setting the time for 60 seconds, and then turn to pg. 3 of the Modeling ORF Teacher Resources – Middle of the Year; Fifth Grade, which should reflect a story titled, Animal Tools. Once the test examiner is familiar with the scoring procedures, s/he will ask the participant playing the role of Phillip Dawson to open up his/her booklet to the story titled Animal Tools. The examiner will then read the “Examiner Script,” located at the top of the page for the same story, Animal Tools, in the Teacher Resources packet.

Once the participant begins reading, the examiner will begin the 60 second timer countdown and record miscues (including, but not limited to self-corrections, insertions, repetitions, mispronounced words [possibly related to graphophonic, syntactic, and/or semantic miscues], word order, omissions).

At the end of the 60 second time period, the examiner will ask the student to discontinue reading the passage. The examiner will then apply the scoring rules to this assessment.

Ask the participants who were observing the administration to share their observations with the whole group. Possible observations may include:

Observations of Examiner Performance:
+Did the examiner position his/her scoring sheet out of the student’s eyesight?
+Did the examiner provide the correct word after the student hesitated for three seconds?
+Was the examiner impartial with his/her facial expressions?

Observations of Student Performance:
+How fluent was the student?
+What types of miscues did the student make (several omitted words, final sounds were dropped, no attention to punctuation, etc.)?
+What physical behaviors did the student demonstrate? Facial expressions?

 

Collaborate

Allow participants to get with another person to form a pair. Each partner within the pair will decide which screener(s) they would like to practice administering with their peer responding how a student in their classroom who is completing the screener might typically respond. It may be important to remind each participant that their “student” should most likely reflect similar reading behaviors as one of the students who was identified within their specific focus group.

If participants do not have access to an additional screener used by their county to practice in the session today, the facilitator will either provide hard copies of DIBELS 8th Edition benchmark assessment materials, including both the Scoring Booklet and the Student materials for the corresponding grade level of the teacher, or allow the educator to use his/her own grade level copies to practice with.

Allow 20 minutes for educators to practice administering the preassessment screeners with their partners, alternating between test examiner and student respondent. Then, ask for volunteers to share a few of their observations about the administration and scoring process. It is possible that participants may not complete this task in its entirety during the 20 minutes. Note: Continued observations may take place after this session has ended.

 

Reflect and Next Steps

Remind participants that if they choose to use the DIBELS 8th Edition screeners with their focus group of students as part of the guidelines in completing their Data-Based Instructional Plan, then, depending on the grade level, there are a varying number of screeners they can choose from. As participants revisit their classroom reading benchmark data, they may decide that the students in the focus group need additional prescreening in multiple areas. Or, they may determine that this particular set of students need to additional screening in one area. This will be up to the participant to determine the level of need, as identified by their students’ reading performance. Regardless of which set of screener(s) the participant selects, s/he will need to be sure to provide a rationale as to why each screener was selected and administered with this focus group of students.

DIBELS 8th Edition Screening Measures Available
Grade Level Screening Measure
K Letter Naming Fluency
Phonemic Segmentation Fluency
Nonsense Word Fluency
Word Reading Fluency
1 Letter Naming Fluency
Phonemic Segmentation Fluency
Nonsense Word Fluency
Word Reading Fluency
Oral Reading Fluency
2 Nonsense Word Fluency
Word Reading Fluency
Oral Reading Fluency
MAZE
3 Nonsense Word Fluency
Word Reading Fluency
Oral Reading Fluency
MAZE
4 Oral Reading Fluency
MAZE
5 Oral Reading Fluency
MAZE
6 Oral Reading Fluency
MAZE
7 Oral Reading Fluency
MAZE
8 Oral Reading Fluency
MAZE

Reflection:
Take five minutes to reflect on the following questions at the top of Activity 2.3.1: Reflect and Next Steps for Additional Administration of Universal Screening Measures:

  1. Which additional screening instrument will you use with your focus group? Knowing your students and the types of results the instrument should yield, why did you select this particular screening tool? Will it provide enough evidence about specific sets of reading behaviors you hope to investigate?
  2. Are there multiple reading subskills this screener targets? If so, which subtests do you plan to incorporate with your focus group of students? What is your rationale for targeting these subskills?

Next Steps:
After today’s session, participants should begin administering additional prescreeners to the 4 – 6 students in their focus group. As they administer each screener, participants will need to find a way to organize and display their student information and preassessment data results in a table that can be easily interpreted. A copy of this assessment data will be compiled in Part B of the Data-Based Instruction Plan. Additionally, participants will need to bring, or access, copies of this assessment data for the first meeting of the Data-Based Instruction Plan: Data Analysis Review and Plan Development Session.