It is a cumulative test and assesses a student's learning up to their current grade level
IT IS LAGGING INDICATOR DATA...
"Lagging indicators serve a critical purpose, but lagging indicators focus on the outcomes of instruction that has already occurred—they don’t tell us what is happening now. Leading indicators, on the other hand, are predictive in nature. These data points are frequent and formative, and they offer valuable information that can help [educators] to adjust and change course in the moment—if lagging indicators can be likened to an autopsy report, leading indicators are a patient’s vital signs (Data First and Douglas Reeves, http://leadered.com/pdf/ICLELeadingLaggingIndicatorsSB.pdf).
Report Card data, and triangulation of data: OBSERVATIONS, CONVERSATIONS and PRODUCTS, create a more complete picture of student learning. Do all of the data points lead to similar conclusions about students' needs?
Let's take a look Cheyne's and Burnt Elm's IIR (Item Information Report) from last year. Do you notice any trends? Areas of strength vs. areas of need? Use the sticky notes to keep track of your ideas, and we will debrief as a whole group.
What is the difference between a KNOWLEDGE and UNDERSTANDING, APPLICATION or THINKING question? Using the example questions, can you determine what type each one is?
What do you notice about OPEN RESPONSE vs. MULTIPLE CHOICE questions?
How are open response questions assessed?