Wednesday, February 10, 2010

5 Lessons on Using Interim Assessments to TEACH

Over at Inside School Research, smartypants Debra Viadero tells us about a study with potentially big implications for teacher practice and policy. (And I mean smartypants in a kind, "thinking really hard" kind of way, not a teasing, "think you know it all" sort of way -- as I often end up explaining to my kiddoes when I use the term affectionately and they give me a confused look.) So smarytpants Viadero explains how many teachers don't use interim assessments to change instruction. Interim assessments are tests developed by testing companies, states, and districts, rather than teachers themselves, and given during the year before the final, sometimes high-stakes test. In DC, I believe interim assessments might look like DIBELS/Text Reading Comprehension (TRC) for lower grades, and the DC-BAS for upper elementary. So the Consortium on Policy Research on Education (CPRE) studied teachers' use of these tests and found that teachers were getting good information about students from them, but not using that data to change instruction. Importantly, this non-application wasn't for any lack of teacher motivation. (After all, there are some pretty high consequences for some of US if our kids don't score better on those tests by the end of the year!) Instead, Viadero reports, "Often the disconnect came because teachers weren't given the know-how, the time, or the resources to figure out how to address students' knowledge gaps or because the assessments weren't aligned with the curriculum."

As a teacher, I've often used interim assessments less than effectively. But since we know a lot about what doesn't work, I'm going to share a story about a semester in which my colleagues and I DID use interim assessments to change our teaching. As we go, I'll try to pull out some principles that led to our success.

  1. Streamlined technology - Our story starts with a Palm Pilot. When my school began using DIBELS/TRC, each teacher received a Palm Pilot and training on how to use it. When we gave our students the first interim assessment, we recorded each student's answers directly in the Palm as the students spoke. It took a long time to get through all the students on-on-one, but there was a nice payoff: macro- and micro-level details on our students' performance, all neatly organized on the website. Designed by Wireless Generation, I could (and did) click to see a class-level bar graph of where my students scored on various skills. I could also go through each child's answers to get a finer level of detail on any one student.

  2. Relevant training - Once we had all this data, we teachers were trained on how to USE it. DCPS and our school leaders arranged (and paid!) for a former teacher/ current DIBELS tech guru from Wireless Generation to come to our school and train us more. (Note: Each of these trainings took TIME. School leaders arranged coverage for our classes so we could be trained for two full afternoons.) It was a big investment of time, but it was worth it. The Wireless Generation trainer hit us right at our Zone of Proximal Development, asking us what we had done and what we knew, figuring out we had mastered the mechanics of giving the assessment, and moving on to how to use the data to plan instruction. She walked us through a process for analyzing the data by prioritized skills, grouping our students into small groups for instruction, and accessing supplementary curriculum resources to meet the needs identified by the assessments. She also left us with concrete tools (planning templates and web site recommendations) for replicating those processes with our students throughout the year.

  3. Planning meetings aligned with training - So after all that good training and test-giving, we had our student data, we had our teacher knowledge -- but that doesn't mean we had the time to use it. By this point in the year, we teachers had done a second round of interim assessments on our students. So, once we had our mid-year student data, our literacy coach worked with us during our planning time. In these meetings, each teacher grouped his or her students as the Wireless Generation training had taught us, using the same tools we had been trained with. After that meeting, I met with my students in different groups, focusing the small group lessons on different skills, since I had more recent data about what they knew and what else they needed. (Note to school leaders: Please use these kinds of planning meetings selectively: planning/prep time in which teacher set our own agenda is also very, very important for successful delivery of instruction.)

  4. Easy-to-use curriculum and remediation resources - Full disclosure, this is where I think my school and I have the longest way to go. But we have made important strides which I think are worth sharing. When kids aren't getting a key skill, chances are teachers have already taught that skill as their primary curriculum prescribes, but the kids need something more. My school shared two important resources -- and the "What's Next?" section of the web site -- which provide already-written lessons on skills aligned to the DIBELS data. Although these resources are not perfect, they provided my colleagues and I an accessible starting point for changing our instruction.

  5. Frequent mini-assessments - Interim assessments often take a lot of time to give and analyze. A strength of DIBELS (and, to a lesser degree, TRC,) are the progress monitoring structures. Progress monitoring is a quick way to collect some of the data from the full interim assessment on some of the students (usually, the students who are most likely to struggle.) Currently, my fellow teachers and I (with a little coaching and pushing from our school and district leaders) take quick, mini-assessments on struggling kids every 1-2 weeks, and can view and analyze the new data on Once teachers get good at using data to change instruction and have the time and resources to do it, we want to do it more than once or twice a year. Young children change so rapidly, and we want to meet them as close to where they are NOW as we can -- not where they were two months ago.

So, in the end, what did all this training and planning and assessing look like for students? It looked like skipping lessons in my phonics curriculum that I knew 80% or more of the kids had already mastered. It looked like my assistant teacher and I meeting with the small groups of students who were not making progress 8 times a week rather than 3. It looked like changing the way I was teaching 1:1 correspondence (pointing to words one at a time while reading) because my students were better at it than I thought, they just weren't using it in new situations. And, *I hope* it will look like my students becoming better readers, writers, and thinkers than they would have before these changes in instruction.

No comments: