It’s Not About the Size of the Pandemic Slide—It’s About Where to Start Teaching (Part I)

It’s Not About the Size of the Pandemic Slide—It’s About Where to Start Teaching (Part I)

During a Crisis, You Have to Change the Definition of Success

Dear Colleagues,

Introduction

   The existence and impact of an academic “pandemic slide” has been a frequent topic in the popular and educational press since the beginning of the pandemic earlier this year.

   Related reports have documented that:

  • 4.4 million households with students still lack consistent access to a computer, and 3.7 million lack internet access. While schools provided computers to more than half of their households, only a fraction of these homes received devices to access the internet (September 28, 2020; USA Facts);
  • Many educators stated that their schools covered less material or no new instructional territory during the virtual Spring and early part of the new school year (August 20, 2020; EdWeek Market Brief);
  • Students’ virtual attendance last Spring was significantly down (for example, a majority of the 5,659 educators in one survey said that fewer than half of their students were attending), and this problem has continued this Fall (September 22, 2020; New York Times); and
  • Students have not mastered academic material at the expected grade levels this Fall, with math skills more affected than literacy skills.

_ _ _ _ _

   Critically, this last bullet is the primary focus of this two-part Blog message.

   The Questions?

   What does it mean for your district or school students—whether in a virtual or on-site classroom—to be “academically behind”?
   How and where (in the curriculum) should you teach each of your students?

   These are the critical planning questions to answer right now as we approach January and the second semester of the school year.

   They are important because some schools will welcome their students on-site for the first time this year beginning in January, while other schools—that have taught their students in full-time on-site pods, or using half-time hybrid schedules—will continue their second semester journeys.

   Either way, all schools nationwide need to “take stock” of their students’ current academic status and progress now so that they can prepare for the instructional changes that may be needed this January.

   Ultimately:

The point of this two-part Blog Series is that (a) educators need to be less concerned about a pandemic slide that focuses on where students “should be” academically in their classes or courses; and (b) more concerned about determining what academic skills students have learned, mastered, and are able to apply, and how to teach from there.

   In today’s message, we will address these points by discussing (a) the “etiology” of the pandemic slide—and how it needs to be quantified, validated, and analyzed so that it can be linked to the best services, supports, and interventions; and (b) how to coordinate an assessment process to determine what academic content and skills students have mastered.

   In Part II of this Blog Series, we will address (c) how to organize different-achieving students into instructional groups; and (d) how to integrate remediation, core instruction, and acceleration for these different students.

   As is evident from the title of this piece: When in a crisis, the focus is to manage and stabilize the crisis. This often requires a change in the “definition of success.”

Educationally, these are challenging times. We need to adapt to the academic impact of the pandemic by focusing on quality instruction and student learning. It makes no sense to be burdened or dictated to by academic standards and outcomes that are no longer viable.

   We can facilitate student learning and mastery. . . by maintaining high and reasonable expectations.

_ _ _ _ _ _ _ _ _ _

The Etiology of the “Pandemic Slide”

   Words are powerful.

   The phrase “pandemic slide” has—in a few short months—become an “accepted” part of our educational lexicon. It has been discussed without specification and validation. It is already being marketed by vendors trying to sell their assessment and intervention products. And it has created anxiety among administrators and teachers.

   Right now, if we are going to functionally focus on the pandemic slide in schools nationwide, it needs to be quantified for every student, validated as the condition underlying a student’s current academic skill gap, and analyzed relative to needed services, supports, or interventions.

Quantifying and Validating the Pandemic Slide

   On a conservative level, an academic gap (or slide) that is due to the pandemic is evident when there a change in an individual student’s learning curve before the pandemic began, as contrasted with his or her learning curve during this ongoing pandemic.

   That is, if a student’s speed or rate of learning and mastery is significantly lower in a specific academic area during the past five instructional months (i.e., March, April, May, September, and October) than during the one, two, or three years before, we could reasonably conclude (other factors or events aside) that the student has learned less (or “has lost” previous learning) during the pandemic.

   [Note that to be even more precise, we would also factor a student’s typical “summer slide” into the analysis—but let’s keep this practical.]

   Example 1. If a typical student was making 10 months of pre-pandemic progress in literacy for every 10 months in school (a 1.0 rate), then a “pandemic slide” might be evident if they had made 3 months of progress in the five months (i.e., March, April, May, September, and October) since the pandemic began (a 0.8 rate).                                                                 

   Example 2. If a “more challenged” student was making 8 months of pre-pandemic academic progress in literacy for 10 months in school (a 0.8 rate), a pandemic slide might be evident if they had made 2 months of progress in the five months since the pandemic began (a 0.4 rate).

   Example 3.  If a different student was making 10 months of pre-pandemic progress in literacy for every 10 months in school (a 1.0 rate), and continued to make this same 1.0 rate progress during the past five pandemic months, we might conclude that there was either no impact due to the pandemic or that the student was compensating for any impacts.

   NOTE that an implicit assumption in these examples is (a) that the academic assessment tools being used and the resulting progress monitoring data are valid and are accurately measuring student learning and progress; and (b) that the outcome data from the assessment tools reflect the same learning skills and content as in the curricular material taught in the classroom.

_ _ _ _ _

Analyzing Academic Gaps to Prepare for Intervention

   If academic gaps exist due to the pandemic, and if we want to effectively close these gaps, we still need to determine the reason or reasons why (i.e., the root cause) the gaps exist.

   More specifically, the pandemic has not caused any valid and existing gaps; the pandemic has created one or more conditions that have caused the gap.

   Among the most likely reasons for an existing pandemic-related academic gap are the following:

  • Student (virtual and/or on-site) attendance, academic engagement, work assignment understanding and completion, motivation, ability to work and learn independently, ability to learn on a virtual or computer-based platform, pandemic-related emotional or mental health reactions, and/or access to a reliable computer and internet connection.
  • Curriculum that has been cut down or truncated due to time; that is not well-designed for home- or self-instruction, or virtual teaching; that is not aligned to district and/or state standards and learning objectives; that does not provide effective prerequisite skill sequencing or scaffolding.
  • Instruction (virtual and/or on-site) that has not occurred or is not developmentally or pedagogically sound; that is delivered without understanding the pandemic’s functional impact on students’ social, emotional, or behavioral status or interactions; that involves students with such varying prerequisite and existing skill levels that differentiation cannot be successful; that cannot be effectively modified to a virtual platform.

   Critically, in order to identify and provide the services, supports, instruction, or intervention that will close existing academic gaps, educators need to know exactly why the gap occurred, and then they need to link the reason to the approach(es) that will address it.

   Example 1. If the gap is because students were never taught (or effectively taught) the prerequisite skills for the instructional unit or skills now being presented, the current teacher may need to teach or re-teach the prerequisites needed as part of the new unit.

   Example 2. If the gap is because students do not have (and can learn) the (metacognitive or other) skills needed to learn in a virtual environment, then these skills need to be taught and applied to curriculum and instruction.

   Example 3. If the gap is because (previous) teachers have been asked to differentiate instruction for too many students functioning at too many different skill levels, then the staff may need to reorganize the assignment of students into specific differentiated instructional groups.

   As a rule of thumb, if students—who demonstrated average or expected rates of learning before the pandemic—are now functioning below grade-level academically due to pandemic-related conditions that influenced the quality of instruction or their opportunity to learn, they need to be taught (a) at their current functional, mastery levels—for skills that need to be learned sequentially, and (b) at their current grade-placement levels—when missed skills can be integrated into the grade-level content and curriculum.

   For example, as many mathematical skills are sequentially taught and dependent on students’ mastery of prerequisite skills, if students have missed six months of quality instruction and learning, instruction may need to “drop back” and start at the last levels of student mastery.

   In contrast, for middle or high school English students, vocabulary and comprehension skills are learned more experientially as they are exposed to different genres of texts, stories, and novels. Here, missed vocabulary and comprehension skills can be “made up” by strategically choosing stories or novels that best addressed existing skill gaps.

   Said a different, “old school” way, if students do not get to Great Expectations because of the pandemic, then they miss that experience. However, the vocabulary, comprehension, plot analysis, and compare and contrast skills that may have been taught during Great Expectations can be embedded into the instruction of a different novel.

   In summary, the pandemic is an event that has triggered (as above—for certain students, for different reasons) specific academic skill gaps. This event is similar to having a student who was unable to attend school for eight to twelve months due to a hospitalization after a car accident.

   In order to best teach students who have missed sequentially-dependent skills, instruction should begin at their points of current skill mastery. When embedded or contextually-dependent skills have been missed, educators—for example, in English, science, and history will need to strategically select which areas of their curricula to emphasize and teach, and how to integrate missed skills and content into their instruction.

   Where students are “supposed to be” is largely irrelevant from a pedagogical perspective right now.

_ _ _ _ _

Determining Students’ Academic Skill Mastery

   Ultimately, as above, a district or school’s instructional plans for January hinge on valid assessments that tell them what content and skills students have learned and mastered in literacy, mathematics, writing/language arts, and science.

   The results of these assessments help identify (a) the current functional, instructional skill levels, for each student, in these core academic areas; (b) the prerequisite skills that students can build on as they progress to the next scaffolded skill, content area, or unit in their school’s scope and sequence or curricular map; and (c) where teachers should start the teaching process.

   In order to facilitate this planning for January, however, the needed assessments should be completed within the next two to three weeks, and meetings to analyze and use the results should be held in early December.

   Based on the collected data, these meetings should focus on (re)organizing students into the best instructional groups in the second semester’s core academic areas, and determining students’ pods or class assignments, schedules, and access to specific teachers and needed support services.

   NOTE that the prevailing assumption here is that the instruction for most students in January should be guided by their pandemic-specific levels of academic skill mastery—unless there were pre-pandemic learning conditions (e.g., related to student disabilities) that still need to be addressed.

_ _ _ _ _

A Blueprint for Assessing Students’ Current Academic Skills

   In order to accomplish the assessment, instructional grouping, and multi-tiered service and support goals related to students with confirmed pandemic slides, districts and schools need to:

  • Functionally and validly assess all of their students in literacy, mathematics, and writing/language arts;
  • Integrate and “StoryBoard” the results (see Part II next time) in each academic area to determine the best ways to group students instructionally to maximize the impact of effective teaching and differentiation;
  • Identify and plan for the students who need multi-tiered services, supports, and interventions;
  • Align the StoryBoard with staff and resources (including Intervention Specialists, paraprofessionals, computer-assisted instruction and intervention, after-school tutoring, etc.). . . with an eye toward student equity;
  • Factor the results into (a modification of) the school’s second semester schedule and logistics; and
  • Evaluate the decisions on an ongoing basis, making “mid-course” grouping, scheduling, and/or logistical changes as needed.

   Critically, nothing in this blueprint is new. It simply needs to be adapted to how districts plan to educate their respective students in January—that is, using a hybrid, all-virtual, or all on-site approach, respectively.

_ _ _ _ _

Choosing/Developing the Academic Assessments

   Relative to organizing the assessment process:

  • Districts or schools should identify the Power or Anchor Standards in literacy, mathematics, science, and writing/language arts at each grade level that are most essential for students to learn and that, typically, represent the foundational or prerequisite content or skills for the next level of learning in the curriculum, scope and sequence, or course progression.

Many states and publishers have already done this. . . so districts do not need to reinvent this wheel. . . they only need to research and tap into these existing templates.

Some state departments of education in fact (e.g., Arizona, Arkansas, and others) have re-visited this process, creating COVID-19 “academic playbooks” and “unit plans.”

_ _ _ _ _

  • From the respective Power or Anchor Standards, districts or schools should identify the knowledge, content, information, and skill-specific test item specifications needed in each grade level’s academic literacy, mathematics, science, and writing/language arts assessments that will most accurately evaluate students’ current functional learning and mastery status.

Once again, many state departments of education (e.g., Florida, California, Indiana), publishers of different academic curricula (that include formative assessments), and academic assessment vendors (e.g., NWEA, Edmentum, STAR, i-Ready, Istation) have already completed this task for districts and schools.

_ _ _ _ _

  • Districts or schools should either create (e.g., from an existing test bank) Adaptive Assessment Tests in each academic area, or adapt the vendor- or publisher-provided Adaptive Assessment Tests they are currently using—ensuring that their items are consistent with the test item specifications generated in the step above.

Guided by the Power or Anchor Standards chosen, this is essential to a valid assessment of each student’s current functional skill level in each academic area.

As an FYI: On-line Adaptive Assessment Tests utilize computer-programmed algorithms that determine which test items will follow already-answered test items based on students’ correct or incorrect responses. This allows students to be assessed across different grade-levels when they are functioning either above or below their current grade-level placement.

Adaptive Assessment Test results typically determine where students are academically functioning—regardless of their grade-level placements, as well as the specific skills that they have mastered or not. For example, the test results might report that a fifth-grade student is functioning at the middle of third grade level in phonic skills, and at the end of second grade in vocabulary and comprehension skills. The specific phonics skills mastered or not mastered would also be specified.

_ _ _ _ _

  • Finally, districts and schools should complement the Adaptive Assessment Test results with, for example, formal and informal classroom- and curriculum-based teacher assessments, independent student work assignments and samples, and in-class portfolios, projects, or tests.

_ _ _ _ _

   Throughout this assessment process, students will need to be closely supervised to ensure that (a) the assessment tool accurately evaluates their skills; and (b) they have participated with their full attention and motivation.

   Relative to the former area, we find that students often perform higher on paper-and-pencil assessments than computer-assisted assessments. Moreover, when reading long test passages, they will more often return to an earlier page of text when using hard-copy materials than return to an earlier screen when assessed on a computer. Schools need to be careful when using computer-assisted assessments as they may sacrifice assessment validity for assessment convenience.

_ _ _ _ _

Pooling the Academic Assessment Results

   Recognizing that different assessments have different assessment goals, the goal here is to organize students—based on the literacy, mathematics, science, and writing/language arts assessments completed—into broad clusters that represent their current mastery and functional knowledge and skills levels. Once again, this is all driven by the Power or Anchor Standards at each grade level for each academic area.

   To facilitate this goal, students can be organized into clusters where they are identified as functioning above, at, below, or well-below their current grade-level placements. For high school students, grade-level placements may need to be cross-walked with specific academic courses or course levels where specific content or skills are taught.

   To enhance the specificity of the grade-level placement clusters above, it is recommended that school staff also pool the available quantitative and qualitative academic data into a specific “Grade-Level Equivalent Skill Summary Score” for each student in each academic area.

  • The quantitative data will be a standard score, percentile, or grade-level equivalent score from the Adaptive Assessment Tests discussed above.

Here, students can be organized into clusters involving students who scored (a) above their current grade placements—1.5 standard deviations above the assessment test mean; (b) at their current grade placements—between -1.0 and 1.5 standard deviations on the assessment test; (c) below their grade-level placements—between -2.0 and -1.0 standard deviations on the assessment test; and (d) well-below their grade-level placements—below -2.0 standard deviations on the assessment test.

Some Adaptive Assessment Tests also provide grade-level equivalents that indicate that a student is functioning, for example, at the middle of Third Grade in phonetic decoding, the beginning of Third grade in vocabulary, and the end of Second grade in comprehension—resulting in a beginning of Third Grade “total reading score.”

The Adaptive Assessment Test results above should be integrated with other quantitative data collected by teachers from classroom- or curriculum-based assessments. They also can be compared with students’ past interim assessment or state benchmark proficiency results—although most of these latter results are dated and do not factor in the impact of the pandemic.

_ _ _ _ _

  • The qualitative data, as noted above, include teachers’ formal and informal observations of students’ classroom, curriculum-based learning as reflected in their involvement and participation during instruction, interactions during cooperative group or project-based activities, independent assignments and work samples completed, and in-class portfolios, projects, or tests.

Additional qualitative analyses can be conducted by looking at the Adaptive Assessment Test results and determining how many Power or Anchor Standards a specific student has mastered at each grade level.

For example, in math, a specific sixth-grade student in the Fall may have mastered all of the Power Standards through Fourth Grade, four of eight Grade Five Power Standards, and one of eight Sixth Grade Power Standards. Qualitatively, we might estimate this student’s current functional skill level to be at the middle of Fifth Grade in math.

All of this qualitative information can be supplemented by reports from teachers who taught specific students in previous school years.

_ _ _ _ _

   Ultimately, all of the quantitative and qualitative data (leaning most heavily on the quantitative and, especially, the Adaptive Assessment Test data) can be pooled into a “Grade-Level Equivalent Skill Summary Score.”

   While not the most sophisticated psychometric score, a student’s current functional skills in a specific academic area can be categorized as at the Beginning-, Middle-, or End of a specific Grade-Level Equivalent.

   Thus, for example, a student at the beginning of her Seventh-Grade year might be functioning at the End of Fifth Grade level in Reading, the Middle of Sixth Grade level in Math, and the Beginning of Fifth Grade level in writing/language arts.

   This Grade-Level Equivalent Skill Summary Score can be compared with the district’s scope and sequence or curricular pacing charts to determine where curriculum and instruction might begin for specific students.

_ _ _ _ _ _ _ _ _ _

Summary

   The phrase “pandemic slide” has—in a few short months—become an “accepted” part of our educational lexicon. It has been discussed without specification and validation. It is already being marketed by vendors trying to sell their assessment and intervention products. And it has created anxiety among administrators and teachers.

   If educators are going to evaluate and address the existence of a pandemic slide in their district or school(s), it needs to be quantified for every student, validated as the reason for a student’s current academic skill gap(s), and analyzed relative to needed services, supports, or interventions.

   But even more critically, now is the perfect time to evaluate the academic status and progress of all students in literacy, mathematics, science, and writing/language arts so that data-informed planning can occur for this coming January and the start of the new semester.

   This is especially important during this pandemic as some schools will welcome their students on-site for the first time this year beginning in January, while other schools—that have taught their students in full-time on-site pods, or using half-time hybrid schedules—need information to continue their second semester journeys.

   In today’s message, Part I of a two-part Blog Series, we addressed (a) the “etiology” of the pandemic slide—and how it needs to be quantified, validated, and analyzed so that students’ learning gaps can be linked to the best services, supports, and interventions; and (b) how to coordinate an assessment process that determines students’ current functional skill levels, and what academic content and skills they have mastered or not mastered.

   In Part II of this Blog Series, we will address (c) how to organize different-achieving students into instructional groups; and (d) how to integrate remediation, core instruction, and acceleration for these different students.

   Across both parts, we are emphasizing that educators need to be less concerned about a pandemic slide that focuses on where students “should be” academically in their classes or courses; and more concerned about determining what academic skills students have learned, mastered, and are able to apply—and how to teach from there.

   We are also emphasizing, especially during these challenging times, the need to adapt to the academic impact of the pandemic by focusing on quality instruction and student learning. Indeed, it makes no sense to be burdened or dictated to by academic standards, expectations, and outcomes that are not viable right now.

_ _ _ _ _

   While we are only half-way through this two-part Series, we anticipate many who will say that these ideas—while research-based and field-tested across the country—are not realistic given the available time, resources, schedules, and even expertise.

   Our respectful response is:

  • We need to know and be guided by research-to-practice blueprints first as we approach our students academically this coming January. If we don’t know the blueprints, then we don’t know how close we can come to these blueprints given the available time, resources, schedules, and expertise in our districts and/or schools.

Moreover, if we don’t know the research-to-practice blueprints, we will (in essence) be playing “instructional roulette” with our students’ futures. This puts our students (and staff) at-risk, and increases the probability that our results will be underwhelming.

_ _ _ _ _

  • If we don’t instructionally program our students for academic success, then academic frustration and related social, emotional, and behavioral problems—beyond where these students are now—have a high probability of emerging.

These problems will then (further) undermine these (and other) students’ academic engagement and progress, and this may initiate a vicious cycle.

The result that our students will be further behind—both academically and behaviorally—than when we started.

_ _ _ _ _

  • We need to “go slow to go fast.”

That is, in the absence of valid data, schools may be assuming that they know where students are functioning right now academically, and how much learning progress or loss they have made since March. . . and then, since the beginning of this school year.

At this point in the school year, schools need to re-validate the pods or instructional groups where students are now learning. Perhaps a different instructional group or differentiated learning approach in January will make all the difference in where students are in June?

Academically, we need to accurately, and in a measured way (no pun intended), determine who is ahead, who is progressing, and who is behind. . . and how far they are behind.

If this takes a little more assessment time to get right. . . so that students are assigned to the right classes with the right curriculum and  instruction levels. . . this time will be well-invested.

_ _ _ _ _

   As always, I appreciate your ongoing support in reading this Blog.  I hope that you, your colleagues, your students, and your families are safe and healthy.

   If you have comments or questions, please contact me at your convenience. 

   And please feel free to take advantage of my standing offer for a free, one-hour conference call consultation with you and your team at any time.

Best,

Howie