Download eBook for Free

FormatFile SizeNotes
PDF file 0.5 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

Research Questions

  1. To what extent can older assessment information be used to ameliorate the problem of missing test data used for course placements?
  2. To what extent do student and school characteristics influence the consistency of using older data in course placement strategies?

The novel coronavirus disease 2019 (COVID-19) pandemic has created an unprecedented set of obstacles for schools and exacerbated existing structural inequalities in public education. In spring 2020, as schools went to remote learning formats or closed completely, end-of-year assessment programs ground to a halt. As a result, schools began the 2020–2021 school year without student assessment data, which typically play a role in selecting students for specialized programming or placing students into courses. Although conceptual research has emerged to support school and district decisionmaking regarding assessment during the pandemic, there has been relatively little empirical research to help guide schools and school districts on handling the impacts of the pandemic on the availability and interpretability of assessment data.

To address this gap, the authors of this report provide empirical evidence to inform schools' and districts' approaches to course placement in the absence of end-of-year assessment data. The authors compare and contrast three potential strategies that use older assessment data to estimate missing test scores: simple replacement, regression-based replacement, and multiple replacement. The authors examine the ways in which the pandemic may have influenced the consistency of decisionmaking under these strategies and the extent to which these strategies work equally well for all students, regardless of student race and ethnicity or school poverty. They also discuss these strategies' implications for schools and districts.

Key Findings

Consistent course placement decisions can be made using all three of the replacement strategies, although much depends on the district context

  • Regression-based replacement methods improve as the number of tested students increases; the districts with the least-consistent placement recommendations had fewer than 250 tested students.
  • Simple replacement strategies might result in high rates of misclassification for students who have significant changes in achievement or content mastery between testing periods.
  • Students whose actual scores are just above or just below the criterion for course recommendations are likely to end up with different recommendations if their replacement scores differ even slightly from their actual scores.

Assuming average school quality can be problematic for course placement decisions that are based on regression-based methods

  • Because school quality varies considerably, such an assumption can have the effect of systematically overestimating some students' future achievement and underestimating other students' future achievement.

There is evidence of differential method performance based on student race and ethnicity and school poverty

  • The simple replacement and multiple replacement methods do not seem to induce bias based on student demographics or school poverty, but the regression-based methods do.
  • The regression models that do not include school prior achievement systematically underestimate scores for white and Asian students and systematically overestimate scores for black and Hispanic students.
  • Regression-based methods tend to overestimate scores for students in high-poverty contexts and underestimate scores for their more advantaged peers.

Table of Contents

  • Chapter One

    Introduction

  • Chapter Two

    Study Background and Context

  • Chapter Three

    Study Approach

  • Chapter Four

    Results

  • Chapter Five

    Implications for Decisionmaking in the Absence of Large-Scale Test Scores

  • Appendix A

    Data Sources for This Report

  • Appendix B

    Analytic Methods

  • Appendix C

    Complete Results

Research conducted by

The research reported here was sponsored by the Institute of Education Sciences, U.S. Department of Education and conducted by RAND Education and Labor.

This report is part of the RAND Corporation Research report series. RAND reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND reports undergo rigorous peer review to ensure high standards for research quality and objectivity.

This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.

The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.