Remote psychoeducational testing is here—and it’s valid

Date:

Share post:

If you’re a district leader, chances are you’ve wrestled with the decision: Should you allow remote psychoeducational evaluations for special education eligibility?

On one hand, you’re facing unfilled school psychologist positions, ballooning referrals, and intense pressure to comply with timelines. On the other hand, you’re unsure if remote assessment is legally defensible, valid, or even ethical.

These aren’t just hypothetical concerns. They’re questions I hear every week in my work as a school psychologist and educational consultant. And they used to be difficult to answer with confidence. But, with an expanding body of research, we have stronger answers, and they’re backed by data.

In July 2025, I completed a large-scale national study comparing in-person and remote administration of the Woodcock-Johnson V Tests of Cognitive Abilities and Achievement (WJ V). Using a matched case-control design with 300 participants and 44 licensed school psychologists from across the U.S., the study found no statistically or practically significant difference in student scores between in-person and remote formats.

In other words, when conducted with fidelity, remote WJ V testing produces equivalent results to traditional in-person assessment.

This finding isn’t just important—it’s urgent.

Carefully structured environment

School psychology has been facing a workforce crisis for over a decade. A 2014 national study predicted major shortages, and that forecast has played out in real time.

Many districts are now relying on contracting agencies and remote service providers to stay afloat. At the same time, the demand for evaluations is climbing, fueled by post-COVID learning gaps, behavioral needs and the growing visibility of neurodivergent students.

In this context, remote assessment isn’t a novelty. It’s a survival tool. But concerns remain. Leaders wonder:

  • Will a hearing officer accept remote scores in a due process case?
  • Are students disadvantaged by the digital format?
  • Can we trust the results to guide placement and services?

These are valid concerns, but this is where research, implementation and documentation matter.

The WJ V study builds on prior research from 2016, 2017, and 2020, which also found score equivalency for remote administrations of the WJ IV COG and ACH, RIAS-2, and WISC-V assessments, respectively. Like those earlier studies, ours was designed around rigorous fidelity standards:

  • Touchscreen laptops with screens 13” or larger
  • A secure platform with embedded digital materials
  • Dual cameras to capture the student’s face and workspace
  • A guided proctor in-room with the student
  • Standardized examiner and proctor training protocols

This isn’t Zoom with a stopwatch. It’s a carefully structured environment that replicates traditional testing conditions as closely as possible.

When those fidelity conditions are met, the results hold up. Our findings showed p-values above .05 and effect sizes below .03 across all tested subtests, indicating statistical equivalence. This means schools can confidently use WJ V scores from remote testing, provided the setup adheres to best practices.

A path toward better service

Let’s be clear: Remote assessment is not a free-for-all. Not every platform, provider or protocol will yield valid results. Trained examiners manage remote testing variables in the same way in-person variables are handled every day.

The takeaway is this: Remote psychoeducational testing can be valid, reliable, and legally defensible if done right.

And beyond its validity, remote assessment opens up a strategic opportunity. Districts can now structure staffing more effectively.

By allowing licensed remote school psychologists to handle evaluations, onsite staff are freed to focus on preventive services—such as student mental health, behavior intervention, MTSS support, and crisis intervention—that are critical but often sidelined due to heavy assessment loads.

This hybrid model isn’t just a workaround; it’s a path toward better service delivery and more sustainable roles for school psychologists.

Here’s what district leaders can do:

  • Vet providers carefully. Ask about their platform, equipment, training, and adherence to published fidelity standards. Find out about their experience delivering remote assessments and the technical support they can provide.
  • Clarify device requirements. Different assessments may require different equipment. Being aware and setting up a working testing station is crucial.
  • Build policies that define standards. Your administrative procedures should spell out expectations for remote testing conditions, so staff and contractors are aligned.

Remote testing opens doors for students in rural districts, for schools with unfilled psychologist positions and for families seeking timely evaluations. But it only works when we take it seriously and treat it with the same care as in-person assessments.

The student environment, test security, examiner training and the equipment used must all be carefully considered and dictated by the available research.

The good news is we continue to have strong research to support that path. Our challenge now is to apply it wisely.

References

Castillo, J. M., Curtis, M. J., & Tan, S. Y. (2014). Personnel needs in school psychology: A 10-year follow-up study on predicted personnel shortages. Psychology in the Schools, 51(8), 832–849. https://doi.org/10.1002/pits.21786

Farmer, R. L., McGill, R. J., Dombrowski, S. C., & Lockwood, A. B. (2021). Cognitive and academic tele-assessment: A practical guide for school psychologists. Contemporary School Psychology, 25(1), 44–58. https://doi.org/10.1007/s40688-020-00325-9

Reynolds, C. R., & Kamphaus, R. W. (2015). Reynolds Intellectual Assessment Scales, Second Edition and the Reynolds Intellectual Screening Test (2nd ed.). PAR, Inc.

Schrank, F. A., McGrew, K. S., & Mather, N. (2025). Woodcock-Johnson V Tests of Cognitive Abilities and Tests of Achievement: Examiner’s manual. Riverside Insights.

Taylor, S. (2023). Equivalence of remote, online administration and traditional, face-to-face administration of psychoeducational assessments [White paper]. Presence. https://www.presence.com

Taylor, S., & Wright, A. J. (2023, October 19). Cognitive and academic teleassessment: A practical guide. PresenceLearning.

Wright, A. J. (2018). Essentials of tele-assessment. Wiley. https://doi.org/10.1002/9781119485075

Wright, A. J. (2020). Equivalence of remote, digital administration and traditional, inperson administration of the Wechsler Intelligence Scale for Children, Fifth Edition (WISCV). Psychological Assessment, 32(9), 809–817. https://doi.org/10.1037/pas0000939

Stephanie Taylor
Stephanie Taylor
Stephanie Taylor is a school psychologist, researcher, and founder of Taylored Education Solutions. She has also worked as a special education director, state-level consultant and edtech leader. She can be reached at [email protected]

Related Articles