Every Student Succeeds Act Peer Review: Be More Consistent, Not Less Rigorous

Accountability

July 13, 2017

by Charles Barone and Dana Laurens

Feeling confused lately by the Every Student Succeeds Act peer review process? That’s perfectly understandable. Writers, in both the trade and mainstream press, vary greatly in what they think is going on at the US Department of Education (USDOE).

The most widely held, or at least the most loudly voiced, opinion is that USDOE is being too tough on states and overly rigid. Last Friday, a New York Times headline blared, “DeVos’s Hard Line on New Education Law Surprises States,” with the article claiming that the Secretary of Education “has signaled a surprisingly hard-line approach to carrying out an expansive new federal education law, issuing critical feedback that has rattled state school chiefs and conservative education experts alike.”

Similarly, a dispatch Tuesday from Politico Pro’s Education Whiteboard (check your inbox) announced that the House Education and Workforce Committee will hold a hearing next Tuesday, July 18, on “state and local efforts to implement the Every Student Succeeds Act” and noted that “The department has faced criticism for what some see as a narrow reading of the law by the department in the feedback of the nine plans that have been made public so far.” Rick Hess likens, somewhat satirically, USDOE’s administration of ESSA to the Politburo’s oversight of 1950’s Soviet Union farm policies.

Perhaps those who desire more limited federal authority over state and local education authority shouldn’t have been so quick to support, actively or passively, repeal of the Obama Administration’s accountability regulations earlier this year. The USDOE, in providing comments via peer reviewers on state accountability plans, is simply doing its job of enforcing the law (which is quite detailed as to how USDOE must conduct peer review, among other things) as it best interprets it, without the benefits of accompanying regulations. In fact, it’s possible that instead of being too tough, the Administration isn’t being consistently tough enough.  That is, the bigger issue here that surprisingly few are noting is that USDOE is not being so much ridiculously rigid as exceedingly inconsistent.

The only one to nail this so far seems to be Ryan Reyna who, over at the Fordham Foundation’s Flypaper blog opined, correctly: “the problem is that there is no internal consistency about the standard states must meet. And that leaves states—both Round One states and those yet to submit—in the lurch.” Reyna points out, for example, that while reviewers panned Delaware’s academic goals for being “unambitious,” Nevada’s goals received praise for being “very ambitious” despite being quite similar to Delaware’s (“half the expected improvement as Delaware in half the time”).

Even Reyna, however, overlooks the apparent lack of consistent attention from both states and USDOE to the underlying ESSA statute. For example, one of the most notable changes to the Elementary and Secondary Education Act under ESSA is the requirement that state accountability systems have an additional “indicator of school quality or student success.” The statute clearly specifies that the indicator must allow “for meaningful differentiation in school performance” and be “valid, reliable, comparable, and statewide (with the same indicator or indicators used for each grade span, as such term is determined by the State).” The state of Louisiana, however, which has a plan with many notable strengths, describes its indicator of school quality or success this way:

“The interests and opportunities indicator (five percent of each school’s score) will measure whether schools are providing students with access to a well-rounded education…including visual and performing arts, foreign language, technology, co-curricular activities, advanced coursework, health/PE, career pathways, etc. Per BESE’s motion, this will be measured through a “menu” approach that will allow districts to demonstrate a strong effort in a variety of ways.” [Emphasis added]

Thus, Louisiana’s “interests and opportunities” indicator, by definition, clearly violates the statutory provision that it be “comparable and statewide” because each district will use a different measure of “interests and opportunities.” “Meaningful differentiation, validity and reliability,” while secondary in this case, can’t be determined either because we have no idea which measures districts are going to choose. Nothing in ESSA prevents a district from measuring what it considers to be important, such as “interests and opportunities,” but those things have no place in a statewide accountability system. ESSA, a law that intentionally left a number of policies highly flexible, was extremely clear with regard to accountability indicators needing to be “comparable and statewide.” These facts also seem to have eluded the reviewers of Louisiana’s ESSA plan, who concluded:

“Because there was insufficient data and information about the Interest and Opportunity measure at elementary grades, the reviewers were unable to determine if the plan met requirements. The State must provide details about the measure to fully meet the requirements for school quality/success at elementary grades. Specifically, the plan must show how the measure meaningfully differentiates school performance and the overall validity and reliability of the measure.” [Emphasis added]

In doing so, these reviewers negligently omit that the measure is neither comparable across districts nor statewide, as the statute specifically requires. The vagaries of peer review are nothing new and folks at USDOE aren’t the only critics failing to pay attention to the law. Regardless, none of this bodes well for the transparency required for parents – and policymakers – to make informed decisions as states move from ESSA planning to implementation. Those officially – and unofficially – tasked with reviewing state plans need to up their game now because these problems are only going to get worse – and even more opaque – as states move from planning to the more difficult job of implementation.