The Utah State Board of Education on Thursday received an overview of a draft legislative pilot that would let local education agencies and charter schools opt into nationally norm‑referenced assessments instead of the state RISE end‑of‑year test.
Matt Throckmorton, an education advocate who said he is working with Representative Lisonbee on the measure, told the board the bill would allow participating schools to administer a nationally norm‑referenced test at the beginning, middle and end of the school year and then convert results so the final administration can be used for accountability "in a statistically significant and a valid manner." Throckmorton said the pilot would run five years and that participating LEAs would pay vendor costs; he said the draft currently carries no fiscal note.
The bill would cap participation so districts could opt in until participating students reach 10% of statewide test‑eligible students, and charters could opt in until participating charter students reach 35% of charter test‑eligible students. The State Board of Education (USBE) would select approved assessments through an RFP and be responsible for producing conversion or concordance work so different tests could be compared "apples to apples," the deputy present told the board.
Why it matters: Utah uses the RISE criterion‑referenced assessment for federal and state accountability and to identify schools needing support. Replacing the end‑of‑year assessment for accountability purposes requires statistically defensible equating so results remain comparable across systems and over time.
Board members pressed presenters on several technical and policy details. Chair Jaime asked, "What does the USBE gather data for?" and expressed concern that alternative assessments could isolate a school's results from nearby schools if conversion does not preserve comparability. A USBE deputy said the agency would post aggregated results and report annually to the Education Interim Committee; she also cited a 2031 sunset in the draft.
On statistical methods, the presenters described two approaches discussed with experts: building concordance tables from large samples and equating tests using anchor items present on both assessments. Director Brough, the board's data lead, cautioned that exact equivalence is difficult and that equating relies on assumptions about tests measuring the same constructs. He said the work would require staff time and likely be embedded within existing contracts with USBE's Technical Advisory Committee (TAC).
Board members also raised practical concerns. Several asked whether the pilot would change how writing is handled (the draft referenced reading and math but not writing); that question was left unanswered and marked for follow up. Members asked how parent opt‑out rules would apply: presenters noted state law allows opt‑out for the current end‑of‑year test but that the law treats locally administered assessments differently, and the draft had not resolved whether opt‑out would apply to pilot assessments.
Supporters said the pilot could reduce "test fatigue" and provide teachers with formative growth data that better informs instruction. Throckmorton cited past state use of norm‑referenced tools and said prior conversion work had been successful; other board members urged consultation with USBE assessment staff and TAC before the bill advances.
What happened next: The bill text is still being refined and has not been numbered. Presenters said they will supply technical amendments and further details to USBE staff and that the board would be kept informed. No motion or vote was taken during the meeting.
The board asked staff to supply follow‑up answers about writing coverage, the precise conversion methodology to be required in rule, and how state opt‑out law would apply to pilot assessments.