Skip to main content

Draper Journal

Are RISE test scores accurate? With issues surrounding the year-end state tests, state assures parents of their accuracy as results released

Oct 31, 2019 02:12PM ● By Julie Slama

In its inaugural year, RISE standardized assessments had technical problems when students tried to submit their answers and the system froze.

By Julie Slama | [email protected]

Canyons School District released its RISE test results on Sept. 27. Murray District released theirs the following Monday. Granite District made results available the previous week.

However, with technical problems with Utah’s new computer-adaptive RISE tests, which affected almost 18,500 third- through eighth-grade students statewide, some parents may feel uneasy about their children’s year-end test score results, acknowledged State School District Assistant Superintendent of Student Learning Darin Nielsen. About 1 million students take the assessment.

“Parents have been asking questions why there are problems, where they were, what they were and how accurate are the results,” he said. “We understand their concerns and we have been looking into reports of malfunction, incompleteness and inaccuracy. We were using a new testing platform this year and over a five-day period, there were periods of ‘slowness’ and ‘interruptions of service’ so many schools had to stop and quarantine their computers. There is no evidence of data lost or evidence of scores being lost.”

Knowing that, and having been part of an independent audit, many district administrators now feel confident in the test scores.

“We feel more comfortable than we initially did,” Granite School District Director of Communications Ben Horsley said. “At first, it was a nightmare when students tried to submit and it just froze. The state officials are confident and have done an admirable job piecing together to get the quality results for individual student growth and performance, school grades and accountability and for school turn-around status.”

Canyons School District Director of Research and Assessment Hal Sanderson said Canyons is encouraged with their results.

“We’ve had scores that are very encouraging,” he said. “At the district level, the majority of testing has seen increases.”

The 2018-19 test results of Utah’s standards-based assessment, called RISE, which stands for Readiness, Improvement, Success, Empowerment, is measured in terms of proficiency or clear understanding of math, science and English. 

The issue wasn’t with the test itself, but with the platform of delivery, Nielsen said.

“What was shocking was the easier part of this; we expected the software to take the data and spit it out into a report format, where the data could be downloaded to specific reports for students,” he said. “The problem came in the delivery of the vendor, Questar, to meet what we were promised. At that point, the board met to review the contract.”

On June 7, the Utah State Board of Education voted to cancel its contract with Questar Assessment, Inc., the vendor that has provided the technological platform for the state to administer statewide RISE assessments. This school year, the State Board decided to use their previous platform provider, AIR, to deliver the RISE test.

The original contract with Questar was more than $44.7 million, Nielsen said.

The assessment is a multistage delivery, meaning after the first set of 25 questions, a second stage of questions are given to the students’ level, ranging from easy to difficult.

Much of the early feedback of the problems with the assessment was when students submitted their work and saw it “pending” or spinning, Nielsen said. 

“The proctors were not sure what to do, if tests were submitted or not. Usually, that first panel of questions would be collected and scored, then a second stage of questions would become available. But there was plenty of evidence that the assessment couldn’t move on. We reached out to the vendor and we were told there was ‘slowness’ with their server,” he said.

While Jordan School District Superintendent Anthony Godfrey reported “every district had some issues” and “Jordan was in the same boat with intermittent periods” without service, Nielsen said that Murray School District initially questioned the data because of issues with the reports.

“Their students would push submit, but the next day, the individual student reports didn’t match the same numbers that were shown on the computer screen. The data base was fine, but the issue was in the reporting function,” he said.

One Murray teacher, who spoke with the understanding of not being named, said her students didn’t initially receive the language arts scores they usually expect, and that’s when many of them began to question the validity of the assessment.

Although the testing issues were widespread, Nielsen said there were significant disruptions in some areas, such as the “eighth-grade writing in one whole district — Canyons School District,” Nielsen said.

Sanderson confirms that Canyons, like many others, experienced issues.

“We had a couple glitches and we’d shut down the computers to save the tests when they wouldn’t save to the RISE servers,” he said. “Mostly, we moved testing to another day. If the tests would not have saved, it would have been a disaster. The tests provide good information. It shows if kids master a concept.” 

Nielsen also said they received reports that some eighth-grade students who were scheduled to take Secondary Math I weren’t able to access their tests because of the platform errors.

While issues came on certain days of testing, Nielsen pointed out that students could take the assessment over a period of time, and he applauds teachers who became flexible and altered their testing times.

The first reports of glitches in the testing were on April 25; they continued April 26, April 30, May 2 and May 10. As of April 25, more than 85,500 students had taken the RISE tests and of those, more than 64,800 successfully submitted their scores. Students continued taking the RISE assessment through mid-June to accommodate year-round schools’ testing.

There have been reports that tests were lost, but Nielsen said that isn’t accurate.

“We don’t have any evidence that students’ tests were completely lost,” he said. “Every teacher codes (categorizes) their tests. If a student didn’t participate, then the teacher needs to code it. The most common is parental exclusion, but there are 16 codes and that was the data that was still needing to be inputted.”

Those numbers, 4,583 students who didn’t test for various reasons, were down compared to previous years. An example, he said, was that there were reports of 381 students coded as a health excuse who didn’t complete the test. In 2016, 1,150 students were excused for a health reason. 

“This year, we had 95% take the RISE test,” he said, up from 91% to 92% the past three years. 

While RISE testing isn’t mandatory, Nielsen said that come spring 2020, there will be academic incentive to do well on the assessments that align with the core curriculum standards.

“There is language in the statue that a teacher can use students who participate and show proficiency can improve their class grade,” he said, adding that “scores can’t penalize a grade.”

Such incentives would be in a school or teacher disclosure statement to be transparent, he said.

What might further confuse parents this year is that RISE replaced Utah’s SAGE (Student Assessment of Growth and Excellence) testing, resulting in questions of how they can compare the two year-end examination scores.

Nielsen said that originally SAGE testing was written for students in grades three through 10. However, with changes in the test, including when the ACT became the standard for high school sophomores, Nielsen said it was time to rename the year-end assessment.

“These are still our questions, similar ones to the SAGE test, that align to our core curriculum,” he said about the questions that are scrutinized by Utah teachers and parents.

Currently, the state board is looking for a long-term provider to give the RISE assessment.

“Every year, things happen out of our control,” Nielsen said. “We knew there were problems in Texas and after we selected Questar, in New York and Tennessee. We asked what happened and received assurances that the issues would not occur here. We took proactive action and prepared for them, but, unfortunately, we still had issues.”