The New Hampshire Department of Education finally released the results for the 2015 statewide assessment, the Smarter Balanced Assessment (SBA). Although they received the results in July, and at that monthly meeting with the state Board of Education they talked about the need to carefully craft the public message, they prohibited districts from publishing the composite results until today.
As expected, the results are dismal. Across all tested grade levels, only 58% “meet or exceed the achievement level” in English and only 46% do so in math.
According to the NH DOE’s press release, they consider the results to be baseline scores and inappropriate to compare to prior years’ results. 2015 was the first time NH fully implemented the Smarter Balanced Assessment; the New England Common Assessment Program (NECAP) was used previously. It is true that the two tests are very different. The NECAP was done as a bubble test (paper and pencil) and all students in a grade took the exact same exam. The SBA was largely administered online and is adaptive, meaning that the test will become easier or harder depending on a student’s performance. The SBA is also aligned to the state’s new College and Career Readiness Standards (aka Common Core).
Let’s break that down into its components.
By treating the 2015 results as a baseline, the state DOE is effectively saying these scores have little to no impact for judging how well (or poorly) our schools and students performed. They are only going to use it as a point to measure future results. So if a student did poorly on this test, will it impact him or her directly? It absolutely could! A technical advisory by the NH DOE dated January 13, 2015 says that school districts may use the SBA results to determine grade and class placement. This is a local decision and parents would be wise to discuss this with their elected school board members.
The SBA is a computerized exam, a platform that may be unfamiliar to many students. For those children, not only would they be tested on the material itself, but also on their skill with the computer and test interface. Additionally, districts may have very different equipment. Does that make the test fair? Note that Manchester, Gorham, and Barrington students were given the wrong version of the test this spring.
The SBA is adaptive, so the test gets harder or easier depending on how the student is doing on the exam. So, if the tests are different, are they valid? Validity means that the exam tests what it intends to test. Education experts have raised this question and many more, concluding that the Smarter Balanced Assessment is not a valid test.
What exactly is the “achievement level?” According to the Smarter Balanced Assessment Consortium, the levels are mostly along grade-level expectations, but can be modified by each state.
The Smarter Balanced Assessment refers to an October 2014 online panel that practically crowd-sourced the cut scores. All participating states agreed to abide by the process and achievement levels. However, New Hampshire was one of the states that did not participate. So, again, what are New Hampshire’s cut scores? The Smarter Balanced Assessment allows each state to set its own cut scores, meaning their own pass/fail levels. The state Board of Education is tasked with setting NH’s cut scores, so what are they? Did they use the same ones used previously for the NECAP tests? Did they revise them up or down for the SBA, especially considering that 2015 was the first full implementation in the Granite State? Other states, including New York and Washington DC, manipulated their cut scores to either inflate their results, or to make them worse to further play into the message that the new College and Career Readiness Standards are more “rigorous” than the ones used before. Note that the official Consortium’s cut scores were designed to have students and schools fail; look at pages 5-6. Why? Why are we setting them up to fail? Ultimately these results are meaningless.
Why did it take so long for the composite scores to be released? Supposedly one of the big advantages of the SBA, particularly as a computerized test that required many districts to upgrade their technology, was that the results would be available very quickly. The NH DOE published a FAQ in 2014 as part of the introduction to SBA.
“The use of computer adaptive technology is more precise and efficient than form (paper/pencil) testing, providing results for teachers and students in a matter of weeks. It gives quick results that teachers and administrators can use to differentiate instruction better meeting the needs of their students in “real time.” In addition to measuring student achievement at the end of the school year, the Smarter Balanced Assessment System will provide information during the year to give teachers and parents a better picture of where students are succeeding and where they need help.”
The SBA was administered in a six-week window from March to June 2015, depending on the grade level. The latest testing period finished five months ago. Why did it take so long to compile results and release them? Students are not working with the same teachers as they did last year, so how do these results provide meaningful “real time” feedback?
Ultimately parents and concerned citizens should ask the NH DOE Commissioner, Dr. Virginia Barry, their superintendents, and school boards why we are spending valuable instructional time and district resources on assessments that fail to provide meaningful results.