Written by: Donald Heller
Primary Source: The Dean’s Blog
We recently received the results of our students’ performance on the Michigan Department of Education’s Professional Readiness Examination (PRE). The department requires students to pass all three parts of the PRE test – reading, writing, and math – before they are allowed to do their student teaching. In our college, we require students to pass the PRE before they can be admitted to any of our teacher preparation programs (admission generally occurs at the beginning of their junior year); if students are going to have problems passing the test, we would rather know that before they are actually enrolled in our program and taking courses. The PRE is a criterion-referenced test of reading, writing, and math skills; it is not a measure of students’ knowledge of pedagogical skills.
The state education department restructured the PRE test this year, changing it from one that in the past sought to measure “the minimum amount of content knowledge needed to perform in the role of an entry-level educator,” to one that measures the “level of content knowledge needed to effectively perform the job of a qualified Michigan educator.” This language (which comes from the letter sent to education schools with the statewide results) may sound like a subtle shift in the standard, but it is actually a major change. Rather than using the test (which used to be called the “Basic Skills Test”) to ensure prospective teachers met minimal standards for a beginning teacher, the PRE now is intended to ensure that these prospective educators are able to perform “effectively” in the classroom. This change was part of the state agency’s efforts to strengthen the pool of teacher candidates in the state. In a press release announcing the test results, State Superintendent Mike Flanagan was quoted as saying, “We want the best and brightest teachers in Michigan classrooms. Increasing the expectations necessary to pass the certification exam gets us closer to that goal.”
Last year, 82 percent of the over 6,000 teacher candidates statewide who took the Basic Skills Test passed all three sections of it. This year, with the new test and standards in place, only 26 percent passed the entire test. These results presented a wake-up call not just to students who did not pass the test, but to schools and colleges of education across the state. The pass rate for our 142 students was above the statewide average, but we still experienced a large drop from last year’s results. (This represents only a portion of our prospective teacher candidates, as not all of this year’s applicants took the test Oct. 5.) As I recently wrote, our teacher candidates demonstrate high levels of academic achievement, both in terms of their credentials coming into the university (high school grades and ACT/SAT scores), as well as their grades earned while enrolled as MSU students. Thus, we were surprised as well by these results.
After some initial panicky emails back and forth between our Student Affairs Office (which advises our students on state certification requirements, and received most of the surprised and/or angry emails from students who failed the tests and their parents), our Teacher Education leadership, and me, we managed to calm down a little bit. We delved into the test results that we received for our students, and we were able to understand a bit better what had happened. Of the three subtests – in reading, writing, and math – the writing test was the biggest hurdle for most students. This mirrored the statewide results, which showed that the writing test was the one that tripped up most students.
Students can take the test as many times as like in order to try to pass it and only have to retake the parts they failed, though there is a cost associated with each attempt. While we had done our best to inform our students that the PRE was restructured from its predecessor, and that new standards for passage would be put in place, we suspect many of the students may have been a bit complacent about the level of difficulty based on the high passage rate their earlier peers had enjoyed. Thus, they may not have put as much time and effort into preparing for the test as the new standards would have required.
We also recognize that whenever a new or restructured criterion-referenced test is put in place, establishing cut scores for passage can be as much of an art as a science. The Department of Education convened a panel of experts from both K-12 and higher education to establish these cut scores, and they used their best judgment in doing so. They were no doubt working under the charge provided to them to increase the rigor of the exam. As Mike Flanagan put it in the press release, “This is part of a long-term plan that is four years or so in the making. Michigan schools need the best and brightest educators teaching our students. We want to ensure we have effective educators in our classrooms …”
The expert panel likely knew the cut scores they chose would result in a much lower passage rate than in prior administrations of the test. But whether the specific cut scores actually will ensure that these students will be the future “best and brightest” teachers is unknown. The process may just end up selecting the students who are the best test takers, but not necessarily those who will be the best teachers. It is difficult to argue against the concept of wanting to have strong students going into the teaching profession, but relying on these tests may not be the best method for ensuring this outcome.
Latest posts by Donald Heller (see all)
- A dean’s farewell - December 28, 2015
- Once again, the New York Times misrepresents student debt - November 30, 2015
- Duncan’s higher education legacy - October 6, 2015