November 26, 2024
BANGOR DAILY NEWS (BANGOR, MAINE

Where state test hurts students

Maine teachers to score MEAs on computer” ran a MaineDay headline (BDN, May 28). I wonder how many readers noted that the newsworthy part was not that computers are capable of scoring exams. The laudable element is the use of technology to address the isolation experienced by Maine teachers and to provide a vehicle for setting standards for student writing.

Determining whether multiple choice or true-false answers are right or wrong can easily be left to computers. But the hardest work in scoring the MEA’s comes in determining whether students can adequately develop a written argument or fully understand the scientific method. As one of the members of the Task Force charged with designing the philosophical framework and the process of developing the Learning Results, I applaud the efforts to connect Maine teachers in the discussion of quality education.

However, I am deeply concerned with the direction the MEAs seem to be taking this year.

What are tests designed to do? Any answer to that question demands addressing other sub-questions. Have students learned what they have been taught? Have teachers taught what they are expected to teach? Are schools providing the programs that support the best learning environment? Are we testing what we value? A well-designed assessment (test) will change what is taught. For example, the early MEAs expected students to write with voice and thoughtful topic development as well as mechanical skill. Classroom teachers were trained how to score these nebulous qualities and this translated directly into classroom practice.

Students learned techniques and strategies on how to develop these more difficult skills. Twelve years ago it was rare to find examples of “excellent writing” from 11th graders on the MEAs, but now our fourth-graders are regularly producing writing at or above that level! If we value well-written work from students, if we give them the instruction, time and feedback necessary for them to accomplish good writing, and if these are the skills and processes we actually test, then we are heading in the right direction. Teachers do serve on the test committees and will score many of the open-ended responses.

With time, their participation will refine both instruction and assessment but this year there appears to be a wide gap between the two and a great concern for students in that gap.

I am concerned that the current round of MEAs has taken valuable class time away from our students. One 11th-grade English teacher bemoaned that a result of the extreme length of the tests was that her students read and discussed one less work of literature than last year. Another high school administrator admitted that the atmosphere had been “pretty tense” during the two weeks of testing and there was a palatable sensation of the test “spinning out of control.”

What price are we paying for stressful, lengthy testing? It appears to be lack of real work done in the classroom. The task force was quite clear that students should not be granted a diploma based on the number of days that they warmed chairs in science class but based on what they knew and were able to do. Yet a fourth-grade teacher calculated that, if you totalled the time spent this year for all fourth-, eighth- and 11th-grade students, this year’s test eliminated a month and a half of education.

The level of specificity presents another worry. Will the short answer segments of the test result in students cramming lists of minutia the week before the exam? This is directly counter to the original purposes that the Task Force laid out for the Learning Results (LR).

How will the public interpret the test scores that have asked students detailed questions on music or dance if these areas have not been offered in their schools? In a survey of a local high school, students were asked which test topics they felt most unprepared for; many honor students responded that they didn’t even recognize some of the discipline-specific vocabulary. It seems clear that what the test scores will show is that schools have not yet begun to address certain areas of the Learning Results.

My major concern is that the level of expected knowledge will be reduced to merely memorizing a list of names, dates and wind patterns. The full intention of the LR was to challenge students to be clear and critical thinkers, not merely to force students to accrete disparate facts in short term memory. The Learning Results are “sold” based on the task force’s work and the huge amount of public input collected during its development.

One task force member wrote, “I believe that the work I was involved in was misrepresented. To promise that certain things would or would not happen (and to gain public support by assuring that these things would be carried out) and then to see the opposite occur is extremely upsetting … ”

To set the record straight, the original task force (22 people from diverse walks of life and support staff from the Department of Education) spent two years working on the philosophical foundation and on the process of how to develop the Learning Results. Here are a few key points from which we see the State of Maine straying.

We adamantly opposed a single, high stakes exit test. We believed that the complexities of evaluating students could not be captured in one testing measure. This prompted us to strongly urge the state to work with local districts to determine an array of assessment techniques which would take into account various student strengths.

We felt that research and technology skills and process should be infused throughout the document and not confined to separate disciplines. These are the province of all teachers and all students.

We believed that these standards could not be met in a discipline-isolated fashion.

We believed that everything in the document should be pointing toward the general but important guiding principles. Otherwise it would be a mere compilation of discrete facts to be memorized or of isolated skills to be demonstrated out of context. This was not our hope for Maine’s future. Not since Francis Bacon can any one person have a firm grasp on all subjects.

Another task force member wrote to me suggesting that instead of expecting students to regurgitate specific scientific factoids, educators must focus more on areas such as: Do kids know the scientific method? Can they set up an experiment? Do they know how to display data? Do they know how to do research and to develop an hypothesis? Do they know how to write up and communicate conclusions?

We also pulled away from stating that specific skills needed to be acquired by a specific age. That’s why the Learning Results were formatted within grade spans.

We recognized the ultimate importance of professional development time and space for each school staff, not merely the wealthy districts.

We were quite clear that the Learning Results should not be a state curriculum; we did not believe that the best thing for our students was to build a day-to-day plan for specifics of what should be taught and how. We believed that we were developing a document that was both shaped from a state-wide perspective (based on a large-scale public input) yet allowed plenty of room for local direction. We wished to honor the work of those communities who were already producing excellent students.

Sure, let technology assist Maine educators in scoring the open-ended MEAs. But let’s not relax our vigilance regarding their form and content. Take care that the process used to administer them conforms to the spirit of the Task Force and delivers the meaningful, accurate assessment of our students’ accomplishments. Community members as well as educators would benefit from revisiting the foundations on which the Learning Results were framed.

Abigail Garthwait of Orono was a member of the Task Force on Learning Results. The content of this commentary was also supported by other task force members, including Al Dickey, Lynne Miller and Barbara Wicks.


Have feedback? Want to know more? Send us ideas for follow-up stories.

comments for this post are closed

You may also like