Friday, March 5, 2004
Chronicle Herald

Testing the testers; What educational establishment still won’t tell you about schools

By Charles Cirtwill

WITH the limited fanfare it deserves, Nova Scotia’s Minister of Education, Jamie Muir, released the third annual Minister’s Report to Parents last Friday.

The good news? We didn’t get any worse. Our Grade 12 students are still achieving in the mid-50s (out of 100) on their exams in English, chemistry and physics; and our 13- and 16-year-olds still rank below the national average for writing. Our kids in grades 3, 4 and 5 have improved a mere eight per cent from 42 to 50 in terms of average scores on math tests.

And yet we are spending, according to the minister, almost $1,900 more per student this year than in 1997-98. That’s a lot of money just to prevent things from getting worse.

If we are spending that extra money, where is it going and is it going to where it is needed?

More important, in the context of a “report to parents,” how is that extra spending affecting the performance of each individual school? After all, each student is educated in a very specific school, with its individual strengths and weaknesses, not in some abstract “system.”

The minister’s report, in contrast to his counterparts in Newfoundland and New Brunswick, does not talk about individual schools. It does not talk about the 70 per cent of the high school grade that is assessed by teachers without outside controls to ensure consistency and objectivity. It does not even talk about all the exam marks.

The minister’s report only looks at the sample of those exams that is centrally marked by the department. It certainly does not look at the differences in the grades assigned by the central markers versus those at the schools themselves.

Yet without such a comparison, there is no objective measure of whether there is grade inflation or even just consistency between students and schools.

It was to deal with just this kind of black hole in educational accountability that the Atlantic Institute for Market Studies launched last year our first annual report card on Atlantic Canadian high schools.

Our second was released this week. The new report card documents yet again the disturbing lack of clear and comparable information about what is going on in each of our schools, and the disappointing indifference of many members of the educational establishment to this shocking state of affairs.

Sitting down with senior school board and department staff, I was speechless to discover, for example, that they could not tell me the average middle school grade in math or English; they simply do not track that information. All too often, the reaction I got to my requests for hard information about school performance was not merely “we don’t have it,” but a far more worrisome “and we don’t really see why we should.”

The minister is to be applauded for continuing to release annual reports to parents, but if it is to do any real good, it must be done at a school-by-school level. It is all very well to know how the province as a whole is faring, or even the different school boards, but what Nova Scotians need to know is how the individual schools are doing where their kids are being educated.

The AIMS report card was roundly criticized for using the department’s own data to assess schools, but that criticism missed the point. The main theme of our first report card was just how pitiful the data were that the department had on which to base its decision-making and management of the schools.

You can’t manage what you don’t measure. By drawing everyone’s attention to the appalling state of knowledge of what was going on in the schools, we hoped to increase the pressure to improve and expand that knowledge. And that strategy has worked – up to a point.

In response to our first report card, the then education minister Angus MacIsaac committed to school-level reporting this year. Unfortunately, we’re still waiting for that commitment to be honoured.

So until the public education authorities in this province start producing school-level information that allows us to identify and emulate excellent schools and help poorly performing ones, AIMS will continue to gather and publish all the meaningful information we can discover on how our schools are doing.

For our second report card, for example, AIMS has collected information from our post-secondary institutions across the region about how kids from each high school perform once they reach university or college.

We have also replaced the province’s “graduation rate” of incoming Grade 12 students with “hold” and “retention” measures for Grade 10 students who graduate or at least enter a third year of high school.

There is a thirst for this information among parents, students and teachers. We had over 180,000 hits on our website in the month we brought out our first report card. And we will keep bringing them out until the educational establishment does us out of a job by doing theirs.