In commentary for the release of its seventh Annual Report Card on Atlantic Canadian High Schools, the Atlantic Institute for Market Studies suggests there’s a correlation between improved measurement, along with more publicly accessible data, and the performance of schools. Since the annual report card was launched in 2003, using essentially similar yardsticks year to year, there has been “growing evidence that progress is being made,” the Halifax-based think tank claims, noting that it has waited until solid data was in place before commenting on whether progress was being made.

Nova Scotia leads in this tentative “progress” among the four provinces, AIMS notes. This province is also cited for its own efforts to improve testing — including the return of provincial exams — and to make more information on school performance available to the public. This year’s report card for Nova Scotia includes four new measures — attendance rates, plus teacher-assigned grades in math, science and language arts.

Evidence remains tenuous for the claim that increased effort to measure and assess, both within the school system and from the outside by AIMS, is starting to pay off. But ultimately this is the heart of the rationale for the always controversial cause of testing and measurement of the performance not just of individual students but of whole schools. AIMS focuses on the school level on the principle that education takes place in schools, not in districts (or regions), and the variability of results within regions seems to support that.For variability, look no further than Cape Breton which for the second year running has Nova Scotia’s top-ranked school, Cape Breton Highlands Academy in Terre Noire, Inverness County, as well as some of the lowest, with Memorial in Sydney Mines and Glace Bay High numbered 40 and 41 out of 55 ranked schools.

Ten island schools get final grades but four, including Sydney Academy, don’t because of missing data. Among the 10 schools ranked, five improved their grade from the previous report, three fell, and two are unchanged. Notably, two schools — Holy Angels in Sydney and Breton Education Centre — jumped two steps, and Holy Angels is singled out as one of the province’s most improved schools over the last five years. The letter grades for schools range from A-minus at the top to C-minus at the bottom, so there are six steps in the range in this report, which uses a three-year rolling average (the last school year counted being 2006-07).

Unfortunately, the data that are easiest to report are the single-letter grades for each school, which lead inevitably to the ranking. AIMS doesn’t apologize for this but makes the point that the real grist of the annual report card is to be found in the detail on a particular school compared to others or to its own history. Indeed the letter grade of any school is virtually meaningless without asking why.

The value of the AIMS report card is disputed and may remain so. Some educators are hostile, calling the effort narrow, ideologically-driven, and all but irrelevant. But if nothing else, AIMS has helped erode the encrusted taboos surrounding the assessment of school performance and public access to such data. That’s progress, whether schools are progressing or not.