by Charles Cirtwill

Some 10 years ago, New Brunswick was the regional leader on testing and accountability in Atlantic Canada. On both the anglophone and francophone side of the house, it had the best, broadest and most advanced testing regime in the region, rivaling that to be found in Alberta and B,C. It wasn’t perfect, but it was getting better every year.

More to the point, the availability of the testing results improved the quality and utility of other data that the system generated. Graduation rates and teacher-assigned grades in particular could be assessed and weighted using an objective external benchmark to ensure improved results were actually improvements. Indeed, the use of data was beginning to percolate throughout the system with teachers, administrators, students and parents starting to learn how and when to use data, all types of data, to the greatest effect to improve every school.

Alas, then came the fiscal pressures of an economy in transition, and so New Brunswick abandoned its leadership position, cutting exams and eliminating reports. The choice had been made. The ability to steer the ship wasn’t as important as the need to simply make the ship bigger. Recall, if you will, that similar choices had been made around the design and operation of a ship called the Titanic, with predictable results.

I wish the Titanic comparison could be passed off as mere puffery. Unfortunately, as I (and others) have pointed out before, the performance of the New Brunswick education system leaves little to be positive about. In 2007 New Brunswick ranked almost dead last in reading proficiency, on the new national testing program known as PCAP (pan-Canadian assessment program), ahead of only P.E.I. on the English side and Manitoba on the French. New Brunswick had similarly bad results on the old national assessments known as SAIP and continues to have comparably dismal returns on the international assessments (PISA) – dead last among Canadian provinces in science in 2006, for example, and next to last in math that same year.

The good news for the children in New Brunswick’s schools was that the ability to steer was something many professionals inside the system had enjoyed. So, they made the best of a bad situation, protecting the remaining exams, finding new and innovative ways to collect objective, or at least independent, evaluations of school performance (parent surveys being one such example). Their efforts have now reached a new pinnacle, with the latest release of the annual school-by-school performance reports.

Children are not educated in systems, they are educated in schools. By collecting and reporting school level results, openly and honestly, to all comers, the province of New Brunswick has once again reclaimed its leadership role in Atlantic Canada as a champion of informed decision-making about education.

Regrettably, this does not mean New Brunswick’s schools will automatically be better than those in Nova Scotia or Ontario; improvement takes time. But better, more open, more aggressive reporting on actual results means that New Brunswick schools have a better chance of improving next year, and the year after, and the year after that.

Unless of course, once again facing extreme fiscal pressures, the province decides to protect subsidies to business and perks for politicians instead of the tools needed to ensure a quality education for every child in New Brunswick.

New Brunswick deserves credit for being back on top in keeping people informed about what is actually going on in all your schools. But you have been here before. The question is, are you going to stay here this time?

Charles Cirtwill is the President & CEO of the Atlantic Institute for Market Studies (, an independent, non-partisan public policy think tank based in Halifax. It produces the Annual Report Card for Atlantic Canadian High Schools.