Skip to content

Survey numbers don’t add up

Re: District of Sechelt satisfaction survey Editor: In response to Christine Wood’s summary in the May 16 Coast Reporter, I read the full report on the satisfaction survey of 300 Sechelt residents.

Re: District of Sechelt satisfaction survey

Editor:

In response to Christine Wood’s summary in the May 16 Coast Reporter, I read the full report on the satisfaction survey of 300 Sechelt residents.

This survey was contracted by the District of Sechelt to Innovative Research Group in Toronto. With a PhD in research/evaluation methods, I found the methodology to be unsound, with so many basic errors in analysis and reporting that it is difficult to have much confidence in the findings.

First, many survey questions are poorly worded, too long, contain too much information, and/or are inconsistent with the response choices. It is hard to imagine how anyone could listen to some, let alone all, of these questions, and their response choices, over the phone without falling asleep.

Second, we do not know the response rate. How many people responded to each survey question (N=?)? How many people chose each response option (N=?)? This information needs to be provided for every survey question. Reporting percentages alone is not good enough.

Third, no statistically significant differences, p-values, or effect sizes are reported, so any group differences, e.g. across years, may be insignificant or due to chance.

As a result, conclusions like “More are satisfied with efforts to reduce traffic congestion than dissatisfied,” actually tell us very little.
Fourth, a regression model might be more useful than an “imputed importance” matrix without Ns.

Finally, I have a few questions: How much was spent on this report? Why are our tax dollars being spent to purchase a report of this quality from a firm in Toronto?

Valerie Ruhe, Sechelt