Survey questions raised
A county opinion survey uses what some opinion experts consider flawed techniques.
“The statistically valid telephone survey is conducted every two years to gauge what issues are most important to all Washoe County citizens whether they live in a city or unincorporated area of the county,” said a county statement announcing the survey, which asks residents to rate county functions.
But pollsters say calling a survey statistically valid doesn’t make it one.
The problem is that a couple of the questions are lengthy, detailed, and layered—and the survey, commissioned from InfoSearch in Reno, is being conducted by telephone.
The fourth question, for instance, has a list of 15 county services, and the resident is asked to rank each of the 15 on a priority scale of 1 to 5. The fifth and sixth questions are similarly formulated.
One pollster who did not want to be named said, “There’s no way there can be a valid response to this question unless the respondent can have the complete list in front of him and make a judgment of all the functions, weighing one against the other. It asks a resident to remember a list of 15 things in his mind after they’ve been read to him. Could you do it? I don’t doubt that the county is being told it is valid, and certainly doing it off-phone would be more expensive, but this is just not an appropriate question for a phone survey.”
University of Nevada, Reno’s James Richardson, a former pollster, said the technique “does not allow for comparisons and [it] builds in a bias. In polling, it’s known as fatigue bias.”
He said, “The way surveyors try to get around that bias is rotate the order of the questions.”
That still would not allow the resident to see the entire list, but it would reduce the advantage the first items on the list generally get. The first item on this list is “Criminal prosecution and operating the jail.”
County spokesperson Kathy Carter said, “Conceptually, some surveyors will recommend that this rotation of answers be done to ensure there is no ‘order effect’—i.e., by always giving the options in order, does it predispose the respondent to rate one option higher over the other? That’s the concept. In reality, InfoSearch has found in their 15 years-plus of surveying that there is little if any ‘order effect’ for a survey of our type to justify the additional expense of developing and printing out the re-ordered surveys for the calltakers to use, and then the logistics of assembling the data results from the various survey versions. And, as my press release stated, we were very cost conscious and, in fact, got the survey to be conducted for the same amount of money as we did two years ago, with even increasing the number of volunteer citizens to participate in the online version. So, bottom line is that the possibility of an ‘ordering effect’ upon our survey is very slight, and not rotating the question choices has absolutely no impact upon the statistical validity of our survey.”