Survey said
If you're feeling wonky on a Friday afternoon, check this puppy out. It's a phone survey done by some folks at Portland State University on how residents of our region think things are going. Since it comes from the PSU "urban planning" school, you need a grain of salt with this, but it's interesting nonetheless.
If that's just too dreadfully sociological for you, at least scroll all the way down to Appendix C, where you get great stuff like "R was very distracted at the beginning, someone nearby was playing a woodwind instrument loudly."
Comments (2)
Who is this "Azonoren" and where can we meet?
Posted by cj | February 17, 2007 2:21 PM
I spent the weekend reading most of this survey-my thoughts.
As always concerning surveys, the type/manner of asking survey questions is always important in forming the "media/government" spin on the survey, or anyone else. In this case, reading the endless pages of comments from participants, even they frequently note this same criticism of the survey.
For example the survey asks in degrees from "agree, somewhat agree, etc." if "My family and I are now financially better off now than we were two years ago". This question forms a much different result than asking "My family and I are financially SIMILAR to what we were two years ago". With the "similar" survey question the results will be more in the "average range" than the two extremes. Participants in surveys will typically answer in the range to be more "average" than "negative" or "positive". We all strive to be more "normal".
Another point about this survey is the placement in the list of what degree a participant may agree or disagree. For example, some questions range from "somewhat dissatified" to "somewhat satisfied". But in the total list of choices the "satisfied" or "dissatisfied" is placed at the end of the list. This order of response taints the survey results. Usually participants pick from the "ranges" and not the choice(s) at the end.
The survey also could easily have different results based on now the questions are presented. For example most of the questions are asked in a "postive nature". For example, "The regions economy is doing well"-"do you agree or disagree" in all the ranges asked. If this same question was reversed in a "negative" format like, "The regions economy could be better"-"do you agree or disagree" in all the ranges, the results would likely be much different. More likely more participants would side with the "agree"-"the economy could be better".
There is also an interesting dichotomy between the survey results based on the questions asked and the "comments" from the participants. The comments were the best part of the survey that tells much about the participants thinking-not the survey questions themselves. First, the survey asks few questions concerning "traffic, congestion, road conditions, etc.", but there is a preponderance of comments concerning these issues. Same goes for "density", "UGB" and related issues. The survey seems to miss the issues participants are most concerned about. Will PSU note this missing link? Was the survey made to miss these important issues?
How will PSU, the pols, different governmental agencies, and the media interprete the survey? For the media the "interpretation" will be made in a 10 second sound bite-"Portland is doing a pretty good job". For the rest they may follow suit but elaborate a little more to lead one down their "political agenda". The comment portion of the survey will probably be totally ignored because it hasn't been put into a percentage form.
Then there is also the misuse of "percentages" when surveys are analysed. And also the results from 850 participants. Note that the age average of participants is quite high-higher than the region's average. And that more are female than the average. These are important differences.
Is this a good survey? Maybe it is good for the agenda you prefer.
Posted by Jerry | February 19, 2007 11:07 AM