In recent days a Harris Interactive poll about President Obama and the Tea Party plopped into the blogosphere, then into the mainstream media. It came on the heels of a gut-wrenching, partisan, divisive debate over health insurance.
A blogger teased the Net with some of the Obama findings, then Harris Interactive released the poll that showed many Americans, especially Republicans, think he is a socialist, a Muslim, wants to take away people's guns, and even that he is "the anti-Christ." Another release indicated that the Tea Party supporters number about 1 in 6 of all American adults.
The surveys come after Harris Interactive chief Humphrey Taylor read John Avalon's new book, "Wingnuts: How the Lunatics are Hijacking America." Taylor says he read this book, which describes some of the "extreme views" -- his words -- about President Obama. His idea was to test how many Americans held those views.
It's appropriate for public opinion researchers to test these labels to see what proportions of Americans, including Republicans, really think. But given Harris Interactive's methods, should we take its polls with a grain of salt?
Short-term changes in public opinion about people who have widespread name recognition can be quite sensitive to news events. The surveys were done in early to mid-March, when the partisan pummeling over the health care bill was on the upswing.
Minnesotans can remember March of 2004, when the Star Tribune's Minnesota Poll showed a huge lead for John Kerry over George W. Bush right after a book harshly critical of Bush's administration was making the media rounds. Partisans went nuts, claiming bias and inaccuracy. That poll tracked Bush's and Kerry's ups and downs quite closely for the remainder of the election season as Kerry's lead dwindled: The day before the election it showed Kerry with a 4-point lead in Minnesota; he won the state by 3.5 points. The poll proved to be one of the most accurate in the nation.
The Harris Interactive poll brings to mind candidates' message-testing surveys, where pollsters test different negative messages about a candidate's opponent to see the best way to attack, or test positive messages about the candidate to see which resonates best with voters. Often those questions are written to elicit a strong, emotional response, and may not be meant necessarily to reflect the opinions of some larger population.
As a number of observers -- including the insightful Gary Langer at ABC News -- have commented recently, the wording of the Harris Interactive questions is not unemotional. "Anti-Christ" and "socialist," in our culture, are strong, value-laden words that carry a lot of negative, emotional social valence. Contrast them with more neutral question wording often used to measure positives and negatives: "Do you have a favorable or unfavorable opinion of Barack Obama," or "Do you approve or disapprove of the way Barack Obama is handling his job as president?"
Moreover, the question was introduced with, "...here are some things people have said about President Obama," followed by only negative descriptors. This is certainly an unbalanced measure, offering no non-pejorative choices. Citing others' opinions first also is a known way to interject what methodologists call "acquiescence bias" into the measures. Take a look at Langer's critique; it's worth reading.
Respondents in the poll come from a self-selected sample. Most Internet users have seen this kind of appeal: Join our research panel and earn rewards for filling out surveys. The folks at Harris Interactive say they've weighted their panel data to make it representative of the United States. And I'm sure they did their best.
But if you take a non-random sample to begin with, you still have a non-random sample when you've finished weighting.
Let's be clear: There are plenty of times when it's appropriate to use online panels, especially in the world of marketing, where Harris Interactive has won awards for its research. The American Association for Public Opinion Research, the nation's leading professional association for opinion research methodologists, recently came out with a white paper concluding that while opt-in samples had their uses, researchers "should avoid non-probability online panels when one of the research objectives is to accurately estimate population values."
Cutting through the research-ese, the report concluded that these types of samples are less accurate than other, more traditional scientific sampling means. Even different kinds of opt-in panels can differ significantly in their results.
We can't throw all online panel babies out with all the Internet research bathwater. Gallup, Knowledge Networks and others use panels recruited from a national probability sample that don't rely on opt-in panels. And no survey research methodology is perfect: All have issues with nonresponse, coverage, measurement and other potential problems.
Methodologists can natter back and forth about the technical aspects. And survey methodology is worth nattering about, because we're trying to characterize the feelings and beliefs of a nation at an important time in history.
But what's more disturbing is the fuel that the polemic language in this type of poll -- and some message testing surveys that make it to the public eye -- provide to wingnuts on both sides of the political spectrum.
Researchers should be careful in their wording to get at opinions, not reactions to words that push a group's linguistic hot buttons. Bloggers and others of their ilk should be more measured in their writing and evaluate these polls more critically. And we readers? Let's get educated on how to interpret poll findings.
So go get the salt shaker, and make sure it has plenty of grains of salt. Polling for the 2010 races has just begun, and it looks as if these non-random panel polls will be active players.
Rob Daves, former director of the Minnesota Poll, is principal at Daves & Associates Research in Minneapolis. He also teaches survey methodology at the University of Minnesota's Humphrey Institute.