Editor’s note: John Fox is principal of John Fox Marketing, Cincinnati.
For nearly the past two years we Americans were inundated with presidential election news, debates, commercials and polls. As we spent most of 2012 muddling through the Republican primary season and the field was narrowed to Barack Obama and Mitt Romney, not much changed in this bombardment. Every day – news about the candidates, news before and after the debates, endless (mostly negative) commercials and those continual polls!
For those of us lucky enough (ahem!) to live in a swing state like Ohio, we were overwhelmed by postcards and a never-ending onslaught of phone calls. Most of the calls were of the prerecorded or robo type. I personally heard from Mitt Romney himself, Clint Eastwood and even Doris Day (who surely didn’t sound age 90)! These calls were either telling us whom to vote for, more likely whom not to vote for, or “just a quick 30-second survey” which would then take at least five minutes.
After the 27th automated research survey, I actually received a call from a live human being stating that she had only three questions to ask. I boldly told her, “Don’t even ask. Here are my responses: (presidential candidate), (senatorial candidate), independent!” The caller was stunned. I had aced the three questions, in the proper order. She thanked me and moved on to her next call.
As November 6 approached, the pundits (I love that word) from both sides were constantly quoting the polls. There were the national polls but very quickly everyone dismissed them because, as we all know, the president is not elected by a national vote (just ask Al Gore). What really counted were the polls in the nine swing states, where, you know, all the money was spent on those lovely smear ads.
The Democratic pundits were singing the praises of the polls from Ohio, Florida and the rest, most of which had Obama leading. The Republican pundits (try saying that three times, fast) were either denying these polls or else quoting other survey results. Those had probably been conducted by organizations that were, well, let’s say (to use a double negative), less non-biased than the ones the various news networks were quoting.
They were also trying to denigrate the more independent polls by questioning whether the sample sizes were large enough or if the sample was representative enough of the entire population. This charge assumed that the results lacked the appropriate amount of the following: cell phone users (young people); households without phones (very poor people); some who can’t hear their phone ring (older people); and, most important, the huge group of those who are too busy to answer a phone or a research survey – presumably rich people.
Now, I’ve been in the marketing research business for more than 25 years. I have learned that a sample size of even 300 or 400 is sufficient to yield statistically projectable results regardless of the geographic area covered in the research. I have also learned that if you are missing from your sample a few people from various specific demographic segments, this tends to wash out and the results will still be predictable. And that’s precisely what happened in the 2012 presidential election polls.
Obama ended up winning eight of the nine swing states and the polls were totally correct. In fact, it turned out that most of those whom the Republican experts thought may have been underrepresented in the polls were precisely the demographic mix that the post-election pundits (from both parties) identified as winning the election for Obama – well, except for that wealthy segment.
Nate Silver, of the FiveThirtyEight blog, received the most credit for his prognostications on the election. He absolutely nailed it, right down to predicting the 332 electoral vote total for Obama. How did he do it? He apparently has an algorithm (having nothing to do with Al Gore’s dancing) and the polls were certainly part of it.
Immediately following the election, Silver published a list of the most- and least-accurate election polls, by research organization. The most accurate was a company called IBD/TIPP which conducted its research over the phone. The second-most accurate was Google Consumer Surveys, conducted totally online – a triumph for the direction in which marketing research is going (and squashing the hypothesis that this methodology may not be representative). The least accurate, by far, was Gallup – that huge, long-standing firm known for its, uhhhhh, Gallup Poll.
As a member of the marketing research industry, I am very proud of the accuracy of these surveys that harangued all of us ad nauseam during 2012. Marketing research is the real winner of the 2012 presidential election.