October 17, 2006

CAN the Polls All Be "Screwy?" Of Course They Can

Hatched by Dafydd

Over at Power Line, John and Paul (but neither George nor Ringo) are intoning a mantra that "the polls can't all be screwy."

But in fact, they can be. I'm not saying they are; but it's entirely possible that every, last poll has made a critical false assumption that will only show up when the final vote is tallied on November 7th. However, even if this is true, it might not be enough for them to retain Congress, unless they also improve in the public polling.

(By an amazing synchronicity, just as I was finishing this post, Hugh Hewitt's show came on -- and he was interviewing Scott Rasmussen on this exact question!)

Here is what Paul wrote:

The White House is pointing out that the polls used by major news organizations to show that voters strongly favor Democrats this year all employ samples in which voter id does not reflect historical norms. Specifically, in polls by USA Today/Gallup, CBS/NYTimes, ABC/WP, Newsweek, AP/Ipsos, Time, and Pew, Democrats exceeded Republicans by margins greater than those that existed in any recent election....

But what about all of those polls by Rasmussen, et al that show Democrats ahead in so many of the key races in individual jurisdictions? As John said, "the polls can't all be screwy."

The bias problem doesn't show up much in the wording of questions; it's hard to mess up a question like "do you plan to vote for Republican Rick Santorum or Democrat Bob Casey jr.?" Especially when half the respondents hear instead, "do you plan to vote for Democrat Bob Casey jr. or Republican Rick Santorum?" Nor is there any overt, deliberate attempt to pick an overly liberal (or conservative) pool of respondents.

But before putting stock in any poll, we must understand the provenance of polling in general. What pollsters report is not the raw percentage of how respondents (hereafter, "Rs") answered the poll questions; nor should it be. If a state's electorate is 12% black, but the poll sample ended up being 18% black, then that pool of respondents is not representative of the electorate, and the responses should be "weighted" to bring that into line.

Weighting means that the number of responses for each candidate that come from (self-described) black Rs is multiplied by 12/18, while the corresponding responses from white Rs are multiplied by the corresponding fraction of 88/82, thus bringing the total responses from each group of Rs down or up to what the pollster expects. In fact, pollsters simultaneously weight for a large number of such variables, all based upon their predicted "turnout models" for each of those subgroups of voter... and therein lies the rub. [I misstated the second fraction up there, but alert commenter PBRMan Stone caught me, thank goodness. One hates being caught, but not as much as one would hate not being caught!]

In order to determine whether the poll sample includes too many or too few black, Hispanic, female, college-educated, impoverished, rich, or Catholic Rs, the pollster must first decide what the right number will be. But how do they do this?

First, of course, they look at past elections. In this case, that would mean the election of 2002, since the election of 2004 is not comparable: it's very hard to compare a purely congressional election to a presidential election, because the dynamics are completely different. But this backwards look is not sufficient, because circumstances have changed dramatically since then: for one thing, President Bush was polling at 60% or so in 2002 but only at about 40% today.

Thus, the pollster must adjust the expected turnout model to take these changes into account; and this is where the bias creeps in, probably unbeknownst to the pollster: how much less turnout should we expect from evangelical voters in 2006 vice 2002? How much more turnout of women, or blacks, or Hispanics?

Pollsters don't answer these numbers in the dark: they can start with demographic statistics from the Census Bureau, for example, telling them whether the black population of Pennsylvania has increased or decreased and by how much. But that doesn't necessarily predict whether the percent turnout of blacks in Pennsylvania will go up or down, or by how much: if a state passed a motor-voter bill that caused a big jump in registrations of 18 and 19 year olds, that doesn't necessarily imply an equivalent jump in 18 and 19 year olds actually voting.

But there is one controversial category that is the true wild card and will be the subject of the rest of this post: party identification. I'm not going to bother adding links for everything I say here; it's a research project all on its own. But here is the lowdown:

There is a huge, unresolved debate among pollsters: to what extent does party identification by an R actually reflect his party registration, and to what extent does it instead reflect which party he supports now? In other words, of all the people who now say they're "Independent," how many are actually registered Democrats or Republicans who are just saying they're Independent because they're unhappy with the direction their actual registered party has taken?

The vast majority of public pollsters resolve this problem by simply ignoring it: they use the possibility that party ID might reflect actual voter intent to reject weighting by party ID at all. In fact, of the major public pollsters, only Rasmussen weights for party ID... and even they use a turnout model based upon (wait for it) polling! Thus, they ask Rs their party ID -- and use that to weight other poll samples for party ID. Yeesh!

(Hugh failed to ask Scott Rasmussen one question, the answer to which I've been dying to hear: since Rasmussen does weight for party ID, how often is he forced to adjust in favor of the Democrats, implying an oversampling of Republicans? My guess would be that he almost always adjusts in favor of Republicans, implying his samples -- thus the samples of many other pollsters who do not weight for party ID -- tend to overpoll Democrats.)

How much to weight for party ID is a weighty question for a very weighty reason: if poll samples consistently come up with significantly more Democrats and Independents than voted in the last comparable election (and consequently fewer Republicans), does that mean that a bunch of registered Republicans now consider themselves more in the Independent or Democratic camps -- hence will vote that way -- or does it mean there is an unidentified but systemic bias in the sample selection that will disappear when voters actually go to the polls?

In other words, should polls be weighted to "correct" the typical "oversampling" in favor of the Left in the pool of Rs, or does that supposed oversampling actually reflect true voter intent -- hence should not be eliminated by weighting?

And there is a related question that even further complicates the situation: assume some number of Republicans are mad at the party, so when asked their party affilliation, they say "Independent" or even "Democrat," and when asked who they will vote for, they say "Casey." What percent of them will, in the end, come back to the fold and vote for Santorum, even if they must hold their noses while doing so? After all, if you believe that a person will "switch" his party affilliation one direction, then he could jolly well switch it back in the voting booth, too.

The reality is that the percent of overpolled Democrats and Independents who are in fact "false-flag" voters -- voters who say they're one party while actually being another -- is neither 0% or 100%; nor will all the false-flaggers actually vote for Democrats:

  1. Some of the increase pollsters see is genuine, and will result in greater turnout of registered Democrats and Independents, hence more votes for Democratic candidates;
  2. Some is false-flag, but committed: Republicans saying they're something else, but as a true sea-change in their thinking, which will carry through to the polls, resulting in more (Republican) votes for Democrats;
  3. But some is false-false-flag, meaning they're false-flagging now -- but in the end, for whatever reason, they will come to their senses and return to the fold, voting for the Republican candidate after all.

Every pollster would admit this, though you might have to get him drunk enough. But nobody, and I mean nobody, actually knows what percent of the supposed "oversampling" of the Left is actually Type 3 -- thus leading to an actual, systemic, bias in the polls in favor of Democratic candidates.

If (3) is but a small portion of the supposed overpolling, then the polls are likely fairly accurate -- as of this moment. Under this turnout model, the Left is not being oversampled much at all. But if the large increase in Democratic and Independent party ID is largely explained by false-false-flag voters, then the oversampling is real and could be significant.

The answer to this question changes from day to day, naturally: a committed false-flag voter can turn into a false-false-flag voter three days before the election, if he hears the right argument, either in an advert or from a neighbor.

For my own guess -- and that is all it really is -- I think that the percent of the overpolling that is false-false-flag is significant. Here is what the White House said in that press release linked by Paul above:

In short, between 1992 and 2004, only once did one party enjoy an advantage as large as 4 points over the other in party ID. But in recent polling samples used by eight different polling organizations (USA Today/Gallup, CBS/NYTimes, ABC/Washington Post, CNN/Opinion Research, Newsweek, AP/Ipsos, Pew, and Time), the Democratic advantage in the sample surveyed was never less than 5 points. All these organizations conducted surveys in early October. According to Winston, the Democrats held the following party ID advantages in these early-October surveys:

• USAToday/Gallup: 9 points
• CBS/NYT: 5 points
• ABC/WP: 8 points
• CNN: did not provide sample party ID details
• Newsweek: 11 points
• AP/Ipsos: 8 points
• Pew: 7 points
• Time: 8 points

While I'm sure there has been some honest "false-flagging" by registered Republicans who actually intend to vote Democratic, hence identify themselves as Independent to the pollster -- and even some actual party-registration switching away from the Republican Party -- I do not believe that it is such a staggering increase as we see here. 8 points? 9 points? 11 points?

However, there is no question that Republicans are running behind right now, even taking the false-false-flaggers into account. Rasmussen polls do weight for party ID; and even though they base their guess of turnout on polling, Scott Rasmussen just said (on Hugh Hewitt, remember?) that many of the races (including Sens. Mike DeWine and Rick Santorum) have Republicans so far behind that even upping turnout to the 2002 level doesn't put them ahead.

That is why I have estimated that systemic bias in public polling accounts for only 1% - 2%: that's my back of the pants guess of the impact of type-3 "false-false-flag" Rs.

I also guess that the advantage Republicans enjoy on GOTV, money, and general skill at closing (including the power of incumbency) will give them an additional 3% - 4% on average, though not evenly distributed among the races. Thus, most Republicans who are only down by 4% or less in the last public polls before the election have an excellent chance of pulling it out.

So in answer to John's aphorism (and Paul's quotation of John's aphorism), yes, it's certainly possible that all the polls are, in fact, screwy. But it's impossible to know that for sure until after the election.

Also, even if screwy, there is no way to measure just how screwy they are: it might not be enough to make up for the Republican deficit.

But it might change the outcome in some close races. It's certainly worth pursuing the question of trying to figure out how much of the "oversampling" actually reflects a real shift in the electorate, and how much is actually an improper oversampling that should be corrected by weighting.

Hatched by Dafydd on this day, October 17, 2006, at the time of 4:36 PM

Trackback Pings

TrackBack URL for this hissing: http://biglizards.net/mt3.36/earendiltrack.cgi/1362

Listed below are links to weblogs that reference CAN the Polls All Be "Screwy?" Of Course They Can:

» Can All the Polls Be Screwy? from Blog-o-Fascists
Power Line

Our friend Dafydd ab Hugh continues the conversation Paul and I have had over whether the polls that consistently show bad results for Republicans ca...

[Read More]

Tracked on October 17, 2006 7:30 PM

Comments

The following hissed in response by: BusterDBear

The only poll that matters is on Nov 7th.

The pollsters know the bias that can/does creep into their polls, and the honest pollsters try to prevent the bias from corrupting their results ---
but what about the editors who spin those results even further from the truth.

The above hissed in response by: BusterDBear [TypeKey Profile Page] at October 17, 2006 5:30 PM

The following hissed in response by: Hestrold

Polls, polls, polls, every two year, I live and die by them, and then, come election day, they are WRONG, EVERY TIME. I should feel like a complete fool to get so worked up over the polls every time, but apparently I'm in good company. Everyone I know watches the polls and now all my favorite bloggers watch them religously.

If I wasn't watching the polls AGAIN, I couldn't imagine coming to the conclusion that the Democrats had a chance of capturing the House and Senate. After all, they have no plan, they want to cut and run, they want to raise taxes, they want to appoint liberal judges to the Supreme Court, they are not for missle defense. All they can do is throw fits.

But, wait, those polls....

The above hissed in response by: Hestrold [TypeKey Profile Page] at October 17, 2006 5:58 PM

The following hissed in response by: bill

If you ask a marketing research person they will tell you phone surveys are NOW useless. Most don't answer the phone or have only cell phones with caller ID, most that answer the phone hang up when they find out it's a survey, and the rest might answer a few questions, then hang up, what's left basically lie to sound like they are more reasoned than they are.

The demographics you get with phone surveys tends to wards the low end of society, who aren't going to buy your product, or any product anyway.

Bottom line, you won't find a marketing guy hanging his hat on a telephone poll for anything.

So how is it going to be different for political polls?

I know in our family, the answering machine gets the phone line, real calls come to our cells, everyone has their own.

The above hissed in response by: bill [TypeKey Profile Page] at October 17, 2006 6:56 PM

The following hissed in response by: Terrye

Rasmussen has changed its weighting since the last election. I saw it on their site. They have increased the numbers of Democrats by 5 points. They said that people were identifying themselves more that way.

And I thought, how does that work? How is that consistent?

The above hissed in response by: Terrye [TypeKey Profile Page] at October 17, 2006 6:57 PM

The following hissed in response by: PBRMan Stone

In the example you cite of blacks being overrepresented, 18% instead of 12%, shouldn't the wieghting multiplier for non-blacks be 88/82 instead of the reciprocal of 12/18 (which is 18/12)?

The above hissed in response by: PBRMan Stone [TypeKey Profile Page] at October 17, 2006 7:03 PM

The following hissed in response by: Jim,MtnViewCA,USA

Sorry, a bit off-topic, but I want to mention that it is fascinating to watch the anti-Repub effort. The Dems and their MSM and gov't bureaucrat minions are SO desperate that the polling oddities may not be all chance. It might demoralize Repubs and it might serve to damp down investigation of successful cheating.
Certainly the last minute Oct surprises such as Rep Foley and the attacks on Rep Curt Weldon (the Able Danger guy) seem awfully convenient. All the stops are being pulled out.

The above hissed in response by: Jim,MtnViewCA,USA [TypeKey Profile Page] at October 17, 2006 8:46 PM

The following hissed in response by: Svolich

I know one reason the polls are crap. They're oversampling me.

This election season I'm getting about 3-4 poll calls per week. During the primary I was getting 1-2 per DAY. Once I had a poll beep in while I was talking to another poll, I put them on confrence call.

Apparently they have such a hard time getting enough responses they have cheat sheets with the numbers of people that are willing to talk to them.

The above hissed in response by: Svolich [TypeKey Profile Page] at October 17, 2006 9:34 PM

The following hissed in response by: bpilch

Nothing would be funnier than to just have a 4 week version of the exit polls. Clearly the polls are oversampling Dems, but the question is by how much. Probably not enough to save the House, but maybe so with the GOTV machine. While on average we are expected to lose, our expected utility value is positive because if we managed to pull this one out our victory would be so huge and their defeat even huger (sic)....

The above hissed in response by: bpilch [TypeKey Profile Page] at October 17, 2006 10:38 PM

The following hissed in response by: RBMN

If pollsters know that their numbers are inaccurate, because people hate them and won’t talk to them, how do they try to pick the winner correctly at least?

By demoralizing one side so much, that they make their prophecy self-fulfilling. No?

The above hissed in response by: RBMN [TypeKey Profile Page] at October 17, 2006 10:41 PM

The following hissed in response by: Dafydd ab Hugh

Svolich:

Apparently they have such a hard time getting enough responses they have cheat sheets with the numbers of people that are willing to talk to them.

I wonder if that's literally true? If so, it would be a catastrophic scandal and would completely invalidate all polling, since the most basic, core requirement of a valid poll is that the sample not be self-selected.

You might have been the victim of the numerous political calls that masquerade as polls; I would have a hard time believing that any one person would be called repeatedly per day by pollsters such as Gallup, Field, Pew, Mason-Dixon, Quinnipiac, and the other respected public pollsters.

In all my adult life, I have never been contacted by a pollster I recognized... which is normal, with 200,000,000 adults in the country and only 1,000 to 1,200 contacted per poll.

(You should be contacted about once every 200,000 polls; and I doubt there are anywhere near 200,000 polls conducted during a person's lifetime -- more like 30,000 at most, probably many fewer. Thus, 85%+ of all people will probably never get contacted in their lifetimes.)

Dafydd

The above hissed in response by: Dafydd ab Hugh [TypeKey Profile Page] at October 17, 2006 10:47 PM

The following hissed in response by: Dafydd ab Hugh

PBRMan Stone:

Oops, I think you're right. This always happens when I try to write at the speed of thought (I'm a fast typist, but not that fast!): I end up typing one paragraph while writing the next one in my head, and I make foolish errors.

I corrected it. Thanks!

Dafydd

The above hissed in response by: Dafydd ab Hugh [TypeKey Profile Page] at October 17, 2006 10:50 PM

The following hissed in response by: Svolich

I wonder if that's literally true? If so, it would be a catastrophic scandal and would completely invalidate all polling, since the most basic, core requirement of a valid poll is that the sample not be self-selected.

I don't know if my interpretation is true, but my description of my experience is accurate.

Many of the calls we get appear to be testing a message. They start off asking if I've heard of the following candidates, then give a pretty bland description (X is endorsed by the Sierra Club, Y is endorsed by the California Peace Officers) now how do you feel about them?

There have been a couple of "push polls" but they're pretty easy to spot (X was found having sex with a Llama! Now how would you vote?) and we hang up on them.

But many of them are just straightforward are you going to vote, who are you going to vote for.

One thing I have noticed. All the professionals talking about sampling recently talk about how everything is autodialed and totally opaque. But many times I've told callers that I can't talk to them right now, if they'll call back in half an hour I'll be glad to. Every time they've said thanks, didn't ask for my number, and they called back.

I can usually get them to talk a good deal about themselves. I'll say, sure, I'll talk with you, but I want to know a couple of things. Where are you calling from, are you paid by the survey, how many do you do in a shift, that kind of thing. I'll start asking who they're with, I know we've gotten Zogby, Rasmussen, Field and Gallup in the past.

The above hissed in response by: Svolich [TypeKey Profile Page] at October 17, 2006 11:39 PM

The following hissed in response by: Charlie

Svolich,

You may well have been the victim of an onslaught of push polls. While no reputable pollster would ever consider such a shady and unprofessional tactic, they are actually quite common outside the legitimate polling industry, especially close to the end of a campaign.

One way to spot a push poll is abruptness: people conducting push polls want to get you on the phone, say their piece (usually including something that might make you "more likely or less likely to vote for" somebody), and move on to the next person as soon as possible. The purpose of a push poll is to disseminate information (whether true or not so true), not to gather it, so they're going for maximum call volume.

Ergo, they won't waste time with useless things like demographic questions (age, religion, party, ideology, income, etc.). These are things that a legitimate pollster would consider invaluable information; legit tracking polls in the closing days of an election season do usually run short, but any poll lacking these basic questions should be considered suspect.

Careful, though...just because a poll tests your reaction to unflattering things about a candidate, doesn't mean it's a push poll. For all you know, that very candidate may have commissioned that poll to test his/her own negatives and see if pre-emptive damage control might be necessary.

As for "cheat sheets," I can guarantee you no such thing happens (again, I must include the qualifier "among reputable pollsters"). As I see you and Daffyd have both surmised, that would take the random element out of the sample, skew the poll, and render the results useless to anyone interested in accuracy.

The above hissed in response by: Charlie [TypeKey Profile Page] at October 18, 2006 12:48 AM

The following hissed in response by: Charlie

bill,

I had to chuckle when you remarked how marketing research folks dismiss the usefulness of phone polls. That's the beauty of political polls...people absolutely love to talk to political poll interviewers (at least, certsinly, moreso than to marketing survey interviewers). People are a lot more likely to sound off if you show interest in their political views than if you ask them about their preferences with regard to, say, cell phone service providers.

For one thing, political polls are less likely to be lumped in under the rubric of the toxic "telemarketing" label, especially if (as Svolich mentioned) a quick couple of questions about the caller's bona fides will usually reveal to the respondent's satisfaction that the caller will not be ending the survey with a sales pitch.

For another, the high profile of political polls brings with it a certain cachet; if political polls have been all over the news, potential respondents frequently like the idea of taking part in one given the chance.

Moreover, politics are simply easier to engage in a conversation about than some product that the respondent may or may not even have any use for. You all know the blogosphere well enough to know that anybody at all (no offense intended, Daffyd, of course :::grin:::) can toss out an opinion on politics. On the other hand, how many of you would be interested in a marketing survey about a nationwide auto parts chain?

Yes, I realize this all sounds a lot like amateurish psychobabble (or worse, cheap rationalization), and I'm telling you, if I didn't know it to be true I'd be rolling my eyes right along with you. All I can tell you is that, for lack of a better word, people just find politics more fun to talk about than most other survey topics.

The above hissed in response by: Charlie [TypeKey Profile Page] at October 18, 2006 1:12 AM

The following hissed in response by: Robert Schwartz

Figures don't lie. Liars figure. The polls are all run by people with political agendas.

Once upon a time I did some graduate work in statistics and polling. The one thing I learned is that the answers to pollsters questions are determined by the structure of the poll. The issues in constructing the poll (e.g. the wording of the questions asked, what order the questions are asked in), sample construction (my wife hangs up on pollsters, my daughter has a cellphone and no telephone landline, many people routinely screen calls with their answering machines), and survey methods (in the 1950s pollsters went door to door, now they just telephone) are insuperable.

Political junkies and pollsters speak a different language than ordinary Americans. Most people do not have a coherent ideology and will answer specific questions, particularly ones that are not in the front of the public agenda in a very offhand way. Terms like liberal and conservative (and even Republican and Democrat) have neither fixed nor coherent meanings for most people.

Furthermore the psychology of the polling situation is never considered. The answers to poll questions reveal only what the sample thinks will cause the pollster to have the emotional reaction to the sample, that the sample wants to occur. I.e. if the sample wants the pollster to be pleased to talk to the sample, the sample will express opinions that the sample believes will please the surveyor. Because the majority of samples want to make the pollster happy, they will tend toward bland conventional wisdom, usually derived from the mainstream media.

Or to put it more simply, polls are mirrors that reflect the images pollsters want to see. The only possible conclusion is that polls results tell us a great deal about the pollsters' weltanschung, and almost nothing about the preferences and beliefs of the sample. As I always say, you tell me the results you want from a poll and we can get them.

Ignore polls, they are worthless.

The above hissed in response by: Robert Schwartz [TypeKey Profile Page] at October 18, 2006 6:56 AM

The following hissed in response by: jaybird

There was a time, somewhere between 1952 and 2000, when public opinion sampling had advanced and evolved as an art and science to the point that political polling was devastatingly accurate. Practically pin-point. For example, in 1964 the polls nailed LBJ's landslide victory with dead-eye accuracy weeks before election day. And there was none of this oversampling-undersampling stuff. The goal was to simply get a snapshot of where things stood, and they did it. But a funny thing happened somehwere along the line. Pollsters began taking polls for other reasons than to merely guage and assess public opinion at a given moment. Instead they began to use polling and the polls offensively, as a way to massage and manipulate the public, as a way to shape public opinion. I'm not talking about using the results of polls to do these things. I'm talking about using the very polls themselves to do it. Hence, we have phenomena like push-polling, and the practice of election day exit polling skewed to suppress the vote in other geographic areas. And the instances of plain old dishonest polling shaded to make a candidate appear more viable and competitive than he really is, thereby manufacturing instant Big Mo.

And so then after an election cycle or two of that, another funny thing happened. People figured out that polls are fraudulent, or the next thing to it. And so now people play games with the pollsters. Personally, I just don't believe polls anymore.

The above hissed in response by: jaybird [TypeKey Profile Page] at October 18, 2006 7:14 AM

The following hissed in response by: Svolich

Charlie -

I haven't been tracking the demographic questions, I must start keeping a log. But the vast majority of the calls I get DO include demographic questions. I lie for that portion - my real ethnic background is mixed European/Asian/black/Hispanic/Russian/pacific islander/native American, and they never have a choice for that, so I pick one at random.

I also never tell them my real income, I say it's $60k. It's none of their business.

One more amusing point - I live in a very Hispanic city, so many of our candidates have Hispanic names. Like Hermando de la Libertad. The callers can't pronounce their names, I have to help them through.

If the length of call is an indication as to whether it's a push or not, most aren't. They run 10-15 minutes.

The above hissed in response by: Svolich [TypeKey Profile Page] at October 18, 2006 7:38 AM

The following hissed in response by: Whitehall

Two points.

I am a Republican and SELF-SELECTING OUT. A contact by a poll taker is politely refused. Why? I no longer trust polls - they have become offensive (in both uses of the term) weapons in the political battles. I'm sure I'm not alone on this and so the general predictive ability of polls MUST decline.

Second, the point about the randomness of polls is incorrect. If polled populations were in fact random across the country then the odds are one would not be contacted. However, polling is concentrated in contested elections where ample money is spent. No one is polling (much) in California's senate races since Diane Feinstein is a shoo-in. Rhode Island and Connecticut will have a much greater intensity of polling per capita.

At heart is the goal of polling. Is it to predict the future? Or is it to collect tactical information? Or is it to provide tactical advantage? Given the declining track record of polls in the first, I'd suggest that the motivations of pollsters has shifted to the second and third possibilites.

The above hissed in response by: Whitehall [TypeKey Profile Page] at October 18, 2006 9:19 AM

The following hissed in response by: TallDave

Remember Bill Kristol in 2004?

"Exit polls are never wrong. These people are very smart. They know what they're doing."

You know, so many articles have been written about "Republicans facing defeat." Wouldn't it be funny if all those tons of newsprint was based on polling bias?

The above hissed in response by: TallDave [TypeKey Profile Page] at October 18, 2006 10:59 AM

The following hissed in response by: SheilaG

I agree that polls that they publish are often suspect.

But it seems like political candidates often have very different polls. They seem to know what's going on more than the newspapers do, and we see that in the allocation of money to different races, etc.

How do they poll? And is it more scientific?

The above hissed in response by: SheilaG [TypeKey Profile Page] at October 18, 2006 11:08 AM

The following hissed in response by: betapi

Polls can't be accurate. Because I hang up on ANYONE calling me to sell something, or poll. Not only that, I'm on the Federal NoCall list. Neither do I listen to any pre-recorded candidate message.
My phone, my time, my choice not to share. Hey, it's my opinion, let 'em pay me for it. Better still, I got no time for them anyway. I hang up at the earliest possible moment, without either saying 'no thanks' or 'fuggoff'. Wastamytime. I vote - the only poll that counts.

The above hissed in response by: betapi [TypeKey Profile Page] at October 18, 2006 1:17 PM

The following hissed in response by: KarmiCommunist

A Poll Question that is rarely (if ever) asked: Will you vote against Democrats or against Republicans?

The above hissed in response by: KarmiCommunist [TypeKey Profile Page] at October 18, 2006 2:13 PM

The following hissed in response by: Dafydd ab Hugh

Robert Schwartz:

Yet on the other hand, the well-respected public polls (like Gallup) are all more or less accurate when the election is actually held; they're usually off, but only by a couple percentage points.

So they cannot be entirely "worthless."

Dafydd

The above hissed in response by: Dafydd ab Hugh [TypeKey Profile Page] at October 18, 2006 2:19 PM

The following hissed in response by: Navyvet

It's been awhile since I've been down this road, but Bill's first comment is an important one (not to disparage any of his other comments).

Let's assume you're conducting a poll and place 1,000 completed calls. (A completed call being one where you reach a real, live person--answering machines, and no-answers don't count.) Of these 1,000 completed calls, only a percentage of the folks on the other end of the line will agree to participate in the poll. Some hang up (often after making a rude comment).

These non-participants can make up 40% to 60% of the calls!

Pollsters merely continue placing calls until they finally acquire their target number of Rs. Once the total reaches 1,000 completed polls, another 700 to 1500 calls have been made to non-Rs.

No one knows anything about the people who declined to participate. Are they registered voters? Did they vote in the last election? Are they more inclined to be Democrat, Republican, or Independent?

One could make an argument that, post-Foley, many Republicans might be reluctant to answer polling questions while Democrats might be more inclined to do so.

Pollsters don't like to talk about the flaws in their methodology, but I learned long ago to ignore poll results as little more than SWAG's.

The truly frightening thing is at least some Democrats are saying that if the Republicans aren't defeated in November's balloting, then the election must be rigged because all the polls say Democrats should win!

The above hissed in response by: Navyvet [TypeKey Profile Page] at October 18, 2006 8:11 PM

The following hissed in response by: Stinson7

Last spring I took a class called Voting and Public Opinion that was, more or less, about polls and the volatility of the American electorate. I went into this class expecting my professor to debunk polls. Rather, I left understanding that in most cases polls are accurate (within their margin of error of course). So, below is a list of the final polls for the presidential race in 2004. You'll notice that 10 out of the 14 had BUSH leading, 2 of them showed a tie, and 2 showed Kerry leading. You'll also notice that if there was any oversampling of voters voting Democrat it was probably only 1 to 2 percentage points. Furthermore, only 2 polls were inaccurate outside of their margin of error (FoxNews and Newsweek Magazine). All of this was very discouraging news for me last spring when I began to sense a troubled midterm for the GOP.

Bush = first percentage, Kerry = second.

Marist (1026 LV) 49% 50%
GW/Battleground (1000 LV) 50% 46%
TIPP (1041 LV) 50% 48%
CBS News (939 LV) 49% 47%
Harris (1509 LV) 49% 48%
FOX News (1200 LV) 46% 48%
Reuters/Zogby (1208 LV) 48% 47%
CNN/USA/Gallup(1573 LV) 49% 49%
NBC/WSJ (1014 LV) 48% 47%
ABC/Wash Post (2904 LV) 49% 48%
ARG (1258 LV) 48% 48%
CBS/NY Times (643 LV) 49% 46%
Pew Research (1925 LV) 51% 48%
Newsweek (882 LV) 50% 44%

Actual results: Bush 50.7%. Kerry 48.5%

Thus, the current polls are probably accurate to within 3 percentage points (here's hoping those 3 points lean right). In other words, the GOP is in for a serious shellacking on November 7th. My hope, like that of many Republicans, is that this defeat will bring the GOP back to its roots (which it has so clearly departed from).

The only good news that the media has consistently failed to understand, is that when the American public votes to put a party in power, they are not necessarily endorsing it, they, more than not, are simply choosing the lesser of two evils. Public opinion is shifty and can change rapidly. My class studied the seesaw elections of 1946,48,50,52 and 54 as well as the elections of 80 and 82, and 94 and 96. In each of these cases public opinion shifted from one party to the next in back-to-back elections. So, after the election when you start hearing the media saying that the American public has given up on the GOP, and when they run stories on party jumpers to prove their point, and when they run stories like Time did that the Republican Revolution is over, turn your TV off, it will all be utter nonsense.

The above hissed in response by: Stinson7 [TypeKey Profile Page] at October 18, 2006 11:48 PM

The following hissed in response by: Dafydd ab Hugh

Stinson7:

We're not talking about presidential polls, which are much easier to get accurate. This is a congressional election. How well were the pollsters predicting the outcome of the 2004 congressional elections at this point in that year's campaign?

Although there were 469 separate elections across 50 states and the District of Columbia, let's just restrict our inquiry to what the pollsters mostly polled: the so-called "generic congressional" poll.

In the House election, the actual vote ended up 49.2% Republican and 46.6% Democratic, for a Republican advantage of 2.6%.

Here are all the polls listed on Polling Report that conducted polling around today's date, October 19th, in 2004:

Battleground, Oct 18-21: 42% R, 46% D, D+4%
USA Today/Gallup, Oct 22-24: 50% R, 47% D, R+3%
NBC/WSJ, Oct 16-18: 44% R, 44% D, Tie
Newsweek, Oct 21-22: 47% R, 46% D, R+1
Democracy Corps, Oct 20-21: 44% R, 49% D, D+5%
AP/Ipsos, Oct 18-21: 46% R, 47% D, D+1%
CBS/NYT, Oct 14-17: 39% R, 45% D, D+6%

Average: 44.6% R, 46.3%, D+1.7%

That means the average of the polls was off by 4.3% (and on the wrong side)... which may be within the margin or error, but just barely; and 4 out of 7 called the winner wrong; only 2 of the 7 called the Republicans as winners.

This time, the error is likely to be worse, because the pollsters have changed their methodology for predicting voter turnout. They assume that, e.g., religious people, males, marrieds, seniors, and other typically GOP-leaning subgroups will have very depressed turnout. Contrariwise, they predict that union members, women, singles, and young voters, Democratic-leaning cross tabs, will have extremely high turnout.

But there seems no basis for making this judgment -- other than as a proxy for the pollster really saying "Republicans will be depressed and Democrats enthusiastic, translating into Republicans losing, thank goodness!"

So if you're really expecting a 30-40 seat blowout (whether rooting for it or dreading it), you're going to be shocked on the 7th.

Dafydd

The above hissed in response by: Dafydd ab Hugh [TypeKey Profile Page] at October 19, 2006 1:16 AM

The following hissed in response by: Rovin

So if you're really expecting a 30-40 seat blowout (whether rooting for it or dreading it), you're going to be shocked on the 7th.

Dafydd

Of course, we will all hear from the mountain-tops (left-wing nutcases) that the Republicans stole another election, rather than an electorate came to their senses and actually voted in favor of a secure nation. Five years and no attack on our shores goes a long way for a party that the pollsters predict should/could lose.

Does anyone remember the Voter News Sevice that turned out to be an embarrassing joke that the media embraced?

The above hissed in response by: Rovin [TypeKey Profile Page] at October 19, 2006 8:22 AM

The following hissed in response by: Thanos

The other things that your polling proffessor probably didn't tell you was that the only accurate polls are those close to the election. The pollster's rep is on the line once you get two weeks out and he's producing flawed data. So invariably during the waning days the poll stats inexplicably, but always shift to near-reality. So any polls taken hereafter will likely be as accurate as possible, but polls earlier than 10/15 likely contain large flaws.
One other note, Zogby has admitted that telephone polling has gone to hell, and that this year more of their results are obtained online.

The above hissed in response by: Thanos [TypeKey Profile Page] at October 20, 2006 3:05 PM

Post a comment

Thanks for hissing in, . Now you can slither in with a comment, o wise. (sign out)

(If you haven't hissed a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Hang loose; don't shed your skin!)


Remember me unto the end of days?


© 2005-2009 by Dafydd ab Hugh - All Rights Reserved