Who Answers Polls? Do You?

Nate Silver recently gave a good description of something that’s been gnawing at me lately . . . do shifts in enthusiasm shift the response rates of partisans? 

Is part of Romney’s vote-preference bounce due to a shift in enthusiasm? And what does that mean for response bias in polls pre-debate? Here’s Silver:

Polling firms are hoping that the 10 percent of people that they do reach are representative of the 90 percent that they don’t, but who will nevertheless vote. But there are no guarantees of this, and it is really something of a leap of faith. The willingness to respond to surveys may depend in part on the enthusiasm that voters have about the election on any given day.

Aren’t adults who are more enthusiastic about politics also more likely to vote? Sure, but the polls have other ways of handling that problem. The platonic ideal is to reach a random sample of all American adults, and then to apply a likely voter method to eliminate those who are unregistered or unlikely to vote. If you have a sample that is biased toward enthusiastic respondents to begin with, and then apply a likely voter screen, it risks double-counting the enthusiasm factor, especially in cases like presidential general elections when overall turnout is quite high.

Here’s the added twist, though . . . If polling is susceptible to this kind of response bias, perhaps the pre-debate polls were over-estimating Obama’s advantage. On the Republican side, there is more than enthusiasm in the response equation. Going on anecdote, a fair chunk of conservatives won’t answer polls because they think they are biased against them and another group will answer with made-up responses for similar reasons.

When we’re talking about shifts in the vote of around 3 points or so, small shifts in response bias among the 90-95 percent of citizens who don’t answer surveys could make a significant difference.

Do you answer surveys? If not, why not? And if so, why? Do you give made-up answers?