Kristen Soltis Anderson on the 2020 Polling Failure
Kristen Soltis Anderson is a pollster, speaker, commentator, and author of The Selfie Vote. During this podcast conversation, she talks about the 2020 polling failures—and why fixing them will be hard
Click here to listen to our full conversation.
The following excerpt has been edited for length and clarity:
Last week I had Kristen Soltis Anderson on the podcast to discuss the 2020 polling failures and how to fix them. Instead, she explained why that will be very hard, if not impossible, to do. Kristen is a pollster, speaker, commentator, and author of The Selfie Vote.
Matt: I guess we have to start where probably all your talks are starting nowadays, which is: What happened with the polling?
Kristen: Very funny that when I first started in this industry 15 years ago, and I would tell people that I was a pollster, people would ask me, “does that mean you're the one making the phone calls to people during dinner asking who they're voting for?” And I'd say “no, no, we hire a call center that does that. I'm not going to be you know, calling people up during dinner.” And I’d have to explain what polling is. Now, it's almost the opposite direction. Everyone thinks they're an expert in my industry, and now people actually think pollsters these days are about as popular as the Coronavirus. At best, people think our industry is flawed, and we've missed the mark, and what’s worse, they think we're actively trying to put our thumb on the scale to get things wrong. After 2016, I was very sympathetic to a lot of that, because I too, was very surprised by the election result in 2016, and was like, “hey, that's our one job.” (It's not actually our one job, but it's the thing that the public thinks is our main job—is understanding how an election might go.) But after the 2020 election, I think the story is a lot more complicated than it was in 2016. In 2016, you had this clear division where the national polls were pretty correct. But there were enough key swing states off by big enough margins, that the entire picture of how the electoral college might go was thrown out of whack. And you had the industry take a hard look at what happened. And they came to the conclusion that two big things had happened in 2016. One was that you had late deciding voters, who in the end, broke overwhelmingly for Trump—that if you just didn't really like both candidates, in the end, you were more likely to break for Trump than Clinton—and the polls were not picking up on that late movement. And number two, polls were not accounting for the number of non-college educated voters, particularly non-college educated white voters, which are especially prevalent in these upper Midwest states—that they were being systematically under-polled. So the industry fixed those things. But if you talked to me over the last four years and said, “Well, can we trust the polls?” My answer was always the same, “You should be cautiously skeptical,” and that's because we're really good as an industry about fighting the last battle. We're really good at figuring out what went wrong last time and how not to do it again. But that's not the same as anticipating what might go wrong the next time. And I think there was a lot of discussion about what is thought of as the “Shy Trump Vote,” which is a term that gets thrown around and has a million different definitions based on who you're talking to, but was sort of debunked as an actual reason for polling error in 2016… But I was not as dismissive of it this time around. Just because it didn't happen in big numbers last time, didn't mean it wouldn't this time, because the country is in an even more polarized and frustrated moment. People are more worried than ever about, you know, quote, unquote, being cancelled, etc. And I just was apprehensive about the idea that we could completely dismiss it this time around. And there's not enough data yet to confirm that “Shy Trump Voters” are a factor in this equation, because you have to answer “why were the polls really wrong in Florida, but really right in Georgia?” Are there “Shy Trump Voters” in Florida, but not in Georgia? Or are there “Shy Trump Voters” in Wisconsin, but not next door in Minnesota? You've got enough unusual and hard to explain patterns of polling error this time around that there will not be one neat, simple unified theory of what went wrong like there have been in the last few elections when polls missed the mark.
Matt: It is interesting, because I mean, if you look at just look at Maine, for example. I don't think there was a single poll that showed Susan up in that race and she won. You know, Lindsey Graham, there was, what $100 million or something spent against Lindsey Graham in South Carolina. Graham won by like 13 points—
Kristen: Yeah. And so if you think that “Shy Trump Voters” are the issue, why is it the case that in many of those same polls that showed Susan Collins in a neck-and-neck race or down by a bit to Sara Gideon, those polls were not as far off the mark, when it came to Trump. There were no poll saying, “Oh, Donald Trump might win [Maine],” you know. The Trump number was actually pretty on the money in some of the senate polls, but it was Susan Collins number or Lindsey Graham's number that was way off the mark… But there was just a lot of things that were going to be uncertain about this election in terms of what turnout was going to look like—the difference between early voters and day of voters. The pandemic made it even harder for pollsters to do well, this time around. And it's going to make it harder for us as an industry to unpack what went wrong in the days and weeks to come.
Matt: The most egregious example was, I think it was an ABC/Washington Post poll that had Trump winning Wisconsin by 17 points. Everyone thought that was an outlier, to be sure. But how do respected outlets get something that wrong?
Kristen: What's going to be another factor that makes this so hard for the industry to unpack is that it would be one thing if all of ABCs polls had been off… [but] the ABC/Washington Post poll was the best media poll in Florida. They had Trump winning Florida by two, which is pretty close to the mark. So it would be one thing if you could say, “well, these five pollsters were all garbage.” But these five pollsters were all good. But instead you have a pollster, totally blowing it in one state and then nailing it in another state. And the inverse of that is Trafalgar [Group] which has gotten a lot of attention in the days leading up to the election where a gentleman named Robert Cahaly put out polls in places like Florida that also had, I think, he had Trump up by two in Florida. You know, there are many states where his polls are going to be much closer to the market…and it'll be interesting in the places where the conventional polls really whiffed, and his polls were right. His problem is in the states where the [mainstream] polls were generally right, and his polls were off the mark. So he had, you know, Trump's gonna win Arizona by a couple points, and that didn't wind up panning out. So there is not one single person who I think can emerge from this and say, “I got it right.”