Archive.fm

The Social Contract with Joe Walsh

43 Days Out: Handicapping The Race with Carl Allen

Pollster, handicapper, & analyst @realcarlallen joins me to handicap the presidential race with 43 days to go. Who's in a better position right now, what factors might change things these final 43 days. Fascinating stuff. Have a listen.


Hosted on Acast. See acast.com/privacy for more information.

Duration:
44m
Broadcast on:
24 Sep 2024
Audio Format:
mp3

Explaining football to the friend who's just there for the nachos, hard. Tailgating from home like a pro with snacks and drinks everyone will love, any easy win. And with Instacart helping deliver the snack time MVPs to your door, you're ready for the game in as fast as 30 minutes. So you never miss a play or lose your seat on the couch or have to go head to head for the last chicken wing. Shop game day faves on Instacart and enjoy $0 delivery fees on your first three gross reorders, offer valid for a limited time, other fees and terms apply. Expand the way you work and think with Claude by Anthropic. Whether brainstorming solo or working with the team, Claude is AI built for you. It's perfect for analyzing images and graphs, generating code, processing multiple languages, and solving complex problems. Plus, Claude is incredibly secure, trustworthy, and reliable so you can focus on what matters. Curious? Visit claud.ai and see how Claude can elevate your work. Hey there, it's Farneesh Terabi, host of the So Money Podcast. Imagine having a super smart and reliable virtual friend to help you guide your finances. Well, besides me. Enter Claude by Anthropic. With the power of AI, Claude can help streamline your financial tasks. From analyzing data and charts to generating code for investment models or budget trackers, Claude's advanced reasoning helps you tackle the toughest financial challenges. Take control of your financial future with the right tools. Visit claud.ai today and start saving time and money. America, we got to get back to doing what's expected of us as free citizens in this democracy. That means being tolerant, being respectful, staying informed, and being engaged. I'm former Congressman Joe Walsh. Follow me here and join the millions of Americans who are renewing their social contract with each other. The social contract, it's on us. Hey, Joe Walsh, with you, former Congressman Joe Walsh, with you. This is the social contract with Joe Walsh. Thanks for listening. Thanks for watching. I'm going to put him. We've had Carl Allen on before. I'm going to put him on. I promised you this once a week just for a little update until the election. Carl Allen is my favorite. There he is. Good picture this time. Oh, all right. He's out of Ohio. He's a great state of Ohio. He's my favorite pollster. He's my favorite data scientist. He's my favorite handicapper. Carl, what are you? When somebody asks what you are, how do you describe? What are you? That's a wonderful question. Other than being an author, most importantly, before I was ever an author, before I did handicapping, before I did any of that, I was an analyst. An analyst is someone who takes the data. A good analyst takes the data and tells people what it means. A researcher collects the data. An analyst tells people what it means. The best way to describe what I do is an analyst. Love it. Love it. Love it. Then he's my favorite analyst. He's also a contrarian. I love that. He's also Carl Allen. Got a book out right now. Hot off the press. It's out. The name of the book is the polls weren't wrong. Hold that book up there, Carl. There it is. The polls weren't wrong. It's out today. Go to Amazon. Wherever order that book, you will become so much wiser and so much smarter because of it. Carl Allen, the polls won't wrong. By the way, follow Carl Allen on Twitter at real Carl Allen. That's Carl with a C at real Carl Allen. Carl, I want to get you on once a week before the election to kind of look at and regularly handicap this presidential race that all of America, we've got 42 or three days left to go, is obsessed with. So welcome, my friend, this week. I got a number of different questions for you. I'm going to plug your book throughout. But let's start broad. And let me come right at you with probably the most important question. Why the question everybody listed in us is most curious about my handicapping analyst friend. Yeah. And I'm going to phrase this as carefully as I can because I know you're so sensitive to wording. What's your forecast for the outcome of the presidential race? Right now. Yep. What's your forecast for the outcome of the presidential race? All right. I'll answer that. Since this is my second time on your podcast and I like you, I'm going to give you a short answer and then a long answer. A short answer, presidential election, Kamala Harris is about a 66% favorite to win. 66% is pretty good. 66% in my forecast is higher than 538, much higher than Nate Silver. But the reasoning behind why my number is higher than theirs is quite simple. And it is, I dread oversimplification. But the states that Kamala Harris must win in order to win the election. Everybody knows Pennsylvania, Wisconsin, Michigan. Everybody say it with me. Pennsylvania, Wisconsin, Michigan. Her position in each of those vital swing states, that's not her only path to victory. She has other paths to victory. But if she wins those three states, she will almost certainly win the election. Her position in those three swing states is much stronger than people realize. The reason that my forecast is 66% and not stronger is just because we still have time before the election for things to go wrong. It will not be easy. It will not be a runaway. It will not be a blowout. But when I say Kamala Harris is favored, I warn people that you should feel good about that because that is her position. But at the same time, if I'm going to a craps table or a roulette table in my odds, if I have a 33% chance of losing, I might not be so comfortable with that. And as you know, Joe, and I don't like to make light of the situation, the scenario where Kamala Harris loses is far more dire than me losing money out of my pocket. And so that's the overview of where we're at right now. That's a good answer, Carl Allen. So right now, your forecast for the outcome of the race is, she right now, you give her a 66% chance of winning. Correct. That mean that you give Trump a, by the way, there's the book, the title of the book, it's available on Amazon. The polls weren't wrong by Carl Allen. Get it. Does that mean you give Donald Trump your handicapping for Donald Trump right now is 34%. Correct. Correct. That's a very, that's a very high number. And that's not a number that I'm comfortable with. But as a, as a, as a voter, as a person who cares about what happens in our country, I don't feel good about that. But I do think my bias is aside. And I would never tell someone that I'm not biased, because anyone who tells you that these are what the numbers say, I'm not biased, whatever they're lying to you, I would never say that I'm not biased. But based on my best analysis, that is where we stand. And before I get into what variables go into your forecast, just a top line, we're 42 43 days out when you, when you say she has, right now, you're handicapping it, she's at 66% chance of winning, as of right now, is that relatively speaking, Carl, to the average person listening to us right now is that, oh my God, that's, that's a really, that's a high number, that's average, or I'm still super nervous. That's like 50/50. How do you, when I, when I put probabilities on things, I always, I always tell people the difference between, in math, the difference between 50/50 and 66/34 is huge. In an election, in an election, where I think you would agree with me, Joe, that there are some very, very serious consequences of who wins and who doesn't. We need to treat it as, as, as voters who care about things. We need to treat it as if we're behind, as if it's tied, because at the end of the day, what I think, what Nate Silver thinks, what 538 thinks does not matter. We must win nationwide, not just in the swing states, but there are important Senate races, there are important house races. Almost everyone has a race close to them that matters. So what I do from an, as an analyst, as a forecaster, as a handicapper, I want to say, this is what I think, this is why I think it, and oftentimes my analysis will differ from other analysts. But whether you trust my numbers or not, action is required. So use those, use the numbers accordingly. A quickie, Carl, I, if I had asked you that, that same question two weeks ago, do you remember where you would have put that number? Yeah, it was almost the same. Almost the same. Almost the same. Yeah. In my forecast, very little has changed in the past two weeks. Kamala Harris' numbers have gotten a little better, but that was basically priced into my forecast. Those forecasters. Continue, Carl, continue. You know, other forecasters, primarily Nate Silver, he had her around 35%. So I've done this since 2016. President House Senate races, my forecast has never had that big of a discrepancy between my forecast and Nate Silver's. My one around 65, his was around 35. We've never had that big of a discrepancy. I want to get into that. I want to drill down on that. That voice by the way, belongs to Carl Allen. Follow him on Twitter at real Carl Allen. There it is right there and on the screen with the C. He is one of my favorite. I'll call him young because he's younger than I am. He's one of my favorite young analyst pollsters, handicappers. He's brilliant. Got a great book out just out right now. The polls weren't wrong. There it is by Carl Allen. There's that beautiful book. Awesome. Carl, besides polling for everyone listening to us, that 66% number you give Harris, besides polling, what other important variables are included in your forecast? Yep. There are a couple very, very important ones. One, how undecideds decide? Who are the undecided voters? Why are they undecided? How will they fall? Undecideds, contrary to Nate Silver's forecast, contrary to other analysts, do not always split 50/50. In fact, they almost never split 50/50. In 2016 and 2020, undecideds favored Trump. So 2024, there are a lot of analysts. There are a lot of pundits, media, a lot of people saying, "Well, we should kind of price that in again this time." There is not a strong case for that. Why? I understand that 2020 is very heavy in people's mind. I understand that 2016 is very heavy in people's minds. But the reason that it is not as big of a factor in 2024, number one, if you look at the number of undecided voters, it is far, far lower. 2016, we were around 8, well, at this point in the election in 2016, we were around 12 to 15% undecided. 2020, we were around 8. This year, we're already down to, and my polling average is 5, 538, and Nate Silver are very similar. So there is only so much impact that an uneven undecided split can cause when there's only 5%. The second factor, the reason that I think or that my forecast is more favorable to Harris than in previous year than other forecasters, and what is different this year from previous years, is that third party impact. When Kennedy was still in the race, when he was still polling around 10 to 12%, I was actively reminding people, Kennedy will not get 10% in any swing state. It's not going to happen. So the question became, why is he polling so hot? The answer was, remember when Kennedy was still in the race, Biden was also still in the race, a lot of people who favored Democrats, they leaned left, but they could not bring themselves to say, I will vote for Joe Biden. Some of them said they were undecided, some of them just named whatever popular third party candidate came to mind. And this year, that was Kennedy. Now that he's out of the race, he's not even on the ballot in many states, we have a much easier calculation, which Hall, thank God for me, much easier calculation, much smaller margin for my forecast to make an error. Now, there's a distinction between a poll error and a forecast error, but Kamala Harris is polling 48, 49, 50 in the swing states that she needs to win in order to win the election. Again, that doesn't mean that's going to hold true, what that does mean, what that does mean is she is in a much stronger position than Joe Biden was in 2020, and in a much stronger position than Hillary Clinton was in 2016. So undecideds, the third party impact, is there any other major variable that goes into your forecast? Are those pretty much the two biggies? Time, time, time, time is a weird thing in politics. Remember where we were about a month ago? Remember where we were 40 whatever days ago? Time is a hell of a drug is what I tell people. If you had asked Democrats two months ago how they felt about the election, they would have said not good. You asked them now, maybe they're not overly optimistic, but they're certainly more optimistic than they were two months ago. Now that we know who the candidates are, now that we know Kennedy is not going to be a viable third party candidate to steal votes from would be anti-Trump voters who lean democratic. Now we have a much easier calculation to make. So all we're all we're looking at now, we just need more data, and as we approach the election, those error bars or that confidence level will shrink in my forecast. Carl, who else do we? Jill Stein or Cornell West, are they even a factor? It depends on the state. I'd have to look at my list here to see which states they might be a factor, and I know Jill Stein just had a thing in Nevada. Nevada is a very interesting state for many reasons, but it doesn't look like she's going to be on the ballot there. They are a non-zero factor, but fortunately, fortunately, it looks like those third party candidates that people who if they had to choose had to choose would favor Kamala Harris, it looks like that those candidates aren't going to be deciding factor in any of the major swing states. Carl Allen, author of The Polls, weren't wrong, that great new book out there. The Polls weren't wrong by Carl Allen on the left. There's the beautiful book on the right. Go buy it. It's out there. Right now, just came out. Amazon, go wherever and buy it. How does your forecast generally compare with kind of the big three that everybody's familiar with? Five, three, eight real clear politics in silver. Nate's silver. Great question. Real clear politics, their contribution to, I would say, poll aggregation, which is basically collecting a bunch of polls and taking the average of them, their contribution is transparency. They say these are the polls that we're taking. This is the average that we get from them. The good thing about being transparent is you can kind of spot where their biases might be. When you see that there's an arbitrary cut-off right before a really good poll for Kamala Harris, you're like, maybe this is an exception, and then it keeps happening and you're like, oh, okay, this is more of a trend. Unfortunately, in the past four to eight years, real clear politics has kind of shown more bias. That doesn't mean that they don't do good work. It just means you have to take it, take it for what it is. Five, 38, who is both poll aggregator and forecaster, which means they take the polls and they say this is what, based on the polls, this is what we think will happen. Their methodology, in my experience, there's two flaws with it that math nerds could talk about. One are the tales. Basically, they rate highly, highly unlikely things happening as being like marginally likely. One famous one, one popular one, one was Donald Trump wins every state except Maine or something like that. Kamala Harris wins every state except Mississippi. These weird outlier things that they give a 0.1% chance to, nerds can enjoy that. That's not a really big factor. The main factor, the main thing that separates my forecast from theirs and from others is how much weight I give to polls that are the most recent. Polls that are the most recent. Anyone could understand that a poll released today is probably better data than a poll released two weeks ago. We understand that because it's more recent. But as we get really, really, really close to the election. A lot of pollsters are releasing things every other day. 538 has this weird thing where the polls released the day before the election get a ton of weight. The polls released like a week before get very little or none. The impact of that, sorry, go ahead. No, that doesn't seem right, Karl. That doesn't make sense. No. In 2022, what happened, and this is the methodology that I pointed out in advance, I said, "Hey, what's happening to 538's poll averages?" They're all over the place. This is when silver was still in charge there. Very close to the election, these partisan pollsters were pumping stuff out, and partisan pollsters take them for what they are in good, better sideways. But when your poll averages are made up 50% or more of partisan polls, just because they're really close to the election, that to me, I saw that these partisan pollsters are just taking advantage of this system and trying to manipulate it. I saw that happen before it happened. I said partisan pollsters will only get this much weight regardless of how recent they are and my polling average. And silver released a statement about it, and I wrote about it in my book because it is a very, very big issue when we're talking about poll averages is if someone wanted to manipulate your averages could they? As in, I have a vested interest in the public thinking a certain candidate has a better chance that they do than they do. And silver said no, it's a free market. Democratic leaning pollsters could release polls to if they wanted that they don't says something. Which he took to mean Democratic leaning pollsters are taking polls, but they're not releasing them. Therefore, they must not be good. And then we saw what happened in 2022. Democrats allegedly overperformed their expectations when in reality, if you, in my opinion, had properly aggregated the polls, not allowed those partisan pollsters to flood your averages, you would have ended up something much closer to what I had, which had 50 and 51 Democratic Senate seats being the most likely outcomes. Five thirty eight had forty eight or forty nine democratically held seats, the most likely outcome. So that's a major, major difference in how I take my poll averages versus how five thirty eight does. And there is a strong mathematical basis for that. I won't bore you all with it, but it is starts in chapter four in the book. It's called ideal polls. Chapter five, throw it in the average. An individual poll could only be so accurate. Yeah. Regardless of how recent it is, regardless of how recent it is, their individual polls are subject to fluctuation. And that's why I don't say, oh, yes, this most recent poll must be the best or most accurate one. Take Carl Allen, take 30 to 45 seconds. Yeah. And tell me, explain to me why you are so critical of Nate Silver's forecasts. Critical of his forecasts this year, his forecasts have gone very far astray from the work that he had done in the past. Again, I told you, this is this year about a week ago, when his forecast was 35%, Kamala Harris, my forecast was 65%. We're looking at the same numbers. Why are we drawing such different conclusions? Yeah. My understanding, since I followed his forecast for so long, is that there's either something has changed or his incentives for the work that he does has changed. It is well known that he has a relationship with Polly Market now, which is a prediction market. Offshore takes action on the election. To me, that conflict of interest is not necessarily a problem, but it does kind of explain why his numbers can be so volatile. If your employer or someone that you have a business relationship with makes money in volatility, that to me is kind of a red flag that maybe someone isn't doing as good of work as they had in the past. That's why I am currently critical of his forecast. There are other reasons in the past that are more technical, but starting really this year, that volatility, that movement has really raised a red flag for me. Hey, Prime members, are you tired of ads interfering with your favorite podcasts? Good news! With Amazon Music, you have access to the largest catalog of ad-free top podcasts included with your Prime Membership. To start listening, download the Amazon Music app for free, or go to amazon.com/adfreepodcasts. That's amazon.com/adfreepodcasts. To catch up on the latest episodes without the ads. Hey, I'm Ryan Reynolds. At Mint Mobile, we like to do the opposite of what Big Wireless does. They charge you a lot. We charge you a little. So naturally, when they announced they'd be raising their prices due to inflation, we decided to deflate our prices due to not hating you. That's right, we're cutting the price of Mint unlimited from $30 a month to just $15 a month. Give it a try at Mint Mobile.com/switch. $45 up front for three months plus taxes and fees, promoting for new customers for limited time. Unlimited more than 40 gigabytes per month slows. Full turns at Mint Mobile.com. Take 30 seconds and answer this question. All right. Is there any pollster that doesn't have bias? That's a great question. Paul, do you? That's a great question. I'm always asked, Karl. Obviously, every politician has a bias. I've been in the media. Everybody in the media has a bias. Does every pollster, forecaster have a bias? I think everybody does have a bias, but data itself, if it is properly collected, should not have a bias. That doesn't mean it can't have an error, but that does mean if you collect your data a proper way, that there's not bias in it. Now, here's one of the major factors, and this doesn't differentiate me a ton from 538 and some others, but transparency in how a pollster A collects their data, B weights their data, as in saying, "Okay, we observed this, but we didn't get much response." You have to wait your data after you collect it, perfectly valid mathematical process, and then obviously the back end of how your poll data is reported. There are some major factors in transparency that go into whether or not a pollster is biased, and if a pollster is consistently putting out numbers that lean or skew a certain way, this is the really hard thing to understand. I only talk about it briefly in the book, but there is no objective way to wait your poll data, which is to say, you could give five very, very good qualified pollsters the same exact raw data. This is who we interviewed, these are their demographics, et cetera, and they would all give you slightly different weighted data, and that's okay. The problem is when that data consistently historically skews one way, and when a pollster isn't transparent with how they're waiting their data in the first place, that is where I have a major problem. In fact, Joe, I've started not even including them, I should have done it sooner, but I've started not even including those pollsters in my polling averages. I said, "Look, if you're not transparent with how you do your work, you should not be included." I don't care how well your polls have predicted the election in the past, which is Nate Silver's standard. I will not include it, because I don't know where your numbers are coming from. You could just be making them up, and there's a case that some pollsters have fudged numbers in places. I've seen you, Carl Allen, talk about recency bias with Silver's forecast. A couple questions, my friend. Why is that bad? So, recency bias has a place in polling more recent data, because it is closer to the election, should be more reflective of the population, which is to say a poll that we take today, 40 whatever days before the election, versus a poll that could be taken two days before the election. Well, that poll two days before the election, because it's not impacted by all that time that we're missing in between, of how people might change their mind, how undecideds might decide, that data should be more accurate. The problem is, is when you start drawing the line between a poll that is two days old and a poll that is three days old, a poll that is three days old, and a poll that is five days old, fundamentally very, very little has changed, almost nothing, has changed between that five day old poll and the two day old poll. But because of their methodology, other forecasters give substantially, substantially more weight to those polls that are closest to the election. The underlying reasoning makes sense. The underlying reasoning is it's closer to the election, so it should be more reflective of the population of people who vote. The problem is, like I said before, when I talked about the fluctuation of poll data, polls are as a tool messy. They're a little bit noisy. We need a lot of poll data to pinpoint where exactly we are, and that's why we take an average in the first place. But even that averages subjects to a little bit of noise. So to narrow it down to two days before the election versus five days before, a week before, that is very, very arbitrary. And that is giving a lot of undue weight to these two or three polls that were taking closest to the election, when those polls are still subject to the same noise that all polls are subject to. So when I talk about recency bias and poll data, it's the same thing with recency bias in our analysis. Most prevalent in our minds when we talk about elections are 2020, or maybe 2016, considering the kind of shock outcome. So we look back at those elections and we say, wow, that happened, then it could happen this year. Yes, that is true. But on paper, by the numbers, I should say, this election is much more similar to 2008 and 2012, when low undecided, low third party, then they are to 2020 and 2016. My analysis, which does get a little bit into the technicalities of things, and I also look at non US data, I say we need to compare things on similarity, not by what names are on the ballot, not by Democrat Republican, not by what happened last election, but by the underlying variables, how many people are undecided, how many third parties are there. I would say we're approaching a point in US politics, where historical trends don't really apply as much as they used to, because even just 15, 20 years ago, not many election cycles ago, parties would pick the best possible candidate they could for the swing voters, for the moderates, but this increasing polarization that Trump has caused with Republicans, this is really skewing things and it's making it very hard to say, oh, let's compare this to when Ronald Reagan was elected, let's compare this to when George Bush, like these historical trends are becoming less and less appropriate. So we really have to view things in the proper context, which is to say, yeah, which is to say, understanding those polls closest to the election aren't necessarily going to be much more accurate than a poll three days earlier. Carl Allen, the book is The Polls Weren't Wrong. It's out now, get it on Amazon, or wherever you want to go online and buy a book. There's the beautiful cover of The Polls Weren't Wrong. Follow Carl on Twitter @RealCarol. And Carl, two more for you, this handicapping Tuesday, in no particular order. I've heard, I continue to hear about, and I read another article today, about the traditional Trump undercount in the polls. We saw it in 16, they say. We saw it in 20, they say. Trump's vote is always undercounted in the polls for whatever reason, because people are embarrassed or ashamed to say they're going to vote for him. And Trump always does better on election day. He always does better when all is said and done. Just on that, Carl, is there validity to this? And how do you deal with that? That's a great question. 2016, one of the reasons Trump seemed to overperform his poll numbers in so many states was because he got a lot of support from people that pollsters were kind of like, these people don't usually vote. They say they're going to vote for Trump, but they don't usually vote. So some pollsters included them, some pollsters didn't. On average, Trump was not getting strong numbers. The other thing was the undecided vote. In 2016, and again in 2020, we know, based on data that was collected after the election, Trump received 2020, Trump received at least 60% of undecided votes in swing states, some were higher than that. So that difference that was being made up by those undecided voters and those unlikely voters was very prevalent. The good pollsters, and we know this by looking at their methodology, the good pollsters in 2022, and again in 2024, they started accounting for this. There's no perfect way to account for voters that you can't talk to. Trump voters are notoriously hard to get a poll from. These whites, working class, rural voters, they're harder to pull. But pollsters have found creative ways to include them in their poll numbers. And we're starting to see that with all of this noise close to the election, people are getting really scared every time they see a poll that says Trump is up to in North Carolina or up three in Arizona. All this means is pollsters are getting better at figuring out who the likely voters are because they have better data now. They see who voted in 2016, they see who voted in 2020. So to say that this trend will continue to happen is not a poor, and when I talk to people about this, I say, look, you're totally justified in being concerned about this. And I'm not saying it can't happen, but we know why it happened then. And the factors that caused it to happen then, we're not seeing those same factors in 2024. And I'll give you a recent example from, and this is also from my book, the same thing happened in the UK when conservatives were outperforming their polls, conservatives were outperforming their polls. So in the UK, all of their analysts started just arbitrarily adding three or four points to all the conservatives poll numbers. And guess what happened? Labor, the more liberal party ended up winning. They said, oh, this thing that happened the past two elections literally because of 2015 or sorry, 2011 and 2015 in the UK elections didn't happen in 2017. And then everybody was shocked. But what all that happened was the pollsters fixed their methodology. They said, oh, a lot of people who didn't normally vote are now voting. And we know who they're voting for will account for it. So I don't want to give people a false sense of security about what the polls are saying right now. And Kamala Harris is relatively favorable position in them. But I will say, yes, the election will be close in all of the swing states. The six main swing states, I would be shocked if either candidate one or lost by more than five, like it will be that close. But at the same time, the idea that Trump was outperforming his poll numbers, he was outperforming in 2016, his poll average is like 42. So then, and Carl, everything you just said for three minutes there makes absolute sense to me. It really does. Thank you. Thank you. Thank you. So and maybe this can never be measured or has it been measured? Is there a thing where people are more reticent or hesitant when being polled to say they support Trump? Is that right there? Is that a thing? I wouldn't say that it's not a thing. There is a really interesting phenomenon in poll data called social desirability, which means that the person that you're talking to, if you believe that the person you're talking to wants you to respond in a certain way, then you're more likely to respond in that way, which is to say, especially in 2016, when pollsters were contacting people and Trump was very negatively viewed. Analysts were saying he had very little chance of winning, etc. A lot of people who in their mind favored Trump were afraid to say that. They were afraid to say I support Donald Trump. That phenomenon has faded. It has become much more socially acceptable to say I support Donald Trump. In many ways, it became more socially acceptable to say you support Donald Trump than to say you supported Joe Biden or Kamala Harris. So that social desirability bias, I'll leave that for sociologists, psychologists to talk about. But that is a factor that we have to consider when we're analyzing poll data, especially when we're talking about a very serious topic like presidential elections. Because if you ask someone how much money they make, and you only based your data on what people said, you would think that America's doing quite well right now, that inflation is not a problem. But the reality is there are reasons that not necessarily that people would lie, but that they would slightly overstate. It's the same thing with their support for Donald Trump in the past. Carl Allen, my last quick one for you this Tuesday. Give me, give me any insights or predictions you've got about the state of North Carolina. Everything this guy, Mark Robinson, is going through just from your analyst's perspective. Can any of this drag Trump down in the state of North Carolina? The answer is unquestionably yes. When down ballot candidates perform poorly, that doesn't have as strong of effect as candidates at the top of the ballot. The polls right now have Stein around 47-48, and in the average 47-48, Robinson around 40. Robinson will get more than 40% of the vote because, for better or for worse, Donald Trump's name is on the top of the ballot. A lot of people will show up to vote for Donald Trump, and a lot of those people will also vote for Robinson. But there are some voters because Robinson has become such a negative effect. He has started to have such a negative effect on the perception of the party, that there are some people who were on the fence. They might vote for Trump, they might vote for Harris, or they might vote for Trump, or they might not vote. That is a real factor as well. There are a lot of people who now would vote for Harris, who previously wouldn't have. When I say a lot of people, I'm not talking 5%. But remember, the margins that we're dealing with in swing states, especially one like North Carolina that has gone very narrowly, republic in the past two elections, when we're dealing with a state like that, 2% can change the whole state. We saw that in Georgia in 2020, that 2% is almost undetectable in data, but it can swing the whole state. So North Carolina, to me, not just because of Robinson, by the way, because of some other factors that favor Kamala Harris there, North Carolina might be 2024's Georgia, in the sense that it's kind of nobody is shocked by it, but it is still a little bit unexpected. Thank you. And obviously, just intuitively, it makes sense. Everything Robinson's going through clearly isn't going to help Trump. It's not going to be related. Yeah, it's going to hurt him a little. Carl Allen, my handicapper, my pollster, my analyst, the book is out. It's called The Polls Weren't Wrong. There it is. There it is. Hence some book by Carl Allen, get it on Amazon, go buy it and do me a favor for the next four, five, six weeks, 42, 43 days, follow this guy on Twitter at real Carl Allen. There it is right there at real Carl Allen. Carl, you're a champ. Thank you, my friend. As always, we'll talk next week. Everybody else out there be brave, baby. Thank you for listening. Remember to listen, share and follow the social contract with Joe Walsh on Apple Podcasts, Spotify and everywhere great podcasts are found. And be sure to leave a five-star review. This has been the Social Contract with Joe Walsh. Hey there. Looking to level up your shopping experience? Let me introduce you to Amazon Live. If you haven't heard, it's a shoppable video experience where influencers and creators showcase the latest must-haves all while you shop in real time. And for those who love some celeb gossip, reality stars like Kyle Richards, Lala Kent, and friend of the pod, Paige Disorbo. On her new show in bed with Paige Disorbo, Paige invites top-tier guests to cozy up in her fluffy bed where they spill secrets, share nighttime routines, and even whip up midnight snacks. Stream and shop new episodes of her series in bed with Paige Disorbo every Tuesday at 7 p.m. ET by going to amazon.com/live/page_disorbo. Or you can watch Amazon Live's new live TV channel on freebie or prime video under the DIY section and shop along on your phone. Expand the way you work and think with Claude by Anthropic. Whether brainstorming solo or working with the team, Claude is AI built for you. It's perfect for analyzing images and graphs, generating code, processing multiple languages, and solving complex problems. Plus, Claude is incredibly secure, trustworthy, and reliable, so you can focus on what matters. Curious? Visit claud.ai and see how Claude can elevate your work. Hey there, it's varnish terabi, host of the So Money podcast. Imagine having a super smart and reliable virtual friend to help you guide your finances. Well, besides me. Enter Claude by Anthropic. With the power of AI, Claude can help streamline your financial tasks. From analyzing data and charts to generating code for investment models or budget trackers, Claude's advanced reasoning helps you tackle the toughest financial challenges. Take control of your financial future with the right tools. Visit claud.ai today and start saving time and money.