Although polling is a useful technique for understanding public opinion, it also poses a number of issues. Can a tiny sample really reflect a whole nation, and how are respondents selected?
Knowing how these surveys are conducted is crucial as polling adjusts to new technology and shifting communication patterns.
Read More Articles
Here are some commonly asked questions:
- Who do you call, and how do you reach them? Do you call cell phones?
- Is calling people on the phone still a good way to get their opinions?
- How do you make sure the people you talk to represent the whole country?
- If so few people answer their phones, how do you know their answers aren’t biased?
- How can just 1,000 people represent the whole country?
- What’s the difference between polling likely voters and registered voters?
- How accurate are polls in general, and how accurate is the Times/Siena Poll?
- If the Electoral College decides the election, what’s the point of a national poll?
- How do you pick which questions to ask in a poll?
1. Who do you call, and how do you reach them? Do you call cell phones?
Live interviewers in call centers located in Florida, New York, South Carolina, Texas, and Virginia conduct the New York Times/Siena College Poll over the phone. We call voters on landlines and cell phones, and respondents are chosen at random from a nationwide list of registered voters. More than 90% of voters were contacted by cellphone in recent Times/Siena surveys.
How many individuals now answer calls from pollsters is one of the most frequent queries we receive. Reaching certain people frequently requires multiple tries. Approximately 2% of the individuals our callers attempt to contact will ultimately answer.
Since fewer people stay on the phone for longer interviews, we aim to keep our calls brief—less than 15 minutes.
We contacted respondents in Arizona, Georgia, Michigan, Nevada, Pennsylvania, and Wisconsin—six states deemed crucial in the next presidential contest—for battleground polls. We concentrate a large portion of our polling on the states that have the most influence over the outcome of the campaign, since the electoral college, not the popular vote, determines the outcome of presidential elections.
2. Is calling people on the phone still a good way to get their opinions?
In the past, telephone surveys were regarded as the gold standard for survey research. Together with strategies like text messaging and online panels, they are becoming one of the many legitimate ways to reach voters. Over time, the benefits of telephone surveys have diminished as a result of rising expenses and likely compromised representativeness due to dwindling response rates. Telephone polling may eventually become completely unfeasible.
Nonetheless, telephone surveys continue to be an effective method for conducting political surveys. There is no national list of email addresses, and postal mail takes a long time. And they remain the sole efficient means of rapidly reaching a random sample of voters. Other approaches, such as mailing panelists to complete a survey beforehand, have drawbacks. Like the possibility that only the most politically engaged voters will return for a subsequent poll.
Voter registration files provide a great means to guarantee a proper balance between Democrats and Republicans. This is one reason why telephone polls, such as The Times/Siena Poll, have continued to perform effectively in previous elections. Surprisingly, a Wisconsin Times/Siena poll gave similar results to a mail survey we ran, which paid participants up to $25 and had a 30% response rate.
3. How do you make sure the people you talk to represent the whole country?
The voter file—the list of registered voters we utilize to administer our survey—is our finest resource for guaranteeing a representative sample.
This is far more than just a phone book. This data set includes a plethora of information about 200 million Americans, such as their party affiliation, residence, demographics, and whether they cast ballots in the most recent elections. To make sure we have the appropriate proportion of Republicans and Democrats, young and old, and even those with wealthy homes, we use this data at every step of the poll.
We make an effort to conduct interviews with a representative sample of Americans on the front end. Persons who do not vote in every election are among the many we call who do not seem likely to answer. For any Time/Siena poll to reach, for example, the correct share of Hispanic Republicans in Maricopa County, Arizona, or the correct share of white Democrats from the Western United States, we make sure that we conduct the appropriate amount of interviews by race, party, and location.
After the poll is finished, we employ a procedure called weighting to make sure the sample accurately represents the entire voting population by comparing our respondents to the voter file. In actuality, this typically means that we give respondents from groups who are less likely to participate in a survey—such as those who did not complete college—more weight.
4. If so few people answer their phones, how do you know their answers aren’t biased?
We experimented in 2022 in an attempt to quantify the impact of nonresponse on our telephone surveys. In our experiment, we paid Wisconsin voters up to $25 in exchange for answering a mail survey. Compared to the 2 percent or so who usually respond by phone, about 30 percent of families accepted our offer.
We found that the mail survey respondents were similar to those we call about voting issues. But they were more likely to be politically moderate, have “No Trespassing” signs, and follow politics less closely.
However, there’s no way to confirm that survey respondents match non-responding voters in demographics. There is always a chance that we have overlooked a hidden variable or another dimension of nonresponse.
5. How can just 1,000 people represent the whole country?
Sampling is the fundamental premise behind survey research. You only need a sample to have a solid picture of the entire population. But you do not have to speak with everyone.
You probably utilize sampling in your daily life, even if you are not aware of it. Additionally, you only need to taste a spoonful of soup to get a sense of its flavor—you do not need to consume a whole pot.
Sampling is only effective if the subset you taste is representative of the entire set. To get a representative sample, pollsters typically use random sampling, in which each person has an equal chance of being chosen.
Theoretically, just a few hundred Americans would be sufficient to evaluate public opinion with a reasonable degree of precision if you had a truly random sample of them, just like you would likely recognize after a few hundred attempts that a coin flip is a 50-50 proposition.
A fascinating fact about sampling is that a survey with 10,000 respondents is not ten times more accurate than one with 1,000, and a larger sample size may be less accurate if the respondents are not representative, for example, if the survey employs a less rigorous methodology.
The margin of sampling error for a 1,000-voter survey is approximately three to four percentage points. In actuality, this implies that if a survey indicates that 57% of voters support a particular issue, the actual number may be closer to 54% or 60%.
The margin of error would decrease just marginally, perhaps by two percentage points, if we doubled or, better still, quadrupled the number of respondents we polled. In other words, there would be little improvement in the survey’s overall accuracy.
However, the margin of error may significantly rise if the number of respondents falls too considerably. When examining outcomes across smaller demographic subgroups, it is critical to comprehend that.
We frequently survey multiple states simultaneously when conducting polls. One benefit of that is that we can comprehend the election dynamics in every state and obtain a sizable enough sample of responders from every state to examine demographic groups like young voters and Latino voters.
6. What’s the difference between polling likely voters and registered voters?
All people who are registered to vote in a state or nationally are included in polls of registered voters. However, hardly every voter who is eligible to vote does so, even in elections with the highest turnout.
Likely voter surveys try to determine a person’s likelihood of casting a ballot and then use that information as a guide.
People tend to overestimate their likelihood of voting, even though there are numerous legitimate methods to identify likely voters. According to pollsters, voting is a “socially desirable behavior,” or something that people would like to claim to have done. Voters have used statistical modeling to help explain this overreporting over the years.
Pollsters often focus on results among registered voters well in advance of an election. Additionally, if you want to look at more than just the choices that voters are making with an election, it is frequently useful to look at results among registered voters.
7. How accurate are polls in general, and how accurate is the Times/Siena Poll?
Across all the contests we polled in the final weeks of the 2022 midterm elections, Times/Siena poll results were, on average, within two points of the actual outcome. According to the website FiveThirtyEight, that contributed to The Times/Siena Poll being the most accurate political pollster in the nation.
However, every survey has practical constraints. First, polling is a crude tool, and as the margin of error indicates, results may differ by a few points from what we report. Two percentage points can make a big difference in close elections. However, that great of a gap is not as significant for most problems.
In a typical election, the national polling error is between two and four percentage points. In 2020, surveys were off by an average of 4.5 percentage points from the outcome, and in some areas, the difference was even greater. The ultimate popular vote in 2016 differed by almost two percentage points from national polls.
8. If the Electoral College decides the election, what’s the point of a national poll?
Electoral College outcomes are not always consistent with the national popular vote. However, national surveys, which represent the views of the entire nation and not just those in crucial areas, can still offer valuable insight into the problems and attitudes influencing the election.
National polls frequently reveal how voters perceive the candidates and the issues influencing an election. National polls are helpful instruments to look into issues of wide national importance, such as abortion, immigration, and the economy. They also helped us discover how deeply Democratic unhappiness with Joe Biden was during the 2024 election cycle.
9. How do you pick which questions to ask in a poll?
When preparing to conduct a survey, we consider current events. It may be influencing people’s opinions as well as potential future events that may influence public opinion.
We occasionally ask people to gauge the effect of a particular event, such as a presidential debate. Additionally, we occasionally survey Americans to see how they feel about a specific topic, such as the economy.
After deciding on poll themes, we spend a great deal of time discussing and creating the questions. We ensure that every respondent, no matter their political affiliation, has a fair response option to represent their point of view in each question we ask. We want to ensure that the question is fair to all.
Additionally, we want to ensure that everyone understands the query to imply the same thing. Last but not least, we must measure people’s actual opinions rather than forcing them to adopt a certain viewpoint or set of ideas. We take great pride in the skill of creating precise survey questions.
The Siena College Research Institute team, along with Camille Baker, Nate Cohn, William P. Davis, Ruth Igielnik, and Christine Zhang, created the Times/Siena Poll.
_____________________________
Thank you for reading this article. If you notice any errors, please inform us by emailing businessnexuz@gmail.com