A woman holds her child as they cast her ballot for the parliamentary election .
(photo credit: REUTERS)
Exit pollsters who presented a woefully inaccurate map of Tuesday’s election outcome are searching for clues as to why their results differed so vastly from the actual votes being counted on Wednesday.
There are expected to be large gaps between the final public polls published the Friday before elections, and the actual results. In Israeli politics, many voters remain undecided until the last minute, and a lot can happen in those last few days.
But the exit polls – which debut precisely when the polls close at 10 p.m., and offer the first insight into what went down on Election Day – are generally far more accurate. This year, the first exit polls presaged a rough tie between the Likud and the Zionist Union, with around 27 Knesset mandates each. With most of the votes counted, the Likud has edged closer to 30, while the Zionist Union has pulled in a meager 24.
Mina Tzemach, scientific manager of the Mitgam Institute that runs the Channel 2 poll, said that some of the differences could be attributed to timing.
The last exit poll data that pollsters use in their initial assessments are taken at around 8 p.m.-8:30 p.m., in order to prepare the data for the 10 p.m. debut. Those final two hours could have seen a rush of right-wing voters, she said. The company’s updated 1:30 a.m. poll, she continued, was closer to the actual result.
Another problem, she noted, was that in certain areas, people were less willing to participate in polling, though she and other polling companies could only speculate as to why.
“Unlike last time, many people refused to participate in the exit polls,” she said, noting that the trend was particularly pronounced among immigrants from the former Soviet Union. Some 30 percent of voters in heavily Russian-speaking areas refused to participate, she said.
Though some have speculated that certain populations are nervous about admitting whom they voted for, the exit poll methods make that an implausible scenario. Pollsters do not stand outside booths and take surveys. Rather, they set up mock booths at 60 polling stations around the country, representing some 25,000 voters, and ask citizens who have just voted to repeat the exercise, putting a ballot slip in the envelope and dropping it in a box.
Panels Politics CEO Menahem Lazar said the gaps were worrying and confusing. Statistically, polls should be able to better predict the results for the large parties, he said, but virtually all the pollsters got the two biggest ones wrong. Predictions for the small parties, meanwhile, turned out to be rather accurate.
Further, a the wave of late-coming Likud voters could explain the gap between exit polls and outcomes, that explanation did little to account for the drop in the Zionist Union.
“It really worries me,” said Lazar.
If they want to figure out exactly what happened, he went on, the polling companies should examine differences between the mock polls and the actual polls they were supposed to represent. Another option would be to do follow-up surveys to determine what went wrong, he suggested.
However, he added, “I’m not sure if anyone will do these kinds of polls.”
Steven Miller, an American Israeli pollster at 202strategies, said that the pollsters should release their methodologies, though the polling companies say it is proprietary information.
“There are a lot of questions around the exit polls,” he said on Wednesday. “What’s most peculiar about the exit polls that were published last night is that none of them disclosed their methodology.”
Regardless, pollsters will likely have to rethink their approaches before the next election rolls around.