ADVERTISEMENTREMOVE AD

In Bihar, Did All Exit Polls Fail? Pollster Sanjay Kumar Explains

If indeed, as many claim, exit polls are ‘fake’ or ‘of no use’, then why do viewers stay glued to exit polls on TV? 

Updated
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large

Amid much chatter around exit polls apparently having failed in the case of the Bihar elections 2020, let me point out a fact: even if all exit polls had indeed failed in the state, the heavens still would not fall – and this ‘failure’ certainly isn’t a ‘national crisis’ as it is being presented.

Even if all exit polls have indeed failed, how will it affect people’s job prospects, one’s health, or our economy?

And if exit polls apparently serve no purpose, and are ‘fake’ as claimed by many, the question should be asked to those who spend hours in front of the TV when exit polls are telecast. Why does the media give so much space and air time to these exit polls if they are of no value and are also ‘fake’?

As someone who has done numerous polls over the last three decades, I have always failed to understand this conundrum.

If indeed, as many claim, exit polls are ‘fake’ or ‘of no use’, then why do viewers stay glued to exit polls on TV? 
Data: Sanjay Kumar, CSDS
(GFX: Shruti Mathur / The Quint)
ADVERTISEMENTREMOVE AD

Why Both Praise & Criticism For The Same Exit Polls? What Explains Contradiction?

Since the results were announced, I have received congratulatory messages – for the Lokniti-CSDS post-poll estimates to be reasonably close, within the range of the margin of error. But simultaneously, I have also been hounded for being a ‘fake pollster’ – ‘never getting it right’, and other such charges.

I have accepted both with an open mind and heart, but I am confused as to how I could get contradictory responses for the same work. Either the opinion on Lokniti-CSDS post poll has been formed with some preconceived notions about opinion polls in general, or about me or about an institution, or there is a lack of knowledge/information with those who are being aggressively critical.

Now let me give you another example: a batsman scored very well in 20 Over matches, another batsman plays very well in One Day cricket, and there is another batsman who plays really well in Five Day Test match, but not in other modes of cricket. If one asks, who is the best batsman among these three, I am sure one yardstick cannot be applied to arrive at the conclusion.

The point I am trying to make is that whether Exit Polls have failed or succeeded should be judged by applying the same parameters, and not by applying different parameters by different people.

Do different doctors use different instruments for measuring the temperature of different patients?

How Exit Polls Should Be Judged

It is important to first figure out what is the measuring tool for judging the accuracy of the exit poll/post poll. Going by the Bihar exit poll’s latest prediction, if the accuracy of the exit poll is judged by only the seats forecast then, yes, many exit polls failed – as they got the seat forecast wrong, although some got the seat forecast correct; but if we judge the accuracy of the exit polls/post polls by their vote share estimate, there are few exit polls/post polls which have been almost up to mark, quite close or well within the range of margin of error, though most of these polls got their seats share wrong, in spite of getting the vote right.

So, we have two kinds of exit polls/post polls before us – even for Bihar, there are ones which got the vote share estimate right but the seat share wrong.

At the same time, we also have the exit polls, which got their seat share forecast right, but either their vote estimate was off the mark or they did not disclose the vote share.

Now I leave it to the readers to decide which poll is more correct – the ones which got the vote share estimate right or the ones which got the seat forecast right. Unfortunately, there are few which were off the mark – both with regard to vote share estimate and seats forecast. But I would also like to draw the attention of the readers to the fact that some of those who were off the mark – both with regard to vote share and seat estimate – have predicted many elections accurately in the recent past.

ADVERTISEMENTREMOVE AD

Why Did Some Exit Polls Get It Wrong In Bihar?

Can we take away all the credit from them just for one failure, and question their credibility and honesty? I think we should not jump to conclusions so quickly. Like politician-bashing, pollster bashing has also become fashionable these days. A day after the results were announced (11 November), those whose vote share estimates were closest to the actual vote share, were questioned more than others. Being questioned is absolutely fine – readers and viewers have every right to ask questions – but do ask questions with an open mind, and not with pre-conceived notions.

But this defence of polls does not mean that there is no need for introspection; there is certainly a need for serious thinking as to why many got the trend wrong.

Through exit polls/post polls, one should certainly get the direction right even if it fails to capture the extent of victory accurately. Unfortunately in Bihar, many exit polls failed to get the direction right with regard to seat forecast – and even with regard to vote share estimate.

As to why this happened in Bihar, the answer is clear: this was because of the extremely tight election – the final vote share suggests that both NDA and Mahagathbandan were caught in a tie, with 37.3 and 37.2 percent votes respectively.

Such a situation will always remain a nightmare for any pollster, no matter the sample size. What made the situation even worse for the pollsters is the narrow margin of victory in many constituencies, which upset the conversion of vote into seats. Twenty-three assembly seats were decided within a margin of less than 2,000 votes. Another twenty-three assembly constituencies were decided between 2,000 and 4,000 votes. When seats forecast are worked out using the swing model based on estimated vote share through exit poll/post poll, no model can successfully account for such margin constituencies.

ADVERTISEMENTREMOVE AD

Why Is It That Pollsters Get The Seat-Share Wrong But The Vote-Share Right?

Questions are also being asked as to why pollsters get the seat forecast wrong when they get the vote share right. The marginal victories, because of our first-past-the-post voting system, makes the situation challenging. Who should political parties be asking this question to – as to why parties get a different number of seats with the same vote share?

After all, in the Bihar election, two alliances polled exactly the same vote, but one alliance is forming the government while the other will sit in Opposition.

Why is it that in Karnataka, the BJP has always won more assembly seats than the Congress, even when the BJP’s vote share has been less than the Congress’s? Why is that the BJP won 116 Lok Sabha seats with 18.6 percent votes in the 2009 Lok Sabha election, while the Congress managed to win only 44 Lok Sabha seats with slightly higher vote share (19.6 percent) in the 2014 Lok Sabha election? The BSP should be asking ‘why no seats to the party’ – even if it polled 20 percent votes.

As to why exit polls go awry, it is due to different reasons, and we need to look at them on a case by case basis to learn, but new challenges emerge in new elections. A year ago, who knew COVID-19 could emerge as a new challenge for the doctors, for example?

(Sanjay Kumar is a Professor at Centre for the Study of Developing Societies (CSDS). He is also a well known psephologist and a political Commentator. This is an opinion piece, and the views expressed are the author’s own. The Quint neither endorses nor is responsible for them.)

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: 
Speaking truth to power requires allies like you.
Become a Member
Read More
×
×