Handout: Opinion polls
1st September 2015
Opinion polls
What are opinion polls?
An opinion poll is a scientific survey designed to measure the views of a specific group — for example a country’s electors (for most political polls).
Why are they important?
Public opinion is, in fact, recognized as an important force in statecraft. In countries ruled by dictatorships every effort is made to keep the public in line by allowing only one point of view to be heard. No free play of public opinion is permitted.
In a democracy like ours it is an accepted idea that the public which is called upon to make important decisions at the ballot box must be kept informed of popular issues. It is also an important principle of our governmental system that public policies are decided upon by the people. Popular control over lawmaking bodies, over executives in the government, and over domestic and foreign policy is a basic idea in our political society. The people are the source of power. Hence their opinions should mold the action of government.
The successful life of our government operating under these principles justifies our faith in the people’s good judgment. We believe that once the public’s views on public issues are known and acted upon, our government will be improved rather than damaged. It is often said that only those who distrust the public and the soundness of its judgment need fear an expression of its views.
How is an opinion poll scientific?
The two main characteristics of scientific surveys are a) that respondents are chosen by the research company, and b) that sufficient information is collected about respondent to ensure that the data in the published results match the profile of the group being surveyed. For example, if the population being sampled contains 52% who are women and 30% who are over 55, then a typical opinion poll will ensure that its published data contains these proportions of women and older respondents.
How does a poll choose a sample that is truly representative?
There two main methods. The first is “random” sampling, the second “quota sampling”. With random sampling, a polling company either uses a list of randomly-drawn telephone numbers or email addresses (for telephone or some Internet polls); or visits randomly-drawn addresses or names from a list such as an electoral register (for some face-to-face surveys). The polling company then contacts people on those telephone numbers or at those addresses, and asks them to take part in the survey.
“Quota” sampling involves setting quotas — for example, age and gender — and seeking out different people in each location who, together, match those characteristics. Quota polls are often used in face-to-face surveys. In addition, some Internet polls employ quota samples to select representative samples from a database of people who have already provided such information about themselves.
Do polling companies do anything else to achieve representative samples?
Usually they do. While well-conducted random and quota samples provide a broad approximation to the public, there are all kinds of reasons why they might contain slightly too many of some groups and slightly too few of others. What normally happens is that polling companies ask respondents not only about their views but about themselves. This information is then used to compare the sample with, for example, census statistics. The raw numbers from the poll are then adjusted slightly, up or down, to match the profile of the population being surveyed. If, for example, a poll finds that, when its survey-work is complete, that it has 100 members of a particular demographic group, but should have 110 of them (in a poll of, say, 1,000 or 2,000), then it will “weight” the answers of that group so that each of those 100 respondents counts as 1.1 people. This way, the published percentages should reflect the population as a whole
How useful are opinion polls?
Polls may not be perfect, but they are the best, or least bad, way of measuring what the public thinks. In most countries where poll results can be compared with actual results (such as elections), well-designed polls are usually accurate to within 3%, even if they occasionally stray outside that margin of error. Moreover, much of the time, polls provide a good guide to the state of opinion, even allowing for a larger margin of error. If a well-designed, representative survey finds that the public divides 70-30% on an issue, then a margin of error of even 10% cannot alter the fact that one view is expressed far more widely than the other. However, it is true that in a closely-fought election, a polling lead (in a sample of 1-2,000) of less than 5% for one candidate or party over another cannot be regarded as a certain indicator of who is ahead at the time the survey was taken — let alone a guarantee of who will in the days, weeks or months ahead.
Exit polls
Exit polls are extremely complicated. The main difference between a regular poll and an exit poll is that the former asks who will you vote for, while the latter asks who did you vote for? Exit polls are based on thousands of interviews carried out outside polling stations. Anything else is not really an exit poll. An exit poll provides an estimate result. It assigns a probability to outcomes across the country based on the data collected. Once assigned, the probabilities of all constituencies are added up to generate a forecast of the total number of seats for each party. The exit poll was spot-on in both 2010 and 2005. However, a BBC exit poll in 1992 forecast a hung parliament and in the end, the Conservative party secured a majority.
Bandwagons and boomerangs
The ‘bandwagon’ effect is where people may be influenced to vote for a political party that is doing well in the polls leading up to the election – effectively jumping on the electoral bandwagon. Others argue that polls influence voters leading to a ‘boomerang’ effect where voters vote for the party that is not doing well due to them being seen as underdogs, or likely voters for the party in the lead do not turn out to vote believing their preferred party will win anyway. These effects have led to some countries such as France banning opinion polls in the days leading up to elections to limit their influence on voters.
Tactical voting
Opinion polls have also been criticised for increasing the likelihood of tactical voting. For example, if a Lib Dem voter sees the Lib Dems are trailing in third place but the Labour candidate has a chance at defeating the Conservative candidate, they may vote tactically for Labour to stop the Conservative candidate from winning.
Party polls and focus groups
Focus groups are a small number of individuals brought together to consider a particular issue or respond to a specific policy, whilst in the US they are used to assess the impact of party election broadcasts before they are aired. Party polls are organised by a political party to assess the impact of a campaign or policy giving data upon which decisions can be made.
Opinion polls and the 1992 general election
Labour went from having a four-point lead over the Tories six months before the 1992 election to being eight points behind on polling day. It was a sensational turnaround, and one famously missed by almost every opinion poll at the time.
The Tories’ share went up by only two points, but that was enough to deliver them a majority government. It’s a mark of our changing times that Labour would probably be quite happy with 34 per cent at the 2015 election – so long as the Tories were a couple of points behind.
Opinion polls and the 1997 general election
Labour averaged a massive 52 per cent six months ahead of the 1997 election. This figure proceeded to fall by nine points, but still delivered a landslide. The Tories’ average remained stuck on 31 per cent, while the Lib Dems – as had become the trend – picked up a few points.
Opinion polls and the 2001 general election
- Again Blair easily outshone William Hague as leader
- Labour very competent and trusted in key areas e.g. economy health and education
- Tories seen as out of touch still talking about EU and immigration
- Electorate not engaged hence lowest ever turnout
- Only interesting thing was Prescott punching a bloke
Labour’s share of the vote fell by five points in the months leading up to polling day in 2001. At most elections this would have been cause for concern, but here the party’s lead was so big to begin with that such a drop proved of little consequence.
Notice that the Tories even managed to slip back a point as well. Only the Lib Dems improved their share. 2001 was a rare recent example of both the two biggest parties losing support in the run-up to an election.
Opinion polls and the 2005 general election
- Michael Howard could not compete with Blair
- Conservatives focused on the contentious issues such as immigration and asylum
- Even the conviction of Kamel Bourgass a terrorist who killed a PC and was plotting to poison people could not see immigration as a major issue
- Conservative image was still poor
- They still supported the Iraq War and their claim “vote Blair get Brown” was quite popular
- Tax did not become a major issue. Labour could point to a strong economy
- Law and Order did not interest the public at large
- Labour had a good track record on economy, health and education combined with the fact that the Tories weren’t trusted
- Labour were able to win even with problems such as Iraq War collapse of MG Rover and tuition fees
- Labour still supported by the majority of the media
- Lib Dems tried to gain disgruntled Labour voters by proposing 50% tax for those who earned £100000 p.a.
- Displayed their opposition to Iraq War and tuition fees
- Yet Labour still displayed party competence whereas the Lib Dems were seen as disorganized
- However their majority was reduced, although still large enough to pass their legislation
Overall five of the six polls tracked by the BBC gave an accurate result within their margin of error.
Taking the average of all six, the share of the vote for Labour was 37.6% (actual share 36%), Conservatives 32% (33%) and Lib Dems 22.6% (23%).
And the BBC/ITV exit poll was able not only to predict accurately the share of the vote, but was spot-on in its predictions as to the size of Labour’s reduced majority.
Opinion polls and the 2010 general election
The Conservatives were enjoying an average of 40% in the polls six months before the 2010 election. What happened next turned a likely Tory majority government into a hung parliament.
The Conservatives shed four points, Labour gained two and the Lib Dems rose by six. Half a year out the polls didn’t anticipate the tightening of the race. 2010 was unusual in recent history for the way all three parties experienced significant movement in the polls in the months before polling day. Things were very different five years earlier.
Opinion polls and the 2015 general election
Months of extensive opinion polling were wrong about the election result, a senior pollster admitted as the strength of the Conservative Party result confounded predictions. Opinion polls had consistently shown Labour and the Tories almost neck and neck for months, with little shift throughout the campaign.
Yet those polls by the early hours of Friday appeared to have significantly underestimated Conservative support and overestimated Labour support.The size of the discrepancy saw comparisons with flawed polls from 1992, where surveys predicted a hung parliament, but the Conservatives won an outright majority.
0 Comments