How do you respond when you are interviewed by a pollster? Do you tell the truth or do you make up a response? And if the latter, how do you decide what to say and why do you do this?

These questions naturally come to the fore during election times. If the polls are running against you or your party, it would not be unnatural for you to attack the messenger. It might give you or your supporters some solace, even if it is not well based — and the relief is only temporary.

There is not much literature on this subject in social research journals, perhaps because the subject itself is not easy to research: asking people to be truthful about lying does present some difficulties.

Researchers have used some expedients to minimise non-truthful responses. For example, if the topic is a very sensitive one, questions may be couched in a way that asks respondents what they think others are thinking. The assumption here is that respondents will state their own view by attributing it to “the majority”, thus not having to offer it personally as their own.

Sometimes “privacy” expedients have been used. In Australia, Morgan offered respondents a “ballot box” in which to place their voting intention answer as a gesture towards confidentiality.

Some researchers are using self-completion questionnaires on the net on the assumption that the privacy this affords will lead to more honest answers. The problem with many online self-completion surveys is that there is no control over the sample, and so no-one can be sure exactly who has responded: one person a thousand times or a thousand people once each. It is possible to use technological devices to prevent multiple responses, and while these are used by professional researchers, they are not used universally.

There was a time, particularly in the USA, when white interviewers could not interview in black areas, or non-Jewish interviewers in Jewish areas, because respondents would not trust them or even talk to them. This particular problem was resolved by the use of telephone interviews.

Researchers have thought carefully about question design in an attempt to elicit truthful answers. Thus questions are presented in an explicitly even-handed way, where each alternative answer is seen as legitimate. And sometimes the question explicitly includes a “no opinion because I do not know enough about the subject” response, to indicate that such a response is legitimate.

Some years ago we participated with Fred Emery, the distinguished ANU psychologist and researcher, in studying the dynamics of telephone use. Among other things, we found that people’s responses to telephone questions were often truthful, partly because, without the ability to use body language to divert the questioner, and because they did not feel able to remain silent, they responded as they felt. And further, we realised that the body language of the interviewer, because it was invisible, could not be used by the respondent to guide them in their response. We concluded that the telephone was a good tool for social research. This was some years before the net became an option.

We know there are some questions that are likely to yield less reliable data than others.

Answers to questions which are highly personal or about socially frowned-on behaviour – have you cheated on your spouse in the last seven days? – may often be untruthful.

Questions about people’s wealth yield data which are notoriously unreliable. People may tell you their income, but they don’t like telling the whole truth about their total assets.

Questions about socially questionable or illegal activity – would you personally take part in violent attacks on (specified) ethnic group? – may not yield truthful answers.

When it comes to saying how they would vote, it is common for about 10% of respondents to say they don’t know. This is not usually offered as a response, so those who give it are usually responding spontaneously.

As campaigns draw on towards election day, the proportion of “don’t knows” usually shrinks by about half.

It is hypothesised that people say “don’t know” for a variety of reasons:

  • They are uninterested in politics and give no thought to how they will vote until the last minute;
  • They are interested in politics but are equivocal about whom they will vote for until they have all the information they think they need or until time runs out, or
  • They know how they will probably vote but say “don’t know” in order to keep their voting intention private.

It has been often assumed that this last group are innately conservative people who place a premium on privacy and regard conversations about their personal voting as somewhat ill-mannered. Based on this, it is commonly assumed that they tend to vote for conservative parties, but of course there is no knowing whether this is true.

Our overall hypothesis is that, in Australia, the distortion of poll results about elections by untruthful answers would have less impact on the results than would normal sampling error. We cannot at the moment prove this hypothesis, but would observe that, in the main, polls taken in Australia immediately before an election have a good record of accuracy, when compared with the election result. This suggests that there is not much noise in the system.