Opinion polls and other sample surveys can produce accurate and useful information if the pollster uses good statistical techniques and also works hard at preparing a sampling frame, wording questions, and reducing nonresponse. Many surveys, however, especially those designed to influence public opinion rather than just record it, do not produce accurate or useful information. Here are some questions to ask before you pay much attention to poll results.
• Who carried out the survey? Even a political party should hire a professional sample survey firm whose reputation demands that they follow good survey practices.
• What was the population? That is, whose opinions were being sought?
• How was the sample selected? Look for mention of random sampling.
• How large was the sample? Even better, find out both the sample size and the margin of error within which the results of 95% of all samples drawn as this one was would fall.
• What was the response rate? That is, what percent of the original subjects actually provided information?
• How were the subjects contacted? By telephone? Mail? Face-to-face interview?
• When was the survey conducted? Was it just after some event that might have influenced opinion?
• What were the exact questions asked?
Academic survey centers and government statistical offices answer these questions when they announce the results of a sample survey. National opinion polls usually don’t announce their response rate (which is often low) but do give us the other information. Editors and newscasters have the bad habit of cutting out these dull facts and reporting only the sample results. Many sample surveys by interest groups and local newspapers and TV stations don’t answer these questions because their polling methods are in fact unreliable. If a politician, an advertiser, or your local TV station announces the results of a poll without complete information, be skeptical. Even