«

»

Mar 03 2014

A survey rant

“We need to do some PR but we need something new to talk about – I know, let’s do a survey!” …
“I have a dissertation to do – I know, I’ll do a survey!”

In 1993 I was awarded a PhD following a long period of (self-funded) research, fieldwork, analysis and thesis-writing. The fieldwork stage included a rigorous questionnaire survey of my subjects (magistrates), with follow-up face-to-face interviews, courtroom observations, local data capture, and further interviews of courtroom professionals and others. My research largely predated the world-wide web, and email use then was relatively rare, so I couldn’t distribute my questionnaires electronically or extend the sample’s reach through any kind of sharing. However, thanks to the demands of my LSE supervisor and other academic staff, and the support of court staff across my nine carefully-selected fieldwork sites, I was able to conduct a highly-targeted survey, achieve a very high response rate among my sample, and, after some SPSS mainframe number-crunching, help ensure I could draw statistically reliable conclusions to satisfy a panel of University of London examiners.

I later did a PR qualification which included a module and an examination on market research techniques. These experiences mean I have a healthy respect for opinion pollsters and market research agencies. Indeed, I have commissioned such agencies to do questionnaires and telephone interviews on behalf of organisations I have represented, and discussed rigorous and statistically sound research methodologies. What I learned about populations, sample sizes, response rates, margins of error, confidence intervals, etc, has proved invaluable.

As I repay my research gratitude from the 1990s, I will happily participate in surveys that are professionally delivered (perhaps under the Market Research Society Code of conduct). I welcome surveys that are well-targeted at me or my interests, and which don’t rely on a prize draw for the latest electronic gadget as an incentive to complete it (a copy of the research findings is often more welcome).

However, my respect for sound survey work isn’t always shared. Today, surveying is much cheaper and easier – arguably, too easy. I regularly hear friends complain of poorly constructed questionnaires fired out indiscriminately, often online, emailed, added to Linkedin discussions and/or shared via Facebook or Twitter with a desperate “Pls RT!”

Bias

CIPR best practice guide for using statistics in communications (PDF)

The survey-senders can also mistake a few dozen completed surveys as a success, but it’s not only quantity that matters (perhaps the most widely used service, SurveyMonkey, provides useful guidance on sample sizes); and in 2011 the CIPR produced a helpful guide on using statistics in communications), it’s quality. For example, are the people who respond representative of the market you want to learn about, or just the people you know? Has the survey been accurately targeted on a chosen market or geographical region? If the conclusions are focused on companies, do respondents have the right levels of professional or company knowledge to provide accurate answers or opinions on behalf of their employers?

I sometimes see survey reports which fail on various (or even all) of these criteria, or which don’t acknowledge how sampling or other biases may have been introduced. For example, using social media to disseminate a survey of the extent of building information modelling (BIM) use in the UK construction industry may get a good response rate from those BIM users who use social media, but exclude the views of BIM users who don’t use social media, and – crucially – of people in companies that are not using BIM at all (arguably, these are the people that it might be most useful to learn about). And, the survey will also need to include questions to filter out non-UK users, and perhaps also identify multiple users in the same organisation.

Perhaps I sound a bit negative about surveys? I’m not. Market research and opinion surveys remain a highly valuable way to gather information to check assumptions, identify trends and support decision-making. We also have better tools to conduct surveys and to analyse results more quickly and accurately than ever before, but these need to be properly deployed with a clear understanding of both the opportunities and the potential drawbacks.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>