Many years ago, when working for a health and safety consultancy, I was shown a newspaper article about the enforced closure of a care home after they had failed an inspection on safety grounds, such as their banisters being a couple of millimetres too wide apart. "These people aren't interested in whether or not you've got a loving home," one of the very upset care workers was quoted as saying, "they're only interested in ticking little boxes."
More recently, the IT firm Atos has been the recipient of the outsourcing of decisions on sick and disabled people's welfare, taking these decisions out of the hands of GPs and others who know the welfare recipients, and placing it in the hands of a survey for which you have to score points to be declared unfit for work. LatentExistence describes it in more detail here, and this is one of the results of this procedure. (By the way, if I say anything too critical of this company, my entire blog may be shut down - this happened to CarerWatch and it took a lot of fighting and correspondence to find out that the entire forum, which is a pillar of sanity and support for many exhausted, poor and desperate people, was closed due to a link someone had posted many months previously. But I recommend Margaret McCartney's writings on them, too - sadly the BMJ article I had in mind, and which I believe is linked to here, no longer seems to be available.)
In other words, a badly thought out survey can have horrific - and fatal - results. It can of course also be fairly hilarious to those who have the time and ability to pick it apart, as bloggers did to the BCA's "plethora of evidence" about chiropractic being effective back in 2009.
I'm currently earning my pennies by doing some scientific data entry, which involves a bit of database testing. I'm actually finding it both fun and fascinating, and also discovering just how much thought has to go into writing a survey and its results. A simple "N/A" in a box where an integer is required means that query after query gets generated, multiplying the poor data manager's work. When you create a survey, study, or report, you have to allow for various responses.
The problem a lot of people cite (in my experience, anyway) with surveys is that they "don't give a holistic picture", "ignore the real person", "don't treat anyone as an individual", "reduce important things to tick boxes" and so on. The trouble with this is that a really large survey can't treat everybody, or anybody, as an individual, except for case studies. You need to state exactly what you want to find out, and how much. No survey can find out everything about everybody! And if what people say isn't representative of what's really going on, or the results don't make any sense - that's when you've got a problem.
If your tick boxes make people feel like that, this doesn't mean that surveys involving tick boxes are the problem, it means that the wrong questions are being asked - or, if it's multiple choice, that the wrong range of answers are being offered.
It's a pretty good idea, I'd say, to do a trial run of a survey, and find out where these errors are coming from. No planning can possibly think of everything that will go wrong. So do a practice run, change what you need to, and then have another go. (This is much better than, say, adding a new question halfway through what you are doing - this makes the whole thing a mess.)
Still, I would have thought that even a mobile phone company would have had the imagination to forecast the problems they created themselves with the survey they just sent me. I won't name the mobile company, but they've just offered me a new contract. I've accepted it, because it's a lot better than my current one, and also there isn't a minimum time on it. Let me make clear that they telephoned me while I was at work and I asked if they could call me back another time, which they did, exactly when they said they would - and they did not much underestimate the time the call would take, which was a big bonus. Anyway, next day I got the following message as follows:
MOBILE: "From [X]: You recently spoke to us on [Day X] and we'd like to ask you 6 quick questions about your experience. All your feedback is free."
MOBILE: "Question 1: Was this the first time we've spoken about a specific problem or query? Reply with Y for yes and N for no."
So far so good . . .
MOBILE: "Question 2 of 6: Is your query or problem now resolved? Reply with Y for yes and N for no."
First problem. I thought I'd better add to that . . .
ME: "Except I didn't have a problem/query. [MOBILE COMPANY] did."
MOBILE: "Error: We were unable to recognise your response. Please enter a number between 0 and 9 where 0 is 'poor' and 9 is 'excellent'."
MOBILE: "Question 3 of 6: That's good to hear. Overall how would you rate our advisor on a scale of 0 to 9 where 0 is 'poor' and 9 is 'excellent'?"
I know a few people who work for these companies and know that if you answer anything other than the top number, they get a bollocking and lose their commission. (Mobile companies are not always kind to their staff - I recommend a read of this experience Dean had with a different one.) So although it was annoying being rung up and I had to listen to the standard waffle, the guy was a good listener and answered - even anticipated - all my questions. So I decided to be generous.
MOBILE: "Question 4 of 6. How would you rate the time it took before you spoke to an advisor where 0 is poor and 9 is excellent?"
I think by now it's clear that they have started off this survey by making a very specific assumption which needed clearing up before questions and answers could helpfully continue.
ME: "N/A. I didn't call you."
MOBILE: "Error: We were unable to recognise your response. Please enter a number between 0 and 9 where 0 is 'poor' and 9 is 'excelent'."
ME: "'Excellent' is spelled with two 'L's. Please take a literacy course."
MOBILE: "Unfortunately we are unable to recognize the response in your message. Please try again at another time."
By not allowing for a few very simple different situations, or employing a proofreader, or even allowing a Ctrl-C Ctrl-V to be used in designing error messages, this company has messed up its own surveying ability and wasted its own time and money as well as mine.
And this is why it's important to learn how to design a survey before you do one.