So, recently we’ve been talking about Qualitative Research and how it’s not so scientific, but that ain’t bad.
We identified three ways that you *might* make Qualitative Research more scientific and have been pulling those approaches apart. They are to:
- Use a relatively large sample size (which we destroyed here)
- Ensure that your test environment doesn’t change (which was shown to be foolish here)
- Ensure that your test approach doesn’t change (which we’ll take down now).
So, one of the first things you learn when you come to qualitative research, particularly usability testing, is to write a test script. A test script is good because you’ll be spending an hour or so with each person and you need to have something to prompt you to make sure you cover what you need to cover, and to help ensure you have a good structure to your session.
But this is how scripts are supposed to be used – as a guide. You shouldn’t literally use them as a script! And you should feel confident and comfortable deviating from the script at the right times.
When are the right times to deviate from the script? I think there are two key times.
If you already know what the answer to your question will be, there is very little reason to ask it. Sometimes it is helpful to have an array of people responding in the same way to the same task or question – particularly if your client is particularly attached to a bad idea for some reason. Repetition can help bring them around. Most of the time, though, you’re just wasting valuable research time covering old ground when you could be covering new.
Very often it’s not until the first one or two research sessions that some very obvious issues become very obvious. You wonder why you didn’t pick them up before, but that’s why we do this kind of testing/research. If you’re not updating your prototype (as recommended in Part Two), then you should update your script. Don’t cover old ground for no good reason, research time is too valuable for that.
The other main reason for deviating from the script is if the person you’re interviewing says or does something really interesting. Your script tries to anticipate what people’s reactions might be, to a point – but the point of doing this research is to learn things you didn’t know before, and sometimes what you thought you’d find and what you actually find are very distant from one another – and this is great! This means you’re doing good research. (It’s alarmingly easy to find out the answers you want to find out by researching poorly).
If you’re interviewing someone and they say something interesting and perhaps unexpected – follow it! This is potentially research gold. Don’t let sticking to a script stop you from following this gold. You may, in fact, want to alter your script for future interviews depending on what you discover here.
Of course, this means that when it comes time to do your report you won’t be able to say things like ‘80% of people said this’ or ‘only 10% of people did that’. People do like to say those kinds of things in their reports and, of course, clients tend to like to hear them. People like numbers. (Just think of how they latch on to stupid concepts like the ‘3 click rule’). But you shouldn’t really be using numbers like this in your reporting anyways. After all – as we talked about in part one – you’re not using statistically significant numbers anyway, you’re probably talking about eight, maybe twelve people. Your percentages, no matter how popular, are not particularly meaningful AND you are helping to fuel the perception that research is about numbers like this when, as we agreed earlier, it is really all about the depth of insight and qualitative research is what you do if you want to pull out fancy percentages.
So, write yourself a script and use it for inspiration and reminders and for structure but don’t be constrained by it and do let the content of your interview guide the questions you ask and what you report.
Which makes me think… perhaps we need to talk some about how to ask good questions whilst interviewing… soon, I think.
(Brief apologies for the delay between parts 2 and 3 – I had to do some holidaying in Italy. Briefly back in London before flying out to UX Week tomorrow morning. Are you having a ridiculously busy August too?!)
Definitely look forward to the discussion of questions. That’s a BIG interest of mine; I’ve been lecturing on that for years!
Another approach to making qualitative analysis more scientific is to insert quantitative evaluations somewhere in the process. That is, to transform subjective expert observations into numbers as soon as possible, and to then conduct the remainder of the research/analysis on the numbers.
This is frighteningly common, and while I never formally studied research and science methodologies, it strikes me as fundamentally flawed.
As a (made up) example, imagine those research reports that come out every now and then which compares, say, the amount of “hard news” versus “soft news” on TV. The analyst will watch one night of TV news, and rate the “hardness” of each story from, say, 1 to 10. They will then average them out by TV channel/network or by program/show, transforming the numbers in such a way that all of the top-level results of the study are expressed in pure, scientific-sounding numbers: “73% of all news on Channel X is soft news” or “News programs contain only 14% of the hard news they contained 10 years ago”.
This is what I was complaining about in a post last year criticizing Forresters for “Magically Transforming the Subjective into the Objective”.
Is this a common technique in our field?
[…] Leisa Reichelt’s Disambiguity and left a comment on her qualitative research article, […]
Very well said, Leisa. Talking about interviews and getting the best out of interviewees, in Part 3 of the IELTS speaking examination, the trickiest part for examiners is to ask candidates follow-up questions that brings the conversation down (or up) to a level indicative of the candidate’s best performance. Converting a script into a guide that would serve as a script for a future guide as well requires a certain open-mindedness and flexibility.