Posted By Aimée Caffrey ,
Tuesday, December 10, 2019
Updated: Monday, December 9, 2019
Practical Messiness Masked by the Qualitative and Quantitative Distinction
By: Aimée Caffrey
This blog post discusses the practical messiness that can be masked by the qualitative/quantitative distinction and offers an approach for thinking about and dealing with that messiness.
Like many anthropologists, I have an abiding interest in the ways in which people construct and reproduce boundaries. During my doctoral work, my primary focus was on boundaries such as ethnicity, caste, and nationality. The professional path I have taken in more recent years has in part shifted my attention toward boundaries of another variety—the boundaries that demarcate scientific knowledge practices in industry, and toward a particular boundary with which the readers of this blog are already quite familiar—that between quantitative and qualitative. In my present role, I conduct and help support research that by most definitions would count as qualitative. At the same time, this work almost always feeds into, or follows on the heels of, research that by most definitions is quantitative. It might entail using IDIs, focus groups, or journaling exercises to better understand terminology or relevant dimensions of experience prior to writing a survey. At the other end of things, it might entail using these data collection formats in an effort to make sense of survey findings—when we have discovered the what but are uncertain of the why.
Working at this intersection instills a perhaps exaggerated awareness of, and sensitivity to, the risks of accepting the quant/qual boundary at face value. Like others of its type, this distinction is a productive shorthand for organizing and talking about a variety of practices; however, it can mask the messiness of reality. A very experienced industry researcher gestured toward this messiness on a recent L&E webinar when he remarked on the "under-powered quant" that can be at work when focus group moderators ask for a show of hands. Alternatively, consider that many of what are generally marketed as mobile ethnography or online qual tools often contain what we otherwise think of as quantitative question types (e.g., multiple choice). To offer another example, I regularly assist fellow researchers with the development of interview and focus group discussion guides, and often this assistance centers in part on rephrasing "how much" (i.e., quantitative) kinds of questions to help us make sure we are in fact collecting qualitative data.
These examples of the messiness relate to a tension between the method deployed and the data gathered. When we think of the boundary between qualitative and quantitative as pertaining to a (reporting) distinction between numbers and words, the lines are similarly blurred—we discover the use of stories and images to help explain the findings of quantitative analysis and the use of quantitative adjectives to convey insights from qualitative analysis. This isn't terribly surprising: If there is "terror in numbers," as Darrell Huff wrote in How to Lie with Statistics, the tensions and nuances at the very human heart of qualitative data can also induce discomfort. But, just as the pictures (e.g., graphs) we draw to quell the disquietude of quant can exaggerate the story that the numbers tell, so too can the words we use to describe our qualitative findings be misleading. What is more important than policing the qualitative/quantitative boundary? It is being watchful for what the messiness around that boundary might signal—that there is a misalignment somewhere among the objectives in mind, the method deployed, the data gathered, and ultimately, the claims that are made.
There may be justifiable and even good reasons to ask for a show of hands in a focus group—for example, as a quick "pulse check", or to help warm up participants at the start of the discussion. But whether we think of our work as quant or qual—and whether we are thinking of our questions, our methods, or our claims in making that determination—let's be deliberate and mindful about the implications of actively inviting that messiness into the picture.
Aimée Caffrey is a cultural anthropologist and UX researcher. Since 2017, she has worked in the Advanced Analytics Group at Bain & Company, where she collaborates with consultants, developers, designers, and fellow researchers to help clients solve some of today’s most exciting business challenges. If you wish to get in touch, please email her at Aimee.Caffrey@Bain.com.
| Comments (4)
Posted By Mark Wheeler,
Wednesday, October 2, 2019
Photo by Jo Szczepanska on Unsplash
A book published earlier this year provides a nice toolkit for qualitative researchers and consultants looking for new ways to bring additional value to our work. Super Thinking, by authors Gabriel Weinberg and Lauren McCann, introduces and explains a large number of mental models that can be applied as tools to help us do our research and communicate our findings and recommendations with more depth and impact.
A mental model is essentially a recurring concept that can be used to help understand, explain, and predict things. They are used as shortcuts to higher-level thinking. Most mental models have solid supporting evidence behind them but are not extremely well-known or formally taught to everyone in school.
Because most mental models are intuitive, they can be quickly explained to others, and used to recognize and describe patterns in behavior. They are highly valuable in qualitative research because we continually observe and hear things that need to be communicated to our clients – sometimes in ways that help to give a higher level of explanation than we have heard. It is much easier for us to recognize and explain something if we have a solid label for what is going on.
There are literally hundreds of mental models in the book. They come from a wide and varied number of fields of study, including philosophy, investing, statistics, physics and physical science, and economics. A list of several mental models is included in the accompanying table.
Applying Models in Research
In a recent marketing research project, I found a way to make use of one of Weisberg’s and McCann’s mental models to help communicate a key point to clients during a long day of in-person research. (Note: there will be a lot of detail blinding in this example to ensure confidentiality.) The research was in support of a safer kind of post-surgical wound care that had been on the market for a few years. Some of the doctors in research claimed that they hadn’t noticed fewer post-surgical complications since switching to the safer alternative, and some thought they may have seen even more complications. This was causing (and I am being understated here) some confusion and concern in the back room. Fortunately, the situation brought to mind the mental model of a moral hazard. Put simply, people take on more risk when they have information (in this case, the knowledge about the new wound-care therapy) that encourages them to believe that they are being protected.
Discussion with clients about moral hazard helped us to put a label on what we were hearing and helped us understand and probe differently in later interviews. Even more important, we were ultimately able to use the learnings to generate new messaging about the wound care product to address the potential problem of moral hazard for both physicians and patients.
A lot of the useful mental models in the book come from the social and behavioral sciences. The concept of availability bias describes the fact that once we make an answer (or behavior) available in someone’s mind by drawing attention to it, the answer begins to seem more correct. It is an automatic effect and is nearly impossible to resist. Of course, we usually want to avoid availability bias when we moderate (i.e., no leading questions).
I often discuss this idea of availability bias with clients when writing guides or surveys, and the reaction is overwhelmingly positive – even when it leads to re-writing someone else’s question. Availability bias comes up in other situations, for example when composing messages for promotion. In these cases, the bias can become a bit more acceptable (e.g., “Doctor, tell me about how satisfied your patients have been after you have prescribed our drug?”).
The larger point behind these examples is that introducing clients to mental models such as moral hazard and availability bias helps to communicate relatively complex points in a simple way that wouldn’t be possible without using the terms. When discussing a particular mental model such as loss aversion before research, clients and other listeners then begin to recognize it when they hear it from respondents. It is also fair to think of mental models as “value-adds” for any moderators or consultants who are able to bring in new concepts to help their client achieve their objectives. I’ve found that introducing mental models relatively early in reports can help prepare clients for critical upcoming findings and conclusions.
It is well worth while to check out Super Thinking and discover which mental models can be most valuable to your business.
Mark A. Wheeler, PhD, is a qualitative researcher and consultant who applies his background in cognitive and behavioral science to help his clients achieve their goals. He is Principal of Wheeler Research LLC in Bryn Mawr, Pennsylvania.
| Comments (4)
Posted By Jennifer Dale,
Tuesday, September 17, 2019
Qual or Quant? Choosing the best method for your research study
Quantitative and qualitative research are both scientific methods for data collection and analysis. They can be applied alone, or in combination, to maximize insights.
The Basic Difference: Going Beyond What vs. Why
Quantitative research relies on large sample sizes to collect numerical data that can be mathematically analyzed for statistically significantfindings. Surveys are structured, questions are typically closed-ended, and answer choices are fixed. However, quantitative research may also include a limited number of short-answer open-ended questions to help clarify why people responded the way they did to a closed-ended question. Eye tracking, facial coding, and even Big Data fall under the umbrella of quantitative research, with computers analyzing enormous volumes of data incredibly fast.
Quantitative studies produce numerical data, which allows for statistical analysis and ultimately precise findings. The US Census is a great example of a quantitative research study – fixed and close-ended questions, an enormous sample size, a collective review of many respondents, and measured population segments.
In contrast, qualitative research seeks to understand the reasons behind the numbers, as well as what is not yet known. Sample sizes are smaller, questions are unstructured, and results more subjective. Unlike quantitative research, qualitative studies insert the researcher into the data collection process. The researcher probes responses and participants provide more detail. Qualitative data is collected through interviews, group discussions, diaries, personal observations, and a variety of other creative and ever-expanding means.
Qual studies work with textual and visual data, interpreted and analyzed for directional findings. Qualitative research studies include fluid and open-ended questions, a smaller sample size, an in-depth review of each respondent, and emerging themes.
I like to think of the difference visually, where a quant study collects specific data from a large number of people, and a qual study goes deeper to collect greater insights from a small number of people.
How to Choose
The answer to whether you proceed with quantitative or qualitative research lies in your research objective and available resources.
- Why you’re doing the research
- What you need to know
- Your budget, staff, + schedule
- How the findings will be used
Consider these possible scenarios the next time you’re stuck and don’t know which way to go:
Quant + qual can come together in other ways. A questionnaire with open-ended questions, while ultimately coded numerically, can offer a window into the unknown. Focus groups that also include poll questions or surveys can produce hard data when analyzed in total, even if the results are not statistically significant.
With good planning, quantitative and qualitative research come together like a dance, guiding the marketer’s success with every step.
I Say Hybrid, You Say Multimethod
Combining quantitative and qualitative research approaches is an ancient strategy, but the names continue to change with the times. I did a bit of research and found the following terms being used to describe that ideal combination of quantitative and qualitative research. What term do you use? And why? ;)
Jennifer Dale, President + CEO Inside Heads, is a seasoned marketing professional and pioneer in online market research. Her passion for marketing, human behavior, and technology keep InsideHeads on the short list of research providers for some of the world’s most discriminating clients. Jennifer is co-author of Qual-Online, The Essential Guide and has published a number of articles in VIEWS, Alert! and Quirk’s Marketing Research Review.
| Comments (3)
Posted By Bruce Peoples,
Tuesday, August 20, 2019
Updated: Tuesday, August 13, 2019
Moderating vs. Facilitating: What’s the Difference? Can You Do Both?
A few events in my career journey triggered the exploration of facilitating as a business opportunity and value-add for my clients. First was a breakout session at a QRCA conference where “facilitating” — vs. “moderating” — was brought to my attention. The second was when a client called me on a Sunday to replace his facilitator on Monday, to facilitate a session with R&D, sales, and marketing. With so little time to prepare, I trusted my instincts and experience as a moderator and let it fly… and suddenly I was a facilitator! It was a productive session — the client called me back for another session, and later to do consumer focus groups — and my curiosity was piqued to learn more.
Core Elements: The Same
I attended a three-day training session in facilitation (much like those offered at RIVA or Burke) and was pleasantly surprised to confirm that these two disciplines have much in common. The core elements of moderating and facilitating are the same: there is a gathering of people with something in common; there is a purpose behind the meeting; and there are desired outcomes. You guide the discussion in a thoughtful manner by providing structure and process.
Perhaps the biggest commonality is putting together an agenda — what we call a discussion guide — based on the client’s objectives and desired deliverables. Like a focus group discussion guide, much attention must be paid to the flow of the meeting and to the activities that will generate robust discussions.
Many of the exercises you utilize for qualitative can also be used for facilitating meetings. These might include:
- Developing lists and gathering, sorting, and ranking ideas
- Breaking out in small groups
- Mind mapping
- Perceptual mapping
Facilitated meetings are usually longer – a half day – and therefore benefit from energizing exercises interspersed throughout the session.
The Differences: Participants, Output, and Achieving Consensus
One key difference between moderating and facilitating is the participants. Focus group participants have no skin in the outcome; you will never see them again. Facilitated participants have to work with each other. A facilitated meeting may have colleagues from different functions (R&D, marketing, sales) and at different levels of authority (managers to vice presidents). When the meeting is over, they have to work together to achieve common goals. Their strategies and tactics – their jobs – might be affected by the outcome. I ask to conduct a few brief interviews of participants from different areas prior to the meeting to get a feel for the situation, personalities, motives, and issues that might arise.
Another difference is the output: in a focus group, the outputs are insights and determining their implications and developing recommendations. In a facilitated work group or work team meeting, the output is often an action plan that determines what, who, how much, and when. The action plan should be understood and agreed upon by all participants. In other words: achieve consensus, which means participants can live with the decisions that created the action plan and support them.
Two issues you’ll address more often when facilitating a work team than moderating consumers are resolving conflicts and achieving consensus. Marketing wants that new ice cream now; manufacturing can’t make it until next year. Your approach is somewhat intuitive and not much different than if moderating – but requires more attention and care. Things you’ll need to spend more time on include clarifying the issue, understanding its root causes, ensuring everyone understands the issue, brainstorming pros and cons, and ultimately utilizing techniques to rank or prioritize.
Projects and Meetings Where Facilitators Add Value
Facilitators can add value to a lot of different projects and meetings, but common types are:
- Innovation: Brainstorming to create new product ideas.
- Process improvement: This might include flowcharting a process, identifying roadblocks, and developing solutions to clear those roadblocks.
- Strategic planning: This might include a market situation assessment, SWOT analysis, and developing the outlines of a new strategic plan.
- Data Analysis: Sharing, analyzing, and assessing a lot of data from a variety of sources.
- Planning and executing a new product launch: The output is often an action plan.
Implications for You
If you are a good moderator, should you seek out facilitation opportunities? Yes! You already have many of the skills, resources, and experiences to successfully facilitate work group meetings. To pump up your confidence before jumping in, seek and find some formal facilitation training. On your first projects, get a partner to help plan and execute.
Put together a one- or two-page brochure (PDF is fine) highlighting your capabilities – and this can include moderating. Then network your way to generate awareness. Many of you work with research managers at large companies. Let them know and ask them to share your capabilities with their colleagues in other functions, such as marketing, sales, R&D, or HR.
About the Author:
Prior to becoming a qualitative consultant, Bruce Peoples worked in brand management, channel, and customer marketing for several well-known brands in different industries including Hanes and Jack Daniel’s. Bruce has been a QRCA member for about a decade now and utilizes a variety of methods to help his clients solve their marketing problems, whether they be consumer or business-to-business related. Bruce was trained in moderating at RIVA and in facilitation at Leadership Strategies.
| Comments (0)
Posted By Laurie Tema-Lyn,
Tuesday, March 19, 2019
Annual Conference Reporter on the Scene: Step Back to Move Forward: Developing Customer Journey Maps
Bring the POWER of Theater Games to Your Next Session!
Let me start off by saying I am not an actor, although I’ve had some theater training. I earn my living as a researcher, consultant and innovation catalyst, and I’ve been doing that for decades.
I like to bring PLAY into my work as I see the results are well worth it in terms of ramping up the energy of a flagging team, developing empathy, encouraging candid, uncensored conversations and triggering or evaluating new ideas.
Using theater games builds on fundamentals that all face-to-face researchers/facilitators should have in their arsenal. They include:
- The ability to build rapport and have fun;
- Creating a “safe place” so people feel comfortable expressing themselves;
- Being able to read your group through attentive listening and observation;
- Being willing to take a risk, knowing that there are no failures — risks lead to opportunities.
Here are tips and techniques to add to your repertoire:
- Start with an easy game; I call this one Word Salad. It’s a new twist on the tried-and-true technique of Mind Mapping by adding a pulse — a finger snap — as you capture each participant’s words on a flip chart pad. Breathe and repeat each word or phrase that you are given as you chart. It can be a bit hypnotic. Participants stop self-censoring and by pausing a moment as you repeat the words they listen, reflect and connect. A variation is to use a Nerf ball and throw it to participants to respond. Less time for “thinking,” just gut level responses.
- Experiment with Improvs to illuminate brand perceptions, product or service use, or to inform creative strategy or positioning. It’s good to do a bit of pre-planning to identify some people, places, things or situations that you might want to see “acted out” in your work session. Position the exercise as an experiment.Ask for volunteers and give basic improv guidelines including the use of “Yes, And…” to accept or build on their partner’s offers. Remind participants that you are not looking to them to be funny or clever, just authentic to the character or situations. After you conduct a couple of improvs, it’s important to review what all have learned.
- Theater of Exaggeration. Try this out to spice up a concept review. You might begin in your typical fashion and then encourage participants to push the boundaries. What are the Most Outrageous Plusses or Benefits to this concept? Conversely, what are the Most Outrageous Negatives to this idea? You just might end up with some new ideas or identify problems that participants had been too polite to suggest earlier.
- Mouthfeel: Try this out to help evaluate a name and positioning. This is an improv where participants stand up and have a conversation using a new name or positioning. I recently ran a naming session with a colleague for a social services agency. We had six names in the top tier and were trying to evaluate which were the best. One of the name candidates looked great on paper, but when I asked for two volunteers to improv it (one in the role of a crisis hotline operator, the other a client calling for help) we realized it was a bear… too cumbersome to speak when used in context. We nixed that one from the list.
- Spontaneity based on solid preparation. These games work when you mentally prepare yourselfas facilitator, prepare your respondent team by providing clear guidelines of what you are asking them to do, and prepare your client team in advance so that they won’t be shocked or worried if you include a theater game to your discussion guide or agenda.
These are just a small sampling of theater games and activities you might bring to your next gig. I encourage you to try them out and make up your own, and feel free to get in contact with me.
Links to more articles on this topic:
Practical Imagination Enterprises
| Comments (2)
Posted By Kendall Nash,
Tuesday, March 5, 2019
Updated: Wednesday, February 27, 2019
Practical and thoughtful, but a walking contradiction. She made it clear that every decision she made had a purpose, and every item she bought met well-defined criteria. As she described her grocery store trips, she recalled the price associated with each and every item. In order to even make it into her cart, the items on her shopping list had to fall within an acceptable and narrow margin. And yet, her eyes lit up and you could see her lost in her memories as she described the unique metal bracelet on her wrist that she had bought on a whim for 250 euro during a trip to Barcelona. She smiled again and told me about how it was made.
Scratching Our Heads
That moment when the consumer tells you something totally incongruent with the story you’ve crafted in your mind of who they are and how they live…
Those comments that seem to contradict each other within a span of minutes…
We formulate clear pictures in our own minds of who a person is and what matters to them, only for them to turn around and tell us something that leaves us scratching our heads.
In my early years as a Qualitative Researcher, I’d find myself frustrated. Seeking patterns and convergence of themes, I was always challenged when things didn’t line up. Sure, I understood things would vary from person to person, but I was caught off guard and perplexed by the number of things that didn’t add up within the perspective of one individual.
Humans Are Messy
Of course, it didn’t take me long to realize what many before me had contemplated – that humans are, in fact, messy. We don’t follow a logical path down the road. There’s not always a reason – or at least not a consistent, or “good”, one. We don’t always make linear decisions. Sometimes we struggle with opposing internal forces that shape our mindsets and behaviors.
But then something beautiful happened.
When I looked more closely at those incongruencies within a single person, there were valuable opportunities for my client to step in and meet the consumer in the midst of the messiness. We identified opportunities for innovative products and delivery, discovered more meaningful ways to connect with those not yet using their brand, and found unique ways to give someone a great customer experience worth talking about. It was actually in those messy places we were finding our most disruptive learning – you know, the insights that make your team say “whoa, yes.” It’s exhilarating to experience those moments when you are onto something that you know will significantly and positively impact your business.
Unveiling the Mess with Qualitative Research
As a fan of both quantitative and qualitative research, I respect the ways both serve in delivering the information we need to make good decisions. Yes, enough people will tell you that quantitative tells you the what and qualitative tells you the why, but it’s so much more for me. Quantitative offers us sound decisions, confidence in direction before we set sail, and a big, delicious slice of the world. The beauty of qualitative is our ability to get in the nooks and crannies. To discover the mess and bring things into the light that just might unlock something truly magical for the brand. The rapport we build with consumers allows us a richer glimpse into what matters to them, so we can become brands that matter to them.
Embrace the Mess
Knowing that the messiness of the human heart and mind can be where the greatest potential lies for brands, we can see those moments through an entirely different lens. The next time in research you find yourself with a consumer who doesn’t seem to fit into a perfectly shaped box in your mind, celebrate! When things don’t add up exactly the way you expect them to, celebrate! You are probably onto something really good. And we go after good things.
What about you? Where have you found gold in the messiness of incongruent, inconsistent, yet beautiful human beings?
Kendall Nash is a Vice President at Burke, Inc. in Cincinnati, Ohio. She is an instructor for the Burke Institute and a past president of QRCA. Kendall’s curiosity drives her closer to consumers and their experiences. Her thrills come from uncovering what people truly want and need, and translating that so brands can win.
| Comments (3)