Posted By Tom Rich,
Tuesday, January 14, 2020
Looking Back – A Year of Change in the World of Qualitative
By: Tom Rich
At this time in the new year, it feels right to look back and see what seems to be different. It certainly has been a time of change in the world of qualitative research. I’ve spent the last couple of weeks communicating with some qualitative luminaries to get their perspectives on developing trends over the past year. Based on those conversations, I think a number of key developments are worthy of mention.
The Tool Bag is Growing
The continued expansion of the tools available for qualitative research is profoundly changing what qualitative is, and how it’s done. A dizzying array of tools and resources can be applied to qualitative. The online research platforms continue to grow in both features and sophistication, use of AI-based tools is growing, video editing software has become easier to use, and biometric tools continue to grow more affordable and user friendly. As a result, we can now provide insights to clients that are based upon more than mere conversation—we can bring multiple data sources to bear on our analysis.
The Lines are Blurring
As more tools become available, and as they become increasingly affordable and user friendly, clients are increasingly interested in mixed method studies. This doesn’t just mean qual/quant approaches; it’s no longer acceptable for qualitative practitioners to say, “I don’t do quant.” It means a greater merging of online and face-to-face approaches as well, as UX and more traditional qualitative studies are starting to look very similar.
Agility is the Word of the Moment
Clients are demanding compressed timetables. Practitioners are increasingly completing research in four or five business days. Also, clients’ priorities seem to be changing. Whereas the question used to be, “what will it take to get exactly the information we need?”, it now seems to be, “we have a week—what can we learn in that time?”. As a result, new approaches to recruiting and fieldwork are becoming more common.
Qualitative Approaches Are Becoming More Important to Understanding Big Data
Research users continue to see the value in using qualitative tools to understand all the data they have available. More and more seem to understand that there is a big difference between information and wisdom, and that actually talking to people–while it doesn’t yield quantifiable information–provides a level of detail and insight that can be acquired no other way.
If you feel like the ground is shifting beneath your feet, you’re not wrong. Whether you’re a researcher or a research user, staying current on new tools, techniques, and priorities is essential to survival and success.
I’d like to extend my thanks to these qualitative luminaries for their perspective on developing trends: David Bauer, Jim Kulevich, Abby Leafe, Joanna Patterson, Steve Schlesinger, and Manny Schrager.
Since founding his business, Thomas M. Rich & Associates, in 1996, Tom Rich has conducted thousands of focus groups, one-on-one interviews and online interactions for clients in nearly every industry. He boasts an extensive background in brand strategy, consumer behavior and shopper insights -- skills he developed while working for companies that include Backer & Spielvogel Advertising, Nabisco, Tambrands, and Unilever. This background gives Tom a unique skill set among qualitative practitioners and allows him to structure research and analysis around the tactical and strategic decisions that will be made as a result of the research. Tom holds a bachelor's degree in English from the University of Pennsylvania and a master's degree in business administration from the Amos Tuck School of Business Administration at Dartmouth College.
a year of change
| Comments (1)
Posted By Aimée Caffrey ,
Tuesday, December 10, 2019
Updated: Monday, December 9, 2019
Practical Messiness Masked by the Qualitative and Quantitative Distinction
By: Aimée Caffrey
This blog post discusses the practical messiness that can be masked by the qualitative/quantitative distinction and offers an approach for thinking about and dealing with that messiness.
Like many anthropologists, I have an abiding interest in the ways in which people construct and reproduce boundaries. During my doctoral work, my primary focus was on boundaries such as ethnicity, caste, and nationality. The professional path I have taken in more recent years has in part shifted my attention toward boundaries of another variety—the boundaries that demarcate scientific knowledge practices in industry, and toward a particular boundary with which the readers of this blog are already quite familiar—that between quantitative and qualitative. In my present role, I conduct and help support research that by most definitions would count as qualitative. At the same time, this work almost always feeds into, or follows on the heels of, research that by most definitions is quantitative. It might entail using IDIs, focus groups, or journaling exercises to better understand terminology or relevant dimensions of experience prior to writing a survey. At the other end of things, it might entail using these data collection formats in an effort to make sense of survey findings—when we have discovered the what but are uncertain of the why.
Working at this intersection instills a perhaps exaggerated awareness of, and sensitivity to, the risks of accepting the quant/qual boundary at face value. Like others of its type, this distinction is a productive shorthand for organizing and talking about a variety of practices; however, it can mask the messiness of reality. A very experienced industry researcher gestured toward this messiness on a recent L&E webinar when he remarked on the "under-powered quant" that can be at work when focus group moderators ask for a show of hands. Alternatively, consider that many of what are generally marketed as mobile ethnography or online qual tools often contain what we otherwise think of as quantitative question types (e.g., multiple choice). To offer another example, I regularly assist fellow researchers with the development of interview and focus group discussion guides, and often this assistance centers in part on rephrasing "how much" (i.e., quantitative) kinds of questions to help us make sure we are in fact collecting qualitative data.
These examples of the messiness relate to a tension between the method deployed and the data gathered. When we think of the boundary between qualitative and quantitative as pertaining to a (reporting) distinction between numbers and words, the lines are similarly blurred—we discover the use of stories and images to help explain the findings of quantitative analysis and the use of quantitative adjectives to convey insights from qualitative analysis. This isn't terribly surprising: If there is "terror in numbers," as Darrell Huff wrote in How to Lie with Statistics, the tensions and nuances at the very human heart of qualitative data can also induce discomfort. But, just as the pictures (e.g., graphs) we draw to quell the disquietude of quant can exaggerate the story that the numbers tell, so too can the words we use to describe our qualitative findings be misleading. What is more important than policing the qualitative/quantitative boundary? It is being watchful for what the messiness around that boundary might signal—that there is a misalignment somewhere among the objectives in mind, the method deployed, the data gathered, and ultimately, the claims that are made.
There may be justifiable and even good reasons to ask for a show of hands in a focus group—for example, as a quick "pulse check", or to help warm up participants at the start of the discussion. But whether we think of our work as quant or qual—and whether we are thinking of our questions, our methods, or our claims in making that determination—let's be deliberate and mindful about the implications of actively inviting that messiness into the picture.
Aimée Caffrey is a cultural anthropologist and UX researcher. Since 2017, she has worked in the Advanced Analytics Group at Bain & Company, where she collaborates with consultants, developers, designers, and fellow researchers to help clients solve some of today’s most exciting business challenges. If you wish to get in touch, please email her at Aimee.Caffrey@Bain.com.
| Comments (4)
Posted By Kunyi Mangalam, Mara Consulting,
Tuesday, November 26, 2019
Use a Listening Session Approach for Better Design & Innovation Research
This blog is intended to offer a high-level description of a qualitative data collection method called Listening Sessions that can yield deeper understanding of people’s reasoning, emotions, and guiding principles. This approach is particularly valuable for practitioners of design research, and for those who contribute to the innovation process.
This approach was developed by Indi Young, who invented Mental Model Diagrams (MMDs) when she was one of the founding members of Adaptive Path. Find her at IndiYoung.com. When I interned with Indi, I learned this approach as part of her methodology. Adopting this approach made me a better interviewer for the Discovery work I do in Service Design.
Research for Design – it’s Not Designing a Calorie Tracking App, it’s How to Look Good at a Reunion
Research for design and innovation is tasked with understanding how people make decisions as they progress toward achieving a purpose. The “thing” that’s being designed, like an app, is always part of a larger goal. For example, I want to keep track of my calories by using a tracker on my phone. The thing being designed is the tracking app. But my larger purpose is to look good at a reunion. The app will help me in that goal.
For innovation, it’s not just about one thing (the app), it’s also about what else the company can do to support me in trying to look good at my reunion. Needs will be surfaced that the company can decide whether they want to pursue supporting. Revealing these needs, figuring out how to support them in a competitive way, and commercializing them is fundamental to innovation.
Change Your Mindset from Interviewing to Listening
Conducting qualitative research for design and innovation requires a shift in mind-set. It requires that you “listen” rather than “interview.” You may be thinking, “I have been listening to people for my entire career — I listen for a living!” That’s how I felt, too. Then I learned that an Interview is to a Listening Session as Moderating is to Facilitating. They look similar, but the intent, the process, and the outcomes are different.
There Are Two Critical Differences between an Interview and a Listening Session
First, Listening Sessions belong to Problem Space research; Interviewing tends to belong to Solution Space research.
Problem Space research is concerned with how a person (not a user) thinks and reasons their way to achieve a goal. It is disconnected from a particular company, brand, product, or service. Problem Space research is foundational and can be used to fuel many solutions.
An easy way to think about Problem Space research is that the research focus would be just as relevant to someone your grandparent’s age as it would be to your grandchild’s. Examples include: How do you groom yourself for an important day at work? How do you decide to attend a performance? How do you prepare for a good night’s sleep? How do you make yourself look good to see people at an important get-together?
Solution Space research involves speaking to people about their relationship or an experience with a product, service, or brand — i.e., the solution. The words “users”, “members,” “customers,” and “employees” imply a relationship with a “solution.” Product development, marketing communications strategy and tactics, brand positioning, customer experience mapping, packaging, user experience, and content development are all examples of solution space research.
The table below summarizes the difference between Problem Space and Solution Space research.
The second difference is that you listen for and nudge people to reveal what is underneath a preference, opinion, explanation, or description. Compare this to a Solution Space IDI where we are often interviewing for Perceptions, Opinions, Behaviours, and Attitudes (shout out to Naomi Henderson for her acronym POBA) around a brand, a product, etc. In the Problem Space, when POBAs are articulated, we take them as our cue to nudge people further into their thinking, their feelings, or the “code” they live by to identify what is underneath.
These two differences make an enormous difference in the approach between being a Listener and being an Interviewer.
Difference 1: There is no Guide
In a Listening Session, there is no Interview Guide. The conversation begins with the study scope question, like, “Tell me about how you made sure you looked your best for your reunion…” And it continues from there. In this example, the participant may or may not have used a calorie counter app. They may have used one from a competitor. (Note, they would have been recruited such that they prepared to look their best for their reunion.)
In place of a guide, there is a disciplined ear; the Listener nudges the participant when they hear more surface descriptions. The box to the left summarizes different types of surface descriptions that need nudging to get more depth. The box to the right provides some examples of question stems that will redirect participants to reveal their thinking, reasoning, and emotions.
Difference 2: The Conversation Is Free from Externally Introduced Topics
In many guides, there are questions that we — and our clients — want answered, like reactions and opinions about things participants have not brought up. In a Listening Session, nothing is introduced or queried that hasn’t already been mentioned.
Difference 3: Outputs Are the Starting Point for Innovation and Design
In the Solution space, research outputs are usually an “answer” of sorts: which product package should be produced? Which creative delivered the message most compellingly? Which call to action content resulted in the most conversion?
In the Problem Space, Listening Sessions identify people’s needs as they progress toward a goal or purpose. These needs are the starting point in the organization’s quest to figure out solutions (services, products, experiences) that better support people. Examples include: a website that is more reflective of their needs, a calorie tracking app that corresponds more closely to their purpose of looking good for others.
In other words, while qual in the Solution Space tends to supply the “answers” to problems, qual in the Problem Space tends to supply the “questions” that spur a company to explore one or more directions.
Try a Few Question Stems and See Where it Leads
Integrate a few question stems into your conversation; when a participant say,s “Mostly we go to movies on Wednesday night…”, ask, “How did you figure out that works best for you?” When someone describes a statement of fact — like a describing a scene — ask, “what’s going through your mind in that scenario?”, to get them back to their own thinking and feelings.
For more than 30 years, Kunyi has helped organizations deeply understand the people they wish to serve and assist them in using this understanding to make decisions and move forward with more certainty and less risk.
She is a senior consultant at Mara Consulting, working to help organizations improve service delivery through technology, privacy & security, business consulting, and human centered design.
Linked In: linkedin.com/in/kunyi
| Comments (1)
Posted By Mary Sorber ,
Tuesday, November 12, 2019
"Selling" the Added Value of Qualitative UX Research
Quallies who want to expand their practice into UX research need to be aware of the different types of UX research and the various terminology used in UX field. Practicing UX researchers (UXRs), especially those not lucky enough to work within a company with a well-established UX research practice, should be adept at discussing how qualitative research adds value. This article is written for anyone who need to “sell” the benefits of UX research, whether to generate new business or convince internal colleagues.
How Qualitative UX Research Adds Value
As independent consultants, you probably already have a sales pitch in your back pocket describing the value of qualitative insights to the marketing organization. In UX research, you’ll be working most closely with product management and engineering. Talking to this different set of stakeholders means tweaking the terminology and emphasis. In working with product teams, the language that resonates is that of “bringing the outside-in perspective, mitigation of risk, and efficient use of valuable engineering resources.”
Three Categories of UX Research
UX research falls into 3 broad categories: exploratory, conceptual, and evaluative. Each contributes differently to the product and mitigates different risks. Exploratory — or generative — UX research (field visits, ethnography) addresses product-market fit and reduces the risk of building the wrong product for the wrong people. Conceptual UXR (iterative design research, usually with prototypes) reduces the risk of building the product in the wrong way and minimizes internal bias. Evaluative UXR (primarily usability) confirms user goals are met and reduces user adoption risk.
Disruptive vs. Incremental Research
It’s a mistake to think of UX research as synonymous with usability testing. UX research encompasses much more, and significant contributions come from other methods. Usability testing — as an evaluative method — can contribute only incremental improvements to the product. The engineering team is a fast-moving train on tracks that have already been laid. Usability testing may be able to paint the coach a new color, but it’s too late to route the tracks to a different destination. Disruptive innovation and product-market fit come from doing exploratory research in advance of the engineering effort.
Working as a UXR, your role will be to bring in the outside perspective, helping to develop and maintain focus on the user problem and delivering value to the user. This is sometimes a large effort in the face of pervasive and — at times — strongly held opinions about the perceived problem or the imagined value of a visionary solution. Developing a strong partnership with entrepreneurs and product leaders is key to grounding the vision and shaping it into something that will be successful in the world of the eventual users.
Being adept at explaining the different types of UX research and how they can add value to the product team will help you sell your UX services. And in terms of impact, focus first on exploratory methods, then conceptual methods. Usability testing may be a good way to get in the door, but only as a last resort as your impact on the project will be severely constrained.
Mary Sorber is Founder and Principal Researcher at Practical Insights, a boutique qualitative research company engineered to ask the right questions to get answers and insights that are a springboard for innovation and improved user experience. We are happy to partner with quallies looking to break into the UX field.
qualitative ux research
types of research
| Comments (2)
Posted By Regina Szyszkiewicz, MA,
Tuesday, October 29, 2019
Meditation and the Art of Moderating
Handling Tough Moments
Imagine you are seated at the head of a conference table and you open a focus group discussion asking participants how they feel about their health insurance plan. And the first person who responds says she does not like her plan because she received poor care that led to amputation of her leg. And imagine you still have 90 minutes to go and you don’t want the group’s energy or the discussion to get derailed. What do you do?
This actually happened to me about 15 years ago. I was personally shocked and momentarily caught off-guard by the comment. But I managed to connect with the participant, acknowledging what she had just shared. I proceeded to uncover what others had to say — which included more typical responses of being happy or unhappy due to cost or access.
In the end, the group was successful and the client obtained the desired insights tied to the research objectives. I felt fortunate, because I knew it could have gone a different way.
We Are only Human
As moderators, we have to keep our cool. There are many things we need to juggle during a live focus group or interview session. We need to multi-task: asking the questions, managing participation, keeping track of research objectives, watching the time, etc.
During sessions, we never know what will come. Things might not go as planned: participants may arrive late, first-time client observers may want to add questions that are off-topic to a discussion guide that is already packed, someone may become ill in the group, etc. We need to be able to respond mindfully and wisely.
To ensure participants feel safe, secure, and comfortable, we personally also need to feel the same. When we have an off day, we need to find a way to get our energy centered and grounded.
Meditation to the Rescue
There are things we can do to become present with ourselves so that we can be present with others. Having a self-connecting routine or ritual prior to beginning an interview or focus group is a good start.
I find daily meditation and yoga practice to be an invaluable self-connecting training. I can more easily find the mental and emotional space to make wise choices in the moment. I am also better able to have compassion for others and myself.
In fact, I have found meditation and yoga to be so beneficial in my life and career that I became a certified yoga instructor 15 years ago! (I now teach a community yoga class on most Saturdays as a volunteer.)
Some self-connecting tools to explore:
Regina Szyszkiewicz, MA, of Ten People Talking loves qualitative research. She is a master moderator who has conducted over a thousand qualitative sessions. Regina has deep experience in both in-person and online qualitative methods. Regina received her MA from the University of Illinois in Applied Sociology / Market Research. She served on QRCA’s board from 2016-2018 and currently serves as a co-chair of QRCA’s Online Special Interest Group.
| Comments (10)
Posted By Katrina Noelle,
Tuesday, October 15, 2019
Use the CDJ Framework to Innovate Methods
Innovate your tools and methods by going on your own customer journey; become a customer on a journey through your own methods! In order to keep qualitative insight “approaches to understanding customers” fresh and relevant, you should consider, evaluate, buy, enjoy, advocate, and bond with the methods you use to understand consumers.
The customer decision journey (CDJ) is a model that shows how customers complete a purchase, guiding marketers where and what they should do along the way. Borrow this approach to go on your own journey to develop and choose new tools, techniques, and methodologies.
The journey begins with the consumer’s top-of-mind consideration set – just like consumers do.
- Start by considering your needs. Why are you choosing to iterate an existing method or start offering a new one?
- Brainstorm with your team. Where are opportunities for improvement? What do team members want to try/experiment with?
- Make a list of all the contenders. Then walk through each of them, asking yourself:
- Is it answering a need? Filling a gap?
- Is it giving your team something new or unique?
- Can you explain succinctly the value proposition and point of difference as though you were in an elevator with a prospective client?
- Is anyone else doing it? Who? How? Could you offer it differently?
- Test your ideas. While you do so, be sure to constantly ask for feedback from your team, participants, and clients.
- Track iterations and updates. Chronicle changes made to the approach at every step because your ideas may morph, combine, or improve as you progress.
- Be open. Keep an open mind to the changes/modifications/new ideas along the way.
- Keep asking. Constantly query if the new/improved method is filling a need. Is it improving an older process or adding something new?
3. BUY OR CHOOSE
We’ve included “choose” in this traditional third step because when choosing a methodology, it’s often just that – a choice, a decision to move forward in a certain way – not a purchase.
- Note: this step is sometimes overlooked. After all this work, it’s hard to say “no” to an idea to which you’ve grown close. But keep in mind that rolling it out is an even bigger step than testing it.
- If you decide NOT to move forward, table it in a helpful way. Make note of learnings that could be used in a different format or could serve another purpose at some other time.
4. ENTER THE LOYALTY LOOP: Enjoy, Advocate and Bond
Take a moment to ENJOY your hard work; now is the time to advocate your development with your broader organization and with clients. Try a pilot test with an understanding client or ADVOCATE the approach within your organization!
- Ground everyone. To do so, establish with everyone a need you are trying to meet, the gap you are trying to fill, and/or your rationale for adding this approach.
- Bond. Bonding in this sense means that the team gets familiarized with the new approach and comes to see it as their own. Solicit feedback from participants about their experience. Ask clients how they are using the new approaches and what could be improved further.
- Engage. Ensure your team are staying engaged, enjoying the experience, and getting the most out of the new methods as they are a part of the continual evolution.
This post was inspired by a presentation entitled “Innovate Your Tools And Methods By Going On Your Own Customer Journey” at the CX Talks event in Chicago held on September 24, 2019.
Katrina is principal of KNow Research, a full-service insights consultancy specializing in designing custom qualitative insights projects for 16+ years to unlock insights about brands and target audiences. She is also co-founder of Scoot Insights, whose trademarked ScootTM Sprint approach helps decision-makers choose the right direction.
President, KNow Research, Co-Founder Scoot Insights
www.knowresearch.com / www.scootinsights.com
@kat_noelle / https://www.linkedin.com/in/katrinanoelle/
Customer Journey Maps
| Comments (0)
Posted By Mark Wheeler,
Wednesday, October 2, 2019
Photo by Jo Szczepanska on Unsplash
A book published earlier this year provides a nice toolkit for qualitative researchers and consultants looking for new ways to bring additional value to our work. Super Thinking, by authors Gabriel Weinberg and Lauren McCann, introduces and explains a large number of mental models that can be applied as tools to help us do our research and communicate our findings and recommendations with more depth and impact.
A mental model is essentially a recurring concept that can be used to help understand, explain, and predict things. They are used as shortcuts to higher-level thinking. Most mental models have solid supporting evidence behind them but are not extremely well-known or formally taught to everyone in school.
Because most mental models are intuitive, they can be quickly explained to others, and used to recognize and describe patterns in behavior. They are highly valuable in qualitative research because we continually observe and hear things that need to be communicated to our clients – sometimes in ways that help to give a higher level of explanation than we have heard. It is much easier for us to recognize and explain something if we have a solid label for what is going on.
There are literally hundreds of mental models in the book. They come from a wide and varied number of fields of study, including philosophy, investing, statistics, physics and physical science, and economics. A list of several mental models is included in the accompanying table.
Applying Models in Research
In a recent marketing research project, I found a way to make use of one of Weisberg’s and McCann’s mental models to help communicate a key point to clients during a long day of in-person research. (Note: there will be a lot of detail blinding in this example to ensure confidentiality.) The research was in support of a safer kind of post-surgical wound care that had been on the market for a few years. Some of the doctors in research claimed that they hadn’t noticed fewer post-surgical complications since switching to the safer alternative, and some thought they may have seen even more complications. This was causing (and I am being understated here) some confusion and concern in the back room. Fortunately, the situation brought to mind the mental model of a moral hazard. Put simply, people take on more risk when they have information (in this case, the knowledge about the new wound-care therapy) that encourages them to believe that they are being protected.
Discussion with clients about moral hazard helped us to put a label on what we were hearing and helped us understand and probe differently in later interviews. Even more important, we were ultimately able to use the learnings to generate new messaging about the wound care product to address the potential problem of moral hazard for both physicians and patients.
A lot of the useful mental models in the book come from the social and behavioral sciences. The concept of availability bias describes the fact that once we make an answer (or behavior) available in someone’s mind by drawing attention to it, the answer begins to seem more correct. It is an automatic effect and is nearly impossible to resist. Of course, we usually want to avoid availability bias when we moderate (i.e., no leading questions).
I often discuss this idea of availability bias with clients when writing guides or surveys, and the reaction is overwhelmingly positive – even when it leads to re-writing someone else’s question. Availability bias comes up in other situations, for example when composing messages for promotion. In these cases, the bias can become a bit more acceptable (e.g., “Doctor, tell me about how satisfied your patients have been after you have prescribed our drug?”).
The larger point behind these examples is that introducing clients to mental models such as moral hazard and availability bias helps to communicate relatively complex points in a simple way that wouldn’t be possible without using the terms. When discussing a particular mental model such as loss aversion before research, clients and other listeners then begin to recognize it when they hear it from respondents. It is also fair to think of mental models as “value-adds” for any moderators or consultants who are able to bring in new concepts to help their client achieve their objectives. I’ve found that introducing mental models relatively early in reports can help prepare clients for critical upcoming findings and conclusions.
It is well worth while to check out Super Thinking and discover which mental models can be most valuable to your business.
Mark A. Wheeler, PhD, is a qualitative researcher and consultant who applies his background in cognitive and behavioral science to help his clients achieve their goals. He is Principal of Wheeler Research LLC in Bryn Mawr, Pennsylvania.
| Comments (4)
Posted By Jennifer Dale,
Tuesday, September 17, 2019
Qual or Quant? Choosing the best method for your research study
Quantitative and qualitative research are both scientific methods for data collection and analysis. They can be applied alone, or in combination, to maximize insights.
The Basic Difference: Going Beyond What vs. Why
Quantitative research relies on large sample sizes to collect numerical data that can be mathematically analyzed for statistically significantfindings. Surveys are structured, questions are typically closed-ended, and answer choices are fixed. However, quantitative research may also include a limited number of short-answer open-ended questions to help clarify why people responded the way they did to a closed-ended question. Eye tracking, facial coding, and even Big Data fall under the umbrella of quantitative research, with computers analyzing enormous volumes of data incredibly fast.
Quantitative studies produce numerical data, which allows for statistical analysis and ultimately precise findings. The US Census is a great example of a quantitative research study – fixed and close-ended questions, an enormous sample size, a collective review of many respondents, and measured population segments.
In contrast, qualitative research seeks to understand the reasons behind the numbers, as well as what is not yet known. Sample sizes are smaller, questions are unstructured, and results more subjective. Unlike quantitative research, qualitative studies insert the researcher into the data collection process. The researcher probes responses and participants provide more detail. Qualitative data is collected through interviews, group discussions, diaries, personal observations, and a variety of other creative and ever-expanding means.
Qual studies work with textual and visual data, interpreted and analyzed for directional findings. Qualitative research studies include fluid and open-ended questions, a smaller sample size, an in-depth review of each respondent, and emerging themes.
I like to think of the difference visually, where a quant study collects specific data from a large number of people, and a qual study goes deeper to collect greater insights from a small number of people.
How to Choose
The answer to whether you proceed with quantitative or qualitative research lies in your research objective and available resources.
- Why you’re doing the research
- What you need to know
- Your budget, staff, + schedule
- How the findings will be used
Consider these possible scenarios the next time you’re stuck and don’t know which way to go:
Quant + qual can come together in other ways. A questionnaire with open-ended questions, while ultimately coded numerically, can offer a window into the unknown. Focus groups that also include poll questions or surveys can produce hard data when analyzed in total, even if the results are not statistically significant.
With good planning, quantitative and qualitative research come together like a dance, guiding the marketer’s success with every step.
I Say Hybrid, You Say Multimethod
Combining quantitative and qualitative research approaches is an ancient strategy, but the names continue to change with the times. I did a bit of research and found the following terms being used to describe that ideal combination of quantitative and qualitative research. What term do you use? And why? ;)
Jennifer Dale, President + CEO Inside Heads, is a seasoned marketing professional and pioneer in online market research. Her passion for marketing, human behavior, and technology keep InsideHeads on the short list of research providers for some of the world’s most discriminating clients. Jennifer is co-author of Qual-Online, The Essential Guide and has published a number of articles in VIEWS, Alert! and Quirk’s Marketing Research Review.
| Comments (3)
Posted By Neri de Kramer,
Tuesday, September 3, 2019
Updated: Tuesday, September 3, 2019
Building Empathy: Tips from Anthropology Class
One thing I teach my students is how to think less in terms of “us” versus “them” and more in terms of our common, shared humanity. I make it very clear that students should not expect sensationalist accounts of “weird” or “exotic” peoples, but when they do learn about behaviors or belief systems that don’t make intuitive sense, thinking about all humans as one of “us” helps them look for the underlying logics to these diverse ways of being human.
This open mindset helps students better understand others’ points of view. These others are not limited to the cultural groups we read about but also include the others in the classroom. Classroom discussions are guided by the motto “understand, before wanting to be understood.” This forces students to listen to each other with suspended judgment. It leads to more empathic insights into people and their various views. It makes them consider how their opinions might be received by classmates. It improves student collaborations.
Credit: Evan Krape
Copyright: © 2016 University of Delaware
I assure students that these skills are vital in the real world. They are also obviously vital to applied qualitative research, where in fact we get paid to suspend our judgment in order to put ourselves in the shoes of our respondents and uncover their point of view in empathic ways. In addition to our respondents, we can also fruitfully apply this mindset to our interactions with clients, recruiters, facilities, and other teams.
Anthropologist Grant McCracken made a similar argument when he warned against business anthropologists’ habit of criticizing their clients, even “dissing them behind their backs.” He suggested practitioners return to a more fundamental 20th century culturally relativist position. This means we should practice what we preach by placing ourselves in the shoes of not only those we are paid to understand but also of those we work with.
The benefits are numerous. Understanding others helps us find the common goal all are willing to work toward together. An empathic grasp of what is really going on below the surface of that behavior or request improves our negotiation position. It enables us to solve problems instead of focus on its extraneous expressions. It makes for better proposals. It is helpful in figuring out the power dynamics in a given group. Also, understanding why people respond the way they do might help prevent frustration, which is exhausting.
Below, I share three teaching activities I use to instill a more relativist mindset in my students. This may prove useful to our own work with respondents, as well as to the problems we encounter in our daily work lives. They are designed to foster reflexivity, or an awareness of how one’s own background, environment, and mental processes shape one’s views of the world and of others. The goal is to come to see our own norms, values, and way of being in this world as only one out of many other logical possibilities. This is the critical first step to becoming more receptive to points of view that are not our own.
Examine the roots of your deeply held values
Think polygamy is deeply wrong? Well, why? How did you develop that value? How does that value fit into your life and can you begin to imagine how it might be different for others? These are questions my students grapple with when we play the Norms Game.
We first make a list of categories that constitute a person's social identity (gender, nationality, etc.) which students fill out for themselves. Students then receive a sheet with 7 columns. On the left is a list of cultural practices on which opinions tend to diverge. Students are asked to indicate where they stand on these things by placing an X in one of five ranking columns. In the final column they indicate which aspect of their identity, or which culture, informs that opinion.
Table 1: Courtesy of Dr. Adkins-Jackson
The resulting discussion helps students see their viewpoints and those of others less as absolute dividing lines and more as products of situated lives. Students learn that as people’s circumstances change, their identity shifts, which may result in new worldviews. These are important insights for anybody working with diverse groups of people. They show our own ideas as only relative to those of others and help us see where they are coming from. The game has also generated important insights for the clients and research teams for whom I adapted it.
Explore your subconsciously held biases
The human mind holds associations that people are often not even aware of. Psychologists Greenwald, Banaji, and Nosek study this phenomenon and developed a set of 14 Implicit Association Tests. IATs measure the strength of subconscious associations between concepts and evaluations (i.e. female with career, for the gender-career IAT, or Black people with pleasing, for the race IAT). Anybody with an internet connection and a compatible device can take an IAT. It takes approximately 10 minutes. My students are required to take at least one. They also consider how their implicit biases might affect their social interactions.
The associations we make affect how much empathy we are able to develop for a member of a group cognitively considered “other.” For us, this exercise can make us more thoughtful about our interview questions, our conduct during focus groups, who we are most comfortable working with. See also Banaji and Greenwald’s book Blindspot: Hidden Biases of Good People for the psychological science behind the biases we carry around with us.
Expand your mind
In order to move beyond stereotypes, we need to familiarize ourselves with others and broaden and complicate any implicitly held associations. One way to do this is by purposefully exposing ourselves to unfamiliar perspectives, experiences, and stories. So, for their assignments, I force students to engage with media and social media that doesn’t match their personal interests.
This can be eye-opening. While we researchers already spend a lot of time figuring out others, the more cultural information lives in our brain, the more it will benefit us in interactions with respondents, clients, and colleagues. We need to periodically break out of our own personal media silos and listen to the voices of those we don’t already know and agree with. It’s a behavioral shift that is easy to implement with the proliferation of podcasts, blogs, electronic newsletters, and so on. I consider it a form of anthropological reconnaissance I carry out during my commute or while browsing my phone at the doctor’s office.
Our differences are undeniable and fascinating. But a rigid “us” vs “them” stands in the way of generating empathic insights and prohibits fruitful collaboration. Reflexivity is key to building the empathy we need to succeed in all aspects of our work as qualitative research consultants.
About the Author:
Neri de Kramer, PhD, is a cultural anthropologist specializing in consumer behavior. She has done research in both Europe and the United States, for academic as well as corporate purposes. She is a freelance consultant and professor at the University of Delaware.
This post has not been tagged.
| Comments (0)
Posted By Bruce Peoples,
Tuesday, August 20, 2019
Updated: Tuesday, August 13, 2019
Moderating vs. Facilitating: What’s the Difference? Can You Do Both?
A few events in my career journey triggered the exploration of facilitating as a business opportunity and value-add for my clients. First was a breakout session at a QRCA conference where “facilitating” — vs. “moderating” — was brought to my attention. The second was when a client called me on a Sunday to replace his facilitator on Monday, to facilitate a session with R&D, sales, and marketing. With so little time to prepare, I trusted my instincts and experience as a moderator and let it fly… and suddenly I was a facilitator! It was a productive session — the client called me back for another session, and later to do consumer focus groups — and my curiosity was piqued to learn more.
Core Elements: The Same
I attended a three-day training session in facilitation (much like those offered at RIVA or Burke) and was pleasantly surprised to confirm that these two disciplines have much in common. The core elements of moderating and facilitating are the same: there is a gathering of people with something in common; there is a purpose behind the meeting; and there are desired outcomes. You guide the discussion in a thoughtful manner by providing structure and process.
Perhaps the biggest commonality is putting together an agenda — what we call a discussion guide — based on the client’s objectives and desired deliverables. Like a focus group discussion guide, much attention must be paid to the flow of the meeting and to the activities that will generate robust discussions.
Many of the exercises you utilize for qualitative can also be used for facilitating meetings. These might include:
- Developing lists and gathering, sorting, and ranking ideas
- Breaking out in small groups
- Mind mapping
- Perceptual mapping
Facilitated meetings are usually longer – a half day – and therefore benefit from energizing exercises interspersed throughout the session.
The Differences: Participants, Output, and Achieving Consensus
One key difference between moderating and facilitating is the participants. Focus group participants have no skin in the outcome; you will never see them again. Facilitated participants have to work with each other. A facilitated meeting may have colleagues from different functions (R&D, marketing, sales) and at different levels of authority (managers to vice presidents). When the meeting is over, they have to work together to achieve common goals. Their strategies and tactics – their jobs – might be affected by the outcome. I ask to conduct a few brief interviews of participants from different areas prior to the meeting to get a feel for the situation, personalities, motives, and issues that might arise.
Another difference is the output: in a focus group, the outputs are insights and determining their implications and developing recommendations. In a facilitated work group or work team meeting, the output is often an action plan that determines what, who, how much, and when. The action plan should be understood and agreed upon by all participants. In other words: achieve consensus, which means participants can live with the decisions that created the action plan and support them.
Two issues you’ll address more often when facilitating a work team than moderating consumers are resolving conflicts and achieving consensus. Marketing wants that new ice cream now; manufacturing can’t make it until next year. Your approach is somewhat intuitive and not much different than if moderating – but requires more attention and care. Things you’ll need to spend more time on include clarifying the issue, understanding its root causes, ensuring everyone understands the issue, brainstorming pros and cons, and ultimately utilizing techniques to rank or prioritize.
Projects and Meetings Where Facilitators Add Value
Facilitators can add value to a lot of different projects and meetings, but common types are:
- Innovation: Brainstorming to create new product ideas.
- Process improvement: This might include flowcharting a process, identifying roadblocks, and developing solutions to clear those roadblocks.
- Strategic planning: This might include a market situation assessment, SWOT analysis, and developing the outlines of a new strategic plan.
- Data Analysis: Sharing, analyzing, and assessing a lot of data from a variety of sources.
- Planning and executing a new product launch: The output is often an action plan.
Implications for You
If you are a good moderator, should you seek out facilitation opportunities? Yes! You already have many of the skills, resources, and experiences to successfully facilitate work group meetings. To pump up your confidence before jumping in, seek and find some formal facilitation training. On your first projects, get a partner to help plan and execute.
Put together a one- or two-page brochure (PDF is fine) highlighting your capabilities – and this can include moderating. Then network your way to generate awareness. Many of you work with research managers at large companies. Let them know and ask them to share your capabilities with their colleagues in other functions, such as marketing, sales, R&D, or HR.
About the Author:
Prior to becoming a qualitative consultant, Bruce Peoples worked in brand management, channel, and customer marketing for several well-known brands in different industries including Hanes and Jack Daniel’s. Bruce has been a QRCA member for about a decade now and utilizes a variety of methods to help his clients solve their marketing problems, whether they be consumer or business-to-business related. Bruce was trained in moderating at RIVA and in facilitation at Leadership Strategies.
| Comments (0)