This website uses cookies to store information on your computer. Some of these cookies are used for visitor analysis, others are essential to making our site function properly and improve the user experience. By using this site, you consent to the placement of these cookies. Click Accept to consent and dismiss this message or Deny to leave this website. Read our Privacy Statement for more.
Qual Power
Blog Home All Blogs

Leveraging Social Media Intelligence with the Qualitative Research Community

Posted By Kayte Hamilton, Tuesday, August 25, 2020

Leveraging Social Media Intelligence with the Qualitative Research Community

This is a follow-up to the QRCA Flash Webinar designed as an introduction to social media research (what it is and how to get started). (Presented with my industry colleague Frank Gregory from NorthStar Solutions Group.)

It probably doesn’t come as a shock to anyone reading this that the coronavirus pandemic is now the most talked about topic in the history of social media. A perfect storm for social media conversation volume growth has emerged: Consumers across the globe are stuck at home (initially under strict government orders, now in the interest of community safety), wanting to express how they feel about the situation, how their views of everyday topics have changed because of the situation, or simply to virtually connect with others and laugh to take their mind off the situation. The obvious way to do this is from the comfort of their couch—by posting on social media. 

As consumers’ behavior has been forced to change, the landscape for researchers has changed as well, with some in-person methodologies being impossible to execute for the near future. Therefore, researchers should consider a pivot to new execution strategies, including social media intelligence, as a new tool in your toolkit…myself included!

Years ago I attempted to dabble in social media listening. Pain points included having to learn new skills like query writing, on top of navigating multiple social listening platforms which were all different and all limiting. Functionally, this resource hadn’t been ripened for basic qualitative interpretation. So I admit, I checked out. I figured, “if a client wanted social listening they either (1) are doing it internally or (2) would have asked.” I couldn’t have been more wrong, and Frank quickly schooled me on the renewed power of social mining.

Definition Clarification:

Social media listening is an older view on this research tool. At the time, listening made sense; for the most part, we were simply observing the incoming data and trying to make our own interpretations and connections. Most of the time this told you a percentage of conversation share the brand has and some light ideas revolving sentiment analysis (is there a positive, negative, or neutral perception?).

Social intelligence, the more modern way to describe this sector, is much more advanced. It can capture consumer conversations across any digital entity (from actual social media to product reviews) and add demographic and psychographic layers allowing you to “segment” the digital population (lightly compared to formal screening, of course). Today’s tool landscape helps us analyze in ways past platforms dreamed of, such as audience affinity, influencer evaluation, or platform performance benchmarking. In short, it’s adding more context to the conversations.

Regardless of the type of social media analytics tool, to me the biggest appeal to jumping into social media intelligence more fully is the reminder that it’s really never too late to get started. Unlike other “in-the-moment” approaches qualitative researches might implement, we can go backward in time and analyze social media conversation in time chunks.

As opposed to trying to ask a consumer how they felt about X topic 2 years ago vs. 1 year ago vs. 6 months ago vs. today; social media intelligence allows you to find the millions of consumer comments discussing that topic over that time period. The posts consumers made 2 years ago are still there waiting to be analyzed. So, using the coronavirus pandemic as an example, kicking off a social media intelligence analysis today doesn’t mean you’ve missed out on the last few months of social conversation trends—including how the coronavirus has changed the way consumers think about certain brands, industries, and behaviors.

Every single company has been impacted by our current events. Consumer perceptions around the globe have been impacted in almost every way imaginable, often related to the brand or company you are supporting in your research project. There are many ways to tap into these conversations and use the information to your advantage, from proposals to report writing.

DO

DON’T

  • Use the data as part of a pre-search phase, getting up to speed on a topic.
  • Consider if this tool is something you want to execute or find a partner on. Similar to online boards, ask if you are an expert programmer or if you pay extra for the setup service.
  • Ask your clients how they currently engage with social media analytics. Can you help layer your qualitative expertise with this “big data”? Analysts approach the information much differently than a consumer insights professional.
  • Assume the client’s internal department is sharing social media data with the insights team.
  • Mistake social intelligence as only the “major” social media channels. Data collected includes public forums, news sites, blogs, product reviews, etc., in addition to the main social media sites (Twitter, Instagram, YouTube, parts of Facebook).
  • Block yourself; just because it’s not “screened” or “recruited” information, doesn’t mean it can’t add value to your insight generation process.

Like all new skills, integrating social intelligence into your process takes time. To me, it’s the same type of learning curve as:

  • Online boards/communities
  • Video reports
  • Automated interviews

I think people shy away from learning new skills because they are unsure of how to translate their current qualitative skillsets. Quallies are not just moderators; we bring more to the table than simply asking questions. Therefore, we should have a dynamic set of resources to help us interpret and uncover insight beyond interviewing

Let’s start a discussion. What’s holding you back from integrating social intelligence to your qualitative practice? 

About the author:

Kayte Hamilton specializes in research design at InsightsNow among a large variety of clients from pharma to CPG. As a hybrid researcher, she’s always looking for ways to mix methods. Currently she’s the chair for the QRCA Annual Qually Award, where she advocates for innovative research solutions and shares these findings with the greater QRCA community. 

Tags:  listening  online listening  QRCA Digest  qualitative research  Research Methodologies  social mediaCustomer Journey Maps 

PermalinkComments (0)
 

Online Chat Focus Groups: A First-Timer’s Perspective

Posted By Cheryl Halpern, Tuesday, July 14, 2020

Online Chat Focus GroupsA First-Timer’s Perspective 

Online Chat Focus Groups: A First-Timer’s Perspective 

First-time experiences are both exhilarating and intimidating. COVID-19 has presented us with the opportunity to add to our toolboxes, either because we recognize the seismic shift to online methodologies, or we simply have more time on our hands.

After attending a QRCA webinar about online chat focus groups, I volunteered to conduct a mock session with other professionals who were interested in seeing the platform in action.  

Methodology Description 

Online chat is similar to in-person focus groups in that targeted respondents are recruited to participate in a moderated discussion at a specific point in time for a set duration (typically 60 – 90 minutes), but different in that engagement is entirely text based. 

Online chats typically involve eight to 20 respondents. The moderator can use a whiteboard to display visuals, and backroom observers can communicate with each other directly and with the moderator through an administrator. The administrator also takes care of technical issues and helps prod participants, if needed.   

Objectives and Target Audience 

For this mock chat, my objectives were to let interested researchers experience the platform firsthand and to provide a fun break in these challenging times. I came up with a list of questions to help us explore “The Lighter Side of Quarantine.” 

All who had expressed an interest in the webinar chat room were invited to attend and could opt to be either a participant or an observer. Participants were given screen names based on the adjective they said best described their current emotional state and what they had eaten most recently. Anxious Turkey, Optimistic Beans and Weary Apple were among the favorites. 

Discussion Guide 

I was advised to allow five minutes for every three questions and planned the guide accordingly, with timed sections and detailed questions under each section. 

Once loaded, the discussion guide appears in sequential blocks on the lower righthand side of the moderator’s screen. Six to eight of these blocks can be seen at one time, and all can be seen by scrolling up and down.  

Screen shots to be used on the whiteboard are labeled and appear in a different scroll on the upper righthand side of the moderator’s screen.  

Preparation 

The platform I used had a practice room that I could enter whenever I wanted. It was pre-programmed with fourteen participants submitting random responses at what has been determined to be the typical pace, which is essentially a bell curve over about 90 seconds after a new question is introduced.  

As with any group discussion, the moderator’s task is to guide the discussion, introduce materials, and probe to elicit deeper insight. With synchronous chat discussions, that translates into three distinct but coordinated tasks: 

  1. Sending questions, either from the pre-loaded discussion guide or by typing freehand. 
  2. Sending visual stimuli to the whiteboard. 
  3. Reading the scrolling discussion and immediately probing responses as needed. 

During practice, I learned that I had the flexibility to send pre-loaded questions in any order or skip them altogether if desired.  

Moderation 

I logged in about fifteen minutes before the session started and watched as fourteen participants and thirteen observers entered. 

At the appointed time, I sent instructions to the group chat one sentence at a time, pacing myself by reading the words aloud – just as participants are reading them for the first time. 

I submitted my first screen shot and question and the frantic fun began! After just a few seconds, answers started popping up, each identified by the screen names that had been assigned.  

While I am accustomed to multi‐tasking in live focus groups, I found it rather challenging to type probes while the chat continued to scroll on the screen during the live discussion. Also, because comments were coming in quickly, any probe on a specific comment requires including the screen name of the individual being addressed. While the participant screen names I derived for this exercise were fun, I realized quickly that shorter user names would have been expedient. 

Another interesting aspect of the chat platform is that responses to one question may keep coming in after a new question has been presented. Each respondent is reading, processing, typing and submitting at a different pace. This has implications for both analysis of the transcript and construction of the discussion guide. The resulting output is not a threaded transcript, but a chronological record. 

Consensus Assessment 

We had a Zoom meeting immediately following the chat so that anyone who was interested could participate in a debrief. Virtually all felt the pace was incredibly fast and wished they had more time to read and process each of the responses individually. Nevertheless, the observers agreed that that although the content was generated quickly, it was surprisingly rich and abundant. 

Tips

From my experience moderating an online chat discussion for the first time, I would offer the following tips for others who want to utilize this tool: 

  1. Engage participants from the outset. Without face-to-face interaction, it is especially important to make the respondents feel welcomed and eager to participate.  
  2. Familiarize yourself with all toggles/options available. I did not realize that I could have done more to optimize the respondents’ screens. 
  3. Use the whiteboard judiciously. Juggling the whiteboard and the discussion guide at the same time probably complicates things unnecessarily for a novice. 
  4. Review your discussion guide with an understanding that responses from one question may spill over into the next on the transcript and arrange questions accordingly. 
  5. To facilitate deep dives on key topics, plan multiple, closely related questions and allow respondents 90 seconds to read and respond to each.   
  6. Include time allocations and screen shot reminders in your programmed discussion guide so that all the cues you’ll need are in one place. 
  7. Partner with a trusted administrator, whether that is a colleague or someone from the platform’s staff. They can run interference in the “backroom” so that you can focus on the respondents. 
  8. Practice! Even a skilled moderator needs to take the time to learn the nuances of a new tool.  

cheryl halpernAbout the Author: Cheryl Halpern

Cheryl has 25+ years of executive level marketing professional experience and is the current President of Halpern Research; formerly VP with Dallas Marketing Group and VP of Global Product Marketing with Mary Kay, Inc. 

Tags:  Actionable  Focus Groups  Insights  Market Research Technology  Online Listening  Online Technology  QRCA Digest 

PermalinkComments (0)
 

Adapting Your Listening Skills to the Online World

Posted By Ted Kendall, Tuesday, February 11, 2020
Updated: Tuesday, February 11, 2020

Adapting Your Listening Skills to the Online World

By: Ted Kendall

Photo by kyle smith on Unsplash

As a successful qually, you intuitively know the importance of listening, how to listen well, and how to show participants that you are listening.

Listening is important because it engenders trust, creates rapport, and opens participants up.

In a physical setting, the key things we do to listen, and to show we are listening, include:

  1. Asking questions in response to participant’s thoughts
  2. Using verbal and non-verbal cues to show how you are listening
  3. Letting participants complete their own sentences
  4. Maintaining eye contact
  5. Acknowledging comments in specific ways like boarding or post-it notes

You will have noticed that most involve physicality—you have to be there in real life.

So, how do you listen, and just as importantly, show you are listening, in online qual?

Before we get into this, let me clarify that when I am talking about online qual in this context, I am referring to text-based online qual—primarily bulletin board style. While webcam interviews may be considered online, real life listening skills can be applied to the medium fairly easily.

Set Expectations to Counter Online Research Misperceptions

Photo by Vladislav Klapin on Unsplash

A unique challenge with online qual is that participants don’t necessarily know the difference between a survey and a qualitative discussion, so they often treat the study as if it were a survey. And they often believe that any interactions will be with a chatbot, not a real person.

It’s critical to counter these widely held beliefs and set the appropriate expectations up front. Tell participants you are listening to what they will say. And let them know it’s not a survey—it’s a conversation.

I can sometimes be pretty blunt about this—even going so far as to tell participants that if they just speed through the answers to my questions, they will not get the incentive. And then, when someone does that, I follow through on the promise and call them on it. Often it changes their interactions. Sometimes it doesn’t. But they definitely know you are listening. And, if the discussion is open to the whole group, others will see that you are listening as well.

Depending on the platform, you can use the messaging tools as well as the landing pages to accomplish this. And if the tools aren’t there, just use email or text, even phone, outside of the application.

I also make it a habit to reply to every participant post in the introductions—much like I do in a traditional focus group setting, or for that matter, in a conversation with a stranger. These replies can often reflect common ground, ” I love spending the day in the mountains with my dog too.  What kind of dog do you have?” That’s not a question that will provide rich insights, but it will help open up the participant and really shows you are listening.

It’s critical to establish early in the conversation that you are a living, breathing, listening human being—not some chatbot or AI ghost in the machine. This has a huge impact on how participants approach your conversation.

Avoid AI Tools

Photo by Brett Jordan on Unsplash

Several online platform providers are touting AI generated responses to participants. All I can say is that this is what we get when we let the programmers drive development. Avoid this feature. Yes, it saves you time during the discussion. But it also removes you from the conversation—you are no longer actively listening. You wouldn’t let a robot take over your focus group session just to save time, would you?

Also, AI is not yet perfect. And it needs to be in this case. It’s not a life or death situation, unless you consider the life or death of the research conversation. Even if the AI gets 90% of the interactions correct, there is that 10% that will suck the air right out of your conversation with that participant. If you are using a group setting, other participants will see the mistake and the negative impact becomes exponential.

So just don’t do it. The potential losses greatly outweigh the potential time savings. Besides, actually responding manually forces you to listen and learn—which is what this is all about. Don’t let a robot take your job.

How to Digitally Use “Non-verbal” Cues and Maintain Eye Contact

Photo by Evan Dennis on Unsplash

In the online, text-based world, you certainly can’t maintain eye contact, nor can you provide non-verbal cues to show you are listening. So how do you employ those key principles of listening in an online, text-based world?

Probably the most obvious way is replying to participants’ posts with questions to better understand what they have said or get some clarification on their comment. Yes, I am talking about the same probing questions we lay on participants in focus groups and interviews. These probing questions work just as well online as they do in real life.

To replace those non-verbal cues, I have found it quite effective to comment or ask questions even when there is no need to do so. The idea is that by just saying something, participants recognize that you are there and you are reading what they are posting—you are listening.
Sometimes it is easy to just copy and paste the same general comment to several participants when you do this. If the participants can’t see one another, this is fine and saves you time. But if the participants can see each other, then it just makes you look like a robot.

It’s important when making comments just to show yourself to not require a reply—often this is an option. I like to just thank people for providing quality detail or thank them for an interesting take on the topic. The important thing is to personalize it a bit, to keep it from sounding generic.

Another way to show you are listening is to use the messaging app within the platform to hold meta conversations outside the actual discussion. I make it a point to send reminders at set times as well as thank-yous at the end of the day of discussion.

These messages don’t have to be just logistical in nature. You can also use them to show you are listening. Sometimes I will include a comment about some of the discussion—an insight that came through for the whole group of participants, or sometimes personalizing it to a specific participant.

In the end, listening is important to successful qual, whether you are in the same room as the participant or interacting digitally. It’s just how you listen, and how you show that you are listening, that can take a little adjustment in the digital qual world. But it’s no less important and no less doable.

Author Bio:

Ted KendallTed Kendall is the founder of TripleScoop, a boutique research agency that has a focus on online qualitative. Ted got to this place in his career by being in the right place at the right time to pioneer in early online methods. He was a co-founder of QualTalk that became 20/20 Research’s QualBoards. He learned how to moderate online qual through trial and error and has moderated hundreds of online qual discussions and interviews since that first one in 1997. And he is usually a good listener.

LinkedIn: www.linkedin.com/in/triplescoopted

Tags:  Listening  Online Listening  QRCA Digest  Qualitative  Qualitative Research  Research Methodology 

PermalinkComments (0)
 
Contact Us

QRCA
1000 Westgate Drive, Suite 252
St. Paul, MN 55114

phone:  651. 290. 7491
fax:  651. 290. 2266
info@qrca.org

Privacy Policy | Email Deliverability | Site Map | Sign Out
© 2020 QRCA

This website is optimized for Firefox and Chrome. If you have difficulties using this site, see complete browser details.