This website uses cookies to store information on your computer. Some of these cookies are used for visitor analysis, others are essential to making our site function properly and improve the user experience. By using this site, you consent to the placement of these cookies. Click Accept to consent and dismiss this message or Deny to leave this website. Read our Privacy Statement for more.
Qual Power
Blog Home All Blogs
Search all posts for:   

 

View all (114) posts »
 

Adapting Your Listening Skills to the Online World

Posted By Ted Kendall, Tuesday, February 11, 2020
Updated: Tuesday, February 11, 2020

Adapting Your Listening Skills to the Online World

By: Ted Kendall

Photo by kyle smith on Unsplash

As a successful qually, you intuitively know the importance of listening, how to listen well, and how to show participants that you are listening.

Listening is important because it engenders trust, creates rapport, and opens participants up.

In a physical setting, the key things we do to listen, and to show we are listening, include:

  1. Asking questions in response to participant’s thoughts
  2. Using verbal and non-verbal cues to show how you are listening
  3. Letting participants complete their own sentences
  4. Maintaining eye contact
  5. Acknowledging comments in specific ways like boarding or post-it notes

You will have noticed that most involve physicality—you have to be there in real life.

So, how do you listen, and just as importantly, show you are listening, in online qual?

Before we get into this, let me clarify that when I am talking about online qual in this context, I am referring to text-based online qual—primarily bulletin board style. While webcam interviews may be considered online, real life listening skills can be applied to the medium fairly easily.

Set Expectations to Counter Online Research Misperceptions

Photo by Vladislav Klapin on Unsplash

A unique challenge with online qual is that participants don’t necessarily know the difference between a survey and a qualitative discussion, so they often treat the study as if it were a survey. And they often believe that any interactions will be with a chatbot, not a real person.

It’s critical to counter these widely held beliefs and set the appropriate expectations up front. Tell participants you are listening to what they will say. And let them know it’s not a survey—it’s a conversation.

I can sometimes be pretty blunt about this—even going so far as to tell participants that if they just speed through the answers to my questions, they will not get the incentive. And then, when someone does that, I follow through on the promise and call them on it. Often it changes their interactions. Sometimes it doesn’t. But they definitely know you are listening. And, if the discussion is open to the whole group, others will see that you are listening as well.

Depending on the platform, you can use the messaging tools as well as the landing pages to accomplish this. And if the tools aren’t there, just use email or text, even phone, outside of the application.

I also make it a habit to reply to every participant post in the introductions—much like I do in a traditional focus group setting, or for that matter, in a conversation with a stranger. These replies can often reflect common ground, ” I love spending the day in the mountains with my dog too.  What kind of dog do you have?” That’s not a question that will provide rich insights, but it will help open up the participant and really shows you are listening.

It’s critical to establish early in the conversation that you are a living, breathing, listening human being—not some chatbot or AI ghost in the machine. This has a huge impact on how participants approach your conversation.

Avoid AI Tools

Photo by Brett Jordan on Unsplash

Several online platform providers are touting AI generated responses to participants. All I can say is that this is what we get when we let the programmers drive development. Avoid this feature. Yes, it saves you time during the discussion. But it also removes you from the conversation—you are no longer actively listening. You wouldn’t let a robot take over your focus group session just to save time, would you?

Also, AI is not yet perfect. And it needs to be in this case. It’s not a life or death situation, unless you consider the life or death of the research conversation. Even if the AI gets 90% of the interactions correct, there is that 10% that will suck the air right out of your conversation with that participant. If you are using a group setting, other participants will see the mistake and the negative impact becomes exponential.

So just don’t do it. The potential losses greatly outweigh the potential time savings. Besides, actually responding manually forces you to listen and learn—which is what this is all about. Don’t let a robot take your job.

How to Digitally Use “Non-verbal” Cues and Maintain Eye Contact

Photo by Evan Dennis on Unsplash

In the online, text-based world, you certainly can’t maintain eye contact, nor can you provide non-verbal cues to show you are listening. So how do you employ those key principles of listening in an online, text-based world?

Probably the most obvious way is replying to participants’ posts with questions to better understand what they have said or get some clarification on their comment. Yes, I am talking about the same probing questions we lay on participants in focus groups and interviews. These probing questions work just as well online as they do in real life.

To replace those non-verbal cues, I have found it quite effective to comment or ask questions even when there is no need to do so. The idea is that by just saying something, participants recognize that you are there and you are reading what they are posting—you are listening.
Sometimes it is easy to just copy and paste the same general comment to several participants when you do this. If the participants can’t see one another, this is fine and saves you time. But if the participants can see each other, then it just makes you look like a robot.

It’s important when making comments just to show yourself to not require a reply—often this is an option. I like to just thank people for providing quality detail or thank them for an interesting take on the topic. The important thing is to personalize it a bit, to keep it from sounding generic.

Another way to show you are listening is to use the messaging app within the platform to hold meta conversations outside the actual discussion. I make it a point to send reminders at set times as well as thank-yous at the end of the day of discussion.

These messages don’t have to be just logistical in nature. You can also use them to show you are listening. Sometimes I will include a comment about some of the discussion—an insight that came through for the whole group of participants, or sometimes personalizing it to a specific participant.

In the end, listening is important to successful qual, whether you are in the same room as the participant or interacting digitally. It’s just how you listen, and how you show that you are listening, that can take a little adjustment in the digital qual world. But it’s no less important and no less doable.

Author Bio:

Ted KendallTed Kendall is the founder of TripleScoop, a boutique research agency that has a focus on online qualitative. Ted got to this place in his career by being in the right place at the right time to pioneer in early online methods. He was a co-founder of QualTalk that became 20/20 Research’s QualBoards. He learned how to moderate online qual through trial and error and has moderated hundreds of online qual discussions and interviews since that first one in 1997. And he is usually a good listener.

LinkedIn: www.linkedin.com/in/triplescoopted

Tags:  Listening  Online Listening  QRCA Digest  Qualitative  Qualitative Research  Research Methodology 

Permalink | Comments (0)
 
Contact Us

QRCA
1000 Westgate Drive, Suite 252
St. Paul, MN 55114

phone:  651. 290. 7491
fax:  651. 290. 2266
info@qrca.org

Privacy Policy | Email Deliverability | Site Map | Sign Out
© 2020 QRCA

This website is optimized for Firefox and Chrome. If you have difficulties using this site, see complete browser details.