This website uses cookies to store information on your computer. Some of these cookies are used for visitor analysis, others are essential to making our site function properly and improve the user experience. By using this site, you consent to the placement of these cookies. Click Accept to consent and dismiss this message or Deny to leave this website. Read our Privacy Statement for more.
Qual Power
Blog Home All Blogs

How Can Voice AI Help Qualitative Researchers?

Posted By Kay Corry Aubrey, Tuesday, July 23, 2019
Updated: Tuesday, July 23, 2019

How Can Voice AI Help Qualitative Researchers?

VOICE Artificial Intelligence

Within three years, 50% of Web searches will be done via voice. Today almost one in four US households has access to a smart speaker such as Google Home or Alexa. Consumers are adopting voice technology faster than any other technology, including smart phones. Very soon voice artificial intelligence (AI) will become embedded in our everyday lives to the point where we may not even notice it anymore. How can qualitative researchers leverage this powerful trend?

For inspiration I spoke with four experts who are doing cool things with voice technology. They described unique ways to apply voice Artificial Intelligence (AI) that offer a preview on how this technology might transform our work as researchers. For example, consumers are shifting toward using their voice vs. their fingers to interact with technology and the Internet.

 

The Rise of the Talking Survey

Greg Hedges has had great success with voice-based surveys through virtual assistants such as Siri, Alexa and Google. According to him, “It’s like launching a focus group of one. People are interacting where they are most comfortable in their own home, using their own words. We’ve found that people are more spontaneous and natural when they talk vs. when they type.” Greg’s company also helps organizations integrate voice branding into their digital marketing ecosystem. Part of their expertise is redesigning a client’s SEO strategy to be phrase and question-based (vs. keyword based) to accommodate voice searches. 

 

Ask Your Digital Twin Narrate Your Next Report

Domhnaill Hernon collaborates with artists to explore the deep connections between technology and human potential. He worked with Reeps One, a beatboxer, who fed hours of his audio recordings into Nokia’s AI machine. To their astonishment, the system returned new melodies he didn’t put in but sounded just like him. Rather than feeling threatened, the artist embraced the capability and now incorporates AI-generated tunes into his work. Soon this technology will be widely available, and you’ll be able to produce reports in your own voice that clients can listen to just like a podcast.

It’s hard to imagine how voice technology – and AI in general – will change our world. Technology is always a double-edged sword. On one hand, AI will be used to cure disease, make societies more efficient, and redistribute wealth so humans everywhere prosper. On the other, it might lead to a hardening of the social classes and a surveillance state. In a recent episode of 60 Minutes, AI expert Kai Fu Lee said that 40% of jobs will be eliminated within 15 years due to artificial intelligence. To empower ourselves we need to understand what AI is, how it works, its capabilities and limitations.

 

How Voice AI Works

As with any artificial intelligence, voice technology relies on two things: having access to a huge pool of data, and algorithms that look for patterns within the data. For voice, the algorithm is called Natural Language Processing (NLP). The result can only be as good as the data that are fed into the machine. Today in North America, Voice Assistants (VA) are 95% accurate if the person speaking is a white native-born man, 80% accurate if it’s a woman, and as low as 50% accurate if the person has an accent. This is because of the socially limited group of people who contribute their data by using voice assistants - VA users tend to be early adopters, white, and highly educated.

Jen Heape notes, “Natural Language Processing (NLP) cannot deal reliably with anyone who is not a white male, and this is deeply problematic, which is why Google and Amazon are giving away so much free so they can collect more complete samples.”

The algorithms that make up NLP leverage fixed rules of language around syntax, grammar, semantics. The algorithm can be taught, “if they say this, say that” and the machine learns the pattern. This capability allows the virtual assistant to process simple prescriptive (but useful) commands such as “turn on the lights,” “play NPR,” or “order more lettuce,” because the technology has learned the vocabulary and structure of English sentences.

 

Can a Machine Be Conversational?

However, voice technology is still very much in its infancy. The machine has no concept of culture or social inferences. As Heape noted, “If I were to say ‘The kids just got out of school’ and the listener is in the same time zone, they’d know it’s 3 or 3:30. However, the voice technology would not be able to infer this because it lacks the data.”

Freddie Feldman leads a voice design team which creates chatbots and conversational interfaces for medical environments. According to Feldman, chat bots and voice technology in general are helpful in medical environments to get quick answers to predictable questions. “But for anything more crucial, dynamic or that requires understanding the other person’s psychology you’ll need to call someone in the end.” In theory, it’s possible that voice technology will have deeper human characteristics one day. “The technology is there. It’s just a question of someone piecing it together.”

It’s hard to imagine any machine being able to understand and integrate all the rich signals we send and receive in a conversation: the look on a person’s face, the tone of their voice, their diction, their physical posture, our perception of anger and pleasure, or what they are thinking. These elements are as essential to meaning and human connection as the words themselves. As Heape said, “VAs will never replace the human. There will always be a human pulling the lever. We decide what the machine needs to learn. VAs will remove the arduous elements. But we need a human to interpret the results and analyze it. We’re still so much at the beginning of it — we have not fed the machine.”

My feeling is there will be abundant opportunities for qualitative researchers, but – first – we need to understand the beast and what it cannot do.

 

Learn More about Artificial Intelligence and Voice Technology

Thomas H Davenport and Rajeev Rananki, “Artificial Intelligence for the Real World; Don’t start with moonshots”, Harvard Business Review, January-February 2018. (free download).

Joanna Penn, “9 Ways That Artificial Intelligence (AI) Will Disrupt Authors And The Publishing Industry”, Creative Penn Podcast #437, July 2019.

Oz Woloshyn and Karah Preiss, Sleepwalkers podcast on iHeartRadio.

Voice 2019 Summit, New Jersey Institute of Technology, July 22 – 25.

 

Acknowledgements

Thank you to the experts I spoke with while researching this post:

  • Freddie Feldman, Voice Design Director at Wolters Kluwer Health
  • Jen Heape, Co-founder of Vixen Labs
  • Greg Hedges, VP of Emerging Experiences at RAIN agency
  • Domhnaill Hernon, Head of Experiments in Art and Technology at Nokia Bell Labs.

 

About the Author

Kay Corry Aubrey is a User Experience consultant and trainer who shows her customers how to make their products more easily understandable to ordinary people through usability testing and in-home studies. For the past few years she’s focused on products and services for older people that improve their lives, helping them remain independent and in their home. Kay sees great potential in voice-enabled products geared towards older folks. Her clients include Pillo Health, Stanley Black and Decker Futures, and the Centers for Medicare and Medicaid Services. Kay is the Luminaries Editor for the QRCA VIEWS magazine and a RIVA-certified Master Moderator and Trainer.

Website: www.UsabilityResources.net

LinkedIn: https://www.linkedin.com/in/kaycorryaubrey/

Tags:  AI  data  QRCA Digest  Research Methodologies 

PermalinkComments (2)
 

Annual Conference Reporter on the Scene: Using AI for Quantitative Analysis of Qualitative Data

Posted By Michelle Finzel, Thursday, February 28, 2019
Updated: Wednesday, February 27, 2019
Annual Conference Reporter on the Scene: Using AI for Quantitative Analysis of Qualitative Data

Using AI

Summary:
Shamaa Ahmed and Cal Zemelman from Customer Value Partners, gave us a snapshot of the process of using machine learning Artificial Intelligence (AI) to automate large amounts of qualitative data at the 2019 QRCA Annual Conference. Cal went through using AI to summarize the data and assess the emotional state of the respondent through natural language processing. He also gave all of us an opportunity to analyze the data provided from AI into tables and graphs to discover themes.

Key Takeaways:
Through experiencing this process, I discovered that I was able to rapidly classify responses into sentiment buckets and identify outliers easily for more focused review and analysis. I really like that you can create cool charts for the clients (who always want graphics) and you can continuously train the computer model to improve. I was shocked at how easy some of these platforms are to learn and use, most of them are inexpensive or even free, and that it only takes about 100 responses to train a model.

Putting it into practice:
I was really excited to learn about using AI in my practice, especially since it seems like this is the direction our industry is heading! Now that I know that platforms and models are relatively inexpensive, I plan to learn how to program a model for my own research.

A-ha moment:
I always thought, like many of those in our industry do, that AI was something that would be beyond my comfort zone, but I am thrilled to have found out how accessible and easy to learn the platforms and models are and can’t wait to put them into practice. This is the beginning of something and I am intrigued to follow the process of its development!

Michelle FinzelQRCA Reporter on the Scene:

Michelle Finzel
Maryland Marketing Source, Inc.
Twitter: @MichelleFinzel

Tags:  AI  Artificial Intelligencen QRCA Annual Conference  QRCA Reporter on the Scene  qualitative research 

PermalinkComments (0)
 
Contact Us

QRCA
1000 Westgate Drive, Suite 252
St. Paul, MN 55114

phone:  651. 290. 7491
fax:  651. 290. 2266
info@qrca.org

Privacy Policy | Site Map | Sign Out
© 2019 Qualitative Research Consultants Association

This website is optimized for Firefox and Chrome. If you have difficulties using this site, see complete browser details.