AI is very effective at processing large volumes of data but still requires human guidance in its application. AI Product Designer Ioana Teleanu introduces the two main types of AI research tools, insight generators and collaborators, and how to apply them in UX research.
As Ioana explained, the primary goal of insight generators is to provide concise and informative summaries of user research sessions. They analyze the transcripts but don’t take any additional information into account, like context, past research, or background details about the product or users. So, insight generators can’t interpret the complete picture of user interactions and experiences.
Collaborators are more advanced—they can be trained with human-generated interpretations and context, including research goals, questions, and product background. Collaborators can recommend thematic analysis tags and generate insights based on the transcripts and contextual data. Collaborators also have the capability to analyze researchers' notes to create more nuanced themes and insights. However, they also have difficulties in handling visual data, as well as issues with citation and validation, and the potential introduction of bias into research results.
Bias in AI can come from training data (systematic bias), data collection (statistical bias), algorithms (computational bias), or human interactions (human bias). To reduce bias, use diverse and representative data, test and audit AI systems, and provide clear guidelines for ethical use to aim for fair and unbiased AI decisions that benefit everyone.
The Take Away
AI research tools have the capacity to reduce cognitive load, support decision-making by processing large volumes of data, automate tasks like image formatting and text resizing, offer deeper insights into human behavior and usage patterns, create prototypes and various visual assets, and excel in pinpointing usability issues.
There are two types of AI research tools, insight generators and collaborators. Insight generators summarize user research sessions by analyzing transcripts but lack the ability to consider additional context, which limits their understanding of user interactions and experiences. Collaborators provide more context-aware insights through researcher input, but they still struggle with visual data, citation, validation, and potential biases.
To overcome these tools’ limitations, you must exercise caution, maintain human oversight, critically evaluate outputs, be aware of potential biases, and use AI as a supplementary, not sole, decision-making source.
References and Where to Learn More
Read the NNG article that Ioana mentions in the video, AI-Powered Tools for UX Research: Issues and Limitations (NNG)
Discover more about bias in this article, Towards a Standard for Identifying and Managing Bias in Artificial Intelligence.
Hero image: © Interaction Design Foundation, CC BY-SA 4.0