Master questions on study design, qualitative and quantitative methods, insight synthesis, stakeholder influence, and research exercises — with frameworks and a full study walkthrough.
Start UX Research Interview Practice →UX researcher interviews test your ability to design rigorous studies, extract meaningful insights from messy data, and translate findings into product decisions that wouldn't have happened without your work. Interviewers focus on methodology — choosing the right method, maintaining quality under pressure, and communicating findings that move teams to action.
Most loops include methodology discussion, a case study presentation, a research exercise (designing a study live or critiquing one), and a collaboration round.
A UX researcher designs and conducts studies to understand user behaviour, needs, and motivations. They plan research, recruit participants, collect data through interviews, usability tests, surveys, and analytics, synthesise findings, and present recommendations that shape product decisions.
Generative research explores the problem space — uncovering needs and mental models before a solution exists. Evaluative research tests a specific solution through usability testing, A/B testing, and concept validation. Strong programmes balance both.
An observation is factual: 'Five of eight participants clicked the wrong button.' An insight is interpretive: 'Users expect the primary action in the bottom-right corner because that pattern is established elsewhere.' Insights drive decisions; observations alone do not.
Can you design rigorous studies that answer the right questions — not just run comfortable methods?
Goal — What was the research question and why did it matter to the business?
Method choice — Why this method over alternatives? What trade-offs did you consider?
Participants — Who did you recruit, how many, and why that sample?
Procedure — How did you structure the study — discussion guide, tasks, stimuli?
Analysis — How did you move from raw data to findings?
Impact — What decision did the research influence and what was the outcome?
Understanding the problem space before solutions exist. Uncovering needs, mental models, and opportunities.
Testing whether a specific solution works. Effective usability studies and accurate result interpretation.
Moving from observations to insights that drive action. Where the real analytical work happens.
Research only matters if it changes decisions. Can you influence product direction?
Decision at stake — What was the team about to do or debating?
Research question — What did you set out to learn and why?
Key finding — What did research reveal that changed the equation?
Resistance — Did stakeholders disagree? How did you navigate?
Outcome — What decision was made and what impact did it have?
Most loops include a practical component. Prepare for live exercises and case study presentations.
Interviewers test whether you know when to use each method.
| Method | Best For | Sample | Output |
|---|---|---|---|
| User Interviews | Understanding needs, motivations, mental models | 6-12 participants | Themes, user needs, mental model maps |
| Contextual Inquiry | Observing real behaviour in natural environments | 6-10 participants | Workflow maps, environmental insights, workarounds |
| Diary Study | Longitudinal behaviour, habits, emotional context | 10-20 over 1-4 weeks | Behaviour patterns, frequency data, triggers |
| Usability Testing (Moderated) | Testing specific interactions with probing | 5-8 participants | Task success, severity-rated issues, recommendations |
| Usability Testing (Unmoderated) | Scale testing quickly | 15-50 participants | Task success metrics, click paths, time-on-task |
| Concept Testing | Validating ideas before building | 6-10 participants | Comprehension, perceived value, comparative preference |
| Card Sorting | Information architecture and categorisation | 15-30 participants | Category structures, labelling, similarity matrices |
| Survey | Measuring attitudes or behaviour at scale | 100-1000+ responses | Quantitative distributions, segmentation, NPS/CSAT |
| A/B Testing | Comparing measurable impact of variants | Statistically significant sample | Conversion rates, confidence intervals, causal evidence |
| Analytics Review | Behavioural patterns at scale | Entire user base | Funnel metrics, engagement patterns, adoption rates |
Case studies, study design exercises, and methodology questions.
Start UX Research Interview Practice →These roles share analytical skills but differ in methods, outputs, and position in the product process.
Focus: Understanding user behaviour through qualitative and mixed-methods research
Key skills: Qualitative research, usability testing, interview technique, synthesis, storytelling, stakeholder communication
Interview focus: Study design, method selection, synthesis, stakeholder influence, research exercises
Focus: End-to-end product design using research as input
Key skills: Interaction design, visual design, prototyping, lighter-depth research, systems thinking
Interview focus: Design process, portfolio, exercises, collaboration, measuring impact
Focus: Quantitative methods — surveys, experiments, log analysis at scale
Key skills: Statistics, survey design, experiment design, SQL, R/Python, A/B testing analysis
Interview focus: Survey design, statistical analysis, experiment design, sampling, effect sizes
Focus: Analysing product usage data from existing behavioural data
Key skills: SQL, analytics platforms, data visualisation, statistical analysis, metric frameworks
Interview focus: Metric definition, funnel analysis, cohort analysis, experiment analysis
B2B SaaS project management tool with 45% enterprise trial abandonment during onboarding.
Research plan — Goal: understand why 45% of enterprise trial users abandon onboarding. Questions: at which step, what expectations are violated, what do they do instead? Method: 10 moderated interviews + analytics review. Convince PM that analytics shows where but not why.
Recruitment — Screen: enterprise trials (10+ seats), past 30 days, completed <3 of 7 steps. Exclude never-logged-in users. Mix: 4 admins, 4 individual contributors, 2 independent signups. Over-recruit to 14.
Study execution — 45-min sessions: context (5 min), expectation mapping (10 min), walkthrough of actual onboarding with think-aloud (20 min), alternatives explored (5 min), ideal state (5 min). PM and designer observe 3-4 sessions live.
Synthesis — Affinity diagramming with designer. Three insights: (1) Admins abandon at team invitation because permissions aren't configured yet. (2) Invited users see admin-focused onboarding irrelevant to them. (3) Both groups want to see value before 20 minutes of configuration.
Recommendations — Split into admin vs team member tracks. Reorder admin flow: show value first via sample data import. Make onboarding progressive with skip-and-return. PM champions redesign after observing sessions.
Impact — Onboarding completion: 55% → 78%. Trial-to-paid conversion improves 23% — largest conversion improvement in product history. Study documented in research repository.
Methodological rigour: Choose methods deliberately based on the question, not comfort zone. Explain why and what alternatives.
Insight quality: Move beyond observations to interpretive insights that drive action. Depth of thinking is the differentiator.
Influence & impact: Has your research actually changed decisions? Concrete examples of shifting team direction.
Communication skills: Tell the research story compellingly. Case study presentation quality reveals core communication ability.
Collaboration mindset: Partner to designers and PMs, not gatekeeper of user truth. Bring the team along.
At minimum: user interviews, usability testing (moderated + unmoderated), surveys, and analytics interpretation. Senior roles: also contextual inquiry, diary studies, card sorting, concept testing, and mixed-methods design.
Almost certainly for mid/senior. Common: live study design (30-45 min), portfolio deep-dive with methodology challenges, research critique, or synthesis exercise with raw data. Practice designing plans for products you use.
Impact includes: changing what was built, reframing the problem, informing future roadmap, or revealing risks accepted with full information. Reflect on why findings didn't land and how you adjusted. Self-awareness matters.
15-20 min: context (2 min), research question (2 min), method and rationale (3 min), key findings with evidence (5 min), impact (3 min), reflection (2 min). Practice the story. Spend equal time on each section.
Research-adjacent degrees (psychology, HCI, cognitive science, anthropology) provide methodological training. Many successful UXRs transition from other fields. Portfolio and demonstrated skill matter most.
Startups: breadth, speed, lean research, scrappy recruitment, multiple hats. Big tech: depth, rigour, large-scale mixed-methods, complex org dynamics, 4-6 structured rounds, statistical sophistication.
Increasingly yes. Minimum: survey design, basic descriptive statistics, analytics dashboards, A/B testing methodology. Quant UXR roles need: experimental design, inferential statistics, SQL, R/Python.
Core: user testing platform (UserTesting, Maze), survey tool (Qualtrics, Typeform), synthesis tool (Dovetail, Miro), video conferencing. Helpful: analytics (Amplitude, Mixpanel), Optimal Workshop, Figma.
Under 2 minutes. Research career narrative: what drew you to understanding users, what shaped your methodology, what you're passionate about, why this company. Highlight one signature strength. Don't list every job.
Describing what you did without explaining why. 'I conducted 8 interviews' describes process. 'I chose interviews because we needed emotional barriers, not frequencies' shows methodological reasoning. Always connect to impact.
Practice research methodology, case study presentations, and study design exercises.
Launch UX Research Interview Simulator →Takes less than 15 minutes.