How to combine qualitative and quantitative insights for better product decisions

Elena Borisova
4 min readDec 19, 2021

--

While qualitative research is widely discussed and practiced by design teams, quantitative methods often get ignored.

Since a lot of product and design work happens on something that didn’t exist before, this makes sense. No product, no users, no data. But whenever you work on an existing product, taking advantage of the data will expand your organization’s understanding of users and ways of informing and validating designs.

So, how do you combine qualitative and quantitative insights?

Use both qualitative and quantitative research to build full picture
Use both qualitative and quantitative research to build full picture

Consider a scenario: you have a business goal of increasing e-commerce website conversion rate (through improving user experience). How do you approach it with either of the research methods?

At least two moments to use research in the product design process are:

  • To discover user problems and business opportunities
  • To validate your solutions — whatever you came up with to address whatever you discovered

On top of it, both research methods are crucial for understanding users — defining behavior archetypes, jobs to be done, you name it. But it’s a topic for a separate conversation.

Insights flywheel: learnings from validation will feed into later discovery
Insights flywheel: learnings from validation will feed into later discovery

Part one: Discover user problems

The first idea that comes to mind is to identify what is not working well with the current product.

The quantitative approach would look like this:

  • Look at data (e.g., Google Analytics) to see where people drop off in the funnel, bounce, or (don’t) interact with specific elements.
  • Check frequent errors (e.g., what input fields do users skip in a checkout form?)
  • Look at behavior analytics tools like Hotjar for unexpected behaviors: check heatmap and look at recordings.

And qualitative:

  • Run a usability test to see where people are struggling with typical tasks

Another good question to ask is what are the users’ needs that the current implementation doesn’t address at all?

  • Data will be worse in answering this as it’s an open question, so you have to fish for unexpected behaviors (e.g., channel shift during weekends)
  • User research has way more to offer in this case. Organize generative interviews to ask users how they approach X (from more narrow “how do they buy groceries” to broader “how do they ensure they will have something to eat”) or even more comprehensive diary studies.

Part two: Validate your solutions

You found some issues and created some solutions to address them. But, how do you know if they work? Do they actually improve user experience?

Either research approach can help answer that.

  • Quantitative: A/b testing will tell the bottom line — this works¹ for more users than before. Most of the time, it’s harder to say why the solution failed. Also, A/b testing requires development time to implement the solution.
  • Qualitative: Usability testing will help discover fundamental issues but won’t cover edge cases or performance problems. On the other hand, it doesn’t necessarily require development time. Though it’s easy to invalidate something, validating is difficult — the fact that research participants were happy with the product and successfully completed given tasks doesn’t mean that the bottom line is better for everybody or even for the majority. Plus, user recruitment can be complicated and time-consuming, depending on your product.

To sum it up

Quantitative and qualitative research
Quantitative and qualitative research

You’ll need your product to be live and have reasonable traffic for quantitative research.

Qualitative studies are longer to set up; synthesizing observations into insights to bring to teams also takes time. And don’t underestimate the complexity of user recruitment. The more specific/niche topic you cover, the more difficult it gets. If you design a product for NASA astronauts, good luck recruiting your users.

If you’re lucky to have access to both users and data, you should leverage this opportunity. Here’s the User insights checklist I use when I start on a new project.

[1] Given that conversion or whatever metric you’re looking at is a good proxy for user experience. A lot of time it’s not: people can still buy at a specific e-commerce website despite the horrible experience if, for example, it’s cheaper.

--

--

Elena Borisova

Combining data and psychology for product and design decision-making | Head of Design at DeepL | elenaborisova.com