LongCut logo

Continuous discovery: A tactical deep dive with Teresa Torres

By Dovetail

Summary

## Key takeaways - **Structure messy discovery with three components**: Discovery has three primary components: know where you're going with an outcome that measures impact, discover customer opportunities (needs, pain points, desires), and discover solutions that match those opportunities. [02:20], [03:35] - **Interview weekly for customer reminders**: Product teams need a customer conversation every week to get fast answers and regular reminders that customers are not like us; for example, a non-technical admin asked how to create a link in email. [06:20], [07:34] - **Continuous discovery complements project research**: Product teams doing continuous discovery does not replace skilled researchers who handle longitudinal studies, pricing studies, or competitor research; both can coexist with researchers embedded or supporting multiple teams. [08:51], [10:40] - **Break solutions into testable assumptions**: Break solution ideas into specific underlying assumptions like 'subscribers want to watch sports on our platform' and test them quickly with one-question surveys instead of prototyping the whole idea. [38:26], [40:24] - **Frontload interviews, target opportunity by week one**: In week one of a new outcome, do 3-4 interviews to draft experience map and opportunity space, then choose a target opportunity; move fast to solutions in week two to show progress to stakeholders. [48:56], [49:50] - **Compare multiple ideas to decide**: Test assumptions across multiple solution ideas to compare and contrast; if results are terrible or muddled across all, throw ideas away and start over; a clear frontrunner indicates direction. [46:02], [46:34]

Topics Covered

  • Full Video

Full Transcript

welcome to inside out today we are talking about product Discovery uh it's the beginning of the product development life cycle and perhaps one of the most interesting aspects of developing a new

product it's going through and finding what the actual problem is that you want to solve and figuring out how you're going to solve it uh which inevitably means you have to talk to customers

which is why it is so close to our heart here at dovetail um now talking to customers is uh obviously a fantastic

thing to do however most people that we know at dovel uh have told us that it is hard to talk to a lot of customers and it takes a long time and it's very timec consuming and it's part of the reason

why our product is all about trying to make that more efficient and effective uh in another way uh Our Guest today Teresa Torres product coach and author of continuous Discovery habits is also

about making product Discovery more efficient and effective and uh that is what we going to be talking about today and what I really like about Teresa is

uh she's a very Hands-On person and I think that the title uh thought leader is perhaps not something that uh you probably claim too much uh is that what

would you say Teresa I think nobody should give that title to themselves that's for the audience to decide right exactly exactly well I've always uh I've

liked your Hands-On approach I I know a lot about you know your past you know coming from um very executional roles and then moving into an executive role and then deciding you know really where you want to be is helping helping teams

get better at things like product Discovery um so why don't we get started and would you be able to give us a quick overview of what continuous Discovery what a continuous Discovery team does

week over week yeah this is a big question but it doesn't have to be so I think one of the things that I really like to do is Discovery can be messy right any research is messy there's lots

of twists and turns we don't we might start at Point a thinking we're going to get to point B and we might even get to point B but we might take 17 left turns on the way there right and so one of the

things that I like to do is I like to add some structure to a messy process how do we know we're going in the right direction how do we keep be in our sites um and I look at this as there's three

primary components and there's lots of tactics within those components but the three primary components are we have to know where we're going right so we have to have an outcome in mind what what are

we trying to accomplish as a team hopefully that outcome is something that measures impact and not just an output right we're trying to accomplish this thing that has this impact on our

customers lives in a way that also impacts our business uh the second thing is we have to discover what our customers need and I tend to use this

phrase um needs pain points and desires and then I collectively call them opportunities so why is it more complicated than it needs needs to be it's because uh it's easy as product

people to fall into the Trap of or just solving problems uh but I personally use a lot of products that don't solve problems in fact they arguably create problems for me and that's because they

satisfy desires right I'm an avid mountain biker I love my mountain bike you can't pry it out of my hands it's not really solving a problem maybe it's helping me get more fit uh but I would

argue doing it that at the gym is probably safer right so we see this across the product room need pain points and desires we often Overlook desires um and then the third thing that we need to

do in Discovery is we need to discover what are the solutions that address those opportunities and this is the piece that I think like a lot of teams hyperfocus on we live in solution world

but I think the real power is how do we create a really close match between Solutions and opportunities so then you asked what am I what is a continuous

Discovery team doing week over week they're always keeping the outcome in mind I like to use customer interviews as a way to uncover unmet needs pain points and desires or to discover

opportunities and I like to use assumption testing to evaluate Solutions and more importantly to evaluate the match between a solution and an opportunity um so for me week over week

I'm probably uh doing a couple interviews and I'm probably doing a lot of assumption testing right so a a regular week for a

continuous Discovery team uh is interviews some interviews and then assumption testing and we're going to dive into what those things actually

mean um let's start with interviewing so the whole continuous Discovery process uh is about surprise surprise being continuous and building another surprise

a habit uh so why interview every week uh why can't teams just talk to customers when they have research questions you know a Project based approach yeah uh so first of all if we're talking about product teams and by

product teams I typically mean product managers designers software um Engineers the people building the product uh we tend to we don't have time like when we have a question when we have a research

question we don't really have time to ramp up and go recruit someone and find someone to talk to I mean we should in an ideal world but we don't and so if we wait until we have a question to think

about who we should interview oh we're not going to do it now there's other teams like if our company has a centralized user research team they have more time they're not on this continuous

shipping Cadence um they have more time to sort of set up a research project recruit participants interview around a single topic come up with a research report synthesize it and then

communicate it out that's very different so when we talk about continuous Discovery we're talking about the teams that are building the products and sometimes those teams have researchers embedded but typically uh they don't in

my experience it's very rare right and so we're talking about the teams that are building the product we want them to have a feedback loop as they make decisions about what to build so if I'm

waiting until I hit the moment where I need feedback I'm just not going to do it I have Engineers waiting right I have a meeting to go to where I'm telling them where I am in the road map like I

don't have time to say let's stop and recruit so part of the idea with continuous interviewing is I want a conversation on the books every week no

matter what so that I always have an opportunity to get some fast answers to my daily questions but more importantly I'm also always investing in my

understanding of my customers I'm always spending time and I think this part is the part that's really undervalued I could just assume a lot of things about my customers I spend all

day working on my product I know inside and out it's really easy to start assuming my customers know it inside out and they think just like me where's if I just have a simple conversation with my

with a customer every week I start to see oh you're not that familiar with generative AI oh you don't really use four different browsers interesting

you're not like me at all right and so it's good to have those regular reminders um it sounds silly because like of course you can think your way through like I'm more technical than my customer and that means this but like I

had a conversation with an early employee who was not technical at all she was an admin and I remember she asked me she said Teresa how do you put those clickable words in your

email and I was like what she didn't know what a link was she didn't know how to create a link she'd never like she'd seen links but she right this is what we need we need regular reminders that

those of us who build technology are not normal right so we need regular exposure to normal people so that we make better decisions about what we're building uh 100% I I couldn't agree more

um I I just want to pack a few things especially around um you know the product teams focusing on this continuous Cadence and I know that we've got a lot of researchers in the audience and I suppose they would be asking you

know is it is continuous Discovery a replacement for Project based research uh can they can they coexist is this an inter nyine Feud is this a hostile

takeover yeah uh if you read LinkedIn it sounds like a hostile takeover but don't believe what you read on social media I don't think that continuous Discovery or

teams doing their own Discovery replaces good research done by skilled researchers must say that again because I'm getting criticized by a lot of user researchers that I'm not vocal enough

about this continuous Discovery product Discovery product teams talking to customers does not replace skilled researchers doing good research I think

there's some overlap in what we do but I also think we do fundamentally different things here's what a product team is never going to do they're never going to do a longitudinal

study never a product team will never have time to do that uh they probably will never a product team will probably never do a very rigorous diligent

pricing study who has time for that on a product team right um we're probably not going to do research on what our um what

prospects think of our competitors products right I could generate a hundred things like there's a million research questions that are not appropriate for a product team that

companies need answers to and so yes do we still need user researchers absolutely now the question is does that mean user researchers only work on Project based research not necessarily

so I think Project based research squarely fits in the realm of user researchers because product teams simply don't have time I think there's some research like activities that squarely

fits in the product team's realm that maybe user researchers don't need to be involved in I think this is where I'm going to get myself into trouble but there's lots of gray area here so I

think that there's things like really simple assumption tests I'm gonna launch a one question survey to just test a quick assumption I have I probably don't

need a use a researcher to help with that activity I'm gonna go have a conversation with a customer who I know uses this feature to just learn a little bit more about how they use that feature I probably don't need to use a

researcher now where it's gets messy is there's a whole lot of gray area in between right I've seen researchers embedded on product teams and they're doing day-to-day Discovery with their product

manager and their designer and their engineer and it's amazing it means all of that Discovery work is getting the benefit of a skilled engineer I mean a skilled researcher I've seen uh

researchers support multiple teams in which case they're doing project-based research they're helping with the most critical Discovery questions and I've seen everything in between so I don't

think there's a right answer here what I do think is researchers are not going away skilled research is still something businesses will need um but I think that each company has to decide how many

researchers are they going to hire what role are they going to play I think each researcher has to decide am I more interested in Project based research am I more interested in day-to-day product

Discovery research do I want to do a mix of both and then we got to make sure the researcher is matched to the right team and the right company in the right context ever the pragmatist um and I think that's what gets you in trouble

perhaps you're just a little bit too honest there but the reality is this is the reality that I've heard time and time again from researchers themselves which is there are always far more research questions than there are researchers on hand to answer them so I

think as long as that's the case we are always going to be in a position where other roles like product management are going to have to take up the slack I mean like you said they simply cannot wait uh for these answers and if we're going to develop and deliver good

products that solve real problems for customers customers then inevitably product managers going to have to talk to customers and I think that's fine and I think having a structure around it like continuous Discovery is probably one of the best ways for them to get

effective answers um so just back to the idea of the so we were talking about continuous interviewing um and we're talking about uh product teams which are OB you know involved product managers

designers Engineers sometimes uh ux researchers um why so they they're interviewing every week but what is the goal of each interview um can we really

get value from one interview each week yes because we're our goal we're not starting with a research question where we need to talk to a lot of people and look for patterns and identify

themes these types of interviews are very different than I think what a typical researcher would do in a Project based world I think um the goal of the interview think about it as we're just

trying to build empathy we're Expo uh increasing exposure increasing contact so we see some companies do this by having everybody in the company um do support for a day a week or do support

for a couple weeks when they get on board that's great but support is only one view into your customers right it's it's a first of all it's a view into a customer that's having a problem which

is great we should learn that um but we don't always with support get the bigger context uh what's your goal what are you trying to accomplish how does our product fit in your life um show me what

you do now I realize a lot of these things are things that we can learn from Project based res research and when we have that available to us that's great but I also think product teams product

managers designers software Engineers need to be constantly exposed to who their customers are the context in which their products are being used what they're trying to accomplish and it's

not because we're very we're doing um research with this goal of like coming up with an answer it's just about adding it's like putting money in the bank

we're learning more and more context about our customers so as we make decisions about what to build we have all of that context to draw

from yes and I think when we move on to the Assumption testing part of of this discussion I that that plays an integral part in the kind of goal every week as

well right Teresa yeah so here's the I I teach teams one style of interviewing so this came up I was doing an event yesterday

and someone asked like if we teach our product teams how to do interviews and run experiments themselves are we replacing user researchers well if you're a user researcher and you only know how to do one type of interview

maybe but most re user researchers know how to design all sorts of qu interview questions based on the research question right that's what researchers do what I

teach product teams is not how to do all sorts of user research interviews to answer a wide variety of of uh research questions I'm teaching them how to do one thing

you're starting with an outcome I want to learn the context in which that outcome matters so if I'm working at Netflix and I'm trying to increase

viewing engagement I want to know what role does Netflix uh fit in your life when do you watch it where I can literally teach a product team to

ask in every single interview just tell me about the last time you watch Netflix is that always the right question for every research question no is that a great question for a product team just

trying to learn about how a product fits in their customers's lives yes right now I can play with that if I'm on the search team I can say tell me about the last time you chose a new show if I'm on the mobile team I can say tell me about

the last time you watched Netflix on the go but this isn't rocket science right it's how do I just get a little bit of exposure in a reliable way methods absolutely matter but how do I get a

little bit of exposure so I can learn from my customer and I can um build my knowledge Bank of how they're not me see last time we chatted you you refer to yourself as an empiricist is this

something that you sort of and this kind of links back into the way that you think about how product teams approach questions like this um is that still

something that you you think yeah I to to some degree here's what I'll say I think that product teams in particular are trying to change Behavior so the

primary thing we need to learn about is actual behavior and this is why I really like story based interviewing tell me what you actually did if we can paare that with show me with what you actually

did even better right now there's other Ty times of research what you think matters what you feel matters but for a product team who just needs to learn very specific ways to keep exposure to a

customer I think we can focus on Behavior tell me what you did because that's going to get you 90% of what you need those teeny tiny questions you're trying to answer every day and we can

rely on our user researchers to fill the gaps around feelings and sentiment and um everything else and like why what you told me is different from what you did

right like there's all what your views of the future are are you g to cancel your cable account I'm gonna leave all that to my user researcher what I want to know is okay you're sitting in your

living room and you turn on your TV tell me about your experience right and I think we can teach product teams to do that and I've seen it for years now it

unlocks so much more context for teams and they end up making much better product decisions so um how let's talk about Recruitment and participant recruitment

because obviously this is something that can take a long time um and it ends up being quite a big blocker and there's actual several products just created a whole cottage industry around just making making sure you can get the right

participants so uh how do you find customers to talk to every week so again this is going to differ from a research project in a research project I'm really concerned about variation and

sampling and are you an outlier uh if I'm on a product team ideally I care about those things but I don't really have the luxury of caring about those things on day one now by day

10 I do care so let me talk through this before I lose all the researchers in the room uh if you have never tal if you're a product manager and you have literally never talked to a customer

I want you to talk to whoever will talk to you and I've seen this over and over again something magical happens when you talk to your first customer and I see a

lot of teams delay and delay and delay because it's not perfect it's not the exact right customer I don't care if you talk to somebody in your family who matches your customer if it's the first

customer you ever talked to right because people think they know so much about their customer and then they have first contact with just another human that's not them and their mind is blown

so I want to get to that moment as quickly as possible and if they turn out which is not likely but if they turn out to be a weirdo random outlier that's gonna come out in conversation number

two and now I might not know who's the outlier which means I'm going to do conversation number three right and so once a team starts talking to customers on a regular basis now I want them to

think about are we talking to people in the same region are we only talking to people that are super Eng engaged are we only talking to people who raise their hand and say talk to me and we can start to look at how do we recruit for more

variation are we going to get to a statistical sample hopefully I don't have to tell a room full of user researchers that with qualitative research statistical samples don't

aren't relevant um do I need variation yes do I need to make sure that I'm talking to the right types of people yes what's great is we have amazing tools that allow us to do this that allow us

to Target the right people that allow us to only ask people who have used certain features that have the behaviors we're looking for from a wide variety of geographies like this is not a hard

thing for a product team to do anymore because our tooling is much better than it used to be I have quite a few questions here from the audience so I think I'm just going to jump in and ask a couple now

because um yeah we've never had so many questions to be honest Teresa so you've got uh some hopefully you got answers for all these so

um questions okay I think this one kind of relates to what we're just talking about so it's how do you manage skewing assumptions from a few participants uh without managing a selected sample for

confidence in the Insight so I think it's just about like cherry picking and um you know that idea of you know if you've only got if you're only talking to a few people I think we kind of covered it but do you want to just

reiterate one more time yeah one of the things that I encourage teams to do is to automate their recruiting process and like the most common way to do this is to put an Ask embedded in your product

that says do you have 20 minutes to talk to us so there's some bias here you're only going to get folks who are willing to opt in right but what it does is it removes the internal company bias of I'm

going to just reach out to our happiest customers I'm going to just reach out to the customer that my sales rep will let me talk to or that my account rep will let me talk to it's a little bit more uh

we can Target every we can let some people opt in yes there's a bias and some people won't opt in we can supplement this later with more with other methods um but I think

it really helps with uh you have a way of reaching a wide variety of customers and then talking to the folks that are most willing and here's the deal we're not talking about skilled interviewers

here I don't want somebody who's not that skilled in interviewing to deal with a unwilling participant that's a that's a train wreck right so in a lot of ways we want to set our product teams

up for success I want them talking to willing participants again when we when our teams have a handful of interviews under their belt now we and they're and they're building habit and they're getting good at it and they're seeing

where they still have shortcomings and especially if they have support of training or a skilled user researchers helping them get good at it I know not all user researchers want to teach this is why I teach interviewing you don't

have to teach um uh we can start to look at how do we introduce this idea of more variation and not fall under this problem of outliers but you know what I'm going to

tell you most of the people that opt in are pretty good people to talk to like in practice this is not nearly as big of a problem as it seems and it's because we're not treating what we learn as

truth we're not we're we're treating it as this is Sha's experience here's what I learned from Shawn I'm not extrapolating this is what our customers think yeah I found the same thing

actually um that you know the the sort of people who are willing to give you their time and um talk to you for 20 minutes or half an hour about the product that you're building for them are usually quite passionate about you know the product and yes of course that

skews answers and you're not going to get it might be you know survivorship bias you're not going to get the an from people who just don't give a crap but but like to pretend like that's not Illuminating um and that you can't like

really get a good insight into the problems you're solving by talking to a passionate person about the product is obviously not true as well like you you going to get again not a perhaps representative VI but you're definitely

you're definitely going to get some interesting and uh useful information out of it I think that's what you're all about really it's a and I think a lot of people a lot of purists and people who have like obviously ideological sort of

ideas about research which are correct I have to so of you know there is always a place for rigor and yes when we're taking a fully scientific approach we should you know respect the sort of

process and rules around that um however ien you're kind of a fan of the predo principle someone who you know knows where you can get most of the value from is that is that something is that kind of an Outlook you have I you know what

I'm really driven by the fact that I just want companies to become more customer Centric that's really at the heart of this like I really want companies to build better things for

their customers and if I thought I could say something or write something that would magically convince companies to hire a bunch of us an army of user researchers to go out and do great

rigorous research and everybody else in the organization would trust that research and act on it I would say those things I don't think there's anything I can say that's going to make that happen

I don't think there's anything anybody is GNA say in the short term that is g to make that happen just like any profession I mean we were here 30 years ago with design every profession has to

mature and get recognition in an organization and the business case needs to be made for it and we have to get there painfully slowly and I understand pain user researchers are going through

as a result of this and especially recently but here's the reality until we get to that point I want to help companies make customer Centric

decisions and part of that solution has to be we already have people in organizations talking to customers all day every day customer success folks

sales folks support folks product teams how do we help them have better conversations that's my goal yeah and I think um what you just said then you know if we

could magic away and make it make more companies more customer Centric and and you know increase research headcount or you know however it may be achieved uh I think everybody in the in the audience

today and potentially everyone in any product team or research or would agree you know like that's really the goal how do we get people to take more notice of insights use them more you know like actually think about the customer when

they're building something and this is really um you know this is the goal that we all have so hopefully even if you disagree with Teresa and people in the audience um potentially you can see that we're all kind of heading in the same

direction um can I just add to this like one thing that I see a lot is sometimes people argue why do product teams move so fast right like why can't we slow

down and do our two weeks of research if we think about the number of decisions we're making on a daily basis in our business across the board not just your product teams your Finance

team your marketing team your sales team your customer success teams an overwhelming majority of decisions are being made without research I think most user researchers would argue we're not

going to like apply research to a 100% of those decisions right some of them are really important decisions and we should take the time to do good research but I think

there's a sliver where we can say okay look you're going to make a 100 decisions 80 of them you're going to make based on the top of your head because it's business and that's how business Works maybe 10 of them we can

do lightweight research activities research activities by non-researchers and maybe 10 of them require good research and what does that do it gets us 10 more decisions that got some

feedback loop in there that's really where I see the power of this yeah let's just get feedback on more decisions yes because you're you're absolutely right and I think there's a

quote I for some Harvard Business School Professor but it's you know that most decisions in fact every decision pretty much in business is made with incomplete information you know we it's just true it's just the nature of the game there's

too much VAR variability out there there's too many known unknowns and unknown unknowns and whatever it is that ronfeld said that time and it's uh for those of you old enough in the audience

to remember that the the nature of business is that we're always shooting somewhat in the dark and if we can just shed some light if we can illuminate our path just a little bit we should be really going out of a way to try and do

this and if we can do in a way that's fast efficient and effective and you know doesn't interrupt the flow of business because ultimately that's really going to be the name of the game then we should take those paths um I've

got a question here from Sid chatani um he's asked I think he's a good question asked this time which is how do you go about promoting a culture of continuous

Discovery Well that's a big question um I think uh so first of all Marty kagan's book transform just came out and it's svg's view Silicon Valley Product group's view of

what they see needs to be in place for a successful transformation or adoption of these practices to be in place uh I know not everybody loves Marty Kagan there's

a whole section in this book that I think is spectacular he talks about you need a sea-level executive on board your CEO needs to be on board on some level

you need a product leader on a product executive on board who is willing to coach teams on how to work this way you probably need a design executive and an

engineering executive to be on board it's really hard right here's what I say to individual contributors don't worry

about how your organization works and I tell you this from personal experience I wasted a lot of my full-time employee experience worrying about what everybody else in my organization was doing and it

would have been way smarter to just focus on what I was doing and I think what people underestimate is they have way more agency to work this way than they realize even if you're being asked

to deliver a fixed road map with fixed dates you can still do all of the discovery around Solutions you can still talk to customers and figure out how to make those Solutions better for the right

customers you can still have an outcome mindset so I think it's really easy to get distracted by nobody else in my organization works this way they're

actively telling me I can't a lot of this is a way of working a way of thinking a way of approaching problem solving we all individually can control

how we solve problems and I think that's how change happens I think if we all I think if every product manager designer software engineer user researcher list your favorite title chose to start

working this way tomorrow our organizations would change a lot faster than we think okay so T I have a lot of questions around this so uh when you train teams you suggest that teams document the outcomes or themes of their

conversations I've got uh questions about uh is there a documentation process sorry that was from uh Liz trimba uh from Josh sorry from Tyler

hail um we have the question uh what happens after the conversation is over is a documentation process we've got questions here about uh tools that you

use obviously I'm going to plug dovetail here uh yeah how do when multiple teams are doing it for the same product any recommendations approaches and the tools and uh repository so obviously this is a

love tell webinar will'll be remiss of not to mention of course that dovetail is a research repository it is exactly a tool you can use for this uh actually my colleague colen is here um and we're

going to share a a series of templates that you can use if you want to you can jump in you can try dovetail uh free today um you got Discovery templates and you can check it out for yourself as a

as a documentation process does dovetail have an interview snapshot template it it does it has it yeah yeah we have uh interview templates um I don't know if we have one specifically tailored to

your approach though I think maybe I'll have to talk to you afterwards to get one yeah maybe we'll have to make that happen yeah but n non-plug perspective

uh Teresa what about the documentation question yeah so again remember we're not framing these interviews as a research project where we need to keep

everything in perpetuity so one of the things I teach teams is this cont this concept of continuous synthesis so we synthesize at multiple levels really

synthesize at the individual level and then we synthesize two different ways across interviews so if we start at the individual interview level I want to see teams capture first of all I teach

story- based interviewing tell me about a time something happened I then teach them to draw that story we're basically drawing an experience map of what happened in that story I do this because

for teams that are not trained in synthesis drawing is a great way to help them see the key moment whatever what actually happened first a then B then C

then D second thing we do is we start to look at where was their friction in the story what where were their unmet needs where were their unmet pain points where were their unmet desires and they come up with a list of opportunities now this

is specific to that single interview we're synthesizing one interview in my book I introduced an interview snapshot template that just captures this visually it kind of looks like a Persona

template but here's the deal I don't want non-researchers creating person personas I want them focused on I talk to Shawn this is Shawn's story because I think non-researchers can do that so

then they collect a bunch of these they've been interviewing for a quarter let's say they have 12 of them maybe they have 24 of them they're doing two interviews a week I actually recommend that after they do three to four

interviews they do their first round of synthesis across interviews and they do this on two levels they create an experience map that encompasses everything they heard across those

stories so they have individual experience Maps they can look at the moments in those experience maps and create a super experience map that covers the whole experience again I can teach a non- researcher this because

it's a really superficial here's the key moments in a customer's Journey but it's emerging from real stories it's not just made up here's what we thought right so then that's one level of synthesis

across is what are the key moments across our customers Journeys and then the second one is what are the common needs pain points and desires that end

up on my opportunity solution tree again could there be way more value out of these interviews could we get into sentiment analysis and um sure I'm trying to get product teams to look at

what is the most actionable things they learned from their interviews and for a product team that's going to be what's a need I can address what's a paino I can address and what's a desire I can app um

address and then they do that every three or four interviews they Continue to update that common experience map and they Continue to update their opportunity solution tree and because they're doing that continuously they're

never stopping and doing theme analysis or Affinity mapping they're never creating a research deck because they don't have a guiding research question they're just understanding their

customer context and it's fueling where do we want to play and they of course can combine that with research they're getting from the rest of the

organization um I'm going to jump back uh I I love the idea of experience uh mapping uh and this is something that we discussed last time is it is it related how or how closely related is it to sort

of assumption testing yeah H mapping language is so messy uh okay let me talk about terms when I say experience mapping a lot of

people might think it's a customer Journey map or it might even be a solution story map here's how I distinguish between those three things a story map is is an experience map of how

somebody is using a product right a customer Journey map for a lot of companies they Define it as these are all the touch points the customer has with their company whether it's through the product or the success team or the

sales team but it's the customer Journey with the whole company when I say experience map I'm trying to divorce it from the product or the company I just want to learn about Shawn regardless of

whether Shawn engages with my my company whether he uses my product now he opted in through my product so he probably uses my product but I want the scope to

be about Shawn and Sha's needs and not about the product and what you did in my product so that's why I use the experience map language to try to get

product teams to think broader than hey look at my shiny solution it's not about your solution it's we're here to learn about our customer okay

um thinking about the interviews we're still like talking about doing continuous interviews every week um and you know synthesizing them sort of on mass every you know maybe four or

five that you get um and and sort of surfacing those experiences or those kind of like the collective most important nodes on a in a in a kind of

experience map um what about prototypes should should people bring prototypes to their interviews yeah so this is where we're going to get into assumption

testing so the way that I Define interviewing in the like habit sense I'm looking at that activity with the goal of finding opportunities where do I have

unmet needs pain points and desires once we move into evaluating Solutions I want to get my teams focused on assumptions and testing very specific assumptions

and this is by Design because too many teams think about testing ideas from a project mindset we come up with an idea we come up with one idea we prototype

the whole idea which means our designer does all of the design work up front if it we get any feedback if we're on the right track and then we run really long

essentially usability studies to get feedback on do we build the right thing and the thing is a usability study doesn't really test desirability it certainly doesn't test viability or

feasibility or ethical assumptions right and so and then also I don't want to do all the design work before I know it's the right solution and I don't want to work with one solution at a time so I

encourage product teams to work with multiple ideas that solve the same need we're setting up a compare and contrast decision and then we're going to take our ideas and break them down into their

underlying assumptions so this is a common idea because it was popularized by The Lean Startup and David Bland does a ton of work in this space but I think it's still really

misunderstood like I hear from product teams every day that say I don't know how to test my assumption I go okay well what's your assumption and they say my customer will want my solution okay that's not an assumption I

mean sure it's an assum asson it's not a testable assumption the only way to test that is to build it and to see if your customers want it right so the

underlying the heart of identifying assumptions is we have to get really specific let's say that I I'm going to use the example I used in my book we work at Netflix we're considering adding

live sports actually Netflix is now doing this I feel like my book might have inspired them or the millions of sports fans around the world probably inspired them right so we're we're

evaluating should we should we um integrate live sports into our platform I can generate first of all I can generate assumptions just around that opportunity I want to watch live sports

uh what what sports should we include uh do they need to be live do you want to watch them On Demand do you want to pause and Rewind do you like there like there's a hundred million questions that

come up as we start to think through a solution so one of the things that I teach is how do you take an idea get really specific about what it means use those specifics to generate

underlying assumptions like actually our subscribers want to watch sports they want to watch sports on our platform they understand that we offer Sports

they know how to find sports on our platform they we can create a good viewing experience for sports I know that lots of sports streaming companies create terrible experiences I'm a hockey

fan it goes into overtime if you tell me what time the game ended you just ruined overtime for me 90 % of sports platforms do this it drives me nuts right so

there's a 100,000 little things in there these little assumptions I like to think of as building blocks that if the wrong one is faulty the whole solution is

going to fall apart right so the value of breaking your Solutions down into underlying assumptions is I can test assumptions really quickly I don't have to do most of the design work I can just

like if I want to know are my subscribers Sports fans I can embed in my product a one question survey that just says have you watched a sporting event in the last

week now here's the challenge and I saw this come up in the chat the average product manager isn't G to ask that question they're gonna ask do you watch sports and they're going to get garbage data because every human is watched a sporting event and every human is going

to say yes which is why I think we need to train product teams on how to do this well I'm not suggesting the average joeo Schmo is going to run a good one question survey but it's not very hard

to say ask about specific behavior in the past and time box it people can follow these Simple Rules right so I can learn in like an hour or two if I'm Netflix by just embedding on my website

have you watched a sporting event in the last week and then I like to pair this I call it a onew punch I want to ask you one question because that's going to get you to click on it and reply if it's multiple questions you're going to be like I don't have time for that but if

you say yes you watch a sporting event I'm going to ask you a second question to get more reliable data I'm going to ask you what was it and I'm going to make you fill it in now I can evaluate

is the sporting event that you watched relevant to what I'm considering integrating into my platform and that yes is a more qualified yes right I can do this in an hour or two I didn't need

my designer to prototype the whole solution and I'm starting to evaluate is there demand for my solution among my population okay um so when it comes to

like assumption testing this fits into continuous interviewing or is it kind of different like do you bring your sort of prototypes to the interviews and you kind of rapidly test them like that or is it better to use like more of a like

a quantitative approach or survey based approach do you have any sort of specific opinions on like like yeah it kind of depends on what tools you have

access to so I prefer the Assumption testing gets a little more quantitative even with prototyping right we now have unmoderated testing platforms I don't

love un unmoderated testing platforms for like here's the whole solution I'm going to give you a complex task you're on your own hopefully you remember to think out loud like that's not the best

scenario right but if I show you a really simple prototype and I ask you to explain something to me I can evaluate your understanding if I give you a

microtask and which is great now your video is 10 seconds and I can evaluate results way faster right so one of the nice things about assumption testing smaller tests simpler tasks now we can

rely on things like unmoderated testing I can test with 30 people in a in one day I can go home for the night and come back to 30 results um not everybody has access to these tools so I might also

run my prototype tests in my interviews as I'm doing them because it's opportunistic but it's really hard if you're doing one interview a week I don't want you to take 10 weeks to get

enough responses to test your assumptions I'd rather you rely on uh some of the great tools we have that help us do it this fast um while we're on the topic of

prototypes and working with designers I've got a question here from Josh Bradshaw who says designers love your book yay it's great yay designers yeah uh yet it was originally

targeted at product managers how should designers view uh continuous Discovery habits how should they view your methodologies I want to push back on it was originally targeted towards product

managers there you go there is nothing in my book that targets in fact everything about my book targets it towards product managers designers and software engineers and I'll even share it's funny that people think I'm a

product manager I was a product manager I was also a designer I was also an engineer I've also done a ton of user research uh so I like to just do whatever is required to build a good

product and I kind of wish more people would Embrace that and then we could stop arguing about titles and roles and who does what we could just be humans who build things it's so the one one woman product

team I actually did know that you were a designer and and an engineer you started off as an engineer no I my background is in human Center design right but I started my career in

the 90s when there wasn't a plethora of design jobs so the way that you snuck into a design job was you did front and engineering and did all the design work because there were no designers at most

companies that's right that's right um so uh back to assumption testing just thinking like um how do we make decisions based on what we are learning

from our assumption tests just getting a little bit more into the Weeds on that one yeah I'll tell you one of the most common questions I get and you get this even with prototype testing right all

right I talked to five customers uh how do I interpret their results is their feedback good enough right um this is hard and it's because

we fall into the Trap of testing one idea at a time so I show you a prototype maybe you love it but I don't know if you're really going to use it I I mean hopefully I designed it in a way where I'm evaluating your behavior and not

what you think you'll do um but even so right like it's really hard when we work with one idea at a time to evaluate is this good enough and this is part of the

reason why it's really important that product teams compare and contrast Solutions I mean designers know this right every designer on the planet knows to provide multiple options not just one

it's the same idea at decision-making research like we know this we know the more options we can consider the better decision we make so one of the ways that we can make decisions on our assumption

test is we're going to test assumptions across multiple ideas and we're going to evaluate the results and sometimes the results across all three ideas are terrible and we decide to throw our

ideas away and start over sometimes they're very muddled and it doesn't look like we have a clear winner in my book that means our ideas are terrible and you should throw them away and start over sometimes we get a clear front

runner and it's clear one idea is qu qualitatively better than the other ideas that's a good thing that tells us let's go in this direction right so it's

really hard to evaluate assumption tests and to make decisions on assumption tests if we're working with one idea at a time is we're learning relative feedback we have to compare and contrast

and we're never going to remove judgment from the process we still have to make the best decision we can with imperfect data um I'm just going to use this opportunity now we've got like a we've

got still like plenty of time left for questions and stuff but I wanted to just give a quick shout out to um our conference that is coming in a few weeks so if you do love uh if you like

conversations like this with with Fascinating People with great great insights you will love inside out uh we're currently giving away some tickets as well uh we have a competition going on our LinkedIn all you have to do is

answer the question on our LinkedIn um and register for the free inside out uh webinar also hybrid uh event so if you register for that and answer the

question you will go into the draw to win a ticket to the conference in San Francisco so it is in San Francisco though so you kind of have to be in that area if you want to go um but yeah jump

jump in um and like I said the the online version is completely free so you can stream all the great talks that are going to go on on April 11 on that day

um back to your scheduled programming um I just want to know about like so we've talked about assumption testing and we've talked about uh continuous interviewing like how do you know when

to use what like and I think just bringing it all together and showing our audience like you know where it begins and where those two things kind of really fit in would really help I think yeah I'll kind of give my ideal timeline

for a team before I get into the details I want to give a shout out to table based layouts because that came up in the chat and now I'm squarely thinking about Internet Explorer 5 so thank you

for that uh okay so maybe Internet Explorer 4 oh I don't miss those days um okay that was like 20 plus years ago and I still feel like I have a little bit of

like in my brain um let's talk about like what does this look like in practice let's say it's the beginning of a quarter you're on a cross functional product team you have a new outcome now

what okay the first thing I like to do is I like if I'm on a brand new outcome for the first time I want to frontload my interviews I want to get to that three or four interviews as quickly as

possible so that I can start doing my across the interview synthesis so in week one I'm probably doing three to four interviews by the end of week one I

have my first draft of a common experience map and my first draft of the opportunity space I I'm gonna tell you one of the phrases I use a lot in all of our

courses is crummy first draft I see teams spin their wheels on this just get to a crummy first draft and the reason why I want you to do a crummy first draft you're going to revisit it every three or four weeks it's going to

continue to evolve they're both intended to be living documents and I don't want you doing generative research indefinitely we're product teams our job

is to ship Solutions so by the end of week one I want the team choosing a Target opportunity and starting to explore Solutions and that makes a lot of people uncomfortable they're like we

barely know anything yeah but last quarter you just built whatever you thought you should build with no input so now it's the new quarter by the end of week one three to four interviews a

draft of your experience map a draft of your opportunity space choose a Target opportunity in week two you're brainstorming ideas you're story mapping them you're identifying assumptions and

you're launching assumption tests and here's why it's critical that it happened this quickly by the way in week two you're also interviewing you're doing another interview the reason why it's critical

it moves this fast if you take three or four weeks to interview and map your opportunities space and you haven't chosen a Target opportunity and your stakeholders come to you and say how's the quarter going how are you doing on your outcome for a whole month you're

saying I don't know yet that is not going to fly right so we have to be pushing towards Solutions now here's the deal you're probably going to pick the wrong target opportunity in

week two you're probably going to explore pretty crummy Solutions you're not going to be very good at assumption testing that's okay you're still better than you were last quarter when you were just pulling things off the top of your head

so you build some things they're probably the wrong things but at least you built something you can show progress you have something to tell your stakeholders who are way too delivery focused and then you do it again and

you're continuing to interview so you're continuing to learn more about your customers you're continuing to learn about more opportunities so the second time you choose a Target opportunity it's going to be a little

better right so people get caught up on perfect cycles and I get I I will tell you I'm going to say this in 100 people on LinkedIn are going to be like Teresa said move fast and break things and

build whatever you want that's not what I'm saying we have to move fast because our organization expects us to that is the truth we have to move fast because

our organization expects us to when we slow down we lose the right to do any Discovery at all everybody loses the

business loses the team loses customers lose we have to move fast to earn the right to keep doing Discovery so I think people should push the pace you're interviewing every week every single

week you should be evaluating a set of solutions deciding the best option to build with every cycle you will get better and better at it your knowledge bank will grow you'll get better at

running assumption tests the bets you make in your back backlog will get stronger and stronger that's it ever the product coach you know that's it folks get out get on the court and hustle H

it's all about it's and I mean your book is called continuous Discovery habits for a reason it's really something about building that habit building that muscle and getting improving over time not

Perfection but getting better and I I really love that um so I've got I want to get through as many of these questions as possible from our audience um so I'm just going to I'm going to fire through them and let's and see see

how we go um if the CEO isn't on board and another SE level exec isn't on board for advocating and being Hands-On should you try and get them on board or just seek out other opportunities neither I

don't think nether of those is the right answer focus on your own individual habits I will share I've never worked at a company where the CEO was like oh we

should do continuous Discovery I've also never worked at a company where I didn't do continuous Discovery just focus on your own habits uh how is the continuous Discovery approach different from having

stakeholders PMS devs Etc observe interviews conducted by uxr on a regular basis so this is kind of contrasting with Jared spo's kind of exposure uh approach uh to

yeah I like this as a stepping stone like if you have an organization where the culture is we really want everything done by user researchers I think that's fine as a stepping stone I think what

you're going to find really quickly is your product teams are going to watch those interviews and they're gonna have a million questions and if you can't help them get answers to those million questions they're going to stop trusting your research so I think like if you

want to like show what a good interview looks like if you want to get everybody involved if you want to build excitement about working this way I think that's a great first step but you're opening a

can of worms when we see an interview we have a 100,000 more questions and if they're not going to get answers to those 100,000 more questions they're gonna stop watching your interviews those last questions from

Randy simia and salil rs respectively so shout out to you guys thank you very much um so a question from Greg LEL about uh experience Maps uh is it

multiple experience maps for each research conversation at what point do you tie it all together and find patterns and then how frequently are you updating it yeah so remember we talked

about two levels of synthesis the first level is I'm just synthesizing what did I learn from my conversation with Sean if I I want to see a team collect

one good detailed story in an interview that means I have one experience map now what inevitably happens is teams aren't that good at collecting stories they've got 20 minutes they spent seven minutes

minutes and they have 13 minutes left they collect a second story in which case they would do a second experience map now across interviews I'm doing this every three or four interviews so if I'm

doing one a week it's every three or four weeks if I'm doing two a week it's every other week and I'm starting to look at what are the common moments across these stories what's my common experience

me from um thank you for thank you for the answer from Anonymous attendee um are you creating hypo hypotheses when testing assumptions or are these more

generic research questions very sort of technical question here um what's your response I used to use the term hypothesis in fact if you search product talk for hypothesis you'll find it used

quite a bit I stopped using it when I wrote the book and this is because what I've see so every user researcher will relate to this and I'm sorry I use the term user researcher I know ux

researcher is now the term of the day it's just old habits die hard um okay so every researcher has experienced this you present your research somebody disagrees with the findings what do they

do they nitpick the methods every time right you didn't talk to the right people you didn't talk to enough people I think you asked the wrong question because we don't believe research we don't agree with that's just

the reality every human is this way um when we use the term experiment when we use the term hypothesis we're setting a standard that we're following

the scientific method we're doing double blind randomized controlled studies we're opening the door to I don't have to believe a thing you learned because you didn't do that level of

experiment that is not the goal with assumption testing we're not proving anything we're not scientists we don't have time for real science we do care about methods absolutely we care about

methods but we're trying to collect signals are we on the right path does this solution look better than this solution and so I find that just talking about assumptions and trying to collect

some data about does this assumption look faulty or not I'm not proving it I'm not disproving it I'm not validating it I'm not invalidating it I'm just looking at is there evidence out there

that could help me inform my decision that's it real light and I think that's really important because when we talk about experiments and we talk about hypotheses we start getting well I don't

like your methods okay well you made 17 decisions today with no evidence can I talk about your methods right and so I think that's where um language really matters so I try not to use the term

hypothesis at all uh in a related notes from cap Fox and I there's a lot of questions about you know sort of bias um how do you like

how do you avoid bias um or is it even something that we can avoid I mean that's like a big philosophical question

uh I don't think we can avoid bias 100% like if I start at the highest philosophical level I don't think we could avoid bias 100% I think like if we

go back to like the 1990s and the early 2000s and best practices we used to say things like we need a researcher to run a usability test because the designer

will be biased in interpreting the results of their own design I think that's true I see this a lot I see product teams fall in love with their idea they do research they're literally looking to validate that they were right

they don't have a lot of intellectual honesty they're not approaching their research from a good standpoint that's not what I'm talking about here right I

think if a team builds the muscle of it's not about my idea versus your idea it's not about this is the one best idea it's about how do we get a signal that we're going in the right direction and

one of the things I think really helps this is a lot of teams hyperfocus on their first idea I make them generate 20 ideas because I want to break their

fixation on a single good idea I want dozens of good ideas when we get to that level we start to let go of like well this was my idea and that's your idea and this has to be the way to do it and

I think that's where a lot of the bias and research comes from as we just overcommit so I like to break the overcommit so uh we're about out time

here but I Lucy seret has commented um ah I love this so many truth bombs and I can absolutely agree I think this is one of the uh best conversations I've had on

on these webinars and this is my second time talking to and uh that's the same impression every time it is a lot of Truth bombs it's a lot of people a lot of pragmatism a lot of just this is the way it is and this

is how we're going to get better uh talk yes so if everyone can give a digital Round of Applause for Teresa Torres I'm sorry we couldn't get around to all the questions that were asked um and I know

that you have so many uh you've asked so many great questions so it is unfortunate but you know I I uh you know maybe we'll try get Teresa around another time because honestly this was great conversation and we've uh everyone

has uh clear really enjoyed it so uh thank you again um again we've um you know there's inside outs coming all the all the regular plugs honestly really I'm just stoked to have had this great conversation and and also the audience

you guys have been absolutely fantastic so many awesome questions such great engagement so yes until uh we'll wrap it up now but until next time see you later thank you thank you I love all the

reactions I feel a little beat up recently by ux researchers and it's nice to see that not everybody is in the can't so thank you I appreciate it absolute super fans over here so uh

thanks again everybody we'll see you next time ciao

Loading...

Loading video analysis...