A Framework for Seamless Collaboration Between UX Researchers and Product Analysts
By UXDX
Summary
## Key takeaways - **Peanut Butter & Chocolate: Analyst + Researcher Metaphor**: Data analysts (peanut butter) focus on quantifying problems ('how big is the problem?'), while UX researchers (chocolate) explore the 'why' and 'how' by going directly to users. [01:24], [02:23] - **The Peril of Siloed Solutions**: Rebuilding the 'help' feature without understanding user needs led to increased development time and cost, but no improvement in engagement, highlighting a failure to address the core problem. [07:55], [08:31] - **Real Barriers to Collaboration**: Silos form due to unclear roles, jargon, time pressure, and stakeholder biases, preventing effective integration of qualitative and quantitative insights. [09:10], [10:20] - **Plan Together Upfront**: Collaborating from the start by defining who owns which sub-questions and bringing counterparts into sessions leads to more informed research and analytics, and builds empathy. [15:55], [19:49] - **Prioritize Big Picture Strategy**: When facing limited time and resources, focus on answering the big, strategic, and less-defined questions first, as getting these wrong negates the impact of tactical improvements. [19:35], [27:37] - **Frame Research as Risk Mitigation**: To overcome objections to qualitative research, position it as a way to identify potential risks and avoid costly mistakes, highlighting the opportunity cost of not understanding the 'why'. [30:35], [30:42]
Topics Covered
- Why do research and analytics teams fight?
- Collaboration fails because of these hidden barriers.
- Your users don't want a tool, they want a partner.
- What if conflicting data is a hidden door?
- Stop solving wrong problems by defining success first.
Full Transcript
[Music]
[Applause]
Hi
everybody. Hi
everyone. Thank you so much for your
time today and thank you UXDX for this
amazing conference and for giving us a
platform to talk about something we're
both very passionate about research
analytics triangulation and all the
magic that comes from it.
um he did such an amazing job
introducing us there's very little I can
add to that but my name is Archa Sha I
work at Lexus Nexus as a principal UX
researcher I've been a researcher for
over a decade now um just with a
plethora of different titles I am
Shubashri I'm a data analytics manager
working in Lexus Nexus as well I was
initially hired to help UX research team
with their data needs and that's where I
met Archana but I quickly realized that
it's not really about me helping them
with data needs. But working together
and collaborating together can help us
create much more holistic picture of the
problem that we are trying to solve. And
that kind of created this framework that
we are going to talk to you all about.
Before we go any further, we wanted to
quickly introduce all of you to our
protagonist who we are going to use them
today to tell the story from us going
from strangers to best friends to now
this framework. So peanut butter here is
the data analyst
and chocolate represents research
because chocolate and we wanted to prime
you for lunch.
Um so as a data analyst working with
product the questions that I'm generally
trying to answer is how big is the
problem how much is the problem or like
how many users are feeling that kind of
problem and generally trying to quantify
the effect or the problem space and
that's why I generally use different
kind of codes to create different kind
of narratives around graphs and charts
and trend analysis to be able to provide
answer for those questions.
And on the flip side, traditionally
research has been well known for going
to the source of the problem, the users.
So what are the problems? What are the
problem spaces? What are the boundaries?
Why do those pain points exist? And I
might be preaching a little bit to the
choir over here, but the usual slew of
methods that uh apply over here are
contextual inquiries and surveys and so
on and so forth. And that's what my
toolkit tended to consist of.
So a little bit about the processes when
we were both um new to the organization.
Everything that you're going to look at
today is oversimplified for the purposes
of this talk. But this will give you a
general sense of how it broke out. So
you start with a business question. Any
business question is never as simple.
There's lots of facets to it, lots of
research questions it breaks down into.
Once you have that, there's the research
planning. What method is going to be um
best for answering those questions? most
effectively and efficiently. Once you've
collected all of that data, what you're
working on then is compiling all of that
information to then put together your
recommendations. So it's advice, answers
or directions for what the business
should do. The analytics process looks
pretty similar as well. Uh so when we
get the business questions, we are
generally trying to figure out what part
of the question can be answered by the
data and analytics that we have access
to. We then generally go off to figure
out what are those data sources that we
can get some of these answers from. Have
those findings, create a narrative
around it, create some data and insights
based on the graphs and dashboards. And
then we go back to our stakeholder with
the recommendations from those data and
insights. And that brings us to the fun
story part of the talk today. Um, this
is a tale of two strangers and it's a
story from 2017. So basically a million
years ago. Uh but this was a story of
where peanut butter and chocolate were
standing on two opposing ends of a cliff
and did not know anything about each
other at all. And something that you
should know any good story starts with
some background and some context. So
here it is. The world that Subashri and
I work in is basically one that caters
to legal professionals. So for any of
you who have watched any kind of
courtroom dramas, suits or any one of
those um legal dramas, you may already
have an inkling of how complicated law
is. Now if law is going to be
complicated, legal tech that supports
that world just as complicated, if not
more.
And the problem that we were looking at
was basically catered to solving for
legal professionals. A cornerstone of
what legal professionals do is legal
research. looking at all of that
history, all of that case, all of the
nuances and subtleties to be able to
build a solid one that you can present
and you cannot afford to look like a
fool in court. And that is exactly what
our flagship product was catering to and
it was meant to support that research.
Now that said, customer support was
overwhelmed and inundated with lots of
people calling in asking for help in
using that system. And this was a
problem because we actually did have a
mechanism to self-s serve, look at the
capabilities and figure out how to use
the system in the system
itself. So what business came to us with
was a simple question of maybe we should
rebuild help, add in all those bells and
whistles, add in more content that
people can self-s serve and use the
system for. And like most projects these
days, you got one month to do it. Um and
that of course meant that Subashi and I
were scrambling and we were running off
in our own individual paths to find as
many answers over there as we
could. So um breaking down that entire
process, the focus of our talk today is
more the collaboration and the framework
piece of it. So I'm not going to bore
you with the methods or the details over
there. But what I can tell you is from
the business question of would
rebuilding help help um what I broke
that out into as research question is
why do users need help and if they need
help where are they seeking it where do
they want to seek it and for this I ran
interviews and the biggest takeaway that
came from those interviews with
customers who were calling infrequently
was there's a lot of data in your head
you're thinking about your client you're
thinking about that c issue you're
thinking about the state you're in when
you're researching that issue so much in
your head, putting all that into the
system to get everything that's relevant
extremely difficult. And so it was clear
to me that users wanted help with
figuring out how to craft an effective
search. And the second piece of this was
the survey. So the piece of where do
users need help? This was fairly
straightforward. All of the data has
been falsified to be respectful of our
confidentiality needs. But the takeaway
was the same. Yes, people wanted to go
to customer support, but if you look at
that big red box, they wanted to find
help online. So, they wanted help
online, but no one seemed to be using
it.
So when I looked at the business
question of if we should improve the
inproduct help the data analytics
question that I came up with basically
was to figure out are people using the
current help system and if they are how
frequently do they use it and when in
their user journey within the our
product are they trying to use help. So
when I looked at the data what I found
was very low percentage of current users
are actually interacting with help. Even
among those only 20% are coming back and
using it multiple times but the sessions
where they are interacting with help are
not successful. So they are probably not
able to find what they are looking for
and they are generally using help right
after running a search. So the
conclusion from this uh analysis was
that people are not really interacting
with the inproduct help. So remember in
this scenario we don't know each other.
So we went separately to our
stakeholders and provided with our
insights and findings and based on that
time crunch that we were running through
uh the product actually decided to
improve the inproduct help that we
already have add more content to it add
more flexibility and options for the
users to be able to interact with the
help system um and use help them with
their searches. So we we ended up
creating this new and improved inproduct
help but nobody really came. Um so we
didn't really have any increased
engagement with the inproduct help. Uh
people were still calling up customer
support to get the help that they
needed. So what went wrong here? I did
provide the insights that people are not
really interacting with our inroduct
help. This one over here suggested after
talking with five users that we should
build new and improved help. To be
clear, I made no such recommendation.
You can get the problem right and the
solution wrong and that happens a fair
bit. At least I shed a lot of light on
the subtlety, the nuances and so much of
complexity that goes into that search
system that clearly needed help. What
did you do? Boil all of that into a
number on a bar chart? Well, my bar
chart at least talked about the whole
user base that we had and people who
were using or not using health. But of
course, we should all listen to your
call insights. So basically vibes, those
vibes are why we have the product being
as half as successful as it is today.
You try launching a successful product
without understanding the user or the
problem space more deeply. What are we
going to do with understanding our users
deeply? We are not doing therapy
sessions here. We are just trying to
solve business problems. No, you're
right. Let's just keep solving the wrong
problems and AB test ourselves off a
cliff. We hope this isn't the case, but
for those of you that it was, I hope it
wasn't triggering. But a lot of this, if
it wasn't text, may have been subtext
for all of you. Now, in all seriousness
and jokes apart, um what we've found
over the years that we've uh been
working together with our teams, growing
our teams, and also the conversations
we've had the privilege of having across
the industry is there are some very real
barriers that exist between our
functions. Now, one of them lack of
clarity in roles. If you don't know what
the other person does, how do you even
know what to go and ask of them? And
then who owns which piece of the
question? And it's only natural to butt
heads in the age of AI when everything
is accelerated. Everyone wants those
insights yesterday. That tension becomes
a little bit stronger. The second is
lack of understanding of each other's
disciplines. No, research is not just
talking to five users and analytics is
not just bar charts. There's a far more
depth u to what each of us is capable
of. But again, without that
understanding and all of that jargon
that can come with each of these, that
barrier gets stronger. Time constraints.
At this point, we're all working with
stakeholders who wanted those insights
yesterday. Um, and those bring a very
real pressure, forcing us to resort to a
toolkit we already know and are familiar
with. And the last of this, of course,
is stakeholder biases. Now, I don't know
if this is also relatable, but over my
career, I've worked with stakeholders
who built features off of that one
thing, that one user said that one time.
But at the same time, I've also had
stakeholders who will only budge if
there is a statistically significant
number attached to anything I present.
And both of these biases are real. Now,
they're just as real as the impact it
can have on the business. If you
continue down that path and you refuse
to change, you could be solving the
wrong problems. And when you're solving
those wrong problems, it leads to a lot
of repercussions as well. You're less
efficient and it's a lot more expensive.
At Lexus Nexus, an AB test can take
about 3 weeks. And if you're basically
trying to recruit from a very niche
market audience, then even one single
qualitative study that you want to get
really strong reliable results on can
take very long and time is money. So are
we all doomed here? Don't we have any
get way to get out of the spiral? We
actually might have a solution for you.
So let us now imagine in this parallel
universe where UX research and analytics
are best friends. We know each other's
strength. We talk to each other on a
regular basis. And in this scenario, the
process can actually looks pretty
different. Um so whenever the moment
that we get business questions, we sit
together and we figure out what part of
the question can be answered best by
research team versus the part of which
can be best answered by the analytics
team. Then we go off we do our research
we do our analysis and the findings that
we get we then come together and share
that recommendation based on our
holistic understanding of our user
problem and the problem that we are
trying to solve. Um and this is of
course iterative as well because the
insights that you are getting can lead
to more questions that we need to
answer. In this scenario, uh the
questions that I got that I needed to
solve for were still the same. Are
people using help? How are they using
help? How frequently do they use help?
And my findings were exactly the same as
well that they are not using help. And
the sessions that they use, they are not
finding what they're looking for and
they are using help right after running
a search. And the nice thing about
having a BFF is you know exactly what
they're capable of, what they're going
to find, and how it might drive your
next steps. So it became a lot easier
for me to plan sequential research and
parallel research. Now the piece of this
that I knew I could run in parallel with
that one month still in place was what
do users want? So that same survey was
run, same results were found, people
wanted help
online. So when you put it all together,
people wanted help online but success
rate was very low even when they used
it. So maybe online help wasn't offering
things in the right manner. customer
support was clearly doing something
strongly different from whatever was
available uh in the system. And the
second piece of this, we knew that we
wanted help with search because
Subashi's data pointed out that whenever
they used help, it was typically after
they ran the search. But when that was
available in the system with all those
bells and whistles, they still weren't
clicking into it. So maybe people are
just having a hard time finding it. So I
took the first piece of the puzzle which
was basically what is customer support
doing that's distinctly different from
our online system. So this time around
my research question was a little bit
more pointed. It wasn't just why do you
need help? It was why do you need help
with search? And therefore my findings
were also far more pointed. It was the
usual stuff of like there's too much in
my head, lots of variables that I need
to plug in and I need comprehensive
coverage. But it was also the system
isn't hard to use. I can figure out the
bells and whistles. I don't know what to
put in the terms, the right kinds of
sources, what libraries should I be
looking in. All of that is what my
biggest problem is. And that's why it's
wonderful to have someone on the phone
brainstorming with me, telling me how to
put those terms and connectors together,
craft that search, and then I know I
have far more confidence in the answers
that I got.
And when I was trying to answer the
question of are they not able to find
what we have in the product, we
conducted some AB test and made help
much more visible in the product. But
even after doing that, people were still
not engaging with the help that we have
in system. People were still calling up
customer support and they were still
doing that right after running a search.
So this time around because we were
talking to each other and based on all
of these additional insights that we
have, we actually ended up creating a
more comprehensive narrative of the
problem space. We realized that even if
we create much more enhanced
capabilities in the product, people are
probably still not going to use it
because they didn't really needed help
with just writing search. They needed
that brainstorming partner to be able to
do that kind of like sounding board
going back and forth to figure out
answers for those very specific and
complex questions that they had. So our
recommendation for the product this time
around was different. We suggested that
enhancing the capabilities probably not
going to increase engagement or not
going to reduce the customer support
volume that we were seeing but maybe
conversational help can help. So we
actually ended up shelving in this time
the project. Um and that actually ended
up saving a lot of time, money and
resource as you can imagine for the
business. So bridging perspect
perspectivity together this way can
really help us become more effective
analyst and researcher because we can
mitigate some of the blind spots that we
might have if you are working in our
silos and it can create much more
effective solutions for our users and
for our business.
Yeah, sounds simple, right? But we've
taken many a stumble along our journey
to building this framework and we'll
walk you through what we've learned
along the way as well. So, first portion
of it, find your counterpart. And for
that, if screaming Marco and waiting for
the polo doesn't work, um, typically
what you can do is reach out to product.
Um, more often than not, product
naturally aligns all of these different
pieces of the puzzle together, but it's
on you to take that next step and
continue building more. and higher on
top of that relationship that is
established. If that doesn't work, one
of the things that our leadership has
been supportive of and what I would
recommend is reaching out to them and
setting up monthly reviews where you're
able to talk about the kinds of insights
you learn, what you bring to the table.
The first step to anything is awareness.
And from there on, you can lead to
further connections. The third piece of
this, and I know this sounds mildly
stalkerish, but look them up on the
active directory and ask if they're
willing to have an informal coffee
conversation with you. Talk about your
priorities, run it against theirs, see
where you might be able to supplement or
complement each other's
data. Second piece of this, and this is
actually where we got started.
Typically, after you run your research,
is where you have findings. So, there is
a typical starting point for a
conversation. Otherwise, it's hard to
know what to talk about when you meet
them for the first time. So, share your
findings and then ask for analytics. If
these this is what I found, uh, what are
you seeing in the product? Are people
doing what we think they are doing? Are
they doing something completely
different? If you they are doing what
you think they are doing, then you've
found a stronger story to tell your
product or your stakeholders about what
they should build, which of the problems
to focus on, and you have a quantified
priority you're able to attach to all of
the issues that you have found. Now, if
they don't agree, that's actually the
fun part because what you've
accidentally discovered is a third
hidden door. In what world can both the
qualitative and the quantitative
insights be true? And what would those
hypotheses be? And that becomes your new
research
project. And the next step assuming that
this has become second nature. You've
become we've built that habit of
reaching out to each other is start
planning your research together. And the
same things that we had demonstrated
earlier in terms of um if you know them
and you know what they're capable of,
you also know what parts of the answers
they'll be able to inform and that and
that can also inform what you do as a
next step within your research. So
essentially if you start with the
business question, break it out into all
of those different facets, different
research questions, sit down and have a
conversation then about what can you
bring to the table? What should I own?
And if the same question can be answered
by both, which subp parts of these
questions can I own and how will we put
that together? You can have that
conversation up front and all of that
friction goes
away. And brownie points for this one.
Um, as you're conducting that research,
if you're able to tag or bring your
counterpart in, um, what I've seen o
over the years is that the analytics
counterparts research and their findings
become a lot better informed because
they hear it directly from the source
and you're building that level of
empathy that lasts very very long. And
at the same time if you're talking about
u a particular study that you've already
run and this was a case very recently
where my counterpart was able to look
over my shoulder and say oh are you
trying to basically measure willingness
to use for the AI I can add some
behavioral data to that and we can make
that model a little bit stronger and we
were able to go to business with here
are the levers you have to pull this is
the strongest lever that's where we need
to go and have the biggest impact.
So now when that becomes very normal
when you are best friends you are
already working on projects together the
next step to take is becoming a little
bit more proactive and start solving the
biggest problems for the product and
figuring out what those biggest pain
points are and who are the users who are
feeling that kind of pain points that we
should solve for. It also helps if we
can design some of the success matrix
together because often time as this
supportive function uh within the
product it can become difficult for us
to figure out how much impact are we
really creating. So having this success
metrics together can really guide us
that are we solving the right problems
the right way or are there things that
we still need to do. So look at those
success metrics together from a
dashboard from a uh periodized report
and figure out that if we are moving in
the right direction or not and if we are
not that kind of insights can give you
access to this priorized backlog that
you can work off of to make sure that we
are solving the most important problems
for our
users. Uh and all of those sounds great
of course in theory but then we are
always dealing with the real constraints
of time and resource. So in our
experience what has worked best is to
include collaboration as part of the
process. So it's not something that we
are ignoring or neglecting when we have
these real challenges. Um so whenever we
are talking about business question the
the part that we are trying to figure
out what are the problems what are we
trying to solve for it helps to have a
cross functional uh conversation with
the key stakeholder in place as well so
that we understand the context and the
problem that we are trying to solve and
it helps us create better research and
analytics questions and then we can go
off of and do our own research and
analytics and we can figure out what the
findings are but in these phases it
helps to have regularly scheduled
working session. So you are coming
together on a regular basis and talking
through your findings or your challenges
and it can also help to include some
templates and checklist as part of the
process as well. So in our case we have
some templates around how can we share
the findings together to our
stakeholder. For checklist we have some
checklist around what are the data
sources that we should look into? what
are the people that we need to invite to
our next meeting? Who should we reach
out to to triangulate and synthesize the
findings that we have? Um and of course
looking at the dashboards or reports
together to make sure that we are moving
in the right direction of solving the
right user problems. So this is our full
framework in action. Um there's a lot
going on here of course but this is just
for future reference. Hope all of it
makes sense because we have tried to
build up to this total framework. Um so
the point that we are really trying to
make is if it's not obvious by now is
that working together bridging
perspective can really help us get to
happy user but also happy business
because analytics on its own can answer
questions like how big is the problem?
How many people might be facing that
problem but then adding research
insights to it can also help us
understand the why or how much of that
is a pain point. And on the flip side of
that same coin, while research can
easily answer the what and the why, it
can also go that extra step of informing
which one should we focus on and which
one do we prioritize, especially in an
age of limited resources.
And that's why bringing both together
and collaborating can really help us
gets to this holistic idea of who our
users are and what are their biggest
pain points. And while that benefits the
business, what will benefit you
personally is that you would become a
lot more effective and efficient because
you have you're able to squeeze a lot
more out of the same kind of studies
that you'd be
running and it would lead you to an
effective product.
Join your partner in crime.
That was wonderful, right? all the
zooming in and zooming out of the
frameworks. And I mean, of course, the
metaphor of peanut butter and jelly and
all things seamless collaboration. So,
uh, what we're going to do now is kind
of transition into getting some, uh,
online Q&A. Um, and of course, you know,
you can stalk our wonderful speakers on
the LinkedIn. Um, but in terms of like
the substantive nature of all things
seamless collaboration, there are a
couple really good questions up here
that I kind of want to lead off with if
you're open to it. Um, so Lexus Nexus in
terms of size about what's the number of
employees? What what about a 10,000 if
you're looking at 10,000 just our
company? Yes. Sure. So there are people
in the audience that are like 10,000
that's so cute and there are others that
are thinking that's a pretty sizable
organization but the the question here
around large orgs like you're getting a
lot of data coming in a lot of opinions
coming in as well. Um what would your
recommendations be for navigating kind
of conflicting data points?
Um more often than not um the conflict
can arise because you haven't defined uh
what you're actually measuring. And
depending on whom you ask how big is the
stick depends on how who's looking at
it. So um more often than not what it
takes is for all of these different
research functions to get together
define how you're going to be measuring
and what you're going to be measuring.
And from there on it becomes a little
bit easier. Not saying that all of those
will agree perfectly because a lot of it
depends on instrumentation, what the the
tools capabilities are, etc. But that is
usually a great starting point to start
avoiding those conflicts. Yeah. And just
to quickly add to that the matrix that I
was talking about. So defining what good
looks like because in many cases people
have different ideas as Archana said
based on their understanding their
expertise. But having that shared
understanding of this is the problem
that we are trying to solve and this is
what good looks like. This is how we
know that we have solved the problem
helps.
That's great. I'm gonna ping pong this
back to Arjuna here.
Um, hearkening back to an earlier
presentation from today. What we heard
about is kind of the the um dual stage
approach to delivering tactically as
well as kind of thinking big picture and
strategically.
And there's a question here around kind
of navigating andor prioritizing
different research initiatives. So we
have the big strategy level initiatives
of trying to answer larger questions
maybe some things that are less than
defined versus at the lower level kind
of like product level and like click
level interactions. How do you how do
you manage those tradeoffs?
Um, so as much as we pro proposed a
framework over here, I want to be clear
that our roles aren't always 50/50. So
especially when you're starting to enter
into areas that are unknowns where we
have no data to begin with. That's where
I think research is maybe sometimes
working in isolation. you're going off
and running based off of market research
data, customer support data, whatever
you're able to get your hands on and
you're answering the big strategic big
picture questions with the usual
methodologies, jobs to be done is my
favorite. But that said, typically on
the tactical end of things, I think is
where she uh probably spends a lot more
of her time than I do uh in terms of
making sure that things are working
right, feeding things back into the
prioritized backlog because there is
something to to look at. So in terms of
time split um I think research ends up
being a little bit more forward heavy.
Um that said in terms of prioritizing
um this is more a personal opinion than
anything else. If you have only limited
amount of time big picture strategy
questions have to take priority. You get
those wrong it doesn't matter what you
build. Okay. Anything? No. She said
perfect right. She kind of nailed it.
Yeah. Um, so as far as our speakers are
concerned, they have a a wealth of
professional experience that extends
beyond Lexus Nexus that includes Twilio,
T-Mobile, and uh, a shared experience
consulting at Infosys. I'm curious
because as a common thread as a
researcher, something that I've run
into, something that's coming in from
questions here is people discounting the
value of research because it's more
qualitative than quantitative. I'm just
kind of curious from your perspectives
how you how you kind of handle those
sorts of objections.
I mean that was the one of the reason
that we built this framework because for
us we feel like collaborating together
and getting these insights together can
make us a much more stronger case not
like really against our stakeholders but
you know like for our users let's put it
that way. Um so that's why like together
working together and combining all of
these different data points that we have
access to is the best bet for us to
create that comprehensive narrative and
be like if you don't listen to this I
mean we have such a strong case here for
that problem. So okay and we've also
gotten to a point where we punt it to
the right person. We say that's we
require strong confidence for that.
You're not asking a deep question you're
asking a broad question. we need to go
across the market and we need to have
higher confidence that this is a
solution that will work for everyone
else. We need to work together on this
one. It's not just qualitative and they
do the same thing when it's like we
don't know the why behind this we need
to go to research. So building that
function, building that partnership is
important. But in addition to that, what
in in complete honesty, what has
worked somewhat successfully with
stakeholders has also been um if we
don't do the qualitative research, here
are the number of things that could go
wrong. And I just want to highlight
these as risks. So positioning those as
risks can be convincing. Um, worst case
scenario, if you run that research up
front and you learn that you were fine,
you spent a couple hundred dollars, but
then you find out that you were wrong
and you were solving the wrong Y, it
just could have potentially cost you
millions of dollars. Love that. The
opportunity cost. Yes. Y'all, thank you
so much. That was a great presentation.
Thank you.
Loading video analysis...