Productside Webinar

Digital Product Management: Part 1

Guide Following & Leading Your Users

Date:

05/21/2020

Time EST:

1:00 pm
Watch Now

Most PMs recognize that successful products demand both a clear understanding of customer problems and, then, corresponding solutions that address these needs. This informs our product roadmap priorities and marketing efforts. That’s ‘table stakes’. Yet, paradoxically, when customers ‘buy in’ and adopt our product, we’re often ‘flying blind’ as to how our product is being used, as it’s being used.

One of the key dimensions that distinguish Digital Product Management practice from a more traditional PM practice is that of tracking, understanding – and thoughtfully intervening when appropriate – as your users experience your product. This is, after all, where ‘the rubber hits the road’- where users will experience value, or frustration, as they endeavor to accomplish their tasks. Join David Nash for an exciting and informative webinar which will get you thinking about what it takes to facilitate an in-product journey that will delight your customers and keep them coming back.

Welcome & Webinar Overview

Roger Snyder | 00:00:00–00:03:30
good morning everyone and welcome to another Productside leadership webinar. my name is roger snyder. i’m the vice president of marketing at Productside. thank you for joining us this morning.

things have certainly been moving fast and here at Productside our goal is to help product managers and product marketing managers address challenges head-on. we keep listening to our community and we try to provide topics that are engaging.

today we begin a two-part series on mastering digital product management and david if you advance it to the next slide that would be great.

i’m joined this morning by david nash. he’s a principal consultant and trainer here at Productside. david is a recognized expert in product management, product marketing, entrepreneurship, and team development.

he delivered generations of successful technology products across many different segments: hardware, software, software-as-a-service, systems, and is an expert in b2b software-as-a-service pricing and user experience and in-app user engagement.

he has built several best-practices pm teams and high-performance teams as a product management leader in various companies at all stages of growth. he has held leadership roles as vice president and senior vice president.

david’s also a recognized community builder, coach, and bar raiser, and is the founder of product camp portland which has been going on since 2012.

so david, thank you for joining us this morning.

David Nash | 00:03:30–00:04:05
good morning everybody, greetings from beautiful portland, oregon.

Roger Snyder | 00:04:05–00:06:00
that’s right and i am here in los gatos — it says santa cruz mountains and that’s where i was supposed to be but pg&e decided to pull the power from my house, so i am in the office today.

so i’m the only one in the office. we’re practicing good social distancing, i’m the only one here.

so let’s move on to the next slide and we’ll do a couple of housekeeping items.

this is, like i said, part of the product management leadership series and you can join our linkedin group to continue to hear more about this series as well as see lots of great tips on how to lead as a product manager.

and i will put this link into our chat box in a moment.

right now i want to get on to the webinar details.

we really encourage you to ask questions during the webinar and so please use the question box to enter your questions. we’ll accumulate these questions over the course of the webinar and then have a q&a session at the end where we can try to answer as many questions as we possibly can.

the most commonly asked question is “can you watch this webinar later?” and the answer is yes. we will send you an email with a link afterwards that will allow you to access this webinar from our website anytime you want to view it.

we often get asked whether we’ll share the slides and since the slides contain a number of our course materials we don’t share the slides directly, but we do allow you to view the webinar live and recorded. so we encourage you to view that webinar whenever you like and like i said we’ll send out that link later on.

About Productside & Leadership Series

Roger Snyder | 00:06:00–00:08:10
all right let’s keep going.

so, a little bit about Productside. we love interacting with you guys and so we encourage you to ask and answer questions, as we’ve talked about earlier.

our mission at Productside is to help companies and individuals do great product management and great product marketing. we like to empower product managers to do these great things.

so let’s talk a little bit about what we offer.

this slide talks about our various services. i’m not going to tell you about them all — we’re just here to help you improve your skills or your team’s skills.

we offer a variety of services. if you want to learn more, check us out at Productside.com.

all right, so today in the product management leadership series we have several different themes for these various different installments in our series and, if you animate the slide a little bit david, you’ll see that today our series is in the vein of professional development.

so as a product management leader we’re hoping that you gain an understanding — a deeper understanding — of what it means to be a digital product manager.

and, like i said, this is a two-part series and david will kind of go over what’s going on with the digital product management landscape in two parts.

all right, so that’s all about us. let’s talk about the audience.

so today we see that we’ve got 52% — over half of the group — are product managers and product marketing managers. we have a good representation of directors and vice presidents as well — that’s great — about 16% in those groups.

and we have some product owners joining us today, that’s also awesome, good stuff.

and then there’s a lot of folks that probably work with product managers, work around product managers and want to know more about what it is a product manager is doing and how they can help, as well as what they can hope to get out of the product manager’s role, especially in digital product management.

so a good group for us today.

Roger Snyder | 00:08:10–00:08:35
all right, so david i’m going to turn it over to you at this point. you can introduce the series and dive into the materials.

Intro to Digital Product Management & Two-Part Series

David Nash | 00:08:35–00:11:30
all right, thank you roger.

again, pleasure to be here with you all this morning. that guy in the photo a few slides ago had a haircut — that was me approximately 60 days ago — so i bear some resemblance to that, sans the haircut.

so as roger said, this is the beginning of a new series. we’re really excited about it. this particular topic is a two-parter.

today we’re going to talk about some of the fundamentals of user engagement: what it means, what does the term even mean. and we’ll build a foundation that we’re going to come back to ongoing.

the first click-in on that will be next month — about a month from today, june 18th i think it is — we’ll do a little bit of a deeper dive into improving engagement, onboarding, and gamification.

but there’s lots more to come in this topic. this topic is exploding. it’s of vital interest to our community — that’s all of you — and so it’s vitally important for us to share this with you.

i’m going to do the world’s fastest introduction of lean startup. many of you have seen these models before. i want to use them kind of as a context setter today.

there’s two things that we really try to orient ourselves to in the world of the digital product manager.

the first is this very famous model on the left, really created by eric ries some years ago in the lean startup. it’s called the build–measure–learn loop.

and, you know, it’s interesting because this is the circle of actions — if you will — the triad of actions we take continuously, and it was conceived for startups.

i’m here to tell you that it’s not just for startups at all. this model is at least as important and perhaps even more important for established companies, established product lines.

and the reason is that a startup is often looking for data from a very small user base or extrapolating a lot.

but on the other hand, many of us are working — or have worked — for large companies with established products and established customer bases, where you have a ton of data at your fingertips.

and so really what a startup craves, larger enterprises often have much more ready access to.

so that’s what we’re doing: ideas, hypotheses turn into things we wind up building. we deliver something. we’re measuring constantly to get signals and sensors from that product about use.

we build a data set which is ongoing. we mine that data over and over, we learn, and we pivot and adapt.

David Nash | 00:11:30–00:13:15
and the things we’re doing on the left are in service of learning the things on the right.

so number one is: as product managers, we want to make sure the things that we’re building are valuable and usable — that they’re going to have customers who want to throw their money at us, who are going to buy these products and use these products. so we start at “valuable.”

circling counterclockwise from the top, we also want to know from a technology point of view that it’s feasible — that is, can we build it? do we have the technology? can we make it simple, affordable, reliable — all those things.

and last but not least, sometimes even more important earlier than “can we build it?” is “should we build it?”

so the issue of viability: is this a business model that’s attractive for us to pursue and invest always-scarce resources in?

i don’t care whether you have one team of engineers because you’re a startup, or you have 150 engineers — i’ve been in both environments — however many engineers you have, they’re never enough to feed your dreams.

you have to make tough choices.

and so this notion of viability is super important.

so that’s the model — again, probably the world’s fastest overview of the lean startup model. thank you eric.

and really these are the domains that we’re testing.

but the bottom line — probably the key takeaway — is: this is all real nice theory and it’s all a hypothesis when you’re jotting it down on the cocktail napkin and doing your first customer research calls.

it’s not real until you measure it, until you validate it, until you understand all the things that you thought you knew but wound up learning very differently.

Roger Snyder | 00:13:15–00:14:00
i really like that image there on the right — the “valuable, feasible, viable.”

i think every product manager should just sketch that out, pin it to their wall, and look at it every day as they’re considering the next request from sales, the next request from marketing, the latest customer feedback, and just ask themselves those three questions.

of course they’re going to have to get the whole team’s help on all of this in some aspects, but it would really help the product manager make good decisions.

David Nash | 00:14:00–00:14:10
no question about it, thanks roger.

From Flight Plans to Reality & First Poll

David Nash | 00:14:10–00:17:30
so let’s move ahead.

most of the folks who come to our webinars are working — and again, as our data indicated, some 52 percent of you today are working pms, pms, product marketing managers — and you probably have something in market already, perhaps several products, perhaps a portfolio of products if you’re a vp or director or leader on the phone today.

and you know this didn’t happen accidentally. you started with a plan.

and while it has been a couple of months since most of us have been on an airplane, i wanted to harken back and share with you a metaphor that i think everybody will get, and that is that every flight starts with a plan.

now, you went on kayak or google flights or whatever — perhaps your airline that you’re loyal to — and you bought a flight.

in my case i wanted to fly from portland to new york, where my family is. i planned this thing and yes, i had to cancel in the last couple of weeks because corona, thank you.

but we made a flight plan. i knew what it was going to cost, i knew roughly how long it would take, i made my trade-offs.

i had those levers to slide in terms of budget and layover and all those things. i made a choice.

and just as you have made choices in your products — you’ve written product requirements, you’ve written user stories, you’ve written a product vision and communicated that — and you believe that’s what you’re going to do.

and then you get on the airplane and life happens.

and this is actually a gross simplification of what happens on most commercial flights.

you file that flight plan, you take off from portland, and a number of things happen: you have bad weather.

how many times have you heard the captain say, “we’re going to descend or ascend 10,000 feet to get out of the bad weather or the chop”?

you have fuel issues, you have unplanned traffic — all kinds of things.

and so the flight you wind up flying is often very different.

you can pivot. by the way, there’s probably dozens of micro-adjustments — i’m just calling out some of the coarse, unplanned adjustments that happen during a flight.

and this is very real-world. and so your best-conceived hypothesis for how you’re going to get there often changes.

so it’s a powerful metaphor; i think we can all relate to it.

make plans — and, you know, just be ready for them to change.

David Nash | 00:17:30–00:18:05
so with that, we’re going to open our first poll of the day, and i think roger’s going to facilitate that for us.

Roger Snyder | 00:18:05–00:20:20
absolutely. good. all right, so i’m launching the poll.

why don’t you go ahead and read it.

David Nash | 00:18:20–00:19:40
so, if you’re not deeply familiar with user journeys i’ll define the terms a little more in about five minutes.

suffice it to say for now: this is what you think users will be doing when they’re inside your product.

and in the lack of formal tools and methodologies, here are some of the common ways that we’ve found product managers and their development partners and operations partners do these things today.

some of us mine server logs, go back through apache logs and see what pages customers visited.

some have customer survey tools they send out afterwards, saying “what are you using and how frequently?”

sometimes your support team can jump in and smooth the user and understand what the user is doing — kind of seeing through the user’s eyes.

sometimes your product is fully instrumented — and that’s a good day for all of us. we’ll get back to that in a minute.

and sometimes you don’t have a clue. sometimes you don’t really know what’s going on inside your product, right.

and of course you may use a combination of these things, right, even…

Roger Snyder | 00:19:40–00:20:20
yeah, yeah, there often isn’t a one-size-fits-all.

and sometimes you are bringing a new set of sensors online — if you will — a new set of eyes and ears to look at your product while you’re still attached to the old ones.

so i’ve worked at companies where we had little bits of all of these things, just to try to cobble together a holistic picture of what’s going on.

so i will encourage folks as you’re answering this — if you do multiple ones of these — choose the one that you feel like your company is most strong at.

all right, i’m going to close the poll now and let’s see what the results were…

Flying Blind & Need for User Journeys

Roger Snyder | 00:20:20–00:21:05
…we’ve got 19 percent of folks saying they’ve mined server logs.

64 percent — which is the largest number by far — say they do customer surveys, which is extremely encouraging, that’s great.

21 percent: the support team has tools for understanding what’s going on in the user world.

and then 28 percent — this is a number that’s been going up and i really love to see it rising more — say our products are instrumented to capture this. that’s encouraging.

and a few folks had said, you know, “what’s a user journey?” so that’s good — there’s a part of the audience that is going to learn what a user journey looks like and how they go about that.

so, good, good poll results. we’re very encouraged by them.

David Nash | 00:21:05–00:23:05
thanks everybody for the response and thanks roger for the readout — that’s fascinating and honestly very consistent with what i’ve seen over the last few years doing this and working with other folks who are doing this.

so, you know, staying with our airplane model for just a moment, there’s a metaphor about flying blind.

and this is not an academic theoretical discussion — i, too, have flown blind for longer than i wanted to.

again, working in companies with very successful products — by the way with tens of thousands of customers and many large enterprise customers with thousands of users at each account — and often we didn’t know what our customers were doing.

and i want to stress that these were commercially very successful products, rapidly growing companies.

and yet the state of the art, you know, as recently as a few years ago — 2016, 2017 — was that there were not a variety of tools that were really cost-effective and pervasive.

and so you had to do things the hard way.

and so we didn’t know what customers were doing.

sometimes you wouldn’t know customers weren’t using the product or weren’t satisfied with the product until they churned.

if you’re not familiar with churn, that’s the bottom of your bucket with the holes in it where customers fall out — they don’t renew — and that’s never a good day to find out your customers weren’t using your product until they churn.

and we also didn’t know where their victories were, where their struggles were inside.

so that’s been my experience again; certainly it was at a company that was very well led and very typical of particularly large b2b or enterprise companies: they just have a lot of customers and were not raised in a period where they had to measure all of this stuff.

Defining User Journeys & User Engagement

David Nash | 00:23:05–00:27:10
and so moving ahead, i want to share with you two concepts really for the rest of our session today.

the first is this notion of a user journey.

kind of just a working definition for it on the left is: it’s the series of steps that your users will take to accomplish a task.

generally these are planned. generally these are optimized for them.

when you’re thinking about user journey from a planning perspective you’re often thinking about: how can i take the current journey and optimize it — maybe take out some unnecessary steps, maybe not ask customers for information we already ought to know about them.

and the other really key point about usage: sometimes you’ll hear “customer journey” and “user journey” used interchangeably, and that’s often valid for b2c products — consumer products — where the buyer and the user are the same.

however — and i can’t stress this enough — for b2b products, for enterprise products, very often the buyers and users are very different.

the buyer is the person your sales team is calling on, your customer success team is calling on, and they are sometimes user proxies.

sometimes they’re a department manager, they’re a chief something-or-other, they’re a senior executive with the budget to buy your product.

but there may be hundreds or thousands of users who have no other relationship with your company than that of using the product.

and so that’s really where the user journey becomes vital, because you’re the only part of the company, by the way, that’s really charged with understanding users.

because the sales and marketing team are chartered very clearly with driving awareness and preference for the buyers.

so that’s the user journey.

user engagement, again, is really a long-time discipline that’s come into sharp focus over the last few years and that really helps us reveal some behavioral actions, inclinations about how users are responding to things in the product — what they’re doing, what they’re not doing.

there are many dimensions. we’ll get into that in just a moment.

and when we talk about users we talk about them as individuals, we talk about them as groups of people, and we talk about them not just in the moment but over time.

so: how are things trending?

and just again — a quick setup — this is multi-dimensional.

and our job is to really look at these users through a couple of different lenses.

the first is, we really want to understand before we attempt to do anything or interpret — we want to just watch. we want to learn. we want to watch that engagement, we want to see what users are doing.

we want to really understand what are the intended versus unintended outcomes.

the ones on the left — the intended ones — are the ones you want them to experience.

and then there’s always unintended outcomes as well.

so that’s the watching part.

but we’re watching in service of doing something better.

David Nash | 00:27:10–00:28:40
and we’re going to find out that it’s very helpful to guide users not just when they’re struggling but sometimes when they’re successful as well.

we want to intervene thoughtfully, selectively, respectfully — but there are times we’re going to want to intervene.

when users do things that are rewarding for them, we want to set them up so they do it again. we want to reinforce those behaviors so they form habits that are good for them and good for our product and good for our business.

and overwhelmingly we want to simplify.

i can tell you, having worked at a number of b2b companies and enterprise software companies for more than the last decade, that b2b products often grow up through generations of being responsive to large customers.

these large customers often have workflows that are slightly different, and so before too long — in service of large, demanding, high-spending enterprise customers — you’re adding a lot of specific features because no two companies have the same workflow.

and so therefore there become ten ways to do any one thing.

and that’s not really good. in the moment, delivering a swiss army knife is often not what people want — they just want a screwdriver or just want a blade.

and so simplifying for b2b products is something which we just don’t do enough of as a product management discipline, and it’s often very high ROI.

the bottom line is you want to link the user outcomes and the product outcomes to the business impact.

this is a huge topic in and of itself; we’ll explore this in a future webinar.

but really making sure that when the user wins, your company wins.

that’s easier said than done, and you don’t know what the user winning looks like until you’ve looked at it deep enough and long enough.

In-Product Analytics & Tool Landscape

David Nash | 00:28:40–00:32:40
so that’s when we talk about linking user outcomes to things that will drive revenue, drive retention, lower churn, drive referrals — all those things that our businesses care about.

this notion of in-product software and usage analytics has been around for a few years, but it’s really gone through a renaissance of sorts.

there’s been a ton of studies on this. you can probably google and within five minutes find several more of these.

but one that kind of made me chuckle was from gartner — gartner does the hype cycle pretty much for everything — and as recently as 2018 they included software usage analytics in actually three different hype cycles that year: for saas tools, for customer support tools, and this product category of user software usage analytics showed up on all three of them at the “peak of inflated expectations.”

this is gartner-speak for: there’s something there, there are tons of expectations, it’s on the rise, but it really hasn’t proven its worth yet.

here we are two years later, and i can tell you — having implemented a couple of these tools commercially and having worked with a lot of other folks who chose other tools — i think this software is really starting to come into its own today.

in my experience — again from my seat — i haven’t really seen it fall into the dreaded “trough of disillusionment.”

i’ve seen it really level out into this plateau of productivity.

and i think the real key with these tools is not that the tools are invaluable, but really understanding how a craftsman uses their tools — that’s really where you get the most value, not that the tools’ value is inflated.

Roger Snyder | 00:32:40–00:33:10
absolutely. and i think for folks who may not be familiar with the gartner hype cycle, just think 3d television — right — and that will give you an idea of where some hypes end up.

and that can be the end of that product’s journey. “where’s my flying car?” right?

David Nash | 00:33:10–00:36:10
exactly.

so there’s a ton of tools, right? you always know an area is exploding when the venture money pours in, the private equity money pours in, and there’s now a ton of companies.

these are just a few; there’s probably a dozen more companies that are in this area of so-called in-product user analytics.

these are some of the ones i’m most familiar with.

by the way, i lead an event here in portland called product camp — and perhaps you have product camps in many of your own cities — and so these are the kinds of sponsors that come to these conferences, because their tools are so valuable that they want to make product managers aware of them.

so there’s two takeaways here.

first off, in the third decade of the 21st century you should not build this yourself.

you may have some really smart engineers who say “i can build this.” i’m telling you: don’t do it.

these products are not commoditized — because they provide a lot of value — but there’s a competitive field of companies doing this.

do your homework, set your criteria, do a proof of concept.

do not build this yourself because you can find solutions off the shelf from some of these leading companies here, and you can find the one that will best meet your needs.

and they’re more affordable than ever.

the other thing i’d leave you with is a lesson that i had learned going through this: instrument everything.

all of these products rely on some sort of a javascript widget that you drop into your code — you drop into sometimes every page.

it’s a little widget that is the sensor, if you will — the thing that reports back to the pendo backend or the amplitude or heap or mixpanel backend; they’re all fairly similar.

and the idea is that this is really the easy part — dropping in the code.

your developers will figure this out, you know, in less than a week. they’ll do a bunch of testing.

but there’s no reason not to measure everything.

you will find out over time you’re collecting some garbage data — you can throw away some pages that aren’t really part of the workflow, you’ll have extraneous data that’s part of your journey.

but don’t be cheap when instrumenting this, because this javascript should go everywhere.

it’s super easy. you may have to open some udp ports in your firewall in your data center, you may have issues around iframes and what the sensors see and what they don’t.

but my advice to you is: put it everywhere, because you can’t analyze data that you don’t collect.

so collect everything, and you’ll figure out before too long what’s the most important and what you can ignore.

it’s definitely a journey.

Roger Snyder | 00:36:10–00:36:40
it really is.

Dimensions of User Engagement Overview

David Nash | 00:36:40–00:40:50
so let’s actually peel back the onion a little bit and introduce the dimensions of user engagement.

the reason there’s a pendo logo there on the lower left is that i used pendo very personally and deeply over the last three years and it’s a great product.

all these products have good products, but i wanted to just call out pendo because it was really near and dear to me.

what i’m going to share with you now is very typical of all the products whose logos you saw on the prior slide, so these are representative areas.

but if there are any other pendo users out there, i’d love to hear, by the way, what tools you’re using in the comments — so if you throw some comments in, that’ll be very helpful for us.

but you’ll see a strong affinity here for pendo if you’re familiar with that tool.

so these are some of the dimensions we’ll discover:

• breadth — which is really how many active users you have in your product.
• depth — which is how much of your product they’re using.
• frequency — which is how often they’re coming in and how long they’re staying.
• paths and funnels — particularly for our b2b software, when there’s more than one way to accomplish a particular goal, what are the ways customers find most helpful, and where are they getting stuck?
• feedback and sentiment — in-product feedback on specific features: is this feature useful to you right now, how can we make it better?

and asking customers in the moment, when they’re using it, is a huge breakthrough that, once you have that capability, you don’t ever want to give it up.

and sentiment — these are things like nps, “how are you feeling at this moment — are you happy?”

what better time to ask customers than when they’re in the product versus three months later when they get an email campaign.

so, bottom line for all of this — particularly if you’re new to this — is: when someone says “are people using our product?” don’t accept “usage” as a metric.

“usage” is completely ambiguous.

“usage” shows a kind of lack of sophistication — and by the way, been there, done that, that’s where we all start.

but i think very quickly we’ll see that “usage” is almost meaningless when you really go through the different lenses of the elements of engagement.

Breadth & Depth of Usage

David Nash | 00:40:50–00:45:00
let’s peel through each of these quickly.

first off, breadth.

for b2c apps — consumer-facing apps — this is really as simple as: how many active users do we have? these may be paying users, these may be free users who are on a free version of your product.

for b2b it’s a little more interesting because, again, you have one customer — that customer may have hundreds or thousands or tens of thousands of users.

so: how many active users per paying customer do we have?

really, really important when you’re talking about customer health, if you have a customer success team.

there are a couple of key metrics here.

let me throw one out where, if you don’t have a starting point, start here:

a kpi i’d recommend is so-called daily active users or monthly active users.

think about this as: how many of customer x’s licensed users — who are physically able to use your product — have used the product in the last 30 days?

that’s a great metric. that metric may not be a perfect fit for you, but it’s a great place to start.

let me show you a real-world example of where an enterprise software company has done this.

here’s an account — account x — that has about 300 users, and looking at the breadth indicator we saw that, of those 300-ish users, about a third of them are logging in once a day, more than a third of them are logging in once a month, and then a little less than a third are logging in weekly.

the very first thing you ask yourself is “why?” what’s going on here? why are these three bands of users so different?

i don’t know what the answer is and you will not know the answer at first, but it’ll force you to ask the questions and look at the data.

there could be all kinds of reasons why your users have banded into these three very clear strata.

it could be as simple as different user personas. maybe some are work initiators or processors, some are reviewers, some are approvers.

there may be a multi-part b2b workflow, for example.

so anyway, this will force you to ask the questions.

the other place where we found this hugely valuable is: if you have a customer success team for your larger accounts, you can arm your csms with this data, take them back to the customers, and show the customer how they’re using your product, how they’re getting value from it.

and you can give them insights that they’re going to want to know because, as the buyer, they want to know that they’re getting value from your product and from their user base.

we found this to be tremendously helpful to feed this back to our buyers.

Roger Snyder | 00:45:00–00:45:40
i think it’s a great point and it’s also easy.

for example, your virus software at home sends you a monthly report showing how effective it was in this last month doing things for you. it’s a different category than this one, but i was just thinking about the fact that these feedback loops can be valuable b2b as well as b2c.

David Nash | 00:45:40–00:47:10
no question.

and building on breadth we then move on to depth.

and depth — as the graphic implies here — is all about how much of your product are they using.

are they using 20% of your features and the other 80% may not be necessary? what are the features they are using the most, which are they using the least?

which features are most valuable for specific user segments?

for example, in the prior chart for those three different bands of users, they may not be using the same exact features.

which ones get regular use and — if you really care about retention and churn, which you all should — which are the most sticky ones that are going to keep your customers coming back time and time again to your product?

again, as a starting point for a metric: identify five-ish — probably not less than that and probably not a whole lot more than that — features that have been validated as leading retention indicators.

that can be your early warning as to customers who are being successful versus customers who are struggling.

David Nash | 00:47:10–00:48:10
here’s another real-world snapshot:

9 percent of features from customer x in product y were getting 80 percent of the clicks.

i can’t emphasize how often this happens in b2b software, for the reasons i mentioned earlier.

this is really telling.

this forces you to say: hey, what’s going on here? what should we do about it?

we can really do a couple of things here.

we can say: make sure, if those 9 percent of all features are really where all the value is, that we make them easy to use — take all the friction out of it.

maybe those features are worth charging more for.

maybe you can remove some of those other 80 percent of features over time and hide them — and, if no one’s using them, they won’t notice when they’re gone.

very, very key analysis here, with specific actions you can take from it.

Frequency & Cohort Analysis

David Nash | 00:48:10–00:52:45
with breadth and depth under our belt, let’s move on to frequency.

frequency — you may think at first — is similar to breadth, but it’s a little different.

this is not just “how many daily active users do you have?”

when users do log in, how much time are they spending? what accounts are they from?

here again, a very key simple metric to get started is: let’s look at the number of logins or time spent across users for a given customer over 30 days.

one of the most important and common analyses that you’ll do — particularly if you’re running a subscription business — is something called cohort analysis.

so here we looked at a cohort of users that had about six weeks under their belt.

and over those six weeks we noticed that, by week six, only 3% of the users were coming back.

you might say, “oh my god, is this the world’s worst product?”

no — this was a very successful commercial product.

but again, what’s really fascinating is that different users were coming back for different reasons, and this is not uncommon for b2b software.

so this made us ask the questions: what’s going on here, what should we do, what’s causing that?

there could be all kinds of reasons.

maybe they were product tourists.

maybe when you onboarded that cohort of customers six weeks ago, something in the product really had too much friction — it was tough to find, people didn’t know that a feature was available, hence they couldn’t use it.

maybe your implementation team was struggling with this group of customers, maybe they didn’t do the job they would normally do in getting your customers provisioned and implemented.

maybe it was a seasonal business.

maybe we’ve had a covid-19 situation and your product’s not being used for the last 90 days.

maybe it’s a poor-fit group of customers for this specific product because of geography, vertical market, etc.

you don’t know until you look in, have some hypotheses, and try to take some actions based on what you’re learning.

so: hugely, hugely valuable, cohort analysis.

David Nash | 00:52:45–00:54:05
and then another view on cohort analysis — again real-world, from a company that discovered this:

you could look at accounts, for example, that have not logged in at all over the last 90 days, and that may not have a csm on them — and these are the companies that are not going to renew your product.

again, this is not a good day when you have, let’s say, over a million dollars of arr — annual recurring revenue — that’s at risk because you didn’t know that customers weren’t using your product.

now, for a company that has hundreds of enterprise accounts and maybe a thousand mid-market accounts, you might say, “well, who cares about 14 accounts?”

well, your ceo and your cfo will care about a million dollars of arr that you didn’t know was about to walk out the door.

so: very, very key to look before you’re the victim of churn.

Paths, Funnels, Feedback & Sentiment

David Nash | 00:54:05–00:59:15
so we’re plodding along here. let’s move through paths and funnels.

paths and funnels — as their name implies — are all about when users are in your product:

• where are they going?
• how are they getting from point a to point b?
• where are there multiple ways to get to the same finish line?

very often there are.

what causes users to abandon a task? what does success look like?

if you’re going to gamify — and we’ll get back to that in a future topic — how do you reward people moving from one plateau of usage to deepen their engagement and use more advanced features?

how long does it take to complete a typical process?

so this is really a very effective way to measure your product’s efficiency and usability.

paths and funnels, and going from point a to point b, and paths of convergence are what we capture with this notion of paths and funnels.

and two more dimensions i’ll combine together are feedback and sentiment.

first, in-product feedback for specific features.

how many of us as product managers have done research and surveys and user groups asking about: “is feature a or feature b more powerful, more valuable? if you can only get three things on this roadmap instead of the ten things you want, which are the three and why?”

these things — which are our lifeblood in product — we can now do better than ever by doing it in-product.

same thing for sentiment.

here’s a couple of quick examples.

you just used feature x. what better time to ask a customer, “is this feature working for you? can this feature be made any better?”

they’re in the moment and they’re going to give you the best possible feedback, versus being asked by an email survey months from now.

and, on the sentiment point of view, you could argue whether nps is waxing or waning now — it’s still very popular — but the bottom line is: what better time to ask about sentiment than when you’re in-product as well?

so these are key things that you just couldn’t do before the advent of all those kinds of tools.

Roger Snyder | 00:59:15–01:00:05
these tools are amazing.

years back when i was doing path analysis for a mobile app, we had none of this available — so all we could do was guess.

with the current crop of tools, not only do you see where people are getting stuck, you can actually ask them right there, in the moment, “what’s slowing you down?”

that’s hugely powerful.

David Nash | 01:00:05–01:01:15
exactly.

and speaking of asking questions, roger’s going to open another poll for you now.

of these dimensions that we just covered — again: breadth, depth, frequency, paths and funnels, feedback and sentiment — if you’re already using one of the commercial tools that i mentioned earlier, or perhaps one i didn’t mention, or if you’re on the cusp, which of these do you think would be most valuable to you right now?

and if you had to start, which ones would you start with?

and again, check as many as applicable here.

Segmentation & Getting Started with KPIs

David Nash | 01:01:15–01:05:10
so, as those poll results come in, let me talk briefly about segmentation.

as product managers, we live and die by segmentation, right, because our customers and users are not homogeneous.

and likewise with all the data you’re looking at and some of the quick screenshots that i shared from historical data — here are some of the lenses to look through to really understand how different groups of users might have very different journeys and levels of engagement in your product.

for b2c products and for b2b products, these are just some examples:

• customer size and revenue bracket
• role and permissions in the app
• geography
• device type
• industry or vertical
• lifecycle stage (onboarding, mature, at-risk)

segmentation is critical here.

and then we get to the question: “where do i start?”

all of us have to reckon with this question when we start, because we don’t know what “good” looks like.

there’s a lot going on here, but if we look at this by the rows — or swimlanes — we can look at breadth, for example, and set some guidance of what would distinguish a “champion” customer.

maybe: more than 10 active users.

from a “healthy” customer: maybe 5 active users.

a customer that has room to grow: 2 to 5.

or an “at-risk” customer: less than 2 active users.

if we look at features being used, champion customers — role-model customers, testimonial customers — tend to use nearly all your key features.

healthy customers use maybe three of the key ones.

a customer with room to grow, who might need some attention, might use only one or two of the key features.

and of course, if they’re at risk, they’re only scratching the surface.

and, not surprisingly, frequency of use and time in product is another very key indicator.

look, the bottom line is: you’re not going to know at first, because you probably haven’t measured this.

so you’re going to start, you’ll likely be wrong — but get started.

and once you get started you’ll rapidly see, over the course of probably 30 days, kind of where the high- and low-water marks are.

and this is the way you’re going to get started and then be much more data-informed about the decisions you’re making in your product.

get those indicators set, get those thresholds set, and then plow forward.

Plans for Implementation & Q&A Highlights

Roger Snyder | 01:05:10–01:06:05
all right, i’m going to launch this next poll and keep it running while we take some questions.

we’ve got a lot of great questions today, david, so we won’t have time for all of them. i’ve tried to kind of bundle a few of them together.

the first questions are around the areas of: “is this kind of methodology or these tools available for native apps?”

i think there’s a quick answer for that one.

David Nash | 01:06:05–01:07:10
yes — overwhelmingly yes.

these tools are available.

many companies out there have started in the web world and some have expanded; some started mobile-first.

they’re available for native apps — mobile apps on ios, on android.

so overwhelmingly yes.

Roger Snyder | 01:07:10–01:09:05
another couple of questions here are: “okay, so how do i do this for manufactured products — non-digital products? what approaches could i use to gather these kinds of kpis for non-saas model software or physical products?”

two different areas.

David Nash | 01:09:05–01:11:00
good questions.

there is a plethora today of iot products which may not have a traditional user interface.

they may be things like your nest thermostat or your camera.

lots of privacy issues, by the way, particularly if you’re in healthcare with hipaa and all that.

but if your product has any connectivity back to a central backend — maybe for maintenance, maybe for updates, maybe for just sending instrumentation — then you can embed a sensor which shows usage there.

your customers are going to want to know you’re doing it, they’re going to want to opt in.

let them know it’s for usage-collection purposes only, and overwhelmingly customers opt in.

if your product is not fundamentally connected as an iot device or as a device with an actual user interface, it’s going to be a lot tougher.

i know for manufacturing areas in hostile environments, sometimes products have data loggers on them locally — but that requires sending out a person routinely to go check those data loggers, if they have no other way to phone home.

you can do it; it just makes it a lot tougher and more expensive and time-consuming.

otherwise, you only have the conventional, and often inaccurate, way of just asking users.

Roger Snyder | 01:11:00–01:12:20
i want to answer that a little bit as well.

there are clever ways to get some customer engagement going for non-digital products — even consumer packaged goods.

you can start clubs so people want to join and get information — but they can then participate in surveys for you.

you can put a qr code on packaging that allows people to say “hey, you’ve got feedback? scan this qr code.”

they can join a club of loyal users who want to give feedback or answer a one-time survey.

and for anybody who joins those, then you have the opportunity — and you’ve got to do this in a kind, intentional way, not just spamming folks — but on a regular cadence that matches the usage patterns for your products, you can check in with those customers periodically via email and ask them to participate in follow-up polls.

that way you’re able to start trying to create the feedback loop and a way to get this input from your users — and it can be for a box of cereal as well as a manufactured device.

there’s a lot more ways now to create a digital touchpoint to go along with your physical product.

David Nash | 01:12:20–01:13:25
that’s great.

the only thing i’d add to that — that’s well said, roger — is that customers really just want to be heard.

i’ve found this overwhelmingly.

they expect no other reward or motivation except they want to be heard, and that is its own reward for them.

so, yeah, take advantage of that.

Roger Snyder | 01:13:25–01:14:50
a couple more questions are kind of in the vein of: “what’s the best method to gather feature-level feedback? is it in-person, is it survey, in-app, all of the above?”

i think we’d probably say: use the tools you suggested in that category.

David Nash | 01:14:50–01:15:45
yes — certainly asking in situ, or in-product, as we indicated a couple of slides ago, is a really key way because it’s top of mind then.

all those other areas you mentioned — user groups, surveys that are individual and collective — are viable and helpful.

the non-quantitative, qualitative discussion about what they’re trying to accomplish and how they feel about it is really, really key.

Roger Snyder | 01:15:45–01:17:05
one last question was: “are there any good comparison resources you recommend to choose between these tools?”

and i think the short answer is: read voraciously, ask peers, talk to professionals in your space, and learn from their experience.

David Nash | 01:17:05–01:18:05
yes — another really great question and one we won’t have sufficient time to answer now.

i would just say: read voraciously. that’s what i did back in 2016–2017 when i selected the tool that i did at that time.

in the last three or four years there’s been the second and third generation of those tools and a bunch of new ones.

if you look for “in-product user analytics,” you’re going to have a ton of scholarly articles, blogs, and case studies.

it’s much easier to do that journey now than it was three years ago.

and, as you do when you’re buying a car or other major purchase, ask friends, talk to peers in a similar business and see what their experiences were.

Key Takeaways & Closing

David Nash | 01:18:05–01:20:10
so let’s kind of take her out.

we hit a lot today in the last 50 minutes or so, so let me just recap some of the key points:

• all that great stuff from the build–measure–learn loop and the viable–valuable–feasible model is essential to our practice of building great products — but it’s all theoretical until we validate it.
• the build–measure–learn loop is not just for startups anymore. if you’re in an enterprise and you’re not using this methodology, you’re missing out on something that’s very simple and elegant and effective.
• user engagement is multi-dimensional, so don’t accept someone saying “what’s our usage?” — we can do better than that.
• there are lots of tools available. do your shopping, pick one, and you’ll really be amazed at the stuff you’ll know that you didn’t know till now.
• segmentation, segmentation, segmentation — it’s always been important, and it’s still important for interpreting your analytics.
• and again, if you are thoughtful, restrained, and careful, customers not only don’t mind in-product intervention — they’re actually grateful for it when you help them in a moment of need.

let go of the fear, but again: be restrained, be thoughtful about it, and intervene when necessary.

Roger Snyder | 01:20:10–01:21:25
very good. i’m going to put a link into the chat right now for part 2 of this series — mastering digital product management, part 2: onboarding and gamification — which we’ll deliver next month.

i encourage you all to sign up for that.

we’ll continue the conversation, and we’ll try to answer more of these questions offline as well.

thank you so much for joining us today. we hope you found this useful, and we look forward to continuing the conversation next month.

David Nash | 01:21:25–01:21:40
thanks everybody, see you soon.

Roger Snyder | 01:21:40–01:21:45
bye.

Webinar Panelists

David Nash

B2B SaaS Executive | FinTech, PropTech, GRC & Automotive | Proven leader in scaling, optimizing & integrating high-impact product organizations.

Roger Snyder

Principal Consultant at Productside, blends 25+ years of tech and product leadership to help teams build smarter, market-driven products.

Webinar Q&A

A Digital Product Management user journey is the end-to-end path customers take inside your product—from first click to task completion. It is essential because tracking real-time usage patterns helps PMs identify friction, measure value delivery, and intervene thoughtfully to improve outcomes. Understanding the user journey allows product teams to reduce churn, increase engagement, and connect product behavior to measurable business results such as retention, NPS, and expansion.
Digital Product Managers track in-product user behavior using tools like Pendo, Mixpanel, Amplitude, or Heap to instrument features, map funnels, and analyze engagement. These platforms capture actions such as clicks, time-on-task, adoption of key features, and user paths—giving PMs the data needed to learn how products are used as they’re being used. This visibility eliminates “flying blind” and supports evidence-based product decisions.
Core Digital Product Management engagement metrics include breadth (active users), depth (feature adoption), frequency (login patterns), paths & funnels (task progression), and sentiment & feedback (in-app surveys, NPS). Tracking these multidimensional metrics helps PMs identify which features drive value, where users struggle, and how engagement relates to outcomes like retention, churn reduction, and product-market fit.
PMs intervene using contextual in-product guides, prompts, and help overlays triggered at the right moment—only when they improve user success. Examples include onboarding nudges when a user stalls, tooltips explaining next steps, or feature education pop-ups when users try something new. Effective intervention is subtle, personalized, and designed to increase value realization, not add friction.
Real-time usage tracking enables Product Managers to replace assumptions with actionable data. By understanding which features correlate with retention, which user segments struggle, and how workflows actually occur, PMs can prioritize work that drives measurable business outcomes. This includes reducing churn, improving onboarding, identifying expansion opportunities, and validating product-market fit based on actual behavior—not guesswork.