Artwork

Innehåll tillhandahållet av Steve Portigal. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Steve Portigal eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

41. Carol Rossi returns

1:02:49
 
Dela
 

Manage episode 410001080 series 62327
Innehåll tillhandahållet av Steve Portigal. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Steve Portigal eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

In this episode of Dollars to Donuts Carol Rossi returns to update us on the last 9 years. She’s now a consultant who focuses on user research leadership.

I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, or two years from now they see value in it in some way that they couldn’t have anticipated. I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And keep reexamining, how do I feel about the work that I’m doing? And what am I getting back from people? – Carol Rossi

Show Links

Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. I’m Steve Portigal. In this episode, I catch up with Carol Rossi, nine years after she was first on Dollars to Donuts.

There’s a bigger and better new edition of my classic book, Interviewing Users. As part of the launch of the book, I spoke with Russ Unger for his Tent Talk speaker series. Here’s a little clip.

Russ Unger: What’s your approach to ensuring that the feedback gathered from user interviews is effectively communicated and incorporated into the design process?

Steve: The first part of that I think is that you have to do something. You have to make sense of what you gather. Some of this kind of goes to maturity of any individual practice. I think the less experienced folks are, the more they want to just take what they remember about what was said and type it up. And that verb is, that’s stenography maybe, or collation as far as you get. You put these pieces together. And then you’re just taking requests or gathering complaints. You might as well use a survey for that. I think it’s the iceberg model, right? Some of it is above the surface, but a lot of it is below the surface. Below the surface means going back to what was said and looking at it and making inferences. What wasn’t said? How was it said? What said at the beginning and what said at the end? And that’s just within one interview. What did person A say? What did person B say?

And there’s a whole new chapter about this. It’s the analysis and synthesis process. And some folks say that the ratio should be two to one. For every hour of the feedback that you gather, you should spend two hours analyzing and synthesizing. And I think in a less evolved practice, it’s the inverse. You might spend half an hour for every hour or even less. The caveat here is not every research question merits this. If we are looking for, I don’t know, choice preference between something and something else, we might be really clear about what that is. We come back and say, do this.

But for anything where we want to understand why or understand opportunities or understand motivation, a new space you want to go into, characterize a customer that we haven’t worked with before, it really is worthwhile to go and do this analysis and synthesis.

How do we have impact? We have to have something impactful to say. I just want to say that. Some other factors that I think can make or break it is working collaboratively with stakeholders, the folks that you want to inform, influence, take action before you do the research. And so having an understanding of what business challenges are or business goals, like what are we trying to do as a company? And then formulating really good research questions. What are we going to learn in order to inform that? And then choosing methods and approaches that can support that. And not doing that in a vacuum. And then this has the effect of switching your role from being proactive to reactive.

I think it’s hard to have an impact with reactive work. Those requests that come are often late. They’re often based on a shallow assumption about what kind of value research can provide. And so you are going to give a thumbs up, thumbs down in some direction. So your sort of role as a provider of these kinds of insights is diminished. If you can be proactive, which means maybe understanding a roadmap or what decisions are being made or who else is going to do what and proposing research on your own roadmap that is intentional and is ahead of time, you leave space, of course, for things that come up, fire drills and so on.

But trying to work in a proactive, collaborative way, aligning on goals and then putting the effort in to make sense changes the whole conversation about what you’ve learned. You get to that point of sharing with somebody.

That’s part of a larger Tent Talk. You can check out the whole show and definitely buy your postal carrier and barista their very own copy of the second edition of Interviewing Users. If you want to help me out, write a very short review of Interviewing Users on Amazon.

Over the last couple of years, I’ve been partnering with Inzovu to run training workshops about storytelling. Storytelling is an essential human skill that powers how teams work together with each other and with their colleagues. I’ll put a link in the show notes with more info about what I’ve been up to with Inzovu. And if storytelling is something you’d like to build up in your organization, reach out to Inzovu or to me.

Okay, let’s go to my conversation with Carol. She’s a consultant with a focus on user research leadership. Carol, welcome back to Dollar a Donuts after nine years since we last talked. It’s great to talk to you again.

Carol Rossi: Yeah, thanks, Steve. I can’t believe it’s been nine years.

Steve: Time does fly. Let’s talk about those nine years. You know, what’s been the shift in your evolution in your professional world since then?

Carol: When we last talked on the show, I was at Edmunds and I was leading the UX research team there. And I had been there at that point, I guess, four years. I had started the team there and then ended up staying at Edmunds until 2017. And then I took a moment, because I’d been there for quite a long time, and took a moment to kind of ask myself what I wanted to do next. I call it my gap year. So I did some consulting, some really contract work as well as like consulting, helping people think about how to set up a team.

And then in 2018, I went to NerdWallet and that involved a move. So I was in LA for the bulk of my career. 2018, I moved to San Francisco for the job at NerdWallet. And that was an established team that I led for about four years. And I mean, we can go into detail about any of this stuff, but basically left NerdWallet in 2022 and started a consultancy where I’m now focused on helping companies, helping leaders know how to get the most impact from research

Steve: Can we talk about NerdWallet a little bit and then talk about your consulting work now?

Carol: Yeah, Sure.

Steve: So it was an established team. Is that right?

Carol: Yeah, it was. So there were three people on the team. There was actually an open headcount when I joined. We ended up doubling the size of that team. So we still remained a relatively small team, but we did get some additional people. We actually, I think some of the work that I’m really proud of there is that we went from having these researchers doing sort of very siloed work, or even though they were all researchers, they were hardly really working with each other even. And then developed that team to the point where we had a lot more strategic impact. We started a voice of customer program. Two of the people on the team became managers during the time that I was there. So they saw a fair amount of professional growth. And when I left, again, there was this voice of customer program established, as well as a program to train designers and PMs and content managers to do some of their own research. We had, well, on the market research side, they were doing some brand work. We were doing some kind of explorations about how that played out in product. So there were more things that are more sort of horizontal activities we were doing, and also empowering some people to collect their own insights, as well as deepening the impact of our team.

Steve: When you talk about coming in and the researchers that were there were siloed, my mind starts to go to that embedded word and what that means. But I think you’re talking about siloed in a grander scheme of things. But I don’t know, what does siloed look like then?

Carol: I think it’s a really good distinction. The difference between siloed and embedded to me is that embedded can be and is a very valuable way to participate in a product development team.

So it’s like, and we ended up with this sort of hybrid model, I would call it. Because at the time that I left, the team was reporting to ultimately me, but they were dedicated to specific focus areas. So we had one person working on the logged in experience and that involved maybe three pods. We were calling them pods, but squads, whatever, product trio, whatever language we use to talk about the combination of the PM, the designer, the content strategist, and some number of engineers. So we’d have one researcher per, let’s say, three of those pods, but they were all within a focus area.

So one was dedicated to the logged in experience. We had, for example, a couple people working on what we call the guest experience or shopping. So if you’re looking for, so I should say NerdWallet is a company that provides advice and products to consumers who might be looking for financial products. Consumers might be looking for a credit card or a mortgage or a personal loan or whatever. So you can either go and read some articles and then get linked to some potential credit cards for you based on what you’re interested in and your credit score and those kinds of things. Or you can download the app, log in, and get tailored advice based on your specific situation. So those are, at the time, were separate areas of the company in terms of the way the development was divided up.

So I think embedded to me is there’s a very healthy relationship with those pods where the researcher is either dedicated to one or maybe crosses over a couple of those areas, of those pods. But siloed to me is people are working on something so exclusively that maybe there isn’t a lot of conversation across. And I think what you lose in that kind of model is opportunity to take advantage of research that might be going on in an adjacent area or even a very different area but has relevance to what you’re doing.

And so you can have a lot more efficiency across the research function if you’re not re-doing work, you know. Or people are learning techniques from each other, you know. Or people are partnering so that there’s some broader impact across these different focus areas. So there might be — because to the consumer, to the ultimate user, the customer, they’re not seeing it, right, as these sort of separate areas. They’re seeing it as one experience. And sometimes in order to do product development, you have to divide things up.

So how do we keep the flow and the things that need to be similar across the experience and have it make sense by looking for those areas of, you know, similarity or continuity or whatever the word is that you want to use there. Some of the things that we did that worked really well were have just — so first of all, I should just be really clear. Because it was a manageable team, I mean, small enough team, we could do things like have team time every week where researchers felt like they were able to have a dedicated, you know, I think it was an hour or something, but a dedicated time where they could talk about some of the stuff they were doing, present problems to each other, learn from each other, like have time to be able to say, I’m doing this thing, I think there might be some relationship to what you did last year or what so-and-so did who’s not even here anymore and what can we talk about there.

So I think there’s — with a small enough team, you can definitely have people, you know, embedded or partially embedded within specific areas so they’re having maximum impact in those areas, but still conversation across. So I think that’s one thing that we did. Another thing we did was have kind of a loose repository. We weren’t using a really fancy tool. We just literally had, you know, a wiki where all of the research that was done was available. So people could go in and see what had been done and see if there was something relevant to them. And that could be like product managers, designers, anybody could go in and look at that and see. And then they’d usually come back and ask us questions. Hey, I saw this thing, you know, I wonder how that can be relevant to our team. So I think there are a few things that you can do.

Steve: You mentioned that you put in programs to teach other folks who are not career researchers to do research. What did that look like? How did that work?

Carol: I think the way that I’ve seen that work well is to create — when I’ve created a three-part series, workshop series. And so we start with these three workshops and then we do ongoing coaching. So it’s not just a matter of taking a, you know, a training session. And the first workshop is really setting up the research for success. And so that’s really about planning and study. So then there we talk about starting with the business objective, you know, like people will often start with a research question. Like we need to know X. Okay, well, why do you need to know X? Like there’s some business reason why you need to know it. So what’s the thing you need to know? Why do you need to know it? What decisions will be made as a result of that? And then what’s the best way to get that answer? Obviously, you know, in what timeframe do you need to know it and those things as well.

But starting with that framework to give people an appreciation for the fact that we don’t just run a study because we have a question. We kind of put context around it. Even if it’s a lean and I’m using the language of run a study, but this is like the language that some people are using is having conversations with customers or collecting insights, whatever language people are using. It’s the same thinking. And in that first workshop, we talk a lot about reducing bias, making sure we’re not asking leading questions or, you know, the way that we’re writing a task or something that we’re going to put up on a prompt that we’re going to put up on an unmoderated tool for a participant to engage with whatever. We talk a lot about how to do that in a way that those are going to be effective. And by the end of the first workshop, everybody has a lightweight research plan. I give a template. So there’s a template that has all those elements in it. And there are a lot of tips and tools and sample. Questions and sample tasks. So it’s pretty plug and play, but the foundational understanding is there in terms of, you know, not introducing bias and some of those other elements.

The second workshop is literally run a study. So when I was at Edmunds and we were doing in-person research, we would recruit a bunch of participants to come in and we’d have designers, PMs, engineers running their own interviews and, you know, we’d sit and give feedback often. Now what we do is all unmoderated. These workshops are all online now remote. So, you know, it’s an unmoderated tool and they set it up. They set up their study and the tool, and then we, you know, wait for the results to come in the videos or whatever.

And then the third workshop is in researcher language synthesis. And that’s the, like, how do you go from all this data that you just got to actionable insights? So we look at, we talk about the data. We talk about findings that come from there. We talk about insights that are really most important. And then we talk about prioritizing those insights according to the business objective, back to the business objective, back to the decisions that need to be made. What are the most important of all of those insights? Cause you might get a lot of stuff, you know, out of even a lean study. What are the things you need to take action on? And then we talk about taking action. And I have a, again, there’s a template for summarizing their findings of their study, but there’s a table that shows like, what was the insight? Okay. It was high priority. So we’re going to take action. We’re going to do this thing. Who is going to do this thing? It’s assigned to a team. Who’s the point person. Maybe it’s the PM on the team. Maybe it’s the designer. By what date is this thing going to be done? So everybody on the team now has agreed beyond the person that ran the study. They go back to the team, have the conversation. Everybody has agreed. Here’s what we’re going to do as a result. And then that goes into that table goes into their summary. And then there’s a way to go back. If the person that’s running this study. Is not one of the people on that team in this case, they probably are because they’re the designer or the PM or whatever. But you can go back and see what was actually done. Was it, you know, when was it done? What impact was gained by that study. And you can then add the impact t your impact tracker.

Then there’s the coaching that happens after the training. And that’s really vital to help people sear in the knowledge from the training and get feedback as they go along and ask questions. So sometimes the designer will go to the researcher who led the training and ask, will you take a look at my plan ’cause I’m going off the template a bit and I wanna make sure this makes sense. Or they have a question about synthesis because they get something that they didn’t anticipate and they wanna talk through how to do it. Or they need help figuring out how to message something to somebody that wasn’t on the team but needs to get these insights. So there are things that come up in life and sometimes it’s feedback on something that they’re doing. Like when I was at Edmonds and we were doing live interviews, we’d actually have a conversation after each set of interviews with the people that were running them and how did you think that went? And if we saw something that maybe they could benefit from, we would share that with them. So I think that’s a really important part of it and something I incorporate into the workshop that I do to train, it’s not just the training, it’s the follow-up coaching as well.

Steve: I think there’s a lot of hand-wringing off and on over the years about the risks and the consequences of, what did you call them? Non-career researchers, that’s a great term.

Carol: People who do research, I think is what people are saying now.

Steve: You know, we talk about the consequences of these kinds of programs that allow non-career researchers or people who do research. If we empower them as we’re kind of sort of the gatekeepers of the skills and the knowledge to do research, which may not even be an accurate framing anyway, ’cause people are doing research anyway.

Carol: Yeah.

Steve: here’s sometimes some hand-wringing about unintended consequences or intended consequences. I don’t know, with these programs of these different organizations, were there longer term kinds of changes that you noticed?

Carol: Yeah, I think it’s a good question. And I’ll just say, I don’t have that argument anymore with people. I have stopped trying to defend how this can work and how I’ve seen it work well, because the fact is, I’m just really realistic. First of all, I’ve seen it work well in this way that we talked about where there’s this sort of training set and then this sort of coaching activity, and there’s a conversation. It’s an ongoing conversation. And so what I’ve seen work well, one of the things that I’ve seen come out of that that’s been really beneficial is that people who have gone through this program tend to have a better sense, when we’ve been in-house, tend to have a better sense of how to work with research and have a better appreciation for the research that the researchers are doing, the career researchers are doing. And that partnership is richer. I have seen it go awry. I’ve seen people go through a few workshops, refuse the coaching, and then do things like put an app in front of consumers and say, “Do you like it?” So it’s not without risk. I totally get that.

At the same time, I’ve stopped having that discussion with the researchers that are worried about the field being diluted, or I’ve stopped using the word democratize, ’cause we’re not democratizing. We’re helping people do stuff that, frankly, they’re already doing. So why wouldn’t we help them do it better? So I think what, and I see it now in the consultancy, I’m really focused, if I say that I’m focused on helping leaders in companies that maybe don’t have a research leader, or maybe they’ve got one or two researchers, or no researchers, and they’ve got all of these other people out having conversations with customers, why wouldn’t I want to help them do that in a way that it’s gonna be more effective, where they’ll get good data? Because we all know that if you just go out and put an app in front of somebody and say, “Do you like it?” You’re not gonna get, it used to be garbage in, garbage out, right? That language still applies decades later.

So yes, there are risks. I know what the risks are. I think I named one of them anyway. People just go, “Well, why can’t I do persona research?” Or whatever, probably not the best example, but just helping them realize there are things that you need to know, and I get that you need to know those things, and you’re probably not gonna get what you’re looking for with this method. And so having those conversations, it doesn’t mean that once I leave, they’re not gonna try to do that anyway. I can’t control that. Even if I’m in the company, I can’t control that.

So I think the risk of people who do research or non-career researchers doing this just without any guidance is greater than the risk of them thinking they can do something that they really need a career researcher for. And I think it’s not, this is not unrelated to, I mean, it’s a bit of a tangent, but it’s not unrelated to the thing that we see where companies think they want research and they hire someone. I’m seeing, I was seeing more of this like before the big sort of layoffs happened starting at the end of 2022, I guess. I was seeing more first researcher roles that were a player coach, kind of lead manager, which I think is great. I think that’s what I would advise clients to do if you’re gonna get one person, make sure they’re at that level.

But I do still see companies hiring more junior people. And I know what they’re thinking. They’re thinking we need someone to do some research. So they’ll get someone who’s very smart and very well-trained in their research chops, but there may be a senior researcher or maybe more junior than that. And then they’re overwhelmed with, they don’t have a sense of the landscape or how to manage in that kind of an environment. They aren’t getting mentorship in their research work. And then there’s sort of a, like there can be at the company, it kind of a, well, that didn’t really work out. So we don’t need research. You know, there’s sort of, instead of the concept of research being seen, instead of research being seen as like a concept or a practice that kind of associated with a person. And then they go, we don’t need any researchers. We just need to do this ourselves. And so I feel like that has, I’ve seen that a bit.

And I’ve seen, I mean, I’ve also been, some of the people that come to me for coaching are people in that situation because they’re researchers that are not getting mentorship and they’ve kind of been thrown into this situation where they just don’t have the experience to be able to manage all the pieces that go with it because it’s not just about running studies. And I think I totally get the excitement about being the first researcher, you know, and when someone wants you to play that role. And I mean, it’s, you know, there’s a lot of trust that goes into that. I also know people that are sort of senior researcher level, I’m just throwing these terms out. I mean, it’s all, you know, it just depends on the person, but who would say, and you know, they’re in a career search and we’re talking about their career search and they’re like, I don’t want to be the first person ’cause I know what’s involved in that. So, you know, I think it’s like, yeah, I get why someone would take that job, even if they maybe have like a couple of years of experience ’cause it’s exciting. And I also hear people are probably qualified, you know, who have been working for six or seven years. And again, those numbers are just, who knows, you know, it just depends on the person. And they’re like, I don’t want to do that ’cause I know how hard it is.

Steve: We sort of shifted in this conversation a little bit to talking about your consultancy. What did you start and why?

Carol: I had been, towards the end of my time at NerdWallet, I had been getting calls from coworkers asking for help to set up a research program. Like, how do I get started if I want to set up research? And, you know, I was just having these conversations and realizing that I was really excited about this topic and that it’s your beginning. The beginning point is really exciting to me, right? So when I left NerdWallet, I started looking at open roles at the time. And they were this, like I was saying, player/coach kind of role, right? And so it’s like you’re doing some of the bigger research while you’re setting up operations, while you’re setting up a roadmap, while you’re setting up, you know, all the infrastructure and everything. And I had already done that. I had done it a couple times. So I realized I wasn’t excited about doing that again.

And what I was excited about was the leadership components of that. And so the coaching or advising, and we can talk about what I think the differences are there, but, you know, the sort of training, helping people become more self-sufficient, either leaders feel like they’re stronger at supporting a research practice, whether they have researchers or not. Again, like we were saying earlier, helping designers, PMs, you know, et cetera, feeling confident that they can collect insights. If they’re going to do it anyway, we may as well help them do it well. So those are the pieces that I realized I was more interested in. And also just having conversations with people about the importance of operations and thinking about research ops from the beginning or the middle, wherever you are, and how that can be such a force multiplier, you know, such a way to move forward more quickly by spending some time on infrastructure, tools, templates, like having some kind of process, knowing, you know, having some way for people to capture the insights that they’re collecting and share it in whatever way that, however that looks like things that are going to help you do things better and faster later.

So those were the pieces that I was really interested in. And I decided to just go out on my own. I have, you know, I was out on my own for a while through, let’s see, like through most of the 2000s, that looked more like contract research work at that point. And I was doing that in parallel with other work that I was doing that was not tech. But at this time I was like, I’m going to go all in on this consulting model and see what happens. And that was like towards the end of 2022.

Steve: Since you teased us with coaching versus advising, I’m going to ask you to take the bait. What do you think the difference is?

Carol: I mean, I think, and this isn’t like, you know, genius. I think this is the way that a lot of people distinguish those. But to me, coaching is more, let me start with advising. Coaching is more like I’m working with the head of design or I’m working with somebody, you know, head of product or someone in that team that’s in a leadership role to help them see, you know, for themselves, like how that can, how research can be, have more impact or, you know, again, whether they have researchers or not. And so advising, I think has much more of a, like we’re in a conversation and I’m giving them ideas or tips.

Coaching is more of a, I’m working with, I don’t do the big sort of life coaching or big picture career coaching. Like, should I do this anymore necessarily? Because I’m not like trained as a coach where I would do life coaching kind of thing. It’s more like, you know, somebody is in an ops role and wants to shift to a research role and they have all the training to do that, but people aren’t seeing them as a researcher. What do they need to do with their portfolio, their resume? How do they need to talk about the work? Somebody gets laid off, you know, it’s a surprise. They’re trying to prepare for their next role. Somebody is, like I said, in a role where they’re like the only researcher and they’re not getting the mentorship. They got feedback on a specific thing and they don’t really know how to work on it. And their manager isn’t really kind of maybe helping them figure it out. Like it’s a very specific engagement around a topic that we can say, here’s the end goal and here are the steps that you can go through to get to that end goal. And what are the milestones that we can look at along the way, even if it’s just like four weeks or six weeks.

It’s a very specific set of things that we’re doing to get somebody to a particular place. Whereas advising is also there’s a set sort of, you know, sort of a set like arrangement, a number of sessions or whatever. But it’s more like me tossing out advice or ideas, maybe more than I would in a coaching model.

Steve: I’m going to use a word you haven’t used, but when you talk about coaching, I think a little about facilitation. Whereas in the advising, you have a best practice or an idea or suggestion. In the coaching, you’re kind of working along the path to get this person to articulate specific goals, that kind of thing.

Carol: It’s kind of like they are going to do the work to get to a certain place. And I am helping facilitate that. And it’s the way that I work with people in coaching, it’s like there’s actually a worksheet that we use. And the worksheet kind of starts with like, what, again, I sort of should distinguish I’m not doing this sort of big picture, like what is my life about, but I do start with like, what’s your mission statement as a researcher? And what is your broader goal over the next few years? And then what are you trying to get to in the next few months, whatever that timeframe is? And that’s a worksheet where it’s like, literally, what steps are you going to take to get there? What, you know, how are you going to know that you’ve achieved that step? So what milestones are we looking for? What does success look like? When are we going to say you’re done with that step and, you know, maybe addressing a different step?

And so it’s not super linear like that, but it really is. It’s like a, you know, a template. And I found that that worked really well. Actually developed the template when I was at NerdWallet, because I found it worked really well for the team to help them think through either the broader, like, I want to get to be a manager. How do I do that kind of thing? Or the very specific, they got feedback on a performance review about something and over the next few months they want to work on it. And so it’s a really simple template and approach, but that’s how I keep the coaching engagements to like a particular goal that people are going for.

Steve: So coaching engagements, advising engagements, what are the other ways in which you’re working for whomever?

Carol: So I do workshops and I have one workshop that’s really targeted to researchers or, I mean, it could be anybody, but mostly the people who come are like lead researcher or managers or senior researchers or designers. It could be PMs as well, but that’s Prioritizing Research for impact. And you know, there’s a lot of conversation about impact. It’s really the thing that we have had to make sure that we’re measuring, right? It’s not about, and what is impact? We can talk about that in a minute, but the workshop is about how to think through how you’re going to get to impact. It’s not just run the studies that you want. It’s not just run the studies that somebody’s telling you they want. It’s like, what’s the business objective that we’re trying to achieve? What decisions are going to be made if we have this information for a particular study? What do we already know about this? And then we sort of go through this framework based on clarity, risk and cost. So what do we already know that’s clarity? What do we still need to know? What’s the risk of going forward without more research, any research? And what’s the cost of doing research? What’s the cost of developing this?

And there’s a worksheet. It’s really a spreadsheet that we toss all of this information into and have the conversation about each of these possible research projects. And then at the end, you can see what’s high priority, what’s medium, what’s low priority. And then we also talk about how to involve, who do you involve in this prioritization process? How do you involve partners? And then when we get to the end, like who’s the ultimate decision maker for research? That may not be the person that, I mean, sometimes people come into the workshop and they’re like, well, the person who’s making the ultimate decisions, the person who should be really. So that’s a conversation to have. And then after the decisions have been made, what are some best practices to convey prioritization decisions? Transparency, you know, share the work, show people how you got to that decision. Hopefully they were either involved in the conversation up front or someone on their team was who has helped them understand the process. And so nobody is super surprised at the end, ideally.

And then sharing out the results, like literally share the worksheet with everybody that needs to have it so they can see what decisions were made, which projects were prioritized against what other projects. And then for each of the, you know, if it’s sort of low priority and you’re not going to move forward, how do you communicate that? If it’s high priority, how do you communicate that? And then we end up with a lot of things that are sort of medium, like we need to do something, but we don’t need to do a fresh study. And so maybe that’s a researcher is going to go sit through what we already know, and that will save the team time by not doing fresh research because we already know a lot about it. So we have high clarity, but it is high risk to move forward without doing anything else, you know, and the cost to do this research, meaning like go through this stuff is pretty low relative to the cost of going through development and getting it wrong, which is pretty high. So pulling those levers in, you know, in the workshop, we go through this for like three research projects so people can actually do it by the end of the workshop, they’ve prioritized three projects, then they can take that back to their organization and use that tool, the worksheet.

Yeah, it’s on Maven, which is a platform that has, it’s actually a really good platform for all kinds of workshops and leadership. There are workshops on AI now, there’s all kinds of stuff in there. So that’s the one that’s about prioritizing research for impact. I also have one that I literally call it see maximum impact from customer conversations. And that’s a it’s creating a game plan for the, it could you can call it your research program, you can call it your, you know, customer insights practice, you can, however you describe the thing you’re trying to do by having customer conversations. But the idea is that we do that one’s really tailored to like, leadership.

So the people that come are usually like head of product, head of design, product ops, you know, UX leaders, whatever, it’s leadership role. It could be somebody who’s starting a research team who’s a researcher, and they haven’t done this before the kind of player coach person we’re talking about. But the idea is at the end of that workshop, we have a game plan. We do a gap analysis, what’s the current state of research, I’m just going to call it research shorthand, you know, what’s the ultimate desired state. And then let’s make a three month, very specific three month plan to get there. And we look at infrastructure, meaning tools, processes, training, whatever’s going on there, the operational pieces, we look at staff, that could mean you have a researcher, it could mean people doing research, it could mean there’s some operations person on another team that’s helping you recruit, could be anything. And then we organically in the conversation, we start to talk about the research roadmap, because people will come in and they’ll go, well, the most important thing we need to know is X. And so it’s not a workshop to lay out your whole research roadmap.

But those pieces come in, the thing we ultimately need to know is this. Right now we need to know this other piece. So yeah, that’s also on Maven. I’ve run it internally within the company for, you know, a handful of leaders. And I’ve also started running it on Maven. The third workshop that I have right now is this training, you know, designers, PMs, content strategists, whoever, to do their own research. And that’s the thing that we talked about earlier, three parts, planning a study, executing a study, synthesizing to get to actionable insights, and then some coaching. And that one I’ve been running within companies, and I’m going to put it on Maven soon. It’s in the process of moving. I can still run it within a company, but it’s in the process of also becoming available on Maven.

Steve: The one for leaders, the title has customer conversations, not research in it.

Carol: What I’m finding, and I’m not the only one, I’ve been in conversation with a lot of people that are finding this. I mean, in this conversation, we’re talking about research, I’m using that language. But you know, my target audience really is like head of product, head of design. And so there can be, and I think in the last year and a half, become even more of a challenge with the word research in that audience that sometimes people think it means it’s going to be big, it’s going to be expensive, it’s going to take a lot of time. And yeah, sometimes it might be big, expensive and take a lot of time if what you need to know is foundationally something really important to your business, right? That you don’t know that’s going to, you know, make it or break it kind of thing, right?

But I think a lot of what people need is not necessarily that. And I don’t want people, I don’t want those leaders to think that having conversations with customers needs to be big, expensive and take a lot of time. Of course, they’re doing research, you know. But like if you look on my website right now, the word “research” does not appear in until you scroll below the fold. And so I’m experimenting with the way to talk about the offerings that go beyond the word research, because I, unfortunately, I used to be much more of a purist, like many, many years ago earlier in my career. Well, people need to know that research can be lean. Yeah, people are going to figure out that research can be read because we’re going to do it. You know that I don’t need to be preaching about it and I don’t need to be stuck on using that language. I think one of the things that’s held us up in the past as a field is that we’ve been too attached to language process ideas that aren’t necessarily current anymore. And so I’m like, call it whatever you want, you know, like we’re going to do this thing and I think it’s going to help your business and I’m not attached to the word.

Steve: When you started talking about developing this business for yourself, you kind of hinged on like what was exciting to you. And I’m wondering, you know, now that you’re kind of up and going, like do you find in a different experience for yourself when you are doing this, say, through Maven and it’s for the public, for lack of a better term, versus working with an organization and kind of going into that organization? Is there any differences for you when what you’re doing in those different kinds of venues?

Carol: You know, I have this really deep background in teaching. And so for me, leading workshops is really fun and it’s really exciting and working within the organization can also be fun and exciting. It just, they are different and I enjoy both. Yeah, I mean, I like the kind of bringing people together from different organizations and seeing the kinds of experiences they bring in to the workshop and they get a lot of benefit out of that conversation. I mean, this is the feedback that I get, like not only was it valuable to get the, you know, the material and the worksheets and whatever insights I’m bringing and facilitation, but the experience that other people are bringing in from, you know, if we do this publicly is really valuable. And frankly, sometimes I have to really rein it in because they can just start going on and trying to solve each other’s stuff, you know, help each other solve things and, which is great and I love it when they, at the end, you know, people say, let’s connect on LinkedIn and keep the conversation going. I’m actually about to set up a way for people to keep the conversation going across cohorts. So that’s something that I’m going to be doing later this year as well, because there’s so much benefit that people find from checking in, you know? So yeah, it’s different and they’re both interesting to me for very different reasons.

Steve: You had offered to give a little more definition about what impact meant. So I want to loop back to that.

Carol: I’ve been looking at and following what other research leaders are saying about this too. And I think that one thing that we seem to all agree on is that impact goes beyond what I call product impact. So, you know, pretty obvious that impact means we do some research, we come up with some insights, you know, we take the most important of those and we do something to change the existing product or we move into an area that’s new and we see some kind of impact that we can measure in terms of, you know, lift in engagement or revenue or customer satisfaction or whatever the thing is that we’re measuring, right, from a business perspective. That’s one kind of impact, but there are other types. And so I think there are three things.

One, product impact, like I just described, and organizational impact. And that’s stuff like what we were talking about earlier, seeing teams understand better how to work with career researchers by going through the process of learning how to do some research for themselves. I would call that organizational impact. Organizational impact is, you know, content strategist understands, you know, let’s cut that one. I’m going to stick with the first one. Operational impact is stuff like efficiency. So and this, again, relates back to something I said earlier, but we look at this, you know, we try to prioritize the most important and most impactful research. We look at something where we already have a lot of information. We have a lot of clarity about this problem, but maybe this team doesn’t know it.

So for example, we sort of real example, there was a new team spun up around a very important initiative. So the product manager, the designer, and the content strategist were all new, but there was a researcher that had been doing research in that area. And so the product designer, content strategist, designer thought they needed to do a six-week sprint to uncover, you know, where they needed to go with this very important thing. And researcher knew that there was a lot of information already. Researchers spent, you know, something like half a day going through all the information that they had, sat down with this trio, shared the information with them in an hour. Researchers spent like four hours. We can calculate the cost of that time, saved this trio the first three weeks of the sprint. We can calculate the cost of the time that they would have spent and do math and say, we spent X dollars. We saved X dollars here. And they were able to go straight to concept testing because there was all this foundational work that had already been done. So that’s an example of operational efficiency. And I don’t know that we, there are people talking about this, some people talking about this, but I don’t know that we’ve spent as much time on those calculations as a field as I think we could.

Steve: Are there impacts that are not measurable or not easily measurable but still kind of make your list?

Carol: I’m sure there are. I think I’ve pulled my list down to three. I mean, if you look at some of the things that people have been writing about, there are like these much more detailed models. I think it goes back to what are you going to do with this impact? Like if we want to be able to go back to leadership team, or we want to be able to put, you know, at the end of a quarter on an OKR spreadsheet, like what our impact was, we need to make it digestible by other teams and leaders. And so I think we can, I feel having studied this for a while, that everything kind of rolls up to one of those three areas. So I haven’t found something that doesn’t roll up to those three areas, let me put it that way. And I think that if we keep it simple like that, we’re much more likely to be able to say we can see, you know, like we saved X dollars by not doing a bunch of extra research on this project. And that’s something that we can talk about very clearly. I think the related to this is that it can be hard to measure impact period. And we know that, you know, so if you’re a researcher who’s a shared resource across multiple teams, you work on one thing, you go off to work with another team, how are you going to know what team A did a month later, unless someone, you know, comes back and tells you, you may have to go back and ask, hey, what happened from that study? So you know how to describe the impact that you’re having.

But we need to be making the effort to try and find out. I mean, it’s hard. As a consultant, it’s hard for me to know what the ultimate impact is of these workshops and the coaching and the advising unless people tell me. And I also know quite well from my teaching experience, sometimes people learn a thing and then it’s not until, you know, a while later that it actually kicks in for them. So I think that when we’re talking about training that doesn’t have a direct sort of relationship to work that’s happening right now, yeah, it’s hard for me to even know what impact I’m having. But I think it’s really, really important for us to continually try to make sure we can get as much as we can about that. Thank you. So, obviously, none of us knows the future, and we can’t talk about the future unless we talk about how we got where we are and where we are now, right? So I think I actually want to back up to a bit of like the difference between nine years ago and now, because I think it’s relevant to this.

So when you first invited me to do this sort of redo, have this redo conversation, one of the prompts was what’s changed and my first thought was everything. And then I went back and listened to the original conversation from nine years ago, and I realized, oh, more than everything has changed. And nine years is a long time. So we would expect that there would be shifts. But aside from the obvious, like the pandemic, remote work, that kind of stuff, just listening back and thinking about the way I talked about the work then, the way we all were talking about the work then, and the way we talk about the work now, we’ve been talking about impact. We have not, I haven’t used the word qualitative research or design thinking. And the last conversation was all about that, because that’s where we were at that point in the industry. And so that was what was making the work successful then. But we were, if we look even further back, the internet, the history of the internet, right? I was at GeoCities in 1998. We were making it up as we went along. And I remember reading the IPO paperwork and it said, we have no idea how we’re going to make money from this thing. And so that was normal. And then we had the boom and we had the bust.

And then, you know, so through like 2000s, everybody was talking about design thinking into maybe late 2010s. Now it’s all about impact. So the way that we characterize the work has really shifted. I think for me also, when I think future, being at this point in my career, I start asking myself, what is my legacy? Which sounds really fancy. It’s not like I think I’m capital L legacy, like I’m a celebrity or something, but I think we all kind of go, I’ve been doing this for a long time. Like, what am I going to leave this field? What am I contributing and what impact do I want to have now as I go along? And then what am I going to be leaving whenever I decide to stop this? So I kind of look at all of that and I go, where are we now? What does the future look like? Obviously AI. I mean, we don’t need to say much more about that. We need to figure out how that’s going to, how do we use AI tools and that’s changing every single day. How do we use those tools to help the work that we’re doing now?

I mean, when people ask me, what do I need to do? I actually had a call like this yesterday, person got laid off. What do I need to be thinking about and what do I need to do to position myself for my next role? It’s like, you need to be studying AI tools. And like, if you haven’t already done that, like get jumped in. Right. So that’s one kind of really obvious thing. I think another thing that we’re seeing now that’s not going to go away, that’s going to be in the future is this idea of people who do research, right? Non-career researchers collecting some of their own insights. We have to just, we can’t stick our heads in the sand and say, make it go away. It’s not going away. It’s here. It’s been here for a while and we need to figure out how to jump on that. We need to be mixed methods researchers.

You know, it’s funny because when I started, I came out of human factors school and that was very quantitatively focused. And then when I started working, I just started at an era when the work was very qualitatively focused. And so now we’re shifting back towards generalists. So I think everybody needs to be some kind of mixed methods researcher. And I think most people are going to end up being sort of T-shaped, like you’re very strong in some areas more than others, but I don’t think we can go out anymore and say, I only do ethnographic, deep qualitative research and I don’t know anything about writing a survey. Like I just don’t know that that’s going to be possible moving forward.

And another area that I think is really important for us is to, for people who haven’t already been doing this, because some of us have been doing this for a while, but triangulating insights across different sources. So knowing how to dive a bit into analytics data, you know, understanding something about behavioral science, if you don’t already, you know, making friends with the people who run customer support. So you know what they’re hearing, like, do you have a market research function, you know, like all of these other insights functions that I personally think and have thought and have seen work really well, where we’re like, totally working together in a very collaborative way. You know, I think at a minimum, like knowing what they’re doing, if you’re not in an environment where their culture is that collaborative, but having some way to look at things across multiple types of insights functions.

So this is a bit of a personal aside, but it’s very relevant to this question. So I went public in January with the fact that I had lung cancer late last year, and I decided to go public with it, because I thought it might be valuable to people. And I’ve gotten, I mean, you know, in terms of personally what that did for me, it’s, you know, it’s just sort of I could have gone through a full examination of my whole life and career. Oh, my God, do I want to keep doing this? And what I realized is, I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. And so that’s what I want to leave the world with. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, they’re taking the prioritization worksheet back to their company, or we have this coaching conversation and two years from now, they see value in it in some way that they couldn’t have anticipated. So I think that’s really vague and broad. But, you know, I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And that what zaps my energy, what gives me energy, like you were talking about earlier, I really like teaching these public workshops, as well as doing the work internally. So I’m going to keep doing the public workshops. Yeah, just keep reexamining what, how do I feel about the work that I’m doing? And what am I getting back from people?

Steve: I think you’re saying that talking about or going public with your medical situation prompted people to reach out to you in a way that highlighted the importance to you of the impact of the work that you’re doing. Is that correct?

Carol: Yeah, it was one of the things that did that. I mean, and also just literally in terms of impact of that article. Many people have told me, oh, I went and got a checkup, because I realized I hadn’t been taking care of my health. Oh, I smoked like many years ago, I should go check that out. Or I hugged my child more closely, I called my parents, the human elements of it, as well as the physical health elements were that was really rewarding. And I don’t know what I expected. But I don’t know, for some reason, I didn’t. I don’t know why I didn’t necessarily expect all of that.

Steve: Well, yeah, you have no template for, no prior in what the response to that is going to be.

Carol: No template. I mean, just to throw this out there. And just as another, like, I didn’t write this in the article, but I didn’t even know how to tell the clients that I was working with. And I and that’s where I said, I’m going on sabbatical. And then people thought I was taking a fancy vacation. And then I said, Well, I’m taking a medical leave. And then they worried a lot and started slacking me. Are you okay? What’s going on? How are you? What do you even say when you need to take two months off or whatever it was, if you don’t want to disclose because I wasn’t ready to disclose that. So I don’t even have a template for that is what I’m saying.

Steve: Yeah, now you’ve had that experience, so you’ve learned from that experience.

Carol: Hopefully helped other people.

Steve: We’ve been talking in and around impact at various levels and yet this article, the examples you just gave from your writing of maybe it’s outcomes, not impact. I don’t know. I don’t want to jargonize it, but the kinds of things that happened as a result of your action that you found meaningful and that people reported back that they found meaningful. I don’t want to take a personal experience and try to force map it into something professional, but I guess I’m just seeing echoes throughout our conversation.

Someone saying I hugged my kid is very interesting. That was the action they took. That was something they shared with you, and that was something that had meaning for you as a result of it. When we started off talking about founding your consultancy and determining what you wanted to do for that, what kind of offerings you had, I was just struck by the fact that you used the filter of what excited you.

Now we’ve been talking about changes and even looking ahead, present moment to “future.” I guess just maybe try to tie those things together. Are there things about the near future, the distant future, whatever time horizon we have for future, are there things about that with the work that you’re doing that excite you?

Carol: The thing that I’m excited about for this year is to actually do more of the public workshops. And so I think I mentioned I’m going to roll out the research, you know, lean research for designers and PMs to be public. I’ve got some other ideas that I’m working on, like, you know, some of the pain points that I hear from customers are finding the right people finding the right participants for research, which is a lot easier and B2C than it is in B2B. But there are some things that we can talk about. That’s going to be a workshop being having more conversation around knowing when do you do this yourself? And when do you hire a career researcher? What are the operations that you need to put in place to have your conversations with customers be effective? Like, there are topics like that, that I’m exploring for either short workshops or longer ones because those are things that I’m hearing about. And I like that public forum. So I’m excited to be rolling those out later this year.

Steve: Carol, it’s really great to have this chance nine years later and talk about what’s changed more than everything and the work that you have done and that are continuing to do.

Carol: Yeah, thanks so much for including me.

Steve: Thank you for taking the time. It’s great to chat with you.

Carol: It’s been really fun.

Steve: That’s it for today. I really appreciate you listening. Find Dollars to Donuts where podcasts are podcasted, or visit portigal.com/podcast for all of the episodes, complete with show notes and transcripts. Our theme music is by Bruce Todd.

The post 41. Carol Rossi returns first appeared on Portigal Consulting.
  continue reading

57 episoder

Artwork

41. Carol Rossi returns

Dollars to Donuts

208 subscribers

published

iconDela
 
Manage episode 410001080 series 62327
Innehåll tillhandahållet av Steve Portigal. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Steve Portigal eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

In this episode of Dollars to Donuts Carol Rossi returns to update us on the last 9 years. She’s now a consultant who focuses on user research leadership.

I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, or two years from now they see value in it in some way that they couldn’t have anticipated. I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And keep reexamining, how do I feel about the work that I’m doing? And what am I getting back from people? – Carol Rossi

Show Links

Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. I’m Steve Portigal. In this episode, I catch up with Carol Rossi, nine years after she was first on Dollars to Donuts.

There’s a bigger and better new edition of my classic book, Interviewing Users. As part of the launch of the book, I spoke with Russ Unger for his Tent Talk speaker series. Here’s a little clip.

Russ Unger: What’s your approach to ensuring that the feedback gathered from user interviews is effectively communicated and incorporated into the design process?

Steve: The first part of that I think is that you have to do something. You have to make sense of what you gather. Some of this kind of goes to maturity of any individual practice. I think the less experienced folks are, the more they want to just take what they remember about what was said and type it up. And that verb is, that’s stenography maybe, or collation as far as you get. You put these pieces together. And then you’re just taking requests or gathering complaints. You might as well use a survey for that. I think it’s the iceberg model, right? Some of it is above the surface, but a lot of it is below the surface. Below the surface means going back to what was said and looking at it and making inferences. What wasn’t said? How was it said? What said at the beginning and what said at the end? And that’s just within one interview. What did person A say? What did person B say?

And there’s a whole new chapter about this. It’s the analysis and synthesis process. And some folks say that the ratio should be two to one. For every hour of the feedback that you gather, you should spend two hours analyzing and synthesizing. And I think in a less evolved practice, it’s the inverse. You might spend half an hour for every hour or even less. The caveat here is not every research question merits this. If we are looking for, I don’t know, choice preference between something and something else, we might be really clear about what that is. We come back and say, do this.

But for anything where we want to understand why or understand opportunities or understand motivation, a new space you want to go into, characterize a customer that we haven’t worked with before, it really is worthwhile to go and do this analysis and synthesis.

How do we have impact? We have to have something impactful to say. I just want to say that. Some other factors that I think can make or break it is working collaboratively with stakeholders, the folks that you want to inform, influence, take action before you do the research. And so having an understanding of what business challenges are or business goals, like what are we trying to do as a company? And then formulating really good research questions. What are we going to learn in order to inform that? And then choosing methods and approaches that can support that. And not doing that in a vacuum. And then this has the effect of switching your role from being proactive to reactive.

I think it’s hard to have an impact with reactive work. Those requests that come are often late. They’re often based on a shallow assumption about what kind of value research can provide. And so you are going to give a thumbs up, thumbs down in some direction. So your sort of role as a provider of these kinds of insights is diminished. If you can be proactive, which means maybe understanding a roadmap or what decisions are being made or who else is going to do what and proposing research on your own roadmap that is intentional and is ahead of time, you leave space, of course, for things that come up, fire drills and so on.

But trying to work in a proactive, collaborative way, aligning on goals and then putting the effort in to make sense changes the whole conversation about what you’ve learned. You get to that point of sharing with somebody.

That’s part of a larger Tent Talk. You can check out the whole show and definitely buy your postal carrier and barista their very own copy of the second edition of Interviewing Users. If you want to help me out, write a very short review of Interviewing Users on Amazon.

Over the last couple of years, I’ve been partnering with Inzovu to run training workshops about storytelling. Storytelling is an essential human skill that powers how teams work together with each other and with their colleagues. I’ll put a link in the show notes with more info about what I’ve been up to with Inzovu. And if storytelling is something you’d like to build up in your organization, reach out to Inzovu or to me.

Okay, let’s go to my conversation with Carol. She’s a consultant with a focus on user research leadership. Carol, welcome back to Dollar a Donuts after nine years since we last talked. It’s great to talk to you again.

Carol Rossi: Yeah, thanks, Steve. I can’t believe it’s been nine years.

Steve: Time does fly. Let’s talk about those nine years. You know, what’s been the shift in your evolution in your professional world since then?

Carol: When we last talked on the show, I was at Edmunds and I was leading the UX research team there. And I had been there at that point, I guess, four years. I had started the team there and then ended up staying at Edmunds until 2017. And then I took a moment, because I’d been there for quite a long time, and took a moment to kind of ask myself what I wanted to do next. I call it my gap year. So I did some consulting, some really contract work as well as like consulting, helping people think about how to set up a team.

And then in 2018, I went to NerdWallet and that involved a move. So I was in LA for the bulk of my career. 2018, I moved to San Francisco for the job at NerdWallet. And that was an established team that I led for about four years. And I mean, we can go into detail about any of this stuff, but basically left NerdWallet in 2022 and started a consultancy where I’m now focused on helping companies, helping leaders know how to get the most impact from research

Steve: Can we talk about NerdWallet a little bit and then talk about your consulting work now?

Carol: Yeah, Sure.

Steve: So it was an established team. Is that right?

Carol: Yeah, it was. So there were three people on the team. There was actually an open headcount when I joined. We ended up doubling the size of that team. So we still remained a relatively small team, but we did get some additional people. We actually, I think some of the work that I’m really proud of there is that we went from having these researchers doing sort of very siloed work, or even though they were all researchers, they were hardly really working with each other even. And then developed that team to the point where we had a lot more strategic impact. We started a voice of customer program. Two of the people on the team became managers during the time that I was there. So they saw a fair amount of professional growth. And when I left, again, there was this voice of customer program established, as well as a program to train designers and PMs and content managers to do some of their own research. We had, well, on the market research side, they were doing some brand work. We were doing some kind of explorations about how that played out in product. So there were more things that are more sort of horizontal activities we were doing, and also empowering some people to collect their own insights, as well as deepening the impact of our team.

Steve: When you talk about coming in and the researchers that were there were siloed, my mind starts to go to that embedded word and what that means. But I think you’re talking about siloed in a grander scheme of things. But I don’t know, what does siloed look like then?

Carol: I think it’s a really good distinction. The difference between siloed and embedded to me is that embedded can be and is a very valuable way to participate in a product development team.

So it’s like, and we ended up with this sort of hybrid model, I would call it. Because at the time that I left, the team was reporting to ultimately me, but they were dedicated to specific focus areas. So we had one person working on the logged in experience and that involved maybe three pods. We were calling them pods, but squads, whatever, product trio, whatever language we use to talk about the combination of the PM, the designer, the content strategist, and some number of engineers. So we’d have one researcher per, let’s say, three of those pods, but they were all within a focus area.

So one was dedicated to the logged in experience. We had, for example, a couple people working on what we call the guest experience or shopping. So if you’re looking for, so I should say NerdWallet is a company that provides advice and products to consumers who might be looking for financial products. Consumers might be looking for a credit card or a mortgage or a personal loan or whatever. So you can either go and read some articles and then get linked to some potential credit cards for you based on what you’re interested in and your credit score and those kinds of things. Or you can download the app, log in, and get tailored advice based on your specific situation. So those are, at the time, were separate areas of the company in terms of the way the development was divided up.

So I think embedded to me is there’s a very healthy relationship with those pods where the researcher is either dedicated to one or maybe crosses over a couple of those areas, of those pods. But siloed to me is people are working on something so exclusively that maybe there isn’t a lot of conversation across. And I think what you lose in that kind of model is opportunity to take advantage of research that might be going on in an adjacent area or even a very different area but has relevance to what you’re doing.

And so you can have a lot more efficiency across the research function if you’re not re-doing work, you know. Or people are learning techniques from each other, you know. Or people are partnering so that there’s some broader impact across these different focus areas. So there might be — because to the consumer, to the ultimate user, the customer, they’re not seeing it, right, as these sort of separate areas. They’re seeing it as one experience. And sometimes in order to do product development, you have to divide things up.

So how do we keep the flow and the things that need to be similar across the experience and have it make sense by looking for those areas of, you know, similarity or continuity or whatever the word is that you want to use there. Some of the things that we did that worked really well were have just — so first of all, I should just be really clear. Because it was a manageable team, I mean, small enough team, we could do things like have team time every week where researchers felt like they were able to have a dedicated, you know, I think it was an hour or something, but a dedicated time where they could talk about some of the stuff they were doing, present problems to each other, learn from each other, like have time to be able to say, I’m doing this thing, I think there might be some relationship to what you did last year or what so-and-so did who’s not even here anymore and what can we talk about there.

So I think there’s — with a small enough team, you can definitely have people, you know, embedded or partially embedded within specific areas so they’re having maximum impact in those areas, but still conversation across. So I think that’s one thing that we did. Another thing we did was have kind of a loose repository. We weren’t using a really fancy tool. We just literally had, you know, a wiki where all of the research that was done was available. So people could go in and see what had been done and see if there was something relevant to them. And that could be like product managers, designers, anybody could go in and look at that and see. And then they’d usually come back and ask us questions. Hey, I saw this thing, you know, I wonder how that can be relevant to our team. So I think there are a few things that you can do.

Steve: You mentioned that you put in programs to teach other folks who are not career researchers to do research. What did that look like? How did that work?

Carol: I think the way that I’ve seen that work well is to create — when I’ve created a three-part series, workshop series. And so we start with these three workshops and then we do ongoing coaching. So it’s not just a matter of taking a, you know, a training session. And the first workshop is really setting up the research for success. And so that’s really about planning and study. So then there we talk about starting with the business objective, you know, like people will often start with a research question. Like we need to know X. Okay, well, why do you need to know X? Like there’s some business reason why you need to know it. So what’s the thing you need to know? Why do you need to know it? What decisions will be made as a result of that? And then what’s the best way to get that answer? Obviously, you know, in what timeframe do you need to know it and those things as well.

But starting with that framework to give people an appreciation for the fact that we don’t just run a study because we have a question. We kind of put context around it. Even if it’s a lean and I’m using the language of run a study, but this is like the language that some people are using is having conversations with customers or collecting insights, whatever language people are using. It’s the same thinking. And in that first workshop, we talk a lot about reducing bias, making sure we’re not asking leading questions or, you know, the way that we’re writing a task or something that we’re going to put up on a prompt that we’re going to put up on an unmoderated tool for a participant to engage with whatever. We talk a lot about how to do that in a way that those are going to be effective. And by the end of the first workshop, everybody has a lightweight research plan. I give a template. So there’s a template that has all those elements in it. And there are a lot of tips and tools and sample. Questions and sample tasks. So it’s pretty plug and play, but the foundational understanding is there in terms of, you know, not introducing bias and some of those other elements.

The second workshop is literally run a study. So when I was at Edmunds and we were doing in-person research, we would recruit a bunch of participants to come in and we’d have designers, PMs, engineers running their own interviews and, you know, we’d sit and give feedback often. Now what we do is all unmoderated. These workshops are all online now remote. So, you know, it’s an unmoderated tool and they set it up. They set up their study and the tool, and then we, you know, wait for the results to come in the videos or whatever.

And then the third workshop is in researcher language synthesis. And that’s the, like, how do you go from all this data that you just got to actionable insights? So we look at, we talk about the data. We talk about findings that come from there. We talk about insights that are really most important. And then we talk about prioritizing those insights according to the business objective, back to the business objective, back to the decisions that need to be made. What are the most important of all of those insights? Cause you might get a lot of stuff, you know, out of even a lean study. What are the things you need to take action on? And then we talk about taking action. And I have a, again, there’s a template for summarizing their findings of their study, but there’s a table that shows like, what was the insight? Okay. It was high priority. So we’re going to take action. We’re going to do this thing. Who is going to do this thing? It’s assigned to a team. Who’s the point person. Maybe it’s the PM on the team. Maybe it’s the designer. By what date is this thing going to be done? So everybody on the team now has agreed beyond the person that ran the study. They go back to the team, have the conversation. Everybody has agreed. Here’s what we’re going to do as a result. And then that goes into that table goes into their summary. And then there’s a way to go back. If the person that’s running this study. Is not one of the people on that team in this case, they probably are because they’re the designer or the PM or whatever. But you can go back and see what was actually done. Was it, you know, when was it done? What impact was gained by that study. And you can then add the impact t your impact tracker.

Then there’s the coaching that happens after the training. And that’s really vital to help people sear in the knowledge from the training and get feedback as they go along and ask questions. So sometimes the designer will go to the researcher who led the training and ask, will you take a look at my plan ’cause I’m going off the template a bit and I wanna make sure this makes sense. Or they have a question about synthesis because they get something that they didn’t anticipate and they wanna talk through how to do it. Or they need help figuring out how to message something to somebody that wasn’t on the team but needs to get these insights. So there are things that come up in life and sometimes it’s feedback on something that they’re doing. Like when I was at Edmonds and we were doing live interviews, we’d actually have a conversation after each set of interviews with the people that were running them and how did you think that went? And if we saw something that maybe they could benefit from, we would share that with them. So I think that’s a really important part of it and something I incorporate into the workshop that I do to train, it’s not just the training, it’s the follow-up coaching as well.

Steve: I think there’s a lot of hand-wringing off and on over the years about the risks and the consequences of, what did you call them? Non-career researchers, that’s a great term.

Carol: People who do research, I think is what people are saying now.

Steve: You know, we talk about the consequences of these kinds of programs that allow non-career researchers or people who do research. If we empower them as we’re kind of sort of the gatekeepers of the skills and the knowledge to do research, which may not even be an accurate framing anyway, ’cause people are doing research anyway.

Carol: Yeah.

Steve: here’s sometimes some hand-wringing about unintended consequences or intended consequences. I don’t know, with these programs of these different organizations, were there longer term kinds of changes that you noticed?

Carol: Yeah, I think it’s a good question. And I’ll just say, I don’t have that argument anymore with people. I have stopped trying to defend how this can work and how I’ve seen it work well, because the fact is, I’m just really realistic. First of all, I’ve seen it work well in this way that we talked about where there’s this sort of training set and then this sort of coaching activity, and there’s a conversation. It’s an ongoing conversation. And so what I’ve seen work well, one of the things that I’ve seen come out of that that’s been really beneficial is that people who have gone through this program tend to have a better sense, when we’ve been in-house, tend to have a better sense of how to work with research and have a better appreciation for the research that the researchers are doing, the career researchers are doing. And that partnership is richer. I have seen it go awry. I’ve seen people go through a few workshops, refuse the coaching, and then do things like put an app in front of consumers and say, “Do you like it?” So it’s not without risk. I totally get that.

At the same time, I’ve stopped having that discussion with the researchers that are worried about the field being diluted, or I’ve stopped using the word democratize, ’cause we’re not democratizing. We’re helping people do stuff that, frankly, they’re already doing. So why wouldn’t we help them do it better? So I think what, and I see it now in the consultancy, I’m really focused, if I say that I’m focused on helping leaders in companies that maybe don’t have a research leader, or maybe they’ve got one or two researchers, or no researchers, and they’ve got all of these other people out having conversations with customers, why wouldn’t I want to help them do that in a way that it’s gonna be more effective, where they’ll get good data? Because we all know that if you just go out and put an app in front of somebody and say, “Do you like it?” You’re not gonna get, it used to be garbage in, garbage out, right? That language still applies decades later.

So yes, there are risks. I know what the risks are. I think I named one of them anyway. People just go, “Well, why can’t I do persona research?” Or whatever, probably not the best example, but just helping them realize there are things that you need to know, and I get that you need to know those things, and you’re probably not gonna get what you’re looking for with this method. And so having those conversations, it doesn’t mean that once I leave, they’re not gonna try to do that anyway. I can’t control that. Even if I’m in the company, I can’t control that.

So I think the risk of people who do research or non-career researchers doing this just without any guidance is greater than the risk of them thinking they can do something that they really need a career researcher for. And I think it’s not, this is not unrelated to, I mean, it’s a bit of a tangent, but it’s not unrelated to the thing that we see where companies think they want research and they hire someone. I’m seeing, I was seeing more of this like before the big sort of layoffs happened starting at the end of 2022, I guess. I was seeing more first researcher roles that were a player coach, kind of lead manager, which I think is great. I think that’s what I would advise clients to do if you’re gonna get one person, make sure they’re at that level.

But I do still see companies hiring more junior people. And I know what they’re thinking. They’re thinking we need someone to do some research. So they’ll get someone who’s very smart and very well-trained in their research chops, but there may be a senior researcher or maybe more junior than that. And then they’re overwhelmed with, they don’t have a sense of the landscape or how to manage in that kind of an environment. They aren’t getting mentorship in their research work. And then there’s sort of a, like there can be at the company, it kind of a, well, that didn’t really work out. So we don’t need research. You know, there’s sort of, instead of the concept of research being seen, instead of research being seen as like a concept or a practice that kind of associated with a person. And then they go, we don’t need any researchers. We just need to do this ourselves. And so I feel like that has, I’ve seen that a bit.

And I’ve seen, I mean, I’ve also been, some of the people that come to me for coaching are people in that situation because they’re researchers that are not getting mentorship and they’ve kind of been thrown into this situation where they just don’t have the experience to be able to manage all the pieces that go with it because it’s not just about running studies. And I think I totally get the excitement about being the first researcher, you know, and when someone wants you to play that role. And I mean, it’s, you know, there’s a lot of trust that goes into that. I also know people that are sort of senior researcher level, I’m just throwing these terms out. I mean, it’s all, you know, it just depends on the person, but who would say, and you know, they’re in a career search and we’re talking about their career search and they’re like, I don’t want to be the first person ’cause I know what’s involved in that. So, you know, I think it’s like, yeah, I get why someone would take that job, even if they maybe have like a couple of years of experience ’cause it’s exciting. And I also hear people are probably qualified, you know, who have been working for six or seven years. And again, those numbers are just, who knows, you know, it just depends on the person. And they’re like, I don’t want to do that ’cause I know how hard it is.

Steve: We sort of shifted in this conversation a little bit to talking about your consultancy. What did you start and why?

Carol: I had been, towards the end of my time at NerdWallet, I had been getting calls from coworkers asking for help to set up a research program. Like, how do I get started if I want to set up research? And, you know, I was just having these conversations and realizing that I was really excited about this topic and that it’s your beginning. The beginning point is really exciting to me, right? So when I left NerdWallet, I started looking at open roles at the time. And they were this, like I was saying, player/coach kind of role, right? And so it’s like you’re doing some of the bigger research while you’re setting up operations, while you’re setting up a roadmap, while you’re setting up, you know, all the infrastructure and everything. And I had already done that. I had done it a couple times. So I realized I wasn’t excited about doing that again.

And what I was excited about was the leadership components of that. And so the coaching or advising, and we can talk about what I think the differences are there, but, you know, the sort of training, helping people become more self-sufficient, either leaders feel like they’re stronger at supporting a research practice, whether they have researchers or not. Again, like we were saying earlier, helping designers, PMs, you know, et cetera, feeling confident that they can collect insights. If they’re going to do it anyway, we may as well help them do it well. So those are the pieces that I realized I was more interested in. And also just having conversations with people about the importance of operations and thinking about research ops from the beginning or the middle, wherever you are, and how that can be such a force multiplier, you know, such a way to move forward more quickly by spending some time on infrastructure, tools, templates, like having some kind of process, knowing, you know, having some way for people to capture the insights that they’re collecting and share it in whatever way that, however that looks like things that are going to help you do things better and faster later.

So those were the pieces that I was really interested in. And I decided to just go out on my own. I have, you know, I was out on my own for a while through, let’s see, like through most of the 2000s, that looked more like contract research work at that point. And I was doing that in parallel with other work that I was doing that was not tech. But at this time I was like, I’m going to go all in on this consulting model and see what happens. And that was like towards the end of 2022.

Steve: Since you teased us with coaching versus advising, I’m going to ask you to take the bait. What do you think the difference is?

Carol: I mean, I think, and this isn’t like, you know, genius. I think this is the way that a lot of people distinguish those. But to me, coaching is more, let me start with advising. Coaching is more like I’m working with the head of design or I’m working with somebody, you know, head of product or someone in that team that’s in a leadership role to help them see, you know, for themselves, like how that can, how research can be, have more impact or, you know, again, whether they have researchers or not. And so advising, I think has much more of a, like we’re in a conversation and I’m giving them ideas or tips.

Coaching is more of a, I’m working with, I don’t do the big sort of life coaching or big picture career coaching. Like, should I do this anymore necessarily? Because I’m not like trained as a coach where I would do life coaching kind of thing. It’s more like, you know, somebody is in an ops role and wants to shift to a research role and they have all the training to do that, but people aren’t seeing them as a researcher. What do they need to do with their portfolio, their resume? How do they need to talk about the work? Somebody gets laid off, you know, it’s a surprise. They’re trying to prepare for their next role. Somebody is, like I said, in a role where they’re like the only researcher and they’re not getting the mentorship. They got feedback on a specific thing and they don’t really know how to work on it. And their manager isn’t really kind of maybe helping them figure it out. Like it’s a very specific engagement around a topic that we can say, here’s the end goal and here are the steps that you can go through to get to that end goal. And what are the milestones that we can look at along the way, even if it’s just like four weeks or six weeks.

It’s a very specific set of things that we’re doing to get somebody to a particular place. Whereas advising is also there’s a set sort of, you know, sort of a set like arrangement, a number of sessions or whatever. But it’s more like me tossing out advice or ideas, maybe more than I would in a coaching model.

Steve: I’m going to use a word you haven’t used, but when you talk about coaching, I think a little about facilitation. Whereas in the advising, you have a best practice or an idea or suggestion. In the coaching, you’re kind of working along the path to get this person to articulate specific goals, that kind of thing.

Carol: It’s kind of like they are going to do the work to get to a certain place. And I am helping facilitate that. And it’s the way that I work with people in coaching, it’s like there’s actually a worksheet that we use. And the worksheet kind of starts with like, what, again, I sort of should distinguish I’m not doing this sort of big picture, like what is my life about, but I do start with like, what’s your mission statement as a researcher? And what is your broader goal over the next few years? And then what are you trying to get to in the next few months, whatever that timeframe is? And that’s a worksheet where it’s like, literally, what steps are you going to take to get there? What, you know, how are you going to know that you’ve achieved that step? So what milestones are we looking for? What does success look like? When are we going to say you’re done with that step and, you know, maybe addressing a different step?

And so it’s not super linear like that, but it really is. It’s like a, you know, a template. And I found that that worked really well. Actually developed the template when I was at NerdWallet, because I found it worked really well for the team to help them think through either the broader, like, I want to get to be a manager. How do I do that kind of thing? Or the very specific, they got feedback on a performance review about something and over the next few months they want to work on it. And so it’s a really simple template and approach, but that’s how I keep the coaching engagements to like a particular goal that people are going for.

Steve: So coaching engagements, advising engagements, what are the other ways in which you’re working for whomever?

Carol: So I do workshops and I have one workshop that’s really targeted to researchers or, I mean, it could be anybody, but mostly the people who come are like lead researcher or managers or senior researchers or designers. It could be PMs as well, but that’s Prioritizing Research for impact. And you know, there’s a lot of conversation about impact. It’s really the thing that we have had to make sure that we’re measuring, right? It’s not about, and what is impact? We can talk about that in a minute, but the workshop is about how to think through how you’re going to get to impact. It’s not just run the studies that you want. It’s not just run the studies that somebody’s telling you they want. It’s like, what’s the business objective that we’re trying to achieve? What decisions are going to be made if we have this information for a particular study? What do we already know about this? And then we sort of go through this framework based on clarity, risk and cost. So what do we already know that’s clarity? What do we still need to know? What’s the risk of going forward without more research, any research? And what’s the cost of doing research? What’s the cost of developing this?

And there’s a worksheet. It’s really a spreadsheet that we toss all of this information into and have the conversation about each of these possible research projects. And then at the end, you can see what’s high priority, what’s medium, what’s low priority. And then we also talk about how to involve, who do you involve in this prioritization process? How do you involve partners? And then when we get to the end, like who’s the ultimate decision maker for research? That may not be the person that, I mean, sometimes people come into the workshop and they’re like, well, the person who’s making the ultimate decisions, the person who should be really. So that’s a conversation to have. And then after the decisions have been made, what are some best practices to convey prioritization decisions? Transparency, you know, share the work, show people how you got to that decision. Hopefully they were either involved in the conversation up front or someone on their team was who has helped them understand the process. And so nobody is super surprised at the end, ideally.

And then sharing out the results, like literally share the worksheet with everybody that needs to have it so they can see what decisions were made, which projects were prioritized against what other projects. And then for each of the, you know, if it’s sort of low priority and you’re not going to move forward, how do you communicate that? If it’s high priority, how do you communicate that? And then we end up with a lot of things that are sort of medium, like we need to do something, but we don’t need to do a fresh study. And so maybe that’s a researcher is going to go sit through what we already know, and that will save the team time by not doing fresh research because we already know a lot about it. So we have high clarity, but it is high risk to move forward without doing anything else, you know, and the cost to do this research, meaning like go through this stuff is pretty low relative to the cost of going through development and getting it wrong, which is pretty high. So pulling those levers in, you know, in the workshop, we go through this for like three research projects so people can actually do it by the end of the workshop, they’ve prioritized three projects, then they can take that back to their organization and use that tool, the worksheet.

Yeah, it’s on Maven, which is a platform that has, it’s actually a really good platform for all kinds of workshops and leadership. There are workshops on AI now, there’s all kinds of stuff in there. So that’s the one that’s about prioritizing research for impact. I also have one that I literally call it see maximum impact from customer conversations. And that’s a it’s creating a game plan for the, it could you can call it your research program, you can call it your, you know, customer insights practice, you can, however you describe the thing you’re trying to do by having customer conversations. But the idea is that we do that one’s really tailored to like, leadership.

So the people that come are usually like head of product, head of design, product ops, you know, UX leaders, whatever, it’s leadership role. It could be somebody who’s starting a research team who’s a researcher, and they haven’t done this before the kind of player coach person we’re talking about. But the idea is at the end of that workshop, we have a game plan. We do a gap analysis, what’s the current state of research, I’m just going to call it research shorthand, you know, what’s the ultimate desired state. And then let’s make a three month, very specific three month plan to get there. And we look at infrastructure, meaning tools, processes, training, whatever’s going on there, the operational pieces, we look at staff, that could mean you have a researcher, it could mean people doing research, it could mean there’s some operations person on another team that’s helping you recruit, could be anything. And then we organically in the conversation, we start to talk about the research roadmap, because people will come in and they’ll go, well, the most important thing we need to know is X. And so it’s not a workshop to lay out your whole research roadmap.

But those pieces come in, the thing we ultimately need to know is this. Right now we need to know this other piece. So yeah, that’s also on Maven. I’ve run it internally within the company for, you know, a handful of leaders. And I’ve also started running it on Maven. The third workshop that I have right now is this training, you know, designers, PMs, content strategists, whoever, to do their own research. And that’s the thing that we talked about earlier, three parts, planning a study, executing a study, synthesizing to get to actionable insights, and then some coaching. And that one I’ve been running within companies, and I’m going to put it on Maven soon. It’s in the process of moving. I can still run it within a company, but it’s in the process of also becoming available on Maven.

Steve: The one for leaders, the title has customer conversations, not research in it.

Carol: What I’m finding, and I’m not the only one, I’ve been in conversation with a lot of people that are finding this. I mean, in this conversation, we’re talking about research, I’m using that language. But you know, my target audience really is like head of product, head of design. And so there can be, and I think in the last year and a half, become even more of a challenge with the word research in that audience that sometimes people think it means it’s going to be big, it’s going to be expensive, it’s going to take a lot of time. And yeah, sometimes it might be big, expensive and take a lot of time if what you need to know is foundationally something really important to your business, right? That you don’t know that’s going to, you know, make it or break it kind of thing, right?

But I think a lot of what people need is not necessarily that. And I don’t want people, I don’t want those leaders to think that having conversations with customers needs to be big, expensive and take a lot of time. Of course, they’re doing research, you know. But like if you look on my website right now, the word “research” does not appear in until you scroll below the fold. And so I’m experimenting with the way to talk about the offerings that go beyond the word research, because I, unfortunately, I used to be much more of a purist, like many, many years ago earlier in my career. Well, people need to know that research can be lean. Yeah, people are going to figure out that research can be read because we’re going to do it. You know that I don’t need to be preaching about it and I don’t need to be stuck on using that language. I think one of the things that’s held us up in the past as a field is that we’ve been too attached to language process ideas that aren’t necessarily current anymore. And so I’m like, call it whatever you want, you know, like we’re going to do this thing and I think it’s going to help your business and I’m not attached to the word.

Steve: When you started talking about developing this business for yourself, you kind of hinged on like what was exciting to you. And I’m wondering, you know, now that you’re kind of up and going, like do you find in a different experience for yourself when you are doing this, say, through Maven and it’s for the public, for lack of a better term, versus working with an organization and kind of going into that organization? Is there any differences for you when what you’re doing in those different kinds of venues?

Carol: You know, I have this really deep background in teaching. And so for me, leading workshops is really fun and it’s really exciting and working within the organization can also be fun and exciting. It just, they are different and I enjoy both. Yeah, I mean, I like the kind of bringing people together from different organizations and seeing the kinds of experiences they bring in to the workshop and they get a lot of benefit out of that conversation. I mean, this is the feedback that I get, like not only was it valuable to get the, you know, the material and the worksheets and whatever insights I’m bringing and facilitation, but the experience that other people are bringing in from, you know, if we do this publicly is really valuable. And frankly, sometimes I have to really rein it in because they can just start going on and trying to solve each other’s stuff, you know, help each other solve things and, which is great and I love it when they, at the end, you know, people say, let’s connect on LinkedIn and keep the conversation going. I’m actually about to set up a way for people to keep the conversation going across cohorts. So that’s something that I’m going to be doing later this year as well, because there’s so much benefit that people find from checking in, you know? So yeah, it’s different and they’re both interesting to me for very different reasons.

Steve: You had offered to give a little more definition about what impact meant. So I want to loop back to that.

Carol: I’ve been looking at and following what other research leaders are saying about this too. And I think that one thing that we seem to all agree on is that impact goes beyond what I call product impact. So, you know, pretty obvious that impact means we do some research, we come up with some insights, you know, we take the most important of those and we do something to change the existing product or we move into an area that’s new and we see some kind of impact that we can measure in terms of, you know, lift in engagement or revenue or customer satisfaction or whatever the thing is that we’re measuring, right, from a business perspective. That’s one kind of impact, but there are other types. And so I think there are three things.

One, product impact, like I just described, and organizational impact. And that’s stuff like what we were talking about earlier, seeing teams understand better how to work with career researchers by going through the process of learning how to do some research for themselves. I would call that organizational impact. Organizational impact is, you know, content strategist understands, you know, let’s cut that one. I’m going to stick with the first one. Operational impact is stuff like efficiency. So and this, again, relates back to something I said earlier, but we look at this, you know, we try to prioritize the most important and most impactful research. We look at something where we already have a lot of information. We have a lot of clarity about this problem, but maybe this team doesn’t know it.

So for example, we sort of real example, there was a new team spun up around a very important initiative. So the product manager, the designer, and the content strategist were all new, but there was a researcher that had been doing research in that area. And so the product designer, content strategist, designer thought they needed to do a six-week sprint to uncover, you know, where they needed to go with this very important thing. And researcher knew that there was a lot of information already. Researchers spent, you know, something like half a day going through all the information that they had, sat down with this trio, shared the information with them in an hour. Researchers spent like four hours. We can calculate the cost of that time, saved this trio the first three weeks of the sprint. We can calculate the cost of the time that they would have spent and do math and say, we spent X dollars. We saved X dollars here. And they were able to go straight to concept testing because there was all this foundational work that had already been done. So that’s an example of operational efficiency. And I don’t know that we, there are people talking about this, some people talking about this, but I don’t know that we’ve spent as much time on those calculations as a field as I think we could.

Steve: Are there impacts that are not measurable or not easily measurable but still kind of make your list?

Carol: I’m sure there are. I think I’ve pulled my list down to three. I mean, if you look at some of the things that people have been writing about, there are like these much more detailed models. I think it goes back to what are you going to do with this impact? Like if we want to be able to go back to leadership team, or we want to be able to put, you know, at the end of a quarter on an OKR spreadsheet, like what our impact was, we need to make it digestible by other teams and leaders. And so I think we can, I feel having studied this for a while, that everything kind of rolls up to one of those three areas. So I haven’t found something that doesn’t roll up to those three areas, let me put it that way. And I think that if we keep it simple like that, we’re much more likely to be able to say we can see, you know, like we saved X dollars by not doing a bunch of extra research on this project. And that’s something that we can talk about very clearly. I think the related to this is that it can be hard to measure impact period. And we know that, you know, so if you’re a researcher who’s a shared resource across multiple teams, you work on one thing, you go off to work with another team, how are you going to know what team A did a month later, unless someone, you know, comes back and tells you, you may have to go back and ask, hey, what happened from that study? So you know how to describe the impact that you’re having.

But we need to be making the effort to try and find out. I mean, it’s hard. As a consultant, it’s hard for me to know what the ultimate impact is of these workshops and the coaching and the advising unless people tell me. And I also know quite well from my teaching experience, sometimes people learn a thing and then it’s not until, you know, a while later that it actually kicks in for them. So I think that when we’re talking about training that doesn’t have a direct sort of relationship to work that’s happening right now, yeah, it’s hard for me to even know what impact I’m having. But I think it’s really, really important for us to continually try to make sure we can get as much as we can about that. Thank you. So, obviously, none of us knows the future, and we can’t talk about the future unless we talk about how we got where we are and where we are now, right? So I think I actually want to back up to a bit of like the difference between nine years ago and now, because I think it’s relevant to this.

So when you first invited me to do this sort of redo, have this redo conversation, one of the prompts was what’s changed and my first thought was everything. And then I went back and listened to the original conversation from nine years ago, and I realized, oh, more than everything has changed. And nine years is a long time. So we would expect that there would be shifts. But aside from the obvious, like the pandemic, remote work, that kind of stuff, just listening back and thinking about the way I talked about the work then, the way we all were talking about the work then, and the way we talk about the work now, we’ve been talking about impact. We have not, I haven’t used the word qualitative research or design thinking. And the last conversation was all about that, because that’s where we were at that point in the industry. And so that was what was making the work successful then. But we were, if we look even further back, the internet, the history of the internet, right? I was at GeoCities in 1998. We were making it up as we went along. And I remember reading the IPO paperwork and it said, we have no idea how we’re going to make money from this thing. And so that was normal. And then we had the boom and we had the bust.

And then, you know, so through like 2000s, everybody was talking about design thinking into maybe late 2010s. Now it’s all about impact. So the way that we characterize the work has really shifted. I think for me also, when I think future, being at this point in my career, I start asking myself, what is my legacy? Which sounds really fancy. It’s not like I think I’m capital L legacy, like I’m a celebrity or something, but I think we all kind of go, I’ve been doing this for a long time. Like, what am I going to leave this field? What am I contributing and what impact do I want to have now as I go along? And then what am I going to be leaving whenever I decide to stop this? So I kind of look at all of that and I go, where are we now? What does the future look like? Obviously AI. I mean, we don’t need to say much more about that. We need to figure out how that’s going to, how do we use AI tools and that’s changing every single day. How do we use those tools to help the work that we’re doing now?

I mean, when people ask me, what do I need to do? I actually had a call like this yesterday, person got laid off. What do I need to be thinking about and what do I need to do to position myself for my next role? It’s like, you need to be studying AI tools. And like, if you haven’t already done that, like get jumped in. Right. So that’s one kind of really obvious thing. I think another thing that we’re seeing now that’s not going to go away, that’s going to be in the future is this idea of people who do research, right? Non-career researchers collecting some of their own insights. We have to just, we can’t stick our heads in the sand and say, make it go away. It’s not going away. It’s here. It’s been here for a while and we need to figure out how to jump on that. We need to be mixed methods researchers.

You know, it’s funny because when I started, I came out of human factors school and that was very quantitatively focused. And then when I started working, I just started at an era when the work was very qualitatively focused. And so now we’re shifting back towards generalists. So I think everybody needs to be some kind of mixed methods researcher. And I think most people are going to end up being sort of T-shaped, like you’re very strong in some areas more than others, but I don’t think we can go out anymore and say, I only do ethnographic, deep qualitative research and I don’t know anything about writing a survey. Like I just don’t know that that’s going to be possible moving forward.

And another area that I think is really important for us is to, for people who haven’t already been doing this, because some of us have been doing this for a while, but triangulating insights across different sources. So knowing how to dive a bit into analytics data, you know, understanding something about behavioral science, if you don’t already, you know, making friends with the people who run customer support. So you know what they’re hearing, like, do you have a market research function, you know, like all of these other insights functions that I personally think and have thought and have seen work really well, where we’re like, totally working together in a very collaborative way. You know, I think at a minimum, like knowing what they’re doing, if you’re not in an environment where their culture is that collaborative, but having some way to look at things across multiple types of insights functions.

So this is a bit of a personal aside, but it’s very relevant to this question. So I went public in January with the fact that I had lung cancer late last year, and I decided to go public with it, because I thought it might be valuable to people. And I’ve gotten, I mean, you know, in terms of personally what that did for me, it’s, you know, it’s just sort of I could have gone through a full examination of my whole life and career. Oh, my God, do I want to keep doing this? And what I realized is, I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. And so that’s what I want to leave the world with. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, they’re taking the prioritization worksheet back to their company, or we have this coaching conversation and two years from now, they see value in it in some way that they couldn’t have anticipated. So I think that’s really vague and broad. But, you know, I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And that what zaps my energy, what gives me energy, like you were talking about earlier, I really like teaching these public workshops, as well as doing the work internally. So I’m going to keep doing the public workshops. Yeah, just keep reexamining what, how do I feel about the work that I’m doing? And what am I getting back from people?

Steve: I think you’re saying that talking about or going public with your medical situation prompted people to reach out to you in a way that highlighted the importance to you of the impact of the work that you’re doing. Is that correct?

Carol: Yeah, it was one of the things that did that. I mean, and also just literally in terms of impact of that article. Many people have told me, oh, I went and got a checkup, because I realized I hadn’t been taking care of my health. Oh, I smoked like many years ago, I should go check that out. Or I hugged my child more closely, I called my parents, the human elements of it, as well as the physical health elements were that was really rewarding. And I don’t know what I expected. But I don’t know, for some reason, I didn’t. I don’t know why I didn’t necessarily expect all of that.

Steve: Well, yeah, you have no template for, no prior in what the response to that is going to be.

Carol: No template. I mean, just to throw this out there. And just as another, like, I didn’t write this in the article, but I didn’t even know how to tell the clients that I was working with. And I and that’s where I said, I’m going on sabbatical. And then people thought I was taking a fancy vacation. And then I said, Well, I’m taking a medical leave. And then they worried a lot and started slacking me. Are you okay? What’s going on? How are you? What do you even say when you need to take two months off or whatever it was, if you don’t want to disclose because I wasn’t ready to disclose that. So I don’t even have a template for that is what I’m saying.

Steve: Yeah, now you’ve had that experience, so you’ve learned from that experience.

Carol: Hopefully helped other people.

Steve: We’ve been talking in and around impact at various levels and yet this article, the examples you just gave from your writing of maybe it’s outcomes, not impact. I don’t know. I don’t want to jargonize it, but the kinds of things that happened as a result of your action that you found meaningful and that people reported back that they found meaningful. I don’t want to take a personal experience and try to force map it into something professional, but I guess I’m just seeing echoes throughout our conversation.

Someone saying I hugged my kid is very interesting. That was the action they took. That was something they shared with you, and that was something that had meaning for you as a result of it. When we started off talking about founding your consultancy and determining what you wanted to do for that, what kind of offerings you had, I was just struck by the fact that you used the filter of what excited you.

Now we’ve been talking about changes and even looking ahead, present moment to “future.” I guess just maybe try to tie those things together. Are there things about the near future, the distant future, whatever time horizon we have for future, are there things about that with the work that you’re doing that excite you?

Carol: The thing that I’m excited about for this year is to actually do more of the public workshops. And so I think I mentioned I’m going to roll out the research, you know, lean research for designers and PMs to be public. I’ve got some other ideas that I’m working on, like, you know, some of the pain points that I hear from customers are finding the right people finding the right participants for research, which is a lot easier and B2C than it is in B2B. But there are some things that we can talk about. That’s going to be a workshop being having more conversation around knowing when do you do this yourself? And when do you hire a career researcher? What are the operations that you need to put in place to have your conversations with customers be effective? Like, there are topics like that, that I’m exploring for either short workshops or longer ones because those are things that I’m hearing about. And I like that public forum. So I’m excited to be rolling those out later this year.

Steve: Carol, it’s really great to have this chance nine years later and talk about what’s changed more than everything and the work that you have done and that are continuing to do.

Carol: Yeah, thanks so much for including me.

Steve: Thank you for taking the time. It’s great to chat with you.

Carol: It’s been really fun.

Steve: That’s it for today. I really appreciate you listening. Find Dollars to Donuts where podcasts are podcasted, or visit portigal.com/podcast for all of the episodes, complete with show notes and transcripts. Our theme music is by Bruce Todd.

The post 41. Carol Rossi returns first appeared on Portigal Consulting.
  continue reading

57 episoder

Alla avsnitt

×
 
Loading …

Välkommen till Player FM

Player FM scannar webben för högkvalitativa podcasts för dig att njuta av nu direkt. Den är den bästa podcast-appen och den fungerar med Android, Iphone och webben. Bli medlem för att synka prenumerationer mellan enheter.

 

Snabbguide