A solid UX research strategy serves as the compass that guides research ops and helps ensure repeatable success. But how do you envision, create, and execute a strategy that helps you achieve your goals in the most effective way possible?
In this episode of Awkward Silences, Devin Harold, Director of Research at Capital One, unpacks how to craft and refine a winning UXR strategy, including tips to help you make team playbooks, win stakeholder buy-in, and inform your strategy with maturity models.
In this episode, we discuss:
- The significance of a robust UX research strategy
- The definition of a good and bad strategy
- Essential components of an effective research plan
- Aligning research with stakeholder needs and expectations
- Metrics and KPIs to evaluate progress and success
Watch or listen to the episode
Click the embedded players below to listen to the audio recording or watch the video. Go to our podcast website for full episode details.
Highlights
- [00:01:17] Strategy vs delivery, prioritizing projects and allocating resources
- [00:10:13] The role of flexibility and adaptability
- [00:15:38] Gaining stakeholder buy-in and tying research to organizational objectives
- [00:21:05] Establishing KPIs and metrics to measure progress and success
- [00:33:46] Periodic reviews and updates to maintain relevance and effectiveness
Sources mentioned in the episode
- userinterviews.com/awkward
- Good Strategy/Bad Strategy by Richard Rumelt
- UXR Maturity Models: Move to a more advanced level in your org by Nikki Anderson of Dovetail
- UX Strategy Components by Nielsen Norman Group
- The Organizational Appetite for Research by Behzod Sirjani
About our guest
Devin Harold is the Director of UX Research at Capital One, where he leads a team dedicated to improving end-to-end experiences and touchpoints for one of the company’s primary business units. With over eleven years of experience in UX design and research, he has a deep understanding of research methodologies, leadership, strategy, frameworks, and interaction design. Devin’s expertise and leadership have been recognized with multiple awards, including the IDEA Award, Verizon Beyond Award, and Verizon Credo Award.
Transcript
Devin - 00:00:02: Strategy based on the question alone, that it would be good to sunset one in favor of the other because then you're not solving the needs of your co-creators. And that, again, is not solving the diagnosis. The diagnosis here is that they still feel like they need that work.
Erin - 00:00:19: This is Erin May.
JH - 00:00:21: I'm John-Henry Forster. And this is Awkward Silences.
Erin - 00:00:26: Silences.
Erin - 00:00:34: Hello everybody, and welcome back to Awkward Silences. Today, we're here with Devin Harold, the Director of Design Research at Capital One. Thanks for joining us, Devin.
Devin - 00:00:43: Yeah, thanks for having me. I'm excited to be here.
Erin - 00:00:45: Awesome. We're going to talk today about Research Strategy, which is a huge topic, and we've got a lot to dive into, and I know it's something that is probably really interesting and full of mystery to both senior and beginner researchers alike. So hopefully a little something for everyone in here. I've got JH here too.
JH - 00:01:05: Yeah, we're 100 something episodes into this podcast now, and I feel like we say this a lot, but how have we not covered this topic before?
Erin - 00:01:11: Yeah, strategy is important. It’s a good one.
JH - 00:01:13: Keeps happening. A timely one. Yeah, a good one.
Erin - 00:01:16: Awesome. Well, let's start. This is a very big question, but let's just start with the big question, then we'll drill in more. But what is Research Strategy, Devin?
Devin - 00:01:24: That's a really great question, and one in which I'm sure many people have different answers to. But my answer is that it should be a driving force for the decisions that you make as a team. It can be large decisions like where are we going to invest our time or our budget this year, or the resources that we have. Or it can be small decisions like how do we specifically democratize this particular research method or how do we consult with our partners who may also do research. And so, to me, research strategy is all about building a muscle and behavior within your team that progresses your practice of research forward towards some form of future state or end goal. And that's really in service to a lot of what you do in your day-to-day. So that's to me what Research Strategy is, it's kind of like a compass that points your north.
JH - 00:02:18: Cool. And how intertwined is it, or is it not, with the company strategy, the product strategy? Are these things all Venn Diagrammed? Are they cascading? How do you think of them fitting together?
Devin - 00:02:30: I think that any good strategy for a research team should bear in mind where the company is going, and you should have awareness to your company strategy, your product strategy, and the strategy of other partners like tech and design. However, for research specifically, I think that it is not only in service to those things, but adjacent to them as well. In my opinion, a good Research Strategy is one which focuses on building the practice of research, which is something that your partners won't really have a heavy stake in or they're not the ones responsible for it, you are. And so I think it should fit like a puzzle piece alongside other strategy pieces within the business, other strategies that others are working on. But you shouldn't say that your strategy is someone else's strategy or only rely on others to give you the guiding light as to where your team should go. They should be inputs, but that's all they should be is inputs. Because what you do to build your research practice should be something that you own for your team, for your organization. No matter if you're a research team of one or a research team of 200, you're in charge of owning how you mature your practice over time in better service to your partners and your organization.
Erin - 00:03:49: Yeah, I love that distinction because I think so often with planning or strategic work, there is this desire to be really hierarchical. It's like we're going to start up here and we're going to nest in there. And to your point, you have to take that as an input. Recipe for failure is a strategy that isn't aware of the company strategy, but you also have research-specific goals that, to your point, are not going to be known necessarily to the folks setting the company-wide strategy.
Devin - 00:04:18: Absolutely. I like to take kind of the Ford approach in that case where he's coined by saying if you were to ask people what they want, they'd say a faster horse. But he said, "Well, I'm going to deliver a car." If you only ever rely on your partners for telling you what research should be, you're not providing any new inventive ways to deliver new types of research, new research programs, or to elevate your practice in ways they may not have anticipated today. So I love what you said, like inputs and alongside us, but really you should have a voice of your own in terms of building the practice of your team.
JH - 00:04:56: Is this something that's important for large, mature research teams? Small, kind of maybe one-person scrappy research teams? When should people be thinking about their Research Strategy?
Devin - 00:05:06: So people may disagree with my answer to this, but I will say that for me, I think you always need to be thinking about it. If you are just a single researcher within a wider team, or if you are, again, a very large organization, maybe you already have a strategy and that's fine. But you should always be thinking about your vision and your future state, where you want to go, and what are the things that you're doing now and in a quarter from now and in a year from now to get you towards that future state. Because otherwise, I was thinking about this analogy earlier this morning. I think strategy can be summed up in saying the work that you're doing to deliver research for your organization is like running on a treadmill. You certainly are working and exercising your research muscles, and you can see results and good outcomes of your hard work, but you're not moving forward, you're staying still. A good Research Strategy takes you off the treadmill and moves you onto the track so that you are not only seeing meaningful outcomes from your efforts, but you are moving forward. And you actually can progress as a practice. And that always needs to be thought of. Regardless of where you sit in the organization or how large your team is, it's really important to make sure everyone's focusing on where are we going next as a team or as a research practice to better serve myself and our partners and the business.
Erin - 00:06:35: And so the key difference, if you have one person or 200 people, I guess the impact of having a good strategy is larger if you have 200 people. But if you're one person, of course, you care about how you're spending your time and where you're going and are you making an impact.
Devin - 00:06:52: Absolutely. I think that the altitude of prioritization may look different, but prioritization nonetheless of where you spend your time and the things that you do day to day still should be impacted by a strategy.
JH - 00:07:06: Yeah, I like to run, but I hate treadmills. So this analogy, how do you get started so people maybe have having this awakening. We need a research strategy. We need to make this more explicit and more defined. Do you just jump right into it? Do you need to start somewhere else? How do people kind of break the glass on this thing
Devin - 00:07:26: Yeah, there's a couple of things that I would say to get started. And I'm going to channel a really wonderful book for everybody who's listening. It's by Richard Rumelt. It's called Good Strategy/Bad Strategy. And through decades of consulting with Fortune 500 and mom-and-pop shops, he really dissected good strategies being composite of three different things. One is a diagnosis. A diagnosis of where you are and the problems or the issues or the challenges that you're facing as a team or an organization. By identifying those things, you can then create a future state where you want to go, a vision for moving forward. You have to know where you are before you can know where you want to go. And so by diagnosing the issues or the problems you're facing as a team, and then identifying where you want to go in terms of a vision or a future state, you then create a set of coherent actions or goals or objectives. They're called different things by different teams. But essentially, what are the things that you can start doing next week, next month, next quarter, to then progress towards meeting that future vision? And so to get started, what I would recommend all teams doing is do a bit of self-reflection. Maybe take one of the UX Research Maturity Models that are out there, or UXR maturity models. And as a team-building activity, identify where you feel that you are in that maturity model. There's a great article that was written by Nikki Anderson for Dovetail that talks about different UXR Maturity Models and NASDAQ, as well as Get Your Guide, have their own maturity models that they've built for their organizations. So maybe there's a self-reflection of we take the needs of our partners and our business. We know what our organizational priorities are in large. Where are we as a research team on the scale of maturity? Are people busting down the doors asking us for research left and right? And they are helping us to really prioritize the most strategic, valuable research. That's probably where we all really want to go, right? Or are we reactionary or reacting to smaller research or maybe more aesthetic or validation exercises? And if that's where you are, what's the delta there? And that's where I would start by having some self-reflection. Another good tool for this is maybe do a self-SWOT analysis with your team, identify your Strengths, Weaknesses, Opportunities, and your Threats. And that might be a great way to get a better understanding of where you are today.
Erin - 00:10:00: Yeah, especially in this market when things are changing so rapidly to where is our research practice in terms of sort of a meta-analysis of research as a practice, but then where is the company? How can research really fit into moving us forward right now? So that seems like a really good approach.
Devin - 00:10:18: Yeah.
JH - 00:10:19: And is part of the understanding your own research maturity, is that part of your strategy? Might be, we need to mature as a research organization and so we need to do things that enable that? Or is it to understand how and where you can have impact? It seems like a really good first step, but I'm just curious how you actually then deploy it?
Devin - 00:10:38: Yeah, I love that. I think for me, it is one of the inputs of self-reflection to identify that diagnosis that we talked about. And so if you determine as a team you do want to progress research maturity, then that's exactly what that reflective exercise is for. Where are we in the maturity scale now? And let's say in two years from now, we want to get to the next level. What are the things that the signals of the next level that tell us where we're not today? For example, do we have an insights repository where people can come and learn from research we've already done, or do we do a lot of duplicative stuff today? Maybe one of the ways that you can do diagnosis of where you are is a listening tour with your partners. Meet with some of your closest partners and with your leadership and ask what is working, what's not working, and ask them in an ideal state, if you waved a magic wand, what would research look like in its best shape or form? And look at the things that may be similar with what stakeholders say and then with what two or three levels of maturity next look like. And maybe that gives you then a roadmap to say, okay, well, us as a team, we're doing a lot of duplicative work. Maybe you've done a listening tour with your peers. And they say, yeah, we just keep learning the same things. We're not learning new insights or new novel things. And maybe if you have a team who reported to you, if you're a research leader, or maybe if you're in a team of a larger team of researchers, you all are feeling like, oh, we're just not making that much of strategic headway. We're not delivering research that informs roadmaps or strategy. Those are all inputs to that diagnosis. And from there, to make it actionable, you would say, what are the things that we want to change to mature how we operate as a team? And those things are going to be those coherent actions that are part of your strategy, those goals or those objectives you stand up. An example of this would be if you've identified you're not doing enough strategic research. Or, maybe one of your coherent actions is to increase the amount of strategic research that your team does within the next year after having identified that as part of your strategy as a team. Once you've identified that's where you want to go, it allows you to start thinking about how are we going to do that? What incentives or rewards are we going to create in the organization to facilitate that behavior or change behavior? Are we going to build an intake form to help identify types of research? Are we going to say no to evaluative research? Those are then decisions you make as a team or as a leadership group to make progress towards that strategy. So that's a little bit of how to get started once you've identified where you are and the pain points that you may be facing as a team.
Erin - 00:13:27: Yeah, practical question, and I imagine the answer is probably it depends. But I'm curious to hear how you think about this, which is how long to spend on the diagnosis phase, right? Sort of similar to how long to spend on research, how much diagnosis is enough to kind of have conviction to move forward with the other steps in developing your strategy.
Devin - 00:13:49: I will say yes, it does depend, I'm sure. And actually, I say it depends, not necessarily just as a cop-out. I would say it because my whole philosophy of strategy is that you should not be taking a one-size-fits-all approach. One UX maturity model for one organization may not work for your organization. And a democratization model for people always say, "Oh, democratize evaluative research, not generative research." Well, that might not work for your organization. And so the reason why I say it depends is you should start with just having a self-reflective exercise, no more than an hour, thinking through what are the pain points we're facing as a team, on behalf of our partners, and on behalf of our own research team. And if you're left scratching your heads after the hour going, we've written down three things, maybe five things that might mean that you need to do a listening tour, just one-on-one stakeholder interviews with your partners, with your leadership to identify their ideal state for research, the things they wish they could do with insights, and maybe more of a collaborative exercise with your team of researchers or with your leadership group. Maybe your leaders to identify where you think that you should go in that future state and that listening tour might take a week. It might take two weeks. But I would try to cap it at like, a month maximum. That's a lot of time to be spending on that phase because at the end of the day, the rubber needs to meet the road at some point and it's better to start moving forward with an early version of a strategy that you can iterate on over time because that's the other thing. Strategy should not be finite. It is something that continues to evolve with macroeconomic trends, with the changes in your organization and the changes in your team. So start with something better than nothing and then continue to iterate.
JH - 00:15:37: Yeah, I think that's a good point because when I've done this in the product world, you, for most things, have a strong bent towards like, "Let's just do the thing and get it going, and we'll fix, wait, and iterate." And there is something about strategy that I think can kind of trap you into like, "Well, this is important and it's this big thing, and you want to do it well," and so then you become kind of a perfectionist, and to your point, you never actually get to a version. So I think that's a really good point. We've been talking a lot about this diagnosis stuff around kind of internal-facing. Is it also applicable to do that sort of towards the business and your users and other things too, like we've seen growth in this segment of users and we don't understand them at all? And so there's something there that we might want to focus on. Do you do it in kind of both directions?
Devin - 00:16:18: I think you can. I've seen that work well in the past. The one thing I would address is, what is the intended outcome of your strategy itself? Often, what I find is the work that you're going to deliver for the business or the organization, the actual research you're going to do, that's like the treadmill. You're going to do that anyway. You always want to make sure that your research, just like any department in any organization, is focused on the right stuff. And so for me, in my experience, building a research strategy is more about elevating the infrastructure and the practice of research in an organization adjacent to the business needs. And so if there's a new customer segment, maybe that's part of your strategy for a line of business that you're delivering for. But that doesn't necessarily tell you about your research as a practice. Unless you feel like, "Oh, we don't have the tools or capabilities to even go after that customer group," then maybe that actually starts to tie into building the resources, the budget, the tools to spread your impact across different customer groups that your business is tackling. So I could see it playing a role. I think that it will be largely dependent on your organization and what you're facing, but I think you'd be remiss not to consider it when you're thinking about building your strategy.
JH - 00:17:38: Just to play that back to make sure I'm getting it. Sounds like what you're saying is this research strategy is really about how you maximize the impact and leverage of the research group and what infrastructure or changes or things you need to do to do that, and then requests or inquiries around understanding groups and stuff are going to come up anyway. So that can be a little bit secondary. Is that a fair paraphrasing?
Devin - 00:18:00: Yes. And instead of saying secondary, I would just say like, again, adjacent to I think that they're both important in parallel, absolutely.
JH - 00:18:08: Cool.
Erin - 00:18:09: Lots of geometry. Love it.
Devin - 00:18:11: Yes.
Erin - 00:18:12: So we talked about kind of step one, good strategy, bad strategy, inspired diagnosis. You've spent an appropriate amount of time there, not too long, and you're going to revisit, you're going to constantly re-diagnose. Then we move on to guiding policy. Can we talk a little bit about that step?
Devin - 00:18:29: Yeah. So, as I mentioned, Richard Rumelt says that once you have your diagnosis of the things that you're trying to solve for as a team, then you create a guiding policy. What I've seen in practice, after reviewing different ways that folks like Nielsen Norman Group, or others may recommend strategy, they really elevate this idea of a vision. So I think that they can actually be interchangeable. A guiding policy is what is the overarching thing that your team is striving to do to solve for most of those pain points you've identified, or at least the most salient pain points that you've identified. So a guiding policy is something that is in service to this future state for your team, your vision. What I would recommend is, after looking at all the things that you've learned from your peers, your leadership, your customer groups, and from your own team, what are the things that you want to do more of and do less of as a team? From there, identify what your future state is. Imagine two to three years out, who do you want to be as a team? What does that look like? An example is if you want to be doing more enablement within your team, let's say that you have so many team members after you're listening to her and your diagnosis, you say there's a problem with people wanting to do research but not knowing how to get started, and us as the research experts. Maybe we want to be enablers of that so that we're the center of good quality, well-managed research in the organization and people come to us as the experts. If that's your future state and that's not where you are today, then that should be part of your guiding policy or that vision. So work with your team. Do a team-building exercise to take the inputs from your diagnosis and start to generate a future state. And the thing that I think works really well is making sure that when you're building that vision, you have the Who, the What, and the Why. So an example of a future state might be we want to be enablers of smarter decisions for product teams with customers at the center. Maybe that's not who we are today, but that's who we want to be in the future. And that's that guiding policy of the thing that is in service to all the actions you're going to take as a team this quarter and then the next year against your roadmap to get you there. There are lots of great opportunities for you to leverage those insights that you built from the diagnosis phase and then to deliver on that future state. So I see that as like the vision and where you're going.
Erin- 00:21:05: Yeah.
JH- 00:21:05: Do you have any advice when you talk vision and guiding policy and stuff like how to get it at the right level of specificity or altitude? I feel like sometimes when people go into the vision stuff, it just keeps getting loftier and loftier and vaguer and vaguer. But also, if it's too specific, it's not really a vision, it's going to be a little bit too. Any rules of thumb or ways of doing that well?
Devin - 00:21:25: Yeah, I would say no more. Well, first, actually two things. First, have a timeline against it. So don't just say you want to be this future thing that everyone is like, but when would that actually ever happen? We want to be the CEOs of the company driving every last decision. Well, is that attainable? Is that realistic? And so, yes, I see leeway put in smart goals. So absolutely agree. Like make sure that it is specific enough. It's measurable in terms of the amount of time, like it's time-bound. Maybe give yourself three to five years. I wouldn't go over five years because that's often really difficult for people to really think about so far in advance. But give yourself that leeway, maybe a block of three to five years out and say, what do we think that we can realistically accomplish in that amount of time? And use that time box to think about your vision. Because to your point, if it's too aspirational, you're not going to get your partners or even your team to buy into it. And then, unfortunately, you're not going to be able to have folks rallying behind the coherent actions that you need to take in order to make that vision a reality
Erin - 00:22:35: As you're describing this, I'm thinking about it too a little bit as sort of how do we want the research team to be perceived internally, right? Not in a shallow way, in a deep way. When people think about the research team, what is it they think we can do for them and can do for this company? Which is sort of like when you think about a vision or a mission of a brand, right? How do we want the market to perceive us? What is it that we're doing? What pain are we solving? To your point, right? Yes, I've diagnosed the pain. How are we solving that? And you want people to understand that in your organization because that will make you more effective.
Devin - 00:23:12: Absolutely. And that's why including your partners or others in the organization as part of that diagnosis that leads to that guiding policy, they're somewhat bought in. Because when you then proclaim what your vision is and where you're going as a team, they see themselves in that and there's automatic buy-in there. It's not like you're coming out of the blue saying we're the research team, we're going to do this thing and everyone goes, that's not really going to drive value for me. And that's why you make sure that you're building that vision, that future state off of actual true pain points that you're facing in the organization and the things that you can solve for as a research team.
JH - 00:23:49: Just building off that. Does this sort of then need to be something that is done by the leader of the research team? Is this ultimately their responsibility to kind of drive forward and work with stakeholders? Is it a whole team thing? Can individual contributors kind of kickstart it? Any thoughts on that piece?
Devin - 00:24:05: I am not one to care where the thing starts, right? I think if you have an appetite for wanting to take this on for your team, run with it. Start the conversations. Because what you don't want is your title or your rank to be an artificial inhibitor to what the team really should be focusing on, which is making sure that you have that vision state or have a way in which you know where you're going and you can track progress in how you're getting there. And so I think anybody could start it. Strategy could come and start from anywhere. I do feel though that in order for this to work, you do need leadership buy-in and you do need that accountability. And so if we think about a RACI, right, I would probably say everybody could be responsible in the research team, across the entirety of the team. And I think it should be team-wide exercises to build internal buy-in and really feel the importance of your strategy, the importance of the things that you're working on. But it ultimately should also mean that the leader of that group is accountable for making sure that we are making progress. You're setting the policies that will help support your team and support progress towards those goals.
Erin - 00:25:16: So we've talked about, we've diagnosed, we have our guiding policy, our vision, and then we get a little bit into more the sort of brass tax, right? Like what are we actually going to do, how are we actually going to live this policy and solve some of these problems we've found? So how do you go about identifying that stuff?
Devin - 00:25:35: That is often what I've seen and heard. When I talk about strategy with peers or with others in the industry, it's often where they actually stop. They go, "Oh, we have these big aspirational goals" and then they don't make it real. And I think what you need to ask yourself is after you have that future state and that guiding policy, then you need to determine what are the things that you need to do to get you there? I would start by just asking yourself a set of questions. What are the tools that you need to make that future state a reality? For example, if you want to do more generative research, do you have a diary study tool? Do you have a proper self-service tool that will enable you to run user interviews and really learn deep empathy with your customers? That's going to be important to meet that vision. Do you have the right budget? If part of your future state is that you want to make sure that you're staffed against the entirety of every line of business in your organization, so you have a researcher per line of business for full coverage, if that's part of that strategy, you're going to need to start making conversations happen now on what does the ratio look like in researcher to designer to developer support. And so ask yourself what are the skills that you need in order to make that future state happen? What are the tools or resources that you need? And then again, that self-reflection of where you are now. After you ask those questions and you reflect on where you are now, start to think about what are the actions you can take beginning tomorrow to actually make headway on this. Let's say I mentioned the example earlier. If you didn't have a research repository but you need one, well, are you going to start with an MVP, Google Slides, or Google Spreadsheet? Right? If that's going to be your interim solution, then someone needs to build that. So determining your timelines then around actually building the capabilities or the skills that you need in order to get there. And so that's what we call Coherent Actions. After you have that guiding policy, you have a set of coherent actions that your team is going to take and prioritize as a research practice that will solve for all those questions that I mentioned earlier. To make sure that you build the skills that you need, whether that's upskilling your team through creating a training curriculum through the year and then administering training on different research methods, if that's one of your coherent actions you've identified or maybe it's to inform your hiring practices. So you're hiring in researchers with the skills that you need, that your business needs to meet that future state. All of those decisions you can make as a team or make as a research leader, but those become your coherent actions. And then, of course, with SMART Goals, and I know that there's lots of love in the chat for SMART goals, making sure that you have timelines against that. And that's where Nielsen Norman Group, when they talk about research strategy, one of the things that they highlight also is that the third piece they recommend outside of a vision and goals is a plan, making sure that you then put those actions against some form of timeline. When will you have an interim state for your first version of an insight repository? When will you be able to hire a researcher who has heavy quantitative skills, because that's what you determine you need? When do you think you're going to be able to have budget conversations to get the tools that you need for doing more generative research? Those are then going to be against a timeline that you can spread out across the next quarter, the next year. And to meet that vision, as I mentioned earlier, time-box it to three to five years. You could even work backwards from there and say, "Oh, in two to three years or five years' time, we need to meet this future state. Is that realistic now, after scaffolding all the details that I need to do as a team to build these things?" So that's how you make it a little bit more real, is just asking yourself those self-reflecting questions of what do you have today? And then what are you missing as a team to get you to the future state? And everything that you're missing, making sure that ties back to those original problems that you diagnosed, that last part, and then I'll end it here and throw it back to you for any other questions and thoughts. But I think the one thing I don't want to miss is when you're building those coherent actions, always ask yourself, "Are these solving for those original pain points that we identified earlier in the process?" Because if you have a full list of things that you can do to meet that future state, like "Yeah, we're going to build a repository. We're going to hire more researchers. We're going to stop planning usability tests, and we're only going to put interviews and diary studies on our roadmap, and those are the things we're going to start doing." But those actually don't solve all the problems that you identified earlier, that your guiding policies are in service to, then you may not be focusing on the right actions. So you may want to prioritize from the list of things you could do, what are the things that you should do and prioritize and scaffold them to build your roadmap.
JH - 00:30:46: All right, a quick awkward interruption here. It's fun to talk about user research, but you know what's really fun is doing user research, and we want to help you with that.
Erin - 00:30:54: We want to help you so much that we have created a special place. It's called userinterviews.com/awkward for you to get your first three participants free.
JH - 00:31:05: We all know we should be talking to users more, so we've went ahead and removed as many barriers as possible. It's going to be easy. It's going to be quick. You're going to love it. So get over there and check it out.
Erin - 00:31:14: And then when you're done with that, go on over to your favorite podcasting app and leave us a review, please.
JH - 00:31:23: How much do you think about planning the plan? And what I mean by this is there's a lot of budget headcount-type things in here that you're using as examples. If, you know, for instance, that your company does that stuff in November, December every year to set up for the next year head is it important to try to be kind of like, intentional to refresh your research strategy ahead of that, so when you go into those conversations, you're like, hey, here's why we need that extra headcount. Or here's why we need this new tool. Versus if you're doing this in January and February and you kind of missed the boat, now you got to wait a full year. I don't know if that makes sense, but is there something there that you want to kind of play with the timelines of your organization in a smart way?
Devin - 00:32:00: I would say absolutely, and that end - better late than ever. Do it anyway. But if you have that organizational knowledge that is just phenomenal because then that will impact the sequence of your coherent actions and the timeline that you stand up for things that you want to prioritize now versus prioritize later. You should do that based on the timing of your organization like you mentioned. And then the most salient pain points - if the number one pain point from all of the feedback that you've gathered from your partners, let's say product and tech and design - they say it's not clear what you do. I don't know who you are or what kind of research you can supply. To me, maybe creating a team playbook is the most important thing that you really need to do even before you ask for budget. So knowing that, oh my gosh, next week or next month is budget conversations, that might not be the most important thing. So really making sure that you lean on the feedback loops and the diagnosis earlier with the most salient pain points even ahead of those organizational requirements because that will make sure that you're meeting traction and improving your maturity as a practice over time because you're focusing on the most important stuff.
Erin - 00:33:20: Yeah, I was going to say that feels like a really important part of looking at some of these maturity models out there is which step is really solving a real problem for my organization right now? Right. And it might not be this linear. Right. Using those as starting off points and really, where do we need to go to solve the problems we're having or to address the opportunities that we aren't realizing? We have a lot of questions. Should we answer some of our audience questions here?
JH - 00:33:46: Yeah, I see one in here that feels pretty relevant to what we're talking about. So Bill, throw one off here. Yeah, it's asking about how much time you strategizing versus doing. So obviously the strategy stuff takes time and bandwidth. Just curious how you think about making space for it amongst just doing the research work.
Devin - 00:34:03: That's a great question. I think strategy should be ideally revisited quarterly, in my experience. Quarterly at least, looking and saying, hey, are we still focusing on the right things? Are we prioritizing in the right way? But a deep-dive probably every six months. And I say that because again, the markets change, teams change, organizations change. And in terms of timing, I would set aside - it doesn't need to be some lofty three week at a time thing. We're talking maybe pull up your strategy. I would recommend, by the way, for what it's worth, when you have your guiding policy, your future state, and then when you have those coherent actions, put that somewhere, like document that make that known to the organization, to your team on a single page, if you can, and pull that up every quarter and say, is this still what matters? Are we prioritizing our work to the things that will help progress us and do that in a team activity, no more than 2 hours together, and say is this the stuff that we should be focusing on? Is this where we should still be heading? And then every six months do another feedback loop and just take a week to gather feedback from the organization. Maybe do another SWOT analysis to reflect and then update your strategy if need be. But I don't think it needs to take more time than that. I don't believe strategy versus delivery, it should not be 50-50, I believe because if you have the right strategy and if you've done the right homework, it shouldn't need to change all that often in the first place.
Erin - 00:35:38: Yeah, kind of measure twice, cut once and part of how when you were talking about strategy, it's like this is a shortcut to good decision-making. Right. You do that kind of deep thinking up front and then you know, right, you don't have to work so hard to make every decision after that.
Devin - 00:35:57: Absolutely. And it's funny because just the way you said that made me reflect on one thing is building a strategy is like doing research on your own team. Imagine that you think about customer personas, you build personas to then aid in decision-making and prioritization for future product releases, backlog discussions without needing to do more research. That's a strategy. You do all the hard work upfront to identify what are the problems you're solving as a team. You build that into the DNA of what your team is doing and how to prioritize your work. Which means you don't need to keep revisiting that time and time again. You have it. So used it to make those decisions.
Erin - 00:36:39: Right. Yeah. I like this Jhoyce question here. JH for short. She says, leadership only values reactionary research. Any advice on how to get buy-in for investing in long-term research strategy at all?
Devin - 00:36:53: I would say, as part of your diagnosis, one of the things I'm going to throw another article at everybody, Behzod Sirjani wrote a beautiful essay called "The Organizational Appetite for Research," and it has a list of questions that you should reflect on and ask yourself: What are the capabilities of your team and your organization? What are the things you can do and provide? And then, what are the needs of your organization? What are the things that they're asking for that they may need but they don't know to ask for? And what I hear that question describing is there's an appetite for reactionary research and there's a desire for your team to do more than that. And those things are constantly in flux. We read that all the time as UX researchers, and that's the bane of our existence in the industry. What that tells me is your strategy may need to consider how do we still solve that need for stakeholders, delivering on the reactionary stuff and communicate. We're still doing that, but we're also going to deliver value over here with the proactive stuff. I don't believe, in your strategy based on the question alone, that it would be good to sunset one in favor of the other because then you're not solving the needs of your co-creators. And that, again, is not solving the diagnosis. The diagnosis here is that they still feel like they need that work. And so when you're developing your strategy, I would advise you to think about a way that you can show your partners and your leadership that you can still maintain or sustain good quality work in that area and then start to build value over here. And often that means just doing it once and creating a really great case study for the value that it provides. Get buy-in for at least one project and then use that as your rallying cry to say, "See, we did this one really generative project and it led to all these great decisions." Document that. Look at the impacts and document and measure how that affected the organization. And then maybe over time, you'll start to shift behaviors where they're going to start asking for more of the proactive and less of the reactive, but make sure you accommodate both, at least to begin with.
Erin - 00:39:07: Yeah, that's a great point. And as we were talking about earlier, a little bit of this is where we want to get over time, having that kind of time horizon in mind, and we're going to build toward that slowly and take some steps instead of doing a total 180.
Devin - 00:39:22: Erin you nailed it. And reminding yourself that strategy is just that. It is not an overnight switch you cannot change an organization's, culture, behavior overnight. And that's why it's that future state. It's an aspiration that you're striving for, but it should be attainable. So knowing that, take every win as you can and celebrate the wins along the way, but know that it's going to take time, I love that push, Erin.
JH - 00:39:50: Totally. There's a good kind of specific one in here from Aaron Wilson of do you use KPIs on a user research strategy? Do you have any metrics that you're tracking?
Devin - 00:40:00: Absolutely. And I may not have mentioned this clearly enough, so I apologize, but the idea of the coherent actions, those should be really specific in terms of what are the things that you need to build, the things you need to gain, or the things you need to do to meet that future state. Well, how do you know you've done it? How do you know you're there? How do you measure your progress from where you are today and where you want to be tomorrow? That's with objectives and key results. That's with KPIs, I would absolutely say that you need to figure out for each coherent action. How do you measure where you are today? How do you measure your progress against where you want to be tomorrow? And make sure that you have those things very top of mind and it is an integral part of those coherent actions. For example, I mentioned the enabler thing earlier. If you want to enable more decision-making and more research, well, maybe you're going to deliver training to the design team as part of it. I would recommend benchmarking what are the design team's current level of knowledge in certain research methods or research as a practice. And after you've delivered training, let's say on a usability test, you deliver this big training, you deliver templates and everything. Ask those questions again. Benchmark did your training actually move the needle? And after every training session or after a quarter, if you look back and go, we're not on our understanding of research process or research methods, that means you need to pivot your approach isn't working, but you wouldn't know that if you weren't measuring it. So absolutely agree. You need KPIs, you need some form of measures.
Erin - 00:41:41: This feels like the area that you're going to probably adjust the most frequently in your strategy, right? You're going to take another look at your diagnosis and make sure that your vision is staying relevant, but you've got to constantly update those success markers, the coherent actions, to move that forward.
Devin - 00:42:00: Absolutely.
Erin - 00:42:01: And would you recommend you kind of do that on a quarterly planning cadence with the rest of your company or just sort of as needed, one in, one out?
Devin - 00:42:11: I think as needed. If you pull up one of those measures and you're like, oh, we're not doing what we should be doing, or maybe our approach is not working, pivot, then don't wait a quarter. But I would have an intentional cadence to look at those measures month over month and quarter over quarter just to make sure that it's not too late by the time you've invested so much time or energy into some of those things.
JH - 00:42:36: This is actually kind of related to all this. So from Diego, between those quarterly or semi-annually checkpoints with the team, what do you recommend teams should do to ensure they're still aligned and focused on their agreed strategy? So I guess maybe a simple area is how do you actually live it day to day, week to week, and not forget about it and then revisit it every quarter?
Devin - 00:42:54: Love that! I would recommend that whatever project management tool you're leveraging, whether it is Basecamp, Airtable, or Jira, tie your work to that. If there are infrastructural things that you're doing, let's say a member of my team is going to be building the training curriculum for designers and making sure it's a SMART goal. We're going to have a curriculum by the end of next week, and then we're going to start rolling it out by next quarter. Tracking that in the same manner that you're tracking your actual research projects by creating those internal projects for yourself, that's really important. If your projects that you're delivering as a team are already tied to it, tag it as such. If there are tags that you can leverage, like in Jira, there are different components and things, add a component that's relevant to the specific strategy that you're focusing on so that it's top of mind for the team that's delivering it. In addition, there is something that us leaders of research practices can also do, and that is provide incentivizations for folks to make sure that we're living our strategy or making good on the strategy. Are there things that you can promote with looking at performance evaluation as a measure of making sure people are focused on specific elements of our strategy? Or are there things that you can build with other leaders adjacent to you? Maybe you meet with the other directors of design alongside your research practice and say, "Hey, your designers and you want to do more research. Let's talk about a performance management expectation moving into next year where at least 10% of their time is spent on learning from our customers, or at least one out of every five projects has research involved with it, and they're the ones leading the research." So there are some policy or human governance changes that you can make to facilitate that behavior a little bit more rather than just crossing your fingers and hoping people are focusing on it.
Erin - 00:44:55: We always get lots of folks listening who are trying to break into UXR. So when I ask one of those questions so this is from someone listening on LinkedIn Live, what would you have emerging UXRs be proficient or experienced in as they think about entering the field?
Devin - 00:45:12: I would say, be proficient in, I wouldn't even say experience in, because obviously if you're new, it's hard, it's like the chicken or the egg. But proficient in knowledge of varying research methods. What are the different core methods of a user researcher, and what are the pros and cons of those methods, and when might you employ those methods? That's really important when breaking in because if you haven't walked the walk before because you're new, then that's fine. At least be ready to talk the talk because if you know it on a conceptual level, when you get the chance and opportunity, you can put it in practice with greater confidence.
Erin - 00:45:50: Devin someone had a related question, which was, do you have a favorite research method?
Devin - 00:45:56: I do. I actually think interviews are my favorite research method primarily because I'm a people person. I like to talk to people, I like to learn from them. And just that one on one live conversation is just always really energizing for me.
Erin - 00:46:10: We love user interviews, too.
JH - 00:46:14: We talked to so many researchers who always end up revealing, like, "Well, I'm very introverted and so I love interviews, but then I need time." And so nice to see some representation on the extrovert side. One here that I'm actually curious how your take on this will be is how do you address when something cannot make it into the strategy? I'm curious, is that a feature or a bug or how do you think about that?
Devin - 00:46:36: I think that's the entirety of the point. A strategy should help you prioritize. If everything made it in, it's probably not a good strategy. So I would say when it doesn't make it in, that's beautiful. That means that you're doing your job. Because when you look at the most salient pain points that you're solving as a team or as an organization tied to that future state vision, that should be what is really helping you to steer towards the coherent actions that are the most important thing at the sacrifice of other things. Because when there's only so much time in the day and there was a question earlier, how much do you spend time on strategy versus delivery, there's always a million and one things to do. There's always a thousand projects and way too much research to get done and not enough hours in the day to do it. And that is a reality. So how do you prioritize and progress in your strategy with those coherent actions? It's by dropping the things that just aren't worth your time right now. It doesn't mean they're not important. They're just not important right now in prioritizing.
JH - 00:47:40: Yeah, I think it's Farnam Street, the blog, that has an article about focus, and there's a quote in there that I love. I'm not going to recall it, but it gets to the spirit of you're only focusing when you start saying no to stuff that's unpleasant to say no to. If everything's easy to say no to, you haven't actually got to the point yet where you've really focused, and I think that's really helpful to remember.
Devin - 00:47:59: I love that.
Erin - 00:48:00: Yeah, we have a bunch more questions. I don't think we'll get through them all, but maybe we could do like two more questions and then I know we also wanted to talk about just sort of common failure modes. We've talked about sort of the happy path of setting up your strategy and how that might play out in practice. But is there a way you see folks kind of go wrong that they can avoid by knowing what to steer clear from?
Devin - 00:48:24: Yeah, I'll share just two things. One is when there wasn't enough, and actually it's ongoing, not enough buy-in for the strategy. Either you've prioritized the actions that don't solve for those pain points you've identified earlier enough, and so you're delivering solutions that are looking for problems. That's always the worst. That is one thing that I see because people put maybe what they read in an article above all else, or maybe they put their own desires or where they want to see the team, but it's at odds with where the organization is. That's one of the pitfalls. The other thing is, and we actually had a great question that sparked this conversation, when you're not measuring anything, it's great to do all this work, but when the organization starts to mature and your practice is progressing towards that future state, how do you get credit for that? And how do you elevate that and say, hey, you see all that great research that the design team is doing? Or hey, all those questions that you answered because of the Insight repository? Yeah, we did that. We were able to answer five different questions this past week that allowed us to avoid unnecessary research that would have been duplicative, that maybe saves the organization in overhead cost because we didn't spend our time or priorities chasing down answers we already know because our team built that repository. So tying back to those measures and being able to elevate your success by using those KPIs as a way of determining not only how far you've come but also to elevate your accomplishments, I think that's a pitfall as people aren't measuring the great work that they're doing.
Erin - 00:50:13: Vic here actually had a follow-up question to the other question when we were talking about the metrics in terms of if you want to, for example, train a team on usability testing and kind of benchmark the before and after, how do you practically do that? What metrics are you using to evaluate their understanding, before and after?
Devin - 00:50:34: That is an interesting question. In my experience, what I've done similarly, that I would recommend you try, is a skills map with teams of mine. In the past, I've done a skills map where we just have them look at different hard skills and soft skills that researchers deliver within an organization and rate on a scale of one to five their level of comfort. Are they really comfortable in teaching others about that particular method, or have they not really practiced it before and they're not really sure of the concept? Have maybe like that, Likert scale one to five, define what each of those levels are, and have designers scale themselves on a level of comfort. That's one opportunity that you could do. Another one is even simpler than that, saying, hey, we're going to be doing training and usability testing. What's your level of knowledge of how to run a usability test? One to ten. And then afterwards, hey, we just completed the training, what's your level of knowledge of usability testing? I think it could be as simple or as robust as you feel that is necessary to capture that information. But those are just a few ideas.
Erin - 00:51:42: That's great. It doesn't have to be so hard.
Devin - 00:51:44: That's right.
JH - 00:51:45: It's nice about the skill map, too. If you have a team of a couple of people and you share them, is people on the team know, Devin's an expert in this thing, are really strong, and I can go to them for advice and vice versa. And we've done that within our design team, and it's really helpful for folks to kind of know who they can lean on for advice in different areas.
Devin - 00:52:01: That's right.
Erin - 00:52:02: JH, do you have a favorite last question here?
JH - 00:52:04: I don't know that I do. There's many, I'm having trouble choosing. You got one?
Erin - 00:52:08: Sure. Do you have any advice on how to engage with people in a company that like the idea of using research but struggle to engage with research? So maybe this relates to a little bit of that vision. How do I work with your team? I know what you're trying to do. How do I get involved?
Devin - 00:52:25: Yeah, what I've seen very successful in the past is to at least have a point of view on your engagement model. And so I mentioned earlier an example of if a team is struggling to understand who you are, what you do, or what value you provide, a team playbook that outlines exactly that. Here's the members of the research team. This is typically what we do, here's how to engage with us. That's something that I've personally built in my past roles and literally just put it in front of people and said, hey, this is how you get started with working with members of our team. Any questions? And then you get feedback. Is this clear? Does this help clarify what we can do for you, how we can work together on your next project? And that's one of the ways that I've seen work really well because it's documented. They can refer back to it. They can be evangelists of it once they've lived it. I think another reason is just to try to get involved in a project that you're interested in, that they're working on, and build that engagement through practicing it. Practice makes perfect, they say. "Learning by doing" is another idiom that's out there. I personally really learn very well by doing things and by trying it out. And one of the best ways to really elevate the culture of research is by involving our partners along the journey of where we want them to go or who we want to be in terms of how we work together. So I would pick a project, maybe that they're working on or you're working on, and identify a way that you can collaborate with them that lives and breathes that engagement model. So then when they see the flowchart on the screen, they're like, oh yeah, I know exactly how that works because they've been there.
Erin - 00:54:05: Awesome.
JH - 00:54:06: Nice. I love that.
Erin - 00:54:07: Well, we're just about at time, and we answered so many questions and I think covered most of what we wanted to cover. I think this will be a wonderful resource for so many people at all sorts of levels and all sorts of companies. So thanks so much Devin for joining us.
Devin - 00:54:20: Of course. Thank you so much. Erin and John-Henry really appreciated the conversation and the time today.
JH - 00:54:24: Yeah, this is great.
Devin - 00:54:25: Thanks.