There are many ways to look at data. Two views that can be clarifying for learning businesses are the performance view and the potential view. You can use data to understand how you’re doing currently. How is your learning business performing? You can and should also use data to understand possibilities. How could you be doing? What products and services might you add or sunset or change?
In this episode of the Leading Learning Podcast, number 409, co-hosts Jeff Cobb and Celisa Steele discuss using data in decision-making.
To tune in, listen below. To make sure you catch all future episodes, be sure to subscribe via RSS, Apple Podcasts, Spotify, Stitcher Radio, iHeartRadio, PodBean, or any podcatcher service you may use (e.g., Overcast). And, if you like the podcast, be sure to give it a tweet.
Listen to the Show
Access the Transcript
Download a PDF transcript of this episode’s audio.
Read the Show Notes
Jeff Cobb: [00:00:00] There are many ways to look at data. Two views that can be clarifying for learning businesses are the performance view and the potential view.
Celisa Steele: [00:00:12] I’m Celisa Steele.
Jeff Cobb: [00:00:13] I’m Jeff Cobb, and this is the Leading Learning Podcast.
Celisa Steele: [00:00:22] Data [date-uh] or data [dat-uh]? Singular or plural, as in “the data show” or “the data shows”?
Jeff Cobb: [00:00:29] Yes, data raises a lot of questions. But it has—or they have—a lot of fans too. Some people swear by the need for data to make decisions, and, even if they’re not extremists about it, most people will at least say they buy into the importance of using data.
Celisa Steele: [00:00:47] Yes, and, Jeff, you and I personally believe that data should play an essential part in shaping the strategy and the tactics of a learning business. That’s why we’ve decided to focus on data in this episode, number 409. And we’re particularly focusing on the role of data in helping learning businesses make decisions.
An Event and a Session on Data for Decision-Making
Jeff Cobb: [00:01:09] That’s right. One of the reasons we’re thinking about data these days is because of a recent in-person event that we were part of.
Celisa Steele: [00:01:19] That’s right. ASAE (the American Society of Association Executives) held a one-day, single-track event in person at their offices in Washington, DC, on April 18th, called Learning by Association: Launching Successful Professional Development Initiatives.
Jeff Cobb: [00:01:40] This was a pretty focused event—pretty tight focus around launching professional development initiatives. There was just a single track to it. It’s a minimum viable product approach, I guess you would say—a way for them to test out whether this was something to do or not on an ongoing basis.
Celisa Steele: [00:01:59] Yes. When you say for them to test out, it might be worth unpacking that a little bit because I believe that the Professional Development Section Council was particularly instrumental in coming up with the idea for this event and pulling it off—then, of course, well supported by ASAE staff in doing that.
Jeff Cobb: [00:02:16] Right. ASAE has a lot of events for association professionals. Obviously, that’s the focus of the American Society of Association Executives. But somewhat missing from that has been a focus on the professional development area, and we know that education is a huge part of what most associations do. Most of them have it as a revenue source, which tracks back to what we focus on: reach, revenue, and impact. That revenue part—big thing for associations. Impact—big thing for associations. But not a lot of dedicated professional development programming up until now.
Celisa Steele: [00:02:54] Yes, the PD Council realized that there has been this gap, and so they decided to put together this event and roll it out. Diane Elkins, who’s the chair of that committee right now, at one point said she started chanting, “We’re number four! We are number four!” Because one of the key data points that they had going into this is that, out of all of the different segments that ASAE represents—when you think about associations, like membership, for example—it turns out that professional development is the fourth largest group. So a very significant segment of the people that ASAE seeks to serve, and yet they did not have a dedicated conference for them.
Jeff Cobb: [00:03:35] There was a little bit of data there. They had that data saying, “This is an important segment.” So that’s established clearly by data. We also know clearly that there are not dedicated offerings for that segment. ASAE has done various things in the past to fold in professional development here and there. But, in terms of really dedicated, that hadn’t happened. Of course, there’s been a lot of anecdotal input and good old common sense applied to say it would make sense for us to have something here that we could then gather more data from to determine what we then do going forward.
Celisa Steele: [00:04:08] I think the timing of this happening in 2024—a few years post the initial lockdowns that were caused by COVID-19—was an important factor in having the decision to hold this in person. There was a lot of energy in the room that day.
Jeff Cobb: [00:04:30] Yes, and we’ve seen this a lot. There’s still a lot of pent-up desire for community conversation, networking, that we’re still seeking opportunities—post-COVID, post-Zoom consuming our lives—to be able to engage in that sort of peer-to-peer interaction. And you could definitely feel it in the room there.
Celisa Steele: [00:04:51] You could feel it in the room, but that’s not to say that there weren’t also comments that showed the difficulty for some people in showing up in that room for a full day. Comments about, “Wow, this is starting really early and going really late.” If you happen to have responsibilities at home, perhaps around getting kids to or from school, it can be a challenge to show up at 8 am or a challenge to stay through the reception that went until 6 pm, for example.
Jeff Cobb: [00:05:19] That’s right. Those were data points that came from our being together. So much data can be gathered in real time. By just trying something, you learn so much. And, of course, with the issue of accessibility, inclusiveness—that was part of the agenda for the event as well. One of the important topics that got covered in this single-track, focused day that we had together.
Celisa Steele: [00:05:42] Right. There were essentially six sessions, with transition times built in between each hour-long segment. Those were also billed as being for networking, for being some of that peer sharing, peer discussion that can happen very naturally and informally when you’re all in the room together. But, you’re right, one of those sessions was around inclusivity and thinking very broadly, as one should be thinking, about inclusivity, not just in terms of perhaps people with a disability but also in terms of what I was just referencing around people with family responsibilities and trying to take into account as many of those factors as possible when you’re offering an educational program so that you’re not inadvertently cutting out potential learners who could value and benefit from what you’re offering.
Jeff Cobb: [00:06:33] Just a quick taste of some of the other topics that were on the agenda. Started out with talking about a foundation for success. How do you, from the ground up, strategically determine what you’re going to develop in order to serve your membership? That was the “Program Success Blueprint”—the name of that session. We did a session following that that was really focused on data: “The Business-Optimized Portfolio.” We’ll come back to that. I will say something I’ve noticed—and maybe it’s that confirmation bias or recency bias or whatever—but data seemed to run throughout these sessions. In that first session around having a blueprint for success, data was very important. Obviously, the focus of our session was on data.
Jeff Cobb: [00:07:16] We then had the session on inclusivity and accessibility. Also very data-driven because of the way that you’re able to be inclusive and accessible is really having an understanding of both your audience specifically but in general. If you know the numbers around the types of people who may have these different types of challenges, it’s just a no-brainer that you need to be designing and delivering in a more accessible and inclusive way. So data was very important there. We looked at buying, building, borrowing around content creation, and, again, data very important to determine making those sorts of decisions.
Celisa Steele: [00:07:54] And then there was a panel—actually two panels that rounded out the day. One panel with folks sharing some of their successes but also, very importantly, sharing some of their failures and what they had learned from those at their organizations, in their learning businesses. And then a final panel that pulled the curtain back on the day was with folks involved in the planning of the event and unpacking a little bit the decisions that had been made, a little bit of “If we had had more time or if we had it to do it again, this might be the direction we would go or the angle that we would take.” Again, if you have a room of professional development folks, to do that analysis of what actually has happened can be very meaningful because it’s the kind of thing you can take back and apply to your own offerings and think about, “Oh, that point about the scheduling (or whatever it is), we can apply that to what we’re offering in our learning business.”
Jeff Cobb: [00:08:54] Again, a type of data gathering. Very meta throughout this. We’re about to turn now to talking a little bit more about our session and specifically focusing on data, but just the whole perception of this event where data was saying that this is worth trying, and then you gather data, ran through the sessions, all of the topics that people were talking about, and then this backing up and looking at the event itself and saying, “What data are we gathering from this that then might shape what we do going forward?” I think that encapsulates a lot of what happens when you’re doing any sort of educational programming, any sort of professional development programming. You’ve got that pre-data, you’ve got that data that’s going to inform the topics, and then you’ve got the collection of data to then plan for going forward.
Partner with Tagoras
Celisa Steele: [00:09:42] At Tagoras, we partner with professional and trade associations, continuing education units, training firms, and other learning businesses to help them to understand market realities and potential, to connect better with existing customers and find new ones, and to make smart investment decisions around product development and portfolio management. Drawing on our expertise in lifelong learning, market assessment, and strategy formulation, we can help you achieve greater reach, revenue, and impact. Learn more at tagoras.com/more.
What Does Data-Driven Decision-Making Look Like?
Celisa Steele: [00:10:22] We can share a little bit more about what we talked about data and some of our further thoughts about data that, to your point, came out of the fact that we were there that day, what we heard in the room, and what folks happened to share there. Again, back to what we said at the outset of this episode, we really do believe in the importance of data, the value of data, in helping learning businesses decide what to do. And so we’re bringing that interest, and we’re bringing a lot of experience in helping learning businesses look at their own data and determine what that might mean about what they should add to their portfolio, what they might need to take out of their portfolio, how the portfolio is performing. So we come in with both experience and interest in this topic of using data for decision-making.
Jeff Cobb: [00:11:12] Right. We did start off by really examining what that means—to be data-driven when you’re making decisions—because I think that gets thrown around a lot. Clients or prospective clients will come to us all the time and say, “We want a data-driven plan for moving forward.” What does that really mean? We broke that down a little bit to start with.
Celisa Steele: [00:11:33] Yes, and we started by first asking what the people in the room thought of when they think about data-driven decision-making and heard from people. I will say what we heard in the room was very much in line with what we were already thinking of in terms of data-driven decision-making, and, for us, it encompasses a lot of aspects. We want to look at both qualitative and quantitative data, for example. We want to look at both hard and soft data. We want to make sure that we have accurate snapshots of a particular point in time. Data very often is that picture of a moment. But then it’s also very valuable to have those snapshots over time, so you can begin to get a sense of progress and see how any changes you’re making are being reflected in the data because you can compare it back to a point in time before that change was in place, for example.
Jeff Cobb: [00:12:29] Yes, and all of this implies being conscious of and intentional about getting data, making sure you’re in a position to get data, sharing data, using data effectively in a meaningful way to make decisions going forward, and getting everybody bought into that process. And that was an important point of this also. We asked about “What is data-driven decision-making?” in the beginning. We also asked people about the data culture at their organizations, what it’s like for them. Are they using data? We did a one-to-five rating scale. One being you basically ignore the data—you just do whatever you’re going to do because that’s what you do—up to five being it factors into everything. I think we saw a pretty good bell curve on that in terms of whether people were using data or not. But, using it or not, there were common challenges out there that people had around it.
Celisa Steele: [00:13:28] Yes, and we should touch on some of the challenges. But I will say two things. One is that we talk about using data to help drive decision-making. We don’t want to be data-determined…
Jeff Cobb: [00:13:44] Right.
Celisa Steele: [00:13:44] …where you purely take what the data tells you and then run with it. There’s a lot of emotion that goes into human decision-making, and that means both on the learning business, internal side, in terms of setting strategy and determining tactics; it also means in terms of the learners and how they’re going to interact or respond. We’ve probably all had experiences where you put out something that’s exactly what the data said was wanted, and it doesn’t perform the way that you thought that it would. This is where some of the emotion comes in. And that’s why we like to talk about the qualitative as well as the quantitative, the soft data as well as the hard data. The idea is to pull all those types of data together and to also allow your gut to have some influence here as well, so that you’re not just using what the data says to absolutely determine the direction. You’re instead taking the data as one data point.
Jeff Cobb: [00:14:43] Yes, and you made the point earlier too about data over time. You get those snapshots in time, but you need to look at it over time as well because we find all the time that an organization will run the big needs assessment survey or the big member survey or whatever it is, and they get that big bump in data that maybe they’re doing once every five years, and suddenly that becomes the driver for everything. And it’s just that one point in time. You have to make looking at data, talking about data, having data influence decisions, something that’s happening consistently over time. And you have some points of reference that you’re able to see how they change over time and what the trends might be.
Challenges to Having a Strong Data Culture
Celisa Steele: [00:15:28] Some of the challenges that we heard around having a good data culture, around being more on that five end of the Likert scale of “How data-driven are you in your decision-making at your organization?,” some of those challenges have to do with silos, which can mean that the data that you really need or want isn’t something that you readily have access to. Perhaps someone else in the organization, some other department or area, has access to it.
Jeff Cobb: [00:16:00] Yes. We talked about a simple one that came up, which is Google Analytics. You can get so much very useful information from Google Analytics. We gave an example of working with an organization, a CPA society in December. If you’re in the public accounting field, you know that people are going for their CPE [continuing professional education] credit that accountants have to get to maintain their licensure. That always spikes at the end of the year because people are coming up against their 40 hours.
Jeff Cobb: [00:16:29] We looked at the Google Analytics, and we could see, yes, it spiked, but then we also looked at the devices that were being used, which Google Analytics will tell you quite readily. And the mobile usage was above 50 percent of people hitting their CPE pages and trying to find a CPE to participate in. And this organization had basically no mobile accessibility for the online education that it was offering. So just that one data point gave you a place where you could have a huge win going forward. But so often, if you’re sitting in the professional development department, education department, of an organization, and you’re separated from the marketing folks, you don’t see those numbers. You don’t necessarily know that’s happening. The marketing folks don’t necessarily know to look for that. So you have to have those conversations and then not have those silos.
Celisa Steele: [00:17:16] I think terminology was another challenge that we heard—different departments or groups within the organization using different terms or collecting essentially the same data but in different ways. For example, you can imagine a free text field versus someone providing ranges. All of that can then greatly complicate your ability to use that data or to compare that data.
Jeff Cobb: [00:17:42] We asked about things like taxonomies, which are going to give you a common language and an agreement around this word applies to this—we’re tracking these things. Which some organizations had. Not a high percentage of organizations in the room had implemented taxonomies. But then we’ll often find that, even when an organization has implemented a taxonomy, the professional development department might have implemented this taxonomy, and marketing or membership or certification might have a completely different taxonomy. And so, again, you have these language problems. It’s a communication issue. It’s an accessibility issue around the data. It’s just an overall lack of standards and standardization as to how and what you’re going to collect and what form that’s going to take.
Celisa Steele: [00:18:27] And one other challenge that we heard was that even when you have the data, and you pull it all together, and you provide it, different people looking at the exact same data can come to different conclusions about what it means. And, for us, that gets back to this idea of data-driven, but not data-determined, decision-making because it’s all fine and good to think that data is going to point very clearly to the one, correct, true path, but, in reality, it often doesn’t. It usually steers you in the right direction.
Jeff Cobb: [00:18:59] It’s directional, yes.
Celisa Steele: [00:18:59] You still then have to use other sources of strategic thinking, critical thinking to determine, “Okay, we’re headed this direction. Now what specifically do we do?” Or “How do we specifically capitalize on that opportunity that the data is pointing us towards?”
Jeff Cobb: [00:19:17] Right. And then, once you’re out there and doing it, you have to be willing to course-correct. We were referencing it in the beginning, that the PD council at ASAE thought this was something to do, to get people in the room for this particular offering. But, even once you’re in the room, you’ve got to read the room and figure out how you might adjust, and then, of course, you’re collecting data from the people who are there, who are going to then tell you how to do some things the same, better, different going forward, and you always have to be prepared for that with data, and you’ve got to get out there with some reasonable assumptions about what the offering should be, what the engagement with the market should be, based on what the data tells you, but then you keep collecting data, and then you adjust as you move forward.
Looking at Data from a Performance Perspective and from a Potential Perspective
Celisa Steele: [00:20:09] There are many ways to look at data or to think about data. One approach that we’re fans of is by thinking about data from both a perspective of performance—what can it tell you about how you’re currently performing, your products and services and what you’re able to achieve with those currently. That’s one view, performance. The other view that you can take is around potential. Performance—how are you performing? Potential—how could you perform, or how might you perform? That’s then much more about adjustments that you might make to your overall strategy, to your specific products and offerings, and thinking about what you might add, change, or remove.
Jeff Cobb: [00:20:53] Discussing those two big areas, big buckets of data was really the meat of our presentation. We won’t try to go into all of that in a podcast because a lot of this was very conversational too, and people contributing what they’re doing. But, within that performance bucket, we like to further break that down. And again a lot of what we’re doing in the presentation was just trying to give people some language—an approach for talking about data, for categorizing data—so that, when you do get in that room with other people, you have some ways to parse it out and have meaningful conversation around it. Within that performance bucket—probably no surprise to anybody who’s been following us or listening to us—we tend to break that down into reach, revenue, and impact. What are some of those data points that are going to tell you whether you’re reaching those right learners that you want to be serving? What are the data points that are going to tell you if you’re making the revenue that you need to be making? And what are the data points that are going to tell you whether it’s actually doing anything, whether you’re moving the dial with your educational offerings?
The Performance Perspective: Reach, Revenue, and Impact
Celisa Steele: [00:21:53] In that reach area of performance, you’re going to be looking at things like Web traffic. You mentioned Google Analytics earlier. You’re going to want to be seeing how many people are coming to your sites, which pages on your sites—that level of detail, that kind of data. You also want to be looking at e-mail. In our experience, most learning businesses rely very heavily on e-mail to sell their products and services, and so you want to be looking at things like e-mail open rates, click rates. You also want to be gauging awareness, and this is something that we often find is overlooked because sometimes people want to be very mechanical about it: “We’re going to improve our e-mail content,” or “We’re going to make changes to the Web site.” But sometimes people aren’t even opening those e-mails, or they aren’t even visiting the Web site. There can be an awareness problem, and so making sure that you’re able to gain a prospect’s or even a current customer’s, current member’s attention and get them to become aware of what you are offering.
Jeff Cobb: [00:22:55] Then if they are finally converting. In each of the areas we talked about—you’ve got your traffic, you’ve got your e-mail, you’ve got your awareness—are you then converting into enrollments and into registrations? Obviously, that’s going to be the end game with reach—or, I guess, the end game. We’ll call it the end game because you’re going to go beyond there to impact, but we’ll get to impact and talk about that. But that’s a quick synopsis of the reach area. We then talk about revenue, [which] tends to be a clearer thing to folks. They’re usually tracking your revenue at a gross level and at a net level too. We do find that organizations are often tracking the overall revenue of the learning business, getting that gross revenue. They’re likely tracking the gross revenue of different products or product lines.
Jeff Cobb: [00:23:38] But, when it gets down to net revenue, things get a little foggier usually because, in a lot of cases, organizations aren’t really factoring staff time and softer costs in at either a product category or a product-by-product basis. So you don’t really know your profitability or your net revenue, and you want to get to that because you really need to know what’s performing the best within your portfolio. Every product doesn’t necessarily have to be profitable, but you want to know which ones are and which ones aren’t so you know how to manage that mix. And then one we mentioned that we hardly ever see attention given to in this revenue area is cash flow. How is the cash flowing in?
Jeff Cobb: [00:24:20] It often happens that organizations will put a lot of money down upfront, for example, on a big event, putting down guarantees and securing whatever contracts they need to, when the actual revenue from it is not going to flow for quite a while. So there’s a gap in that cash flow, which can end up leaving you cash poor. Even though you know that money is coming eventually, you don’t have cash on hand to invest in other products and marketing and the variety of things you might do to grow your learning business. So paying attention to how is cash flowing. Are there ways that you can have some products that are going to have more immediate cash flow to them? And, in general, making sure that you do have access to the funds you need to do what you need to do. You don’t have to wait until all the registrations are in or until the next budget is approved.
Celisa Steele: [00:25:08] And then, in this performance view of data, the last area is impact. That tends to involve looking at things around attendance or completion rates or specific learner performance, meaning perhaps how they do on tests or quizzes. It can also involve looking at engagement feedback. It can look at retention. Are those learners that take one course from you coming back for second, third, fourth courses, for example? Like revenue, I think a lot of the things that tend to get tracked under impact are fairly typical, but there’s still often a lot of room for improvement to make sure that you’re collecting data that is actionable and going to help you make better products or services. We’ve had Dr. Will Thalheimer on the podcast before, talking about how to take your smile sheets to the next level so that you’re getting useful information out of those reactions from your learners and that you can use that data to then improve your products.
Jeff Cobb: [00:26:14] This is one of those great areas for a form of tracking over time. Not just getting the reactions right at the end of a course or a session. Get those, but then also follow up, whether that’s a week later, three months later, or six months later, to find out is what was taught in that learning experience, is that actually being applied on the job or wherever it was intended to be applied? But you have to collect that data to know.
Celisa Steele: [00:26:40] We do know that a lot of learning businesses rely on learners to self-report how effective something is, how much they’re applying it. That often is somewhat necessitated by reality. But there’s also the opportunity to think beyond just asking learners for their reactions and their views, to think about how else could you triangulate or get different perspectives on the value of that learning? Perhaps you can have access to their managers or their bosses to get some input on the performance. You might be able to use something like the Brinkerhoff Success Case Method to go out and gather some stories of people who are having important, impactful results come from the education offerings that you’re putting out there.
Jeff Cobb: [00:27:25] Right.
Celisa Steele: [00:27:26] We have a past episode with Rob Brinkerhoff, and so we’ll make sure to link to that in the show notes for this episode.
The Potential Perspective: Three Tools
Jeff Cobb: [00:27:40] That’s a quick rundown of the performance area and reach, revenue, and impact within that. Then, as you noted, Celisa, we also covered this potential area. Performance is about what’s happening now or what has happened in the past. How are we doing? Potential is about what could happen in the future. What could we be doing? How might we improve? We’ll say a little less about this because we focused on three of our tools to help people with this, and we’ve talked about those tools a lot in different episodes on the podcast, so we’ll link to those. But those were the Value Ramp, the Product Value Profile, and the Market Insight Matrix.
Celisa Steele: [00:28:22] We’re using those three tools, but the umbrellas those fall under are going to help you realize your potential. You’re going to want to do some kind of portfolio analysis. That’s where we’re recommending the Value Ramp, [which] could be useful to help you look at the overall story and sense that your portfolio is telling to the world. You’re also going to want to do some product analysis, so not just at that overall portfolio level but at the more individual product level or at least product line level, how are things performing? That’s where the Product Value Profile is potentially useful. And then you need to be doing market assessment generally to make sure that you are taking advantage of opportunities and that you’re aware of any risks or threats to what you’re currently offering. That’s where the Market Insight Matrix can help.
Jeff Cobb: [00:29:14] This is, again, an over-time thing. The Insight Matrix gives you a framework for different types of data points to track for different reasons over time and to be able to provide some visibility and some accountability around those metrics so that you can have them as an ongoing conversation point. We emphasized there, and we’ll emphasize here, that all of these are great thinking and conversation tools. They’re ways to help you surface the opportunities, the challenges, the issues and do it in a way that makes it as visual as possible and also makes it as referenceable as possible by multiple people. You can put the Value Ramp up. All you have to do is draw it on a whiteboard, and you can start a conversation around it. It’s data-driven, and you can use the data you get from that conversation to then determine what do we need to go out and do more in terms of gathering data? Same thing with the Product Value Profile. Same thing with the Market Insight Matrix.
A Decision-Making Framework: Objectives, Alternatives, and Risks
Celisa Steele: [00:30:10] We said at the outset that we’re fans of data and that we’re fans of using data to help make decisions. We should unpack that a little bit because, again, it’s very easy to say, “Yes, let’s use data to make decisions.” Well, what does that actually look like? What does decision-making look like? Where are some of the opportunities to apply data? There’s a framework that we talked about.
Jeff Cobb: [00:30:33] Yes. Again, very simple—can help with a group trying to make a decision. Any good decision process is really going to have three elements to it. You’ve got your objective or objectives that you’re trying to achieve in this case, based on the data that you’re working from. You’ve got your alternatives, which are the different ways that you can achieve the objective or the objectives that you’re pursuing. And then you’ve got the risk that may factor into any of those alternatives and into whatever final decision you go with.
Celisa Steele: [00:31:03] Risk breaks down a little bit further into both severity and likelihood. Not all risks are equal. If, for example, you’re considering some sort of very expensive product development process that’s going to take years or require hundreds of thousands of dollars, that’s a pretty high risk that something could go wrong. If you’re thinking much more of a minimum viable product, like the ASAE event that’s held at their offices, not a whole lot of overhead. That’s a very different level of risk, if one of those fails versus the other one, for example. There’s the severity. But then there’s also the likelihood. How likely is it that either of those might potentially fail?
Jeff Cobb: [00:31:46] Yes, and so something like a minimum viable product, the reason that’s so attractive in the first place is the likelihood with any minimum viable product is usually relatively high that it’s going to fail because usually you’re venturing with something new. But you’re reducing that seriousness significantly. You feel okay putting it out there because, if it does fail, you’re not bankrupting yourself in the process of doing that.
Celisa Steele: [00:32:08] Around the objectives, this is an area where, in so much of the work that we do and that learning businesses do, you can gloss over the objectives, and you can think, “Yes, we know what the objective is.” But you really need to dig down and make sure that everybody is clear on what that objective is. I can give an example from outside a learning business. Jeff, you might be faster-thinking than I am about an example from a learning business. Think about somebody saying something like, “I want a car to get to work.” If you really unpack that and realize that the objective is actually, “I need to get to work,” the car is one alternative, to talk about that next piece. But maybe there’s a public transit option that would also get you to work. Or maybe Uber would turn out to be less expensive than buying a new car and all the maintenance and everything that goes with it, for example.
Jeff Cobb: [00:33:07] I’ll just stay with that non-learning business example because it’s easy for folks to digest. Once you’ve got that objective, then, as you’re indicating, that introduces your alternatives. Everything flows from the objectives. You have to be clear and crisp about what those objectives are so that you’re then looking at the right alternatives and weighing the pros and cons of those alternatives. This is when the whole pros-and-cons thing tends to be valuable.
Jeff Cobb: [00:33:36] If you’ve got a clear objective, and you’ve got really good alternatives that you’ve weighed out—we see this happen all the time in learning management system selection, for example—your field of alternatives tends to shrink pretty quickly, based on how well they do align with truly supporting your objectives. And then you get down to a point where you’ve got two good alternatives. Then you’re going to look at the risk of those alternatives. And, ultimately, if you’ve got two good alternatives, a little bit of a balance in the risks—so one isn’t clearly riskier, more likely, more serious than the other—then it comes down to price. Price is its own form of risk, you could say, but you have this very clarifying and clear process to go through with the decision that, again, can be easily communicated, and everybody can say, “Here’s where we are in the process. Here’s what we’re doing at this point.”
Clear Objectives Help Identify KPIs
Celisa Steele: [00:34:26] Those objectives, again, are just so important. That’s essentially the goal. When everyone has those objectives clearly in mind, everyone’s really clear and in agreement around those objectives, around that goal, then that’s going to tell you a lot about what you need to do. It’s going to also tell you a lot about what your key performance indicators might be. There are a ton of data points that any of us could be tracking, and there’s way more data than we can make effective use of. And so what having those clear goals and objectives is going to help you do is determine what are those key data points so that we do have good data to help us in our decision-making, in fine-tuning our approach. But we’re not drowning in the data, and we’re not missing the signal for all the noise that the data might be producing.
Jeff Cobb: [00:35:27] There are many ways to look at data. Two views that can be clarifying for learning businesses are the performance view and the potential view. You can use data to understand how you’re doing currently. How is your learning business performing? You can and should also use data to understand possibilities. How could you be doing? What products and services might you add or sunset or change?
To make sure you don’t miss new episodes, we encourage you to subscribe via RSS, Apple Podcasts, Spotify, Stitcher Radio, iHeartRadio, PodBean, or any podcatcher service you may use (e.g., Overcast). Subscribing also gives us some data on the impact of the podcast.
We’d also be grateful if you would take a minute to rate us on Apple Podcasts or wherever you listen. We personally appreciate reviews and ratings, and they help us show up when people search for content on leading a learning business.
Finally, consider following us and sharing the good word about Leading Learning. You can find us on X (formerly Twitter), Facebook, and LinkedIn.
Leave a Reply