Two women at computers
eLearning Learning
May 15, 2023

The Benefits of Predictive Learning Analytics


Most L&D professionals measure training after it’s over but what if you could predict the success of a learning program in advance? Author Ken Phillips, CEO of The Phillips Group, shares how predictive learning analytics can change the learner’s experience and the outcomes for an organization.

 

 

 

SHOW NOTES:

Ken Phillips shared many actionable items in his interview including the following:

  • Predictive learning analytics can help tell you which learners are most likely to apply what they’ve learned in a training program and which ones weren’t. It also helps predict which managers are most likely to support the training.
  • Predictive learning analytics can significantly reduce the amount of scrap training, or training that’s delivered but not applied directly back to the job. 
  • Traditional measurement and evaluation typically is backwards-looking and it is most focused on training programs. Predictive learning analytics is forward-looking and examines what’s likely to happen after the training concludes. 

Learn more about Ken Phillip's book, Evaluate Learning With Predictive Learning Analytics on TD.org.

Read about Ken's talk at the #ATD23 Conference in San Diego. 

Powered by Learning earned an Award of Distinction in the Podcast/Audio category from The Communicator Awards and a Silver Davey Award for Educational Podcast. The podcast is also named to Feedspot's Top 40 L&D podcasts and Training Industry’s Ultimate L&D Podcast Guide.

Learn more about d'Vinci at www.dvinci.com.


TRANSCRIPT:

 

Susan: Most L&D professionals measure and evaluate training after it's over, but there's another methodology that focuses on predictive learning analytics that's changing how people think about learning.

Ken: Predictive learning analytics is different in that it's forward-looking because we're trying to forecast right at the end of the training program what's likely to happen down the road when people get back on the job, so it's forward-looking, not backward-looking.

Susan: Our guest today is Ken Phillips, author of Evaluate Learning with Predictive Learning Analytics. Ken will share how you can shift your thinking to create less scrap learning and more learning that's moving the needle. Next, on Powered by Learning.

Speaker 1: Powered by Learning is brought to you by d'Vinci Interactive. d'Vinci's approach to learning is grounded in 30 years of innovation and expertise. We use proven strategies and leading technology to develop solutions that empower learners  [00:01:00] to improve quality and boost performance. Learn more at dvinci.com.

Susan: Joining us today are Beth Buchanan, d'Vinci, senior instructional designer, project manager, and our guest, Ken Phillips, CEO of Phillips Associates. Phillips Associates provides consulting services and workshops focused on predictive learning analytics and the measurement and evaluation of learning. Ken, thanks so much for joining us today.

Beth: Yes, welcome, Ken.

Ken: Well, thank you both for inviting me here. I'm looking forward to our conversation.

Susan: Oh, so are we, Ken. Start off by telling us a little bit about your background and what your company does.

Ken: You can tell probably from the gray hair, but I've been in the L&D field for a long time, almost 40 years now.

Susan: Did the L&D field make your hair gray or--

Ken: Oh, that's an interesting question. I haven't thought about that, [00:02:00] but that's maybe for another podcast, so we can talk about that. It's… probably close to 25 years of that time, I focused primarily on performance management and I did consulting and training in that area. In 2008, I had submitted RFPs to speak at the ATD International Conference previously. Never been accepted because they have tons of submissions. In 2008, I said, well, okay, I'm going to do something around measurement and evaluation, because I've kind of been dabbling in that area previously and got accepted.

And so I did my first ATD International presentation in 2008 and that kind of launched my career into measurement and evaluation because I really found that I had a passion for measurement and evaluation. [00:03:00] So, I just transitioned my whole consulting practice and training practice into measurement and evaluation after that. That's kind of how I got started in measurement and evaluation.

Beth: Wonderful. You're recognized for predictive analytics. I wonder if you can, in broad terms, sort of tell us what that is. What is predictive analytics and what is its purpose?

Ken: Oh yes. It's a methodology that I've been working on for probably the last six or seven years. I got started in it by, I ran across a book called Predictive Analytics: How to Tell When Someone Would Click Buy, Lie, or Die. I got the book and it sat on my bookshelf for months until I picked it up and started reading it because I thought it might be a lot of statistics and numbers and things like that.

To my surprise, [00:04:00] pleasant surprise, was it was a series of case studies around how organizations were using predictive analytics in all different areas of the organization; in sales, in HR, in manufacturing, and lots of different case studies. What got me particularly interested in predictive analytics was a case study around Hewlett-Packard. It was their HR department, and what they were noticing is that voluntary turnover was tracking upwards and had been doing that for several years. They wanted to take a look at that and see if they could predict before it happened, which employees were most likely to voluntarily leave the company.

They had tons of data on all their employees and they were able to create this algorithm which consisted of a number of variables that actually did predict which of their employees were most likely to voluntarily leave the company [00:05:00] and then they put in place a training program for all the managers and supervisors to train them on this, the algorithm and the metric, and then also how to have conversations with employees that fit this model and were likely to voluntarily leave the company. And they were able to reduce their recruiting costs by over $2 million in less than a year.

And so I thought, "Wow, if they can do that with voluntary turnover, couldn't we do something like that with training and focusing on the scrap learning?" Which is the gap or the difference between training that's delivered and not applied back on the job because it waste time, waste money. And I thought, "Wouldn't it be cool if we could predict which learners were most likely to apply what they've learned in a training program, which weren't, which managers were most likely to support the training, which weren't, and be able to have that predictive data [00:06:00] so that we could use that to forecast and advance what was likely to happen with the training program. So that's kind of the backstory of where I got started with all this stuff.

Beth: That's great. That's an incredible achievement. I want to unpack some of those things because there's a lot there in what you just said. Let's start with this idea of scrap learning. As you said, this is the amount of learning that is in the training that does not get applied to the job, right?

Ken: Correct.

Beth: How big of a problem is that?

Ken: There's been lots of research in this area. The two benchmark studies that I often reference in my presentations represent kind of a high end and the low end. The low end was some research that was done by a company called KnowledgeAdvisors. They were a software company here in Chicago where I live. And they created a software program called Metrics That Matter, which the large corporations use to automate the whole [00:07:00] measurement and evaluation of learning process. They had been very successful, had lots of clients back in a number of years ago or several years ago. They went into the back-end of that system to see what they could find around scrap learning and what percent of the training that was being delivered by all their clients was actually being applied back on the job and what they came up with is that, on average, about 45% of all training that gets delivered ends up as scrap, so just slightly less than half.

The other extreme would be some research that was done by a guy by the name of Rob Brinkerhoff, who was a professor at Western Michigan University in their graduate program, Human Resource Development. He did a study in 2004 and a second one in 2008, two different training programs, two different organizations completely, [00:08:00] two different groups of learners, but he found essentially the same result and that is that around 85% of the training that gets delivered ends up as scrap.

Now in some cases, what he found, which was interesting, was that there were a significant chunk of people, actually about 65% of the learners, who would leave the training and their heart would be in the right spot. They would say, "Yes, I'm going to go back and apply this." And then when they got back on the job, stuff happened and things got in the way, and they ended up reverting back to their old ways within 30 days or less. So, when you added that number together with about 20% of the learners who go to the training program and then go back and do absolutely nothing, it came out to about 85% of the training ended up as scrap. Those are the benchmark ends. I've done some other research [00:09:00] with clients and I've looked at scrap learning and every time it's come, it's fallen within that 45% to 85%.

Beth: That's a huge problem. And it sounds like there's a methodology to sort of chip away at it in phases. As you said, first, you have to analyze what's happening, why is this the case, how are these numbers so big that this well-intentioned training given to well-intentioned learners doesn't transfer? I know, in predictive analytics, there are three phases to it. Could you walk us through phase one, the analytical part of it, just a little bit so we can sort of hear how the process works?

Ken: Sure. The phase one is all around identifying the causes of scraplearning. There's a quote that I use when I do my sessions and training that came from a guy by the name of Jim Blankenship who is a former [00:10:00] CEO of Netscape, and his quote is, "If you have data, let's look at the data. If all you have are opinions, let's just go with mine." 

I like that quote because what happens in the L&D or particularly around scrap learning is, the business executives are aware when they send people to training, not everybody's going to go back and apply it. The L&D person who's designing and developing the training program knows that regardless of how well the program is designed and delivered, there's going to be a percent of people that aren't going to go back and apply it. But what happens is, it's like the 3000-pound elephant in the room, nobody talks about it because nobody has a way, nobody has a solution for it. That's what predictive learning analytics does, because it provides a methodology for pinpointing the underlying causes of scrap learning. And that's all in that phase one where we look at and identify [00:11:00] which learners are most likely to apply what they've learned back on the job, which aren't, which managers of the learners are most likely to provide active support for the training, which aren't. Also, then looking at calculating the amount of scrap learning associated with the program, as well as identifying what obstacles the learners experienced after the training when they went back on the job that got in the way and either inhibited or prevented them from applying what they've learned. So all that data gets collected in that first phase so that we can pinpoint exactly where we want to take targeted corrective actions in that second phase of the methodology.

Beth: Yes, solet's talk about that a little bit. Let's get into the solutions and solution implementation a bit. It sounds like, from what you said, even in discussing the analysis phase, that follow-up is a really important factor to ensure that you are targeting these at-risk learners [00:12:00] and having the right kind of managerial support. What are some ways to do that?

Ken: It's somewhat contextual. It depends on the training and on the learners and the work environment that they came from and all that. But there certainly are a number of things that could be done to keep the scrap learning a percentage from tracking downward so much and just simple email reminders that focus on one little aspect of the content just goes out to the learners. Another thing might be a job aid at the end of the training. “Here's a job aid that summarizes everything that was covered in the training.” The whole thing now these days, which you'll be familiar with on Microlearning. That's another tool that can be used to send out information to the learners on a much smaller scale than the entire training program that could serve as reinforcement. There's lots of different ways that reinforcement can be implemented but it does depend [00:13:00] on the learners, the training, and the work environment that the learners come from.

Beth: You mentioned in your book that predictive analytics is a collaborative effort. Can you explain that a little bit what you mean?

Ken: Sure. I think it starts with that phase one when you are able to calculate the amount of scrap learning that's associated with that training program and you can go into on a business executive and can say, "Okay, here's what we've calculated, and if we don't address this, here's what it's costing in the way of wasted money and wasted time so if we can convert that scrap learning percentage into dollars and into wasted time. Right from the get-go, then you've got the business executives' attention because they know they're sending their employees to this training program, [00:14:00] and either it's wasting their time or it's wasting his money or her money. That gets their attention, but then that's just step one because when you get into the solutions, in that phase one, when we've identified the obstacles that learners encountered when they got back on the job and what happened and either prevented them from applying what they've learned or inhibited them from applying what they learned.

In every case when I've worked with clients using this methodology, a number of the obstacles that get identified are outside the realm of the learning and development person's job scope. They can't address some of those things because they're outside their job scope, so they've got to get the business executive involved in that in order to eliminate or minimize those obstacles and increase training transfer. It's another way that from the very get-go, you get their attention and then you keep them involved in that process [00:15:00] through the solution implementation phase as well.

Susan: I would think that collaboration's really important too, because everybody needs to understand the financial impact of all of this. Because of that collaboration, yes, you're going to be investing time and resources to do this predictive learning but you're also ultimately going to have more engaged learners, more impactful learning, and better results.

Ken: Yes.Yes. I actually have a formula that I have created that you can-- It's just plug and play where you just put in the numbers where you can take and convert that scrap learning percentage to dollars and to wasted time and do it credibly. So when you go in there and share it with a business executive, it doesn't look like you've painted a dire picture just to get their attention but you can provide credible data and credible evidence about what it's costing and that goes back to that Jim Blankenship quote, "If you got data, let's look at the data. If you got an opinion, well, I'm just going to go with mine."

Beth: As we've [00:16:00] been talking about this, I've been wondering, how can I apply this? There's so many different types of training programs, focused ones, really wide-spanning ones. How do you decide what's a good training experience to use predictive analytics on? Perhaps you have an example of a case study or predictive analytics in practice that you could share with us.

Ken: Sure. Theoretically, you can use it with any kind of training. However, there are some exceptions, but typically, the research shows that technical types of training programs tend to have higher levels of training transfer or… lower levels of scrap learning, typically, because it's stuff that people are learning that they're going to immediately start using on the job. Probably, it's being supported by their supervisor or manager, and they want them to start doing this stuff. There's a bunch of other reasons for that besides [00:17:00] the fact that it's just technical training.

Soft skill training is a whole different story. That's where going in and taking a look at all of this stuff and being able to sort through it is the real advantage. I would say the criteria I would look at would be; is the program strategically important? In other words, does this program address some kind of goal or objective, either within a department or within the organization? Is it a high-cost program? For example, leadership development programs tend to have fairly high costs involved because it might involve feedback and one-on-one coaching, and all kinds of other expenses. Also, a third one would be how many people are going to attend the training program. If it's a program that's going to be rolled out to a large number of people, that means that there's a lot of time going to be invested in it and a lot of money that's going to be invested in it.

I think the other one would be finding a business executive who is willing or eager, [00:18:00] maybe eager is too strong, willing to support the efforts. Those four things, the strategic importance, the cost, the number of people, and then finding someone that's willing to work with you and be a partner are the things that I would look for in starting with the methodology. If you have a success case, if you're doing this internally, then you can take that success case and share it with other executives and then gives you some credibility then with them and also would create, I think, some interest on their part in doing something similar.

Oh, and then you asked about a case study. There was a company that I worked with a few years ago. They were a commercial property management company. They managed offices and malls and things like that. What they were doing is they had significant number of their senior executives were getting long in the tooth [00:19:00] and getting ready to retire and so they wanted to implement a leadership development program for their mid-level managers and high-potential people that they'd identified. So they put together this leadership-development training program, put all these mid-level managers and high-potential people through this training. They wanted to know whether or not people were applying it back on the job and whether or not the reinforcement kinds of things that we talked about earlier were being used and did they have a positive effect. What they found, in the best-case scenario, the amount of scrap learning went down from 44% before doing anything, before they implemented anything, down to 32%. And this was after they'd implemented all these corrective actions, but shortly after they implemented, so given more time, that percentage probably would've come down even further. The worst-case scenario was [00:20:00] that 64% of the training was scrap and it dropped down to 59%. And the most likely case, which is just an average of the best case and the worst case, went from 54% down to 46%. We also analyzed whether those changes were statistically significant. And in this case, what we found was that there was a 89% probability that the reduction in scrap learning was due to the follow-up activities and reinforcement activities that they had implemented.

Beth: That's incredible.

Susan: Yes.

Beth: It's very refreshing to have numbers on behavior change. This is such the elusive piece of evaluation where-- How do you know if someone's actually applying and using this new knowledge? You found a way here to tell that story. It's really fascinating, Ken.

Ken: Yes, and I think there's two things [00:21:00] that differentiate this methodology from traditional measurement and evaluation that I think are important for people to understand. Traditional measurement and evaluation typically is backwards-looking. When you talk about the five levels of evaluation, you're looking back and you're saying, "Oh, yes, that's interesting." People liked this program or they didn't or people learn something or they didn't or some people are applying it and some didn't, and so on, but always it's all backwards-looking.

The other thing is that the focus on traditional measurement and evaluation is on programs, what happened with this program? Predictive learning analytics is different in that it's forward-looking because we're trying to forecast right at the end of the training program what's likely to happen down the road when people get back on the job, so it's forward-looking not backward-looking. The other differentiator is that we're able to identify specific unique individual, so which learners, specifically, [00:22:00] by who they are, are likely to apply what they've learned, which are not likely to apply what they've learned, which managers are likely to provide active support, which aren't. Now, we can become much, much more strategic in all those follow-up things.

Instead of just throwing stuff against the wall, we can identify all those follow-up things and we can say, "Okay, yes, maybe some of this stuff we want to give to everybody, but here, this group of people, these learners, we need to make sure they get the whole enchilada, all the different reinforcement activities or these managers," because we know that they're not likely to provide active support, so we need to make sure that we can help them change their approach to supporting the training. It becomes much more strategic.

Beth: Right. And then I can see how from there, you can tell the story of organizational change. You can go back to that executive-level person and say, "Hey, we have the data to know how to shift the culture in our organization now," because [00:23:00] we can target these people. That's what's really exciting as well. It's from the top down and you can create a whole new culture from it, it seems.

Ken: Yes. As I said, I think that we talked in terms of creating a success story, where you can pick a program and pick an executive that's going to be willing to work with you and support you, and then you build a success case and you can start distributing that to and sharing that with the other executives. My guess is that they're all sending people to training and they know that it's not always being applied and they're wasting money and they're wasting time. And that's not going to look good to the head executive, especially if they know that there's another executive who's found a way to mitigate the scrap learning.

Susan: Yes, it's definitely a game changer. Ken, for people listening who want to try to get this going in their organization, [00:24:00] what advice would you give them on how to get started?

Ken: Well, there are two things. One would be booklet that you mentioned earlier that I have written, available from ATD or the Association for Talent Development. Title of the book is Evaluate Learning With Predictive Learning Analytics. It's available at www.td.org. It walks through the whole process and explains everything. Of course, the other source would be me, so if you want to--

Susan: Andpeople can see you in person.

Ken: Yes, right. They can contact me in person and on my email or my phone and through LinkedIn. I'm happy to connect with them and talk about it, assist with the implementation in any way that they want, or if they want to just do the book and try it on their own, that's fine as well.

Susan: I meant in person, too, at the ATD Conference, so tell us about that. You're speaking in there as well.

Ken: Yes, I am. I'm doing a session called Level 3 Evaluations Made Simple, Credible, and Actionable. [00:25:00] But I'm also doing an author meet-and-greet in the ATD bookstore at the International Conference where I'll be there available to answer questions, sign the book if you want me to sign the book, and just say hello.

Susan: Well, that's great, Ken. Hopefully, any of our listeners heading to the conference will check out your sessions and come meet you at the meet-and-greet as well.

Ken: YesI would love that. I would love that.

Susan: It was a great conversation. I really enjoyed listening to the two of you talk about this. This sounds like a game-changer, like the missing piece of the puzzle in organizations. I'm just really impressed. Ken, you're really breaking new ground and measurement and evaluation with the predictive learning analytics. We're so thankful you took time to share that with our listeners on Powered by Learning.

Ken: Well, thank you for having me. I'm biased, obviously, because it's my methodology, but it is different from traditional measurement and evaluation. I hope other people recognize that as well. [00:26:00]

Susan: Absolutely, thank you so much, Ken. Thank you for your time.

Ken: Thank you, Susan, and thank you, Beth.

Beth: Thanks, Ken.

Susan: Beth, that was a fascinating conversation with Ken. The idea of looking forward to inform your learning strategy is really interesting.

Beth: It is, I've always thought of evaluation as something to wrap up a training, and to gather information for the next training, but what if you could actually use an evaluation preemptively? I think this could be an amazing tool for our clients to have that data from the start so you know where you need to target in terms of the learner, the manager. The whole entire process could be streamlined, really. I think, in terms of evaluation, it's really breaking some new ground.

Susan: I agree. We're going to look forward to hearing more from Ken and hopefully, this is something that we'll be able to use with some of our clients [00:27:00] moving forward too.

Beth: I think a lot of people will be really excited to try it.

Susan: Thanks, Beth.

Beth: Thanks, Susan.

Susan: Special thanks to our guest Ken Phillips. If you have an idea for a topic or a guest, please reach out to us at poweredbylearning@dvinci.com.

Beth Buchanan

By Beth Buchanan, Senior Instructional Designer/Project Manager

About Us

d'Vinci Interactive is an award-winning comprehensive learning solutions provider for corporate, government, medical, non-profit, and K-12 target markets.

Ready to Connect?

Contact us today to start the conversation. We work with you to find innovative solutions that drive a sense of shared accomplishment and trust.