LITES-Header-2020-1

E05 $150B Is Spent on Workforce Training & Development. Is It a Waste?

005-LITES-KevinYates-TonyGroat (1).jpg

e05 $150B Is Spent on Workforce Training & Development. Is It a Waste?

Kevin Yates, Learning Measurement Services and Tony Groat, International Powered Access Federation with Pinky Gonzales

KEVIN YATES: My philosophy, my approach and my deeply rooted belief is that we don't have to wish hope or believe learning development makes an impact. There's data to prove it. So that drives and that guides everything that I do working across industries, no matter what industry it is.

MANDY HENRY: Welcome to the LITES Podcast, it’s Leadership in Industrial Technology, Education, and Safety where we interview thought-leaders in construction and heavy industries during this exciting time of rapid innovation. I’m Mandy Henry, Communities Manager at ITI and in this episode, our Pinky Gonzales sat down with Kevin Yates of Learning Measurement Services and Tony Groat of the International Powered Access Federation to discuss how training efficacy is measured and how training can help tell the story of a company.

Did you know that there’s about $6 trillion spent on annual payroll in the US? So, that includes payroll wages, benefits and bonuses paid to employees. There is another 150 Billion Dollars spent by US employers on training and development, which represents about 2 ½ percent of total payroll.

To evaluate training efficacy, you really need to get the variables right as well as the overall formula. Employers are well-aware of the hard costs associated to employee payroll, training courses, and exams. The hidden cost of training that is sometimes overlooked is opportunity cost – or, the lost production of an employee while they are in-training.  Depending on the employee’s role and employer industry, the opportunity cost could be 5 to 10 times the employee’s pay! 

For example, in the chemical industry, revenue per employee has been over $800,000 per year, or $385 per hour!  That sort of production equates to an opportunity cost of over $6,000 for a 2-day training course.

With all this considered, it’s obvious why we want to ensure training works, and produces positive returns for the employer and employee.

Let’s listen to Pinky Gonzales and Kevin Yates talk about how we can determine the efficacy of training.

GONZALES: Tell us about Learning Measurement Services, Kevin.

YATES: Learning Measurement Services, my business, is focused on helping organizations answer the question, did training work? Most important is answering those questions with data and analytics as fact-based evidence for learning and development impact.

GONZALES: On your website right up front it's very clear that measurement is one of the key focus points of the work that you do. How do you want to unpack that? How do we measure it? What do we measure? What does that mean?

YATES: Yeah, that's a great question. The good news is that it is not a complicated answer and the work is not complicated either. At the end of the day, there are organizations that are spending thousands of dollars and thousands of hours of people's time focused on training, learning, and development. Really the question is: what evidence is there that those efforts are making an impact on the goals that are within an organization and the way in which people are behaving and performing day to day?

The other question to answer is: whether or not there is a return on their investment. When I talk about return on investment, I talk about dollar-to-dollar return on investment. You know, quite often people use return on investment to mean many different things, but when I talk about it, I'm talking about dollar-to-dollar. So, when we take a look at measurement, we're taking a look at the different data points, the different metrics that answer that question. Because at the end of the day, what we want is fact-based evidence for the extent to which learning and development is impacting organization goals and changing people's performance.

GONZALES: Do you have any specific examples of how measurement is being used or could be used more effectively?

YATES: Yes, absolutely. I'll share with you some work that I recently was doing with a public agency and what that agency wanted to know is: whether or not the training and learning development programs that they had in place are impacting the way in which the case workers who are in that agency are working with their clients, working with children and working with families. So, what we wanted to do is create a chain of evidence that led to the impact of those behaviors on the work that they're actually doing in the field. What we did is create what I like to call a chain of evidence because a chain of evidence ultimately tells a learning and development story along the way. What you want to do is take a look at the different data points that lead to impact.

Those different data points include what happened after the learning took place. Also, you want to measure the extent to which there is incremental knowledge or knowledge acquisition or change in learning based on the learning experience. You also want to include in that chain of evidence the impact that learning has on the individual's performance and then also taking that all the way through to the ultimate organization goal and that is engaging children and families in a meaningful and high impact way.

So, the work that I did was focused on, first of all, identifying what are the different metrics and data points that you collect along the way that tell that story of impact. So, it's less focused on traditional activity metrics and activity data. Traditional metrics and traditional data for learning development look like: how many people were trained, how many courses were offered, and how many hours of training. Those metrics and that data is important from an operations perspective. So those are metrics and data you want to capture that tell the story about the operational efficiency of your training department, or your learning development department. They don't answer questions about impact on performance and change in behavior. The work that I did was really focused on: how do we capture data that tells a story about training and learning and development impact on behavior and performance?

We took a look at what metrics of what data that we need to collect along the way in that chain of evidence that showed or even predicted what we could expect as a result of the learning and development programs that they offer. Also, using data and metrics to identify whether or not there was a change in learning or knowledge acquisition and then really diving deep to say as a result of learning and development, here are the changes in behavior or performance that we see. So, it's identifying what metrics and what data do we use to capture that and what story does that tell? Then leading that all the way again to the end to say what metrics and what data can we use that capture the extent to which those changes in behavior and changes in performance are actually impacting the way in which people are working in the field as they work with families and as they work with children.

GONZALES: Are you personally involved in technological integrations? Do you work with the team or how do you actually engage with a client as it relates to the tools themselves?

YATES: Yeah, and I think it depends on where the client is. I may go as far as implementation, but then it might just be something as high level as helping with how to use the technology once it is implemented. It really depends on where the client is. Quite often what you'll find with technology implementations like SAPI, that’s where we’re really looking for those experts to come in and help us identify how to implement SAPI technology and embed it within the process.

Again, for me it's also supporting the organization through that process. There's a wide range for where I might fit just in terms of implementation all the way through to utilization and making it actionable with telling the learning development story.

GONZALES: Do you have any specific examples or can you walk us through one of your prior client engagements from basically beginning to end? You don't have to name them name, but maybe their industry, what was the approach and then what were the outcomes that you actually saw in a best-case scenario?

YATES: Yeah, that’s a great question. I think what comes to my mind is the work that I did with a very large data insights organization that was focused on marketing research and analytics. The goal that this organization had was finding a way in which to create faster time to value for its entry level marketing research analysts. So, the business goal was to reduce the time it takes for new hires to be effective and to provide value to clients. Prior to this idea it was taking maybe anywhere as long as half a year to get to a point where new hires were able to add value and do the work. So that was the organization goal.

Working backwards from there because that's where you always start. You always start with: what does the organization goal? Before you even talk about training and learning and development and all that. What you really want to do is to connect with the organization's goals and to preferably do that with the highest level of leadership in the business.

The highest level of leadership in the business knew that it wanted to do a better job at working with the clients that they serve. One of the ways in which they could do that was to have staff working on those businesses that were capable and able to provide value to the client, but do it in a faster way, particularly when it came to new employees. The process for me was to identify a metric, right? What is, what is the metric that will define success? We knew that we wanted to reduce the time it took to onboard new hires from 6 months to 3 months because we wanted to reduce that by 50 percent. So, that was the goal, reduce the time it takes to onboard new hires by 50 percent. That was the metric that was the guiding metric or the North Star, if you will.

All the work that I did was working backwards from that. Knowing that the time that it took to onboard new hires was the organization goal and knowing that the metric for success was 50 percent. We took a look at what is it that it takes to get to a point where new hires are able to perform in a way that meets organizations expectations for performance requirements, but do it in a way where it only took about 3 months to do that. We worked backwards, we identified performance requirements and then we identify metrics and data against those performance requirements. We identified how is that people are expected to work and operate in their roles and find performance metrics that allowed us to measure that. Then when we had those performance metrics in place, we went back from there to say: What is it that people need to know? What is their level of knowledge that they need to have about the work that they're doing that would ultimately allow them to perform at the level that we've talked about.

Then we identified metrics that measure level of knowledge, because those would identify the extent to which people were performing at the required level of knowledge. Then we worked backwards to say, “With the learning experience and the learning solutions themselves, what is it that we need to be able to see, what can we measure that would give us insight into the extent to which we could predict or expect that people would gain the knowledge and acquire the knowledge that we expect.” So again, we identified data and metrics that we use to measure the outcomes from the learning experiences and the learning solutions that allowed us to do that.

GONZALES: When you talk about measuring, obviously there's the dashboard side of this, the analytics side. In your experience, what are the best ways to do the measurement itself? Is it a written exam? Is it instructor led? Is it, show me that you know? How do people actually demonstrate that knowledge in a program like this?

YATES: I would say that there is no best way. I would say that there are many ways and that it really depends on what the performance expectations are and how you realize I'm thinking hoping or believing that people are meeting those performance expectations. Here's what I mean by that Pinky, I've seen instances where organizations are using surveys and asking employees themselves about their own performance. I would say that's a least best practice because they're asking people to report on their own performance. So, there are some biases there.

If there are opportunities where you can collect and gather data from observers of a person's performance, then that data is a bit more reliable and you can do that via a survey where they can respond very quickly about their experience with that particular worker. Then that would be a good way to capture reliable and valid data that is real time about a person's performance. The other thing that you probably have often seen when you are in retail, for example, is maybe the person has given you a receipt and on that receipt, they say that, “If you'll just go and answer a few questions about your experience today…” then that captures real time data about performance or that person's performance as well.

GONZALES: How does your approach change, if at all? I think philosophically it would probably be the same in any industry, but do you have a different style or methodology for say heavy industry than you would for a somebody in HR at a software company for example?

YATES: I don't have a different style because my philosophy and my approach and what I believe is the same no matter where I'm working or who I'm working with. My philosophy, my approach and my deeply rooted belief is that we don't have to wish hope or believe learning development makes an impact. There's data to prove it. So that drives and that guides everything that I do working across industries, no matter what industry it is.

What does change, however is the data that we look at. The data that you look at manufacturing might not be the same data that you'll look at in healthcare and the data that you look at in professional service. That there's data to show learning and development's impact, that guides everything you do and that doesn't change. So really the only thing that changes is the difference in the kinds of data that you look at. The approach is the same.

GONZALES: Can you give us some examples of what it means to tell a story with data? Where is that story told? in what ways is the story used?

YATES: Yeah, and there are different stories to tell. It depends on the stakeholder group because we don't want to tell all people the same story depending upon where they sit and the vested interests that they have in that story. So, here's what I mean by that. If I am talking to a senior leader or C-Suite, I am not going to tell a story about how many people we've trained. I'm not going to tell a story about how many hours of training we've offered. From that C-Suite perspective and that executive leader perspective, at the end of the day the story that they want to hear is how learning and development help us reach a goal.

HENRY: To gain a heavy industry perspective on training investments and efficacy, Pinky also spoke with Tony Groat of the International Powered Access Federation.

GONZALES: One of the focuses of this particular episode is on how to measure the results of your training program, which is one part. What should we train? What would we need to know in order to say we've done a good job with this person? What's that process been like? So obviously it's a committee. You’re the chairman, there's been an industry effort, but what have the steps been to get to this point? If somebody who discovers something out in the yard and they have an idea, how does better training work its way up through the system to a point where it becomes part of a standard?

GROAT: Well, first, the people who set the standards are industry experts to begin with. We have many years abroad experience and it's a consensus process where we are afforded the ability to have all stakeholders of the industry within the meetings. So, we have to have manufacturers there, we have to have dealers who buy the equipment and rent it, we have to have users of the equipment there, we have engineers and consultants there, we have training companies there, we have unions who represent workers who are using the equipment there. All of those parties at some level have touched with all of the other entities within the industry, and we recognize with what's happening. We obviously have statistical data on accidents that occur.

Where are we failing? What are the key issues that we have that we need to identify, find the root cause of and find means to mitigate. All of that information comes to our diet, to the dialogue, we look at what are our existing standards say and we make the assessments of: what can we do to improve the results that we're looking to achieve? So, all that takes place and the standards at one level need to be specific in terms of what the requirements are. On the opposing side, the standards need to be flexible to allow a wide range of approaches on achieving those results.

So when we talk about training, a lot of people say, “Well why don't we say that it should be an eight hour training or when can we put some time into it?” Well, what is the equipment that we're using? What's the complexity of that equipment? What is the experience of the operator who were training? What is the knowledge base of the job conditions that they're working on? So, you have all these variables that need to be put into the equation and in our standards and say that we need to get this result and how you get there is up to you to decide and put into place. All that considered, we still have a very generic approach to the first phase of training. One of the key issues that I think that the world industry needs to embrace is really the meaning behind training. Training enables one to become qualified.

GONZALES: Excellent. I want to dig a little bit more into the technology itself. So, somebody has been trained on a piece of equipment. That piece of equipment now has, I assume it's an after-market device of some sort where they're going to be able to log in or prove their credentials and then to say, “I should be allowed to operate this machine.”

GROAT: I prefer to take the two steps about what are the components of the training and where are we in the technology of those elements? So, we have two basic components for aerial lift, mobile, elevated work platform, and operating training. The first one that we call Theory Classroom Training. Historically instructor lead. Now we offer it through an online e-learning portal that can be used in lieu of instructor lead.

When the online e-learning came into play and people started using it, they started finding all these great features that added benefit, such as, you can schedule your training whenever you want, 24/7, wherever we have access to the Internet on your device. You can do it at your own pace. So, if you end up having five trainees, one can get done in two hours in one may get done four hours, but both of them will get the same exact learning and you will control it. We have nine modules; a trainee can't jump from module one to module nine and then try to answer the questions at the end. There's a quality that needs to be built into this because if people think about online training as just watching a PowerPoint presentation, that's not an effective e-learning curriculum. So, there's a lot of value. It needs to be interactive, it needs to engage the individual and it needs to progressively move them forward in the learning process without allowing them to skip, jump and move around.

So, there's a technology and there's a knowledge to developing effective e-learning that makes that tool an effective tool or an ineffective tool. Because I've seen both in the marketplace. The technology that's out there today, I mean, you can't walk down the street without bumping into someone because we're all have our faces into our devices and it's becoming much more acceptable as a deliverable. So now you end up having this and now you don't have to worry about the skill base that you have to keep up in the professional development of an instructor. It's like any tool that you have or any experience that you have, if you don't use it, you will lose it. An instructor is no different than the operator. If he goes over and trains all of their people this year and now he doesn't train for another monthly period of time, he's not going to be an effective or qualified instructor on the next course without him going through some work to redevelop his skills to be able to do that again, just like an operator will have to bring their skills up again. Because if you're not using it, you will lose it.

You can't condense or miss anything with the e-learning. So, it's much more controlled. You can measure everything. We have testing on each one, and there's a tendency for the trainee to actually be more engaged because they're in control of the whole process and from my own experience, and look looking at statistical information that's out there, the retention is generally higher for online e-learning than it is for an instructor led.

GONZALES: The skilled labor shortage, if you have an opinion or perspective on it. What do you see out there in terms of qualified workers and, and the demand versus supply?

GROAT: The issue is, we have lots of people in the world so we don't have a shortage of people. We have a shortage of qualified workers, skilled workers to do specific task. So, is there a shortage to answer? Sure, sure. There is. You know, there always has been and there always will be. And the question is, what is it that we can do today to make that change what you think about things that are happening today? You know, it's like who's not listening to the news every day that goes over and just says this place is closing. In Schenectady, New York where General Electric had one of their main facilities was there, their home branch that over 50,000 employees in their hay day, less than 10,000 there now.

GONZALES: Wow.

GROAT: And every year you hear another layoff, another layoff law, another layoff, and it's not GE alone. You hear it every day, they’re getting rid of people. They were getting rid of people. So, now we have this need for people and we have these people who have a need for jobs and we don't have this bridge to get that training and skill to get them to that point of being employed.

GONZALES: So, we've got one big hairy question. Is there one problem to rule them all? What, what do you think the biggest problem facing the industry right is right now we have 10 years to be working on it. What, what do you think the big, the big issue to solve right now?

GROAT: When I look at the construction industry as an example, their strength is they find a way to get things done. You know, if you look at how are we ever going to get that thing done, what the weaknesses are, they find a way to get things done. Sometimes that way to get things done, even though it doesn't result in someone getting injured or killed, is extremely dangerous. When we do that enough, we become desensitized and we as the employer, as the supervisor, we actually are training them on how to do things incorrectly. We put all of the requirements into our training program, into our standards, and then we go out there and watched them be ignored.

For me, it really is the broad application and compliance of what we said needs to be done. If you go out and you look at a company who monitors and supervises and provides feedback on it, that's how they go from being trained, enabled to become qualified. When we don't provide that guidance and feedback on an ongoing basis and the experience level, they will never be qualified. They will be an operator because you're allowing them to operate it, but they will not be qualified. For me, the discrepancy is that when we go through training, we don't follow it or we don't require it to be followed or we actually circumvent what's being done. Because I will tell you that I go back to that manufacturer's operations manual. No one pulls out and reads it and well, I can tell you right now, every day you're non-compliant. Because I can guarantee you you're not being compliant with the requirements that manufacturer has on the inspection that is required to be done every day. You're not doing it in compliance. Even if you have a little checklist over here saying that I work on the machine, check it off. You're not compliant with the standards, you're not requiring the manufacturers requirements and you're not bringing about safety.

GONZALES: Where do you see similarly simulators today in this matrix that you're already involved in?

GROAT: Simulators today are an add on tool. As I stated, when we're doing the practical evaluations, we do that in a pristine controlled environment. Unfortunately, the environment where we are operating this equipment is always far from pristine and hazard free. So, we go over there and train them and ideally even do an inspection. “I want a brand-new machine.” So, we're going over there inspect brand new machine “Oh, it looks great.” Then the next day they're going to get on a piece of equipment and say, “Boy, that's not like the equipment I had training,” and all of a sudden, they don't even know what's acceptable or what's not acceptable in terms of that inspection. “Wow. There's a big cut out of that tire. Is that OK or not OK? I don't know. We didn't go through that and training because there were cuts into tires on the machines I ever saw.” There's a tendency to miss information that can be augmented and in other means.

My own experience has been such that prior to the last year or two, I would say I see them as a neat thing to do, but people don't want to spend the time doing training right now that's required to add another element to it doesn't seem practical. It seems like an added cost with no demand. In contrast to that I’d go over and just say that when you're looking at how training takes place today, if you do it as a group, you have instructor or evaluator with one guy in five guys sitting, watching him doing that evaluation. Well, if you could, get them onto a simulator where they can practice before they go in and you put an in some real risk scenarios that we will never do on the real testing site that we're allowing them in training, now you're going to put them into a different learning experience, a different realization of it. They can get more comfortable with their controls before they get onto the real controls.

I see as the technology improves, the value added to it when we're doing training and how we’re doing training is there. Then I'd also go over and say futuristically, and I don't even know what the timeframe is anymore because technology is changing at such a great pace, could I say one day potentially them being used in lieu of actually getting on a piece of equipment, I can see the potential of that happening one day.

GONZALES: Are you feeling that VR itself has gotten to a point where it's that close to it. Or do you think it's a mindset that is changed or both?

GROAT: I think it's the technology that will change the mindset. Because today if we went over and said, “Hey, I'm going to just add that in.” I don't care how great it is, but if they can't take the place of being on the equipment, the real question is, “All right, I'm already doing the theory training and I'm already doing this thing and while you're doing is adding more time and cost for me to do training by putting this in here.” I'm not even discounting that's a value because I do think that there is value in that because again, as I stated, we have this clean environment where we're not giving them that breadth of experience and knowledge that being in certain risk situations will provide them with that real experience that they're going to face when they leave the training course.

So, there is value today and utilizing the tool. Now that's me talking as an educator for the safe use of this product. It's me talking as someone who is sitting on standards trying to develop these. But the real question is, are the users going to go over and say, “Yeah, I'd want that.” I think that when we get to the point where that tool can integrate those value added and it can take the place of the machinery, now you really have changed the ball game all together. I'm not just saying that from a time perspective, but the learning that could occur for the reasons that I just spoke about is that if it is as realistic as being on the equipment. If we can get to a point where we can objectively make that determination, now you have a game changer.

And I think about it from the broad application of it and on the utility of it. Think about today, OK, it gets light at this time and it's dark at this time. So that's the amount of hours that we have to do our training in. What was the availability of the equipment. I get the equipment that comes in here, I'm not going to let you get on that piece of equipment. It won't pass the inspection. You have this piece of equipment and how that is going to work each and every time. I pull a trailer up and I have a laptop that you can sit, knew the theory training and I have the VR simulator next to it and you get done with it and you go from here when I can put you in different scenarios to maybe even predicated on the type of work that you're going to be exposed to or that you're going to be doing.

So, you put all those pieces in play and it's like, wow, that's pretty cool. You can see people wanting to embrace that and try that. Gaming is fun, if it's not applicable to learning. In concert with what is required in learning, it's a challenge to see how it will be embraced broadly in all training. But as I'm looking at it today, I just say, “Wow.” It really is difficult to grow over there and say, “Training enables one to become qualified and we're limiting how much it's enabled them by not putting them into some of those scenarios that we can put them into that are realistic that they're going to be facing out in the field that we can't in good conscious do with someone who's doing what we were trying to train them.”

I see tremendous opportunity. I see the development of it just becoming much more robust for the applications that we're thinking about and while, as I said two years ago, I would say “I don't see that this is really going to be embraced widely in the industry. It's a nice thing.” But today I'm saying it will probably happen before I die as opposed to over as opposed to over my dead body, good expression. But I think it has a tremendous application.

GONZALES: Thank you for your time. Anything else you want to get in there?

GROAT: Uh, probably, but I don't remember specifically what it was

HENRY: Thanks for tuning into LITES. It's leadership and industrial technology education and safety. See more at LITES.org. If you enjoyed the show, please remember to rate, review and subscribe. LITES is a production of Industrial Training International. Our guests today were Kevin Yates with Learning Measurement Services and Tony Groat with IPAF. Our producer is Michael Montaine.

If you are interested in becoming more involved in LITES, we are hosting an Open House event in Pittsburgh, PA to demo our VR Crane & Equipment Simulators. The event will consist of facility tours and demonstrations on our virtual reality boom truck, carry deck, rough terrain crane, mobile elevated work platform, and a HoloLens crane inspection. There will also be a Happy Hour to follow! To register for this free event, visit our website at www.lites.org.

 

I’m Mandy Henry, thanks for tuning in.

LITES-Subscribe-iTunes-2020
LITES-Subscribe-Google-2020
LITES-Subscribe-Stitcher-2020
LITES-Subscribe-RSS-2020