What the evaluation?

Here’s the scene. You’ve attended a 3 day training session on developing a set of skills. It’s been an internally delivered session by the Learning and Development team. The content was developed with help from subject matter experts, piloted with a group before roll out, and has an Exec sponsor helping give the training some weighting. It was co-facilitated by an L&Der and a business leader. It picked up on key points of fact, gave opportunity for debate, tested thinking, and developed learning. 14 people were present for the training, and they all engaged with the event.

At the end of the session they’re asked these questions:

On a scale of 1-5 (1 – not at all, 2 – a little, 3 – unsure, 4 – somewhat, 5 – very), answer the following questions:

1) How useful did you find the session?

2) How likely are you to use the skills learned on the session?

3) Will you recommend the training for others to attend?

4) Did you learn a range of new skills that you can apply to the day job?

5) Were the facilitators effective in delivering the content?

Look like fair and reasonable questions don’t they?

Except, when you look at the questions a bit closer, scrutinise what they’re asking, and look at what happens with the information, you start to unravel their usefulness as a way of evaluating a learning session.

First, questions 1-4 are asking about the intent of a person. Now, we know that people are notoriously fickle. What they say they will do, and what they will actually do are two entirely different things. “Will you vote at the General Election?”, “Yes.” After the General Election – “Did you vote?”, “No.” Well, there goes the reliability of having asked that question then.

Question 5 asks the people attending about the efficacy of the facilitators. Now this is simply fraught with issues. “Do I have the personal power to voice my personal reflection?” “Do I want to be non-conformist and voice opinion against the perception of not being kind and appreciative?” “Do I want to give the facilitators a true answer where it may get them into potential trouble?” “If I didn’t like the style of one of the facilitators, but the other was ok, whom am I responding for?” The question may as well simply ask – Did you like the facilitators? The usefulness of the answer amounts to the same, not a lot.

So what am I getting at here? I’m hoping to highlight just how much folly there is in trying to use a scoring based evaluation form at the end of a learning session. I’m being quite careful here – it’s the actual evaluation mechanism I’m being critical of. I firmly believe that we should evaluate if the learning session has been successful. I just think there are better ways of doing so.

Using a scoring based system is really only effective when you are trying to survey people, or ask them to complete a psychometric tool. In the instance of evaluation of a learning session, it just doesn’t hold much validity or reliability. Remember, if something is valid, it measures what you expect it to measure. If it is reliable, you will get consistent answers across a range of respondents. There are those in power in organisations who will insist it’s a useful metric. It’s only a useful metric because they think learning can be measured. Yet they forget that in every academic situation where you are testing learning, you provide a way of testing that knowledge which is independently marked, the results are normalised, and you arrive at a set of results which inform if students have passed or failed. That process in itself takes months. And yet we’re expecting to evaluate if a learner has truly gained the skill they need within minutes of completing a session.

The questions we should be asking at the end of a learning session need to be simple, open, and allow for comment:
– Which parts of the session were most useful to you?
– What did the facilitators do to encourage your learning?
– Which pieces of content were well delivered and provided you with good understanding of the topic at hand?
– What will you practically be able to differently when you return to your day role?

These types of question provide much more information which is both highly useful, and highly critical of the session. It allows for the people attending to voice their opinion as openly or as flatly as they wish. It allows the facilitators to actually derive meaning from the responses. Importantly, it avoids people just giving positive responses because that’s what they think they should do.

If you want to measure if the facilitator is effective or not, I have used the BARS system of evaluation which is highly objective, and very robust. BARS = Behaviourally Anchored Rating Scale. In this you identify the core competencies required of someone in a role. You then provide for each competency a scale of behaviours which moves essentially from ineffective to highly effective. Based on the overall score, you can confidently judge if someone is an effective L&Der or not. Imagine asking people attending a learning session to complete a form like this solely on the facilitator. It ain’t gonna happen. Nor is asking a peer to sit through the whole learning session in order to observe and provide feedback each and every time.

There are middle grounds, as there are with most things in life. And you can find useful ways of evaluating learning sessions. The sooner we move away from scoring based evaluations, the better.

Over on Rob Jones’ blog, he’s providing the other side of this debate.

The fallacy of ROI in L&D?

Through the life of an L&Der, there inevitably comes the question, what’s the ROI of training? It’s an interesting question, because something which is typically about behaviours and personal development is being forced to conform to conventional measurements a business is used to. And, importantly, it’s a valid question. A business needs to know they are gaining benefit from the overhead cost that someone (or team) is paid.

But I think what’s typically put as the ‘proof’ is a fallacy. Here’s what I mean. What’s the value you put on the relationships you build at work? When most companies will encourage a collegiate work ethic, there’s no actual evidence to suggest this makes a difference. Not really. Employee engagement surveys may suggest a workforce is happy or engaged or feels listened to. But there’s little to suggest having a collegiate atmosphere is any better than having a workforce which is mercenary. As long as the job gets done, and no one dies, and environmentally things seem to be amenable to helping people do good work, isn’t that what you need?

Abdi Ltd have a robust system of working out ROI, and it’s a strict methodology you have to follow to show the true cost of training and what this means for the business. It’s certainly a useful tool and system, I just don’t think it gives a true and full picture of what needs to be considered.

So why the focus on collaboration, sharing, learning and development, and all things intangible? Because if the self-help books are anything to go by, these are the things we should be focused on, right? And in particular, the books from the successful millionaires/billionaires seem to suggest it’s the soft things in life that make the difference. You know, listening, coaching, advising, etc. Same old, same old.

The question that L&D needs to be ready to answer is what are they doing to help the business achieve its company goals. That’s how it wins. That’s where the ROI comes from. Not from the number of training courses it delivers. Not from the number of people who have attended external events. Not from the number of managers who have had training. Not even from the tens of thousands (in some cases hundreds of thousands) of pounds spent on trainers, consultants and facilitators. They’re just figures which anyone can improve.

If I can’t tell you how I’m helping the company to achieve its goals then I’m not giving ROI. What that means is I have to be so explicitly a part of the business that managers know I’m a source that can help something get delivered. That may be an L&D event, it may be paying for an event, it may be facilitating a workshop. That’s where L&D makes its mark. Not through happy sheets, monthly training reports or budget reviews. Yes, they’re important. No, they don’t reflect what is actually done in any way whatsoever.

My old boss made the team fill out a set of activities which ‘measured’ what we do. This covered a range of activities we were meant to do which effectively became a list of ‘delivery’ items, and ‘non-delivery’. And every month we’d have a look at how we set our time against it. And it used to be fascinating. Most of the team would average 60%-80% delivery activity. That could sound scary for some people, and encouraging for others. Ultimately what it helped him to show the HR Director was how the team were using their time. I’m not suggesting L&D needs to be monitored via timesheets, but it does offer a better indication of the actual things done as opposed to broad figures and broad numbers.

>Do you understand ROI?

>This week I attended my first L&D 2020 workshop courtesy of the Training Journal (http://www.trainingjournal.com). The workshop was focused on the need for L&D to talk and understand the language of business in order to be successful.

The first talk was given by Tony Sheehan from Ashridge Business School (http://www.ashridge.org.uk). This was a great talk about current trends in L&D and where they may well be headed. Some of the highlights for me were around attitudes to L&D and how technology is influencing how people learn. They provided a 10 point sliding scale which was interesting, and I’m hoping to see the results of this survey. The sliding scale had factors such as Theory or Practice, Information or Resources, and I don’t remember the rest. It was fascinating all the same.
We then had a talk from Jack Wills, head of British Institute for Learning and Development (BILD – http://www.thebild.org) about what businesses actually want from L&D. Some keen insights around CEOs looking for profit, return on investment (ROI), and those are the key things. This really helped me to think about am I getting it right when I think about and what I understand about how the business operates.
The final session was with Jane Massy who is the UK leading practitioner in ROI. She is the CEO of abdi (http://www.abdi.eu.com) which is the only certified professional service in the UK who provide consultancy and training on the ROI method headed by Jack Phillips. Now this was the piece de resistance for me. Jane provided in the short time she had a very insightful look into how you can ensure as an organisation that you are working to a disciplined method which guarantees that ROI is at the core of what you do in L&D. Some of my learnings are below.
The first learning I had was about whether or not the L&D initiatives are linked directly to the business plan. I already know this is what an effective L&D function has to do. The learning was whether or not you could confidently draw a line of sight from the activities and initiatives you do in L&D and how they link directly to the business plan. So, I already do the normal stuff like talking with managers regularly, stay abreast of what’s happening across the business, and through general curiosity and keeping my ears alert, I develop a sense of and understanding of what the business needs are. I then draw up a plan, present it to a few people to validate my idea, then run ahead with it. The part I miss (and if I’m honest often) is taking a moment to think how does this new and exciting initiative fit in with the business plan. Learning no.1.
The next learning I had was about if my L&D initiatives focus on what the learner is expected to do, not what they’re expected to learn. Hmm ok this one is a bit tricky. I will design my courses based on what I think the learner needs to learn. But am I looking at what they’re expected to do once the training is complete? I may think I’m focusing on that, and I may be facilitating conversations to that effect, but is that the result I have in mind? I’m not sure. On reflection, it’s not as explicit as it probably should be. Learning no.2.
The third learning I had was about knowing the full cost of sending someone on a course – be it internally or externally. I know about actual costs associated with people attending training, but this certainly wasn’t a focus of mine. I don’t look to factor the opportunity cost, expenses, admin costs, on costs of someone attending training as being the full cost of them attending. And that’s a pretty big oversight. Someone attending a £2000 5 day course may actually be costing the business somewhere in the region of £6000-£7000 – WOW. How did I not think of that before?
What’s the importance of that though? Well essentially, there’s a bigger expectation about the ROI on that person than was initially considered. Initially I may have thought after 2 months of project work, that knowledge gained has been utilised effectively and we have recouped the cost of the training. Actually now I have to think bigger than that. It may be a lot further down the line this may be the result. And what happens if this person leaves? What happens to that knowledge, time and investment made? So there needs to be some follow up activities that ensure we don’t lose that – documented learnings, wikis, blogs, presentations, workshops, etc.
Aren’t I doing that anyway? Again, I’m kind of doing those things, or encouraging staff and managers to do those things. But that’s not explicit and we are not getting the ROI we would expect if people don’t take the time to do these activities. Learning no.3.
So where does this live me? With a very positive attitude to ensuring I don’t forget the importance of ROI and what it means for the business. From looking at the training needs analysis, to the L&D initiatives, to the ROI expected. These are things I could have spoken about before – I can now talk about them more intelligently.