>Ask the right question for evaluation

>Following a tweet today from Stella Collins about a course she delivered, I want to address with the L&D world the contentious issue of evaluation forms and feedback from training. Specifically though with behavioural training. To be clear, behavioural training is all forms of training that isn’t technical or a core skill.

I’m not talking about the ROI of training, I’ve talked about that before. This is about evaluating the training itself.

Why is this question important though? What information does it yield which is so sought after? Is there a Holy Grail in an evaluation form that we’re all missing?

Well let’s first be clear about what questions get asked in a traditional evaluation form:
– Was the training useful?
– Were the facilities to your requirement?
– How effective was the trainer?
– Was the content relevant?

And you’ll no doubt be reminded of others that come into this mix. Well here’s the thing. Those questions are useless. They have absolutely no use whatsoever. As a trainer, there is nothing you enter onto that information which I will act on. Ever.

But why?

Because I’m making an assumption that the training I’ve just tried to provide for you is useful, is beneficial, and will be relevant to you. I’m only looking for answers on the evaluation form to either confirm or validate my opinion of the training course. Even if you were to give me bad feedback, and be descriptive about the specific elements that need to be changed – I wouldn’t. Because it’s only one person’s opinion. There’s not enough supportive comments for me to make widespread change to the training.

So what needs to be done?

Personally, I don’t bother with evaluation forms. For the very reasons I have mentioned, they produce no information that will produce any change of behaviour on my part at all at any point in the future. And that’s what an evaluation form is meant to provide. Meaningful information for the trainer.

Surely they must give an indication of something though?

NO. While delivering training for Ford Motor Co., the training company I was with had to meet a requirement that the average score from each training session did not fall below 3.75 on a 1-5 scale. If it did, the trainer was questioned about their effectiveness, a plan needed to be drafted about the actions that would be taken to rectify the ‘problem’ and an improvement in the score expected.

The fundamental problem here is the trainer isn’t then assessed on how effective a trainer they are, but a variety of factors that either could or not be controlled by the trainer. Of course a trainer is meant to control for things such as training environment, content, delivery style and time management – but if on one day the trainer has a chesty cough, has been moved room at the last minute, the equipment in the room isn’t appropriate for the training and the trainer has no other aids to support his delivery style, he’s basically fucked (pardon the profanity).

There are formal models of training effectiveness which many trainers will attest to the need for evaluation forms. But I don’t believe in them, nor use them.

Evaluation forms may provide a useful foundation from which to prompt questions about the effectiveness of the training. But any good L&Der will know that actually, any change required in training comes from the discussions you actively seek out with course delegates, or from fellow L&Ders who are there to provide you with feedback.

However, if you do want to seek out formal feedback from your participants, make sure you ask the right questions. To my mind, these should be something like the following:
– What new information did you learn from this training session?
– How will you apply this learning back into your daily routine?
– What will you do to ensure you don’t fall back into bad habits?
– Did the trainer address your specific need for attending this training?
– Were you presented with information that confirmed or validated the way you are currently behaving?
– Were you given the opportunity to question and probe any areas of uncertainty?
– Were instructions form the trainer clear?
– Did the trainer create an inclusive and open training atmosphere?

Crucially, none of those questions should have a scoring mechanism against any of them. It is in the comments people make, that insight is derived. Obviously the wording may need to be changed for some to allow better comments, but you get the idea what questions are the important ones.

Published by

Sukh Pabial

I'm an occupational psychologist by profession and am passionate about all things learning and development, creating holistic learning solutions and using positive psychology in the workforce.

5 thoughts on “>Ask the right question for evaluation”

  1. >Great, and thought provoking, post Sukh. I'm currently reading the Paradox of Choice by Barry Schwarz and I thought this quote quite pertinent to the points you make (it's quite a long quote . . . and based on outcomes of experiments by social scientists)" . . . neither our predictions about how we will feel after an experience nor our memories of how we will feel during the experience are very accurate reflections of how we actually do feel while the experience is occurring."Although the context of the book is about the psychology of decision making I think this quote raises the issue of when, in a training session for example, is the best time to gather feedback.I think the list of questions you give at the end of your post is a good one and these are the types of questions I have asked at the end of training sessions I have delivered. The problem is that these types of questions require time to answer and they are only valuable if the majority of delegates provide you with decent feedback. I tend to find delegates are keen to leave at the end of a session not spend half an hour answering questions. Coming back to the quote, I think the challenge of getting valuable feedback is to get it while delegates are 'in the moment'.

  2. >I Love the proposed questions at the end of this blog, but agree with Martin that they need to be answered after some thought…not in a rushed 5 mins at the end of a session. Emailing to delegates the day after perhaps may be a good way of capturing information whilst it is still fresh, yet having allowed enough time for people to reflect.I understand why do not always use traditional evalution forms. I always do (the client tends to like them), but I don't always read them, or if I do, I read them a few days after the event. Why? I know how the event went! More useful feedback is gained by people who take the time to contact me afterwards with something specific to say. Very few people do it, but I ALWAYS listen to what they have to say.

  3. >@ Martin, this sounds like a book I should read. I enjoyed that quote, thanks for sharing it. I wholeheartedly agree with it too. The feeling I have during an experience is never the same as what I remember it to be. So many other things affect that feeling post experience we can only go on our memories.I also agree with your point about the time factor. Where I have used forms I've been explicit with my delegates that I want them to fill them out before they leave. In most cases this is fine and the delegates are kind enough to oblige.

  4. >@ Sheridan, thanks for taking the time to comment. I think part of our role as L&Ders is to get clients to understand that although evaluation forms may offer some immediate feedback, they are superficial at best.I am in absolute agreement with you about not reading the forms though, as the feedback I've actively sought from the delegates is far richer and meaningful than what the forms may produce.I too always take on board what someone has to say when they take the time out to speak to me directly about the session.

  5. >As it was my tweet that started this I thought I'd better wade in (wellies and umbrella ready).I agree with much of what has been said so am only commenting on what I believe is different.I like to do a post course reflection on all training I'm involved in and think evaluation forms are part of this because it's someone elses view as well as mine.I do read all my evaluation forms and usually straight afterwards (or the next day) because then I've got the individuals clearly in my head so can better evaluate their responses based on my experience (which I know is only my view). I have made changes to workshops based on evaluation sheets if there seems to be a trend. One question we ask is about which session was most / least useful and why. It's only based on what seems useful at the time and doesn't reflect what people will do back at work but there is a stronger chance they'll apply something they found useful or interesting on the day than something that they didn't find useful.The scored questions are never as interesting as the written responses because I believe the written responses show an element of thought – it's very easy to go down a list and tick scores but needs more effort to write comments.And you can tell that some people consistently score higher and others consistently lower but if they've got a range of scores then it shows they've thought about their answers.Like others I also ask people directly if it's working for them and encourage feedback during the day. So yesterday one person was quite happy in the session to say that she hadn't found one area particularly useful to her but appreciated that other people had clearly found it interesting. Later on she found something particularly useful for her.One question I like at the end of our evaluations is 'what 3 things are you going to do differently back at work?' I don't know whether they'll do them of course but it's more likely they'll do them if they've been specifically asked. This info can also be fed back to line managers so they can help people implement their learning.One of the differences may be that I'm not working in a large organisation and being measured on my evaluation sheets – I'm using them as a tool to get a brief written summary of what people learned, what they'll do tomorrow back at work and whether they think anything should be changed. I'll tie that in with my experience, their comments through the day, longer term feedback from managers and any other useful information that comes my way.

Say something...

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s