>Following a tweet today from Stella Collins about a course she delivered, I want to address with the L&D world the contentious issue of evaluation forms and feedback from training. Specifically though with behavioural training. To be clear, behavioural training is all forms of training that isn’t technical or a core skill.
I’m not talking about the ROI of training, I’ve talked about that before. This is about evaluating the training itself.
Why is this question important though? What information does it yield which is so sought after? Is there a Holy Grail in an evaluation form that we’re all missing?
Well let’s first be clear about what questions get asked in a traditional evaluation form:
– Was the training useful?
– Were the facilities to your requirement?
– How effective was the trainer?
– Was the content relevant?
And you’ll no doubt be reminded of others that come into this mix. Well here’s the thing. Those questions are useless. They have absolutely no use whatsoever. As a trainer, there is nothing you enter onto that information which I will act on. Ever.
Because I’m making an assumption that the training I’ve just tried to provide for you is useful, is beneficial, and will be relevant to you. I’m only looking for answers on the evaluation form to either confirm or validate my opinion of the training course. Even if you were to give me bad feedback, and be descriptive about the specific elements that need to be changed – I wouldn’t. Because it’s only one person’s opinion. There’s not enough supportive comments for me to make widespread change to the training.
So what needs to be done?
Personally, I don’t bother with evaluation forms. For the very reasons I have mentioned, they produce no information that will produce any change of behaviour on my part at all at any point in the future. And that’s what an evaluation form is meant to provide. Meaningful information for the trainer.
Surely they must give an indication of something though?
NO. While delivering training for Ford Motor Co., the training company I was with had to meet a requirement that the average score from each training session did not fall below 3.75 on a 1-5 scale. If it did, the trainer was questioned about their effectiveness, a plan needed to be drafted about the actions that would be taken to rectify the ‘problem’ and an improvement in the score expected.
The fundamental problem here is the trainer isn’t then assessed on how effective a trainer they are, but a variety of factors that either could or not be controlled by the trainer. Of course a trainer is meant to control for things such as training environment, content, delivery style and time management – but if on one day the trainer has a chesty cough, has been moved room at the last minute, the equipment in the room isn’t appropriate for the training and the trainer has no other aids to support his delivery style, he’s basically fucked (pardon the profanity).
There are formal models of training effectiveness which many trainers will attest to the need for evaluation forms. But I don’t believe in them, nor use them.
Evaluation forms may provide a useful foundation from which to prompt questions about the effectiveness of the training. But any good L&Der will know that actually, any change required in training comes from the discussions you actively seek out with course delegates, or from fellow L&Ders who are there to provide you with feedback.
However, if you do want to seek out formal feedback from your participants, make sure you ask the right questions. To my mind, these should be something like the following:
– What new information did you learn from this training session?
– How will you apply this learning back into your daily routine?
– What will you do to ensure you don’t fall back into bad habits?
– Did the trainer address your specific need for attending this training?
– Were you presented with information that confirmed or validated the way you are currently behaving?
– Were you given the opportunity to question and probe any areas of uncertainty?
– Were instructions form the trainer clear?
– Did the trainer create an inclusive and open training atmosphere?
Crucially, none of those questions should have a scoring mechanism against any of them. It is in the comments people make, that insight is derived. Obviously the wording may need to be changed for some to allow better comments, but you get the idea what questions are the important ones.