What the evaluation?

Here’s the scene. You’ve attended a 3 day training session on developing a set of skills. It’s been an internally delivered session by the Learning and Development team. The content was developed with help from subject matter experts, piloted with a group before roll out, and has an Exec sponsor helping give the training some weighting. It was co-facilitated by an L&Der and a business leader. It picked up on key points of fact, gave opportunity for debate, tested thinking, and developed learning. 14 people were present for the training, and they all engaged with the event.

At the end of the session they’re asked these questions:

On a scale of 1-5 (1 – not at all, 2 – a little, 3 – unsure, 4 – somewhat, 5 – very), answer the following questions:

1) How useful did you find the session?

2) How likely are you to use the skills learned on the session?

3) Will you recommend the training for others to attend?

4) Did you learn a range of new skills that you can apply to the day job?

5) Were the facilitators effective in delivering the content?

Look like fair and reasonable questions don’t they?

Except, when you look at the questions a bit closer, scrutinise what they’re asking, and look at what happens with the information, you start to unravel their usefulness as a way of evaluating a learning session.

First, questions 1-4 are asking about the intent of a person. Now, we know that people are notoriously fickle. What they say they will do, and what they will actually do are two entirely different things. “Will you vote at the General Election?”, “Yes.” After the General Election – “Did you vote?”, “No.” Well, there goes the reliability of having asked that question then.

Question 5 asks the people attending about the efficacy of the facilitators. Now this is simply fraught with issues. “Do I have the personal power to voice my personal reflection?” “Do I want to be non-conformist and voice opinion against the perception of not being kind and appreciative?” “Do I want to give the facilitators a true answer where it may get them into potential trouble?” “If I didn’t like the style of one of the facilitators, but the other was ok, whom am I responding for?” The question may as well simply ask – Did you like the facilitators? The usefulness of the answer amounts to the same, not a lot.

So what am I getting at here? I’m hoping to highlight just how much folly there is in trying to use a scoring based evaluation form at the end of a learning session. I’m being quite careful here – it’s the actual evaluation mechanism I’m being critical of. I firmly believe that we should evaluate if the learning session has been successful. I just think there are better ways of doing so.

Using a scoring based system is really only effective when you are trying to survey people, or ask them to complete a psychometric tool. In the instance of evaluation of a learning session, it just doesn’t hold much validity or reliability. Remember, if something is valid, it measures what you expect it to measure. If it is reliable, you will get consistent answers across a range of respondents. There are those in power in organisations who will insist it’s a useful metric. It’s only a useful metric because they think learning can be measured. Yet they forget that in every academic situation where you are testing learning, you provide a way of testing that knowledge which is independently marked, the results are normalised, and you arrive at a set of results which inform if students have passed or failed. That process in itself takes months. And yet we’re expecting to evaluate if a learner has truly gained the skill they need within minutes of completing a session.

The questions we should be asking at the end of a learning session need to be simple, open, and allow for comment:
– Which parts of the session were most useful to you?
– What did the facilitators do to encourage your learning?
– Which pieces of content were well delivered and provided you with good understanding of the topic at hand?
– What will you practically be able to differently when you return to your day role?

These types of question provide much more information which is both highly useful, and highly critical of the session. It allows for the people attending to voice their opinion as openly or as flatly as they wish. It allows the facilitators to actually derive meaning from the responses. Importantly, it avoids people just giving positive responses because that’s what they think they should do.

If you want to measure if the facilitator is effective or not, I have used the BARS system of evaluation which is highly objective, and very robust. BARS = Behaviourally Anchored Rating Scale. In this you identify the core competencies required of someone in a role. You then provide for each competency a scale of behaviours which moves essentially from ineffective to highly effective. Based on the overall score, you can confidently judge if someone is an effective L&Der or not. Imagine asking people attending a learning session to complete a form like this solely on the facilitator. It ain’t gonna happen. Nor is asking a peer to sit through the whole learning session in order to observe and provide feedback each and every time.

There are middle grounds, as there are with most things in life. And you can find useful ways of evaluating learning sessions. The sooner we move away from scoring based evaluations, the better.

Over on Rob Jones’ blog, he’s providing the other side of this debate.

Advertisements

Published by

Sukh Pabial

I'm an occupational psychologist by profession and am passionate about all things learning and development, creating holistic learning solutions and using positive psychology in the workforce.

One thought on “What the evaluation?”

  1. Hi Sukh, I disagree with your view on this, hmmmn, you make some valid points, but I would ask – who is this survey for? Why is this survey being run and who is using the results from it?

    (Disclaimer – I write/program/analyse surveys for a living)

    I do agree: asking for feedback at the end of the session isn’t any good – and I think many, many training courses are guilty of this. Anything that wasn’t positive about the session is unlikely to be honestly fed back (if at all). It would be much better to ask the attendees their views and experiences and how useful the session proved after they were able to put some of what they learned into use or explored further? Trouble is someone realised that the survey response rate sucks if you ask people what they thought about something they did a couple of weeks (or months) ago and it’s far more important to have 100% response rate!!! (It doesn’t have to suck, but we do also have to balance memory recall and confusing one session with another – I digress)

    For the facilitators I agree, this survey won’t tell them much but what about the head of L&D who wants to know if these dozen sessions and all the prep work that go into it are worth the effort? What if you’re re-running the course and tweaking it each time – are the changes helping? The scores will help you confirm/disprove or inform.

    And this particular survey doesn’t give the facilitators actionable feedback, and I can understand that’s frustrating for them (your example questions aren’t great – on purpose I think, all-too-common as well!) but I would say you’ve got a room of engaged learners, make the most of it! They’ll tell you if cramming 3 days into 2 hurt, or that they liked how you did X.

    Rob’s blog title has the answer: Mixed Measures 🙂 Don’t use scores in isolation (and figure out what questions are meaningful, work for you). All this information should help complement…

Say something...

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s