New voices in L&D?

I haven’t heard any new voices in L&D in a long time.

A. Long. Time.

I have no idea what the new entrants to our profession are thinking.

I have no idea what the new entrants to our profession want to achieve.

I have no idea what they want to change.

What they want to understand better.

What excites them.

What worries them.

I couldn’t even tell you who these new entrants are.

My L&D bubble hasn’t been broken in a long time.

For all of February, I am opening up my blog for anyone in L&D who doesn’t get to comment regularly / ordinarily – and in particular to those new to the profession.

You can write about whatever you want. No guiding question. No theme. The blog is yours. You can write one piece, you can write 7 pieces. You have a home here, and you are welcome. You can make a video, you can draw art, you can swear, you can be polite, you can provoke, and you can just muse. You can write poetry, you can write in prose, you can saw one word, or write an essay.

Come one. Come all.

The first will be published on Mon 3rd Feb, and we’ll see how regular it is thereafter.

Interested? Email me

The Caution of Social Media Influencers

On Twitter I have 7000+ followers (at the time of writing) and I’m here to let you know that it really doesn’t matter as much as people think it matters.

From what I am aware of, reasons people follow me tend to fall into something of the following:

  • I’m on a website/article list of recommended people to follow
  • Someone recommended me as someone to follow
  • I blog and talk about L&D and am therefore seen as a commentator in this space
  • People came across my profile and because I’m in the same profession/industry as them think we have things in common
  • We met at an event of some sort and by virtue I am being followed
  • Something else I can’t think of for now
  • I’m not seeking to genuinely find out, just my own musings

Now with a follower number that large I could easily fall into the trap of the following thinking.

  2. I have influence over what people think and I can use that in interesting and manipulative ways
  3. I can sell my reach to potential vendors who want access to those kinds of followers
  4. I am successful at Social Media and can advise others on how to grow their numbers
  5. Others want to hear from me so I must share engaging and provocative content for them to read / consume / share

And over the years I’ve tried variants of stuff to test any and all of the previous points. So here are some truths I’ve learned after repeated experiments.

Numbers of Followers do not equal Sales

I have held various open workshops and if any of the above points were patently true, I should have sold out on every one and would currently be making good money from continued endeavours like this.

I have been able to run these open workshops, and nearly every time with a handful of people, and not nearly as financially viable as I wanted them to be.

Numbers of Followers do not equal High Numbers of Likes

I’m just not that kind of content producer who can do that stuff where a tweet or blog is so well written or a podcast so fascinating that it attracts 100s or 1000s of likes / comments / reads / retweets.

With 7000+ followers there is a line of logic that I could be achieving that high level of engagement and interaction, but my followers aren’t doing that. To my mind this is definitely a case of what I’m doing and not nearly anything about my followers or their intentions.

FYI my blog receives probably 80 views for every published piece, average about 50 views a day anyway. My podcast after 33 episodes is about 3000 downloads. My tweets may get a handful of likes and maybe some commentary.

If the logic of 7000+ followers meant anything it should mean I receive multiples on each of those figures every single time. It’s just not happening like that.

Numbers of Followers does not equal Happy Sukh

I really don’t derive any level of happiness or joy from knowing I have a high number of followers. It makes no difference to how I live my life, how I use Twitter or Social Media in general, and I am not affected (fortunately) in any negative way by Twitter.

There are myriad other ways I seek out meaning and things that bring me joy. Some of that happens on Twitter and Social Media in general, but it generally happens off digital and that mix is right for me.

I’m no more of an Influencer than anyone else

The thing about Social Media in general is that it is very transitory and highly forgetful and very fickle. One off comment one day can put people off you altogether. Days or weeks of no content and most people won’t bat an eyelid or remember if you have or notice or even care.

Just to be clear I’ve never manipulated anyone on Social Media

I’m not a dick. I have morals and ethics. I want others to be well and live good lives. I spend a lot of time and effort thinking how I and others can do that more for the benefit of everyone and of individuals. I have never, and will not ever, use Social Media for manipulative purposes.

So what, Sukh?

Well, as with a lot of reflections like this, people know there are good principles to uphold, and those things should be more important or have better meaning than the fickle and temporary nature of Social Media.

  • Be kind to others, we never know what someone else is going through or why they’re saying critical or negative or abusive things
  • If you don’t want to, are feeling threatened by, or are being abused by someone, report them, block them, and if needed file a police report
  • Joy and good news stories help provide hope and connection with others
  • Develop more empathy and the language of enquiry and empathy
  • Build others up. If they don’t want that, let them be and focus your energies elsewhere.
  • Have meaningful interactions.
  • Who you are today isn’t who you were yesterday and isn’t who you are tomorrow. We all move and adapt all the time.
  • If Social Media content is making you angry or experience things that are difficult to understand, talk to someone you trust. Explore your thinking. Understand yourself better.
  • Have good things happening in your non-digital life that you can look forward to and that help you feel good. Make it regular. Keep the routine.

Let me know what you think about the post. What else have you learned about supposed Influencers that you want to share here?

Hangers – lots of the bastards

This was at the top of my list of things to buy at the weekend.

We’re nearing the end of a lot of house renovations and one of those renovations was new wardrobes. This meant I needed hangers cos now I have cupboard space to hang the clothes that need hanging.

My Dad has been discharged home from his time in rehab ward after his rather serious stroke. He hasn’t been home for 15 1/2 weeks. A lot of the weekend was spent making sure he has the things he needs to be mobile, ready for his carers, and starting to establish new home routines.

My daughter has been unwell so I’ve been spending time taking care of her, too. I’ve also been recovering from a bloody awful cold that kept me home for most of last week.

All very regular, isn’t it?

Friday’s election results were unwelcome for many in the UK. We have the biggest Tory majority government we’ve seen in decades. They have the mandate for the next 5 years to implement a whole array of policies. I have little faith any of the policies will genuinely look after those who need the government to help them live well. It wasn’t the result I wanted. Much like the referendum result of 2016.

Johnson is so incredibly awful as a human being. He is a misogynist, liar, racist, and has no statesmanship of any sort. His vision of Brexit will hit this country incredibly hard. He does all of these things for no other reason than he can. There is no moral fibre to the man that I would ever hold up as a guiding principle for anyone.

A lot can happen in five years. The Labour Party has a lot of work to do if they want to recover from such a devastating loss. When Blair came into power, it took the Tories 13 years to come back. Since then they haven’t enjoyed such dominance as they’re about to in Parliament. So we have to as a country really consider what we want our future to look like.

The SNP have a ridiculously strong hold over most of Scotland, and we could be witness to the UK being no more in the next decade.

We have hard discussions to have.

And in the coming months, we should take the time to reflect. To think. To understand. To listen.

We spend so much time shouting our opinions at others. We are bombarded on media by opinions. Where is the thinking time? Where is the necessary reflection needed? Where is the deep debate happening?

Since 2016 we have seen an onslaught of news like never before. Daily updates on every minutiae of world leaders and what they have said and done.

So I’m taking time to just do day to day stuff, and to do them with joy and with heart. We can’t continue at this pace. None of us can. The collective mental burnout is palpable.

Yes, we should be buoyed into action. Yes, we should be angry about the state of things. Yes, we should demand better from our leaders. Yes, we should demand better for our country and for our environment. Yes, there are important topics to be discussed and to be resolved.

At the same time, we need time, we need rest, we need some semblance of joy.

L&D, Research and Evidence

Last week, I wrote about how L&D get research so very wrong at nearly every turn.

So let’s clarify some things we often mistake for being in the same bucket as research that isn’t.

  • Anecdotes are not research. Anecdotes are just that. Stories. Powerful, yes. Insightful, yes. And always subjective. From one person’s point of view. Anecdotes add power when many others report similarly (think #MeToo), and when they add emotional weight to research (think charity campaigns focusing on the plight of one person).
  • Case studies are not research. They are an example of how something worked in a very specific context. Case studies are very helpful for learning about new settings and different contexts. They are not research. They highlight a peculiarity which is interesting to study deeper and further. Case studies are strong because they have deep context.
  • White papers are a mixed bag. I could write a white paper tomorrow about my thoughts on leadership. It would be classed as a thought leadership piece. It wouldn’t have to cite where I make any assertions or provide other sources for reference. At the same time, there are white papers written which are robust, thorough, comprehensive, insightful because of the data.
  • A random poll isn’t research. Often, it’s one person’s way of gaining insight into a question important to them. Polls can be highly effective forms of research when addressing key research questions. The thing which makes a poll powerful and predictive is when it is driven by research methodology, and not by one person’s attempt at justifying their opinion.

And there are different ways we can be more evidence driven. Now, here comes the challenging element with research and evidence. Sometimes, we see evidence and think it must mean it equates to research. It does not. Researchers take years to pull together data from myriad points, use this to assess whether their research is valid or not, and are heavily driven by the data. They have to look for counter-data to try and account for other interpretations and influences. They have to test against different conditions to ensure they’re not just getting their desired set of results. Research is a laborious and time intensive approach, often taking years before conclusions are drawn and can be shared. Even then, it becomes open to scrutiny, challenge and criticism. It is picked apart and debated. That’s often before it hits mainstream attention – if indeed if it ever gets that far.

So, now we’ve got some baseline established, how can L&D get better at critically evaluating evidence and research?

  1. Don’t take it at face value that the research or evidence you’re being shown is as robust as I’ve described above. It is incredibly common for presenters, trainers and consultants to share models and theories that have very little evidence to support their usage, and little to no research. E.g. a leadership consultant argues that they have found 7 key behaviours that lead to better leadership. Sounds great, right? Based on what evidence? Who carried out the research? What were the results of the research? What other factors and variables could have accounted for leadership success? Is it just one context we’re dealing with (sports) or is it from multiple industries and sectors? Or a personal development consultant argues they know how to unlock human potential. A very common claim. So once again, based on what? Their personal experience? That’s valid, but limited. Have their methods worked with people facing 100s of different life problems or are they working with a certain kind of person? When counselling and psychiatric interventions are more research and evidence based, how is this practice better than those? Whose theories are they drawing on? Are those theories validated? Often, anecdotes and case studies are used as evidence of success. That’s valuable, but limited.
  2. Learn about research methodology. Research methodology is a rigorous process. Often the results from research aim to highlight specific research questions that are not meant to be extrapolated to other contexts. The Mehrabian Myth is probably one of the most quoted examples and yet was never meant to encompass every example of communication or have it be used so widely. Things like double blind studies, control groups, stats, design of questions, validity, reliability, all have specific definitions when it comes to research, beyond our own understanding of those terms. Just because we use the English language commonly, doesn’t mean we always understand what we’re being told.
  3. Get comfortable with being a critical thinker on research and evidence. It is ok to be that person that wants to know more. In L&D we like to think we are critical thinkers. A lot of us really aren’t. We will readily swallow any graph or stat or data presented to us as being ample evidence and quote it back to others willingly. There are many good sources of evidence and research available, and many times, L&D just don’t bother.
  4. L&D really needs to stop trying to pretend they are psychologists. If you have heard or read some fantastic piece of work or research and are enthusiastic about it, that is fantastic. Having read the stuff doesn’t make you a psychologist. It just makes you a well read person who has a better appreciation of the human condition. If you like the work, and think it will have relevance to your organisation or client needs, get that person in to do the work. They will understand the work far stronger than you will in your enthusiasm and excitement. Your job as a trainer or facilitator or consultant is not to be the font of all knowledge.
  5. You don’t have to be a researcher to appreciate good research or evidence. You do need to understand it, cos that makes you a good L&Der. When we are delivering our solutions and interventions it’s better to be armed with solid research and evidence over personal feelings and opinions. Personal feelings and opinions are helpful, and carry more weight when there is solid research and evidence to back them up.
  6. Claiming that anyone can do their own research, or that you can show evidence for anything, shows a deep misunderstanding of both good quality research and gathering evidence. It shows a superiority which is unfounded. It shows a disrespect for a set of practices which can significantly improve what we do in L&D.

When it comes to the design of L&D solutions, there are several things we can and should be doing.

  • Find out what research or evidence exists for the models/theories we’re trying to use. If the research/evidence looks weak, don’t use it. Find something more relevant/applicable.
  • If you’re interested in a model/theory, find out from the consultant/vendor what evidence base or research they have for it. If the model/theory has any weight, there will be research/evidence in its favour. Often, the researchers will want to talk to you themselves to really be sure you are getting great information. If it’s been designed by a consultancy or an individual, often the research and evidence is lacking. Instead they will share case studies, white papers or anecdotes. They are forms of evidence, and should be taken as light forms of evidence. Often, this is where claims about the value of research are called into question.
  • Test your model/theory with others before inclusion in a solution. Good research comes from a place of trying to disprove your approach. Give people an option of other approaches and ask them to evaluate which is more effective. If you’re only testing one approach, you are highly likely to receive positive feedback and nothing of value with criticism.

So when it comes to getting better at being a research lead and evidence based profession, it involves education and time to better evaluate what we do and how we do the work we do.

What L&D get wrong about research

We’re living in a period of time where it’s more and more acceptable that we can hammer experts for their subject matter knowledge in preference for someone’s personal opinion. It is genuinely a troubling attitude when someone feels so entitled to their opinion that they create a well articulated argument in favour of it and will willingly disregard opinions from those who have clear evidence suggesting better practice.

Amongst other societal challenges this trend represents it is indicative of lazy thinking and a real lack of critical thinking faculty.

In L&D we’ve been guilty of such lazy thinking and lack of critical thinking for a long time.

There are a number of factors that play a part in the lazy thinking and lack of critical thinking.


Many trainers and facilitators in L&D believe they are so good at what they do as a trainer or facilitator, that it overrules any need for being fact or evidence driven in their approach.

They will run exercises and activities that are designed to do nothing genuinely insightful and are simplistic at best. Such stuff is superficial and is often a result of some creative thinking on how to get someone to think about a topic in a different way.

The personal experience they create is often mistaken and misrepresented as effectual learning.


Researchers take years to become subject matter experts. They ask multiple questions like:

  • Why does this happen?
  • What literature is helpful for this research?
  • What do the results tell us?
  • Is our data biased?
  • Have we studied the right things?
  • What else could be a factor?
  • What does the research help us understand better?

There are many more questions researchers ask.

L&D do not do research. Not even close.

We have multiple case studies and anecdotes. In nearly every case, we lack actual research.


It is a very common affair that the multiduinous models and theories used are nothing more than one person’s view of the world and lack anything close to evidence.

Actual proper research and evidence would show things like:

  • Here’s how this model/theory tangibly made a difference based on these factors
  • Here are the control groups who received nothing, the control groups who recieved something different and the control group who used our model and here are the results from all those groups
  • Here’s the variety of different work groups / sectors / industries the model was tested in, and these are the results
  • Here’s the limitations of this model and how it should not be used
  • Here’s replicated studies of other people using our model and the results they got
  • Here’s peer based reviews of our work and their criticisms of the model

There are so many models and theories put forward in L&D and I would wager 90% of it lacks credible research and evidence.

Case studies are not the same thing. Anecdotes are not the same thing. Both of those are examples of marking your own homework.


In so many cases, we just don’t have the time for actual research. When the likes of Saba, Cornerstone OnDemand, SumTotal or Pluralsight put their stuff out there, they’re not showing us evidence based results.

They believe they are, because they are in multiple big clients and have volumes of clients using their products. Just because a product is widely used does not constitute data as evidence. They do not have the validated research by external sources to verify if their products deliver the successes they claim. They have anecdotes and customer success stories. They are not the same thing.

When the likes of leadership consultancies and personal development trainers make claims their methods work, it is because they have no evidence to show the contrary. If all you are using are your own methods and models, you’re not testing them against anything else so all you’re going to get are positive results in your favour. That is so much skew.

The time to do the proper research would take years. Most people have mouths to feed and bills to pay. Life gets in the way. So the evidence is of a much reduced quality and we’re often asked to move forward based on faith and trust in the person and not worry about the models / theories / methods.


There have been many times I’ve asked a potential vendor for actual research to back up their methods / approaches, and many times I got very poor responses ranging from no research, to them providing case studies and anecdotes, to them claiming research doesn’t matter.

If we can’t be confident the methods being used are tested and have validity and reliability then why are we moving forward with them?

I’ll also get people giving me alternative results from other research, which supports their original thinking, even though it is poor research in and of itself.

I’ll also get people telling me research can be anything we want it to be. This just displays such a lack of understanding about research and why it’s important.


Many in L&D don’t want to ask the question about research. They don’t care about it, they don’t see the relevance of it, and it gets in the way of them doing what they perceive as their job.

It is a problem which is inherent across everything we do. E-learning, in-person sessions, virtual training, mobile solutions, all have such poor research into what good actually looks like we are often left with sub-par solutions and asked to trust in the person more than the tools or resources they put forward.

I have no problem trusting a person. If the tools and stuff they use are flawed, then that basis for trust is automatically problematic. If they do not understand fundamentals about research, it only adds to the problem.

What solutions can L&D provide if courses aren’t the answer?

A couple of weeks ago, Myles Runham wrote this really interesting piece on whether or not L&D can work without courses.

His main provocation is this:
In L&D… we seek problems for which the course, a programme, an event and some content are the answer. These may or may not be learning, training or performance problems… We like these problems because we are ready to create (these) solutions, not because they are the most valuable problems to solve.

So I’d like to posit stuff on solutions which are not event based in any shape. Many of you will recognise some of these as part of stuff you’ve either delivered or take part in yourselves.

  • Define the actual performance problem not the training problem. Starting from there means better exploration of solutions for the manager/leader you’re working with.
  • Ask yourself, what practical digital resources will help the people who need it most effectively?
  • Work with the manager/leader to figure out what operational/tactical/day to day stuff is preventing the desired performance from taking place. Work to resolve that.
  • One of the ways Twitter has grown its user base is using hashtags for regular chats. It creates good habits around community and a healthy space for exploration of topics of interest.
  • Find people in your organisation who can answer stuff on your behalf – be that managers, HR Business Partners, comms folks – a lot of people are willing to help.
  • If you’re dealing with compliance issues, learn more about behavioral science and less about mandatory training.
  • If you want leaders to improve, expose them to what great looks like and let them figure it out.
  • Coaching is a powerful tool to enable ownership of issues and developing personal thinking capabilities.
  • If you’re dealing with diversity issues, look at your organisational design systems and how you can influence those, not worry about diversity training.
  • If you’re looking at inclusion, look at your culture and organisational development practices and how you can influence those, not worry about thought leadership pieces.
  • If you’re looking at leadership development, give them proper organisational problems to resolve, with the right support in place to enable success.

These are just some examples of how L&D can add value and provide solutions without needing to resort to an event of any sort being the default answer. Sometimes, a course/event/training is necessary – for things like core skill development. For most organisational needs and problems, a course won’t be the solution. I have friends whose bread and butter is to deliver courses. That’s good and I’m not arguing against delivery of courses – they often serve a clear need. In many organisational issues, we should be looking at better solutions.

If not Learning Styles, then what?

Last week I wrote this blog post about why Learning Styles should be consigned to the bin. Not just my opinion on it, but because there is no research of any kind that supports the use of Learning Styles in the design or delivery of training/learning solutions.

For those who have trained for years using this type of model – and don’t forget, Learning Styles is many theories not just one thing – it can leave a sour taste. If I don’t use Learning Styles to design a course, then what do I use?

It’s a valid question.

The answer lies in our understanding of training problems / learning needs.

Often, business leaders will inform us of the problem they face. My team need communication skills training. My team don’t deliver their sales targets.

Then comes the request. Can you deliver the training course we need to fix either of those two problems?

A lot of trainers / L&Ders see this as their mandate to deliver for the business. That they can show their value by doing what was asked for. That they will have clear learning objectives.


Before you’ve even started designing or thinking about how you could deliver on the solution, consider these further lines of enquiry back to the business leader.

What change are you seeking from the training?

What are you doing to manage that need yourself?

How are you giving the team feedback on these needs?

What are the structures in place that either support what you’re asking for or hinder the outcome you’re expecting?

What have you tried already?

Who else is doing something impressive that you have seen?

How are the team measured on these needs?

Any one of those questions will start a further line of enquiry with the business leader. What I like about this performance consultancy approach is that you’re engaging the business leader and asking them to take responsibility for the results they’re seeking without defaulting to your training course / learning solution.

That whole process above will nearly always result in a different set of solutions – one of which may still be a course – but that doesn’t become the default option trainers are used to.

Then when it comes to the design of the thing, don’t start from the content side of things. The content is only important if you understand the context the team are working in.

Spend time with the team. Sit with them to understand what challenges they face with the training need identified. Use their input to inform the content you need to design for. Ask questions on what kind of practical solutions they want a training/learning solution to provide. Then design for those things specifically.

Through none of that does it matter what kind of learning preference the individuals may or may not have. You’re not delivering against their learning preferences – that’s not the business need. You’re delivering against their performance needs. That’s what will be measured.

There are many other options when it comes to thinking about the design of training / learning solutions including areas like user experience, behavioral economics, the 70:20:10 model, and experiential learning. Learning Styles has no place in the work we do.

It can be hard to have to re-learn what we know and trust, and if we can’t demonstrate that in ourselves then how are we hoping our learners will fare any better?