WOW 167 | Evidence Based Learning

 

Data-driven learning is not just about improving performance; it’s about transforming potential into possibility. In today’s fast-paced world, learning and development are crucial for staying ahead. But with so many training options out there, it can be tough to know what really works. That’s why in this episode, award-winning L&D specialist Helen Bailey explores the power of evidence-based learning and development. She breaks down the science behind evidence-based learning, sharing real-world examples of how it can drive behavior change and boost performance. From theory to practice, Helen discusses the key principles of evidence-based learning and how they can help you unlock your team’s potential. Whether you’re a learning professional or just interested in how to stay competitive in today’s changing world, this episode has something for you. Join us as we dive into the impact of data-driven learning, and discover how it can transform your organization’s training outcomes.

Listen to the podcast here

 

From Theory To Practice: Applying Evidence-Based Learning Strategies With Helen Bailey

We are here with Helen Bailey from Strategi, a consultancy in the UK. We’re going to be talking about the topic of evidence-based learning and diving into the need for evidence in the design and delivery of learning and development programs and our development in the workplace. Before we get into that, Helen, could you introduce yourself and say a little bit about who you are, your background, and what you’re doing at the moment?

I’m Helen. I’m the Head of Learning and Development at Strategi Solutions. We’re a business consultancy. We focus very much on the creative, the marketing, and the people. We sometimes describe ourselves as like the Avengers because we have a group of experts willing to come in, support, and help businesses. My background is in L&D throughout. I’ve done it for more years than I can care to remember. I had a spell for quite a long period as an interim specialist. I’ve worked in all kinds of industries doing all kinds of things.

That’s brilliant. Thank you. We’re talking about evidence-based learning. We’re going to explore a little bit about it and talk about some specifics, why it matters, and what it means to people and try and draw out some lessons for individuals in the learning and development space but also individuals and organizations leading teams. Before we try and do that, I would like to start with the basics. What is evidence-based learning? What do those words mean when they all come together?

It does what it says on the term. It’s about making decisions based on the evidence available. When we’re talking about evidence, it’s important. It’s a bit about organizational data. It can be about the literature or the research in that area. It can be about the knowledge of practitioners because, to a certain stage, you have a lot of experience. It can be about the experience of your stakeholders. That leads me to my favorite word, which is triangulation. You take three sources of evidence. If they’re telling you the same thing, that’s probably what’s going on. That helps you to make much better decisions.

If we think through that learning journey or a learning process and if we think about the evidence-based, do you start by trying to understand needs? How does that fit in? What’s the starting point of bringing evidence into what you do?

 

WOW 167 | Evidence Based Learning

 

It’s starting to understand the needs. It’s so easy for people to say it. I was talking about this. I could go into any organization, and I would get asked for two things, time management and handling difficult conversations. Those two things always come up. Time management is never about time management. It’s never about people being able to organize themselves. There is something else going on there. What you’re doing with evidence-based learning is you are digging deeper into what is going on. Kevin M. Yates describes himself as the L&D Detective. You are getting to the bottom of the problem. You’re looking at what is going on in that situation.

That’s interesting. Something that we find a lot and that we have started to push forward in our conversations with clients is that we find that quite often, people come to us with a solution. They say, “We’ve got this solution. We would like you to implement it.” Sometimes we say, “Maybe. Why don’t we talk about that?” Quite often we say, “Why?”

As I’ve gone on in my L&D career, that is a question I use more. My favorite example is team building. People will often say, “We want to do team building.” I’ll say, “Why do you want to do that? What do you want to get out of it?” They will say things to me like, “We want people to have a nice time.” I say, “If you want people to have a nice time, I’m not your woman for that. If you want learning outcomes, absolutely. Go bowling. Do whatever. Go out for a meal. That’s having a nice time.”

There’s a huge amount of benefit for going out with a nice meal as a team, but you don’t need to bring somebody in to do that as a special hand-holder for your meal. Why is it that we need to talk about evidence-based learning? Why is it that this is an important topic? What happens if we push toward evidence-based learning?

In being straight, open, and honest about it, we could be wasting our time. We could be doing the wrong thing. I’ve been reading a lot of research, which talks about how busy L&D people are. One of the reasons that they’re not able to get into the data and the whole evidence-based is that there are too many competing priorities. Given what is going on, it seems to make sense to me that we focus our time and energy on things that matter.

At various points, I’ve done different things in my life. At one point, I spent time with builders doing construction stuff. They always used to say, “Measure twice and cut once.” That’s a fairly common phrase, but it seems to fit well here. There’s a risk of doing the wrong thing. Could you elaborate on that a little bit? What happens if you don’t understand the needs or if you make an error in deciding what’s appropriate and try and deliver something like that as a solution for a client? What are the impacts of that?

It’s about credibility internally. We deal with a lot of external clients. If we deliver something that is not suitable or doesn’t meet what the client needs, there’s a lack of credibility. It becomes to be about a lack of trust in the organization. It demonstrates one of my pet hates, which might come on later. It’s people who think that L&D is pink and fluffy. It’s that piece about how you go to an L&D event. Let’s call it that. The person sprinkles their magic fairy dust. You come out a completely different person. The reality is that doesn’t happen. If you want to be credible, we have to be doing something where people go, “This is relevant to me. This makes a big difference to me. I can see how I’m going to apply it back in the workplace. It was useful.”

That comes back to something that we might touch on a little bit later, which is around the measurement of outcomes and figuring out the question, “Why are we doing this? What is it that we want to achieve, change, create, or sustain as a result of this?” To the point of going out for dinner, if you want to have an outcome where everybody feels full, then let’s do that.

It’s perfect for that.

If we step back a little bit, we have been talking here about evidence-based learning itself. We have started by looking at the front end of that process and understanding the needs of an organization. We need to focus on the right thing. I would like to think about learning more broadly. We hear a lot of phrases about learning, “A learning mindset is sometimes applied for individuals.” We hear phrases like, “A learning organization is coming out of various other frameworks out there.” However, we speak about organizations. Why it is important for organizations and the people in them to be able to learn what happens if we can’t do this?

That is a big question.

It’s my favorite type.

To add to your definitions of learning culture and all the rest of it, Dr. Hannah Gore has an interesting one that always sticks with me. She talks about how a learning culture is where people access learning without having to ask for it. That sounds great. Who wouldn’t want to work in that? As we move into the world where we’re talking more about a learning ecosystem but we’re talking about SharePoint sites, the internet, whatever it is, LMSs, and all the stuff that goes with it, that becomes increasingly important.

To answer your question, what happens if we don’t? There is a great quote from Deming, “It’s not necessary to change. Survival is not mandatory.” The world is constantly changing. Therefore, it’s important that organizations learn. I’ve been talking about this. Quite often, we’re so busy doing. We don’t take that time or the opportunity to reflect. It has become very apparent that the organizations that don’t do that and therefore, the functions that don’t do that, do not survive.

WOW 167 | Evidence Based Learning

Evidence-Based Learning: With the world constantly changing, it’s important that organizations learn. Often, they get so busy that they don’t take the time or the opportunity to reflect. As a result, they do not survive.

 

In your work with client research, do you see a difference across potentially different sectors, different global geographies, or different functions within organizations in relation to their propensity to be more focused or more agile in their ability to learn? Do you think it depends on other factors?

It depends on a lot of factors. It depends on the leadership of the organization as well, whether they’re in that space, whether they’re in a more proactive or reactive space, and whether they’re thinking more long-term or short-term. In the pandemic and following the pandemic, I don’t think we fully explored what post-pandemic will look like. In terms of the impact of that, a lot of businesses took a very short-term view. I went to my local high street. I’ve spent a lot of time working in retail. It’s a different place now from what it was. That was not being able to adapt to those particular circumstances at that moment.

We’re up here in Edinburgh, Scotland. The main shopping streets of Edinburgh are different as well. That brings to life some of the changes in the broader business ecosystem we’re in. What do you think are some of the impacts or benefits to individuals when they learn in the workplace? How does it affect them in terms of their experience of work and other rewards or benefits it might bring them?

For individuals, effective learning can be transformational. That’s a word I would put in front of that. We have a model of learning. We talk about a strategy, which is called Reality Learning. We talk about things that are relevant. They make a difference to the person. They transform how they think. We do a lot of work as well in the equality and diversity space. Sometimes that can be quite hard learning for people because there’s a lot to take on board there in terms of what am I having to challenge my thinking quite a lot and sometimes the process of attending an event. I always think that if I’m delivering an event virtually or face-to-face, if I can make people think about one thing a little bit differently, then I have made a difference to them.

That’s a nice way to think about it. My personal reflection a little bit on this is well-being, mastering new skills, growth, and development. Many people are inherently motivating and support our well-being and things like that. It’s helpful for organizations as they face skills, challenge gaps, and develop their capabilities, but there is that personal piece that can be helpful for individuals as well. Here’s a question for you. There are a lot of benefits to taking this evidence-based approach to our intervention. It’s thinking at the front end about them, looking a little bit at the needs piece up front, and then, at the end, doing some backward-looking and passing over what we have done and saying, “Have we achieved our goals? What are our measures?” and checking it in the middle. Why don’t we do it?

It’s the million-dollar question. There are a couple of things. Interestingly enough, 94% of organizations want to measure. That’s the benchmark. There seems to be what wants to but isn’t able to. That seems to be linked to some of the resources available to do that. The first is time, which we have referred to. The other one for me is I always talk about how nobody got into L&D because there was a spreadsheet and some numbers to look at.

People didn’t get into it for that. They got into it because it was about the people. It was about having complicated and very interesting discussions. I do wonder as well if nobody got into it for that. It means that there isn’t capability in organizations. Kevin M. Yates, who I’ve referred to already, talks about people in L&D who are into numbers. He refers to them as unicorns because that’s rare. Most people didn’t get into it for that.

That makes me feel pretty good. I’ve gone the other way. Maybe that should be me, to be honest. I like the conversations, the complexity, the problem-solving, and the engagement.

You can have both sides. My colleagues at Strategi will say, “The thing that I love to do most is to facilitate the conversation and have that in-depth conversation, but I’m also interested in what the numbers tell me.” If we talk about the bigger ecosystem, what are people looking at? When are they looking at it? How long do they spend looking at it? That tells you about where people are at and what’s driving their motivation to go and look at things. We do more of that. If we do something and it doesn’t work, that’s fine. We do something else, but we use that evidence to help us with that decision.

Talking about the bigger ecosystem tells you more about where people are and what’s driving their motivation to go and look at things.

As long as we’re learning on the way, and making intelligent mistakes, then that’s okay because we build everything from past failures.

I love the sound of intelligent mistakes. I like that. I’m going to steal that.

I make many of them. Some of them are intelligent. We will see. We talked a little bit about the needs assessment at the beginning and understanding that. If we get to the stage where we’re clear on our needs and we have created what it is that an organization is looking for from a learning perspective, how do we at that stage start to look forward or bake in an evidence-based process? Do we need to start doing things at the early design stage? How do we start to think through from there?

The earlier, the better. We don’t think about it enough. If you take the classic evaluation thing, give it out two minutes before people are due to go home and run out the door, which is your classic. My first pointer for this is to think about what the business is looking for. Talk to stakeholders. Think about the return on expectations. What do they expect to see? What are the business measures relating to that? Capture that. You can talk about that at the end because that gives you your benchmark. We need to get better at the language we use with our learners. Quite often, we will talk about your key takeaways from the session. The better question for me is, “What is your key takeaway? How are you going to use it back in the business?”

You’re drawing that applicability in.

We drive back that this is about a business intervention. We need to get our line managers on board and have conversations. They are the people who see this happening. Talk to them about what they are seeing differently and whether you can build it into an appraisal or any of those things. They are able to see those measures. I was talking to somebody about a management development program. At the end of it, the classic thing is you do a presentation, which is what I’ve learned and all that thing.

In that organization, they were doing TED Talks. I thought that was such a great idea. That’s something a little bit different. When we’re talking about your classic evaluation and how they’re going to use it, we need to be less formed. We almost make it a training activity, so it becomes part of the session. For example, at Strategi, we use three questions. We do it on Menti. We do it at the end of the session. It becomes part of the session. As a group, we then talk about what you’ve taken away to use back in the workplace. There is a discussion about it. It feels more tangible.

For clarity, Menti is a whiteboard group word cloud platform. You talked about a few things I would like to touch on a little bit. You started by talking about linking learning interventions to business metrics. I thought it would be helpful if you could call out a couple of the types of metrics that people might benefit from through effective L&D.

It depends on what it is. We do HR skills programs. We do things like that with disciplinaries and absences. Has that rate gone up and down? From a management development point of view, we do tend to use lots of 360. We do a 360 at the beginning. We will do a 360 at the end. That tells us about shifts in behavior and where people are coming from that point of view.

I do think there is nothing like getting some live immediate feedback at the moment. Quite often, when we do our longer programs, in the middle of that, I’ll sit down with people and say, “What’s going well for you? What have you been able to apply? What’s going not so well?” We have a real adult-to-adult honest conversation. A lot of what happens in evidence-based L&D is around having some stand-up conversations and saying, “What’s this impacting? Is it not? If it’s not, what do we need to do now?”

WOW 167 | Evidence Based Learning

Evidence-Based Learning: In L&D, there is nothing like getting some live, immediate feedback at the moment.

 

I like that mix of quantitative and qualitative that you’ve called out there in terms of metrics. You can do your surveys at the beginning and surveys at the end. You can look at business metrics, be it your number of disciplinaries, excess hours worked, or whatever happens. You can find these and track them as well as survey-based outcomes from direct reports of managers or so on. Qualitative conversations are so powerful. They can be an opportunity in my view to cement and further embed your learning anyway in a coaching approach as well as an exploratory approach, which is nice.

Something else that’s in my mind is people might want to look at some of these things. For example, if you invest in your managers, you might improve retention. That’s lovely. That’s a great goal. How does one navigate causation and correlation? Do we need to have a pinch of salt and say, “We’re checking the right pebbles and the right wells and hopefully making the right splash?” How do we hold ourselves comfortably through that?

Back in the day when I used to deliver CIPD programs, we used to talk something about judgment. In your case, you’re talking about retention. We assume maybe that you don’t want to reduce retention. You want to increase it. The 40% increase in retention is used by some line managers taking what I would call a bit of a finger-in-the-wind approach with that, which you don’t necessarily know. This is one of the problems with return on investments, which has always felt like the holy grail of L&D. We get that because so much else that happens in the organization is apart from L&D.

In the HR example I’ve given, it could be that there has been a new policy. It could be that there’s a new management structure or whatever it is that also could have impacted. I don’t think there is an easy answer to that question if I’m honest about direct causation. I’ve been able to say that L&D has made a direct difference, but what you can talk about is L&D has contributed.

Being able to hold onto that ambiguity or uncertainty and be directionally confident in what you’re doing is helpful. As an aside, as people know, I used to work in a finance function for a large financial service organization. We couldn’t quantify our contribution as finance. Our holy grail was, “How does finance add value?” We’re like, “I don’t know.” We spent loads of time trying to do that. It’s a very numeric discipline. It’s interesting.

One thing in my mind is quite often as we go through any of these stages, whatever it happens to be, where we set objectives, go through reality, and then look back at what we have done, we’re so much more informed at the end of a process than we are at the beginning. The goalpost shifts. The world moves. Things change. How do we keep ourselves comfortable that we are working in an evidence-based way even when looking back retrospectively, it can be possible to look at things differently? How do we make that okay for us ourselves?

We have to be careful about being responsive and thinking at the moment, “Where are we with this?” We have outlined a bit of a process where we talked to stakeholders at the beginning and the end. There have to be much more regular checkpoints than that in reality because the world is moving. The organization has probably changed, particularly when you start quite a large-scale intervention.

We have to be careful about being responsive and thinking at the moment.

It is about checking in regularly with your stakeholders and making sure that you are checking the relevant sources of data at the CIPD or whoever you choose to follow in the L&D world that produces reports nearly every week. I managed to read two new reports before I came to speak to you. I’m keeping an eye on that literature. There is something for me about L&D people constantly updating themselves. You’re only as current as your knowledge. If you don’t do the research and further reading, it very quickly becomes outdated.

It’s a fast-moving space in terms of the thinking that’s out there. What pitfall do you see? What are some of the traps that people can fall into when they’re thinking about doing this or starting to look at measuring and gaining evidence over their interventions?

My favorite one is the data wheel of death. I love that.

Say more about this.

That sounds like a Star Wars thing. Sometimes we think we have to have all this complete data to get started, and we don’t. It’s better to start and then build on the data you’ve got rather than wait until the perfect moment because there never is a perfect moment to say, “Let’s go.” Sometimes you need to start collecting data and evidence and then realize what else you need. Sometimes you need to start the process. Sometimes we get lost in maintaining data that we don’t need. We need to be better at jettisoning stuff that isn’t serving us anymore. I’ve collected a lot of data. I’ve got halfway through and through. I’m not even using that. Let’s ditch it.

That sets a trap I see in many different industries where the production of data becomes an industry of no benefit. We’re not using it in a meaningful way. I can see that as a pitfall that exists.

My other one is we focus on the wrong things. You take an evaluation, which is a form of evidence in an organization. I can’t remember how many forms I’ve seen that asks, “The training was engaging. Did you have a nice lunch?” It’s not phrased like that, but you know what I’m getting at. My favorite phrase for this is from Michelle Parry-Slater. She says, “It’s not about the biscuits.” She’s right. It’s not about the biscuits. I don’t think it’s about the trainer either.

I was listening to a podcast. They were talking about how learners aren’t always in the right place to evaluate the learning because if it has been a tough and challenging session for people to think about, they’re not necessarily going to have had a good time and say, “That was useful.” They’re not going to find it. My last point is sometimes we focus on intention rather than action.

Say a little bit more about that.

Sometimes we talk to people, “What are you going to do with it?” I tend to do that, but we don’t follow that up. We never find out if they ever did that. All we know is that they intended to do it, which is great because applying the learning is important, but I would like to know if they did it or not.

That’s something that we see in lots of global surveys and all kinds of things. People would do this. I’m like, “Who has done it?” It’s a very different type of piece that comes up. You were speaking there about how it feeds a little bit into the timing of a gathering of that look-back data. How far into the future do you go? The examples that you’ve talked about there seem to imply that a lot of this data gathering can be done at the end of a training session. People might be in a funny state then. How far into the future do you go when you’re looking at these metrics? How far is it reasonable to say that you would have an impact? Does it depend on the training that you’re delivering? How does that work?

Powerful behavior change takes time, and it’s not always going to be immediate. If you talk to people, it is a good idea to gather it at the end, but I also think it’s a more powerful measure until 3 months or 6 months down the line because people have had time to think about it. The fact that you are asking people about that learning again refreshes it in people’s minds as well and brings it back. That helps embed it further. You get a double hit there, “I went on that learning. I want to remember that for whatever reason.”

Powerful behavior change takes time.

People start to fight because you’ve got the recency effect, “I’ve done it.” Let’s say I’ve been on a coaching course. I go back to work and start talking about all this lovely language, “What are your goals, James, and all that kind of thing?” After a bit, you start to go, “What do you want to get out of this?” That’s the way you are normally. It gives you a bit more time to embed it. I’ve thought about it, but it also reminds people about learning.

We have the same view of that type of asking of questions. I also think that everything is an intervention. You give someone a survey, and that can change people. All of these things are connected, which is interesting. This is important. Understanding why we’re doing stuff matters. Measuring it to know if we have done it well matters. It helps us understand what works and what doesn’t, make things better, improve things for clients, and improve things internally. One of the things we find is as a consultancy, a lot of the people we work with come to us with a solution and say, “We want you to do this.” We pick that, work through what your problems are, and go from there but also, nobody wants to pay for an evaluation. I want to navigate that.

WOW 167 | Evidence Based Learning

Evidence-Based Learning: Measuring outcomes helps us understand what works and what doesn’t, make things better, improve things for clients, and improve things internally.

 

I reflect on that completely. We are business consultants in the same way. The reason we do the evaluation work is to provide further evidence for our clients. We do it in terms of, “This is what we have given you over the course of the year.” There’s a real conundrum for me about organizations that don’t want to pay for it. The actual words that I wrote down were, “There is gold in their hills,” which is a reference to some weird movie a long time ago.

There is such richness available in that. If you tuck it down a continuous improvement route by cost-benefit analysis, there is so much that organizations could learn from that. Much that we tend to share is conversations with our clients because it helps inform fundamentally at the end of the day. You might recognize this. If people don’t want to pay for it, they don’t want to pay for it.

If you put it on a piece of paper as an option, quite often they will say, “Let’s not do that bit.”

Even internally, I’ve been the one that has been interested in the stats. I can produce all the lovely graphs and everything you want. People have gone, “Were the biscuits any good?”

Maybe the lesson is we should divert the evidence to the biscuit fund, and then everyone will be happy. I’ve got a question as well on this. It’s something I mulled over a little bit, which is the role of experience. We have talked there about measuring outcomes and things like that. Sometimes my sense is that the creation of a good or stimulating experience in itself can affect an individual in a way that’s helpful, even if it’s hard to pair that up with specific aspects of content or messages being delivered. The way that something is created and facilitated as a learning experience can almost be the driver of output in itself. How do you feel about the importance of an experience supporting change?

I agree with you. That’s not because I’m here. I’ve certainly facilitated sessions. We used to call them a bigger conversation. It would be on a particular topic. Let’s say it’s equality, diversity, and those kinds of things. The whole team is sat there having a discussion that they would not normally have in the workplace and start to ask what are considered to be tricky questions, “What does a wedding look like in your culture, for example?” Some of the feedback from those sessions has been so rich because people go, “That’s the first time we have ever had that conversation. We will have that conversation more as a result of this.”

It’s a great example of a learning intervention, which doesn’t have a clear measurement. What you’ve probably got is the team working more effectively together. That event is transformational for that team because suddenly, they’re having tricky conversations, which means when they get to the tricky stuff at work, we are going back to our having difficult conversations, which nobody can have, then those tricky conversations are easier because we have had this difficult conversation, and it was okay.

You’ve done a bit of benchmarking of difficulty. You’ve given people practice with facilitators who are guiding you or supporting you through some of that. That can be helpful. I’ve not looked for evidence on this, but I feel this to be the case. In that environment of a learning space where it is facilitated, individuals can feel pride, agency, hope, and an ability to improve their experiences and those around them. All those positive capital aspects, positive emotions, and positive reflections that can go with that can grow out of an experience, even if it’s sometimes a bit separated from the content. A way that you do things can bring all of those benefits.

As we would say in a bit of the North where I come from, that’s right up my strait. I sometimes refer to it as a style or whatever you want to call it. I strongly believe in the power of positive emotions in a learning environment. It’s important, whether that’s about sweet chocolate or having little balls you throw about whose name I can’t remember at the moment, whether it’s about engagement, or whether it’s about the way you do things. I’m not incredibly formal in that sense because I don’t think people can learn if they’re tense. That’s a hard learning environment.

We were doing emotional intelligence. I played Pass the Parcel. It’s something that is a little bit different. People remember things that are a little bit quirky and different. It helps root that learning. The big thing that the people who came to that session can remember is that they played Pass the Parcel. That starts the process of remembering the other stuff. If it’s okay, I’ll pick up on something you said about having those conversations. We did a menopause session. Some of the feedback around that was that it was great to be able to talk about it in the open.

When I was doing the research for that session, I felt a little bit sad because there was a lot of talk about women who did not have the space to do that. I came away from that session, and I felt quite proud that people felt it was okay to have that conversation. That was a transformational experience for some people in that room because some of the feedback talked about, “I am going to be more open about this going forward.” It’s the power of that.

It helps create role models when there are difficult topics like menopause, aspects of diversity, or mental health. We were speaking before about neurodiversity as a sub-strand that for many years has been going on, but it’s a topic. Creating spaces where people can speak about things like this sometimes requires the gentle guide rails of a facilitator to create that experiential space where it is permissive, okay, safe, or risk-free.

You can ask the awkward question. I also feel that in those spaces, sometimes the facilitator has to give something of themselves up. If you show that you are willing to be vulnerable and willing to be able to talk about some of this stuff in the menopause chat, and I shared some of my experiences, then that encourages that environment. I don’t do it because it’s all about me. I do it because if people can see that I can do it, then anybody can do it.

That’s a great point. We could go off and do a different conversation about facilitating.

We’ve got a whole other conversation now.

How do you manage yourself as a facilitator doing that? It’s costly. We can do out some other time. People are maybe considering in their L&D practices, a consultancy, or an organization. If they’re looking at starting to bring a little bit more evidence into their organizations and interventions, where would you suggest they start? Have you got suggestions for reading or things they could do? What’s the starting point for people who want this?

I’ve got a massive suggestion for reading. It’s a bit of self-promotion. You can go and read my blog. I would start with the work of Kevin M. Yates because that’s great. On his website, he has a whole section that has loads of templates, which are useful. I’ll go for that. I would have a look at the CIPD’s evidence-based practice section. Internally, I would go and talk to your stakeholders if you haven’t done so already. Find out what they’re interested in and focus on that because that’s a good start.

Ask better questions. We have talked about the why question, but we need to start digging down into what’s going on and not be satisfied with the first answer, “Why is that going on? What evidence do you have for that?” Let’s start talking in that language. Let’s start talking about learning outcomes and business impacts. The language we use is so important. When I was doing my CIPD, and here we were talking many moons ago, they were talking about that then, yet here I am many years later still saying we should be talking about this. It is great, but business outcomes are what the business values.

The language we use is so important. Let’s start talking about learning outcomes and business impacts.

That’s why we commission learning and development to achieve outcomes. I’m not originally from this function, but it’s easy to see the minimizing of some of the language in learning and development, organizational development, and even wider aspects of HR use about themselves, which is interesting. We’re pretty much at the end of where we are. In the interest of time, I’m going to wrap us up to move us on and close us out. Here’s one last question. How can people learn more about you and what you’re doing?

They can find me on LinkedIn. We do a lot of events as well. If people want to go on our website, you can come and see us in person. We do some virtual events as well. We would love to see you there.

That’s brilliant. It’s time. Thank you, Helen. That was wonderful. Thank you for your time and your contributions.

Thanks, James.

It’s Jane. Thanks for reading. If you enjoyed it, if you have a question, or if you want to say hi, you can find us on Twitter @WorldOfWork_IO. Don’t forget, you can also find out more about what we do, including our online seminars, workshops, and development programs on www.WorldOfWork.io.

 

Important Links

 

About Helen Bailey

WOW 167 | Evidence Based LearningHelen is an award-winning L&D specialist and for more than 20 years has worked across a variety of sectors and with managers, leaders and employees at all levels of the organisation. She is focused about creating engaging and meaningful experiences using Accelerated Learning to truly embed the learning.

Helen likes nothing better than interrogating data to identify clear learning needs and demonstrate value to our clients. In addition, she is committed to continuous professional development and uses her learning to constantly evolve her approach to L&D. Outside of work Helen can often be found walking, watching musicals, and visiting places of historical interest.