Blog
Learning & Development

AI in L&D: A Double-Edged Sword

Posted by iAM Learning

We tend to think about the rise of AI in negative terms, in part thanks to Sci-Fi through the ages. There’s an imbalance in how it’s portrayed: too many Replicants and Skynet, and not enough WALL-E. 

Think about it this way, though. You’re an L&D manager in a mid-sized company. You’re scratching your head and staring at your screen while trying to put together a training module, and it’s taking forever. Suddenly, the email chime sounds: you’ve been sent a new AI tool, one that can potentially create this whole module in minutes, saving you weeks of spent time and energy. 

AI suddenly doesn’t feel so hostile, then, does it? Take that, Skynet! 

It’s more complex than this, though. AI sure offers some tantalising benefits. Tools like ChatGPT and Gemini can help L&D departments generate personalised learning paths, create engaging quizzes and yes, even draft entire training modules. It seems like an L&D department’s dreams come true! But the reality is that there are ethical considerations to AI use. Job displacements, biases, and even environmental considerations, too.

AI isn’t a distant future concept like it was at the peak of Sci-Fi. It’s here, right now, and it’s the subject on everyone’s lips, regardless of sector. It’s rapidly changing how businesses work, and that includes how they approach learning and skill development for their employees. Tools like the ones described are becoming practical, powerful resources for L&D professionals. This blog is designed to offer a balanced look at both the good and bad things about AI, so you can make an informed decision through a series of easy-to-digest bullet-pointed lists. Let’s get started. 

The Good 

It’s not in dispute that AI can make a massive positive difference in society. For example, Autocorrect. It’s AI that’s been in use for years already, and saved many an embarrassing email. Let’s take a look at some of the compelling advantages of AI use in L&D first. 

Scaling Content Production

Traditional content creation has always been one of those time-consuming, resource-intensive processes. This is where AI steps in: 

  • Rapid content adaptation: Training materials can be quickly updated to reflect new regulations, tech changes or strategic shifts 
  • Reduced development time: What once took weeks can now potentially be accomplished in hours, or perhaps even minutes 
  • Customisation at scale: businesses can create highly specific training modules for different departments or role types, for example, modules for IT engineers that include cybersecurity protocols or communication strategies for HR folk 

Personalised Learning Experiences 

One-size-fits-all training is old hat. AI enables learning experiences tailored to individual needs with real precision. 

  • Adaptive learning algorithms: Content difficulty and presentation can be dynamically adjusted, based on an individual’s performance 
  • Predictive learning recommendations: AI can suggest future learning paths, based on a learner’s current skills and their organisation’s needs 
  • Detailed performance tracking: This offers comprehensive insights into individual and team learning progress 
  • Optimised engagement: Learning materials can be designed to match individuals’ learning preferences 

Economic Efficiency 

We know budgets are tight across the board, including the L&D team. AI can help with this, too. 

  • Reduced labour costs: AI can offer significant reductions in time and resources required to create learning content 
  • Scalable training: Because content creation is quicker, businesses can develop and deploy training in a more cost-effective way 
  • Continuous improvement: AI systems can learn and refine themselves to perform better automatically, reducing long-term training expenses 
  • Reduced physical infrastructure needs: Digital training environments, enhanced by AI, may eliminate the need for training rooms altogether 

The Bad

You may have read the above and thought that AI sounds too good to be true. What’s the catch? Well, there are several important ones to discuss, and many more besides. Let’s start off with one of the biggest, and a more general point, rather than L&D specific.

AI Use is Catastrophic for the Environment

It's too easy to think about AI as an ethereal, cloud-based tech with no physical footprint to speak of. But the reality is that behind every automated process, every learning algorithm & every tracked performance hides a huge infrastructure which uses enormous amounts of energy. 

  • E-Waste: Data centres are packed with tech that becomes obsolete rapidly, and contributes to the mountains of waste and hazardous chemicals that leech into the environment 
  • Water consumption: AI infrastructure uses a vast amount of water through cooling systems – approaching six times the annual use of Denmark – and that simply isn’t sustainable 
  • Carbon emissions: training AI models can produce massive amounts of carbon dioxide, and it’s getting worse as the number of tools increases 

Bias and Fairness Concerns

AI systems have a fundamental limit – they can only use the data that they’re trained on. If biases exist in that data, then the biases will be perpetuated in their output. 

  • Algorithmic prejudice: Historical data may contain societal or organisational biases 
  • Representation challenges: Without care, AI-generated content could disadvantage certain groups 
  • Cultural sensitivity: AI content may not respect diverse cultural contexts 
  • Unconscious bias detection: The content created may need to be continually monitored to ensure any potential discriminatory patterns are corrected 

Potential Job Transformation 

The automation of content creation raises critical questions about the very future of those working in L&D. Are people needed, in this future world? Or at the very least, what will their jobs look like? 

  • Role evolution: L&D experts will need to transition from content creators to an AI oversight role 
  • New skill requirements: The L&D team will have to develop new skills in AI management, prompt engineering and validating content 
  • Collaborative intelligence: Will humans be replaced entirely in this new learning phase? Redundancies are a genuine threat here, but we think the future lies in human-AI partnerships, not replacement 
  • Ethical oversight: Efforts must be made to ensure AI tools will align with organisational values and learning objectives 

Data Privacy and Security Challenges

Quite simply, AI systems require a substantial data set to function effectively. That creates many potential privacy complexities. Ask yourself: “Where was this data obtained?” and “Was everyone who contributed to this data set properly informed to its purpose?” Would you give away all your data, everything you’ve ever contributed to the online community, freely? 

  • Sensitive information handling: Training often involves personal performance data, and that requires robust protection that may not be available 
  • Regulatory compliance: Navigating different regional data protection regulations may be tricky, particularly in a workforce which has an international presence – what works in the UK may not be legal elsewhere training may be taking place 
  • Transparent data usage: It must be clearly communicated how any learner data is collected and used 
  • Cybersecurity considerations: AI systems may be tricky to protect from potential breaches 
  • Data ownership: Questions of intellectual property rights become murky when content is both contributed to and generated by AI systems: this creates uncertainty around who truly owns and controls the material 

Maintaining a Human Connection in Learning

One of the main challenges with AI use in learning is to ensure there’s a genuine human element to it. We learn from people’s experiences, and AI ‘telling us what to do’ doesn’t quite feel like we can trust it. After all, AI has never had those experiences. 

  • Emotional intelligence limitations: AI can’t replicate human empathy and contextual understanding, at least not in a full, genuine way. 
  • Complex scenario navigation: Certain learning experiences, especially in leadership development, really do require human judgement 
  • Motivational support: The inspirational aspects of learning often require human interaction – when have you last been inspired by a machine’s accomplishments? 
  • Contextual interpretation: It takes a human to understand subtle cultural and emotional nuances in most cases 
  • Quality and accuracy: There’s nothing quite like a real expert when it comes to creating content that will positively impact someone, even if it’s just deciding what learning outcomes might be 

So, what do we think? 

We’re at a critical intersection right now, one where technology promises unprecedented transformation, but also raises significant questions about its impact on human learning and professional development. Ultimately, it’s for every company to decide for themselves whether they should use AI in their L&D, but frankly, at the rate it’s progressing in the workplace, there may be no escaping it in the near future. 

As far as we’re concerned, we think the future of Learning and Development isn’t about choosing between human expertise and AI. We think the two are best intertwined with a symbiotic relationship. There doesn’t seem a way to ignore it in this industry, even if you wanted to. But what shape will this relationship take?

Continuous assessment – We must regularly evaluate AI learning tools to see if they are working, and are having the desired outcomes 

Maintain human oversight – This is critical. AI should be viewed as a tool humans can use, not as a replacement for us. 

Ethical framework development – Clear guidelines must be created for AI usage, and then adhered to. 

Invest in human skill development – Remember those skills we suggested might benefit L&D teams, earlier? Well, they could be very useful. If nothing else, developing new skills adds another string to your bow. 

AI is here already, and it’s fundamentally transforming L&D. It does give us some unique opportunities for efficiency, personalisation and scalability. But to use it successfully requires a thoughtful approach that balances tech potential with our human values. 

Successful businesses will not be the ones who shun AI use, nor the ones who use it as a magic solution, but as a powerful collaborative tool, and that includes using it in L&D. It can be used to amplify learners’ potential, rather than replacing humans as an integral part of the learning experience. 

AI isn’t Skynet. No controlling killer robots here (yet, anyway). If we integrate it incrementally, with full human oversight at every stage, we’ll be able to take advantage during this new tech frontier. 

Want to know more?

If you want to know more about our online learning, why not check out our unique, engaging eLearning content, covering everything from leadership and people management to health & safety and compliance? If you want to see the huge variety of our online training content, please get in touch with us, or try iAM Learning for yourself - get started today! 

Share this post

Related posts

Want to see more - Oliver-Twist

Want to see more?

Your business will likely face unique challenges, but our friendly team is happy to answer any questions you may have. We’re eager to show you any specific courses you’re interested in, or guide you through our off the shelf eLearning courses.