L&D Conference Reflections: AI Vs. Humankind


Where’s L&D Heading?

I’ve been speaking at L&D conferences for over a decade. I always had the same approach to learning: talking to industry leaders about what they actually do, talking to participants about what they want to solve for or struggle with, attending sessions to understand trends, meeting new friends at lunch and in the hallways, and connecting/reconnecting with humans about work, life, and the universe. This article (one of two) explores the common takeaways from the ATD LearningTechnologies conference.

Even A TechKnowledge Conference Is About Humans

This year, the theme of the ATD’s TechKnowledge conference can be summed up by the shirt I was wearing for my session:

human·kind (be both)

We, humans, are complex. Some of us are even intelligent 🙂 How we treat each other, especially those who may disagree with us, plays a significant role in how Artificial Intelligence (AI) is trained. After all, AI is being trained on human data; human stories, that is. Humankind is all about the hidden human stories. Talk to people. Listen to their stories. You never know until you know their whole story. Be kind. Don’t expect technology (even AI) to solve your human issues.

Why All This Long Intro?

Because of the first reflection below: find and understand a problem worth solving first, and then look for a solution (AI or not). Otherwise, you’ll be tired and overwhelmed by how fast the tool changes in your hand while chasing problems that it could solve. Many humans struggle today with even keeping up with the changes, especially in the world of AI. Donald H. Taylor’s annual “What’s Hot in L&D” surveys show the same: AI is top of mind. [1]

Conference Reflections

1. Many Humans Are Tired And Overwhelmed

The pace of change hurts. Many humans are exhausted just by reading articles, listening to podcasts, or watching TikToks. The moment you learn, reflect, apply, and share the latest, it might be obsolete. No wonder there are very few books on AI.

AI also has indirect implications for existing technology. Look at the EdTech industry! They know they’re behind the curve if they don’t claim to have AI integration. Content creation is the simplest entry point. Early adopters started implementing live chat features to take advantage of conversational AI. Today, you can talk to pretty realistic 3D characters. In a couple of years this feature will be in every single decent learning platform.

The constant change forces us to make decisions: should we wait until the full-blown feature is ready, or should we start “hacking” the current version? Building APIs and using code may become technical debt in a couple of years when vendors are bringing their own.

  • How to mitigate the fatigue of change
    • Know that you’re not alone
      When you’re in the trenches in your own organization, it may feel like you’re years behind, and everyone on LinkedIn is an expert in AI and L&D. Talk to others. Network. Go to a conference. You’ll see that everyone is having the same issues.
    • You can’t do it all: prioritize
      You will not be the master of all trades anymore. That era is gone. Prioritize what matters. What matters for you, for your team, and for your organization?
    • Start with the problem
      Most of your challenges did not change. Just because now you may have a technology that can solve a problem that is not worth solving, it is not worth it.
    • Follow others in three circles
      • Challenge circle
        People who are in the trenches having similar challenges you’re dealing with
      • Motivation circle
        People who are 6 months – 1 year ahead of where you want to be; people who motivate you to focus on the short term.
      • Inspiration circle
        People who are years ahead; those who deal with the tsunami of new information, research, and tech changes. Use them as a buffer. Let them work on what matters and what distracts.

2. Rolling Out AI: Back To Behavior Change?

We rolled out co-pilot, and nobody is using it.

This recurring theme reminded me of an experience in my career where the company “rolled out” profiles where employees were supposed to fill out their information about themselves, interests, skills, etc. The result? Miserable adoption rate. The solution? Mandatory HR mandate: people must complete their profile.

You don’t roll out tech like a red carpet. People don’t go to events because there’s a red carpet. They walk on the red carpet because the event is there.

Make the “event” meaningful, and they will come. Same with AI: yes, you need change management. Behavior change management, that is. There’s a whole science behind that! Learning design is not enough to change minds. Start with self-determination theory, BJ Fogg’s MAP model, COM-B, or similar foundations. As for the structure of your rollout plan, here’s something practical from the game design world:

  • Stage 1: Discovery
    Find the people, the problems, and the processes where challenges are. Show them the potential value! Have them see the future, the destination, first, before you give the step-by-step instructions on how to get there.
  • Stage 2: Exploring
    You got the initial buy-in. Motivation is high; experience is low. This is the first time using AI. It may need some handholding and critical mistakes mitigation. Give them an early win! Any small thing that serves as progress. Provide foundational data and AI literacy.
  • Stage 3: Scaffolding
    Now they’re using the tools to solve their problems. Motivation is getting lower as they’re burning energy, but experience and skills are getting higher, which helps them keep engaged. Support them when needed, but don’t hold everyone back with structured live sessions. Let them build and share solutions. Scaffold challenges with supporting tools and materials.
  • Stage 4: Mastery (and beyond)
    You have champions. Self-efficacy (“I can see myself doing this well”) propels solving for new challenges. Connections and relationships built around solution. Provide ongoing support (this is teamwork; you won’t be an expert in everything). Experts can help with onboarding new employees based on lessons learned. Communities can sustain themselves. Best practices can be stored and shared within the AI system (no need for another Sharepoint-site).

3. Upskilling In AI: Where To Start?

One of the most common challenges mentioned in the conference was upskilling the workforce at scale. I often found the word “upskilling” and the phrase “closing the skills gap” misleading. What do you need to upskill someone to a level? Three things: the level where they are, the level where you want them to be, and the shortest path to connect the two.

But somehow, we often focus on the desired state only. Without knowing where individuals are, we only build out the destination and a single road from nowhere. Then, we force everyone to go back to the end of the road to start the journey regardless of where they are.

The “if I don’t see it, it doesn’t exist” policy is not a good one to rely on. Employees use their personal access to AI to help them solve problems and then bring them back to the workplace. It may not be top secret (hopefully), but there are a lot of assumptions there. What is the company’s response? IT blocks copy-pasting from outside. Well, there’s an email for that.

Build a policy that also provides critical awareness of risks. Then, you can think of upskilling, including how you will assess current abilities to enable the shortest path.

4. Is Prompt Engineering Worth Learning?

A year ago, prompt engineering was one of the hottest skills. Fast forward to today, we have thousands of acronyms and “frameworks” on how to write prompts. You can even ask your favorite LLM to generate a prompt. Because generative AI uses Natural Language Processing, prompting is not really “engineering” in the traditional sense. No code is required.

My two cents is that you should learn the whys, not the acronyms or templates. Once you understand why you need to provide specific context, it is easier to adjust. One thing is sure: these models keep evolving. What you learned about them a year ago may not be needed at all today. Focus on explaining the problem, rather than the structure and format someone suggested based on last year’s success.

I always give permission for the model to take time, think, and double-check the best and latest answer. Before that, it often suggested the first, most popular thing, for example, in coding, and then it turned out that it had recently been deprecated. Also, remember you can ask the model to revise the answer. Over and over and over again. Humans wouldn’t tolerate that, but AI is happy to iterate.

5. Practical Use Cases Of AI (For L&D)

The most common use case I’ve seen is content generation. The prompt-based, instant text and graphic content creation solves for an efficiency problem. Being more efficient means we can create more content with less effort. The challenge lies on the other side of the measurement coin: effectiveness. Creating more content faster does not mean we’re making more impact on the job. In fact, this efficiency often leads to “free time” that is consumed by more content creation. I think we should use AI the opposite way: reducing content.

In our session, we shared practical implementation of AI: personalized learning, coaching, adaptive content, authentic skills assessment, building no-code applications in minutes to iterate quickly, and even gamified interactions with interviewing AI-driven 3D characters (so long, multiple-choice answers).

Outside of learning, check out some of the AI use cases in business [2]. In the next article, I’ll continue with the next five themes from the conference:

  1. Accessibility: who cares about others?
  2. Waiting for GodoTech?
  3. Collaboration tech does not collaborate, humans do
  4. Diverse thoughts, better outcome
  5. On a personal note: Living Next Door to Alice. Alice? Who the [beep] is Alice?

References:

[1] THE GLOBAL SENTIMENT SURVEY 2025

[2] Artificial Intelligence (AI) Use Cases and Applications


Source link

Related Articles

Back to top button