We spoke recently with the L&D head of a mid-sized asset management company employing around 300 employees in Mumbai. She told us something that stuck. Her marketing and sales teams were already using generative AI actively for writing social media content, researching clients, preparing pitch materials. Good news, right?

So we asked her how they got trained.

She paused. “Honestly, I’m not sure. It just kind of happened.”

Then she asked us two questions that we’ve been thinking about since. First was how to train my employees on AI across my other teams, the ones that haven’t moved yet? Second, and this one was harder was how do I even know if a course is meaningful for my employees, or if they’re just clicking through it to get the certificate?

Both are fair questions. This post tries to answer both honestly.

The problem isn’t that employees don’t know AI exists

Most HR and L&D heads assume the training challenge is awareness. It isn’t. Your employees have been reading about AI for two years. They’ve played with it on their phones. They know it exists.

The actual problem as that L&D head put it is that you have pockets of confident use and large patches of nothing. And the pockets that are confident didn’t get there through any deliberate effort. They got there because a few curious people figured it out on their own, and now that knowledge sits with those individuals rather than being shared across the team.

That’s not a sustainable position. It creates inconsistency in how work gets done, it creates risk when those individuals leave or change roles, and it creates a quiet resentment among teams who feel left behind.

The goal of training isn’t to introduce AI to people who’ve never heard of it. It’s to get every team to a shared baseline so the organisation moves together rather than in separate directions at different speeds.

Start with the teams that haven’t moved yet

The instinct is to build on what’s working. If marketing and sales are already using AI well, why not focus training there and let that energy spread?

Because it won’t spread on its own. It hasn’t so far.

The L&D head we spoke to had watched her marketing team use AI for eight months. In that time, her operations team had picked up exactly nothing from them. Different floors, different managers, different weekly rhythms. The knowledge didn’t travel.

Start with the teams that are stuck. Operations, compliance, finance, procurement, client servicing wherever AI hasn’t reached yet. These teams often have the most to gain because their work is full of repetitive, time-consuming tasks that AI handles well. A compliance analyst who spends three hours every week compiling regulatory updates can cut that significantly. A client servicing executive who drafts the same responses to the same queries twenty times a day has an obvious use case.

Before designing any training, ask your managers one thing, which of their team members are actively using AI, which are curious but haven’t done anything yet, and which are avoiding it entirely. The middle group that is curious but inactive is almost always the largest. They’re your starting point, and they’re also where training pays off fastest because you’re giving people something they were already ready for.

Don’t assume the teams already using AI are doing it well

This is the part the AMC L&D head found uncomfortable when we pushed on it.

Her marketing team was using AI. But on asking what they were actually doing with it, the answers varied enormously. Some were using it well. Some were using it in ways that would concern a compliance officer which also involved instances of pasting client information into public tools, accepting AI output without verification, using it for tasks where the margin for error is low.

“It just kind of happened” is not a training strategy. It’s a liability.

Even your most confident AI users probably need some structure around what they should and shouldn’t be doing. Not to slow them down, but to make sure the habits they’ve built are the right ones. A short, structured course gives everyone including your AI enthusiasts , a common framework and a shared language. That matters especially in a regulated industry like asset management where the line between clever efficiency and compliance risk is not always obvious.

Keep it short enough that it actually gets done

This sounds obvious. Companies still get it wrong.

We’ve seen companies build eight hour AI training programmes. We’ve seen companies buy access to enormous online course libraries with 40 modules on AI. The completion rates are terrible and nobody wants to talk about it.

The only AI training that gets done across an entire Indian company is training that fits inside a normal working day without making unusual demands on anyone. Thirty minutes, self-paced, with something immediately useful at the end of it.

Think about who you’re asking. A client servicing executive in Pune is handling queries from the morning and sitting in a process review after lunch. You are competing with her actual job for attention. If your training is long, she’ll click through it without engaging and mark it complete. If it’s short and clearly connected to something she does every day, she’ll actually try it.

Thirty minutes is enough to teach someone to write a useful prompt, understand why AI gets things wrong sometimes, and leave with one concrete thing to try the next morning. That’s not everything there is to know about AI. But it gets people started, and getting started is the only thing that matters in month one.

Generic training gets you started. Customizable training gets you results.

Here is the honest limitation of any foundation AI course, including ours. A course built for every employee in every company is by definition built for no employee in any specific company. The fundamentals including how to write a good prompt, how to evaluate AI output, how to use AI without creating data risk are universal. But the moment you want an employee to actually change how they work, universal stops being enough.

The AMC L&D head’s problem is a good example of this. She doesn’t just need her compliance team to understand what AI is. She needs them to see how it applies to the specific work they do every day which includes summarizing SEBI circulars, preparing audit notes, reviewing distributor documentation or even using it for AML and fraud detection. A scenario built around a generic office worker writing emails doesn’t get her there.

This is why the baseline course matters but customization is what makes training stick.

With XLPro’s Foundation AI Course, the core content the fundamentals of AI, prompt writing, evaluating output, using AI responsibly stays the same across all deployments. What changes are the scenarios. A manufacturing company gets examples built around production planning, quality checks, and vendor communication. A pharma company gets scenarios around medical writing, regulatory submissions, and clinical documentation. An NBFC gets examples drawn from collections, credit appraisal, and customer communication.

This isn’t a small thing. In our experience, completion rates improve significantly when employees recognize their own job in the training. More importantly, application rates improve means more people actually try something differently after finishing the course, which is the only outcome that matters.

If you’re an L&D head evaluating how to train employees on AI for your company, ask any vendor this question directly. Can you show me what the scenarios look like for my specific roles? If the answer is “the content is the same for everyone,” you know what you’re getting. A course that will be completed and forgotten rather than one that changes how people work.

Say something about job security before you teach anything else

In every AI training conversation we’ve had with Indian L&D heads, there’s an anxiety underneath the practical questions. Employees are wondering whether getting trained on AI means helping to make their own jobs redundant.

If you don’t address this directly, it gets in the way of everything else.

The honest answer is that AI takes over specific tasks, not whole roles. The compliance analyst who learns to use AI for regulatory summaries doesn’t lose her job. She gets two hours back every week and does more useful work with the rest of her time. The person who avoids engaging with any of it over the next few years is in a harder position but that’s a different conversation, and it’s not the one to lead with in a training room.

Lead with what’s in it for the person sitting in front of you. If someone spends two hours every Friday compiling a weekly status report, AI can do that in fifteen minutes. That’s not a threat. That’s twenty-five Fridays a year returned to them. Start there.

How do you know if the course actually meant something?

This was the AMC L&D head’s second question, and it’s the one most training vendors don’t answer well. They’ll show you completion rates and quiz scores and call it done. That’s not evaluation. That’s administration.

Completion tells you someone sat through the course. A quiz score tells you they retained some information on the day. Neither tells you whether anything changed in how they work.

Here are three things that actually tell you whether AI training landed.

The first is the simplest. Two weeks after the course, ask each manager one question: has anyone on your team tried something different because of the training? Not “did they mention it” or “did they say they liked it.” Did they actually do something differently. If the answer is consistently no, the training didn’t connect to real work, regardless of what the completion report says.

The second is to look at the questions employees ask after training. Good AI training makes people curious, not just informed. If your L&D inbox after rollout has questions like “can I use this for X” or “what happens if AI gives me wrong information about Y” — that’s a sign the training opened something up. If you hear nothing, the training probably closed things down by making AI feel distant and theoretical.

The third, and this takes a bit more effort, is to sit with one or two people from each team a month after training and ask them to show you how they’re using AI in their actual work. Not in a test scenario. In their real job. What you see in that twenty-minute conversation will tell you more than any survey.

None of this requires a complex evaluation framework. It requires your managers to have one honest conversation per team, and someone at L&D level to be curious enough to follow up.

Track who actually finished, not just who was enrolled

One more thing on measurement, because it matters earlier in the process too.

Enrolling 300 employees in AI training sounds good. What it often means is 300 people got an email, 150 opened it, 80 started the course, and 35 finished it. Nobody wants to say that number out loud, so they talk about enrolment instead.

Use a format that gives you real completion data. An AI training or AI e-learning SCORM course on your LMS or hosted logins where every completion is tracked against a name. Certificates matter too, not because anyone frames them on a wall, but because they give people a reason to finish rather than abandon the course at the halfway mark.

Where to actually start

The AMC L&D head we spoke to left our conversation with a clear first step. Pick one team that hasn’t moved yet. Identify the most time consuming, repetitive task that team does every week. Ask whether a customised set of scenarios for that team’s specific work would make the training feel real to them. Get that team through it, have managers ask the follow-up question two weeks later, and see what comes back.

That’s a manageable thing to do in the next 30 days. It doesn’t require a company wide AI strategy or a large budget. It requires choosing one team, one use case, and being honest about what changes afterwards.

The companies getting this right aren’t the ones with the most sophisticated plans. They’re the ones that stopped waiting for the perfect plan on how to train employees on AI and started with something specific.

XLPro’s Foundation AI Course comes with a generic baseline that works for any company and role-specific scenario customization for companies and industries that want training to stick. Get a free preview →

continue reading

Related Posts