The Webinar Is Playing, Nobody’s Learning



McKinsey’s HR Monitor drops a neat truth bomb: employees across Europe say they spent 12 days in training in 2024; HR reports 22. That’s a 45% gap. Add the kicker: 30% of employees say they had no training at all. In Germany it’s 44%, up from 23% the year before. HR also claims formal feedback happens more often than employees remember. Busy schedules, full dashboards, empty impact. The takeaway is straightforward: development shows up in reports, not in lived experience.
Why does that gap exist? Look at how learning is positioned. Most L&D still sits outside the flow of real work: a calendar slot, a Zoom link, a deck of slides. People “go” to training, then sprint back to the tasks they’re actually measured on. The spreadsheet tracks attendance hours; the participant tracks unread emails piling up. One side is counting presence; the other is counting value. No wonder the math refuses to reconcile.
There’s also a memory leak. If a session doesn’t demand attention, it evaporates. Ask someone a week later what they applied and you’ll often get silence or a vague line about “communication.” The organization says, “we invested.” Employees shrug, “nothing stuck.” That rift is the real story hiding behind the 12 vs. 22 days headline.
Logged In, Tuned Out
Picture the scene: the webinar window sits in the corner, camera off. The inbox hums, a Teams ping flashes, another task gets checked off. Forty-five minutes later, the platform records completion. The mind files it under noise. When learning is easy to fake, people will fake it. Usually without noticing. They’re simply obeying what screams loudest in that moment.
Skills don’t grow through passive streaming. Watching a video on feedback won’t reshape how someone runs a tough conversation. That takes practice, response, and a safe place to get it wrong first. When the whole experience can be consumed on autopilot, it will be. The quiet truth: many “participants” never actually participate. And because most systems can’t tell presence from engagement, the problem stays invisible.
Mandatory quotas don’t crack that cycle. They can shove people into virtual rooms, but they can’t make minds show up. The result: inflated attendance logs, deflated impact. Everyone can point to numbers; no one can point to changed work.
Design for Friction, Follow‑Through, and Visibility
“Mandatory” should be the door you walk through, not the room you sit in. The room needs friction, the useful kind that forces you to lean in. Interactive, coach‑supported programs are non‑negotiable for a good design; live guidance and immediate feedback turn “attendance” into actual practice. Sessions have to demand attention. Live problem‑solving instead of passive case studies, role play that adds a touch of social pressure, prompts that need an answer now, not after lunch. When silence hides, disengagement thrives. Make absence visible: small groups where an empty square is obvious, shared docs where an untouched cell tells its own story, collaborative boards that capture (or fail to capture) thinking in real time.
Then comes the part most organizations skip: follow‑through. Every module should end with a concrete task back on the job: “Use this prioritization lens in tomorrow’s stand‑up and jot down what changed.” Thirty, sixty, ninety days later, someone should actually look. Coaches (or trained managers) should verify the behavior showed up and help course‑correct when it didn’t. When managers actively play this role, learning shifts from being a side project to becoming part of team culture. Without their involvement, most follow-through risks dissolving into HR’s to-do list rather than daily leadership practice. If application lives only in theory, nothing moves. A quick pulse to the manager (did you see this behavior, did you coach it) turns training into an ongoing loop instead of a one‑off broadcast.
Visibility matters as much as design. No-shows, ghost participants, unfinished assignments shouldn’t hide in a quarterly report; they should ping managers the day they happen. Ownership of learning outcomes belongs to the people who run the work, not just those who run the LMS. And when you measure success, don’t stop at counting hours delivered. Track actions taken, speed to first application, and durability of new habits (for instance, the percentage of managers who actually apply a new feedback technique in their next one-on-one). If a module isn’t moving those needles, it needs a rethink, not a ribbon.
Training volume can balloon forever without touching how people work, decide or collaborate. Interaction is the filter that keeps learning from evaporating the moment the browser tab closes. Design for presence. Insist on practice. Make the aftermath observable. Only then do those 22 days mean something beyond a tidy line in a quarterly report.
Final Word: Bring L&D Into the Work, or Stop Pretending It’s Working
Here’s a simple check before you sign off on the next program: could someone get through it with the camera off and the inbox open? If so, chances are they will. The fix is to weave learning into the job itself, make participation natural, and keep the outcomes visible. When that happens, the numbers begin to align with the real stories people share.