Welcome!
-
-
-
-
-
- LMS Home
- Core Concepts
- Companies and Departments
- Courses
- Course Status and Visibility
- Sections
- Video Activities
- Reading Activities
- Document Activities
- Enrolling People in a Course
- Bulk Enroll With CSV
- Roles in a Course
- How Progress Is Tracked
- Course Reports
- Knowledge Base Overview
- Libraries and Folders
- Uploading Content Items
- Library Visibility
- AI Agent per Library
- My Courses
- Platform People
-
-
-
- Project Structure Template
- Table of Contents (TOC)
- Content Folder
- Introduction to Markdown
- Markdown Basics
- Markdown Lists
- Markdown Links
- Markdown Images
- Markdown Code
- Markdown Tables
- Markdown Equations
- Markdown Videos
- Markdown Embedded HTML
- VS Code Snippets
- Introduction to Styles
- Custom Theme
- Framed Narrations
- Markdown Configuration
- Editor Setup
-
Delivering and Measuring Training
TL;DR; A course that ships to no one and is never measured may as well not exist. After you build the course, you have to get it in front of the right learners, make it easy for them to finish, and check whether it actually changed anything. The first two are easy; the third is where most training programs fall apart.
You've built the course. Good. That was the fun part. Now comes the part that most organizations skip and then wonder why nothing changes.
Delivering: getting the course in front of learners
Lupo's enrollment features (see Enrolling People in a Course and Bulk Enroll With CSV) handle the mechanics. The question is who you enroll and how you set their expectations.
A few things that work:
- Enroll a small pilot group first. Before you announce a new course to the whole team, run it past five or six people. You'll find rough edges you missed. Fix them. Then do the broad rollout.
- Tell learners what's in it for them. Don't just enroll them silently. Send a short message or email: here's a new course, here's why it exists, here's what you'll get out of it, here's how long it'll take. Adults don't start things they don't understand the purpose of.
- Set a soft deadline. "Please finish by the end of the month." Not because you're going to enforce it, but because unscheduled work gets put off forever. A deadline creates a small amount of urgency that moves things along.
- Make it findable. My Courses already shows learners their enrolled courses — they don't have to hunt for it. But if you're mentioning the course in a meeting or a Slack message, link to the course page directly. Fewer clicks is always better.
Delivering: removing friction
Course completion rates drop off sharply with every piece of friction you put between the learner and the content. The most common friction points:
- Too long. A course that advertises itself as "30 minutes" and is actually 70 minutes loses half its audience. Be honest about the length up front, and aim for shorter rather than longer.
- Too many clicks to get started. If a learner opens the course and the first screen is "welcome, click here to accept the terms, click here to start, click here to continue," you've lost them. Get them into the first video fast.
- Confusing navigation. If learners can't tell where they are in the course or what's next, they get stuck. Lupo's section/activity structure is usually clear enough, but check that your section names actually describe what's in them.
- Broken content. A 404 video, a missing PDF, a link to a dead external resource. Test every activity as a learner would before you roll the course out. One broken piece destroys trust for the whole course.
These are small things individually. Together they make the difference between a course that gets finished and one that gets abandoned at 40%.
Measuring: what to look at
Once the course is running, spend time in the Course Reports page. The reports tell you several things:
- Completion rate. What percentage of enrolled learners have finished the course? A healthy number varies by context, but below 60% usually means something is wrong — the course is too long, the audience isn't the right one, or the value isn't clear.
- Where people drop off. The activity feed shows you which activities learners complete and which they don't. If everyone makes it to section 2 and then falls off, section 2 is probably the problem.
- Time spent. Rough but useful. If learners are burning through a 15-minute video in three minutes, they're skimming — which means the video didn't earn their attention.
Use this data to iterate. Courses shouldn't be ship-and-forget; they should be ship, watch, adjust.
Measuring: the thing that actually matters
Completion rate is a proxy. The real question is whether the course changed anyone's behavior.
This is the hardest part of training, and the one every organization under-invests in. The way you measure it depends on what the course was supposed to do, and it's usually outside the LMS itself:
- If the course was about a specific tool or process, are people using the tool or following the process? Check the tool's usage metrics or the process's compliance.
- If the course was about reducing a specific class of mistake, are those mistakes happening less often? Look at incident logs, support tickets, QA reports.
- If the course was onboarding, are new hires ramping up faster? Ask their managers, or look at how long it takes them to hit their first milestone.
You planned the course with a concrete desired outcome (see Planning Your Course). The measurement is whether that outcome happened. Set up a check in three to six months to actually look.
When the course didn't work
Sometimes the course doesn't move the metric. That's not a disaster — it's data. Diagnose:
- Is the completion rate low? Then the course never got a fair shot. Fix the friction and re-launch.
- Is the completion rate high but the metric didn't move? The course is finishing but not teaching. Rewrite, shorten, add application activities, or reconsider whether training is even the right tool for this problem. (Sometimes the answer is a process change or a better tool, not a course.)
- Is the metric moving a little but not enough? Good — you have something. Iterate.
The important move is to actually go check. Most organizations ship a course, celebrate the launch, and never look again. If you do nothing else differently after reading this section, start the habit of checking in on your courses three months after they ship.
Make it easy to update
Finally: your first version is not the final version. Content gets stale, processes change, new information comes in. Budget for maintenance.
Lupo's structure helps here — you can update a single activity in a course without rebuilding the whole thing. Take advantage of it. When something changes, update the relevant activity, note the change in the description, and move on.
A course that gets updated every few months stays useful for years. A course that ships once and is never touched is dead in 18 months.
Where to go next
- Your AI-Powered Launch Plan — a practical plan for actually shipping your first Lupo course.
- Course Reports — the page where completion data lives.
- Go Deeper — the full masterclass has more on measurement and iteration.