How to run an effective pilot program for a learning platform

Table of Contents

Running a pilot program for a learning platform allows you to test diverse aspects of the new platform but on a small scale.

When you think about implementing a new learning platform for your organisation, it is necessary to ensure it will be advantageous before making the purchase decision. Running a pilot program for a learning platform allows you to test diverse aspects of the new platform but on a small scale.

First things first, however – you initially need to consider if a pilot program for a learning platform would be an effective step in the purchasing process. Not every purchasing decision needs a pilot program. A pilot program for a learning platform is an investment itself. They take time and effort (and sometimes even a monetary investment) and may not even yield useful information unless they are run correctly.

Before committing to a pilot program for a learning platfoorm, take a moment to consider if it is worth the investment.

  • How unfamiliar are you with learning platforms?
  • How large is the change management exercise in your organisation?
  • How many stakeholders are impacted by the new learning platform?
  • How costly is the new learning platform?
  • Can a pilot program actually return effective results?

That last question is critical. Not everything can be tested via a pilot program. Sometimes the effort to obtain useful feedback from a pilot program is equal to simply implementing the new platform itself. When the new learning platform has flexible pricing and/or no set-up or implementation costs, then it may be simply more effective to skip the pilot program and proceed straight to implementation. If the learning platform has a large set-up cost, complicated implementation process or would commit you to a multi-year, expensive subscription, then running that pilot program might indeed be a wise idea!

If you do decide to run a pilot program for a learning platform, then these six steps will start you on the path towards effective testing and useful data for your decision making processes.

Step One – Identify your goals

Ideally, your review of learning platforms should have already identified the desired solutions and outcomes for your organisation. You need to step back from those overall outcomes and instead define what you want to know from the pilot program for a learning platform itself.  Gather your stakeholders or decision markers and start brainstorming specific questions you are currently struggling with or uncertain about.

In the context of a new learning platform, this might include:

  • Can employees use the learning platform without significant assistance?
  • Do learners use or gain value from certain features?
  • Does the style of learning feel natural to employees?
  • Did employees have problems accessing the learning platform?
  • Was the process of setting up employees and learning manageable (for a pilot)?

Ideally, group similar questions together and summarise them into a set of questions you need to find answers for with your pilot program. These questions will guide you in deciding what features or pieces of the platform need to be evaluated.

Step Two – Determine what to measure

That last sentence in the previous step is very important – a pilot doesn’t need to test every single aspect of the platform. It just needs to focus on those areas or features required to answer your questions and help you make a purchasing decision.

So, for example, if your concern is about how easily employees can navigate the platform, then you’ll need to provide modules for employees to navigate within and between. If you need to evaluate the ease in creating online training, then testing social learning may not be as critical as having a few subject matter experts try out the training creation tools. By setting goals, you will understand the key features of the platform you need to focus on. Test those and those alone.

Step Three – Recruit your program participants

A pilot program group doesn’t have to be large, but if your final audience is diverse, you might need to replicate that diversity. You also need to ensure you involve enough departments and roles to give you enough data. Often your stakeholders will be able to help you identify the right people, particularly if you need to test this among multiple segments.

Testing training is a tricky task too. If an employee is testing the interface, they often don’t concentrate on the training content as much as they would if it was a real training exercise. Conversely, if you want employees to focus on training content, then you need to empower them to spend as much time testing and implementing the training content as they would with real training (with the same support they would get from their manager in a real training exercise). That’s often a big task and requires a substantial investment.

You can manage this by picking training topics with less ‘load’ on the employee – compliance training is often used for this. However, if your goal is to drive training which impacts strategic goals – such as sales or leadership – then compliance training isn’t a great proxy. You need to tread carefully here. Asking too much of your employees in a pilot program of a training platform may be unrealistic and only frustrate them (negatively biasing their responses).

A half-way measure may be for employees to largely complete the training and to then answer a series of review questions about their impressions and intent. Could they foresee the remaining (intensive) training activities being achievable? What have they managed to achieve with the training they did complete?

When testing other aspects of a training platform, similar issues of balance can arise. If you want to test out the ability to create training, asking someone to create an entire training module as a pilot may be too much – you might need to settle with creating part of a module – enough to ‘get the gist’. Administration tasks can be similarly tested. If you are worried about the burden or effort of training administration, then ask an administrator to focus on automation tasks around a single module (not dozens of modules).

Ask too little, and you won’t get high-quality responses as the testing may be superficial. Ask too much, and you may negatively bias the results from a level of frustration.

Step Four – Collect your pilot program data

Your pilot program data will vary depending on your goals. Look at your goals to understand the best mix of qualitative and quantitative data you need.

You can then decide how you will collect that data. It could be physically watching employees use the platform, or it might be an online survey. Good learning platforms should have built-in learner feedback features (such as the built-in feedback and quality polls in all Tribal Habits modules). Usability.gov includes some great examples of both quantitative and qualitative data collection.

Remember: If you plan on gathering quantitative data, make sure you also collect qualitative data. It’s important to a system of checks-and-balances; otherwise, you run the risk of numbers dominating feedback and numbers can be open to interpretation.

Step Five – Revisit your decision-making process

There is no point in running a pilot program for a learning platform if you don’t actually use the results! It’s great if you discovered that employees are using the platform as intended and found the experience beneficial. Chances are, however, that not every aspect of the new platform went perfectly – it is just a pilot program, not a full implementation after all.

Take what you discovered and see how serious the issues are. For example, if employees struggled with navigation, consider if that could be solved in a proper implementation with more useful onboarding or instructions. If employees didn’t use certain features, consider if those features are perhaps less important than you thought or if they need a better explanation.

Step Six – Turn testers into champions

Once you have worked with your pilot program group, you’ll have a good idea who found it valuable and who appreciated your desire to improve learning in your organisation. Take those people who are interested and ask them to share their experiences.

These platform champions might send emails to their department, present on the platform at a team meeting or let you use quotes in rollout communications. They may also be great candidates to create your first custom online training modules in a platform like Tribal Habits.

Smart use of a pilot program for a learning platform

Using these six steps will help you optimise a pilot program for a learning platform. If you take the time to plan, work through a process, clearly evaluate data and identify possible changes, you dramatically increase your chance of success when the full platform makes its debut. And you can make a more confident purchasing decision in the first place.

Shopping cart

0
image/svg+xml

No products in the cart.

Continue Shopping