Blog
Use Behavioral Data To Boost Activation and Retention for Your Customer Training Program [Webinar Recap]

Use Behavioral Data To Boost Activation and Retention for Your Customer Training Program [Webinar Recap]

Jami Kelmenson
September 11, 2023
Analytics & Reporting
Webinars & Events

In our recent webinar, Skilljar’s Director of Customer Marketing, Jen Raphael, hosted a conversation with Eric Mistry, Customer Education Operations Manager, and Daniel Jimenez, Senior Education Specialist, on how Heap uses behavioral data gleaned from their Customer Education program to drive impact to their business.

Heap is a digital insights platform that enables fast-moving digital teams to understand and improve digital experiences. Heap’s complete dataset, natively integrated data science, and qualitative analytics help companies pinpoint their most valuable insights and then act with confidence to make business decisions based on that information. So who better to learn from how they use Skilljar data with their own data-driven insights to improve learning outcomes and shape decision-making?

Optimizing our Skilljar instance is my bread and butter these days. You can have the best content in the world but if you don’t put it in front of people in the right way, make it easy to find, and surface it at the right time, it’s not going to be as effective.— Eric Mistry

Heap’s Customer Education tech stack

Heap adopted Skilljar in 2018 because they wanted to move from live individual webinars and trainings to an elearning content platform so they could scale and better meet customer needs. The main criteria they were looking for in a customer education tool was:

  • an LMS focused on customer education rather than internal enablement
  • a good native authoring tool
  • a good long-term partner

Skilljar met and has continued to meet this criteria for Heap University (HeapU). As we’ve grown, we’ve rebranded the platform, added a ton of content, and are continually leveraging new features like learning paths and live webinars.— Eric Mistry

Heap was able to easily install their data autocapture tool on Skilljar just by pasting a code snippet in the global header in the Skilljar Theme section. As a result, Skilljar data including page views, clicks, change fields, forms and submissions, course completions, lesson completions – anything a user does in Skilljar – quickly loads into Heap.

In addition to Skilljar and their own tools, Heap leverages the following Skilljar integrations to enable productive digital insights:

  • Zapier – for connecting different services together and automating large sets of tasks so they can take specific actions on data
  • Zoom – for hosting classes and webinars
  • Typeform – for implementing end-of-course surveys and quantitative and qualitative user feedback

The easy access to creating and inserting custom code into Skilljar is really well done and documented, as well as the nomenclature of the website.— Eric Mistry

Capturing and using data

Daniel took attendees through how Heap captures and uses data from Skilljar to measure and improve learning outcomes.

Here are the key elements of learning data that Heap captures for optimization purposes:

Key elements to capture for customer education data analytics from Heap

Registration & completion:

  • What are people interested in and how are they going through that content?
  • Are they finishing what they start?
  • Are they only going to certain courses or lessons?

Product:

  • What learning content is specifically leading users to take an action within the product?
  • What actions are they taking in the product?
  • Are they clicking on something? Running a report? Viewing a particular bit of content?

Paths and habits:

  • What do learners do regularly?
  • Are they finding the content that they need?
  • If we provide some kind of callout or shortcut as an experiment, are they engaging with it?
  • If we improve those learning paths users are taking, does this make it easier for them to find what they need?
  • Who are these users? Are they new or returning users?

We take the data we get from Skilljar and use it to help us make decisions and measure what we’ve done in HeapU. We want to make sure we’re backing up our actions with data, not just hunches.— Daniel Jimenez

Using Skilljar data along with Heap’s own tools and direct user feedback gathered through Typeform, Heap is able to understand:

  • What are users doing in HeapU?
  • Is it really helping the learner?
  • Is it a good UI for that learner?
  • Are they completing the actions and having good outcomes?
  • Was it a good experience?
  • Is there something we weren’t thinking about that might be better, cleaner, simpler to really optimize the experience for that learner?

We want them to learn, not just show up and finish a course. We want to understand if they take action. We keep watching the data to see if there are any kind of changes or improvements we can make.— Daniel Jimenez

Daniel relayed that this type of data is useful for helping assess learning outcomes, plan the product roadmap, and optimize the learning site. In this way, they can make sure they provide content that is sticking.

Assessing learning outcomes

Here are Daniel’s suggestions for identifying ways to improve learning outcomes.

1. Make sure you are looking at the right subset of learners.

Daniel shared this example of how they look at data and outcomes for their Session Replay course.

Session Replay course outcome conversion for Heap

Heap built a course to help learners understand how to use their session replay feature, which helps them understand where their users are falling off in learning sessions on their website. It works by enabling clients to actually watch a replay of the user’s progression. They looked at how many total learners took the course and found that only one-third actually went on to watch a session replay, which is the outcome they were looking for from that course.

They then looked only at learners who have access to the session replay feature (this is an add-on feature for paid customers only), to get truer results. It turns out almost three-quarters of learners who had session replay enabled were going forward and taking that action from the course. The key here is to look at the right subset of users, which means eliminating internal employees as well.

2. Ask the right questions and test different hypotheses.

Daniel explained that in order to measure learner outcomes, they create a dashboard that allows them to look at their metrics and continually keep an eye on them. Once they have set up the right charts, it gives them a constant view into different areas within one course along with actions that users are taking.

For example, comparing the actions of users who took their Quick Start course with the entire learner population found that less than half of these users run a query (create a chart) within the first week. Heap discovered that this was an area in which they could use some improvement.

Quick start course data analytics for Heap

Metrics like this beg asking the right questions to understand what is making users perform, or not perform, the desired action.

  • Is the time frame too short?
  • Would the impact show up if they tracked it over a month rather than a week?
  • Did users follow through and run another type of report, as a result of taking this course?

The other action they’re looking for with the Quick Start course is pointing learners toward templates, which help them build their own dashboards faster. The team learned that some learners went directly into building charts out rather than going into templates, but in either case they can measure what is sticking and what is not.

From there, they develop their own hypotheses to test, such as:

  • Is this course working to get users to build templates faster?
  • Is it that once users viewed the templates, they knew how to create dashboards and didn’t require further training?
  • Are the templates not useful?
  • Are they hard to find?

This type of questioning leads to a whole other set of possibilities that they can dig into further by testing and reviewing the data.

Daniel advises determining the core items that will let you know you’re successful. For Heap, this is activation, as defined by running a query, and retention, as defined by running a query within the next month. Then, look at the percentage of learners that took a course and then took that action. Be sure to define your timelines (i.e., one week versus one month for activation and four weeks for retention) to test your core metric and make sure you eliminate users that aren’t relevant (i.e., internal employees.)

Make this a part of your every day or weekly analysis, and look for ways to get learners to activate faster, know if they’re coming back, and when. Decide the actions you want to take and then use the data to measure if and when they are taking them and how these rates may be improved. Consider both quantitative and qualitative data.

We take in both quantitative and qualitative information as we review data and make improvements to assess learning outcomes. Are they actually following through and doing what we expect? Are they coming back to do it again?— Daniel Jimenez

Planning your content roadmap

Once you’ve identified which learning behaviors to assess and review the metrics regularly, how do you then use this information to plan your content roadmap and improve your content?

To do this, Daniel advises the following steps:

1. Decide who your audience is.

Active users in HeapU, a Skilljar customer

The data showed Heap that most of their learners were new users who needed to get up and running with the product.

2. Decide what content your audience is looking at.

Top courses in HeayU, a Skilljar customer

Not surprisingly, new users were mostly looking at the “Getting Started” content for Heap. So they knew they were hitting the right users with the right content.

3. Look at the activation levels for those learners.

Impact of learning on activation and retention for HeapU

Heap found that the activation levels, from the first time learners view a course to the first time they run a query, were quite high. Within the same week, 41% of learners who viewed a course page achieved the desired outcome, versus only 5% who did not. For another metric, retention, 43% of learners who viewed a course are coming back within a few weeks, compared to 30% who are not. So they’ve identified that the right learners are finding the right content and coming back. The next step is to…

4. Prioritize changes based on capacity.

Identify what content is most used by successful learners as well as what new updates are coming out that need to be communicated (UI changes vs. features). Gauge these needs against the capacity you have to make changes. For example, some items can be covered as documentation as opposed to creating a full-fledged course.

Since we are a smaller team, we need to make sure that we prioritize and really use this data to identify the content users are most interested in at the right level for them.— Daniel Jimenez

5. Reduce friction and “struggle zones.”

Use the data to help you understand where there are “struggle zones” for learners (as indicated by falloff in the user journey). The goal is to reduce this friction. For Heap, they considered:

  • What Help Center content is being searched the most?
  • Which learning outcomes are learners completing? Why are they not completing?
  • Does the UI match the training experience? (Colors, changing buttons, rebranding all contribute). This is another area where Heap uses Typeform to find out how they can improve their features and navigation.

Optimizing your learning site

Heap uses their own heat map tool to identify where users are placing their mouse on a page as a proxy for attention. The main insight they got is that most learners don’t bother scrolling or viewing the lower part of their screen.

Heat map for HeapU homepage

Heap’s own heat map discovered that most users don’t go beyond the gray boxes near the middle of the screen when reviewing the HeapU homepage.

There are many things that Eric tests on this page to try to optimize usage including banner size incluidng CTA buttons and the four main buckets of content, since these are the areas where users have the most activity.

It’s also important to understand where learners are coming from before they arrive at HeapU, such as through a link or an email, internal Heap navigation (from Heap to HeapU), the community, help center, and so forth.

Traffic referrer for HeapU

At Heap, all of their sub-domains for training are linked to make the experience easier for learners. Just knowing where learners are coming from can help you decide what actions you need to take to improve outcomes.

Deep Dive: My Courses button

The My Courses button stands out on the HeapU hompepage so users can find where they left offf on a course

Heap wanted to test a hypothesis on why completion rates were lower than expected for their longer courses: they wanted to know if learners had trouble finding and accessing their existing enrolled courses.

Learners’ profiles in HeapU list all the courses the learner is registered for. But their data was showing that very few learners were accessing it. So they added a “My Courses” button to the quick nav buttons below the header image. (Skilljar also has a template that accomplishes this, available in our Developer Center for customers.)

They are continually watching and optimizing all of their CTA buttons as well. The conversion rate (learners click on and complete a course) for learners who click on “My Courses” is over 68% compared to 32% for learners who don’t click on “My Courses.” The conversion rate for learners who click on any quick navigation link is about 50% compared to 29% for learners who don’t.

The conversion rate for HeapU users who access the Quick Nav buttons on their hompage
Getting learners to click on the “My Courses” button has a strong correlation with them actually finishing the course. And we know that course completion correlates really well with product usage. We’re always looking for ways to boost that particular metric.— Eric Mistry

Change, measure, change again

At Heap, they always conduct experiments and iterations, even small ones like the color of a CTA button, to try to boost engagement. Some will work and some won’t. Even if they don’t, they try something else and ask more questions to see what else could work.

For example, a rebrand of the HeapU site was heavily green in color. They got some feedback via Typeform and internal SMEs to dial that down. Today, the HeapU homepage uses a diverse color palette and quick navigation buttons, with only the “My Courses” button highlighted in green.

The HeapU homepage, powered by Skilljar

They also found that learners weren’t clicking on the header at the very top right at all because it was hard to see the links. So they changed the header background to dark blue, and are measuring the impact, as well as the impact of naming conventions for the CTA buttons. Small changes can have an impact and they’re easy to iterate.

For guiding learning journeys, they want to supply content to learners at the right time, such as having a course appear following completion of another course. They accomplish this through Skilljar’s learning paths feature.

How Heap uses Skilljar Learning Paths in their Customer Education program

They’re always looking at the data and making small tweaks along the way. Using Zapier via the Skilljar API, they pull data on lesson completion as well as course completion into Heap. This helps them identify “power learners” and see the journeys they take through the HeapU platform.

At the end of the day, we’re just trying to boost our LMS experience because like Skilljar says, a trained customer is your best customer.— Eric Mistry

To summarize the key takeaways of how Heap uses behavioral data to optimize their learning platform:

  • Asking the right questions of the right data can lead to new insights.
  • You must act on your insights; insights without action do nothing.
  • Change is not a one-time thing.
  • Changes do not have to be massive to have impact.

We’re continually adapting and evolving based on our insights to keep our strategies effective and our learners engaged. Button text or color, or just what something’s named, can have a huge impact and with the right data you can tell what’s causing that impact.— Eric Mistryv

Read more blog posts

blog
Advantages & Benefits of Instructor-Led Training (ILT) Education
Read now →
Customer Spotlight
Instructor-Led Training (ILT)
Training Strategy
blog
7 User Onboarding Best Practices
Read now →
Content Development
Customer Onboarding
Customer Success
Training Strategy

Ready to take Skilljar for a spin?

Take an interactive tour of Skilljar, or book your demo with our team.