32:15 OI

00:91 GC

81:81 GS

Pluralsight Mobile Onboarding

Pluralsight Mobile Onboarding

Pluralsight Mobile Onboarding

an onboarding experience for Pluralsight mobile apps

an onboarding experience for Pluralsight mobile apps

an onboarding experience for Pluralsight mobile apps

Background

What is Pluralsight?

Pluralsight is a subscription-based online learning platform that offers some of the best-in-class courses to help users skill up in tech.

My role at Pluralsight

I led the Native Apps team of 14 stakeholders, which included 3 software development teams (iOS, Android, API), overseeing 7 native apps (iOS, Android, MacOS, Windows, AppleTV, Fire TV, Roku).

Most of our time was spent on the mobile app, and this onboarding experience was one of the most successful projects that I led.

In this project, my team and I set out to understand what caused a sharp increase in user churn and designed a solution for it.

Background

What is Pluralsight?

Pluralsight is a subscription-based online learning platform that offers some of the best-in-class courses to help users skill up in tech.

My role at Pluralsight

I led the Native Apps team of 14 stakeholders, which included 3 software development teams (iOS, Android, API), overseeing 7 native apps (iOS, Android, MacOS, Windows, AppleTV, Fire TV, Roku).

Most of our time was spent on the mobile app, and this onboarding experience was one of the most successful projects that I led.

In this project, my team and I set out to understand what caused a sharp increase in user churn and designed a solution for it.

Background

What is Pluralsight?

Pluralsight is a subscription-based online learning platform that offers some of the best-in-class courses to help users skill up in tech.

My role at Pluralsight

I led the Native Apps team of 14 stakeholders, which included 3 software development teams (iOS, Android, API), overseeing 7 native apps (iOS, Android, MacOS, Windows, AppleTV, Fire TV, Roku).

Most of our time was spent on the mobile app, and this onboarding experience was one of the most successful projects that I led.

In this project, my team and I set out to understand what caused a sharp increase in user churn and designed a solution for it.

Overview

After COVID hit in late 2019 and most tech employers implemented a work from home policy, the behavior of many users on our mobile apps started changing. Prior to COVID, our mobile apps were primarily used by our learners during their commute to and back from work. Since many tech workers no longer were commuting, the usage of our mobile apps was drastically affected.

While reviewing metrics for the 2021 OKR planning, we noticed that there has been a steep increase in first-time app user churn (i.e. more first-time mobile app users would use our app only one time and never return back in the first 28 days).

In just the span of a few months, we saw user churn increase from 31.8% to 38%. Since this metric was highly correlated with our company-wide north star metric, Monthly Active Users (MAU), we set out to find a solution to this problem.

Overview

After COVID hit in late 2019 and most tech employers implemented a work from home policy, the behavior of many users on our mobile apps started changing. Prior to COVID, our mobile apps were primarily used by our learners during their commute to and back from work. Since many tech workers no longer were commuting, the usage of our mobile apps was drastically affected.

While reviewing metrics for the 2021 OKR planning, we noticed that there has been a steep increase in first-time app user churn (i.e. more first-time mobile app users would use our app only one time and never return back in the first 28 days).

In just the span of a few months, we saw user churn increase from 31.8% to 38%. Since this metric was highly correlated with our company-wide north star metric, Monthly Active Users (MAU), we set out to find a solution to this problem.

Overview

After COVID hit in late 2019 and most tech employers implemented a work from home policy, the behavior of many users on our mobile apps started changing. Prior to COVID, our mobile apps were primarily used by our learners during their commute to and back from work. Since many tech workers no longer were commuting, the usage of our mobile apps was drastically affected.

While reviewing metrics for the 2021 OKR planning, we noticed that there has been a steep increase in first-time app user churn (i.e. more first-time mobile app users would use our app only one time and never return back in the first 28 days).

In just the span of a few months, we saw user churn increase from 31.8% to 38%. Since this metric was highly correlated with our company-wide north star metric, Monthly Active Users (MAU), we set out to find a solution to this problem.

Goal

Our goal was to reduce B2B One-and-Dones on mobile from 38% to 30%. (A one-and-done happens when a new mobile user does not return to the mobile app 28 days after their first visit.)

At the time, our apps did not have an onboarding experience. We saw an opportunity in creating an onboarding experience for our first-time mobile app users. Our hypothesis was that by creating a solid onboarding experience for first-time Pluralsight users, we will have a better chance in retaining them for the long run.

With that said, onboarding can be a large time investment and can take so many different directions if not intentionally scoped. Therefore, we spent the early stages of this project scoping down the problem.

It was then time to jump into research to understand the user pain points to scope down the problem and make sure that our solution would give them value while solving a critical business problem.

Goal

Our goal was to reduce B2B One-and-Dones on mobile from 38% to 30%. (A one-and-done happens when a new mobile user does not return to the mobile app 28 days after their first visit.)

At the time, our apps did not have an onboarding experience. We saw an opportunity in creating an onboarding experience for our first-time mobile app users. Our hypothesis was that by creating a solid onboarding experience for first-time Pluralsight users, we will have a better chance in retaining them for the long run.

With that said, onboarding can be a large time investment and can take so many different directions if not intentionally scoped. Therefore, we spent the early stages of this project scoping down the problem.

It was then time to jump into research to understand the user pain points to scope down the problem and make sure that our solution would give them value while solving a critical business problem.

Goal

Our goal was to reduce B2B One-and-Dones on mobile from 38% to 30%. (A one-and-done happens when a new mobile user does not return to the mobile app 28 days after their first visit.)

At the time, our apps did not have an onboarding experience. We saw an opportunity in creating an onboarding experience for our first-time mobile app users. Our hypothesis was that by creating a solid onboarding experience for first-time Pluralsight users, we will have a better chance in retaining them for the long run.

With that said, onboarding can be a large time investment and can take so many different directions if not intentionally scoped. Therefore, we spent the early stages of this project scoping down the problem.

It was then time to jump into research to understand the user pain points to scope down the problem and make sure that our solution would give them value while solving a critical business problem.

Research

1:1 Semi-Structured Interviews

To scope down the problem, we wanted to first understand the experience that new B2B mobile learners have on Pluralsight. So we did 1-1 interviews with that user segment. In our research, we wanted to understand the following questions:

  1. Do learners know what Pluralsight is?

    Since many companies now subscribe to a plethora of different services and provide them to their employees through an access management service (e.g. Okta), users sometimes don't even know what services their company has signed them up for and why. So sometimes the reason users bounce and never return is because of that. The purpose behind this question was to rule this out as a confound.

    It's worth noting, however, that most users who would sign up to speak with us—especially since we didn't incentivize our research studies—would probably know what Pluralsight is. So we proceeded with caution.

  2. Do learners know the value proposition of Pluralsight mobile apps?

    We asked this question to understand whether users know what our mobile app offered (the value proposition) that they wouldn't find on the web platform. Among these value propositions were offline downloads, portability, audio-only player, and a fun quiz experience.

    We wanted to get a sense of whether users need to be onboarded to these features. If that were the case, then we were going to include an introduction to these features during our onboarding flow.

  3. Do learners find the app usable?

    Finally, we wanted to test the user experience of the app. Are app features confusing? Do users find it difficult to download courses? Are some of our highly-retaining mobile features not discoverable enough?

    Poor user experience on the apps could be one of the major driving factors for users leaving the app. So we also wanted to understand whether users found our app to be usable.

Competitive Landscape Review

The primary reason behind going through this exercise was to get a better understanding of learners’ mental models.

I assumed that users who use Pluralsight might also have used other online learning platforms. I wanted to make sure that we design our onboarding experience with users’ existign mental models in mind and not veer away too much from what they’re used to experiencing with similar services.

Coursera

Udemy

Udemy for Business (blocked by subscription wall)

EdX

LinkedIn Learning

Problem
Space

1:1 Semi-Structured Interviews

To scope down the problem, we wanted to first understand the experience that new B2B mobile learners have on Pluralsight. So we did 1-1 interviews with that user segment. In our research, we wanted to understand the following questions:

  1. Do learners know what Pluralsight is?

    Since many companies now subscribe to a plethora of different services and provide them to their employees through an access management service (e.g. Okta), users sometimes don't even know what services their company has signed them up for and why. So sometimes the reason users bounce and never return is because of that. The purpose behind this question was to rule this out as a confound.

    It's worth noting, however, that most users who would sign up to speak with us—especially since we didn't incentivize our research studies—would probably know what Pluralsight is. So we proceeded with caution.

  2. Do learners know the value proposition of Pluralsight mobile apps?

    We asked this question to understand whether users know what our mobile app offered (the value proposition) that they wouldn't find on the web platform. Among these value propositions were offline downloads, portability, audio-only player, and a fun quiz experience.

    We wanted to get a sense of whether users need to be onboarded to these features. If that were the case, then we were going to include an introduction to these features during our onboarding flow.

  3. Do learners find the app usable?

    Finally, we wanted to test the user experience of the app. Are app features confusing? Do users find it difficult to download courses? Are some of our highly-retaining mobile features not discoverable enough?

    Poor user experience on the apps could be one of the major driving factors for users leaving the app. So we also wanted to understand whether users found our app to be usable.

Competitive Landscape Review

The primary reason behind going through this exercise was to get a better understanding of learners’ mental models.

I assumed that users who use Pluralsight might also have used other online learning platforms. I wanted to make sure that we design our onboarding experience with users’ existign mental models in mind and not veer away too much from what they’re used to experiencing with similar services.

Coursera

Udemy

Udemy for Business (blocked by subscription wall)

EdX

LinkedIn Learning

Research

1:1 Semi-Structured Interviews

To scope down the problem, we wanted to first understand the experience that new B2B mobile learners have on Pluralsight. So we did 1-1 interviews with that user segment. In our research, we wanted to understand the following questions:

  1. Do learners know what Pluralsight is?

    Since many companies now subscribe to a plethora of different services and provide them to their employees through an access management service (e.g. Okta), users sometimes don't even know what services their company has signed them up for and why. So sometimes the reason users bounce and never return is because of that. The purpose behind this question was to rule this out as a confound.

    It's worth noting, however, that most users who would sign up to speak with us—especially since we didn't incentivize our research studies—would probably know what Pluralsight is. So we proceeded with caution.

  2. Do learners know the value proposition of Pluralsight mobile apps?

    We asked this question to understand whether users know what our mobile app offered (the value proposition) that they wouldn't find on the web platform. Among these value propositions were offline downloads, portability, audio-only player, and a fun quiz experience.

    We wanted to get a sense of whether users need to be onboarded to these features. If that were the case, then we were going to include an introduction to these features during our onboarding flow.

  3. Do learners find the app usable?

    Finally, we wanted to test the user experience of the app. Are app features confusing? Do users find it difficult to download courses? Are some of our highly-retaining mobile features not discoverable enough?

    Poor user experience on the apps could be one of the major driving factors for users leaving the app. So we also wanted to understand whether users found our app to be usable.

Competitive Landscape Review

The primary reason behind going through this exercise was to get a better understanding of learners’ mental models.

I assumed that users who use Pluralsight might also have used other online learning platforms. I wanted to make sure that we design our onboarding experience with users’ existign mental models in mind and not veer away too much from what they’re used to experiencing with similar services.

Coursera

Udemy

Udemy for Business (blocked by subscription wall)

EdX

LinkedIn Learning

Solution
Space

Solution Design Philosophy

My ultimate goal was to build the most impactful onboarding experience with the least effort and resource allocation to achieve maximum ROI.

  1. Effort: Repurposing Existing Solutions

The first thing I did was dig into our mobile apps more to check whether any existing experiences could be repurposed in our onboarding flow. The rationale here was that this would need minimal development work since it would just require a redesign of an already existing experience.

I found two experiences in particular that could potentially address the problem we set out to solve: learning reminders and weekly learning goals.

Now it was time to see whether those experiences were going to solve the problem of churn

  1. Impact: Quantitative Data Analysis

Since our ultimate goal is reducing one-and-dones, I reached out to our data analyst to retrieve data on how these two features are performing in regard to user retention. We found out that they were some of the highest retaining experiences on our mobile apps.

After both the effort and impact boxes were checked, I got the green light to move forward to designing mock-ups.

Design + Usability Testing

I worked with our designer to create hi-fi mockups to validate our solution by conducting usability tests with our learners before we jump into development.

I created the discussion guide, recruited participants for our usability tests, and conducted the tests. As we went through our usability tests, we iterated over and refined our designs based on our learnings.

Event Tracking

Once we were satisfied with the final product, it was time to work with our data analyst to create and implement event tracking in order to measure the success of our new onboarding experience.

I spent time identifying all the clickstream events we needed to track in order to measure the success of our experience. After that, I created a spreadsheet with all the event names, a description of what each event tracks. I also identified event properties for more granularity and specificity.

Solution
Space

Solution Design Philosophy

My ultimate goal was to build the most impactful onboarding experience with the least effort and resource allocation to achieve maximum ROI.

  1. Effort: Repurposing Existing Solutions

The first thing I did was dig into our mobile apps more to check whether any existing experiences could be repurposed in our onboarding flow. The rationale here was that this would need minimal development work since it would just require a redesign of an already existing experience.

I found two experiences in particular that could potentially address the problem we set out to solve: learning reminders and weekly learning goals.

Now it was time to see whether those experiences were going to solve the problem of churn

  1. Impact: Quantitative Data Analysis

Since our ultimate goal is reducing one-and-dones, I reached out to our data analyst to retrieve data on how these two features are performing in regard to user retention. We found out that they were some of the highest retaining experiences on our mobile apps.

After both the effort and impact boxes were checked, I got the green light to move forward to designing mock-ups.

Design + Usability Testing

I worked with our designer to create hi-fi mockups to validate our solution by conducting usability tests with our learners before we jump into development.

I created the discussion guide, recruited participants for our usability tests, and conducted the tests. As we went through our usability tests, we iterated over and refined our designs based on our learnings.

Event Tracking

Once we were satisfied with the final product, it was time to work with our data analyst to create and implement event tracking in order to measure the success of our new onboarding experience.

I spent time identifying all the clickstream events we needed to track in order to measure the success of our experience. After that, I created a spreadsheet with all the event names, a description of what each event tracks. I also identified event properties for more granularity and specificity.

Solution
Space

Solution Design Philosophy

My ultimate goal was to build the most impactful onboarding experience with the least effort and resource allocation to achieve maximum ROI.

  1. Effort: Repurposing Existing Solutions

The first thing I did was dig into our mobile apps more to check whether any existing experiences could be repurposed in our onboarding flow. The rationale here was that this would need minimal development work since it would just require a redesign of an already existing experience.

I found two experiences in particular that could potentially address the problem we set out to solve: learning reminders and weekly learning goals.

Now it was time to see whether those experiences were going to solve the problem of churn

  1. Impact: Quantitative Data Analysis

Since our ultimate goal is reducing one-and-dones, I reached out to our data analyst to retrieve data on how these two features are performing in regard to user retention. We found out that they were some of the highest retaining experiences on our mobile apps.

After both the effort and impact boxes were checked, I got the green light to move forward to designing mock-ups.

Design + Usability Testing

I worked with our designer to create hi-fi mockups to validate our solution by conducting usability tests with our learners before we jump into development.

I created the discussion guide, recruited participants for our usability tests, and conducted the tests. As we went through our usability tests, we iterated over and refined our designs based on our learnings.

Event Tracking

Once we were satisfied with the final product, it was time to work with our data analyst to create and implement event tracking in order to measure the success of our new onboarding experience.

I spent time identifying all the clickstream events we needed to track in order to measure the success of our experience. After that, I created a spreadsheet with all the event names, a description of what each event tracks. I also identified event properties for more granularity and specificity.

Final
Product

At this point, we were ready (or so I thought! Check out my learnings below to learn more) to go into development.

Below is the final product that our awesome developers built!

Onboarding experience for users who previously set a goal on web

Onboarding experience for users who previously have not set a goal on web

Final
Product

At this point, we were ready (or so I thought! Check out my learnings below to learn more) to go into development.

Below is the final product that our awesome developers built!

Onboarding experience for users who previously set a goal on web

Onboarding experience for users who previously have not set a goal on web

Metrics +
Analytics

At this point, we were ready (or so I thought! Check out my learnings below to learn more) to go into development.

Below is the final product that our awesome developers built!

Onboarding experience for users who previously set a goal on web

Onboarding experience for users who previously have not set a goal on web

Outcomes

Outcome 1: After implementing the new onboarding experience we saw a 16.8% reduction in first-time user churn across our mobile apps (down to 31.8% from 38%).

Outcome 2: The web platform incorporated a part of our onboarding flow in their new learning reminders experience.

Outcomes

Outcome 1: After implementing the new onboarding experience we saw a 16.8% reduction in first-time user churn across our mobile apps (down to 31.8% from 38%).

Outcome 2: The web platform incorporated a part of our onboarding flow in their new learning reminders experience.

Next
Steps

Outcome 1: After implementing the new onboarding experience we saw a 16.8% reduction in first-time user churn across our mobile apps (down to 31.8% from 38%).

Outcome 2: The web platform incorporated a part of our onboarding flow in their new learning reminders experience.

Reflections
+ Learnings

While the experience was successful in reducing user churn, the journey to getting there was quite bumpy.

I would love to shed light on two learning experiences in particular that make me a much better product manager today.

1. Stakeholder inclusion does not mean mindless adoption

The first design sprint I conducted with the team presented us with so many different ideas. Instead of filtering and choosing the right solution for the problem we were trying to solve, I "Frankenstein-ed" a solution with everyone's input in mind in an effort to be inclusive of everyone’s idea. Needless to say, after we did some usability tests, it turned out the solution we had come up with did not work for our users.

After that setback, I learned a very important lesson: it's essential to bring different stakeholders to solution design meetings because their perspectives are invaluable; however, it's just as important for me as a PM to prioritize those solutions and make sure they add value to our users based on all the research I have gathered and synthesized.

2. Technical discovery is critical

The product team spent the majority of product discovery efforts on doing user research to design the best solution from a user experience perspective. However, we did not spend enough time with developers rigorously exploring what the conditions for showing the onboarding experience would be and what edge cases we should account for—an activity that later proved to be much more complex than we had expected, which caused friction during our implementation phase.

Reflections
+ Learnings

While the experience was successful in reducing user churn, the journey to getting there was quite bumpy.

I would love to shed light on two learning experiences in particular that make me a much better product manager today.

1. Stakeholder inclusion does not mean mindless adoption

The first design sprint I conducted with the team presented us with so many different ideas. Instead of filtering and choosing the right solution for the problem we were trying to solve, I "Frankenstein-ed" a solution with everyone's input in mind in an effort to be inclusive of everyone’s idea. Needless to say, after we did some usability tests, it turned out the solution we had come up with did not work for our users.

After that setback, I learned a very important lesson: it's essential to bring different stakeholders to solution design meetings because their perspectives are invaluable; however, it's just as important for me as a PM to prioritize those solutions and make sure they add value to our users based on all the research I have gathered and synthesized.

2. Technical discovery is critical

The product team spent the majority of product discovery efforts on doing user research to design the best solution from a user experience perspective. However, we did not spend enough time with developers rigorously exploring what the conditions for showing the onboarding experience would be and what edge cases we should account for—an activity that later proved to be much more complex than we had expected, which caused friction during our implementation phase.

Reflections
+ Learnings

While the experience was successful in reducing user churn, the journey to getting there was quite bumpy.

I would love to shed light on two learning experiences in particular that make me a much better product manager today.

1. Stakeholder inclusion does not mean mindless adoption

The first design sprint I conducted with the team presented us with so many different ideas. Instead of filtering and choosing the right solution for the problem we were trying to solve, I "Frankenstein-ed" a solution with everyone's input in mind in an effort to be inclusive of everyone’s idea. Needless to say, after we did some usability tests, it turned out the solution we had come up with did not work for our users.

After that setback, I learned a very important lesson: it's essential to bring different stakeholders to solution design meetings because their perspectives are invaluable; however, it's just as important for me as a PM to prioritize those solutions and make sure they add value to our users based on all the research I have gathered and synthesized.

2. Technical discovery is critical

The product team spent the majority of product discovery efforts on doing user research to design the best solution from a user experience perspective. However, we did not spend enough time with developers rigorously exploring what the conditions for showing the onboarding experience would be and what edge cases we should account for—an activity that later proved to be much more complex than we had expected, which caused friction during our implementation phase.

© 2024 — Omar El-Etr

© 2024 — Omar El-Etr

© 2024 — Omar El-Etr

Home

1

Home

1