How Accutics customers outperform their industries

Find out how connect marketing data to experimentation success metrics from Ashit Kumar, User Growth Optimization at Spotify.
Diana Ellegaard-Daia
Head of Content Marketing, Accutics
Hero - Elements Webflow Library - BRIX Templates

In this interview, Ashit Kumar, User Growth Optimization at Spotify, shares his winning tactics for connecting marketing data to experimentation success metrics.

Types of experiments for optimizing the conversion funnel at Spotify

D.D.: To get us started, what type of experiments are you conducting at Spotify with the goal of optimizing the conversion funnel?

A.K.: When it comes to our team, our remit is to optimize the user conversion flows. Since most of Spotify's conversions happen on the web, our team focuses on optimizing those flows, some of those being:

Landing page optimization

One of the key areas of our optimization processes is landing page optimization. A massive chunk of traffic lands on our premium landing pages across different regions and countries. During the last few years, our strategy has been to work closely with the market teams and make sure that landing pages are optimized according to their own regions and their own additional nuances. We make sure that the landing page has all the right information that our subscribers or potential subscribers need.

Churn optimization

What we have also started focusing more on, as the overall subscriber pool reduces in some of our more mature markets, is the cancellation flow.

For example, one of the past experiments was to add the benefits of the premium plan in the cancellation flow so that users are aware that they would lose these benefits if they canceled their premium plan.

Upsell optimization

In our mature markets, we also focus on making existing subscribers aware of what they gain with higher value plans and try to convert them to those plans.

Many of the Spotify plans are based on our life cycle. You start as a student, you convert to an individual plan, you get the duo plan when you have a partner, and upgrade to the family plan when you get a family. These life stages need to be put in front of our users so that they are aware that these plans exist when they need them.

Seamless
adoption
Accutics is extremely simple. It’s a very easy tool to use, it’s extremely intuitive and the fact that you can download and upload a template and generate thousands of CMP IDs with the click of a button is a big selling point especially for agencies.
Nael Cassier,  
Senior Manager Digital Marketing, Qualcomm

Key metrics for user growth optimization

D.D. What metrics do you focus on when you assess the success of an experiment?

A.K.: We have a variety of metrics to focus on depending upon the premise of an experiment. Some of those are:

  • Conversion rate (User level)
  • Activity metrics ( DAU / WAU / MAU)
  • Cancellation rate
  • LTV (lifetime value)

We also started looking into LTV or lifetime value as an average from variance. Our subscribers are very different in their behavior. If you e.g., convert a subscriber on a trial plan, that doesn't mean that the business will get the same amount of revenue as converting to an individual plan without any trials. So we have started to take into account some of those nuances when we run experiments.

Depending upon the hypothesis of the experiment, we may or may not choose one metric over the other. We also have to make sure that we choose one or two good metrics as our primary metric, but then choose one or two other metrics as our guardrail metrics. Mostly because when you are trying to get more convergence, you have to ensure that your experiment does not destroy user experience.

Analyzing our experiment based upon marketing channels is incredibly important for our team, since we primarily focus on optimizing for new subscribers.

How to connect marketing data to experimentation success metrics

D.D.: How do you tie in marketing data to experimentation success metrics?

A.K.: Analyzing our experiment based upon marketing channels is incredibly important for our team, since we primarily focus on optimizing for new subscribers.

In order to import marketing data in our analysis we pass the marketing attributes to our experimentation datasets, and then, during analysis, we make sure that we run a check on these primary marketing channels to see if those underperform/overperform as compared to the overall variant performance.

We make sure that the definitions of those channels are always kept to date via a consistent custom SQL UDF (user-defined) function.

The importance of data quality

D.D.: We know that marketing data is needed for experiments with the goal of optimizing conversions. Why is the quality of marketing data critical?

A.K.: Data quality should be part of the overall experimentation or analysis lifecycle. The quality of marketing data is, as you may have guessed, extremely important for our tests. We set up unit tests in our pipelines to make sure that the data is auto tested on different pre-set parameters.

Our team is responsible for making sure that we have instrumentation in place for data collection and that we only collect data where we have explicit user consent. It is really important to ensure that the data we collect is reliable enough for us to use and our over 150 stakeholders across Spotify to make decisions on.

When you are optimizing your landing pages, a massive chunk of the traffic coming to those landing pages is via marketing channels, which in turn means that for your experiments to work, they must be aligned with your overall marketing strategy.

Usually different teams follow different naming conventions, which would make identification of those channels more difficult than it has to be.

Consistency is key when you want to analyze on the basis of the marketing channels since without it, you’d have to create really complicated business logic to categorize them.

2 key layers of data quality

D.D.: What aspects of data quality do you consider important for your experiments?

Consistency

A.K.: Besides accuracy, consistency is imperative when you are trying to categorize marketing data under different channel groups, for instance.

It is important for us that all of our marketing teams are aligned when it comes to making sure that the UTM campaign parameters they use are consistent.

Timeliness

Apart from that, we also set up alerts via pagerduty, which would inform us if there is any kind of lateness in those datasets, leading to early debugging and fixing of those problems.

Communicating marketing data insights

D.D.: An essential part of successful experiments is activating the insights. You play a key role at Spotify in enabling other teams to optimize the customer journey.

How do you use and communicate marketing data insights to ensure that data helps drive the right decisions?

A.K.: As a centralized optimization team, we have to make sure that we are disseminating the learning from experiments in the most democratized way possible, so our team maintains an open backlog of requests that anyone can submit an idea to. Those ideas are then prioritized the same as anyone else’s ideas.

Similarly, after we run experiments, all of those results, regardless of whether the experiments were successful or not, are pushed to our Knowledge Base tagged with different meta tags with what that experiment idea related to, for example:: a specific page optimization, a region or group of markets, campaigns, also relevant stakeholder teams etc. Using these meta tags, anyone who is looking at the knowledge base can easily filter and find the insights they are looking for

D.D.: What 3 tips would you share with fellow analysts to get more value from their marketing data and drive successful experiments?

A.K.: As someone once said, you can’t optimize something you cannot measure. Similarly, you can’t measure using any kind of data unless it is of high quality.

  1. make sure your data is of the highest quality possible
  2. automate all repetitive processes of your analysis/experimentation
  3. make sure that the results of those experiments are accessible to anyone in your organization

About Ashit Kumar:

Ashit Kumar is a User Growth Lead at Spotify. In his day-to-day, he manages data instrumentation, workflows/pipelines, and insight generation for all A/B tests run within his team. While he is very hands-on with technical and marketing tools, he often works with business stakeholders all around Spotify to help them understand the value of experimentation and behavioral data. When he is not thinking about data, he is busy exploring emerging technologies and economic theories.

Don't let your
organization
fall behind
Take the first step towards data-driven success by exploring how Accutics can transform your marketing operations.

Lead with Insights Survey: Unlock the Power of Data Governance

Why and how marketing data governance comes out as a clear differentiator for success

BY Diana Ellegaard-Daia
Investigating companies that outperform industry growth, marketing data governance comes out as a clear differentiator for success. With the Lead with Insights Survey, we are looking at the current state of the union and we need the help of those who know this best. The end goal is providing you with a report with tangible advice on how to get ahead of the curve.

Join the Lead with Insights Survey

By participating in this brief survey, you'll help us understand the current landscape of data-driven marketing. Your insights will shed light on critical questions:

·     How are marketing and data leaders leveraging data to drive growth?
·     What are the primary obstacles to data standardization and governance?
·     Which data insights carry the most sway for business impact?

Why your participation matters

Organizations that prioritize data governance gain a significant competitive edge. Without a solid data standards practice, your organization risks falling behind and losing market share.

By taking the Lead with Insights Survey, you'll contribute to shaping the future of data-driven marketing. Your insights will enable us to provide valuable resources and solutions to the industry.

Stay updated

The Lead with Insights Survey is just the beginning. We're dedicated to empowering marketing decision-makers with the knowledge and tools they need to succeed.

·     Sign up for our Lead with Insights newsletter to be the first to receive exclusive insights and actionable recommendations.
·     Bookmark this page to stay updated on survey highlights and key findings.
Ashit Kumar

How to Tie In Marketing Data to Experimentation Success Metrics

Find out how connect marketing data to experimentation success metrics from Ashit Kumar, User Growth Optimization at Spotify.

Diana Ellegaard-Daia
Ashit Kumar
Ashit Kumar

How to Tie In Marketing Data to Experimentation Success Metrics

Find out how connect marketing data to experimentation success metrics from Ashit Kumar, User Growth Optimization at Spotify.

By
Diana Ellegaard-Daia
Ashit Kumar
Ashit Kumar

How to Tie In Marketing Data to Experimentation Success Metrics

Find out how connect marketing data to experimentation success metrics from Ashit Kumar, User Growth Optimization at Spotify.

By
Diana Ellegaard-Daia
Ashit Kumar

How to Tie In Marketing Data to Experimentation Success Metrics

Find out how connect marketing data to experimentation success metrics from Ashit Kumar, User Growth Optimization at Spotify.

By
Diana Ellegaard-Daia

In this interview, Ashit Kumar, User Growth Optimization at Spotify, shares his winning tactics for connecting marketing data to experimentation success metrics.

Types of experiments for optimizing the conversion funnel at Spotify

D.D.: To get us started, what type of experiments are you conducting at Spotify with the goal of optimizing the conversion funnel?

A.K.: When it comes to our team, our remit is to optimize the user conversion flows. Since most of Spotify's conversions happen on the web, our team focuses on optimizing those flows, some of those being:

Landing page optimization

One of the key areas of our optimization processes is landing page optimization. A massive chunk of traffic lands on our premium landing pages across different regions and countries. During the last few years, our strategy has been to work closely with the market teams and make sure that landing pages are optimized according to their own regions and their own additional nuances. We make sure that the landing page has all the right information that our subscribers or potential subscribers need.

Churn optimization

What we have also started focusing more on, as the overall subscriber pool reduces in some of our more mature markets, is the cancellation flow.

For example, one of the past experiments was to add the benefits of the premium plan in the cancellation flow so that users are aware that they would lose these benefits if they canceled their premium plan.

Upsell optimization

In our mature markets, we also focus on making existing subscribers aware of what they gain with higher value plans and try to convert them to those plans.

Many of the Spotify plans are based on our life cycle. You start as a student, you convert to an individual plan, you get the duo plan when you have a partner, and upgrade to the family plan when you get a family. These life stages need to be put in front of our users so that they are aware that these plans exist when they need them.

Key metrics for user growth optimization

D.D. What metrics do you focus on when you assess the success of an experiment?

A.K.: We have a variety of metrics to focus on depending upon the premise of an experiment. Some of those are:

  • Conversion rate (User level)
  • Activity metrics ( DAU / WAU / MAU)
  • Cancellation rate
  • LTV (lifetime value)

We also started looking into LTV or lifetime value as an average from variance. Our subscribers are very different in their behavior. If you e.g., convert a subscriber on a trial plan, that doesn't mean that the business will get the same amount of revenue as converting to an individual plan without any trials. So we have started to take into account some of those nuances when we run experiments.

Depending upon the hypothesis of the experiment, we may or may not choose one metric over the other. We also have to make sure that we choose one or two good metrics as our primary metric, but then choose one or two other metrics as our guardrail metrics. Mostly because when you are trying to get more convergence, you have to ensure that your experiment does not destroy user experience.

Analyzing our experiment based upon marketing channels is incredibly important for our team, since we primarily focus on optimizing for new subscribers.

How to connect marketing data to experimentation success metrics

D.D.: How do you tie in marketing data to experimentation success metrics?

A.K.: Analyzing our experiment based upon marketing channels is incredibly important for our team, since we primarily focus on optimizing for new subscribers.

In order to import marketing data in our analysis we pass the marketing attributes to our experimentation datasets, and then, during analysis, we make sure that we run a check on these primary marketing channels to see if those underperform/overperform as compared to the overall variant performance.

We make sure that the definitions of those channels are always kept to date via a consistent custom SQL UDF (user-defined) function.

The importance of data quality

D.D.: We know that marketing data is needed for experiments with the goal of optimizing conversions. Why is the quality of marketing data critical?

A.K.: Data quality should be part of the overall experimentation or analysis lifecycle. The quality of marketing data is, as you may have guessed, extremely important for our tests. We set up unit tests in our pipelines to make sure that the data is auto tested on different pre-set parameters.

Our team is responsible for making sure that we have instrumentation in place for data collection and that we only collect data where we have explicit user consent. It is really important to ensure that the data we collect is reliable enough for us to use and our over 150 stakeholders across Spotify to make decisions on.

When you are optimizing your landing pages, a massive chunk of the traffic coming to those landing pages is via marketing channels, which in turn means that for your experiments to work, they must be aligned with your overall marketing strategy.

Usually different teams follow different naming conventions, which would make identification of those channels more difficult than it has to be.

Consistency is key when you want to analyze on the basis of the marketing channels since without it, you’d have to create really complicated business logic to categorize them.

2 key layers of data quality

D.D.: What aspects of data quality do you consider important for your experiments?

Consistency

A.K.: Besides accuracy, consistency is imperative when you are trying to categorize marketing data under different channel groups, for instance.

It is important for us that all of our marketing teams are aligned when it comes to making sure that the UTM campaign parameters they use are consistent.

Timeliness

Apart from that, we also set up alerts via pagerduty, which would inform us if there is any kind of lateness in those datasets, leading to early debugging and fixing of those problems.

Communicating marketing data insights

D.D.: An essential part of successful experiments is activating the insights. You play a key role at Spotify in enabling other teams to optimize the customer journey.

How do you use and communicate marketing data insights to ensure that data helps drive the right decisions?

A.K.: As a centralized optimization team, we have to make sure that we are disseminating the learning from experiments in the most democratized way possible, so our team maintains an open backlog of requests that anyone can submit an idea to. Those ideas are then prioritized the same as anyone else’s ideas.

Similarly, after we run experiments, all of those results, regardless of whether the experiments were successful or not, are pushed to our Knowledge Base tagged with different meta tags with what that experiment idea related to, for example:: a specific page optimization, a region or group of markets, campaigns, also relevant stakeholder teams etc. Using these meta tags, anyone who is looking at the knowledge base can easily filter and find the insights they are looking for

D.D.: What 3 tips would you share with fellow analysts to get more value from their marketing data and drive successful experiments?

A.K.: As someone once said, you can’t optimize something you cannot measure. Similarly, you can’t measure using any kind of data unless it is of high quality.

  1. make sure your data is of the highest quality possible
  2. automate all repetitive processes of your analysis/experimentation
  3. make sure that the results of those experiments are accessible to anyone in your organization

About Ashit Kumar:

Ashit Kumar is a User Growth Lead at Spotify. In his day-to-day, he manages data instrumentation, workflows/pipelines, and insight generation for all A/B tests run within his team. While he is very hands-on with technical and marketing tools, he often works with business stakeholders all around Spotify to help them understand the value of experimentation and behavioral data. When he is not thinking about data, he is busy exploring emerging technologies and economic theories.

Dive into the full case
Find out how the Fortune 100 insurer achieved marketing excellence with Accutics
Get the Campaign Naming Convention Guide for the Enterprise

Join the Lead with Insights Newsletter

Sign up to newsletter

Popular Articles

campaign tracking

Lead with Insights Survey: Unlock the Power of Data Governance

Why and how marketing data governance comes out as a clear differentiator for success
Diana Ellegaard-Daia
5 min read
Campaign Tracking

Campaign Tracking in Adobe Analytics: Tracking UTMs and CIDs

Get Frederik Werner's method for campaign tracking in Adobe Analytics using UTMs and CID tracking codes.
Frederik Werner
9 min read

Data-driven marketing

Understanding CDPs and Their Role in the Marketing Data Stack

How do you create compelling presentations that wow your colleagues and impress your managers?
Diana Ellegaard-Daia
8 min read
Digital analytics

Migrating to a New Digital Analytics Solution

Mental models are simple expressions of complex processes or relationships.
Frederik Werner
9 min read

Get the latest
marketing analytics insights

No-nonsense marketing and analytics best practices from
international industry leaders, straight to your inbox

Sign up to newsletter