How a mature B2B SaaS client drove 1.5x engagement from users in 1 month

Client wanted to understand how to get new users more engaged with their product - and Pungo delivered the reports to drive product/marketing decisions.

20%

activation rate increase

< 4 weeks

to create AARRR reporting

50%

more users completing onboarding

Overview

Our client is an established SaaS business in the metal fabrication space. They launched a proprietary quoting product that is 1) a standalone tool for shop estimators and 2) a white-labeled tool embedded in incumbent industry software systems.

They needed a reliable way to quickly understand product adoption and retention that didn’t burden the Engineering team with running SQL queries on top of a production database.

They engaged Pungo Insights to design and implement a data stack that could track a complete user journey and then build alerts and reporting to guide Marketing/Product leaders in making growth decisions.

This data strategy involved adopting a multi-source data aggregation tool (Segment), modeling and reporting on key user events (Amplitude), and developing user cohorts for well-timed Marketing/Product actions (Hubspot).

Action

We broke this project into 3 phases:

1. Data Audit and Prioritization

We assessed the data quality flowing through existing tools and processes and determined the critical gaps to address.

  • Developed a data quality score to assign across sources and a prioritization framework
  • Designed a fully-scoped deliverable to showcase the power of a Segment → Mixpanel data pipeline: an executive-level Product Metrics dashboard following the AARRR framework
  • Built a collaborative, robust tracking plan to standardize data management and capture cleaner data

2. Data Modeling and Report Building

We broke up the implementation into bi-weekly sprints with weekly demos to cross-functional leaders.

  • Deployed client-side and server-side tracking code alongside engineering to start collecting a stream of event data
  • Built reports in Amplitude and PowerBI to monitor trends and answer questions across data sources
  • Ran weekly discussion with cross-functional leaders to share hypotheses and align metrics with business OKRs

3. Fine-Tuning and Growth Benchmarking

The last two weeks of fine-tuning and growth benchmarking were spent training internal teams to adopt data tools, providing additional enhancements and sharing Marketing/Product benchmarks.

  • Investigated additional 2nd/3rd degree insights from trends
  • Rolled out trainings to meet 80% report adoption acceptance criteria

Result

We delivered an AARRR dashboard in < 4 weeks, with a few days to spare. The VP of Growth now has visibility into drop-off across the customer life cycle and a structured way to set product improvement targets.

  • Readjusted onboarding customer messaging to drive a 20% increase in activation rate
  • Unlocked greater visibility into feature adoption to supplement customer interviews and heat map analysis. Core product feature was streamlined.
  • Client was satisfied and continues to work with Pungo on a project-by-project basis!

Key Learnings

Stack-rank Data Sources and Scope Out an End-to-End Deliverable

Early-stage SaaS companies don’t want to invest in a robust data lake out the gates, they want access to reliable insights. They’re likely already running with 10+ data sources but have no way of constructing a full user journey, so they’re used to looking at data in siloes.

In scoping out a first deliverable, pick no more than 3 data sources to shorten the time-to-value cycle and drive towards reporting on the most useful growth metrics. Work with your client to see if the AARRR, Growth Accounting, or any other framework would answer their most pressing questions first.

Leaving Time for Adoption

Leaving 1-2 weeks to drive adoption were critical to make sure data definitions and reports were had the stakeholder vote of confidence. Point out common traps that undermine data quality and guide stakeholders to take actions.

There are always adjustments that only get revealed when users begin using data tools. Our acceptance criteria is at least 80% adoption before we consider our deliverable a success. Monitoring and continual improvements follow with more time and usage.