Data Studio

A 12-week sprint to stablize a shaky beta, then define a post-beta vision for the group to work toward

Data Studio

A 12-week sprint to stablize a shaky beta, then define a post-beta vision for the group to work toward

Data Studio

A 12-week sprint to stablize a shaky beta, then define a post-beta vision for the group to work toward

De-risked beta adoption by fixing the biggest drop-offs first

De-risked beta adoption by fixing the biggest drop-offs first

De-risked beta adoption by fixing the biggest drop-offs first

Increased dataset saves from ~60% to ~75%

Increased dataset saves from ~60% to ~75%

Increased dataset saves from ~60% to ~75%

Defined a performance improvement roadmap for post-beta

Defined a performance improvement roadmap for post-beta

Defined a performance improvement roadmap for post-beta

Aligned the product group around a post-beta vision

Aligned the product group around a post-beta vision

Aligned the product group around a post-beta vision

I joined Data Studio 12 weeks before beta to stabilize adoption, and then help guide the team toward a more ambitious post-beta vision. I redesigned core flows to de-risk the beta and make sure users could reliably get value from the product. Dataset saves increased from ~60% to ~75% with no changes to backend architecture. Post-launch, I led the team in defining a vision we could work toward iteratively in order to scale the product's impact.

Leadership asked me to move to the Data Studio team to support the upcoming beta launch

I had proven through my previous major projects that I could move into complicated spaces, quickly turn a product around, and then scale it up to something more robust. Leadership asked me to move to Data Studio to support the launch and set the vision for the future of the product post-beta.


Data Studio inherited technical foundations and was not getting traction with early users. It was too difficult to understand and as a result people weren't able to reliably create and save datasets — meaning people were getting no value from an expensive product.


  • Multiple points along the core user flow were experiencing drop-off above 40%

  • We were positioning this tool for marketers, but it was originally designed for people in data ops roles

  • Long processing times made the product feel heavy and unresponsive

  • Engineering and design were fixing immediate issues without a clear North Star

Too technical

Too technical

Misaligned first impressions

Misaligned first impressions

Slow performance

Slow performance

Unclear product direction

Unclear product direction

Lead Designer

Role

HubSpot

Company

12 weeks

Duration

8 engineers, TLs, PM

Team size

Web; b2b SaaS

Platform

Conversational design

Complex user flows

Technical architecture

Cross-team alignment

Focus

Lead Designer

Role

HubSpot

Company

12 weeks

Duration

8 engineers, TLs, PM

Team size

Web; b2b SaaS

Platform

Conversational design

Complex user flows

Technical architecture

Cross-team alignment

Focus

What I did

  • Mapped the end-to-end user flow from the home page to the dataset builder to identify and resolve drop-off points

  • Made it easier for users to find and combine their data without having to understand SQL terminology

  • Diagnosed the slow performance issue and drafted a list of user flow changes for us to pick up later, after stabilizing the core beta experience

  • Aligned design and product direction across 4 adjacent teams

What changed

  • A technical product became one that used more approachable language and task-based layouts that let people focus on what they wanted to achieve, not how to achieve it

  • People were able to get insights from Data Studio in less time and with higher trust in the outputs

  • More users were able to successfully save datasets and power downstream activation paths (~60% -> ~75%)

  • The product group had a well structured vision to move toward

40% of users were dropping off before getting through the home page

To understand why, I mapped every path a user could take through the home page. Users had to navigate a maze of over 100 nodes and dozens of decision points.


I coordinated with an adjacent team to ship a set of navigation and information architecture changes, getting more users to the dataset builder with fewer distractions along the way.

The app felt slow, even for a data-heavy product

Early feedback showed that the app felt slow. I proposed a new flow that improved perceived performance by front-loading set-up tasks and batching slow steps. Engineering felt this this new flow required more time to define and implement than we were ready to commit to for beta. We planned to pick this work up after stabilizing the core beta experience.

Data combination was too difficult

The app forced people to think in terms of “internal vs external”, which didn't match how they thought about their data. People weren't able to find the data they needed, and those who could had to understand SQL concepts to combine it into a dataset.


≈45% of users dropped off at the data combination step either bouncing or repeating the step multiple times in quick succession. People were guessing their way through it and didn't trust the outputs. This was a major blocker because datasets powered monetized downstream activations like triggering a workflow or CRM sync.

The app's design exposed engineering complexities to users

To resolve the steep drop-off, I reframed the problem from “make joins better” to “design a data combination surface". First I researched SQL join mechanics, then validated constraints with engineering and translated the outgoing UX into a task-based combination flow with conversational language. No SQL terminology.


I rewrote the UI in plain language, then consolidated the controls into a compact side panel so the primary workflow stayed focused and conversational.

Simplified data combination

Datasets powered monetized activation paths like workflows and CRM changes, so getting data combination right was crucial.


I redesigned the experience around task-based layouts that spoke to users in language that was more approachable.

Test, align internally, then ship

Users in the testing cohort were able to use the new design to correctly combine data from multiple sources in just a few seconds (compared to >1 minute with the outgoing design). These were marketing users who did not understand SQL.


I shared the results of my usability studies with the wider product group to get buy-in, then mapped the new UI against the outgoing version. This proved feature parity and helped engineering build quickly.

A de-risked beta with room to grow

Throughout our beta period, I drove changes to key areas of the product that moved our metrics in the right direction as we approached public launch.



Drop-off on the homepage decreased from ~40% to ~30%, and dataset saves increased from 60% to around 75%.

Outcomes & reflections

I was asked to join a new team, stabilize Data Studio just before launch, and then grow it into a mature product. Over my first 12 weeks, I resolved multiple key drop-offs points, defined a performance improvement plan for post-beta, and increased dataset saves from ~60% -> ~75% by redesigning data combination.


Post-launch, I played a key role in a product-group workshop led by group leadership. I worked with another designer on a more sweeping product redesign and rallied the rest of the group around that vision.

De-risked beta adoption by fixing the biggest drop-offs first

De-risked beta adoption by fixing the biggest drop-offs first

De-risked beta adoption by fixing the biggest drop-offs first

Defined a performance improvement roadmap for post-beta

Defined a performance improvement roadmap for post-beta

Defined a performance improvement roadmap for post-beta

Increased dataset saves from ~60% to ~75

Increased dataset saves from ~60% to ~75

Increased dataset saves from ~60% to ~75

Aligned the product group around a post-beta vision

Aligned the product group around a post-beta vision

Aligned the product group around a post-beta vision

What came before?

I led a 6-month turnaround of HubSpot’s Global Search platform, unifying two legacy apps, rebuilding team operations, and establishing an AI-ready foundation. That work set me up to take on the Data Studio beta in a more technical product space.

What came next?

My focus shifted to scaling the experience while influencing design for the product as a whole, across multiple teams.

What came before?

I led a 6-month turnaround of HubSpot’s Global Search platform, unifying two legacy apps, rebuilding team operations, and establishing an AI-ready foundation. That work set me up to take on the Data Studio beta in a more technical product space.

What came next?

My focus shifted to scaling the experience while influencing design for the product as a whole, across multiple teams.

What came before?

I led a 6-month turnaround of HubSpot’s Global Search platform, unifying two legacy apps, rebuilding team operations, and establishing an AI-ready foundation. That work set me up to take on the Data Studio beta in a more technical product space.

What came next?

My focus shifted to scaling the experience while influencing design for the product as a whole, across multiple teams.