The problem: Retail conversion was dropping and the quote flow had too much friction.

The Get a Quote (GAQ) flow had been built up over time, screen by screen, without anyone really owning the full experience. Even for customers with the most straightforward setup, the journey had quietly become heavy. More steps, more copy, more complexity, and no one had really stopped to question it.

By August 2024, conversion was at 3.2%. Our target was 5.6%. Something wasn't working.

0.00%
retail conversion rate (August 2024 baseline)
~0
design issues identified and reviewed
0%
the month-on-month conversion increase needed
LeftRight
LeftRight

The approach: Two radical directions, one sprint, one designer leading each.

The team decided that instead of iterating slowly, we'd compress everything into a focused sprint week: hackathon vibes, heads down, no distractions. The goal was to make a real step change, not a polish pass.

Each designer led their own variant. The first designer (Variant A) kept the multi-page structure but made it more ambitious: surfacing live pricing, spring seasonality data, and making a much clearer case for "now is the best time to join Amber." Mine (Variant B) went the opposite direction entirely. I stripped the whole thing back into a single page, betting that simplicity at its most extreme would convert better. We ran both in a 50/50 A/B test from October and let the data decide.

LeftRight
LeftRight

The outcome: Variant B edged ahead, but the real work it unlocked was downstream.

Over three weeks, my one-pager (Variant B) edged ahead of the multi-page flow (Variant A), but the session replays unveiled interesting findings. Users were clicking through to the sign-up form at a higher rate, then stalling inside it. Half of all drop-off happened on the first two pages. We could see that the simplified flow had gotten people interested, just not informed enough to commit.

That was enough reason to move on to Variant A2, a new iteration that tried to find the balance between both variants, supported by a pre-sign-up page that was introduced to address common questions before people hit the form. Both came directly from what the one-pager surfaced.

0.00%
Variant A conversion into the sign up flow
0.00%
Variant B conversion into the sign up flow
~0%
of sign-up form drop-off happened on the first two pages

learnings... and what next?

Simplicity has a ceiling

The one-pager got more people to the sign-up form. But ~50% of them dropped off on the first two pages, people arrived without enough context to feel ready to sign up. It worked as a hook, not a closer. The insight wasn't that simplicity is wrong, it's that it needs to be matched with confidence-building at the right moment.

One problem, two fixes

The one-pager findings pointed at two separate problems: the results flow and the sign-up form, and both got worked on. Variant A2 refined the flow. A pre-sign-up page tackled the confidence gap before the form. Running a radical experiment didn't just give us a result, it gave us a roadmap.

Data tells you what, replays tell you why

The A/B numbers showed Variant B was ahead. The session replays showed users spending more time on inputs than on the GAQ itself; lingering, not convinced. Without watching real people use it, we would have called it a win and moved on too early.

Ship it and find out

The instinct going into the sprint was to test something radical rather than iterate slowly. That turned out to be the right call, not because the one-pager won, but because running it taught us more in three weeks than months of smaller changes had.