Part II. Validation

It was a cold January morning in 2012, almost four months after we’d launched Lean Canvas as an online tool. I was reviewing our weekly product metrics, coffee in hand, as I always do on Monday mornings.

There it was again.

I had been tracking a disturbing trend for the four weeks in a row—our activation rate had been progressively dropping.

We define activation as a user completing their initial Lean Canvas. This is a critical milestone metric because it serves as a leading indicator for ongoing engagement. Users that complete their Lean Canvas during their first week usually come back and explore more of the product. Those that don’t almost never come back.

Today, our activation rate was hovering at just under 35%—down from a peak of 80% right after launch. That meant for every 100 people that signed up, 65 of them would likely never come back!

What was even more concerning was that we had been aware of this issue for weeks, and we hadn’t been sitting idle. During the last four weeks, I had asked my designer to implement several usability enhancements addressing steps in the activation flow where we saw the biggest drop-offs.

But none of them seemed to be making a difference. And things were still getting worse. No matter what we did, we seemed unable to pierce a certain ceiling of achievement (Figure II-1).

The typical life cycle of an experiment
Figure II-1. The typical life cycle of an experiment ...

Get Running Lean, 3rd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.