Small Failures: The Path to Big Successes
One principle of Lean UX is to either get user validation or fail as early as possible. Failures teach you what’s not working in your product concept or what could work better. These failures may be small things that you can easily fix—such as a button label that users don't understand or a color for a link that doesn’t suggest that it’s clickable. Or it might be an entire array of features that create more confusion and frustration than the initial problems that you intended them to solve. Learning and applying such lessons early on helps you to build a better product.
While using failures to create a better product may sound great, how can you do this and keep your job in the process? In my experience, it’s best to approach the product-design process as a series of small, low-risk experiments.
Introducing Experimentation into the Design Process
Companies have only finite amounts of time and resources to design and build solutions—especially companies with small teams like ours at Bloomfire. Our team already knew that thoughtfully building in opportunities to collect user feedback earlier in the product-design process would help us to know more quickly
- whether we were building tools that our customers and users actually need
- when it’s time to stop investing effort in an approach that’s not working and go back to the drawing board
Our Product and Engineering teams are moving away from putting out releases that deliver large features after lengthy development cycles to building new features iteratively, over successive lighter-weight releases. So we wanted to experiment with how to introduce the validated learning and build-measure-learn approach of Lean UX into our new process. Updating an in-app Analytics page offered the perfect opportunity for us to do this.
When Defining the Problem Is the Problem
Lean UX bounds the problem that a team will solve in a particular release as a minimum viable product (MVP). Here’s how we did that for Bloomfire, a social, knowledge-sharing platform that lets companies tap into the collective knowledge of their employees. Knowledge sharing can take the form of sharing files, asking questions and getting answers, solving problems, locating information, and identifying team members who are subject-matter experts.
Our Analytics page already shared a lot of high-level, general information about what was happening within a user community. Most of this information was available to all members of a community—with just a few visualizations that were available only to administrators. Because of this feature’s broad audience, it didn’t offer much in-depth data or the ability to drill down to information about what specific members were doing or to see the level of engagement around individual pieces of content.
However, within the last year, our market focus has shifted from small businesses to enterprise customers. Based on our conversations with the Customer Success and Sales teams, we knew that there were major painpoints for existing users who administered their company’s user community. While we knew a lot about the needs of our small-business customers, we wanted to collect more feedback about how our new enterprise customers were using Bloomfire and what essential information about their communities was either missing or difficult to find.
We kicked off the project with the understanding that any new Analytics feature would need to evolve and adapt quickly once we had launched it.
We decided that our MVP would not include any data visualizations, and we would instead focus on raw data reports. This would allow us to modify the data that each report included—based on customer feedback—without much development time. Since the ways different customers use the product can vary so greatly, our MVP also focused on allowing users to export data to CSV, so they could download, then customize their reports in whatever way they wanted outside of Bloomfire. Though we had decided to focus on creating and exporting raw-data reports, we weren’t sure which reports to create first or what data they should include.