A competitive gap and a fragmented portfolio
Wrike had a reporting and analytics portfolio that had grown faster than it had been designed. By the time I joined, it spanned legacy static visualizations, tabular reports with limited filtering, and a powerful but complex advanced analytics suite. The solutions did not work well together, and executive escalations from top customers were pulling the roadmap away from typical users.
There was also a competitive problem. Wrike’s SMB and mid-market customers, the largest segment to retain and expand, had no access to visual dashboards while every major competitor offered them. For prospects evaluating work management tools, sub-par reporting was a disqualifying factor. With a new company strategy forming to expand even further down-market, a modern, simple reporting tool became critically important.
| Platform | Dashboards | Visualization | Custom Widgets | Use Cases |
|---|---|---|---|---|
| Wrike (before) | Yes | Enterprise | Enterprise | Yes |
| Monday.com | Yes | Yes | Yes | Some |
| Asana | Some | Some | Yes | Some |
| Smartsheet | Yes | Enterprise | Enterprise | Yes |
| ClickUp | Yes | Yes | Yes | Some |
We analyzed established and rising collaborative work platforms using publicly available information and hands-on product exploration.
Disproving my own assumptions
My first move was research, and my most important finding was that my initial hypothesis was wrong. Working with my expert product analyst, we dove deep into usage patterns and product enhancement request history. I expected behavioral data to show that deep customization was primarily an enterprise need. Instead, users across all segments were heavily customizing templates, and the underlying analyses clustered around a consistent set of patterns. This was not a power-user problem. It was a universal one.
The survey data reinforced this: 87% of users said creating custom widgets was important, and 85% said filtering was critical. But 53% said pre-built widgets mattered too, which told me the solution needed to enable quick setup and easy customization.
I followed the behavioral data with a structured needs study created in partnership with my designer, a broad survey across segments, and in-depth interviews watching users work through actual reporting workflows. Customers were unambiguous about their frustration, stating the standard reports “are useless for me, they are just a static list of information” and the advanced analytics tool has “a lot to learn and it’s a bit daunting unless you’re a data analyst.”
Why we built instead of bought
The research made one thing clear: we needed a fundamentally different product, not a slimmed-down version of our enterprise solution. My predecessor’s approach had been to cut out features of the existing vendor-powered enterprise solution for lower-tier customers. It was the path of least resistance.
After evaluating the options, I ruled it out for three reasons. The operational cost of maintaining the vendor integration was unsustainable, requiring 4–6 weeks of QA, SysOps, and BI engineering effort every quarter with minimal visible benefit to customers. The vendor tool carried a steep learning curve that conflicted with our growth strategy. And our competitors were investing in modern, intuitive experiences that a reskinned enterprise tool simply could not match.
Building natively meant taking on real complexity. It also meant we could design the experience from scratch around what users actually needed, and own the roadmap entirely going forward.
To be thorough, we ran a parallel workstream while I was evaluating options. One of my engineering teams focused on optimizing our Snowflake compute costs, testing whether the unit economics of the enterprise solution could work at scale across all plan tiers. Even with meaningful reductions, the numbers didn't support a profitable business case. That confirmed the build decision: we weren't just choosing a better user experience, we were choosing the most financially sustainable path.
Focused subject areas: designing for the decision, not the data
The central challenge was a fundamental conflict: users wanted everything, all projects, all portfolios, every productivity measure, instantly. My architects said that was impossible at scale without unacceptable latency or limits on input data.
Rather than negotiate between user wishes and engineering constraints, I reframed the problem. What users needed depended entirely on context. A team manager tracking daily task completion needed fine-grained current data. A program manager reviewing delivery health needed historical trend data. These were not the same analysis.
I developed a focused subject areas approach: instead of asking users to navigate a raw data schema, the tool first established context, what kind of work, at what level, and then guided them to the right metrics and visualizations. The goal was to focus the user on the decision to be made, not the data available.
The solution was built around five guiding principles:
From concept to working prototype
Early Figma wireframe developed collaboratively with my designer, KPI summary cards, task breakdown charts, and a live project task list in a unified layout. Annotation arrows show designed interactions including filter behavior and inline status editing.
The subject areas concept in action: users select a data source, choose what to report on, pick metrics, and apply breakdowns, in that order. The chart type selector offers flexibility without overwhelming. This guided, context-first workflow enabled users to build visualizations in seconds.
A finished analytics board for the Portfolio subject area, budget vs. actual spend by key result, with project-level status and progress tracking below. Built entirely using the native widget editor.
Tested and loved
7 out of 8 prototype testers completed the task independently on first contact with the tool. The only user who did not had never used any visual analytics tool before, including Excel.