Marta Fioni

Empowering Paddle’s new users with data about their subscription business.  





Paddle is a payment platform for selling software globally, with no extra effort required in managing taxes and compliance across international markets. Historically, many Mac app developers used Paddle to sell their software outside of the app store, as a one-off purchase.

In 2019, Paddle raised a series B to ‘cross the chasm’ and target the high-growth market of Web-SaaS companies. 

Paddle’s product team was hence under big time pressure to find  product market-fit with this new type of customer. 

👩‍💼 Phase One: identifying true user needs


One big challenge was understanding the needs of customers that were not yet using the platform, and determining the most important features they needed.  We also needed to understand which of our preconceived ideas about these customers would turn out to be valid, and helpful, and discard the rest. 

In order to uncover such customer needs, but also purchase drivers and pain points that would get decision makers to switch to a different payment solution,  I run several exploratory customer interviews with CEOs and executives of Web SaaS companies.


Customer interviews with Web SaaS prospects.  

After a few of them  patterns and recurring themes started to emerge, and product needs started to revolve around specific areas: 

  • In-line checkout customisation and performance 
  • USA taxes compliance 
  • Payment acceptance 
  • B2B Manual Invoicing 
  • Dunning 
  • Flexible Subscription Plans
  • Access to Subscription Data 

This rough list of needs was distilled and refined to build a customer profile  for such customers, and a value proposition and product roadmap that could support it. 


Some of these needs turned out to be fairly addressable, even through services, small hacks, or by building new features on top of the existing Paddle product. 

This was, though, not the case for access to subscription-specific data insights. During this work of discovery, it became clear that this feature was considered a tablestake by prospect customers, but was completely lacking in Paddle’s current product, and was instead very mature in competitor’s products. 

The huge discrepancy between the user’s needs and Paddle’s product made it necessary to run some specific research only focusing on data needs.  There were, in fact, a myriad of metrics and visualisations for subscription-based businesses, and we knew we couldn’t possibly deliver all of them. At the same time, simply asking customers about what metrics they wanted to see gave us a long laundry list, with little sense of perspective or importance. We knew which metrics users might want to see and how - but had no idea what truly mattered to them. 

Since we couldn’t possibly build everything suggested, we needed to prioritize. I used two classic design research techniques to help us do that:

  • Stimulus materials to provoke reactions during user interviews
  • Ethnographic observation techniques to understand what users did in the absence of data


Legacy design used in user interviews and workshops. 

Discovering the ‘hacky’ solutions current Paddle users had had in place to balance the lack of subscription data turned out to be a very useful piece of insight, that influenced some of my design decisions down the line. Some users used external applications that extracted data from Paddle’s API, some were receiving manually generated spreadsheets from Paddle’s customer service, others collected data via calls on their own database. 

I concluded that having in-product information was not the only way to receive and consume these data, and also that some metrics included in the designs, such MRR and churn, were considered essential by all users, while others, such ARPA, just nice-to-haves.

I also learnt how important it was for the most data literate users to have access to well structured raw data rather than to data visualisations. 

Figma and Framer user interview



🛴 Phase two: exploring divergent solutions, and designing my way to an MVP


Following a parallel process, I used the same legacy designs as a support material for my first meetings with the engineers in the team. I ran a few workshops to understand dependencies and technical blockers and learnt that the biggest amount of effort would have been put in calculating such missing and complex numbers.  Calculating essential subscription metrics like MRR required a refactoring of the old Paddle’s data stack, plus the addition of new complex calculations on top. 

Showing these newly calculated data in a new UI, as originally planned, with a new FE stack constituted an additional, not trivial, effort.


Workshops with engineers.

After this first set of user and technical discovery sessions, I summarised my findings and then ran an ideation workshops with the team and internal stakeholders.

My aim was to invite  them to suggest as many divergent, out-of-the-box solutions as possible, to then refine the list and explore different solutions, looking for alternative approaches informed by users insights and new information emerged during the technical discovery.  The group came up with 4 divergent approaches: 

A. An email with the most important number (MRR) sent every week
 
B. A daily updated  CSV export of raw data, for users to calculate their own metrics

C. The ‘legacy idea’: in-product subscriptions dashboard with graphs and new ‘ad-hoc’ metrics

D. An API service to easily and reliably automate the export of rough data, visualising them with a third party tool


Impact /Effort matrix used to compare divergent solutions. 

By ranking these solutions by effort and value, I suggested to start with A, as it was lower effort, and an incremental step towards the most valuable solution, but the one we all knew required more effort.

Given what I learnt in the user interviews and from the technical discussion with the team, I in fact knew that  once the BE could successfully calculate and display the MRR (Monthly Recurring Revenue), the most important and valuable number for a subscription business, we could then display it in different ways.

First displaying  the main MRR number in an email was a way to reduce effort, proving we could calculate the data in the BE, while delivering value to the end user.  

Once this step would have proven successful, we could then move to a second phase, where the same data could be displayed in the product (solution C). 


MVP email design.

After taking the decision of working on the MRR email as a first step towards the dashboard solution, we went from ideation to release in a matter of a few weeks.

The email was released as a ‘weekly pulse’ and revealed a success. With a 60% opening rate the day of the first release, and a  stable 30% opening rate for the next 3 months after the release, it was turned from a test into a product feature Paddle’s users could opt-into. 


🏃‍♂️ Phase three: showing it’s possible, and making it happen, one small iteration at the time


As the big effort to calculate the MRR data was over, I designed the next step iteration as a test aimed at displaying  in-product, exactly the same information contained in the email. I designed a new empty tab with a ‘coming soon’ message in the main overview, to create a ‘beta-like’ feeling around the product.
My goal was to both to motivate the the team and to grab the attention - and the ‘forgiveness’ of Paddle’s  current users. 




From the empty subscription tab to the ‘complete’ product, in weekly iterations. 

Initially, I designed a ‘coming soon’ message, then just a component with MRR number but no graph, and eventually the number with the first graph component. This incremental design was the output of a series of  sessions with the team, in which we ‘sliced’ the design in releases of the smallest possible size, to make sure we could release even minimal user value, with a timely, demoable, weekly cadence. 

Once the first graph was released, the users gave us the attention I hoped to receive: Paddle’s customers tweeted about the release itself, and clearly expected more to come.


A tweet from a user after the MRR graph release. 

I approached the design of the next iterations as a constant teamwork, iterating and adding details as the feedback from customers became more precise and frequent. I often paired with internal data analysts, as much as with front-end engineers to refine my first designs.

Using this method, in a couple of months’s time we were able to ship a user tested and improved version of that  ‘full fledged’ legacy design we started from, and that at the time seemed so impossible to ever achieve.  


Early designs and sketched iterations.  

As the product gained maturity, I dedicated particular attention to design details such tooltips and date pickers, and rolled them out gradually on top of the ‘basic’ design components. Taking this approach helped me and the team manage the testing of such components with real data and edge cases (e.g. missing data, too big data, uneven data ranges, etc). 


Tooltips were a key design detail for the MRR movements graph. 

I refined microinteractions for the datepickers pairing with FE engineers. 



📈 Phase four: scaling the design system, by re-designing a legacy data product


Once reached the goal of providing subscription data to Web-SaaS businesses, I suggested extending the usage of the new data visualisation design system to a re-design (and parallel refactoring) of the Overview page -  in absolute the most viewed area of the dashboard, used daily by a mix of legacy and new Web-SaaS Paddle users.


Paddle’s Overview before and after its redesign.

Well aware of the importance of this page, and how influential the feedback from current power-users would turn out to be, I included in the re-design a direct way to get feedback from users trough a banner and a survey.  Once once again we received a constant flow of positive and negative feedback, that I used to plan the next design iterations.



In-product message to gather direct users feedback. 

I run several sessions with the team to review and prioritise the different feedback from users, and incorporate as much as possibile in the design of the next iterations. One aspect users were very vocal about, and that was releatively simple to iterate on, was responsiveness.



Sample of user feedback requesting responsivness, and my first design to solve for it. 

Other pieces of feedback turned out to be harder to incorporate because they required lots of BE effort for not always clear value. For example, old time power users were lamenting the lack of some metrics, that we considered not essential and very high effort to calculate, but they were used to receive in the previous version.

The new Overview, redesigned using the new data design system. 

Rolling out this new system to redesign an old part of the product came with some challenge, mainly coming from managing power user’s expectations while still containting the scope of the project.

Overall, though, the improvements to the product’s user experience turned out to be pretty significant, and certainly balanced out the downsides.

From performance improvements (loading time for the page went from up to 7 minutes to a few seconds), to accuracy of data, to creating an overall more positive impression during sales product demos, Paddle’s new data design system, effectively contributed to gaining product market-fit with Web-SaaS customers way beyond the specific new feature it was first designed for. 


Paddle’s new data design system.