Measuring the success of your software

BY

22 September 2020

Software Development

post-cover-image

Over the past 9 years, we've built plenty of software products. For the majority of that time, our mindset mirrored that of the industry - we build to our clients' requirements. This created a blind spot after development was complete. Did the solution we built actually enable our clients to reach their goals?

On paper, it sounds fair enough. You pay someone to build a house and as long as they build the house you designed, you wouldn’t expect them to check in and see if you like it...

But as an industry and a company, we felt we had to do better. Rather than build solutions blindly, we're shifting the focus to building successful products. Out of this came a new concept; product success consults.


What is success defined as?

It would be absurd to try and come up with a definition of success that can be applied across every project. Success for project A is completely different to success for project B. So, we must set the success criteria on a per project basis. This is generally done at the start of a project but updated as we progress and as business demands change.

How can it be measured?

This is a tough question to answer. It generally depends on the type of goal you're trying to measure. Some goals may be quantitative, which can be measured by analytics tools. We'll discuss these tools in more depth shortly. Qualitative goals are harder to measure but can usually be done through user testing and more open-ended survey methods.

With background knowledge of our clients applications, business goals and success criteria, as well as experience using analytics tools, we've found ourselves in a great position to capture and synthesise product data. Product owners can be incredibly busy managing the application and helping deliver internal business goals. To ensure we're making data-driven decisions, we experimented with a few different formats. As mentioned before, what came out of this experimentation was product success consults.

Product success consulting

These consultations are delivered in the format of a 1 hour meeting scheduled every 4 weeks. Prior to the consultation, WM will observe collected analytics related to the previous 4 weeks. Afterwards, WM will deliver a Product Success Report detailing all items observed, discussed and the actions set. So effectively, it involves a short meeting to run through the most recent data and set actions based on that. In order to have effective consults, we must setup analytics tools. These tools are usually identified at the scope stage and are chosen based on what data needs to be captured.

Supporting tools we recommend

Smartlook

Smartlook is great at recording event based measurements. An event might be a user pressing a button that requests a consultation, or completion of a payment. They are important actions that you want visibility across. Smartlook also has a recording function available if you want a more detailed look into how users are behaving.

Google Analytics & Firebase

These two are pretty well known. We're a fan of them for those more general usage analytics. For example, return visitors vs new visitors, and average user session data.

Hotjar

Hotjar has a great heat map feature. A heat map is a static picture of a page with click data overlaid. It shows which call to actions have successfully attracted your users' interests. By manipulating the order or design of call to actions, you can use heat maps to evaluate its success.

CRM Integration

Integrating an application into an existing CRM like Pipedrive or Hubspot can align the product to business objectives. For example, if the product facilitates B2B sales, but those conversions happen outside of the application itself, CRM integration can help create visibility.

The benefits of measurement

Perhaps the biggest benefit that comes from measuring product performance is the ability to make data-driven decisions. That way, we completely remove the risk of making inaccurate assumptions.

These decisions become critical when iterating on the first version of your product and adding new features. They can show which features are needed most, or the impact that a new feature has on engagement. We like to think that every step taken is forwards, but the data helps prove (or disprove) that.

How we empower departments and enterprises

Government

author-thumbnail
ABOUT THE AUTHOR

Yianni Stergou

squiggle

Your vision,

our expertise

Book a chat