With most savvy ecommerce brands now using UX analytics solutions to augment their testing tools to determine what areas are most in need of improvement on their websites, it raises the question: Just how much time and money is being wasted by A/B testing alone without UX insights?
More testing isn’t always best
The vast majority of ecommerce teams still waste more time than may be needed testing absolutely everything though when, instead, they should better prioritise the tests they’re running to only implement those they are confident of a win from or those tests which will make the greatest impact on improving conversion or the user experience. But why are ecommerce teams still investing time, money and effort on A/B testing? If it’s a case of “this is what we’ve always done” or “because my manager told me to” then they really should stop – right now.
A/B testing can be a costly investment if you are not using it wisely. For example, do not A/B test a broken link – just fix it! This may seem like obvious advice as fixing the broken links on sites and checking 404 logs regularly should be standard practice by now.
So, what is needed to truly innovate? Simply for ecommerce teams to take the next step and not just find and fix the broken sections and links on a site, but also identify which areas on the site have recurring problems that may be annoying visitors. But traditional analytics can only tell ecommerce teams what customers are doing on their site, not why and how they’re doing it.
Traditional tools can also require analyst and IT teams to spend weeks delving into all of the information gathered in order to find the specific answers that the ecommerce team needs to make their decision.
But as most ecommerce, sales and marketing teams will understand, time is money. Brands should instead look for tools that display aggregated user journeys visually, enabling them to understand why customers are leaving their site as well as measure the revenue and behavioural contribution of any ‘block’ of content. There’s a huge need to understand golden or broken customer journeys, feed actionable insights to test hypothesis and recognise why tests are winning or inconclusive.
The limits of traditional analytics
Many ecommerce teams are heavily reliant on tools like session replay to try and understand behaviour, but similar frustrations always arise. While session replay and heatmapping, tools are used by ecommerce teams to see how visitors are interacting at a page-level, they are often cumbersome and difficult to extract specific and actionable data from. Session replays may reveal individual sessions in great detail, but they also require analysts to spend time getting an aggregated view of customer behaviour that testing activity can be based on, whereas UX analytics tools offer that from the start.
Heatmapping is a colourful way of visually identifying which areas of a site visitors are interacting with, by tracking scrolls, clicks and how they move around, but it does not offer the comprehensive overview that UX analytics do. Although heatmapping can give you an idea of which website areas have the highest activity, UX analytics offers further insights, such as how many times a visitor has clicked on a link, where users are hovering without clicking, and even if they are clicking on an unclickable section.
UX analytics tools can help identify these annoyance areas and can be leveraged by ecommerce teams to incorporate data-driven design, which finds out where users are leaving a site and why, optimise the customer journey to reduce exit rates, and revise and tailor content to suit the users.
How are UX tools helping to identify user frustration?
Rather than only running tests that they are confident of winning, ecommerce teams that combine a UX analytics solution with their existing testing methods are in a better position to identify the root cause of a problems, and prioritise a test to fix it. Ecommerce teams are then in a better position to generate better test hypotheses and prioritise those hypotheses.
Ecommerce teams are now able to determine what is most likely to work before testing, meaning they can run fewer, more targeted tests- both to solve known issues and to optimise conversion rate. Tests can be based on real data on user behaviour, rather on best practice or what competitors are doing.
Clarks & UX analytics
Clarks utilised UX analytics to identify usability issues that, when fixed, were worth over £2.4m ARR last year. Craig Harris, Data & Analytics Manager, uses ContentSquare’s UX analytics platform to collect actionable insights at speed:
“With ContentSquare, we are able to validate and prioritise tests and understand how they impact consumers on our site. We can now pick the tests that are going to do the most good, rather than the ones we think are going to drive the most value.”
Augmenting UX analytics into traditional testing methods turns an initial black and white test into one with a little more colour. For example, UX analytics solutions like ContentSquare augment testing tools so that when they deliver results, those results are not as simplistic – basically, did the tested element change improve the results or not and, if it did ‘win’, by how much? This is an important query and one that ContentSquare’s UX analytics solution can add essential insight to answer, such as what impact the change had on user journeys, which is all to often a forgotten aspect of testing and not something that every analysis testing tool can provide.
Is this the end of A/B testing?
No, but there is now a new need for optimisation, brought about by UX analytics, that offers ecommerce teams stronger and more powerful testing, and personalisation, results by combining the two.
French online buying and selling platform PriceMinister Club revealed last year that “less is more” and that instead of doing 100 small tests with no to little impact of each, they wanted to do 10 tests instead, each with a higher ROI. Partnering with ContentSquare, PriceMinster discovered that there was a simple yet overlooked design issue with its password field, which was identified using the ‘click recurrence’ metric on the form page. A small design tweak to the login rectified any potential problems and had a significant impact on successful logins.
With the fast-paced round of technology innovations being released every year, retail brands need to better prioritize their finite resources and stop wasting time spent on traditional A/B testing and empower their testing roadmap with new tools like UX analytics.
By Duncan Keene, UK MD, ContentSquare