{"id":8843,"date":"2021-05-18T17:46:35","date_gmt":"2021-05-18T15:46:35","guid":{"rendered":"https:\/\/userlutions.com\/blog\/about-us\/successful-a-b-tests-ux\/"},"modified":"2021-08-17T09:17:29","modified_gmt":"2021-08-17T07:17:29","slug":"successful-a-b-tests-ux","status":"publish","type":"post","link":"https:\/\/userlutions.com\/en\/blog\/usability-analysis\/successful-a-b-tests-ux\/","title":{"rendered":"The secret of successful A\/B tests: How we reduced the bounce rate of RapidUsertests by 31%."},"content":{"rendered":"
[et_pb_section bb_built=”1″ next_background_color=”#000000″][et_pb_row][et_pb_column type=”4_4″][et_pb_text _builder_version=”3.19.18″]<\/p>\n
Let’s face it, haven’t we all made gut changes to our website or app without first analyzing the exact problem? We are guilty of this too. If you check these adjustments in an A\/B test, it often shows no changes or even a deterioration and you are no smarter afterwards than before.<\/p>\n
But there is another way – through structured analysis and sound hypotheses we were able to reduce the bounce rate of our website of our tool RapidUsertests by 31% and increase the conversion rate by 47%.<\/p>\n
How did we do that?
<\/p>\n
RapidUsertests.com is our tool for crowd usability testing. After a redesign in early 2017, we had made no more changes to the home page, but the quantitative analytics data showed us a comparatively high bounce rate and potential for increases in conversion rate.
<\/p>\n
Without question, quantitative data<\/strong> is important in generating A\/B test hypotheses. You can only obtain this information through qualitative data<\/strong>. Our conversion specialists therefore combined quantitative and qualitative methods to uncover conversion killers:<\/p>\n The main findings of these analyses: the communication on the homepage could be more convincing and the user guidance had potential for improvement. From the results of these different analyses, we were able to formulate well-founded hypotheses that follow the following scheme:<\/p>\n For example, one of our hypotheses was:<\/p>\n If<\/strong> we address the needs of our customers even more clearly above the fold in our advantage communication, then<\/strong>the bounce rate will drop because <\/strong>users will immediately understand the relevance of our offer.<\/em><\/p>\n Important when formulating the hypotheses: Both the problem and its causes, as well as the planned solution and the expected result must be included. If one of these components is missing, your analyses were probably insufficient and the risk of an unsuccessful A\/B test increases.<\/p>\n Based on these hypotheses, our UX conceptors and designers designed a new home page stage, which we tested against the original variant in a split test.<\/p>\n The result of the A\/B test was impressive: the bounce rate decreased by 31% compared to the control variant with a significance of 100%, which is why we made the new variant live for all users. Combining A\/B testing with conversion and UX analytics means you:<\/p>\n
As in our example, they can show you where problems are occurring on your website through KPIs like abandonment rates and dwell time. But what they don’t show you is the underlying why. This already starts with the dwell time: Do users stay on a page for a long time because the content is relevant to them or because they don’t find the crucial information? Do your users call up many different individual pages because they find your offer so exciting or because you are not well managed?<\/p>\n
This can be a classic UX test, but an expert review by conversion specialists also provides important information. Do you already have insights about your target group from user research and perhaps even use personas and scenarios in your company? Wonderful – they also contribute to being able to analyze what information your users need, where they need it, and what challenges they want to be picked up on.
<\/p>\nThe process: With conversion analysis to well-founded A\/B test hypotheses<\/h2>\n
\n
\n Usability Test<\/strong>
\n<\/a> we were able to understand the motivations and emotions of our customers and prospective customers. How do they understand our offer? What convinces them? What are their concerns?<\/li>\n
<\/p>\n
<\/p>\nGenerate hypotheses that consider the WHY<\/strong><\/h3>\n
\n If<\/em>
\n<\/strong> we make that change, then<\/em> <\/strong>that metric will improve,
\n because <\/em>
\n<\/strong>this problem will no longer exist. – share on Twitter<\/a><\/p>\n
<\/p>\nThe result: 31% fewer bounces and 47% more conversions.<\/h2>\n
Since then, we’ve also seen a significant increase in conversion rates of over 40%.<\/p>\n
<\/p>\nConclusion: Sound hypotheses and successful A\/B tests through UX analyses<\/h2>\n