top of page

Making selecting a services easy

Have you ever wondered what it takes to make an effortless delivery experience of something as big as a fridge.. like... an actual fridge!?

Well... I sentiently have! When I worked as a part of a cross-functional team at Whiteaway Group we focused on making the delivery experience of receiving such un-handy objects as washing machines as smooth as getting a new pair of shoes — all the way from placing the order to having it up and running at home. 


In the fall of 2020, our key result for the quarter was to increase the volume of delivery services like carry-ins and installations by 10%.

Here is the story of how we archived it. 

About this case

This project is about how to increase the number of delivery services sold at Whiteaway Groups webshops by removing friction in the checkout flow and improving usability.

The process in headlines

1. Understanding the problem space

2. Creating


3. User-testing concept

4. Validating the new design surgestion

1. Understanding the problem space

To figure out how to go about getting more customers to buy delivery services we started out by looking at what we already knew! During the last quarter of 2019 and the first of 2020, I performed in the range of 30 moderated end-to-end usability tests on our webshop.

Here one of the biggest showstoppers was found to be located in the Check Out Flow! This is not exactly a desired location for a webshop to have a friction point, by the way.


"Luckily" for us, the issue was tied up to the page where the potential customer would select how they wish to get their appliances delivered, which fitted our quarterly goal just perfectly! what a "coincidence"! 


A photo of my face as the usability test participants kept "re-discovering" a huge friction point in the checkout flow!


Illustration of the original design designs. This is a mock-up due to the original version no longer being available for screenshots

To sum up the problem, it seemed that the participants were overwhelmed by:

  1. Too many similar-sounding delivery services.

  2. Accordion-based explanations that only show information when a service was selected. 

  3. The long explanations for each service lead to more questions than answers. 

  4. The tone of the copy made the participants afraid of making errors! 

To visualize these findings I used the value proposition canvas. Here I could map most of the found pains to the UI, and hereby help the rest of the team to understand the concrete areas of the UI that caused friction in the user journey.

We now had a defined set of pains and gains to address with a new design. From these we defined potential gain creators and pain relievers, to address these and test if we were right, or way off! The main ones we wanted to test were: 1. Decrease the number of available delivery options. 2. Renaming a delivery option. 3. Removing the accordion-style explanations of the delivery option.  

With the problem space defined and a plan of what handle we wanted to try, we could formulate the test hypothesis: 

"We believe that providing users with fewer delivery options will result in an increase in services sold". In order to test this we needed something to test on and test with. Therefore we created prototypes. 

2. Creating prototypes

From these suggested solutions I started a rapid wire-framing process to generate different drafts of potential solutions. Who would have thought removing stuff could generate so manny new versions!

Two of these versions were selected and rendered as mid-fidelity interactive prototypes in Figma to test out in comparison to the current version of the page, which was used as a control version. The designs selected were:

(A). A design with a layout where the three main delivery options - curb-side delivery, carry-in, and installation - are stacked on top of each other, similar to the one already used on the site, but with fewer options, and no accordion style explanations.

(B). A more visual design where the three main delivery options were presented side by side together with an icon, to help visualize the difference between the three, but still without the accordion style explanations of the original version. 


Illustration of the designs. figure A1 and B2 are the wire-framing of the two final prototypes. Below, A2 and B2 are mockups of how these translated into the final UI used in the user test. C2 are mock-ups of the original UI used for the control version.

I set up the two designs as interactive prototypes together with a prototype version of the the current version (C) with the same fidelity in a test system created in Figma. This test system also contained different user cases and taskes . 

3. User testing

To test out the prototypes five user tests were carried out. Two were in-person, then a little thing called "a global pandemic" hit, and the remaining three were carried out as remote tests. The test used a think-aloud approach where the participants were presented with a task given in the form of a case and then vocalized their thoughts as they solved them using the different prototypes.


Illustration of the test system flow. All participant go through the intropage, familiarisation task and version selector in the same order. Afterwords, the order in wich the participants get three task is controlled and selected based on a 3x3 latin square matrix. 

The test used a within subject design where each participant saw three versions. The order of how the versions were presented was randomised using a 3x3 Latin square design. Data was collected through observations of the interactions with the prototypes, as well as follow-up questions post task and a post test survey. Furthermore, a System usability scale was used as a quantitative measure to evaluate the usability.


We saw that the participants had a clear preference for the new simplified versions compared to the control version. This was apparent both from the participants vocal statements and from the collected system usability scale scores, where both prototypes had an average score of ~97 while the controle version only had an average score of 76.

4. validating the new design

With the good indications from the user tests, we wanted to validate the finding. To do this we constructed a functional prototype based on an updated version of design B to be used for a live A/B split test. With this, we were able to measure how changing the UI on this single page impacted the number of services sold. We ran the split test for four weeks on our Danish and Swedish webshops, to collect sufficient data. 

When the test was completed we saw a substantial increase in services sold. In addition, we were also able to isolate a significant raise in the overall conversion rate. AWESOME! 

As a result, we began the (not so) smooth process of scaling up the process and implementing the new design on all Whiteaway Groups web shops, where they now are live! 


The final and current version of the UI, as it looks live on, and


We found that the number of options and explanations in the service selection step caused a choice overload that resulted in customers feeling unsure and afraid of making mistakes, and therefore abandoning their baskets before completing their checkout.


By simplifying the flow, it seemed that we reduced the doubt, resulting in an increase in services sold and overall conversion. 

bottom of page