Senior UI Designer. London
Upon joining the Argos apps team, It was my responsibility to look after the native iOS app for iPhone and iPads. The app was already available on the app store and was doing reasonably well in the app store, however there were many design inconsistencies which exacerbated the experience.
The iOS app also had a very ambitious roadmap for the year with many new features and technologies we wanted to introduce into the app to enhance the experience. My first responsibility was to audit the existing build and define improvements and both in the experience and the visual language.
I then built out a UI component library for the iOS app, simultaneously whilst working on the Bolt design system. Using all the atoms from Bolt made the iOS component library much more flexible and also consistent across all channels as we all used one icon library.
Apple's AR Kit was first introduced to the public in June 2017 with Apple’s annual Worldwide developer conference and released in September 2017. As ambitious as Argos is, it was quick to realise the potential of Augmented reality and its many uses for retail and shopping apps.
It wasn't until 2018 upon joining the apps team that Argos wanted to introduce AR into the iOS app as it had many benefits to e-commerce, including the reduction of returns.
Once we had confidently defined our AR concept, I created low fidelity wireframes and layouts which would get used in our testing lab. Initially, testing of the feature proved successful, but we hit a few roadblocks on the way, including our customer's understanding of the term AR.
As we all worked in technology, the term AR came natural to us, and so we assumed that displaying a CTA with 'View in AR' would be the correct way to go, however in our testing lab it soon became apparent that our customers did not know what AR meant, and so we agreed to change the language to 'View in 3D'.
Another issue we had was with the technology itself, and because the device had to calibrate and scan an area to recognise a surface, this proved difficult with blank white walls, and so we created the 'Wall mode'. The wall mode enabled the customer to create a surface by dropping a start and finish position which created a surface between those points. This became useful when mounting televisions to walls.
After several months of design, iteration and testing we launched Argos AR for our home & furniture category and in collaboration with Lego on several toy models.
I also redesigned every component within the app to be responsive and flexible, therefore the same components got used for the smallest device (iPhone SE) and the largest 12
During our testing phase of Augmented reality, we discovered limitations with Apple's ARkit framework, and it's inability to find surfaces which were blank, or of plain colour. This proved to be a challenge for us, as one of our goals in releasing Augmented reality was to allow customer's to place wall mounted products in their home, such as televisions.
After experimenting with concepts and working closely with iOS developers, we agreed to enable our customers to manually build a virtual wall in their home, which would then allow them to visual a wall mounted product. We call it 'Surface Builder' and this augmented reality mode enables customer's to add 2 points A and B across the base of a wall which then virtually builds the surface between those two points.
As augmented reality was a new and innovative technology I wanted to ensure that the UI and interactions were clean but also focused and didn't cause confusion to our customers. I wanted the UI to help customer's to fully utilise augmented reality and it's benefits.
I introduced the Argos app team to Lottie, a framework build by Airbnb which enables you to animate SVGs in after effects and translate that into Xcode. I wanted to try something new and so animated several UI components to include in the app which would help our customers.
Once we had successfully launched AR, the innovation didn't stop there, and our next big project for iOS was a visual search.
We discovered there were many user cases for this technology, and so with UX support fleshed out ideas of how this experience could work in the most seamless way possible.
Once myself and UX support had defined our ideas, we worked closely with a developer to mockup a simple prototype of visual search as this would have been impossible given the current prototyping tools on the market.
Once we had created this mid fidelity prototype we then began testing, internally with colleagues but also with our customer's in our own testing labs.
What we hadn't accounted for was the ability for our customer's to search for images which
I released visual search in early 2019 with a simple yet effortless experience which enhances the current search functionality. This new technology adds a new depth to search and also contributes to helping customer's with accessibility needs.