UX DESIGN EXAMPLES
Automotive Shopping Tool. I conducted intensive research on customer online shopping behaviour and competitor analysis to create a fast, intuitive, customized shopping experience. Robust filtering, smart category grouping and favourites were the key UX features.
Cross-platform Social Media Tool. UX design provides clear workflows that include geo-mapping and keyword searching. Also an easy to understand presentation of analytics.
Damage Report App. The app needed to complete tasks faster and more accurately than the current industry tools. I conducted extensive user research and testing to establish the best workflow and design.
Music Sharing App. Native app that utilizes the camera, microphone and audio. Designed to create a 'community' experience. Mobile experience syncs with desktop application for a true intigrated experience across devices.
Interactive Elearning Software. Highly interactive courseware that incorporate audio and animation. I interviewed and tested designs on kids to understand their learning habits.
Insurance Quoting Tool. Design provides clear paths for various customer journeys and a clean, intuitive design.
Registration Page. Responsive sign-up form.
Condition Report - Case Study
Cross-platform tool to quickly & accurately create a vehicle condition report.
My role: UX Research, Wireframing, Prototyping, User Testing, Stakeholder Presentation, UI Requirements
The Challenge
Automotive dealerships needed a fast and accurate way to assess and report damage on a vehicle, enter other required information and create and share a professional, standardized Vehicle Condition Report package.
User Research
I researched automotive products that car dealers used for various tasks in their workflow to understand the design and functionality they were familiar with. I conducted interviews with used car dealers and ‘shadowed’ them on the car lot to really understand their daily workflow. I needed to understand:
-
What is the user’s current process?
- What do they like and dislike about their process?
- What does the app have to do to be adopted into their daily workflow?
Dealer for a Day (or a few days)
I conducted extensive user research by ‘shadowing’ various dealers on used car lots as they performed ‘walk-arounds’ on vehicles to report damage and create condition reports. I asked what their pain points were and learned their behavior and workflow. I brought this valuable research back to the Product Team and the collaboration began. I decided on an approach which consisted of rapid sketching and prototyping that we could go to dealers with to get feedback. Collaboration between Product, Engineering and Customer Support, sparked fantastic ideas and allowed us to prioritize a phased released strategy based on what was required in the application vs what was ‘nice to have’.
Analysis
Based on the user research I gathered, I created a persona to guide decisions and priorities.
I used a journey map to visualize and communicate the user’s end-to-end experience in creating a condition report. This helped us identify pain points and areas of improvement as well as to understand the user’s emotional state throughout the process.
I discovered used car dealers are very set in their ways and locked into a process they have used for many years. Injecting new tools into their workflow was challenging. An additional challenge was that most users were not tech savvy so the app had to be simple and not intimidating. It also needed to work seamlessly across platforms. My research identified 3 key areas that were important to the user:
Speed: The app must speed up their current process and cannot feel like a disruption. It must be intuitive and require no training.
Accuracy/Transparency: The app must deliver accurate damage values and store all the required information in a secure, time-stamped file.
Accessibility: Information must be easily recalled, emailed, printed and shared and be a fully integrated cross-platform solution so they could save a session on mobile and complete it on desktop.
The Design
I sketched the UI, flow, and functional and data elements. I designed the primary navigation to mimic the workflow I witnessed when shadowing the users. Color is used throughout to indicate links and selected items. To report damage, I used a digital version of the paper pancake diagram users were familiar with. Consideration was taken to design large, easy-to-click areas (most users were men with typically larger fingers).
Testing and High-fidelity Design
I presented my wireframes to stakeholders and once I had approval, I built a high fidelity mobile prototype. I collaborated with content strategists to ensure all the required condition report content was accurately represented. I tested the prototype with users on mobile devices both in a controlled environment with no outside stimulus as well as on a busy used car lot. Users were asked to complete a condition report for a provided vehicle that had obvious damage. I asked users to ‘talk out loud’ while completing tasks. Tests were timed and recorded.
Improvements
- In almost all cases, users were answering ‘no’ to the disclosure questions, so I pre-selected those answers to
'no' to speed up the process.
- I initially used a generic pancake diagram that represented ‘car’, ‘truck’, ‘van’ but discovered users felt it was more accurate if the diagram closer resembled the actual model of the vehicle. More diagrams were added and selected based on the VIN of the vehicle.
- Animation was used for calls to action (take photo, print) as a slide up from the bottom of the screen. This increased visibility and success rates of completing those actions when required.
- Users were trying to advance to the next screen without clicking the Save icon. We decided an auto-save feature made sense, given the distractions and busy environment the users were working in.
Game Changer
I was confident we improved the accuracy and organization of the user’s process in creating a condition report. And when tested, reporting damage and answering disclosure questions were within the time range parameter we set as a goal. However, initial set up of the vehicle details (year, make, model, trim, VIN) was slow, manual entry. The Engineering team worked on developing a VIN SCAN function. It was decided that the product could not be released without that time-saver. Once the VIN scan was added to the app, all the vehicle details could be auto-populated.
RESEARCH
"If you don’t talk to your customers, how will you know how to talk to your customers?" - Will Evans
The research phase is key to creating an informed user experience. I work with the product team during this phase to conduct user and competitor research. The results are used to to help prioritize product improvements as well as determine market need for new products and features. I use the following techniques during the research phase:
Competitor Analysis
Collaborators: UX, Business Analysts, PM, PO
When I perform an audit/review of competing products, it involves signing up for free trials, creating user accounts and interacting with the products as a ‘user’. I alway use the products on various devices and operating systems to discover any inconsistencies in the experience. The output is a competitor analysis presentation for stakeholders that includes a detailed breakdown of features, design, usability and how each product solves specific user’s needs.
Analytics
Collaborators: UX, Business Analysts, Data Analysts, Client Support
In the case of improving or enhancing an existing product, I use analytics to understand the users’s flow when using the product. Metrics such as most frequently visited pages, visitor demographics, bounce rate, time required to complete tasks and heat maps provide valuable insight when redesigning the architecture, improving the discoverability and learnability of a product, and eliminating unnecessary features or confusing UI.
Information Architecture/Content Audit
Collaborators: UX, PO, Content Writers
Information Architecture labels and organizes the content on a site to show a high level view of how each screen of the product fits together, and also how each item relates to all the other items within this structure. Card sorting techniques can be used to determine if content is grouped and organized in an intuitive and efficient way. Analytics can also provide insight into the effectiveness of the architecture.
A content audit involves reviewing and cataloging the existing content to determine if it’s grouped and organized in an intuitive and efficient way; and if the content supports the user persona throughout all stages of the customer experience.
User Interviews/Surveys
Collaborators: UX, Product Specialists
I find interviews and surveys are useful when you want to explore users’ general attitudes or how they think about a problem. They can mitigate risk of designing something not really needed and they can make stakeholders more confident that a design or feature is needed. Surveys are effective when a larger sample size is required or when a less intrusive information gathering technique is more appropriate. Being able to write effective questions is vital to gathering useful information on ‘how’ and ‘why’ a user does things.
User Testing
Collaborators: UX
User tests can be conducted at the user’s location; users can come to a controlled testing environment, or they can be over video conference. I ask users to complete specific tasks and ‘talk out loud’ while watching and documenting what they do and say. It provides valuable insight into the user’s patterns, flow, and emotions when using the product. It identifies the user’s pain points and motivations while interacting with the product and I can time how long it takes to complete specific tasks. I always let users know that there are no right or wrong answers/selections and that I am not there to help or answer questions – just observe. Ideally, if I can ‘shadow’ a user in their actually daily environment using the product, I find I can gather more true interaction with the product. The output is a user test report that can be presented to stakeholders to support UX design recommendations.
ANALYSIS
“Supposing is good, but finding out is better.”– Mark Twain
During the analysis phase, I take the data and insights gathered in the research phase and organize it into personas, user journey maps and use cases to help understand the motivations of users. I use the following techniques during this phase:
User Personas
Collaborators: UX, PO, Product Specialists, Marketing
By taking the qualitative and quantitative data from analytics, surveys, interviews and user test results, I create personas that represent “typical” users. These personas are assigned names, photographs, motivations, goals and use cases that align with real people using the product.
User Journey Maps
Collaborators: UX, PO, Product Specialists, Development
A user journey is a series of steps which represent a user scenario as they interact with the product. They help to identify how users currently interact with the product and future state of how users could interact with the product. By understanding the key tasks users will want to accomplish, and their emotions during each step, I can start to map out the best interface and functionality to enable those tasks. When trying to understand a user’s workflow, having the option to ‘shadow’ them in their work environment allows me to record their process and workflow to complete specific tasks. I can observe the real distractions and pain points they encounter and the efficiency in which tasks need to be successfully completed. This data along with the other research data and persona information, helps to create the user journey map.
Use Cases
Collaborators: UX, PO, Product Specialists, Development
Each persona is aligned to a user journey and specific use case. Each use case is a sequence of simple user steps and the system response to user actions, beginning with a user’s goal and ending when that goal is fulfilled. I outline both success and failure scenarios and collaborate with developers to define any technical limitations. Use cases are helpful when prioritizing and negotiating features.
PRODUCTION
“If you think good design is expensive, you should look at the cost of bad design.” – Dr. Ralf Speth
The production phase is where the high-fidelity design is fleshed out, content and digital assets are created, and a high-fidelity version of the product is validated with stakeholders and end-users through user testing.
High-Fidelity Designs
Collaborators: UX, Designers, Content Writers, Marketing, Development, PO
With all the research and testing data collected, I now collaborate with the designers to create high-fidelity mockups and prototypes with fleshed out design and content. Designs are tested again on users to see if any issues still exist.
In cases of small feature tweeks or improvements that don’t require a digital prototype, mockups are the final sign-off for stakeholders and are submitted with requirements to the development team.
User Testing
Collaborators: UX, QA
The high-fidelity design is tested on users. Again, I give users tasks to complete and ask them to ‘talk out loud’ while I record their actions and words. Any problems that were missed are addressed.
Championing the design to stakeholders is a big part of my role during this phase. Once validated and approved, I help put requirements together for the development team so they can build the fully functioning product.
QA can also start building test case scenarios for internal testing.
Requirements and Style Guide
Collaborators: UX, PO, Development, QA
Once the final design is approved, I work with the product owner to put together a style guide and functional requirements for the development team to build the final product for deployment.