
Improving Staff Experience in Filing Sales Returns on a Web Platform
Project Overview
Background
AISLINN Bureau De Change, a Nigerian foreign exchange company, faced challenges with its in-house Sales Return System (SRS). Staff found the system difficult to navigate, resulting in frequent errors in customer transaction reporting. These inaccuracies led to cumulative fines totaling ₦3.3M from the federal government due to non-compliance with Central Bank of Nigeria regulations.
In an effort to improve team dynamics and performance, this study aimed at assessing peer influence on a person's productivity and safety practice.
Objectives
The project aimed to identify usability issues within the existing SRS and recommend targeted improvements to enhance the staff experience, increase reporting accuracy, and ensure compliance with regulatory requirements.
Outcome
-
35% faster task completion time.
-
19% Reduction in input errors.
-
₦3.3M annual fines eliminated.
-
92% increase in staff satisfcation.
Team
-
My role: UX/UI Designer
-
Team members:
-
1 Website developer
-
1 Graphic designer
-
Stakeholders: AISLINN Bureau De Change ltd.
-
Tools Used in this Project

Affinity Map
Organized interview and observation findings into key themes and usability issues.

Journey Map
Visualized staff interaction with the SRS, highlighting pain points and inefficiencies.

Figma
Created low-fidelity mockups illustrating proposed system improvements.
How Did I Solve the Problem?

Methodology



1:1 Interviews
Conducted interviews with staff to understand pain points and system challenges.



Contextual Inquiry
Observed staff using the Sales Return System in real-time to uncover workflow issues.



Usability Testing
Conducted task-based tests to measure ease of use, time to complete tasks, and error frequency of different UI designs made with Figma.
1:1 Interviews
What did I do?
I conducted 1:1 interviews with employees as a generative research method to gather initial insights into the key pain points users experienced when interacting with the Sales Return System (SRS). The goal was to understand the challenges users faced before proposing any design solutions. Each session focused on hearing direct user experiences, frustrations, and suggestions for improvement related to system navigation, task completion, and error management.
What were the results?
The interviews revealed several consistent issues across participants:
-
64% of employees reported difficulty finding the links needed to customize their returns, causing delays and frequent mistakes.
-
36% of employees mentioned facing connectivity issues when trying to load the returns customization page, leading to incomplete submissions.
-
27% of employees noted encountering a generic error message ("your return has errors, please fix errors"), but reported being unable to identify or correct the errors due to a lack of specific system guidance.
These insights helped uncover both usability flaws and system reliability issues, forming the foundation for targeted improvements in system design and communication.
What challenges did I face?
One challenge during the interviews was ensuring that users clearly differentiated between system-related frustrations and external technical issues (e.g., internet connectivity), which required careful follow-up questions. Additionally, since some employees had adapted to workarounds, it was important to probe deeper to uncover latent needs and unspoken frustrations that might otherwise have been overlooked.
Contextual Inquiry
What did I do?
-
Conducted on-site observations by shadowing AISLINN staff as they used the Sales Return System (SRS) in their real work environment.
-
Documented every step of the sales-return workflow—including system navigation paths, physical desktop setups, and environmental factors (e.g., network reliability, desk layout).
-
Asked clarifying questions in the moment (without steering users) to understand why they made particular choices or workarounds.
What were the results?
-
Staff frequently toggled between three main modules (Dashboard → Transaction Log → Return Customization), causing repeated context switches and slowing each transaction.
-
Key functions like “Customize Return" were buried under nested menus or external links, so users often had to search for them, increasing cognitive load.
-
Many employees maintained manual spreadsheets to track pending returns—evidence that the system’s status indicators were not trusted or clear.
-
Observed intermittent network timeouts requiring page reloads, which compounded frustration and added time to each task.
-
Error messages appeared in a generic banner (“Your return has errors”), but without inline guidance, users stalled trying to identify and correct mistakes.
What challenges did I face?
-
Staying unobtrusive so staff behaved naturally, rather than “performing” for the observer.
-
Separating true usability issues from external factors (e.g., local network glitches, hardware performance).
-
Scheduling observations across both peak and off-peak shifts to capture the full range of workflows and pain points.
Usability Testing
What did I do?
-
Partnered with the company’s Web Designer to create five distinct UI prototypes in Figma.
-
Ran A/B testing sessions with representative staff members, having each participant complete core tasks (e.g., locating the “Customize Return” link, submitting a return) on all five layouts.
-
Captured quantitative metrics (task completion time, error rate) alongside qualitative feedback (perceived clarity, ease of use, visual appeal).
What were the results?
-
Design C outperformed all others, delivering a 35% faster average task completion time than the original layout.
-
Users gave Design C a 92% satisfaction rating, praising its:
-
Clear navigation with a prominently placed “Customize Return” button
-
Inline field validation, which immediately flagged and explained input errors
-
Simplified workflow, reducing unnecessary clicks and context switches
-
-
Design C also recorded the lowest error rate (8% of tasks with mistakes), thanks to its streamlined form and real-time feedback.
What challenges did I face?
-
Coordinating testing slots during busy business hours: I partnered with team leads to identify natural lulls (e.g., mid-morning breaks, end-of-day wrap-ups) and used a shared online sign-up sheet so participants could self-schedule without disrupting workflows.
-
Iterating quickly in Figma under tight deadlines: I set up a shared component library and organized short, daily design syncs with the Web Designer, ensuring rapid feedback loops and alignment with their project milestones.
-
Maintaining consistent testing conditions: I created a standardized test workspace (pre-configured browser profiles, local copies of assets) and ran a quick network-latency check before each session to ensure every participant experienced the same performance baseline.