Document summary
This research summary presents the results of the usability testing conducted on the sales network tools.
Objectives
As part of our ongoing efforts to improve the user experience and align our products with strategic goals, we have identified the System Usability Scale (SUS) score as a key performance indicator (KPI) for user satisfaction. Additionally, we will be tracking two other critical usability metrics: time on task and success rate. These metrics will provide a comprehensive view of how well our product supports primary user goals across web and mobile platforms.
The purpose of this usability test is to establish a usability benchmark for the product, which will serve as a foundation for future improvements.
Success metrics
SUS score
Time on task
Success rate
Test plan
Test plan | https://docs.google.com/document/d/1BAltDfoqDzOkX4O4fPh0PQemcIkFQiZgjvZArWvVlS8/edit?usp=sharing |
---|
Core user journeys
Persona | Core journey | Rationale |
---|---|---|
BRM & PRM (the primary point of contact within the sales network between business owners/marketers and the organization) | Onboarding After Receiving Activation Link | Why this journey was selected + how you selected this journey |
Onboarding a Business Owner (A major part of why they use the platform, to onboard businesses) | Why this journey was selected + how you selected this journey | |
Selling and Assigning Cards to Business Owners and Customers + description | Why this journey was selected + how you selected this journey | |
Monitoring Business Owners Performance and Placing PND on Non-performing Business Owners + description | Why this journey was selected + how you selected this journey | |
Taking Courses via Dashboard + description | Why this journey was selected + how you selected this journey | |
BRM (the primary point of contact within the sales network between business owners/marketers and the organization) | Ability to Cash Out Commissions + description | Why this journey was selected + how you selected this journey |
SC & SPO (those who oversee the main contact within the sales network between business owners/marketers and the organization, specifically the managers of BRM/PRM) | Initiating the Onboarding Process of a BRM + description | Why this journey was selected + how you selected this journey |
Monitoring BRM/PRM Performance and Placing PND on Non-performing BRMs/PRMs + description | ||
Ability to Cash Out Commissions + description | ||
Selling Cards to BRMs | ||
SC (those who oversee the main contact within the sales network between business owners/marketers and the organization, specifically the managers of BRM/PRM) | Assigning POS to BRMs |
Test environment
The tests were unmoderated, we allowed the users navigate these journeys themselves, remote and recorded. The tests were done with the present working platforms (mobile and web). Some technical issues didn’t allow for the use of test environment and prototypes.
Tasks
Delete this info panel, this is only for guidance.
Provide details on the tasks participants were asked to complete. Ensure tasks align with real-world scenarios that users of the product would encounter.
[insert text here]
Core journey | Tasks |
---|---|
View history & print receipt | A customer that recently made a transaction of N15,000.00, has come to ask you for a re-print of that receipt. Proceed to give the receipt Note: imagine this to be a POS terminal |
[journey name] | |
[journey name] | |
Participants
Participant information
Number of participants | |
---|---|
Demographics | The participants, ranging in age from 25 to 40, include both male and female individuals. The group consists of a combination of new partners unfamiliar with the processes and existing users. |
Selection criteria | Partners selected for the test were both new and existing users. |
Participant profiles
Participant A | Comfort |
---|---|
Age | 25 |
Occupation | Personal Banking Relationship Manager |
Tech savviness | Moderate |
Fintech usage | Uses mobile banking apps (GTBank, Access Bank) for daily transactions. Prefers platforms that offer quick, reliable transfers and clear transaction histories. |
Primary devices | Android phone (Samsung) |
Key insights | Comfort is a new user who was recently onboarded to work as a PRM. She is often frustrated by network downtime and error messages during onboarding process of a marketer. |
Participant B | Francis |
---|---|
Age | 28 |
Occupation | Business Relationship Manager |
Tech savviness | High |
Fintech usage | Uses mobile money apps like OPay and PalmPay for personal savings, transfers, and accepting payments from customers. |
Primary devices | Android phone (Infinix Hot 10) |
Key insights | Francis interacts with business owners on a daily basis and requires easy and smooth interface in order to adequately manage his downlines |
Participant C | Rabiu |
---|---|
Age | 40 |
Occupation | State Coordinator |
Tech savviness | High |
Fintech usage | Uses digital banking platforms and business fintech solutions. Regularly interacts with multiple banking platforms for business transactions. |
Primary devices | Windows laptop, iPhone 13 |
Key insights | Rabiu manages a network of BRMs and is particular about tracking performance |
Participant D | Gabriel |
---|---|
Age | 31 |
Occupation | State Product Officer |
Tech savviness | High |
Fintech usage | Primarily uses personal banking apps like Kuda and ALAT for saving, budgeting, and personal transfers. Also utilises fintech apps for peer-to-peer transfers and mobile payments. |
Primary devices | Android phone (Samsung A32) |
Key insights | Gabriel prefers fintech apps that offer ease for managing his network of PRMs. He values proper reporting and the rewards systems, such as commissions and referral bonuses, to increase engagement. His main frustration comes from inconsistent reporting and error blockers while using the tool. |
Recruitment process
Participants were selected based on their partner user type to ensure representation of the Sales Network.
Recruitment channels included customer database and social media groups.
Screeners implemented include:
Participants regularly interact with the sales network tools.
Participants are current partners.
Usability metrics and results
Success rate definition
Delete this info panel, this is only for guidance.
Success criteria:
A task is considered successful if the participant completes it as intended within the defined parameters, without assistance, and within a reasonable time frame.
Define clear success criteria for each task, such as reaching a specific page, completing a transaction, or filling out a form correctly.
Measurement:
Success rate will be calculated as the percentage of participants who successfully complete each task according to the predefined criteria.
SUS score
SUS is a standardised questionnaire used to measure the usability of a product, using a 10 questions 5-point scale to generate a score between 0 and 100, with higher indicating higher usability.
The average SUS score is typically 68. A score of 68 or higher is generally considered acceptable or better than average. Above 70 is good, and above 80 is excellent.
Time on task
The time it takes participants to complete each journey. Shorter times generally indicate better usability. This metric helps to measure efficiency.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
Success rate
[designer to define success for each journey]
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
Results
Delete this info panel, this is only for guidance.
SUS score
The average SUS score was [insert score], with [brief interpretation: e.g., this score falls within/above/below the industry benchmark].
Analysis: the SUS score reveals [positive/negative] aspects of user satisfaction, particularly related to [insert key findings].
Time on task
The average time to complete key tasks was [insert time].
Analysis: tasks such as [task 1] took longer than anticipated due to [insert reason], indicating potential usability challenges.
Success rate
The overall success rate was [insert percentage]. This indicates that [insert percentage of participants] were able to successfully complete the core tasks without assistance.
Analysis: certain tasks such as [specific task] had a lower success rate due to [insert reasons], while others were completed with high efficiency.
Core journey | SUS score | Time on task | Success rate | Comments |
---|---|---|---|---|
[journey name] | e.g. 60 | 1m 30s | % | Brief key feedback or observation for this journey |
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey |
Post-test qualitative questionnaire feedback
Delete this info panel, this is only for guidance.
Post-test questionnaire:
After completing the tasks, participants will be asked additional targeted questions to gather specific feedback on their experience. These questions will be presented BEFORE the SUS questionnaire, for example:
Was there anything confusing or frustrating about the sign-up process?
How easy or difficult did you find the sign-up process?
Did you encounter any difficulties in reading or understanding the text or labels during the tasks?
Were there any features or elements that stood out as particularly helpful or problematic?
Did you experience any challenges with the colour scheme or visual elements?
Do you have any additional comments or suggestions for improving this experience?
These questions will help uncover specific areas of concern or satisfaction that may not be fully captured by the SUS score alone.
Participants provided additional insights through open-ended post-test questions:
Common pain points | List any recurring issues, frustrations, or confusion that participants mentioned. |
---|---|
Positive feedback | Summarise any positive comments about the product or specific features. |
User suggestions | Insert suggestions for improvement or areas where users struggled but offered solutions. |
Recommendations for improvement
Based on the test findings, the following recommendations are proposed to improve the product’s usability:
Design changes
[Task/feature]: revise [UI component, workflow, etc.] to streamline the user journey and reduce time on task.
[Task/feature]: clarify instructions on [specific screen/interaction] to improve success rates.
Quick wins
Minor adjustments to [insert feature or task] that can be implemented without major development effort.
Long-term recommendations
Consider a more comprehensive redesign of [specific user journey], especially focusing on [insert issue].
Conclusion
Summary
The usability test revealed that while the product performs adequately in most areas, there are specific pain points related to [specific feature]. By focusing on these areas, we can significantly improve the user experience.
Next Steps
The design team will implement the recommended changes and prepare for a follow-up usability test to measure improvements in SUS, time on task, and success rate.
Add Comment