Document summary
Insert brief summary here.
Objectives
As part of our ongoing efforts to improve the user experience and align our products with strategic goals, we have identified the System Usability Scale (SUS) score as a key performance indicator (KPI) for user satisfaction. Additionally, we will be tracking two other critical usability metrics: time on task and success rate. These metrics will provide a comprehensive view of how well our product supports primary user goals across web and mobile platforms.
The purpose of this usability test is to establish a usability benchmark for the product, which will serve as a foundation for future improvements.
Success metrics
SUS score
Time on task
Success rate
Test plan
Delete this info panel, this is only for guidance.
Add link to test plan for example; https://docs.google.com/document/d/1IJSsiOfUI6HE8F8_SOF7Xuc0oS7dTuSG-O8YZwqNhb0/edit?usp=sharing
Test plan | Attach the original test plan for reference - insert link |
---|
Core user journeys
Delete this info panel, this is only for guidance.
Identify and define core user journeys relevant to different user personas.
Focus on tasks critical to user satisfaction and aligned with the product's value proposition.
Include a rationale for why these journeys were selected and how you selected these.
[insert text here]
Persona | Core journey | Rationale |
---|---|---|
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Name of persona + description | Name of the core journey + description | Why this journey was selected + how you selected this journey |
Test environment
Delete this info panel, this is only for guidance.
Outline the approach you took - wether it was moderated or unmoderated test, remote or in-person, which platform it was performed on (mobile, web, POS), interactive prototypes or testing account, etc.
[insert text here]
Tasks
Delete this info panel, this is only for guidance.
Provide details on the tasks participants were asked to complete. Ensure tasks align with real-world scenarios that users of the product would encounter.
[insert text here]
Core journey | Tasks |
---|---|
View history & print receipt | A customer that recently made a transaction of N15,000.00, has come to ask you for a re-print of that receipt. Proceed to give the receipt Note: imagine this to be a POS terminal |
[journey name] | |
[journey name] | |
Participants
Participant information
Number of participants | Indicate the total number of users involved (5-8 participants per journey). |
---|---|
Demographics | Include relevant demographic information such as age range, gender, location, experience level, job role, familiarity with the product, or other factors depending on the target audience. |
Selection criteria | Describe the criteria used for selecting participants. Were they existing users, new users, or a mix of both? |
Participant profiles
Delete this info panel, this is only for guidance.
If applicable (otherwise delete section), provide a few short participant profiles to offer context - see examples below.
Participant A | Chinedu |
---|---|
Age | 35 |
Occupation | SME owner (fashion retail) |
Tech savviness | Moderate |
Fintech usage | Uses mobile banking apps (GTBank, Access Bank) for daily transactions and a POS terminal for customer payments. Prefers platforms that offer quick, reliable transfers and clear transaction histories. |
Primary devices | Android phone (Tecno Camon 16) |
Key insights | Chinedu heavily relies on his POS terminal for business operations and uses fintech apps to track transactions. He is often frustrated by network downtime with POS systems and values transparency in fees. Easy-to-read transaction reports and the ability to quickly reconcile daily sales are critical for his business. |
Participant B | Halima |
---|---|
Age | 27 |
Occupation | Market trader |
Tech savviness | Low |
Fintech usage | Uses mobile money apps like OPay and PalmPay for personal savings, transfers, and accepting payments from customers. Not particularly tech-savvy, but uses these services due to their ease of access and lower transaction fees compared to traditional banks. |
Primary devices | Android phone (Infinix Hot 10) |
Key insights | Halima needs simple, straightforward fintech solutions with minimal steps to complete transactions. She values reliability, especially when receiving payments from customers. She finds features like SMS confirmations and immediate access to funds critical for trust in the service. |
Participant C | Tunde |
---|---|
Age | 40 |
Occupation | Corporate accountant (mid-sized firm) |
Tech savviness | High |
Fintech usage | Uses digital banking platforms and business fintech solutions for managing payroll, supplier payments, and invoicing. Regularly interacts with multiple banking platforms for business transactions. |
Primary devices | Windows laptop, iPhone 13 |
Key insights | Tunde requires fintech solutions that streamline business transactions and offer integration with accounting software. He values features like automated invoicing, bulk payments, and detailed reporting, but he finds it frustrating when banking platforms lack API integrations with their business software. Efficiency and robust security features are critical for business operations. |
Participant D | Amaka |
---|---|
Age | 22 |
Occupation | NYSC corp member (service year) |
Tech savviness | High |
Fintech usage | Primarily uses personal banking apps like Kuda and ALAT for saving, budgeting, and personal transfers. Also utilises fintech apps for peer-to-peer transfers and mobile payments. |
Primary devices | Android phone (Samsung A32) |
Key insights | Amaka prefers fintech apps that offer budgeting tools and savings features. She values gamification and rewards systems, such as cashbacks and referral bonuses, to increase engagement. Her main frustration comes from long transaction processing times or unexpected app downtimes. Security and user-friendly onboarding are important factors for her trust in the app. |
Participant D | Babatunde |
---|---|
Age | 50 |
Occupation | POS operator |
Tech savviness | Moderate |
Fintech usage | Relies heavily on mobile money apps like Paga and Moniepoint to run his POS business, facilitating cash withdrawals, deposits, and transfers for customers. He also uses traditional banking apps for reconciling the daily balance. |
Primary devices | Android phone (Itel A56) |
Key insights | Babatunde’s biggest concern is the reliability of the fintech app or platform, especially in high-traffic areas. He prefers platforms with low transaction failure rates and quick settlement times. Instant customer support is crucial when POS systems encounter issues, and he values platforms with lower transaction fees. |
Recruitment process
Delete this info panel, this is only for guidance.
Participants were selected based on specific demographic criteria to ensure representation of the target audience.
Recruitment channels included [insert recruitment methods: online panels, customer database, etc.].
Screeners were used to ensure participants met the following criteria:
- regularly interact with [specific technology/product. type]
- other criteria related to the specific test such as being a current customer, not being a customer, using a competitor’s product, etc.
[insert text here]
Usability metrics and results
Success rate definition
Delete this info panel, this is only for guidance.
Success criteria:
A task is considered successful if the participant completes it as intended within the defined parameters, without assistance, and within a reasonable time frame.
Define clear success criteria for each task, such as reaching a specific page, completing a transaction, or filling out a form correctly.
Measurement:
Success rate will be calculated as the percentage of participants who successfully complete each task according to the predefined criteria.
SUS score
SUS is a standardised questionnaire used to measure the usability of a product, using a 10 questions 5-point scale to generate a score between 0 and 100, with higher indicating higher usability.
The average SUS score is typically 68. A score of 68 or higher is generally considered acceptable or better than average. Above 70 is good, and above 80 is excellent.
Time on task
The time it takes participants to complete each journey. Shorter times generally indicate better usability. This metric helps to measure efficiency.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
[insert journey name]: users should/are expected to complete this journey within X minutes.
Success rate
[designer to define success for each journey]
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
[insert journey name]: this journey is considered successful when [specify].
Results
Delete this info panel, this is only for guidance.
SUS score
The average SUS score was [insert score], with [brief interpretation: e.g., this score falls within/above/below the industry benchmark].
Analysis: the SUS score reveals [positive/negative] aspects of user satisfaction, particularly related to [insert key findings].
Time on task
The average time to complete key tasks was [insert time].
Analysis: tasks such as [task 1] took longer than anticipated due to [insert reason], indicating potential usability challenges.
Success rate
The overall success rate was [insert percentage]. This indicates that [insert percentage of participants] were able to successfully complete the core tasks without assistance.
Analysis: certain tasks such as [specific task] had a lower success rate due to [insert reasons], while others were completed with high efficiency.
Core journey | SUS score | Time on task | Success rate | Comments |
---|---|---|---|---|
[journey name] | e.g. 60 | 1m 30s | % | Brief key feedback or observation for this journey |
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey | |||
[journey name] | Brief key feedback or observation for this journey |
Post-test qualitative questionnaire feedback
Delete this info panel, this is only for guidance.
Post-test questionnaire:
After completing the tasks, participants will be asked additional targeted questions to gather specific feedback on their experience. These questions will be presented BEFORE the SUS questionnaire, for example:
Was there anything confusing or frustrating about the sign-up process?
How easy or difficult did you find the sign-up process?
Did you encounter any difficulties in reading or understanding the text or labels during the tasks?
Were there any features or elements that stood out as particularly helpful or problematic?
Did you experience any challenges with the colour scheme or visual elements?
Do you have any additional comments or suggestions for improving this experience?
These questions will help uncover specific areas of concern or satisfaction that may not be fully captured by the SUS score alone.
Participants provided additional insights through open-ended post-test questions:
Common pain points | List any recurring issues, frustrations, or confusion that participants mentioned. |
---|---|
Positive feedback | Summarise any positive comments about the product or specific features. |
User suggestions | Insert suggestions for improvement or areas where users struggled but offered solutions. |
Recommendations for improvement
Based on the test findings, the following recommendations are proposed to improve the product’s usability:
Design changes
[Task/feature]: revise [UI component, workflow, etc.] to streamline the user journey and reduce time on task.
[Task/feature]: clarify instructions on [specific screen/interaction] to improve success rates.
Quick wins
Minor adjustments to [insert feature or task] that can be implemented without major development effort.
Long-term recommendations
Consider a more comprehensive redesign of [specific user journey], especially focusing on [insert issue].
Conclusion
Summary
The usability test revealed that while the product performs adequately in most areas, there are specific pain points related to [specific feature]. By focusing on these areas, we can significantly improve the user experience.
Next Steps
The design team will implement the recommended changes and prepare for a follow-up usability test to measure improvements in SUS, time on task, and success rate.
0 Comments