Skip to main content

AI Testing Platform

AI Testing Platform hero image

Designed AI testing platform that builds trust with Test Managers. Extended human capabilities while achieving 35% efficiency gains.

My contribution
User researchUX/UI designPrototyping
Team
1 Product Designer (me)1 Product Manager5 Engineers

The problem

Since this project was directly aligned with a critical business objective, I prioritized understanding the project's purpose and success criteria before beginning any research or design work. GAT was seeking opportunities to reduce operational overhead and free up Test Manager capacity, allowing them to focus on higher-value activities.

User research

We ran weekly user research sessions to observe Test Managers navigating their current processes. From this study we discovered the following insights:

The testing process involved distinct stages: launching tests, moderating results, and delivering outcomes.

Test Managers were using spreadsheets for all data management.

Each stage involved multiple manual tasks, consuming significant time.

AI Testing Platform user research insights
Miro board analysis of user research findings, mapping out the key insights and pain points discovered during our weekly observation sessions with Test Managers.
AI Testing Platform user research insights 2
Spreadsheet used by Test Managers to manually manage their testing process.

The solution

Solution design involved user testing and multiple iterations. High-level logic was mapped out, followed by prototypes. We explored various options, particularly around automating decision-making processes. The challenge was presenting complex testing data in a simple, easy-to-understand way while incorporating AI assistance.

Lo-fi Wireframes

AI Testing Platform wireframes
Lo-fi Wireframes of the new platform showing the new platform's interface and how it will be used by Test Managers.

Hi-fi Designs

AI Testing Platform hi-fidelity designs
Hi-fi Designs of the new platform showing how many testers have performed the test, status and progress of the test.

User Feedback

Prototypes were iterated on after remote user testing sessions. Users were asked to complete a full testing cycle using the designs. We observed their understanding of the AI-assisted process and followed up to understand how they expected the new system to impact their workflow.

Results

The final design transformed the way Global App Testing Managers run the testing process. The new platform stores and tracks data previously managed in spreadsheets. It uses machine learning and AI to automate most of the Test Managers decisions and actions.

🏆 Testing efficiency increased by 35%

🏆 User satisfaction improved by 50%

🏆 Test completion rate increased by 28%

Where do we go from here?

First, I would recommend conducting usability testing again with new participants. Have the changes made it easier for users to complete their tasks?

If usability testing is not an option, I would recommend tracking the following metrics:

What percentage of Test Managers are currently using the new software?

How has the new software impacted the testing delivery time and results quality?