Microsoft Accessibility Insights

Boosted the efficiency of novice accessibility programmers by 55%

SUMMARY

Accessibility Insights is Microsoft's open source accessibility testing product. Developers new to accessibility testing often found the tool confusing and hard to navigate. To tackle this, we conducted a user study and revamped the information architecture and UX writing. This targeted effort streamlined navigation and clarified instructions, leading to a significant improvement in task completion time for first-time users.

ROLE

Product Manager
Visual Designer
UX Writer

TEAM

Client POC
UX Researchers
UX Designer

Discovery

Audit

To understand Accessibility Insights' structure and user pain points, we conducted a product teardown, reviewed past research, and analyzed competitor offerings. This deep dive informed our research hypotheses, paving the way for targeted improvements.

Insights

Only 28.6% of professionals have reported actively implementing web accessibility for at least 3 years and less than 54% of these professionals ever test their work for mobile users. (Source)

01
Perceived Complexity

Users perceive accessibility as complex and daunting, creating a hesitation to even start learning about it.

02
Jargon

Technical terms and obscure language made navigating the platform confusing, even for non-accessibility experts on the team.

03
Navigation

The way users moved between different accessibility tests felt convoluted and inconsistent with standard design patterns.

How might we improve the learnability of Microsoft’s Accessibility Insights for its novice users?

How might we improve the learnability of Microsoft’s Accessibility Insights for its novice users?

Research

Learnability Study

We designed 3 tasks based on user behavior hypotheses, and ran 3 trials per task per participant. We measured time on task, errors, and analyzed note-taking to understand learning curves and information organization.

Interviews

We conducted interviews to get in-depth understanding of their thoughts on the tool. Additionally, we used this to correlate their behaviour with the quantitative metrics of TOT and No. of errors.

Analysis

Quantitative Findings

We received an average SUS score of 78. This corresponds to 'Good' usability.

While reporting failures took slightly longer, deeper qualitative analysis suggests users focused on detailed feedback as they familiarized themselves with the tool. This aligns with zero errors in the final trial. We then used qualitative data to understand initial difficulties and lengthy first attempts.

Visual Analysis & Thematic Grouping

We made sense of think aloud feedback by affinity mapping.
We also analyzed note-taking documents from the third task to understand how participants documented the failures. This would guide the design for the overview page.

Prioritization

Based on the our learnability study and think-aloud user feedback, we identified 8 major concerns. We decided to solve the first three as part of our scope.

Ideation & Feedback

Extension Launcher

Overall, the expert (Senior UX Designer @ Microsoft) and two MS HCI students, preferred the second layout.

Overview Page

Wireframe feedback focused on streamlining information overload. Users requested a search/filter and sort for tests, along with clearer terminology like "Download Assessment" instead of "Save Assessment."
Users were confused about the difference between assessment types (FastPass, QuickAssess, etc) on the Chrome Extension launcher.

Navigation Bar

We received feedback to remove the progress circle instead add a quantitative metric like "2/3 done" for each test.

Visual Design & UX Copy

Fluent Design System

Leveraging Microsoft's Fluent design system as a foundation, we introduced tailored components that seamlessly enhance the user experience.

Redesigned Extension Launcher

Redesigned Overview Page

Redesigned Navigation Bar

Evaluative Research

We conducted a cognitive walkthrough with 3 experts -
One of them was a Senior UX designer from Microsoft and 2 individuals with visual impairments with a background in accessibility and HCI.
We conducted usability testing with 4 novice accessibility users who had limited knowledge of accessibility and were being introduced to the tool for the first time.
We had each expert conduct 5 tasks based on the re-designed components

Results

Overall, the time on task data collected during our evaluative user study suggests that the redesigns made users more efficient .
For onboarding, users’ average time on task went down by 93%, and this increase in efficiency is also supported by qualitative data in which users expressed that the information provided about each test was comprehensive and clear.
For the task where users were asked to report failures from the overview page, the time on task went down by 55%
Overall, the time on task data collected during our evaluative user study suggests that the redesigns made users more efficient.
For onboarding, users’ average time on task went down by 93%, and this increase in efficiency is also supported by qualitative data in which users expressed that the information provided about each test was comprehensive and clear.
For the task where users were asked to report failures from the overview page, the time on task went down by 55%

Through simple conversations and observations, we gained valuable insights into novice users' pain points. Combining these qualitative observations with quantitative data allowed us to measure the impact of our design efforts.

I spearheaded the visual design and UX copywriting, leveraging user interviews to understand their unique terminology and simplify information architecture for their mental models.

Let's explore how my expertise in design and strategic insight can unlock new possibilities for your team.