Why Accessibility Testing Matters

Web accessibility (often abbreviated as a11y) ensures that people with disabilities can perceive, understand, navigate, and interact with websites. This includes people who are blind or have low vision, deaf or hard of hearing, have motor disabilities, cognitive disabilities, or temporary impairments.

Accessibility is not optional for QA:

  • Legal requirement — Laws like the ADA (US), EAA (EU), and AODA (Canada) mandate web accessibility
  • Business impact — 15-20% of the global population has some form of disability
  • SEO benefit — Many accessibility practices improve search engine optimization
  • Quality indicator — Accessible sites tend to be better-structured and more robust overall

WCAG 2.2 Overview

WCAG is organized around four principles (POUR):

Perceivable

Content must be presentable in ways users can perceive:

  • Text alternatives for images (alt text)
  • Captions for video/audio
  • Content adaptable to different presentations
  • Sufficient color contrast

Operable

Interface must be navigable and operable:

  • All functionality available via keyboard
  • Enough time to read and interact
  • No content that causes seizures
  • Clear navigation and wayfinding

Understandable

Content and interface must be understandable:

  • Readable text
  • Predictable behavior
  • Input assistance (error identification, labels)

Robust

Content must work with current and future technologies:

  • Valid HTML
  • Proper ARIA usage
  • Compatible with assistive technologies

Conformance Levels

LevelDescriptionTypical Requirement
AMinimum accessibilityRarely sufficient
AAStandard complianceMost legal requirements
AAAEnhanced accessibilitySpecific contexts only

Most organizations target WCAG 2.2 Level AA.

Manual Testing Techniques

Keyboard Navigation Testing

The most impactful manual test you can perform:

  1. Put your mouse aside. Navigate the entire page using only the keyboard.
  2. Tab through all interactive elements. Can you reach every link, button, form field, and menu?
  3. Check focus visibility. Is there a visible focus indicator on the currently focused element?
  4. Test keyboard traps. Can you Tab into and out of modals, dropdowns, and embedded content?
  5. Verify logical tab order. Does focus move in a logical reading order (left to right, top to bottom)?
KeyExpected Action
TabMove to next interactive element
Shift+TabMove to previous interactive element
EnterActivate links and buttons
SpaceToggle checkboxes, activate buttons
Arrow keysNavigate within menus, radio groups, tabs
EscapeClose modals, dropdowns, popups

Screen Reader Testing

Test with at least one screen reader:

PlatformScreen ReaderBrowser
WindowsNVDA (free)Firefox
WindowsJAWSChrome
macOSVoiceOver (built-in)Safari
iOSVoiceOver (built-in)Safari
AndroidTalkBack (built-in)Chrome

Basic VoiceOver testing on macOS:

  1. Press Cmd+F5 to enable VoiceOver
  2. Use VO+Right Arrow (Ctrl+Option+Right) to move through content
  3. Listen for: image descriptions, heading levels, link text, form labels
  4. Check that dynamic content updates are announced

Color and Visual Testing

  • Test the page in grayscale — is information conveyed only by color?
  • Check contrast ratios with browser extensions (e.g., WAVE, axe DevTools)
  • Zoom to 200% — does the layout still work? Is content still readable?
  • Test with high contrast mode on Windows

Automated Testing Tools

axe-core

The most widely used accessibility testing engine:

# Install axe-core for Playwright
npm install @axe-core/playwright
const { test, expect } = require('@playwright/test');
const AxeBuilder = require('@axe-core/playwright').default;

test('page has no accessibility violations', async ({ page }) => {
  await page.goto('/');
  const results = await new AxeBuilder({ page }).analyze();
  expect(results.violations).toEqual([]);
});

Lighthouse Accessibility Audit

Covered in Lesson 5.18 — the Accessibility category checks ~50 automated rules.

WAVE Browser Extension

Free browser extension that provides visual overlay of accessibility issues directly on the page. Useful for manual review alongside automated checks.

Exercise: Accessibility Audit of a Web Page

Perform a comprehensive accessibility audit combining automated and manual testing.

Part 1: Automated Scan

Run an axe-core scan on a page of your choice:

  1. Install the axe DevTools browser extension
  2. Open the page and run a full scan
  3. Document all findings:
IssueImpactElementWCAG Criterion
Critical/Serious/Moderate/Minor

Part 2: Keyboard Audit

Navigate the same page using only the keyboard:

CheckPass/FailNotes
All interactive elements reachable via Tab
Visible focus indicator on every element
Logical tab order
Can enter and exit modals with keyboard
Dropdown menus operable with arrow keys
Skip navigation link present
No keyboard traps

Part 3: Screen Reader Check

Enable VoiceOver (Mac) or NVDA (Windows) and navigate the page:

CheckPass/FailNotes
All images have meaningful alt text
Headings form a logical hierarchy
Form fields have associated labels
Links have descriptive text
Dynamic content changes are announced
Page language is declared

Part 4: Visual Checks

CheckPass/FailNotes
All text meets 4.5:1 contrast (normal)
All text meets 3:1 contrast (large)
Page works at 200% zoom
Info not conveyed by color alone
Focus indicator visible against all backgrounds
Solution: Sample Accessibility Audit Report

Page audited: example-shop.com/products

Automated (axe-core) — 8 issues found:

IssueImpactCountFix
Images missing alt textCritical4Add descriptive alt attributes
Form inputs without labelsCritical2Associate <label> with each input
Insufficient color contrastSerious3Increase contrast to 4.5:1 minimum
Missing page languageSerious1Add lang="en" to <html>

Keyboard — 3 issues found:

  1. Product filter dropdown cannot be operated with arrow keys — FAIL
  2. Modal has no focus trap — Tab moves behind the modal — FAIL
  3. No skip navigation link — user must tab through 15 nav items — FAIL

Screen Reader — 2 issues found:

  1. Product prices are read as plain numbers without currency — confusing
  2. “Add to cart” confirmation is not announced to screen reader users

Visual — 1 issue found:

  1. Sale badge uses only color (red) to indicate sale status — no text label

Priority fixes:

  1. Add alt text to product images (Critical, easy fix)
  2. Add form labels (Critical, easy fix)
  3. Fix color contrast (Serious, CSS changes)
  4. Add focus trap to modal (Serious, JavaScript fix)
  5. Add lang attribute (Serious, one-line fix)
  6. Announce cart confirmation to screen readers using aria-live region

Integrating A11y into Your Workflow

Shift-Left Approach

  • Design phase: Review mockups for contrast, focus states, heading hierarchy
  • Development: Use axe-core linter in IDE to catch issues before commit
  • Code review: Check for alt text, ARIA attributes, semantic HTML
  • Testing: Automated axe scans + manual keyboard/screen reader testing
  • CI/CD: Fail builds on critical accessibility violations

Common ARIA Patterns to Verify

PatternVerify
aria-labelElement has no visible text but needs a name
aria-labelledbyReferences an existing element ID
aria-describedbyProvides additional descriptive text
aria-expandedToggles between true/false for accordions
aria-live="polite"Dynamic content announces updates
role="alert"Urgent messages announced immediately

Key Takeaways

  • Accessibility testing requires both automated tools and manual testing — neither alone is sufficient
  • Keyboard testing is the single most impactful manual test for accessibility
  • Target WCAG 2.2 Level AA compliance as the minimum standard
  • Use axe-core in your test automation and CI/CD pipeline for continuous checking
  • Test with at least one screen reader to catch issues automated tools miss
  • Accessibility bugs are often the easiest to fix (adding alt text, labels, contrast) but the most impactful for users