Why Accessibility Testing Matters
Accessibility testing ensures that people with disabilities can use your product. This includes users who are blind or have low vision, deaf or hard of hearing, have motor disabilities, cognitive disabilities, or temporary impairments (a broken arm, bright sunlight on a screen).
Roughly 15% of the world’s population — over 1 billion people — experience some form of disability. Beyond the ethical imperative, accessibility is increasingly a legal requirement. The Americans with Disabilities Act (ADA), European Accessibility Act (EAA), and similar laws worldwide mandate accessible digital products. Lawsuits over web accessibility have grown significantly, with over 4,000 ADA-related digital lawsuits filed in the US annually.
As a QA engineer, accessibility testing is not optional — it is a core quality attribute that you must verify.
WCAG 2.1: The Standard
The Web Content Accessibility Guidelines (WCAG) 2.1, published by the W3C, is the global standard for web accessibility. It is organized around four principles known as POUR.
POUR Principles
Perceivable — Information and UI components must be presentable to users in ways they can perceive.
- Text alternatives for images (alt text)
- Captions for video
- Sufficient color contrast
- Content adaptable to different presentations
Operable — UI components and navigation must be operable.
- Keyboard accessible (all functionality via keyboard)
- Enough time to read and use content
- No content that causes seizures (flashing)
- Navigable (clear headings, focus order, skip links)
Understandable — Information and UI operation must be understandable.
- Readable text (language specified, clear writing)
- Predictable behavior (consistent navigation, no unexpected changes)
- Input assistance (error identification, labels, instructions)
Robust — Content must be robust enough to be interpreted by assistive technologies.
- Valid HTML
- Proper use of ARIA attributes
- Compatible with current and future user agents
Conformance Levels
| Level | Description | Requirement |
|---|---|---|
| A | Minimum accessibility | Must-have basics (alt text, keyboard access, form labels) |
| AA | Standard accessibility | Most legal requirements target this level (contrast, resize, captions) |
| AAA | Enhanced accessibility | Highest standard (sign language, enhanced contrast, reading level) |
Most organizations target Level AA compliance. Level AAA is aspirational and not typically required by law.
Common Accessibility Issues
Visual
- Missing alt text: Images without descriptive
altattributes - Poor color contrast: Text-to-background contrast below 4.5:1
- Color-only information: Using color alone to convey meaning (red = error, green = success)
- Missing focus indicators: No visible outline when tabbing through interactive elements
- Text in images: Critical text embedded in images rather than HTML
Keyboard Navigation
- Keyboard traps: Focus gets stuck in a component with no way to tab out
- Illogical tab order: Focus jumps around the page unexpectedly
- Missing skip links: No way to skip past repetitive navigation to main content
- Non-keyboard-accessible controls: Dropdowns, modals, or custom widgets that only work with a mouse
Screen Reader
- Missing ARIA labels: Interactive elements without accessible names
- Incorrect heading hierarchy: Jumping from
<h1>to<h4>breaks navigation - Missing form labels: Input fields without associated
<label>elements - Dynamic content not announced: AJAX updates that screen readers do not detect
Multimedia
- Missing captions: Videos without synchronized text captions
- No audio descriptions: Visual information in video not described audibly
- Auto-playing media: Audio or video that starts automatically without user control
Automated Testing Tools
Automated tools catch approximately 30-40% of accessibility issues. The rest requires manual testing.
| Tool | Type | Best For |
|---|---|---|
| axe DevTools | Browser extension | Comprehensive automated scanning |
| Lighthouse | Chrome built-in | Quick accessibility score and audit |
| WAVE | Browser extension | Visual overlay of accessibility issues |
| Pa11y | CLI tool | CI/CD integration |
| axe-core | Library | Integration into automated test suites |
| Colour Contrast Analyser | Desktop app | Manual contrast checking |
Using axe DevTools
- Install the axe DevTools Chrome extension
- Open Chrome DevTools (F12)
- Navigate to the “axe DevTools” tab
- Click “Scan ALL of my page”
- Review issues by severity (Critical, Serious, Moderate, Minor)
Using Lighthouse
- Open Chrome DevTools (F12)
- Go to the “Lighthouse” tab
- Check “Accessibility” category
- Click “Analyze page load”
- Review the score (0-100) and specific issues
Manual Testing Techniques
Keyboard Testing
The most important manual test. Navigate the entire page using only a keyboard:
- Tab: Move forward through interactive elements
- Shift+Tab: Move backward
- Enter/Space: Activate buttons and links
- Arrow keys: Navigate within components (dropdowns, radio buttons)
- Escape: Close modals and popups
Check: Can you reach every interactive element? Is the focus visible? Can you escape every component? Is the tab order logical?
Screen Reader Testing
Test with at least one screen reader:
- NVDA (Windows, free)
- VoiceOver (macOS/iOS, built-in)
- JAWS (Windows, commercial)
- TalkBack (Android, built-in)
Listen for: Are images described? Are form fields labeled? Are headings announced? Do buttons have meaningful names?
Zoom Testing
Zoom the page to 200% and check:
- Is all content still visible?
- Does the layout adapt without horizontal scrolling?
- Is text readable without overlapping?
Exercise: WCAG 2.1 AA Audit
Perform an accessibility audit of a public website targeting WCAG 2.1 Level AA compliance.
Task
Audit the website using both automated tools and manual techniques. Document at least 10 issues.
Steps
- Run axe DevTools scan and document all Critical and Serious issues
- Run Lighthouse accessibility audit and note the score
- Perform keyboard navigation test on the homepage and one form page
- Test with VoiceOver or NVDA on at least 3 pages
- Check color contrast on all text elements
- Verify all images have appropriate alt text
- Check heading hierarchy (h1 > h2 > h3 order)
- Test at 200% zoom
Hint: Audit Report Template
| # | Issue | WCAG Criterion | Level | Tool/Method | Severity | Location | Recommendation |
|---|---|---|---|---|---|---|---|
| 1 | Missing alt text on hero image | 1.1.1 Non-text Content | A | axe | Critical | Homepage hero | Add descriptive alt text |
| 2 | … | … | … | … | … | … | … |
Group issues by POUR principle for clarity.
Solution: Example Audit Report
Automated Scan Results (axe DevTools):
| # | Issue | Criterion | Level | Severity | Count |
|---|---|---|---|---|---|
| 1 | Images missing alt text | 1.1.1 | A | Critical | 5 |
| 2 | Form inputs without labels | 1.3.1 | A | Critical | 3 |
| 3 | Color contrast insufficient | 1.4.3 | AA | Serious | 8 |
| 4 | Links with no discernible text | 4.1.2 | A | Serious | 2 |
| 5 | Missing document language | 3.1.1 | A | Serious | 1 |
Manual Testing Results:
| # | Issue | Criterion | Level | Method | Severity |
|---|---|---|---|---|---|
| 6 | Keyboard trap in date picker modal | 2.1.2 | A | Keyboard | Critical |
| 7 | No skip navigation link | 2.4.1 | A | Keyboard | Serious |
| 8 | Focus not visible on nav links | 2.4.7 | AA | Keyboard | Serious |
| 9 | Heading hierarchy skips h2 to h4 | 1.3.1 | A | Screen reader | Moderate |
| 10 | Video without captions | 1.2.2 | A | Manual review | Critical |
| 11 | Error messages not associated with fields | 3.3.1 | A | Screen reader | Serious |
| 12 | Horizontal scroll required at 200% zoom | 1.4.10 | AA | Zoom test | Serious |
Summary: Lighthouse score: 62/100. Found 12 issues: 3 Critical, 5 Serious, 2 Moderate, 2 Minor. The site fails WCAG 2.1 Level A on multiple criteria and Level AA on contrast and zoom.
Priority fixes: (1) Add alt text to all images, (2) Fix keyboard trap in date picker, (3) Add video captions, (4) Fix color contrast issues, (5) Add form labels.
Pro Tips
- Automated Tests Catch Only 30-40%: Never rely solely on automated tools. Keyboard testing and screen reader testing are essential for catching the most impactful issues that affect real users.
- Integrate axe-core into CI/CD: Add
@axe-core/playwrightor@axe-core/webdriverioto your automated test suite so accessibility regressions are caught before deployment. - Test with Real Users: If possible, include users with disabilities in your usability testing. They will find issues that no tool or sighted tester can identify.
- ARIA is a Last Resort: ARIA attributes should supplement semantic HTML, not replace it. Using a
<button>element is always better than a<div role="button">. Most accessibility issues come from misusing or overusing ARIA. - Accessibility is Not a One-Time Audit: Build accessibility checks into your regular testing cycle. Every new feature should be tested for keyboard access, screen reader compatibility, and color contrast before release.