TL;DR
- Automated tools catch 30-40% of accessibility issues—you need both automated and manual testing with assistive technologies
- Structure your reports around WCAG’s POUR principles (Perceivable, Operable, Understandable, Robust) with clear severity levels
- Include reproduction steps, affected user groups, and remediation code samples for each issue to enable fast fixes
Best for: QA teams conducting accessibility audits, compliance officers documenting WCAG conformance Skip if: You just need a quick automated scan—use axe DevTools directly instead of building full reports Read time: 15 minutes
Accessibility testing ensures that digital products are usable by everyone, including people with disabilities. With over 1 billion people worldwide living with some form of disability, accessibility is not just a legal requirement but a moral imperative and business opportunity.
This guide provides comprehensive frameworks for documenting accessibility test results, ensuring WCAG 2.2 compliance, and creating inclusive digital experiences.
When to Create Full Accessibility Reports
Create a full report when:
- Preparing for WCAG/ADA compliance audits
- Launching or significantly updating a public-facing application
- Responding to accessibility complaints or legal concerns
- Establishing baseline accessibility metrics for improvement tracking
- Working with external remediation teams who need detailed guidance
Use lighter documentation when:
- Running automated checks in CI/CD (log violations, fail builds)
- Doing quick manual spot-checks during development
- Already have an established accessibility testing process
WCAG 2.2 Compliance Levels
Understanding Conformance Levels
| Level | Description | Requirements | Typical Use Case |
|---|---|---|---|
| A | Minimum level | Essential accessibility features | Basic compliance, minimum legal requirement |
| AA | Mid-range level | Recommended standard | Most websites, industry standard, legal requirement in many countries |
| AAA | Highest level | Enhanced accessibility | Government sites, specialized accessibility-focused applications |
WCAG 2.2 additions (released October 2023): Focus appearance, dragging movements, target size, consistent help, redundant entry, and accessible authentication success criteria.
WCAG Principles (POUR)
Perceivable: Information must be presentable in ways users can perceive
- Text alternatives for non-text content
- Captions and transcripts for multimedia
- Adaptable content structure
- Distinguishable visual and audio content
Operable: User interface must be operable
- Keyboard accessibility
- Sufficient time to read and use content
- No content that causes seizures
- Navigable and findable content
Understandable: Interface must be understandable
- Readable text content
- Predictable functionality
- Input assistance and error prevention
Robust: Content must work with current and future assistive technologies
- Valid, semantic HTML
- Proper ARIA usage
- Compatible with screen readers and other tools
Accessibility Test Report Template
Executive Summary Section
# ACCESSIBILITY TEST REPORT
## Executive Summary
**Application**: E-Commerce Web Platform v3.2
**Test Date**: January 12, 2026
**Tester**: Alex Rodriguez (Certified IAAP WAS)
**Target Compliance**: WCAG 2.2 Level AA
**Overall Result**: 78% Compliant (Needs Improvement)
### Key Findings Summary
| Severity | Count | Blocking Users |
|----------|-------|----------------|
| Critical | 12 | ~20% of disabled users blocked |
| High | 23 | Significant friction |
| Medium | 45 | Degraded experience |
| Low | 18 | Minor issues |
### Compliance by POUR Principle
| Principle | Level A | Level AA | Level AAA |
|-----------|---------|----------|-----------|
| Perceivable | 85% | 72% | 45% |
| Operable | 90% | 78% | 52% |
| Understandable | 88% | 81% | 60% |
| Robust | 82% | 75% | N/A |
### Immediate Actions Required
1. Add alt text to all product images (12 pages affected)
2. Fix keyboard traps in modal dialogs (blocks 15% of motor-impaired users)
3. Update color contrast ratios (fails 4.5:1 minimum)
### Timeline Recommendation
- P0 Critical: Fix within 7 days
- P1 High: Fix within 30 days
- P2/P3: Include in next sprint cycle
Test Scope Section
## Test Scope
### Pages Tested
1. Homepage (/)
2. Product Listing (/products)
3. Product Detail (/products/[id])
4. Shopping Cart (/cart)
5. Checkout (/checkout)
6. User Account (/account)
7. Search Results (/search)
8. Help Center (/help)
### Testing Tools Used
**Automated:**
- axe DevTools 4.8
- WAVE (WebAIM)
- Lighthouse (Chrome DevTools)
- IBM Equal Access Checker
**Manual:**
- NVDA 2025.1 (Windows)
- JAWS 2025 (Windows)
- VoiceOver (macOS Sonoma, iOS 18)
- TalkBack (Android 15)
- Dragon NaturallySpeaking 16
### Test Coverage Matrix
| Area | Automated | NVDA | VoiceOver | Keyboard | Cognitive |
|------|-----------|------|-----------|----------|-----------|
| Homepage | ✓ | ✓ | ✓ | ✓ | ✓ |
| Products | ✓ | ✓ | ✓ | ✓ | ✓ |
| Checkout | ✓ | ✓ | ✓ | ✓ | ✓ |
| Account | ✓ | ✓ | Partial | ✓ | - |
Detailed Issue Documentation
Critical Issue Template
### Issue #1: Missing Alt Text for Product Images
**WCAG Criterion**: 1.1.1 Non-text Content (Level A)
**Severity**: Critical
**Affected Users**: Screen reader users, users with images disabled
**Business Impact**: Cannot complete core purchase flow
**Location**: `/products` page, all product cards
**Current State (Non-compliant)**:
```html
<img src="/images/product-123.jpg" />
Expected State (Compliant):
<img src="/images/product-123.jpg"
alt="Blue cotton t-shirt with round neck, size medium, $29.99" />
Reproduction Steps:
- Navigate to /products using NVDA + Firefox
- Press B to browse by images
- NVDA announces: “graphic” (no description)
User Impact:
- Screen reader users: Cannot identify products
- Users with images disabled: No product information
- SEO impact: Reduced search visibility
Remediation Guidance:
- Add descriptive alt text including: product name, key visual features, price
- For decorative images, use
alt="" - For complex images, use
aria-describedbyfor extended descriptions
Verification Test:
// axe-core rule check
const results = await axe.run(document, {
rules: ['image-alt']
});
expect(results.violations).toHaveLength(0);
Priority: P0 - Critical Estimated Effort: 2-3 days (300+ images) Assigned To: Frontend Team
### High Priority Issue Template
```markdown
### Issue #2: Keyboard Trap in Modal Dialogs
**WCAG Criterion**: 2.1.2 No Keyboard Trap (Level A)
**Severity**: Critical
**Affected Users**: Keyboard-only users, motor disabilities (~15% of disabled users)
**Location**: Add to Cart modal, Login popup
**Description**:
Focus becomes trapped inside modal dialogs. Pressing Escape or Tab does not close the modal or allow navigation outside.
**Video Evidence**: [Link to screen recording showing keyboard trap]
**Reproduction Steps**:
1. Navigate to product page using Tab key
2. Press Enter on "Add to Cart" button
3. Modal opens, focus moves inside
4. Press Escape key → FAILS (modal stays open)
5. Press Tab repeatedly → FAILS (focus cycles within modal)
**Root Cause Analysis**:
Modal component missing:
- Escape key handler
- Focus trap with escape mechanism
- Return focus to trigger element on close
**Remediation Code**:
```javascript
class AccessibleModal {
constructor(modalElement, triggerElement) {
this.modal = modalElement;
this.trigger = triggerElement;
this.focusableElements = this.getFocusableElements();
this.firstFocusable = this.focusableElements[0];
this.lastFocusable = this.focusableElements[this.focusableElements.length - 1];
}
open() {
this.modal.setAttribute('aria-hidden', 'false');
this.modal.style.display = 'block';
this.firstFocusable.focus();
this.addEventListeners();
}
close() {
this.modal.setAttribute('aria-hidden', 'true');
this.modal.style.display = 'none';
this.trigger.focus(); // Return focus
this.removeEventListeners();
}
handleKeydown = (e) => {
if (e.key === 'Escape') {
this.close();
return;
}
if (e.key === 'Tab') {
if (e.shiftKey && document.activeElement === this.firstFocusable) {
e.preventDefault();
this.lastFocusable.focus();
} else if (!e.shiftKey && document.activeElement === this.lastFocusable) {
e.preventDefault();
this.firstFocusable.focus();
}
}
}
}
Priority: P0 - Critical Estimated Effort: 1 day
### Color Contrast Issue Template
```markdown
### Issue #3: Insufficient Color Contrast
**WCAG Criterion**: 1.4.3 Contrast (Minimum) (Level AA)
**Severity**: High
**Affected Users**: Low vision, colorblind, aging users, outdoor mobile users
**Failing Elements**:
| Element | Foreground | Background | Actual | Required | Gap |
|---------|-----------|------------|--------|----------|-----|
| Secondary button | #999999 | #FFFFFF | 2.8:1 | 4.5:1 | -1.7 |
| Form label | #AAAAAA | #FFFFFF | 2.3:1 | 4.5:1 | -2.2 |
| Footer link | #888888 | #F5F5F5 | 3.2:1 | 4.5:1 | -1.3 |
| Placeholder text | #CCCCCC | #FFFFFF | 1.6:1 | 4.5:1 | -2.9 |
**Remediation - Updated Color Palette**:
```css
:root {
/* Old → New (with contrast ratio) */
--text-secondary: #595959; /* Was #999999 → 7.0:1 ✓ */
--text-label: #4A4A4A; /* Was #AAAAAA → 9.7:1 ✓ */
--text-footer: #545454; /* Was #888888 → 7.5:1 ✓ */
--text-placeholder: #767676; /* Was #CCCCCC → 4.5:1 ✓ */
}
Testing Tool: Use WebAIM Contrast Checker or browser DevTools Priority: P1 - High Estimated Effort: 3 days (design review + implementation + QA)
## Testing Methodology
### Automated Testing Integration
```javascript
// accessibility-tests.js - CI/CD integration
const { AxePuppeteer } = require('@axe-core/puppeteer');
const puppeteer = require('puppeteer');
const WCAG_TAGS = ['wcag2a', 'wcag2aa', 'wcag21a', 'wcag21aa', 'wcag22aa'];
async function runAccessibilityAudit(urls) {
const browser = await puppeteer.launch();
const results = [];
for (const url of urls) {
const page = await browser.newPage();
await page.goto(url, { waitUntil: 'networkidle0' });
const axeResults = await new AxePuppeteer(page)
.withTags(WCAG_TAGS)
.analyze();
results.push({
url,
violations: axeResults.violations,
passes: axeResults.passes.length,
timestamp: new Date().toISOString()
});
// Fail CI if critical violations found
const critical = axeResults.violations.filter(v => v.impact === 'critical');
if (critical.length > 0) {
console.error(`CRITICAL: ${url} has ${critical.length} critical violations`);
process.exitCode = 1;
}
}
await browser.close();
return results;
}
// GitHub Actions integration
// .github/workflows/accessibility.yml
Manual Testing Checklist
## Manual Accessibility Test Checklist
### Keyboard Navigation
- [ ] All interactive elements accessible via Tab key
- [ ] Logical tab order follows visual layout
- [ ] Focus indicators visible (3:1 contrast minimum)
- [ ] No keyboard traps
- [ ] Skip links present and functional
- [ ] Modal dialogs trap and return focus correctly
### Screen Reader Testing
- [ ] All images have appropriate alt text
- [ ] Form labels properly associated (`for`/`id` or wrapping)
- [ ] Error messages announced with `role="alert"`
- [ ] Dynamic content announced with ARIA live regions
- [ ] ARIA landmarks present (banner, main, navigation, contentinfo)
- [ ] Headings create logical document outline
### Visual Testing
- [ ] Color contrast meets 4.5:1 for normal text, 3:1 for large text
- [ ] Text remains readable at 200% zoom
- [ ] No information conveyed by color alone
- [ ] Content reflows at 320px width without horizontal scroll
- [ ] Focus visible on all interactive elements
### Forms
- [ ] All fields have visible, associated labels
- [ ] Required fields marked (not just with asterisk color)
- [ ] Error messages specific and helpful
- [ ] Autocomplete attributes present where applicable
### Multimedia
- [ ] Videos have accurate captions
- [ ] Audio content has transcripts
- [ ] Auto-play can be paused within 3 seconds
- [ ] No content flashing >3 times per second
Measuring Success
| Metric | Before | After | How to Track |
|---|---|---|---|
| Automated violations | 98 | 0 critical | axe-core in CI |
| WCAG AA compliance | 65% | 95%+ | Monthly audits |
| Screen reader task completion | 45% | 90% | User testing |
| Keyboard navigation time | 3x mouse | 1.5x mouse | Task timing |
| Accessibility complaints | 12/month | <1/month | Support tickets |
Warning signs it’s not working:
- Automated scan shows 0 issues but user complaints continue (missing manual testing)
- Compliance percentage stuck (team not prioritizing accessibility fixes)
- New features consistently introduce violations (missing accessibility in design/development process)
AI-Assisted Approaches
AI tools have transformed accessibility testing, but with clear limitations. Current AI detection rates by issue type:
- Missing alt text: 95%+
- Color contrast: 90%+
- Missing form labels: 90%+
- Heading hierarchy: 85%+
- Keyboard accessibility: 70%+
- ARIA correctness: 60-80%
- Content quality assessment: 30-50%
What AI does well:
- Scanning large sites quickly for common violations
- Detecting missing attributes (alt, labels, ARIA)
- Checking color contrast ratios mathematically
- Identifying structural issues (heading levels, landmarks)
- Generating remediation code suggestions
What still needs humans:
- Evaluating whether alt text accurately describes images
- Testing actual keyboard navigation flow
- Assessing cognitive load and understandability
- Verifying screen reader announcement quality
- Testing with real assistive technology combinations
Useful prompt for accessibility analysis:
Analyze this HTML for WCAG 2.2 AA compliance issues:
[paste HTML]
For each issue found:
1. Cite the specific WCAG success criterion
2. Explain the user impact
3. Provide compliant code example
4. Estimate severity (Critical/High/Medium/Low)
Best Practices Checklist
| Practice | Why It Matters |
|---|---|
| Test with real assistive technologies | Automated tools miss 60-70% of real user experience issues |
| Include reproduction steps | Developers can’t fix what they can’t reproduce |
| Provide remediation code | Reduces back-and-forth and speeds fixes |
| Document affected user groups | Creates empathy and prioritization clarity |
| Integrate into CI/CD | Prevents regression, catches issues early |
| Test after each major release | Features change, accessibility can break |
| Include business impact | Connects accessibility to revenue and risk |
Conclusion
Accessibility testing is an ongoing process that requires automated tools, manual testing with assistive technologies, and clear documentation. The goal of an accessibility test report isn’t just to list violations—it’s to enable fast, effective remediation.
The best reports combine quantitative data (compliance percentages, violation counts) with qualitative context (user impact, reproduction steps, code fixes). This combination gives development teams everything they need to make their applications truly accessible.
Related articles:
- Mobile Accessibility Testing: WCAG for iOS and Android - Platform-specific accessibility testing
- Test Case Design Best Practices - Structure effective test scenarios
- Exploratory Testing Guide - Discover issues through guided exploration
- Allure Framework Reporting - Rich visual test documentation
- Continuous Testing in DevOps - Integrate accessibility into CI/CD