What Is User Acceptance Testing?
User Acceptance Testing (UAT) is the final level of testing before software is released to production. It answers a fundamentally different question than all other testing levels. While unit, integration, system, and E2E tests ask “Does the software work correctly?”, UAT asks “Does the software do what the business actually needs?”
This distinction is critical. A system can pass every technical test — every function works, every API responds correctly, every workflow completes — and still fail UAT because it does not solve the problem the business intended it to solve.
Consider a company that requests a report generator. The development team builds it perfectly according to the specification: it pulls data from the database, applies filters, and generates PDF reports. All tests pass. But during UAT, the finance manager says: “I need to export to Excel, not PDF. And I need the data grouped by quarter, not by month.” The system works correctly but does not meet the actual business need.
UAT catches these misalignments between what was built and what was needed.
Who Performs UAT?
UAT is performed by business users — the people who will actually use the software in their daily work:
- Product owners who defined the requirements
- Business analysts who translated business needs into specifications
- End users or user representatives who will work with the system daily
- Domain experts who understand the business rules
- Compliance officers for regulated industries
QA engineers do not perform UAT. QA validates that the system works correctly (verification). Business users validate that the system does the right thing (validation). These are complementary but distinct activities.
Types of Acceptance Testing
Alpha Testing
Alpha testing is performed internally by people within the organization, but not the development team. It typically happens in a controlled lab or staging environment.
Who: Internal employees, stakeholders, business analysts Where: Internal test environment (not production) When: Before beta testing, after system testing Goal: Catch major issues before exposing the software to external users
Example: A banking company develops a new mobile banking app. Before releasing it to customers, internal employees at the bank use it for two weeks, performing real banking operations in a sandboxed environment.
Beta Testing
Beta testing is performed externally by real users in real-world conditions. The software is released to a limited audience to gather feedback before the full public launch.
Who: External users, early adopters, selected customers Where: Real environments (users’ own devices, networks) When: After alpha testing, before general availability Goal: Discover issues that only appear in diverse real-world conditions
Example: The same banking app is released to 1,000 selected customers who opted into the beta program. They use it on their own phones, with their own network conditions, for their actual banking needs.
Contract Acceptance Testing
Contract acceptance testing verifies that the software meets contractual obligations defined in a formal agreement between the customer and vendor.
Who: Customer and vendor representatives What: Testing against specific contractual criteria (deliverables, SLAs, features) When: Before payment/sign-off on a contract Goal: Legal verification that all contracted requirements have been met
Example: A government agency contracts a vendor to build a citizen portal. The contract specifies 50 features, 99.9% uptime, and WCAG 2.1 AA accessibility compliance. Contract acceptance testing verifies each item.
Regulation Acceptance Testing
Regulation acceptance testing verifies that the software complies with industry regulations, legal requirements, and standards.
Who: Compliance officers, auditors, regulatory bodies What: Testing against specific regulations (HIPAA, GDPR, SOX, PCI-DSS) When: Before deployment in regulated environments Goal: Ensure legal and regulatory compliance
Example: A healthcare application must comply with HIPAA. Regulation acceptance testing verifies patient data encryption, access logging, audit trails, and data retention policies.
UAT Planning
A UAT plan ensures the testing is organized and effective:
1. Define Scope
What will be tested? Which features, workflows, and business processes?
2. Write Acceptance Criteria
Each requirement should have clear, measurable acceptance criteria:
- Specific: “User can generate a quarterly revenue report” (not “reporting works”)
- Measurable: “Report generates in under 10 seconds” (not “report is fast”)
- Testable: Clear pass/fail conditions
3. Prepare Test Scenarios
Business users test using realistic scenarios from their daily work:
- “Process a standard customer order from quote to invoice”
- “Apply a 15% discount to a bulk order over $10,000”
- “Generate month-end financial reconciliation report”
4. Prepare the Environment
- Production-like data (anonymized if needed)
- All integrations connected (payment systems, email, third-party APIs)
- User accounts with appropriate roles and permissions
5. Train UAT Participants
Business users are not professional testers. They need:
- Instructions on how to report issues (templates, tools)
- Understanding of what is in scope vs out of scope
- Access to support (QA or dev team) for questions
6. Define Sign-Off Criteria
When is UAT complete? Common criteria:
- All critical scenarios pass
- No showstopper defects remain
- Business stakeholders formally approve
- Documented evidence of testing
The Sign-Off Process
UAT sign-off is the formal approval that the software is ready for production:
Sign-off typically requires:
- A signed document from business stakeholders
- A list of known issues with agreed severity and workarounds
- Confirmation that all critical business flows work
- Agreement on timeline for fixing remaining non-critical issues
QA’s Role in UAT
While QA does not perform UAT, QA plays a crucial supporting role:
Before UAT:
- Ensure the system passes system testing (no point starting UAT with known critical bugs)
- Help write UAT test scenarios
- Set up the UAT environment
- Create test data
- Prepare issue tracking and reporting templates
During UAT:
- Provide technical support to business users
- Help reproduce and document issues
- Triage reported defects (is it a real bug or user error?)
- Track UAT progress and coverage
After UAT:
- Coordinate defect fixes with development
- Retest fixed issues
- Compile UAT results report
- Facilitate the sign-off process
Exercise: Create a UAT Plan with Acceptance Criteria
You are QA Lead for a new HR self-service portal. The portal allows employees to:
- View and update personal information (address, phone, emergency contact)
- Submit time-off requests (vacation, sick leave, personal days)
- View pay stubs and tax documents
- Enroll in or change benefits during open enrollment
Create a UAT plan that includes:
- 3 UAT test scenarios for time-off requests (feature 2)
- Acceptance criteria for each scenario
- Who should perform each scenario
- Sign-off criteria for the complete UAT
Hint
Think about the business workflow from the employee's perspective. What are the happy paths? What rules must be enforced (available balance, manager approval, blackout dates)? Who needs to approve — the employee, their manager, or HR?Solution
UAT Test Scenarios for Time-Off Requests:
Scenario 1: Standard Vacation Request
- Actor: Regular employee (Sarah, Marketing department)
- Steps:
- Log into HR portal with company credentials
- Navigate to “Time Off” section
- Click “New Request”
- Select “Vacation” type
- Choose dates: June 15-19 (5 working days)
- Verify available balance shows correctly (should show 15 days remaining)
- Add note: “Family trip”
- Submit request
- Verify confirmation message and email notification
- Verify manager (John, Marketing Director) received approval email
- Acceptance Criteria:
- Balance calculation is accurate (deducts 5 from 15, shows 10 remaining)
- Manager notification is received within 5 minutes
- Request appears in employee’s “My Requests” with status “Pending Approval”
- Calendar view shows requested dates as “Pending”
Scenario 2: Sick Leave with Insufficient Balance
- Actor: Employee with 2 sick days remaining (Alex, Engineering)
- Steps:
- Navigate to Time Off
- Select “Sick Leave” type
- Request 3 days
- System should display warning about insufficient balance
- Attempt to submit anyway
- Acceptance Criteria:
- Warning message clearly shows: “Requested: 3 days, Available: 2 days”
- System allows submission but flags it as “Exceeds Balance”
- Manager notification includes balance warning
- HR is automatically notified of balance-exceeding request
Scenario 3: Manager Approval Workflow
- Actor: Manager (John, Marketing Director)
- Precondition: Sarah’s vacation request from Scenario 1 has been submitted
- Steps:
- Log in as manager
- See pending approval notification on dashboard
- Click to review Sarah’s request
- View team calendar (check for conflicts with other team members)
- Approve request
- Verify Sarah receives approval notification
- Verify HR calendar is updated
- Acceptance Criteria:
- Manager sees complete request details (dates, type, employee balance)
- Team calendar shows conflicts if any team members are already off
- Approval/Rejection requires selecting a reason
- Employee is notified within 5 minutes of decision
- HR system of record is updated immediately after approval
UAT Participants:
- 5-10 employees from different departments (representing different roles)
- 3-5 managers (to test approval workflows)
- 1 HR admin (to test admin functions and verify data accuracy)
- 1 payroll specialist (to verify integration with payroll system)
Sign-Off Criteria:
- All 3 critical scenarios pass for all participant groups
- No showstopper defects (data loss, incorrect calculations, security issues)
- Minor UI issues documented with agreed fix timeline
- HR Director formal sign-off required
- Payroll accuracy verified for at least 10 test cases
- All participant groups confirm the system is usable for their needs
Common UAT Pitfalls
Pitfall 1: Using UAT to find bugs. UAT is not a bug-hunting exercise. If business users spend their UAT time reporting crashes and broken features, the system was not ready for UAT. System testing should catch those issues first.
Pitfall 2: No clear acceptance criteria. Without measurable criteria, UAT becomes a subjective exercise where stakeholders say “it does not feel right” without being able to explain specifically what is wrong.
Pitfall 3: Wrong participants. Having developers or QA perform UAT defeats the purpose. UAT must be performed by people who understand the business domain, not the technical implementation.
Pitfall 4: Indefinite UAT cycles. UAT should have a defined duration and clear exit criteria. Without them, UAT can drag on indefinitely as stakeholders continuously find “one more thing” to change.
Pro Tips
Tip 1: Schedule UAT early in the project timeline. UAT often uncovers requirement misunderstandings that require significant rework. Budget time for at least two UAT cycles (initial + retest after fixes).
Tip 2: Provide UAT participants with realistic data. Business users cannot validate business logic with fake data. Use anonymized production data or carefully crafted test data that represents real business scenarios.
Tip 3: Document everything. UAT sign-off is a formal business event. Keep records of who tested what, what passed, what failed, what was accepted with known limitations, and who approved the release.
Key Takeaways
- UAT validates that the system meets business needs, not just technical requirements
- Business users perform UAT — not QA engineers or developers
- Four types: alpha (internal), beta (external), contract (legal), regulation (compliance)
- Clear acceptance criteria with measurable conditions are essential
- QA supports UAT through planning, environment setup, and defect coordination
- The sign-off process formalizes business approval for production release