A Test Design Specification (TDS) is the missing layer between test strategy and test execution that most teams overlook — and most teams pay for in defect escapes. According to the ISTQB Advanced Level Agile Technical Tester survey 2024, teams with documented test design specifications find 34% more defects in high-complexity features than teams working directly from test plans to test cases. Research from the Software Testing Institute shows that systematic technique selection (the core of TDS) increases defect detection rates by 28-45% for boundary-heavy business logic compared to ad-hoc test design. The document isn’t bureaucracy — it’s the specification that ensures test coverage is systematic rather than random, and that coverage criteria are measurable rather than subjective.
TL;DR: A Test Design Specification bridges test strategy (Test Plan) and execution (Test Cases) by specifying HOW a feature will be tested: which techniques (equivalence partitioning, BVA, decision tables), what coverage criteria (% paths covered), test data requirements, and pass/fail thresholds. Write TDS for complex, high-risk, or compliance-critical features. IEEE 829 defines the standard format.
A Test Design Specification (TDS) defines how testing will be performed for specific features or test objectives. While a Test Plan provides high-level strategy, the TDS dives deep into test techniques, coverage criteria, data requirements, and environmental needs for individual test items. It serves as a blueprint that guides test case creation and execution.
The TDS connects multiple documentation artifacts: it references requirements traceability (see Test Coverage Reports), specifies test data needs, and defines criteria that feed into Test Summary Reports.
Purpose and Scope
Why Test Design Specifications Matter
- Detailed Planning: Translates strategy into actionable test approaches
- Coverage Assurance: Ensures comprehensive testing across all dimensions
- Reusability: Provides templates for similar features or projects
- Knowledge Transfer: Captures testing expertise for team sharing
- Traceability: Links requirements to test techniques to test cases
When to Create a TDS
Create TDS for:
- Complex features with multiple test dimensions
- High-risk system components
- Features requiring specialized test techniques
- Regulatory compliance testing
Skip TDS for:
- Simple CRUD operations with standard test patterns
- Well-understood, low-risk features
- Quick bug verification tests
IEEE 829 Structure for TDS
1. Test Design Specification Identifier
Document ID: TDS-ECOMM-PAYMENT-v1.2
Feature: Payment Processing Module
Version: 3.5.0
Author: Senior QA Engineer - Alex Martinez
Date: 2024-10-06
Review Status: Approved
Related Requirements: REQ-PAY-001 through REQ-PAY-045
2. Features to be Tested
Define test scope precisely:
## Features Under Test
### In Scope
✅ Credit card payment processing (Visa, MasterCard, Amex, Discover)
✅ Alternative payment methods (PayPal, Apple Pay, Google Pay)
✅ Payment validation and error handling
✅ Transaction authorization and capture flow
✅ Refund and void operations
✅ PCI DSS compliance validation
✅ Payment retry logic for failures
✅ Multi-currency support (USD, EUR, GBP, JPY)
### Out of Scope
❌ Payment gateway provider selection logic (covered in TDS-ECOMM-CONFIG)
❌ Accounting system integration (covered in TDS-ECOMM-ACCOUNTING)
❌ Invoice generation (covered in TDS-ECOMM-BILLING)
3. Approach Refinements
Detail the testing approach for each feature:
## Test Approach by Feature
### 3.1 Credit Card Processing
**Test Technique**: Equivalence Partitioning + Boundary Value Analysis
**Equivalence Classes**:
- Valid card numbers (per Luhn algorithm)
- Invalid card numbers (checksum fail)
- Expired cards (past, current month, future)
- Card types (Visa, MC, Amex, Discover, unsupported)
- CVV validation (3-digit, 4-digit, missing, invalid)
**Boundary Values**:
- Card expiry: Current month, next month, 5 years future
- Transaction amounts: $0.01, $0.99, $1.00, $999.99, $1,000.00, $9,999.99
- CVV: 000, 001, 999, 1000 (for Amex)
**Test Data Requirements**:
- Test credit card numbers from payment gateway sandbox
- Edge case amounts (minimum, maximum, just below/above thresholds)
- Various card brands and issuing countries
**Environmental Needs**:
- Payment gateway sandbox environment
- SSL/TLS certificates for secure transmission
- Network throttling tools for timeout testing
### 3.2 Payment Authorization Flow
**Test Technique**: State Transition Testing
**States**:
1. INITIATED → Payment request created
2. VALIDATING → Input validation in progress
3. AUTHORIZING → Sent to payment gateway
4. AUTHORIZED → Gateway approved
5. CAPTURING → Funds capture requested
6. CAPTURED → Payment complete
7. FAILED → Any stage failure
8. CANCELLED → User-initiated cancellation
**Valid Transitions**:
- INITIATED → VALIDATING → AUTHORIZING → AUTHORIZED → CAPTURING → CAPTURED
- Any state → FAILED (on error)
- INITIATED/VALIDATING → CANCELLED
**Invalid Transitions** (to verify system prevents):
- INITIATED → CAPTURED (skipping intermediate steps)
- FAILED → AUTHORIZED (no retry from failed state without re-initiation)
**Test Cases Cover**:
- Happy path: INITIATED → CAPTURED
- Validation failure: INITIATED → VALIDATING → FAILED
- Gateway decline: AUTHORIZING → FAILED
- Network timeout: AUTHORIZING → (timeout) → FAILED
- User cancellation: VALIDATING → CANCELLED
### 3.3 Error Handling and Recovery
**Test Technique**: Error Guessing + Negative Testing
**Error Scenarios**:
- Network failures (timeout, connection refused, DNS failure)
- Gateway errors (500, 502, 503, 504 HTTP codes)
- Invalid responses (malformed JSON, missing required fields)
- Rate limiting (429 Too Many Requests)
- Insufficient funds decline
- Fraud detection triggers
**Expected System Behavior**:
- Clear error messages to user (non-technical language)
- No sensitive data in error logs
- Automatic retry with exponential backoff (for transient failures)
- Graceful degradation (allow order placement without immediate payment if configured)
- Idempotency protection (same payment not charged twice)
### 3.4 Security Testing
**Test Technique**: OWASP Testing Guide + PCI DSS Requirements
**Security Tests**:
- **Data Encryption**: Verify card data encrypted in transit (TLS 1.2+) and at rest
- **PCI DSS Compliance**: Card data never logged, tokenization used for storage
- **SQL Injection**: Test payment forms with SQL injection payloads
- **XSS Prevention**: Test with XSS payloads in cardholder name, address fields
- **CSRF Protection**: Verify anti-CSRF tokens on payment submission
- **Session Management**: Test session timeout, secure cookie flags
- **Access Control**: Verify only authorized users can initiate payments
**Tools**:
- OWASP ZAP for automated vulnerability scanning
- Burp Suite for manual penetration testing
- SSL Labs for TLS configuration validation
### 3.5 Performance Testing
**Test Technique**: Load Testing + Stress Testing
**Performance Criteria**:
| Metric | Target | Measurement Method |
|--------|--------|-------------------|
| Response Time (95th percentile) | <3 seconds | JMeter, New Relic APM |
| Throughput | 500 transactions/minute | Load testing results |
| Error Rate under load | <0.1% | JMeter aggregate report |
| Database query time | <500ms | Application logs, DB profiling |
| Third-party API latency | <2 seconds | Charles Proxy, API monitoring |
**Load Scenarios**:
- Normal load: 100 concurrent users
- Peak load: 500 concurrent users (Black Friday simulation)
- Stress test: Gradual ramp-up to breaking point
### 3.6 Multi-Currency Support
**Test Technique**: Decision Table Testing
**Decision Factors**:
- User location (US, EU, UK, Japan)
- Selected currency (USD, EUR, GBP, JPY)
- Card issuing country
- Merchant account currency
**Sample Decision Table**:
| # | User Location | Currency | Card Country | Conversion Applied | Currency Passed to Gateway |
|---|--------------|----------|--------------|-------------------|---------------------------|
| 1 | US | USD | US | No | USD |
| 2 | US | EUR | US | Yes (USD→EUR) | EUR |
| 3 | EU | EUR | EU | No | EUR |
| 4 | EU | USD | EU | Yes (EUR→USD) | USD |
| 5 | UK | GBP | US | No | GBP |
| 6 | Japan | JPY | Japan | No | JPY |
**Test each combination** for correct currency conversion, exchange rate application, and proper display.
4. Test Identification
Map test techniques to test case IDs:
## Test Case Mapping
| Test Technique | Test Cases | Priority |
|---------------|-----------|----------|
| Equivalence Partitioning (Card Validation) | TC-PAY-001 to TC-PAY-025 | High |
| Boundary Value Analysis (Amounts) | TC-PAY-026 to TC-PAY-040 | High |
| State Transition Testing | TC-PAY-041 to TC-PAY-065 | Critical |
| Error Handling | TC-PAY-066 to TC-PAY-090 | High |
| Security Testing | TC-PAY-091 to TC-PAY-120 | Critical |
| Performance Testing | TC-PAY-121 to TC-PAY-135 | Medium |
| Multi-Currency | TC-PAY-136 to TC-PAY-160 | Medium |
**Total Test Cases**: 160
**Estimated Execution Time**: 24 hours (manual), 4 hours (automated)
**Automation Target**: 70% (112 test cases)
5. Feature Pass/Fail Criteria
Define success metrics:
## Pass/Fail Criteria
### Functional Acceptance Criteria
✅ **PASS** if:
- All critical and high-priority test cases pass
- Zero critical or high-severity defects open
- 95% of all test cases pass
- All security tests pass (no vulnerabilities)
- All PCI DSS compliance checks pass
❌ **FAIL** if:
- Any critical test case fails
- Any critical or high-severity security vulnerability found
- PCI DSS compliance validation fails
- Pass rate <90%
### Performance Acceptance Criteria
✅ **PASS** if:
- 95th percentile response time <3 seconds
- Error rate <0.1% under normal load
- System handles 500 concurrent transactions without degradation
❌ **FAIL** if:
- Response time >5 seconds
- Error rate >1%
- System crashes or becomes unresponsive under load
### Security Acceptance Criteria
✅ **PASS** if:
- Zero critical or high-severity vulnerabilities
- All OWASP Top 10 protections verified
- PCI DSS SAQ-D attestation successful
❌ **FAIL** if:
- Any critical security vulnerability exists
- Card data exposed in logs or unencrypted
Test Data Specification
Define data requirements in detail:
## Test Data Requirements
### Credit Card Test Data
**Source**: Payment Gateway Sandbox Test Cards
| Card Type | Test Number | CVV | Expiry | Expected Result |
|-----------|------------|-----|--------|----------------|
| Visa | 4111111111111111 | 123 | 12/25 | Approved |
| Visa | 4000000000000002 | 123 | 12/25 | Declined |
| MasterCard | 5555555555554444 | 123 | 12/25 | Approved |
| Amex | 378282246310005 | 1234 | 12/25 | Approved |
| Discover | 6011111111111117 | 123 | 12/25 | Approved |
### User Test Data
**Profile Variations**:
- New users (first purchase)
- Returning users (saved payment methods)
- Guest checkout users
- Users in different countries (US, UK, Germany, Japan)
### Transaction Amount Test Data
**Boundary Values**:
- Minimum: $0.01
- Standard: $49.99, $99.99, $249.99
- High value: $999.99, $5,000.00
- Maximum: $10,000.00
### Error Simulation Data
**Network Failures**:
- Timeout simulation: 30s+ delay
- Connection refused: Invalid gateway URL
- DNS failure: Non-existent domain
**Gateway Errors**:
- 500 Internal Server Error response
- Malformed JSON response
- Missing required fields in response
Environment and Tool Requirements
## Test Environment Specification
### Environment Configuration
**Payment Gateway**: Stripe Sandbox (test mode)
- API Version: 2024-06-20
- Webhook endpoint configured for async event handling
**Application Under Test**:
- URL: https://staging.ecommerce-app.com
- Version: v3.5.0-rc2
- Database: PostgreSQL 15.3 (test instance)
**Tools Required**:
- **Test Management**: TestRail
- **Automation Framework**: Cypress 13.x
- **API Testing**: Postman, REST Assured
- **Load Testing**: JMeter 5.6
- **Security Testing**: OWASP ZAP 2.14, Burp Suite Pro
- **Monitoring**: New Relic, Datadog
- **Network Simulation**: Charles Proxy, Network Link Conditioner
### Access Requirements
- Stripe sandbox API keys (test mode)
- Admin credentials for test environment
- VPN access for security testing
Risk-Based Testing Strategy
## Risk Analysis and Mitigation
| Risk | Probability | Impact | Test Focus | Mitigation |
|------|-----------|--------|-----------|-----------|
| Payment data breach | Low | Critical | Security testing (30% of effort) | Penetration testing, code review, encryption validation |
| Transaction failures | Medium | High | Error handling, retry logic (25%) | State transition testing, network failure simulation |
| Poor performance | Medium | Medium | Load testing (15%) | Performance benchmarking, scalability tests |
| Currency conversion errors | Low | Medium | Multi-currency testing (10%) | Decision table coverage |
| Integration failures | Medium | High | API integration testing (20%) | Contract testing, mock services |
**Risk Priority**: Security > Functional Correctness > Performance > Usability
Best Practices
1. Keep TDS Living Document
Update TDS as requirements change:
- Version control in Git
- Review quarterly or after major feature changes
- Link to latest test case repository
2. Involve Stakeholders in Review
TDS review should include:
- QA team (thoroughness of approach)
- Developers (technical feasibility)
- Product owners (business coverage)
- Security team (compliance validation)
3. Balance Detail and Maintainability
Too Little Detail: “Test payment processing” ❌ Too Much Detail: Individual test steps for 160 test cases ❌ Just Right: Test techniques, data classes, acceptance criteria ✅
4. Link TDS to Other Artifacts
Create traceability:
- Requirements → TDS → Test Cases → Defects
- Use reference IDs consistently across documents
“The test design specification is the document that separates strategic testing from random testing. Without it, test case creation is essentially guesswork — you get whatever scenarios the tester thinks of that day. With a TDS, test cases are derived systematically from techniques and coverage criteria. You can measure completeness, defend coverage decisions, and ensure the next engineer picks up exactly where you left off.” — Yuri Kan, Senior QA Lead
Conclusion
A well-crafted Test Design Specification bridges the gap between high-level test strategy and detailed test case execution. By defining test techniques, coverage criteria, data requirements, and pass/fail thresholds, TDS ensures systematic, comprehensive, and repeatable testing.
The key is appropriate detail: enough to guide testers and ensure coverage, but not so much that the document becomes a maintenance burden. Focus on the how and what of testing, leaving the specific steps to individual test cases.
FAQ
What is a Test Design Specification (TDS)?
A TDS defines HOW testing will be performed for specific features — bridging the test plan (strategy) and test cases (execution). It specifies: which test techniques to apply (equivalence partitioning, BVA, decision tables), coverage criteria, test data requirements, entry/exit criteria, and pass/fail thresholds. According to IEEE 829, TDS is a required deliverable for structured testing engagements. Teams with documented TDS find 34% more defects in high-complexity features.
How is TDS different from a Test Plan?
The Test Plan defines WHAT to test and WHY (scope, objectives, strategy, resources, schedule). The TDS defines HOW to test specific features (techniques, coverage criteria, data). A project has one Test Plan; it may have multiple TDS documents — one per major feature. The TDS is more technical than the Test Plan but less granular than individual test cases. See ISTQB Foundation Level for the formal distinction.
What should a TDS include?
A complete TDS includes: feature under test, test objectives, applicable techniques (with justification), coverage criteria and measurements, test data specifications, environmental requirements, entry/exit criteria, pass/fail criteria, risk assessment, and effort estimate. The coverage criteria section is most critical — it defines when testing is “done” for the feature.
When should you write a TDS?
Write a TDS for: complex features with multiple interaction paths, regulatory/compliance requirements (FDA, SOX, ISO 26262), and high-risk areas where systematic coverage is critical. Skip for: simple CRUD features where a standard checklist suffices, and experimental features likely to be removed. TDS effort should be proportional to feature risk and complexity.
Official Resources
- IEEE 829 Test Documentation Standard — formal TDS structure and requirements
- ISTQB Foundation Level — test design techniques classification
- Software Testing Help
See Also
- Test Plan and Strategy Guide
- Test Contract Documentation - Testing contracts and SLAs: scope definition, deliverables, timelines,… — High-level strategy that TDS implements
- Test Case Design Best Practices — Creating effective test cases from TDS
- Test Coverage Report — Measuring coverage against TDS criteria
- Test Data Documentation — Managing test data specified in TDS
- Testing Metrics and KPIs Guide — Metrics for evaluating TDS effectiveness
