A Test Design Specification (TDS) defines how testing will be performed for specific features or test objectives. While a test plan provides high-level strategy, the TDS dives deep into test techniques, coverage criteria, data requirements, and environmental needs for individual test items. It serves as a blueprint that guides test case creation and execution.
Purpose and Scope
Why Test Design Specifications Matter
- Detailed Planning: Translates strategy into actionable test approaches
- Coverage Assurance: Ensures comprehensive testing across all dimensions
- Reusability: Provides templates for similar features or projects
- Knowledge Transfer: Captures testing expertise for team sharing
- Traceability: Links requirements to test techniques to test cases
When to Create a TDS
Create TDS for:
- Complex features with multiple test dimensions
- High-risk system components
- Features requiring specialized test techniques
- Regulatory compliance testing
Skip TDS for:
- Simple CRUD operations with standard test patterns
- Well-understood, low-risk features
- Quick bug verification tests
IEEE 829 Structure for TDS
1. Test Design Specification Identifier
Document ID: TDS-ECOMM-PAYMENT-v1.2
Feature: Payment Processing Module
Version: 3.5.0
Author: Senior QA Engineer - Alex Martinez
Date: 2024-10-06
Review Status: Approved
Related Requirements: REQ-PAY-001 through REQ-PAY-045
2. Features to be Tested
Define test scope precisely:
## Features Under Test
### In Scope
✅ Credit card payment processing (Visa, MasterCard, Amex, Discover)
✅ Alternative payment methods (PayPal, Apple Pay, Google Pay)
✅ Payment validation and error handling
✅ Transaction authorization and capture flow
✅ Refund and void operations
✅ PCI DSS compliance validation
✅ Payment retry logic for failures
✅ Multi-currency support (USD, EUR, GBP, JPY)
### Out of Scope
❌ Payment gateway provider selection logic (covered in TDS-ECOMM-CONFIG)
❌ Accounting system integration (covered in TDS-ECOMM-ACCOUNTING)
❌ Invoice generation (covered in TDS-ECOMM-BILLING)
3. Approach Refinements
Detail the testing approach for each feature:
## Test Approach by Feature
### 3.1 Credit Card Processing
**Test Technique**: Equivalence Partitioning + Boundary Value Analysis
**Equivalence Classes**:
- Valid card numbers (per Luhn algorithm)
- Invalid card numbers (checksum fail)
- Expired cards (past, current month, future)
- Card types (Visa, MC, Amex, Discover, unsupported)
- CVV validation (3-digit, 4-digit, missing, invalid)
**Boundary Values**:
- Card expiry: Current month, next month, 5 years future
- Transaction amounts: $0.01, $0.99, $1.00, $999.99, $1,000.00, $9,999.99
- CVV: 000, 001, 999, 1000 (for Amex)
**Test Data Requirements**:
- Test credit card numbers from payment gateway sandbox
- Edge case amounts (minimum, maximum, just below/above thresholds)
- Various card brands and issuing countries
**Environmental Needs**:
- Payment gateway sandbox environment
- SSL/TLS certificates for secure transmission
- Network throttling tools for timeout testing
### 3.2 Payment Authorization Flow
**Test Technique**: State Transition Testing
**States**:
1. INITIATED → Payment request created
2. VALIDATING → Input validation in progress
3. AUTHORIZING → Sent to payment gateway
4. AUTHORIZED → Gateway approved
5. CAPTURING → Funds capture requested
6. CAPTURED → Payment complete
7. FAILED → Any stage failure
8. CANCELLED → User-initiated cancellation
**Valid Transitions**:
- INITIATED → VALIDATING → AUTHORIZING → AUTHORIZED → CAPTURING → CAPTURED
- Any state → FAILED (on error)
- INITIATED/VALIDATING → CANCELLED
**Invalid Transitions** (to verify system prevents):
- INITIATED → CAPTURED (skipping intermediate steps)
- FAILED → AUTHORIZED (no retry from failed state without re-initiation)
**Test Cases Cover**:
- Happy path: INITIATED → CAPTURED
- Validation failure: INITIATED → VALIDATING → FAILED
- Gateway decline: AUTHORIZING → FAILED
- Network timeout: AUTHORIZING → (timeout) → FAILED
- User cancellation: VALIDATING → CANCELLED
### 3.3 Error Handling and Recovery
**Test Technique**: Error Guessing + Negative Testing
**Error Scenarios**:
- Network failures (timeout, connection refused, DNS failure)
- Gateway errors (500, 502, 503, 504 HTTP codes)
- Invalid responses (malformed JSON, missing required fields)
- Rate limiting (429 Too Many Requests)
- Insufficient funds decline
- Fraud detection triggers
**Expected System Behavior**:
- Clear error messages to user (non-technical language)
- No sensitive data in error logs
- Automatic retry with exponential backoff (for transient failures)
- Graceful degradation (allow order placement without immediate payment if configured)
- Idempotency protection (same payment not charged twice)
### 3.4 Security Testing
**Test Technique**: OWASP Testing Guide + PCI DSS Requirements
**Security Tests**:
- **Data Encryption**: Verify card data encrypted in transit (TLS 1.2+) and at rest
- **PCI DSS Compliance**: Card data never logged, tokenization used for storage
- **SQL Injection**: Test payment forms with SQL injection payloads
- **XSS Prevention**: Test with XSS payloads in cardholder name, address fields
- **CSRF Protection**: Verify anti-CSRF tokens on payment submission
- **Session Management**: Test session timeout, secure cookie flags
- **Access Control**: Verify only authorized users can initiate payments
**Tools**:
- OWASP ZAP for automated vulnerability scanning
- Burp Suite for manual penetration testing
- SSL Labs for TLS configuration validation
### 3.5 Performance Testing
**Test Technique**: Load Testing + Stress Testing
**Performance Criteria**:
| Metric | Target | Measurement Method |
|--------|--------|-------------------|
| Response Time (95th percentile) | <3 seconds | JMeter, New Relic APM |
| Throughput | 500 transactions/minute | Load testing results |
| Error Rate under load | <0.1% | JMeter aggregate report |
| Database query time | <500ms | Application logs, DB profiling |
| Third-party API latency | <2 seconds | Charles Proxy, API monitoring |
**Load Scenarios**:
- Normal load: 100 concurrent users
- Peak load: 500 concurrent users (Black Friday simulation)
- Stress test: Gradual ramp-up to breaking point
### 3.6 Multi-Currency Support
**Test Technique**: Decision Table Testing
**Decision Factors**:
- User location (US, EU, UK, Japan)
- Selected currency (USD, EUR, GBP, JPY)
- Card issuing country
- Merchant account currency
**Sample Decision Table**:
| # | User Location | Currency | Card Country | Conversion Applied | Currency Passed to Gateway |
|---|--------------|----------|--------------|-------------------|---------------------------|
| 1 | US | USD | US | No | USD |
| 2 | US | EUR | US | Yes (USD→EUR) | EUR |
| 3 | EU | EUR | EU | No | EUR |
| 4 | EU | USD | EU | Yes (EUR→USD) | USD |
| 5 | UK | GBP | US | No | GBP |
| 6 | Japan | JPY | Japan | No | JPY |
**Test each combination** for correct currency conversion, exchange rate application, and proper display.
4. Test Identification
Map test techniques to test case IDs:
## Test Case Mapping
| Test Technique | Test Cases | Priority |
|---------------|-----------|----------|
| Equivalence Partitioning (Card Validation) | TC-PAY-001 to TC-PAY-025 | High |
| Boundary Value Analysis (Amounts) | TC-PAY-026 to TC-PAY-040 | High |
| State Transition Testing | TC-PAY-041 to TC-PAY-065 | Critical |
| Error Handling | TC-PAY-066 to TC-PAY-090 | High |
| Security Testing | TC-PAY-091 to TC-PAY-120 | Critical |
| Performance Testing | TC-PAY-121 to TC-PAY-135 | Medium |
| Multi-Currency | TC-PAY-136 to TC-PAY-160 | Medium |
**Total Test Cases**: 160
**Estimated Execution Time**: 24 hours (manual), 4 hours (automated)
**Automation Target**: 70% (112 test cases)
5. Feature Pass/Fail Criteria
Define success metrics:
## Pass/Fail Criteria
### Functional Acceptance Criteria
✅ **PASS** if:
- All critical and high-priority test cases pass
- Zero critical or high-severity defects open
- 95% of all test cases pass
- All security tests pass (no vulnerabilities)
- All PCI DSS compliance checks pass
❌ **FAIL** if:
- Any critical test case fails
- Any critical or high-severity security vulnerability found
- PCI DSS compliance validation fails
- Pass rate <90%
### Performance Acceptance Criteria
✅ **PASS** if:
- 95th percentile response time <3 seconds
- Error rate <0.1% under normal load
- System handles 500 concurrent transactions without degradation
❌ **FAIL** if:
- Response time >5 seconds
- Error rate >1%
- System crashes or becomes unresponsive under load
### Security Acceptance Criteria
✅ **PASS** if:
- Zero critical or high-severity vulnerabilities
- All OWASP Top 10 protections verified
- PCI DSS SAQ-D attestation successful
❌ **FAIL** if:
- Any critical security vulnerability exists
- Card data exposed in logs or unencrypted
Test Data Specification
Define data requirements in detail:
## Test Data Requirements
### Credit Card Test Data
**Source**: Payment Gateway Sandbox Test Cards
| Card Type | Test Number | CVV | Expiry | Expected Result |
|-----------|------------|-----|--------|----------------|
| Visa | 4111111111111111 | 123 | 12/25 | Approved |
| Visa | 4000000000000002 | 123 | 12/25 | Declined |
| MasterCard | 5555555555554444 | 123 | 12/25 | Approved |
| Amex | 378282246310005 | 1234 | 12/25 | Approved |
| Discover | 6011111111111117 | 123 | 12/25 | Approved |
### User Test Data
**Profile Variations**:
- New users (first purchase)
- Returning users (saved payment methods)
- Guest checkout users
- Users in different countries (US, UK, Germany, Japan)
### Transaction Amount Test Data
**Boundary Values**:
- Minimum: $0.01
- Standard: $49.99, $99.99, $249.99
- High value: $999.99, $5,000.00
- Maximum: $10,000.00
### Error Simulation Data
**Network Failures**:
- Timeout simulation: 30s+ delay
- Connection refused: Invalid gateway URL
- DNS failure: Non-existent domain
**Gateway Errors**:
- 500 Internal Server Error response
- Malformed JSON response
- Missing required fields in response
Environment and Tool Requirements
## Test Environment Specification
### Environment Configuration
**Payment Gateway**: Stripe Sandbox (test mode)
- API Version: 2024-06-20
- Webhook endpoint configured for async event handling
**Application Under Test**:
- URL: https://staging.ecommerce-app.com
- Version: v3.5.0-rc2
- Database: PostgreSQL 15.3 (test instance)
**Tools Required**:
- **Test Management**: TestRail
- **Automation Framework**: Cypress 13.x
- **API Testing**: Postman, REST Assured
- **Load Testing**: JMeter 5.6
- **Security Testing**: OWASP ZAP 2.14, Burp Suite Pro
- **Monitoring**: New Relic, Datadog
- **Network Simulation**: Charles Proxy, Network Link Conditioner
### Access Requirements
- Stripe sandbox API keys (test mode)
- Admin credentials for test environment
- VPN access for security testing
Risk-Based Testing Strategy
## Risk Analysis and Mitigation
| Risk | Probability | Impact | Test Focus | Mitigation |
|------|-----------|--------|-----------|-----------|
| Payment data breach | Low | Critical | Security testing (30% of effort) | Penetration testing, code review, encryption validation |
| Transaction failures | Medium | High | Error handling, retry logic (25%) | State transition testing, network failure simulation |
| Poor performance | Medium | Medium | Load testing (15%) | Performance benchmarking, scalability tests |
| Currency conversion errors | Low | Medium | Multi-currency testing (10%) | Decision table coverage |
| Integration failures | Medium | High | API integration testing (20%) | Contract testing, mock services |
**Risk Priority**: Security > Functional Correctness > Performance > Usability
Best Practices
1. Keep TDS Living Document
Update TDS as requirements change:
- Version control in Git
- Review quarterly or after major feature changes
- Link to latest test case repository
2. Involve Stakeholders in Review
TDS review should include:
- QA team (thoroughness of approach)
- Developers (technical feasibility)
- Product owners (business coverage)
- Security team (compliance validation)
3. Balance Detail and Maintainability
Too Little Detail: “Test payment processing” ❌ Too Much Detail: Individual test steps for 160 test cases ❌ Just Right: Test techniques, data classes, acceptance criteria ✅
4. Link TDS to Other Artifacts
Create traceability:
- Requirements → TDS → Test Cases → Defects
- Use reference IDs consistently across documents
Conclusion
A well-crafted Test Design Specification bridges the gap between high-level test strategy and detailed test case execution. By defining test techniques, coverage criteria, data requirements, and pass/fail thresholds, TDS ensures systematic, comprehensive, and repeatable testing.
The key is appropriate detail: enough to guide testers and ensure coverage, but not so much that the document becomes a maintenance burden. Focus on the how and what of testing, leaving the specific steps to individual test cases.