Understanding Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC) is fundamental competency for QA specialists. Knowing how software creation processes work, at which stages testing begins, and how different methodologies affect team work is critically important for effective work. In this comprehensive guide, we’ll examine both cycles, their relationship, and application in various development methodologies.
What is SDLC (Software Development Life Cycle)
SDLC (Software Development Life Cycle) is a structured software development process that describes stages of creating software from concept to support completion.
Why SDLC is Needed
- Structure — clear understanding of what to do at each stage
- Predictability — ability to plan deadlines and resources
- Quality — built-in quality control mechanisms at each stage
- Risk mitigation — early problem identification
- Efficiency — process optimization and rework minimization
- Communication — common language for entire team
Classic SDLC Stages
Regardless of methodology, SDLC typically includes following phases (though names and order may vary):
1. Planning
Goal: Define project scope, resources, deadlines, and budget.
Main activities:
- Business goals and requirements definition
- Feasibility study
- Risk analysis
- Team formation
- Project charter creation
- Cost and timeline estimation
Output artifacts:
- Project plan
- Resource allocation document
- Risk assessment report
- Budget estimation
QA Role:
- Risk assessment participation
- Test resource planning
- Test effort estimation
- Quality requirements input
2. Requirements Analysis
Goal: Detailed understanding of what needs to be developed.
Main activities:
- Business requirements gathering from stakeholders
- Functional requirements documentation
- Non-functional requirements definition (performance, security (as discussed in Bug Anatomy: From Discovery to Resolution), usability)
- Use cases and user stories creation
- Requirements prioritization
- Stakeholder approval
Output artifacts:
- Business Requirements Document (BRD)
- Software Requirements Specification (SRS)
- Use cases
- User stories
- Acceptance criteria
QA Role:
- Requirements review for completeness, clarity, testability
- Requirement walkthrough participation
- Requirement traceability matrix creation
- Acceptance criteria definition
- Early ambiguity and contradiction identification
QA questions at this stage:
- Are requirements testable?
- Are all edge cases covered?
- Are there contradictions between requirements?
- Are non-functional (as discussed in Dynamic Testing: Testing in Action) requirements defined?
3. Design
Goal: Define system architecture and component design.
Main activities:
- High-level design (system architecture)
- Low-level design (detailed component design)
- Database (as discussed in Continuous Testing in DevOps: Quality Gates and CI/CD Integration) design
- UI/UX design
- API design
- Technology stack selection
- Design review
Output artifacts:
- High-Level Design (HLD) document
- Low-Level Design (LLD) document
- Database schema
- API specifications
- UI mockups and wireframes
- Architecture diagrams
QA Role:
- Design documentation review
- Architecture testability verification
- Test strategy planning
- Test environment design
- Test data requirements identification
- Test plan creation start
Important: At design stage, QA can identify problems that will be difficult to test and suggest changes.
4. Implementation / Development
Goal: Write code according to design.
Main activities:
- Code writing
- Unit testing
- Code review
- Version control (git commits)
- Component integration
- Continuous integration
Output artifacts:
- Source code
- Unit tests
- Build artifacts
- Technical documentation
- Code review notes
QA Role:
- Test data preparation
- Test cases and test scenarios creation
- Test environments setup
- Automated tests development (if automation exists)
- Early smoke tests on dev builds
- Code reviews participation (in some teams)
5. Testing
Goal: Defect identification and requirements compliance verification.
Main activities:
- Various testing types execution:
- Functional testing
- Integration testing
- System testing
- Regression testing
- Performance testing
- Security testing
- Usability testing
- Bug reporting and tracking
- Retesting and regression
- Test reporting
Output artifacts:
- Test execution reports
- Bug reports
- Test metrics
- Traceability matrix
- Sign-off for deployment transition
QA Role:
- Main phase for QA
- All testing types execution
- Bug reporting and tracking
- Developer coordination
- Quality metrics provision
6. Deployment / Release
Goal: Product delivery to users.
Main activities:
- Production environment preparation
- Final smoke testing
- Production deployment
- Monitoring
- Rollback plan readiness
- User documentation
Output artifacts:
- Released product
- Release notes
- User documentation
- Deployment checklist
- Rollback procedure
QA Role:
- Final smoke testing before release
- Sanity testing after deployment
- Critical paths verification in production
- Production issues monitoring
- Go/no-go decision participation
7. Maintenance
Goal: Bug fixing, updates, improvements.
Main activities:
- Bug fixing
- Performance optimization
- Security patches
- Feature enhancements
- User support
- Monitoring and analytics
Output artifacts:
- Hotfixes
- Patch releases
- Updated documentation
- Performance reports
QA Role:
- Hotfixes and patches testing
- Regression testing
- User-reported issues monitoring
- Production metrics analysis
- Testing processes continuous improvement
What is STLC (Software Testing Life Cycle)
STLC (Software Testing Life Cycle) is a sequence of steps performed in software testing process. STLC is a subset of SDLC and focuses exclusively on testing.
STLC vs SDLC: What’s the Difference
Aspect | SDLC | STLC |
---|---|---|
Focus | Software development | Software testing |
Scope | Entire creation process | Only testing process |
Start | From project idea | After requirement analysis |
Participants | All project roles | Predominantly QA |
Goal | Create working product | Ensure product quality |
Artifacts | Code, design, docs | Test plans, test cases, bug reports |
Important: STLC doesn’t start after SDLC completion. Testing is integrated into all SDLC phases.
STLC Phases
For a comprehensive understanding of how testing integrates at different stages, see our guide on testing levels.
Phase 1: Requirement Analysis
Goal: Understand what will be tested.
Activities:
- Requirements study (BRD, SRS, user stories)
- Testable requirements identification
- Requirement review meetings
- Testing scope definition
- Different testing types identification
- Requirement Traceability Matrix (RTM) preparation
Entry Criteria:
- Requirements documented
- Requirements documents access
- Stakeholders available for clarifications
Exit Criteria:
- RTM created
- Automation feasibility determined
- All requirements questions resolved
Deliverables:
- RTM (Requirement Traceability Matrix)
- Automation feasibility report
- Questions/clarifications list
Phase 2: Test Planning
Goal: Define testing strategy and resources.
Activities:
- Test Plan preparation
- Test strategy definition
- Effort estimation
- Testing tools selection
- Roles and responsibilities definition
- Testing risk assessment
- Test environment planning
Entry Criteria:
- Requirements analyzed
- RTM ready
Exit Criteria:
- Test Plan approved
- Effort estimation completed
- Resources allocated
Deliverables:
- Test Plan document
- Test Strategy document
- Effort estimation document
Test Plan includes:
- Test scope (what to test, what not to test)
- Test approach (testing types)
- Roles and responsibilities
- Test schedule
- Entry/exit criteria for each phase
- Risk mitigation plan
- Test deliverables
Phase 3: Test Case Development
Goal: Create detailed test cases and test data.
Activities:
- Test cases writing
- Test scripts creation (for automation)
- Test cases review
- Test data creation/acquisition
- Test cases prioritization
- Traceability creation between requirements and test cases
Entry Criteria:
- Test plan approved
- Requirements available and stable
- RTM created
Exit Criteria:
- All test cases written and reviewed
- Test data prepared
- Traceability matrix updated
Deliverables:
- Test cases
- Test scripts (for automation)
- Test data
- Updated RTM
Good test case includes:
- Test case ID
- Test description
- Preconditions
- Test steps
- Test data
- Expected result
- Actual result (filled during execution)
- Status
- Priority
Phase 4: Test Environment Setup
Goal: Prepare environment for test execution.
Activities:
- Test environment setup (hardware, software, network)
- Test data preparation in database
- Testing tools setup
- Environment smoke test
- Test builds acquisition
Entry Criteria:
- Test plan ready
- Test environment design document ready
Exit Criteria:
- Test environment ready
- Smoke test passed
- Test data loaded
Deliverables:
- Test environment ready for use
- Smoke test results
Phase 5: Test Execution
Goal: Execute test cases and find defects.
Activities:
- Test cases execution per plan
- Actual results comparison with expected
- Defects logging for failed tests
- Defects retesting after fixing
- Regression testing
- Test case status updating
- Test execution tracking
Entry Criteria:
- Test environment ready
- Test cases ready
- Build ready for testing
- Smoke test passed
Exit Criteria:
- All planned test cases executed
- Critical and high bugs fixed and retested
- Regression testing completed
- Exit criteria from test plan met
Deliverables:
- Test execution reports
- Defect reports
- Updated test cases (if needed)
- Test logs
Testing types at this phase:
- Smoke testing
- Functional testing
- Integration testing
- System testing
- Regression testing
- Exploratory testing
- Non-functional testing (performance, security, usability)
Phase 6: Test Cycle Closure
Goal: Assess testing completeness and gather learnings.
Activities:
- Test artifacts collection
- Test metrics analysis
- Test summary report preparation
- Lessons learned meeting
- Best practices identification
- Test artifacts archiving
Entry Criteria:
- Test execution completed
- All critical bugs closed
- Regression testing completed
Exit Criteria:
- Test closure report ready
- Stakeholder sign-off received
Deliverables:
- Test closure report
- Test metrics
- Lessons learned document
- Best practices documentation
Test Metrics include:
- Total test cases vs. executed
- Pass/Fail percentage
- Defect density
- Defect leakage
- Test coverage
- Test execution velocity
Development Methodologies and Testing Place
Waterfall Model
Description: Sequential model where each phase starts only after previous one completes.
Phases: Requirements → Design → Implementation → Testing → Deployment → Maintenance
Characteristics:
- Strict phase sequence
- Can’t return to previous phase
- Testing — separate phase after development
- Extensive documentation
- Fixed requirements
Testing place in Waterfall:
- Testing — separate, late phase
- QA involved in requirement review, but main work after coding
- Long test cycle
- High risk of finding critical bugs at late stages
QA Pros:
- Clear requirements
- Complete documentation
- Sufficient time for thorough testing
- Easy resource planning
QA Cons:
- Late defect discovery (expensive to fix)
- Little flexibility for changes
- If bug missed, rollback expensive
- Long feedback loop
When used:
- Projects with clearly defined, stable requirements
- Regulated industries (medical, finance)
- Projects with fixed scope and timeline
V-Model (Verification and Validation Model)
Description: Waterfall extension where for each development phase there’s corresponding testing phase.
Structure:
Requirements ←→ Acceptance Testing
System Design ←→ System Testing
Architecture Design ←→ Integration Testing
Module Design ←→ Unit Testing
↓
Implementation
Characteristics:
- Testing planned parallel to development
- For each development stage — corresponding testing stage
- Verification (Are we building the product right?) and Validation (Are we building the right product?)
- Test artifacts created early
Testing place in V-Model:
- QA involved from start
- Test planning starts at requirements phase
- Test case design parallel to design phase
- Test execution after implementation
Phase mapping:
- Requirements Analysis → Acceptance Test Planning
- User Acceptance Tests (UAT)
- Business requirement verification
- System Design → System Test Planning
- End-to-end testing
- Functional and non-functional testing
- High-Level Design → Integration Test Planning
- Module interactions testing
- API testing
- Low-Level Design → Unit Test Planning
- Individual component testing
- Developer-driven testing
QA Pros:
- Early QA involvement
- Defects found earlier (cheaper to fix)
- Clear correlation between development and testing
- Better test coverage
QA Cons:
- Still inflexible model
- Difficult to adapt to changes
- Requires extensive documentation
When used:
- Safety-critical systems (automotive, aerospace, medical)
- Projects with clear, stable requirements
- Regulated industries
Agile Methodology
Description: Iterative and incremental approach where development conducted in short cycles (sprints).
Agile Principles:
- Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan
Characteristics:
- Short iterations (typically 1-4 weeks)
- Continuous feedback
- Self-organizing teams
- Close collaboration
- Adaptation to changes
Testing place in Agile:
- Testing — continuous process within each sprint
- QA — full team member, not separate phase
- Testers work parallel to developers
- Continuous testing and continuous integration
Agile Testing Quadrants (Brian Marick):
Business-Facing Technology-Facing
Q2: Functional Tests | Q1: Unit Tests
- Manual/Exploratory | - Automated
- User Stories | - Component Tests
- Examples/Scenarios | - API Tests
Support Programming | Support Programming
--------------------------------|---------------------------
Q3: Usability Tests | Q4: Performance Tests
- Manual | - Automated
- User Acceptance | - Load Tests
- A/B Testing | - Security Tests
Critique Product | Critique Product
QA in Agile Sprint:
Day 1-2 (Sprint Planning):
- Planning participation
- Requirements clarification
- Testing effort estimation
- Acceptance criteria defining
- Testing approach planning
Day 3-8 (Development + Testing):
- Ready user stories testing
- Exploratory testing
- Regression testing
- Bug reporting and retesting
- Automation test creation/update
- Constant communication with devs
Day 9-10 (Sprint End):
- Final regression
- Demo participation
- Retrospective participation
- Test metrics preparation
- Next sprint planning
QA Pros:
- Early and continuous testing
- Fast feedback loop
- Defects fixed in same sprint
- Close team collaboration
- Flexibility and adaptability
QA Cons:
- Less documentation (may be unclear)
- Tight timelines (pressure)
- Must be multitasking (testing + automation + communication)
- Regression can accumulate
- Requires high automation level
Agile frameworks:
Scrum:
- Sprints (typically 2 weeks)
- Roles: Product Owner, Scrum Master, Development Team (including QA)
- Ceremonies: Sprint Planning, Daily Standup, Sprint Review, Retrospective
- Artifacts: Product Backlog, Sprint Backlog, Increment
Kanban:
- Continuous flow (no sprints)
- Work In Progress (WIP) limits
- Visual board
- Pull-based system
When Agile used:
- Startups and dynamic environments
- Projects with changing requirements
- Products requiring fast time-to-market
- Web applications, mobile apps
DevOps and Continuous Testing
DevOps unites Development and Operations for delivery acceleration.
Key practices:
- Continuous Integration (CI)
- Continuous Delivery (CD)
- Infrastructure as Code
- Monitoring and Logging
- Collaboration culture
Testing place in DevOps:
Shift-Left Testing — testing shifts left (earlier in SDLC):
- Unit tests written by developers
- API tests automated
- Integration tests in CI/CD pipeline
- Static code analysis
CI/CD Pipeline with testing:
Code Commit → Build → Unit Tests → Integration Tests → Deploy to Staging →
Smoke Tests → Regression Tests → Deploy to Production → Monitoring
Automation in DevOps:
- 70-80% test coverage automation
- Fast feedback (minutes, not hours)
- Tests run on each commit
- Failed tests block deployment
QA role in DevOps:
- Automated tests creation and maintenance
- CI/CD pipelines setup for testing
- Production monitoring (observability)
- Performance and security testing
- Exploratory testing for new functionality
Methodology Choice: Comparison Table
Criterion | Waterfall | V-Model | Agile | DevOps |
---|---|---|---|---|
Flexibility | Low | Low | High | High |
Documentation | Extensive | Extensive | Minimal | Minimal |
QA involvement | Late | Early | From start | Continuous |
Feedback speed | Slow | Medium | Fast | Very fast |
Test automation | Optional | Desirable | Critical | Mandatory |
Risk of late defects | High | Medium | Low | Very low |
Best for | Stable scope | Critical systems | Dynamic products | Modern web/cloud |
Team size | Large | Medium | Small-medium | Small |
Release cycle | Months-years | Months | Weeks | Days-weeks |
QA Best Practices in Different Methodologies
For any methodology:
- Early involvement — participate in requirement analysis
- Requirement review — verify requirements testability
- Test planning — plan ahead
- Risk-based testing — focus on critical areas
- Clear communication — clearly communicate status and risks
- Metrics tracking — track quality metrics
- Continuous learning — learn from mistakes
Specific to Waterfall/V-Model:
- Invest time in detailed documentation
- Comprehensive test coverage planning
- Thorough requirement analysis
- Formal sign-offs at each stage
Specific to Agile/DevOps:
- Automation first — automate everything possible
- Continuous testing — test constantly, not at end
- Shift-left — find bugs as early as possible
- Collaboration — work closely with devs
- Exploratory testing — don’t rely only on scripted tests
- Fast feedback — provide quick feedback
Conclusion
Understanding SDLC and STLC, as well as various development methodologies, is not just theory. It’s practical knowledge that defines how you’ll work every day:
Key Takeaways:
SDLC and STLC are interconnected but not the same. STLC is the testing cycle within SDLC that should be integrated into all development phases, not be a separate stage at the end.
Methodology defines QA role. In Waterfall you test at end, in Agile — constantly, in DevOps — automate and monitor production.
No universal approach. Waterfall may be right choice for regulated industries, Agile — for startups, V-Model — for critical systems.
Early QA involvement critically important regardless of methodology. The earlier you find defects, the cheaper their fixing.
Automation not optional in modern world, especially in Agile and DevOps. Manual testing doesn’t scale.
Adaptation key skill. Methodologies evolve, hybrid approaches (Water-Scrum-Fall, for example) exist. Important to understand principles and adapt them to team context.
Testing not phase, it’s mindset. Quality is entire team’s responsibility, not just QA department.
Successful QA specialist not only knows SDLC and STLC theory but can apply this knowledge in real work, adapting to team methodology and constantly improving testing processes.
Next steps:
- Study 7 ISTQB testing principles for deep fundamentals understanding
- Identify which methodology your team uses
- Assess where testing in your SDLC can start earlier
- Explore automation opportunities in your project