Play 7: Plan for an Evolving Testing Manual & Automated Strategy

Develop test strategy to evolve and mature with digital transformation

As your project matures, so does your ability to incorporate automated testing. Adopt a flexible, continuously improving testing strategy, designed to evolve and mature with your digital transformation. 

Create an initial test strategy for a new project

A new project that is still in the early stages of maturity might have few areas that are tested regularly through unit and coverage tests. New additions and adjustments are happening frequently. Test coverage should be improving towards a stated goal. The user interface may not be in its final design form at this phase, so writing UI level tests will be difficult to maintain.  

developers testing on a tablet

Update the test strategy for the project growth stage

As the project is observed more by stakeholders, it enters the maturity stage or growth stage. In this stage, the project has many areas that are tested thoroughly and are relatively stable in their design. Projects tend to exist in this stage for most of their life as new features are added, and UI details stabilize. Implement testing automation for stable parts of the UI during this phase. The growth stage also presents an opportunity to expand the role of testing automation both by scripting UI level tests for stable components and by integrating unit tests with software development workflow. Continuous integration of unit tests developed in the previous maturity stage builds stability and confidence. 

Finalize testing automation for a mature project

This is the stage where you take a big-picture look at your project to see what is missing from your core testing strategies. Start with unit tests as they may be outdated at this stage and need rewriting. Some unit tests may be better suited for treatment as integration tests. Determine the subset of tests that should remain automated via developer actions verses those that should be run as a batch process due to the duration of their execution or the level of integration the test covers. This latter category can be grouped with UI level tests. The mature project state also should produce reports about testing, rate of success, coverage and performance metrics.  Individual test reports can be fed source data for analytics processing and monitored over time. 

checklist on a clipboard

Considerations for inclusion in the test plan

  • Types of Tests
    • Unit Testing
    • Integration Testing
    • System Testing
    • Testing of a Complete and Fully Integrated Product
      • This type of testing falls in the category of black-box testing where knowledge of the inner design of the code is not a pre-requisite.
    • Smoke Testing
    • Interface Testing
    • Regression Testing
    • Beta/Acceptance Testing
    • Load Testing
    • Security Penetration Testing
    • Smoke Testing – Quick check to verify the software build is stable and can be passed to the QA team for further testing 
    • Regression Testing – Complete time-consuming checks to verify any changes in the code do not affect existing or related functionality 
    • Cross Browser Testing 
    • Target browser-OS configurations 
    • Mobile first world – Include phones and tablets 
    • Automate the testing 
    • Test before going live 

agile icons

Adopt Agile Testing

Agile testing operates under the principle that testing is integrated directly with the software development lifecycle. Testing happens in the same small increments and in concert with writing code.

Fail fast.

In Agile testing, testing is wired directly into the development process, so that defects are discovered early before merges into the trunk. Confidence in accuracy occurs as a result, moving the artifact quickly towards a releasable state. 


  • Build unit tests from day one
  • Automate unit testing through workflow hooks and CI pipelines
  • Use mocks and fixtures to ensure data source independence
  • As the project matures, expand the testing to include more test types
  • Collect as much data about failed tests as possible
  • Generate reports from tests


  • Are tests designed to be independent from one another? 
  • Are tests integrated with the CI/CD pipeline? 
  • Are tests independent of the data source? 
  • Are requirements written so they can be easily converted to tests? 
  • Are the unit tests acting as the foundation of the test suite?  
  • Are the test names self-describing and easy-to-understand?  
  • Is the test code written applying the DRY principle?