The world of software as I know it has changed for me. I no longer join projects in preparation of a testing phase that happens in the end, but I am around from when I'm around until I am no longer around, building testing that survives when I am gone.
Back in the day of testing phase at the end of a project, test strategy used to be the ideas you prepared in order to work through the challenging phase. It gave the tests you would do a framing, guiding design. It usually ended up being written down in a test plan under a heading of approach, and it was one of the most difficult things to write in a way that was specific to what went down in that particular project.
With agile, iterations and testing turning continuous, figuring out test strategy did not get easier. But the ideas guiding test design turned into something that was around for longer, and in use longer. I talked about what ideas stuck with me at DEWT5 in 2015, and same ideas guide my testing to this day.
- Start with the end in mind
- Release time with minimal eyes on system. Rely on TA (test automation) on the release decision.
- TA keeps track of what we know so that it remains known when we change things
- Incremental, incomplete, learning
- Work towards flow of TA value - small streams become a significant pool over time. Moving for better continuously matter, not starting well or perfect.
- Something imperfect but executable is better than great ideas and aspirations. Refactor to reveal patterns.
- Feedback nightly, feedback on each change.
- Maintain ability to run TA on every version supported for customers
- Early agreement
- Design automation visibility and control interfaces at epic kickoffs
- For each epic (feature), add the positive case to TA. Target one. More is allowed but don't overstretch.
- Unit and software integration tests cover cruft of functionality. TA is for system level scenarios including hardware (as it is embedded for us).
- Not only regression TA, also data, environments, reliability, security and performance in automation.
- Acceptance tests for interfacing teams monitor expected dependencies.
- Save the data. Build on the data. But first learn to run it.
- Invest in skilled TA developers through learning and collaboration
- Require developers to maintain automation for breaking changes
- To facilitate GUI selectors, GUI devs create first test with keywords
- Allow for a "domain testing expert" who only contributes in pull request reviews on TA
- Suites and tags give two dimensions to select tests, use tags for readiness
- Seek to identify reusable data sets and oracles
- Reuse of keywords supported through reviews and refactoring time