I see systems. I guess we see things we like seeing, and I like seeing how the bits and pieces connect, what is clear and what is wrapped in mystery of promises of more learning in the future. I like seeing value, and users and flows. And pieces alone are part of that flow, but the promise comes together with the system.
For years, I've tested systems. I've figured out ingenious ways of seeing what changes, learning heuristics of what changes matter, all grounded on knowing why would anyone want to use this? Every moment testing an individual piece, as an exploratory tester, connects somehow to a greater purpose in the context of the system.
When I worked with a team of 10 developers with their only tester, we were doing daily releases without test automation, and it worked great. It worked great into slowly but steadily introducing test automation. But even without test automation, in contained the size of change. Each change would flow isolated through the pipeline with the manual steps. Just like coding was manual, testing was that too. Think, test, implement, test, think, release - a steady flow of features of value.
But now, the scale is different. Where I had 10 people before, I now have 100. And 100 developers, doing non-isolated changes merging to trunk as soon as they think they're ready is change at a pace one tester, even with the ingenious ways of seeing things and knowing things, it is just too much. This is where test automation as documentation comes in. With executable documentation, test automation frees my energy to analyze on top of it, not all of it. I no longer need to analyze details, but trends. Clusters of changes. Driving forces for those changes. Risks in the system, and risks in the people creating those systems. Automation catches some of it - quite a lot of it. And what it does not catch, is a chance of identifying what the automation is missing. To document with test automation.
I find myself in places where automation at first is more of a wishful thinking than actual net of coverage. But learning, every day, and documenting with automation, it grows every day.
My analyzing changes on backlog visualization. If I can fix and forget, I would go there. But sometimes things need bigger focus. And as an exploratory tester and a system tester, I see what we miss. I label it, and ask for it.
I wouldn't know how to connect this stuff with reality if I did not spend time, hands on, with the systems we're building. The product works as external imagination, making my requests of what should be tested more practical. And while I prepare for the automation work, I just so happen to have already tested without the automation, found some problems and gotten them fixed.
We emphasize automation, for a reason. But in addition to folks who automate, we need folks who care for identifying things that take us further, make our automation do real testing. Not end to end, but covering a web of granular feedback mechanisms, so that we know when things are not right.
For years, I've tested systems. I've figured out ingenious ways of seeing what changes, learning heuristics of what changes matter, all grounded on knowing why would anyone want to use this? Every moment testing an individual piece, as an exploratory tester, connects somehow to a greater purpose in the context of the system.
When I worked with a team of 10 developers with their only tester, we were doing daily releases without test automation, and it worked great. It worked great into slowly but steadily introducing test automation. But even without test automation, in contained the size of change. Each change would flow isolated through the pipeline with the manual steps. Just like coding was manual, testing was that too. Think, test, implement, test, think, release - a steady flow of features of value.
But now, the scale is different. Where I had 10 people before, I now have 100. And 100 developers, doing non-isolated changes merging to trunk as soon as they think they're ready is change at a pace one tester, even with the ingenious ways of seeing things and knowing things, it is just too much. This is where test automation as documentation comes in. With executable documentation, test automation frees my energy to analyze on top of it, not all of it. I no longer need to analyze details, but trends. Clusters of changes. Driving forces for those changes. Risks in the system, and risks in the people creating those systems. Automation catches some of it - quite a lot of it. And what it does not catch, is a chance of identifying what the automation is missing. To document with test automation.
I find myself in places where automation at first is more of a wishful thinking than actual net of coverage. But learning, every day, and documenting with automation, it grows every day.
My analyzing changes on backlog visualization. If I can fix and forget, I would go there. But sometimes things need bigger focus. And as an exploratory tester and a system tester, I see what we miss. I label it, and ask for it.
I wouldn't know how to connect this stuff with reality if I did not spend time, hands on, with the systems we're building. The product works as external imagination, making my requests of what should be tested more practical. And while I prepare for the automation work, I just so happen to have already tested without the automation, found some problems and gotten them fixed.
We emphasize automation, for a reason. But in addition to folks who automate, we need folks who care for identifying things that take us further, make our automation do real testing. Not end to end, but covering a web of granular feedback mechanisms, so that we know when things are not right.