Some years into teaching, I mustered courage and create a hands-on work course on exploratory testing. I was terrified because while teaching through telling stories requires you to have those stories, teaching hands-on requires you to be comfortable with things you don't know but will discover. I had people pairing on the courses and we did regular reflections of observations from the pairs to sync up with ideas that I had guided people to try in the sessions. We moved from "find a bug, any bug" to taking structured notes or intertwining testing for information about problems and testing for ideas about future sessions of testing. The pairs could level up to the better in the pair, and if I moved people around to new pairs, I could distribute ideas a little but generally new pairs took a lot of time establishing common ground. People liked the course but it bothered me that while the students got better at testing, they did not get to the levels I hoped I could guide them to.
When I then discovered there can be such a thing as ensemble testing (started to apply ensemble programming very specifically to testing domain), a whole new world of teaching opened up. I could learn the level each of my students contribute on with the testing problems. I could have each of them teach others from their perspectives. And I could still level the entire group on things I had that were not present in the group by taking the navigator role and modelling what I believed good would look like.
I have now been teaching with ensemble testing for 8 years, and I consider it a core method more teachers should benefit from using. Ensemble testing combined with nicely layered task assignments that stretch the group to try out different flavours of testing skills is brilliant. It allows me to teach programming to newbies fairly comfortably in a timeframe shorter than folklore lets us assume programming can be taught in. And it reinforces learning of the students by the students becoming teachers of one another as they contribute together on the problems.
There is still a need of trainings that focus on the stories from projects. The stories give us ideas, and hope, and we can run far with that too. But there is also the need of hands-on skills oriented learning that ensemble testing has provided me.
In an ensemble, we have single person on a computer, and this single person is our hands. Hands don't decide what to do, they follow the brains, that make sense of all the voices and call a decision. We rotate the roles regularly. Everyone is learning and contributing. In a training session, we expect everyone to be learning a little more and thus contributing a little less, but growing in ability to contribute as the working progresses. The real responsibility of getting the task done is not with the learners, but with the teacher.
We have now taught 3/4 half-day sessions at our work in ensemble testing format for python for testing -course, and the feedback reinforces what I have explained.
"The structure is very clear and well prepared. Everything we have done I've thought I knew but I'm picking up lots of new information. Ensemble programming approach is really good and I like that the class progresses at the same pace."
"I've learned a lot of useful things but I am concerned I will forget everything if I don't get to use those things in my everyday work. The course has woken up my coding brain and it has been easier to debug and fix our test automation cases."
"There were few new ideas and a bunch of better ways of doing things I have previously done in a very clumsy way. Definitely worth it for me. It would be good to take time to refactor our TA to use the lessons while they are fresh."
The best feedback on a course that is done where I work is seeing new test automation emerge - like we have seen - on the weeks between the sessions. The lessons are being used, with the woken up coding brain or with the specific tool and technique we just taught.