Opening twitter inspired me this morning to write this blog post. Here's the specific inspirational piece:
Here's a story I want to share on a smart, engaged tester succeeding with code.
I work with a brilliant remote tester from Romania. She joined my projects two years ago with a junior tester title, quickly building and showing testing skills that many seniors can't match in my experience. She pays attention to details, she sees the big picture and is great at taking the product a flow at a time, motivated by the context of use. A lot of her time goes into telling about problems that everyone else (the developers) in her team misses - the information she provides surprises people, continuously. She's purpose-driven, polite and collaborative and drives improvement. And, she has the humility to see that as much as I respect her skills today, I respect her skills more tomorrow as every day is about learning more.
I did not contract her for coding skills. I contracted her for exploratory testing where developers will team up to provide code whenever useful. I soon learned she has a background in mathematics, and is smart to a level I can only admire. I'm still not sure how much of code background she has, as she does not emphasize things she cannot do, but things she can learn. And code would definitely be one of the things she can learn, no doubt about it.
About a year ago, we introduced the first selenium automation scripts into the product she is working on. Selenium stuff was set up as developers effort, and her role in the effort was more focused on helping with the ideas of what flows to automate, leaving the implementation with the developers with (more) coding experience. She introduced ToTest-tags as the team agreed for the ideas, but it turned out that the allocation of time into coding from the developers wasn't sufficient to make progress. As I encouraged her to take time away from the much needed hands-on testing work, she learned some basics of selenium and C# on project hours, she started adding automated checks.
The feedback from developers was quiet and clear. As they refactored the tests, they deleted all that she had created, silently. They did not replace them with ones that were working better, just deleted. Later, with me pushing for answers to understand what is going on, I learned that the code was ugly, it was clear that it wasn't done by a developer, and that the tests were brittle. Brittleness in execution was the main reason for deletion.
At first, I thought it was a better approach to just contribute the ToTest-tags and no code to be deleted. But as I looked at developers not making progress on selenium tests either, I realized something. I could never, as the test manager for this tester, accept the "fact" that she couldn't do the automation. It would be something in my power - our power - to change.
Just some months earlier, the tester needed to start working on a new feature area and I was told it would be a waste of her time as "the project will be over before she understands the area, the decision rules are so complex". We were talking of a person who would have no trouble hand-calculating all the decision rules with her math background, so I setting up for "let's show them how it goes" was a no-brainer. She not only understood the area faster, but also contributed to how the area was developed working well with the product manager, who was very thankful for her existence. Exercising the power of prioritizing into something that would take learning before results was successful.
So, I realized I had not thought the same way about her automation skills. I, as the manager, needed to exercise my power of prioritizing to make room for learning. I don't want her - anyone for that matter - to be the person who is referred to as "can't do it" even though she really wouldn't have time to do it. And of course she can do it, why couldn't she. Because the first minimal trial wasn't a brilliant success, should she give up? Of course not.
So as the summer approached, I suggested to take time for another round and we talked about the first round experiences. There was also an official reason, the no progress on automation work by the developers, but a big part of motivating realizations came from not accepting skills as they are - brilliant without coding. After spending some weeks on this while others were on vacation with support from her Altom colleagues, she came back telling she learned to choose "simpler flows" and focused on reliability of her scripts, comparing to previous rounds results - showing she learned.
There was again some frustration from the team developers with inexperience with version control systems (she wanted to only introduce a couple of scripts while she had added more), but this would all be better if she had more chances to practice continually. And now it seems that the scripts are included, running with the build and being a platform to continue from. The developers examples probably helped to find some aspects of acceptable style, so I would hope they don't end up deleted again. At least they seemed to survive a week of action so far.
Looking at the big picture for the team, I really would need her not doing automation. None of the others in the team see the problems she sees as she uses it. And she sees more, faster, when she isn't focused on automating. But while automating, she also sees problems. She mentioned two in two weeks, whereas her usual pace seems to be more of two in a day. Information-wise, I get a lot less from her as she automates. And the missing part just stays missing and we fix while in production, no one else contributes to that area for now, regardless of the efforts I've tried setting up for that as well.
I wholeheartedly agree with a friend stating "Test automation in general will take the tester's time right now, but will save them time in the future, when they don't have to do everything all over again and can really focus on things that others don't see". With this experience however, we have 10 people who can write the automation, and only one who knows the work the automation is supposed to help completing with results we expect. Simple flows the developers could implement - perhaps in this case again didn't. But they tell me the autumn will be different.
With the "testers should code" trend, it would be selfish of me not to make her code. I need the other skills she has way more, but others will look down on her if I don't allow the coding time that we don't really need from her. And her being the real "full-stack developer" - the only one in the team for that matter - wouldn't be a bad goal either. For people who learn continuously, it's just about the choice of where to invest your time.
There is no longer a question of whether testers should learn to write code. Smart, engaged testers are learning and succeeding with code.
— Marlena Compton (@marlenac) August 15, 2014
I feel very divided with this statement. I really agree with "smart, engaged testers are learning and succeeding with code", but I also disagree with it. I have evidence on smart, engaged testers who are learning and succeeding without code, and feel very strongly about not making those people feel inferior.Here's a story I want to share on a smart, engaged tester succeeding with code.
I work with a brilliant remote tester from Romania. She joined my projects two years ago with a junior tester title, quickly building and showing testing skills that many seniors can't match in my experience. She pays attention to details, she sees the big picture and is great at taking the product a flow at a time, motivated by the context of use. A lot of her time goes into telling about problems that everyone else (the developers) in her team misses - the information she provides surprises people, continuously. She's purpose-driven, polite and collaborative and drives improvement. And, she has the humility to see that as much as I respect her skills today, I respect her skills more tomorrow as every day is about learning more.
I did not contract her for coding skills. I contracted her for exploratory testing where developers will team up to provide code whenever useful. I soon learned she has a background in mathematics, and is smart to a level I can only admire. I'm still not sure how much of code background she has, as she does not emphasize things she cannot do, but things she can learn. And code would definitely be one of the things she can learn, no doubt about it.
About a year ago, we introduced the first selenium automation scripts into the product she is working on. Selenium stuff was set up as developers effort, and her role in the effort was more focused on helping with the ideas of what flows to automate, leaving the implementation with the developers with (more) coding experience. She introduced ToTest-tags as the team agreed for the ideas, but it turned out that the allocation of time into coding from the developers wasn't sufficient to make progress. As I encouraged her to take time away from the much needed hands-on testing work, she learned some basics of selenium and C# on project hours, she started adding automated checks.
The feedback from developers was quiet and clear. As they refactored the tests, they deleted all that she had created, silently. They did not replace them with ones that were working better, just deleted. Later, with me pushing for answers to understand what is going on, I learned that the code was ugly, it was clear that it wasn't done by a developer, and that the tests were brittle. Brittleness in execution was the main reason for deletion.
At first, I thought it was a better approach to just contribute the ToTest-tags and no code to be deleted. But as I looked at developers not making progress on selenium tests either, I realized something. I could never, as the test manager for this tester, accept the "fact" that she couldn't do the automation. It would be something in my power - our power - to change.
Just some months earlier, the tester needed to start working on a new feature area and I was told it would be a waste of her time as "the project will be over before she understands the area, the decision rules are so complex". We were talking of a person who would have no trouble hand-calculating all the decision rules with her math background, so I setting up for "let's show them how it goes" was a no-brainer. She not only understood the area faster, but also contributed to how the area was developed working well with the product manager, who was very thankful for her existence. Exercising the power of prioritizing into something that would take learning before results was successful.
So, I realized I had not thought the same way about her automation skills. I, as the manager, needed to exercise my power of prioritizing to make room for learning. I don't want her - anyone for that matter - to be the person who is referred to as "can't do it" even though she really wouldn't have time to do it. And of course she can do it, why couldn't she. Because the first minimal trial wasn't a brilliant success, should she give up? Of course not.
So as the summer approached, I suggested to take time for another round and we talked about the first round experiences. There was also an official reason, the no progress on automation work by the developers, but a big part of motivating realizations came from not accepting skills as they are - brilliant without coding. After spending some weeks on this while others were on vacation with support from her Altom colleagues, she came back telling she learned to choose "simpler flows" and focused on reliability of her scripts, comparing to previous rounds results - showing she learned.
There was again some frustration from the team developers with inexperience with version control systems (she wanted to only introduce a couple of scripts while she had added more), but this would all be better if she had more chances to practice continually. And now it seems that the scripts are included, running with the build and being a platform to continue from. The developers examples probably helped to find some aspects of acceptable style, so I would hope they don't end up deleted again. At least they seemed to survive a week of action so far.
Looking at the big picture for the team, I really would need her not doing automation. None of the others in the team see the problems she sees as she uses it. And she sees more, faster, when she isn't focused on automating. But while automating, she also sees problems. She mentioned two in two weeks, whereas her usual pace seems to be more of two in a day. Information-wise, I get a lot less from her as she automates. And the missing part just stays missing and we fix while in production, no one else contributes to that area for now, regardless of the efforts I've tried setting up for that as well.
I wholeheartedly agree with a friend stating "Test automation in general will take the tester's time right now, but will save them time in the future, when they don't have to do everything all over again and can really focus on things that others don't see". With this experience however, we have 10 people who can write the automation, and only one who knows the work the automation is supposed to help completing with results we expect. Simple flows the developers could implement - perhaps in this case again didn't. But they tell me the autumn will be different.
With the "testers should code" trend, it would be selfish of me not to make her code. I need the other skills she has way more, but others will look down on her if I don't allow the coding time that we don't really need from her. And her being the real "full-stack developer" - the only one in the team for that matter - wouldn't be a bad goal either. For people who learn continuously, it's just about the choice of where to invest your time.