Friday, September 18, 2015

Pressure or curiosity: Testers learning to program

It's getting to all of us testers. The message that the testers of the future need to be technical. The message that more and more companies set up teams of developers who are also responsible for all the testing is everywhere.

So you find it in yourself, like so many of my close non-programming tester colleagues have, to learn to program. It could be something completely new to you. Or it could be something you stopped doing 15 years ago, because you just found the world of testing from the business and value viewpoint so much more fun. Some of us find the motivation from the negative: feeling the pressure, needing to be 'competitive'. Some of us find the motivation from the positive: curiosity, seeing how that will improve the selection of things I can do with testing.

I find that recently almost every tester I know, including myself, have started taking the steps to become better programmers. On one hand, I'm happy for the turnout. But on other, it might mean that the side of not respecting the value of thinking in testing, at least without self-contained ability to turn that into test automation, has lost the battle.

My curiosity towards the code has lead me to learn that I dislike the stuff I was forced to learn on at school. The algorithms. The mathematical puzzles. The so-called funny little programs that were geeky in the wrong way (like paper-rock-scissors-lizard). I also dislike what automating does to the work of testing that I love so much. I dislike how it transforms my thinking into more procedural, how it forces me to spend time on details when there's so much of the world to see that I have to choose to prioritize out, thinking of long term over short term. I do it, because I believe it's good for our product development. But it makes me want to go work in McDonalds on a regular basis.

Coding from the perspective of solving my own problems is much more fascinating. There's an iPhone app I want to have. I will have it. And I know I need to create it myself. There's the excitement, and I don't mind at all doing great testing (even automated testing) around it.

Could it be that we need to look deeper into our strengths and interests to find what each of us would like to use programming on, instead of thinking that testers are programmers of testing problems? Because with that trend, there will be one tester less - with direct impact on the quality that comes out from our current pipeline.

Another thing on the programming trend that really puzzles me is languages. I'm a strong believer in using the same language for test automation as you use for the rest of your development. If my team works with C# and I bring in python because it's so much better for testing, I will be alone with my code. And I find from experience that bringing in selenium in C#, I get the whole team contributing, and whatever test automation we're doing benefits from the skills of the whole team. The trend still seems to be to look for a language specific to testing, be it a different programming language or vendor-language built into a tool.

For learning, my lesson is this. Pick a language, any language. But pick one that works with problems you find interesting to look into. Pick one where you have people close to you that you can collaborate with. Work with it long enough to get over the starting pains, so that you start to see the structures. Adding more languages when you know one isn't that big a deal. But knowing hello world in 20 languages but nothing deeper isn't that helpful.


  1. It's funny how your posts about dislike of programming tend to pull me out of my hideout and write a comment, perhaps it's because the rest of them seem to fall under either "that's something new I'd like to try out, or at least see" or "that's a great way of putting this idea into words".
    Here, what got my eye were two sentences -
    1) "not respecting the value of thinking in testing"
    2) "what automating does to the work of testing"

    For the first point, I wonder - what brings you to see this as a battle? why is there a side that respects thinking and a side that can and do write code?
    At least when I look at my daily routine, I don't feel any pressure of automating each and every test scenario I can come up with (in fact, many times we discuss the question of whether automating something is worth the effort) and even though programming skills were a major part of the required qualifications when I was hired, I feel that most of my contribution to the team is by being some sort of a knowledge repository, and by asking good questions. I know that my manager thinks the same about our work (even though he's a bit more coding oriented than I am) and I strongly believe that if anyone in my team was asked where do I contribute to the team, words like "feedback", "questions", "domain knowledge" (and maybe "nagging") would come much before "automation".
    It might be that I lack the perspective of a "non-coding tester" (as I have never been one myself), but my feeling towards this are that programming is part of the skillset I expect a tester should have - the same as I expect a decent level of English and a know-how around *nix systems or SQL - Those skills aren't exactly part of testing, but they enable a tester to perform much better.

    The second part is what I find most intriguing - I can (kind-of) understand someone not liking programming tests seeing it as a battle between one's comfort zone and the current trend. What I really wonder about is the affect you describe on how being involved in automation affects the way you think about tests. When I try to estimate if this is happening to me as well, I can recall only the (rare) times when I was reviewing unit-tests. Usually when I write system-level tests I feel this is actually helping me widen my scope - when I'm forced to pay close attention to some details I note potential problems I did not see when we were discussing the feature.
    So, I think I have two questions - about this:
    1) How does the automation you are involved in look? is it is writing a lots of short, one assertion tests? is it using a BDD framework (I think this sort of framework can create the feelings you mentioned)? does it contain long tests simulating a user?
    2) You described the effect of automation as something making you think in a more procedural way - could you try and guess what in automating stuff produces this effect? Could it be the small scope of a test (when compared to a manual test where the whole application is at your fingertips)? Could it be the necessity of formalizing your test ideas into something a machine could follow? Could it be because you are less comfortable with programming and this effect will diminish when you will be more fluent in programming (I got this idea when I noticed that articulating something in French feels as if it has a stronger effect on my thinking compared to English, in which I am much more comfortable)?

    1. Thanks for coming out of your hideout. I'm really enjoying these little discussions we have. :)

      I think I see it as battle for various levels. I'm growing very tired of the automation propaganda that leads to employers not looking for non-programming testers. And I have so many friends who feel sad about this, feeling forced into automation and not enjoying it, and not able to do as much about it as I can. I've talked my way into various organizations showing them that even without me automating, I would be the person to fill the gap they're experiencing. The other things is really both fight with my own comfort zone and what I feel I want to get done with my life.

      Your comment made me reflect back to an article by Lloyd Roden on tester personalities. I'm a "pioneer" by type. This style of tester Likes new / ideas, change, openness, results / efficiency, involving others, risks and Dislikes standards, detail, ‘norm’, paper-work. Automation to me is as much detail and paper-work than scripting my tests, but I would choose automation over manual scripting any day.

      I believe our focus on automation is narrowing down the diversity of personalities that will fit our teams. And more diversity would help in performing well as a team.

      I don't find it a problem if people can code. I find it a problem that people who don't want to are forced / coerced to or belittled because they don't. I have another blog post in the process to respond to the crazy idea that "technical" means "programming" over comfort with using software without fear.

      On your questions:
      1) Tried BDD, did not like it as style of writing because in my context, it's extra hassle to find a gherkin format for things you can express in many ways. Mostly Selenium, database checks of data consistency, and some unit tests.
      2) This would be just guessing. It's seems to be the painful awareness of opportunity cost: I could be doing something else more valuable with my time. I'm working on a detail long, and missing countless other details for one.
      All your guesses are probably part of it. The less comfortable is the assertion people I care about have suggested, and that's what I'm investigating by changing to be more comfortable by forcing myself to do things to practice. But it does not stop me for fighting for my non-programming friends, who are in my opinion boxed stupidly.

      On articulating, programming has grown to be just precise articulation. Like I keep hearing from a friend big in programming with intent, my Finnish brain is smarter than my English brain, and my programming brain. There's countless hours more practice on doing things in your native language. Thus we should do the difficult thinking work in our native languages, instead of trying to think in code. But from there. my issue is with use of time.

    2. Thank you, I think I understand a little bit better your approach.
      Your notion about employers not hiring non-coding testers anymore was what stressed the difference in viewpoint for me the most. In my environment, I feel it is the opposite - I feel as if there are too many open positions for non-coding testers around me, and too many in-between positions.
      The reason I'm saying "too many" is that too often people confuse "writing code" with "being technical", thus hiring less technical testers, which leads to the common misconception that "QA is the easy-way to high-tech", stressing too much the idea of testing as a stepping stone instead of as a profession and skills that one should invest effort in mastering.

      A follow up question does rise - in some of your previous posts you described some changes you brought to your team and some of your achievements and difficulties (I can recall "helping developers get time to refactor", as well as your pairing and mobbing experiments) - wouldn't you say that your programming skills (namely, the ability to participate in coding sessions to feel the impact refactoring makes on the code) contributed to your ability to drive those changes?

    3. I don't think it's programming skill that drives those changes. It's that I deeply care about value we're able to provide and I know that quality in a product comes from skilled developers reacting to feedback positively in collaboration. The thing I need to know about code is whatever is enough to not be afraid of it or mystify it.

      I'm sure people will see that the some of the comfort comes from exposure to code. Mob programming is somewhat painful to me, as it forces me to spend days with code and even expose the level to which I don't know how to write it. But it also is a mechanism that helps anyone quickly capture some of it, enough to be around if they have other value to contribute.

      It's kind of odd thinking back a session in Agile2015, where we had 6 people mobbing where 2 were complete newbies to programming. And there they are, reading code as if it was English with the help of the team. It's a very empowering experience.

  2. And finally (apparently, there is a size limit to comments), one thing I want to say in favor of the idea of having mostly testers that code:
    I think that everyone agrees that having automated tests is a great enabler for the development process - it provides a cool safety net to crash against.
    However, I am also very reluctant to delegate writing tests to anyone other than another tester in my team.
    Delegating it to a developer is something I found difficult, at least in my team where some developers openly express their unwillingness to take part in writing system-level automation, and even those that will participate will lack most of the time the inquisitive urge that tries to find those hidden connections withing the software.
    Another option I see around is having an "automation team" which, in theory, should be comprised of testers that are also good programmers.
    Such a tester will probably be a better programmer than I am and she might even be a better tester. But even in this case, the automated tests I will write will provide grater value - as the tester that is part of the team I know the product better, I am more updated with the bits and bytes and thus can leverage the design and coding of the test to explore and find problems. I also don't need anyone to define for me what exactly to do,which is always a lossy process (when one side defines and another executes, there will be misunderstandings, and what one gets will never be exactly as intended).
    On the other hand, if the coding tester is familiar enough with the product details and is in a position to independently design a test and have the knowledge use it effectively as an exploration tool - then this tester can do everything I can , and code as well. What value do I bring? Why not just get another one like her instead?

  3. This comment of yours includes a lot of assumptions of what a non-coding tester cannot do. I know non-coding testers who know the system technologies really well and read code with developers to get ideas for testing, without ever writing a single line. They tend to focus their energies on different things.

    A lot of the non-programming tester would go under "product owner" in the modern world.

    I'm (mostly) a non-programming tester. Suggesting I cannot do the stuff you mention as programming tester can do is incorrect. I can use the wonderful developers I work with to turn my ideas into code for system level automation too. "I want to do this, can you help turning this into code" seems to work well as a discussion opener. People talking to one another, learn ways to organize that are not boxed into the idea of becoming programmers.

    There's so many ways of organizing your teams. My take is that to build strong teams, collaboration and diversity of thinking is the key. A tester I respect does not test without discussions with all kinds of relevant stakeholders, even if they would not write code even for test automation purposes.

    1. In that case, perhaps I should rephrase my comment - I did not mean to imply what a non-coding tester can or can't do.
      I was referring to my experience, as a tester who also codes, in delegating the automation writing to another party.
      To summarize my points:
      1) I have not yet found a way to get programmers write good tests, and have faced a lot of resistance when trying to approach it. Perhaps it can be done, but I don't think it's easy.
      2) Asking an "automation team" to write the automation for me, while I do more interesting stuff, led also to inferior results - the test was never exactly as I wanted it to be, and the "contractor" automating stuff for me was less familiar with the project, and more tied to what I initially asked for, missing opportunities to change the test while stumbling over something important and missing the chance to notice problems while scrutinizing some small parts.

      One thing you suggested that I overlooked is pairing with a coder (or, in your words "can you help turning this into code"), only it seems to me almost like programming by proxy - that is, you won't be spared most of the boring stuff you have when writing your own tests, and you will end up being a coding-tester (I feel we talk about different things when we say that - for me, a coding tester is a tester that can code, the more I read what you wrote, I get the feeling you mean by this term "a tester that spends a significant amount of time coding")

      The two points above are leading me to think that if we want to have automated tests, the ones that should be writing them are the testers working on the project. Maybe not all testers should write code, and certainly not all of the time, but if the testing team is writing code (which, as I said, I think it should), each tester should be able to code at least to some degree - it's a good thing to have team members specialize in different skills according to their preference, but if a team has place for a specialist, all other team members (or a sufficient number of them) should have some basic skills around that area.

  4. On 1) I'd love to have a long discussion what is a "good test" in your context. I find that given ui-level tests, my developers have no particular issues in writing then in a nice page object structure. But there's a lot of groundwork to getting them to do it, including 1:9 ratio of testers:developers.

    On 2) I work currently with team that are all some sort of product experts. Most of us have been around since the new product started. We have two contractors, but we pretend actively they are ours to keep and treat them that way, as if they are really part of our organization.

    The way I'd really do things depends on people (that are the most important part of context). Not all programmers / programming testers / non-programming testers are the same. Trying to get people to do things they find unmotivating is a rocky road.

    I find that there's a lot of work to be done on skills of testers in general. And programming for them is not on the top of the list of things I would be teaching them to become the contributors they might not be right now.