Saturday, November 10, 2018

New Speakers, New Stories - Agile Testing Days USA SpeakEasy Track

Yesterday at TestBash USA, one of people I've mentored behind the scenes delivered a talk. I woke up today to a delightful message: "...had people saying I was their favorite talk. I wouldn't have reached this point without your help, I can't thank you enough."

Today, at Belgrade Testing Days, lovely people on Twitter delivered me news that another people I have mentored had a full house and got 4.93/5 in immediate app feedback for her first ever talk.

There is something in common. These people did awesome with their talks. They invited help. But what that really shows is that they have always been awesome, and inviting help was just small part of them putting effort into making their messages accessible for others. It has been my pleasure to be a small part on their journey, and get privately insights into what they are teaching - me and others.

I believe we all have worthwhile lessons to share. And we are ourselves our worst enemies talking down to ourselves. There is something you've done. There is something you care for. Your approach, when shared, could help someone else figure out their approach. It could be the same as yours. It could be completely opposite, yet inspired by you. The conference stages are for us learning together, and we need different perspectives and stories on those stages.

You - yes, YOU - have this in you. And you don't have to take that step of becoming a speaker alone. That is why there is SpeakEasy, a community initiative of building productive relationships between speakers, mentors and conferences. I believe in this so much that I've formed a leadership team with 3 lovely colleagues to take the initiative forward from 2018 on. I believe in this so much that I have mentored dozens of people, and keep my calendar open for giving time to support people on their speaking journeys.

Right now, I am volunteering with SpeakEasy that works in collaboration of one of the lovely conferences: Agile Testing Days USA. We have a full SpeakEasy track we are building to get stories to learn from that wouldn't be available otherwise. We seek for 6 talks, 2 workshops and 1 keynote. The talks are from new speakers. The workshops would be new speakers pairing with more seasoned speakers. And the keynote would be a seasoned speaker who has not had their changes of kicking into the keynoting regular circles yet.

You have one more week to join this. To join as a new speaker, you need to schedule yourself into my calendar for 15 minute discussion. We'll figure out what your early idea could look like, and consider it as something that you'll build with support from a mentor if it is the right match for a balanced program. Schedule your session now. If nothing else, you'll get a chance to talk your experiences through with me, and hear my ideas of how you could frame that for other stages.

If you get selected, Agile Testing Days USA pays the travel (with specified limits) and accommodation, and you get to enjoy the other sessions in the conference too. It's a lot of work to prep a talk, but it is also rewarding to structure your thoughts so that others are able to follow. It is a skill, I find, that makes a difference in your career.

You are awesome. And I want to talk with you. I need you to take the first step. I can't find you when you've not taken that stage yet.









Changing the Discussion around Scope

People have an amazing talent for seeking blame. Blame in themselves, what they did wrong but also blame in others, what they did wrong. Having a truly blameless retrospective where we'd honestly believe that F.A.I.L. means first attempt in learning and embracing more attempts in the future, hopefully different ones is a culture that takes a lot of effort.

I've personally chosen a strategy to work around scope that is heavily reliant on incremental delivery. Instead of asking how long it takes to deliver something, I guide people into asking if we could do something smaller first. It has lead my team into doing weekly, even daily releases where each release delivers some value without taking out that was already there. Always turning the discussion towards value in production, and smallest possible increment of that value has been helpful. It enables movement within the team. It enables reprioritization. And it enables the fact that no one needs to escalate things to find a faster route to get the same thing done, the faster route is always a default.

We work a lot with the idea of being customer-oriented - even obsessed, if that wasn't such a negative word. We are thinking a lot in terms of value, empathy and caring, and seeking ways to care more directly. We don't have a product owner but a group of smart minds both inside the team but also outside supporting the team with business intel. The work we all do is supposed to turn into value at customers hands. Production first helps us prioritize. Value in production, value to production.

We didn't always deliver this way or work this way. We built the way we work in this team in the last 2.5 years I've had the pleasure of enjoying the company of my brilliant team.

Looking at things from this perspective, I find there is a message I keep on repeating:

If you have a product owner (or product management organization) and they ask you to deliver a feature that customers are asking, they don't know everything but they do their best in understanding what that would be like. They define the scope in terms of value with the customer.

If they ask you to estimate how much work there is to do that, you need to have some idea of the scope. Odds are, your idea of the scope isn't same as theirs, and theirs is incomplete to begin with. The bigger the thing asked, the more the work unfolds as we are doing it.

They asked you for value 10. You thought it will take you effort 10. That is already two ways of defining the scope.

In delivery, you need to understand what the value expected really is. Often it is more in terms of effort that you first guessed.

Telling folks stuff like "you did not say the buttons needed to be rounded, like all the other buttons" or "the functionality is there, but the users just won't find it" may be that it works as specified but not as really expected. I find that those trying to specify and pass the spec do worse than those trying to learn, collaborate and deliver incrementally.

Scoping is a relationship, not something that is given to me. We discover features and value in collaboration, and delivering incrementally helps keep the discussion concrete. Understanding grows at every step of the way, and we should appreciate that.

** note: "Scope does not creep, understanding grows" is an insight I have learned from Jeff Patton. There are many things I know where I picked them up, while there's more where I can no longer pinpoint where the great way of describing my belief systems came from. I'm smiling wryly at the idea of mentioning the source every time I say this in office - we're counting hundreds. 

Getting best ideas to win

There's a phrase I keep repeating to myself:
Best ideas win when you care about work over credit. 
A lot of times, if you care to be attributed for the work you are doing, the strategies of getting the best ideas out there, implemented, are evading. If you don't mind other people taking credit for your ideas (and work), you make a lot more progress.

Mob programming is a positive way of caring about work over credit. There we are all mutually credited for what comes out. But on the other hand, it is hard that you know something would not be what it is without you, and the likelihood of anyone recognizing your contribution in particular is low.

At TestBash Australia, we had a hallway conversation about holding on to credit you deserve, and I shared a strategy I personally resolve into when I feel my credit is unfairly assigned elsewhere: extensive positivity of the results, owning the results back through marketing them. People remember who told them the good news.

As a manager in my team, I've now tried going out of my comfort zone on sharing praise in public. With two attempts at it, I am frustrated on feeling corrected. I'm very deliberate on what I choose to say, who I acknowledge and when. I pay a lot of attention to the dynamics of the teams, and see the people who are not seen, generally speaking. What I choose to say is intentional, but also what I choose to not say is intentional.

This time, I chose not to acknowledge great work of an individual developer when getting a component out was very clearly team work. I remember a meeting I invited together 5 weeks ago to guide scope of the release to smaller, with success of "that is ready, tomorrow". I remember facilitating the dedicated tester designing scope of testing to share that there were weeks worth of testing after that "ready". I remember how nothing worked while "ready", and the great work from the tester in identifying what needed attention, and the strong-headedness of not accepting bad explanations for real experiences. I remember another developer from the side guiding the first developer into creating analytics that would help us continue testing in production. I remember dragging 3rd parties into the discussion, and facilitating things for better understanding amongst many many stakeholders. It took a village, and the village had fun doing it. I would not thank one for the work of the village.

Just a few hours later, I was feeling joy as one of the things I did acknowledge specifically was unfolding into wider knowledge in a discussion. I had tried getting a particular type of test from where it belonged, and failed, and made space for it to be created in my team. The test developer did a brilliant job implementing it and deserved the praise. Simultaneously, I felt the twitch of lack of my praise on finding the way in an organization that was fighting back on doing the right thing, and refusing feedback.

I can, in the background, remember to pat myself on the back, and acknowledge that great things happen because I facilitate uncomfortable discussions and practical steps forward. Testing is a great way of doing that. But all too often, it is also a great way of keeping yourself in the shadows, assigning praise where it wouldn't be without you.

Assigning credit is hard. We need to learn to appreciate the whole village.

Wednesday, November 7, 2018

Achievements of a Silo

Once upon a time there was a company, much like many other companies yet unique in many ways. As companies do, they hired some great people in different teams. There was one thing in common: all the people were awesome. But the people came from very different backgrounds and ideas.

In one of the teams where some great people in testing landed, the testers were feeling frustrated. With a new team and no infrastructure for builds and test automation yet features flying around being implemented and tested, they found it hard to take the time and focus they felt they needed. So as some great people do, they actively drove forward a solution: they created a new team on the side, with focus on just creating the infrastructure and dropped all the in-team work of testing they had managed to get started. Without facilitation, the in-team testing work turned into tiny, focused on units and components, and perspectives around value and system vanished in hopes of someone else picking them up, like magic.

With the new team and new focus, the great people made great progress. They set up a fancy pipeline with all sorts of fancy tests, and a lovely set of images and documents to share what a great machinery they had built. Where ever this new team showed up, they remembered to tell how well they did, and all the awesome stuff the machinery now made available with sample tests of all sorts that the pipeline theoretically should hold.

The original team focusing on features were handed the great machinery with high hopes for expanding it. The machinery building team built more machinery, on the side of the machinery being used for real projects.

The fun part of this fable arrives after many months has passed. The overall project with the lost focus of who owns system perspectives was struggling a bit, and it became obvious that getting a perspective into readiness wasn't an easy task. So as companies do, a meeting was called.

In the meeting, the machinery team presented all the great things they had built, and great they were. With every example of what was built as example into the machinery, the team focusing on features brought today's reality. That test job - turned off as it broke. Same with the other. And another. All the great things the machinery promised, none of it was realized in practice.

Lesson of this story is: it's not about your team's output, but about the outcome of all the different teams together. You can create the shiniest machinery there is, but if it is not used, and if relevant parts of it in real use get turned off, your proof of concept running all the shiny things provided very little value. It may have taught the great people in the machinery team some valuable personal lessons in the technical perspective. What it should teach is that value of whatever we are building comes from the use of it.

I'm a big believer of teams actively participating in building their continuous integration machinery, and slightly loath people who believe that learning together while building it, while taking it into use isn't needed because someone else could do the learning for you.

Learning with you is possible, for you is not. Achievement in silo often end up worth a little. 

Thursday, October 25, 2018

GDPR got my skeleton removed

On September 29th I wrote a blog post about my ex not deleting "relationship materials" on request. I felt, and still strongly feel that delete on request is what is morally right. Normally I recognize he wouldn't be legally obligated. Pictures and texts could be considered as gift you cannot reclaim, regardless of how much emotional suffering their existence causes.

On October 10th, he confirmed he had deleted the stuff. The confirmation was short: "Done." No response to me checking: "Really? Thank you."

Today I had my chance of asking for final confirmation in a mediated call. The material is deleted.

But I asked a followup question: what made you change your mind? And the response is simultaneously sad and delightful.

What changed his mind was not the fact that many of his friends as well as internet strangers in the software community reached out to talk on my behalf. It was not that he would have cared that I was struggling with nightmares of him raping me that I couldn't control.

It was that I realized that the address I used to share those materials is a company address. And it is a company address of his family's company with relevant risk of damage. Under GDPR, I have the right to request deletion of those materials and that is what I did on October 10th, as I woke up in the middle of the night to yet another one of those awful nightmares. I emailed polite requests for deletion to the company he uses for email and his own privately owned company, to get the "Done." a few hours after.

He expressed I threatened his family, but there was nothing threatening on the request to delete the materials. Delete and all is well. There would be nothing threatening on the potential consequences either, unless my request to delete was actually valid - which it was. He was illegally holding private material on company computers. GDPR comes to play.

Lessons learned:

  1. GDPR actually works to get private data deleted. Thank you European Union. And thank you for my testing profession of keeping me well aware of what this piece of legislation means. 
  2. Private materials belongs in private computers. The traveling consultants all purpose computer for all things private and professional is a bad choice if you want to keep that stuff legally. 
  3. People you care for can really disappoint you big time. But when a door closes, another opens. 

Saturday, October 13, 2018

Finding the work that needs doing in a multi-team testing

There's a pattern forming in front of my eyes that I've been trying to see clearly and understand for a good decade. This is a pattern of figuring out how to be a generalist while test specialist in an agile team working on a bigger system, meaning multiple teams work on the same code base. What is it that the team, with you coaching, leading and helping them as test specialist is actually responsible for testing?

The way work is organized around me is that I work with a lovely team of 12 people. Officially, we are two teams but we put ourselves all together to have flexibility to organize for smaller groups around features or goals as we see fit. If there is anything defining where we tend to draw our box that we are not limited by, it is drawn around what we call clients. These clients are C++ components and anything and everything needed to support those clients development.

This is a not a box my lovely 12 occupies alone. There's plenty of others. The clients include a group of service components that have since age of time been updated more like hourly and while I know some people working on those service components, there's just too many of them. And the other components I find us working on, it is not like we'd be the only ones working on them. There's two clear other client product groups in the organization and we happily share code bases with them while making distinct yet similar products out of them. And to not make it too simple, each of the distinct products comprise a system with another product that is obviously different for all three of us, and that system is the system our customers identify our product with.

So we have:
  • service components
  • components
  • applications
  • features
  • client products
  • system products
When I come in as tester, I come in to be caring for the system products from client products perspective. That means that to find some of the problems I am seeking, I will need to use something my team isn't developing to find problems that are in the things my team is developing. And as I find something, it really does no longer matter who will end up fixing it.

We also work with a principle of internal open source project. Anyone - including me - in the organization can go do a pull request to any of the codebases. Obviously there's many of them, and they are in a nice variety of languages meaning what I am allowed to do and what I am able to do can end up being very different.

Working with testing of a team that has this kind of responsibility isn't always straightforward. The communication patterns are networked and sometimes finding out what needs doing feels like a puzzle to solve where all pieces are different but look almost identical. To describe this, I went to identify different sources of testing tasks for our responsibilities. We have:
  • Code Guardianship (incl. testing) and Maintenance of a set of client product components. This means we own some C++ and C# code and the idea that it works
  • Code Guardianship and Maintenance of a set of support components. This means we own some Python code that keeps us running, a lot of it being system test code. 
  • Security Guardianship of a client product and all of its components, including ones we don't own. 
  • Implementing and testing changes to any necessary client product or support components. This means that when a team member in our team goes and changes something others guard, we go as team and ensure our changes are tested. The maintenance stays elsewhere, but all the other things we contribute.
  • End to end feature Guardianship and System Testing for a set of features. This means we see in our testing a big chunk of end users experience and drive improvements to it cross-team. 
  • Test all features for remote manageability. This means for each feature, there a way of using that feature that the other teams won't cover but we will. 
  • Test other teams features in the context of this product to some extent. This is probably the most fuzzy thing we do. 
  • All client product maintenance first point of support. If it does not work, we figure out how and who in our ecosystem could get to fixing it. 
  • Releases. When it's all been already tested, we make the selections of what goes out and when and do all the practicalities around it. 
  • Monitoring in production. We don't stop testing when we release, but continue with monitoring and identifying improvement needs.
To do my work, I follow my developers RSS feeds in addition to talking with them. But I also follow a good number (60+) components and changes going into those. There is no way anymore Jira could provide me the context of the work we're responsible for, and how that flows forward. 

I see others clinging to Jira with the hope that someone else tells them exactly what to do. And in some teams, someone does. That's what I call my "soul sucking place". I would be crushed if my work was defined to do that work identification for others. My good place is where we all know the rules of how to discover the work and volunteer for it. And how to prioritize it, what of it we can skip for low risks related to others already doing some of it. 

The worst agile testing I did was when we thought the story was all there is. 

Thursday, October 11, 2018

How to Survive in a Fast Paced World Without Being Shallow


As we were completing an exercise into analyzing a tiny application on how would we test it, my pair looked slightly worn out and expressed their concern on going deeper in testing - time. It felt next to impossible to find time to do all the work that needed doing in the last paced agile, changes and deliveries, stories swooshing by. Just covering the basics of everything was a full time work!

I recognized the feeling, and we had a small chat on how I had ended up solving it by sharing much of the testing with my teams developers, to an extent where I might not show up for a story enough to hear it swoosh by. Basic story testing might not be my choice of time, as I have a choice. And right now I have more choices than ever, being the manager of all the developers.

**Note: the developers I have worked with in the two last places I work in are amazing testers, and this is  because I don't hog the joy of testing from them but allow them to contribute to the full. Using my managerial powers to force testing on them is a joke. Even if it has a little truth into it. 

Even with developers doing all the testing they can do, I still have stuff to test as a specialist in testing. And that stuff is usually the things developers have not (yet) learned to pay attention to.

For browser-based applications, I find myself spending time browsers other than developer's favorite and with browser features set away from usual defaults.

For our code and functionality, I find myself spending time interrogating the other software that could reside in the same environment, competing for attention. Some of my coolest bugs are in this category.

For lacking value on anything, I find myself spending time using the application after it has been released, combining analytics and production environment use in my exploration.

To describe my tactic of testing, I was explaining the overall coverage that I am aware of and then choosing my efforts in a very specific pattern. I would first do something simple to show myself that it can work, to make sure I understand what we've built on a shallow level. Then I leave the middle ground of covering stuff for others. Finally, I focus my own efforts into adding things I find likely that others have missed.

This is patchy testing. It's the way I go deep in a fast based world so that I don't have to test everything in a shallow way.

Make a pick and remember: with continuous delivery, you are never really out of time for going deeper to test something. That information is still useful in future cycles of releasing. At least if you care about your users.