I count things, some more relevant than others. I usually know how many times someone called me a 'guy' on a particular day, because it's my calming mechanism of having to deal with that annoyance. I know I delivered 23 external conference sessions in 2020, I know my team is about to do release number 24 before the year ends, I know we've processed a little over 300 Jira tickets (which I consider a few full weeks of wasted time) and I know how many items of value I would say they actually include since I reverse engineered value out of the work we did this year.
The numbers are information. They are only one type of information. But they are information easily used to impress people. And they are a tool for getting my message across stronger than most people.
For past few jobs, I have done end of year numbers. Since anything I work on right now is kind of secret, I'll just go traveling back in time to show what type of numbers I collected in the two previous years, to run an annual comparison.
- Team size. How many people delivering the work I was analyzing.
- Monthly Active Devices (MAD). The first number I was curious on is how many customers the team was serving with the same people. Being a DevOps team meant that the team did development and deployment of new changes, but also provided support for a ever growing customer base we calculated in impressively large numbers. Telemetry was invaluable source for this information. It was not a number of money coming in. It was people using the product successfully.
- Work done represented in Jira tickets. I was trying hard to use Jira only as an inbox of work coming from elsewhere outside the immediate team, and for most part I succeeded with that and messed up all my changes of showing all work in Jira ticket numbers (I consider this a success!). About a third of visible ticket work done was maintenance, responding to real customer issues and queries. Two thirds were internally sourced.
- Work coordinated represented in Jira tickets. Other teams were much stricter in not accepting work from hallway conversations, and we often found ourselves in a role of caring for the work that others in the overall ecosystem should do. Funny enough, numbers showed that for every 2 tickets we had worked on ourselves, we had created 3 for other teams. The number showed our growing role in ensuring other teams understood what hopes were directed towards them. It was also fascinating to realize that 70% of the work we had identified for others was done within the same year, indicating that it wasn't just empty passing of ideas but a major driving force effort.
- Code changes. With the idea that for a DevOps team nothing changes if the code (including configurations) changes, I looked around for numbers of code going into the product. I counted how many people contributed to the codebases and noted it was growing, and I counted how many separate codebases there were and that that too was growing. I counted number of changes to product, and saw it double year over year. I noted that for 4 changes to the product, we had 3 changes to system level test automation. I noted code sharing had increased. Year over year numbers were delight: from 16% to 41% (people committing to over N components) and from 22% to 43% (more than M people committing on them) on the two perspectives of sharing I sampled. I checked my team was quarter of the people working on the product line, and yet we had contributed 44% of changes. I compared changes to Jira tickets to learn that for each Jira ticket, we had 6 changes in. Better use the time on changing code than managing Jira, I would say.
- Releases. I counted releases, and combinations included in releases. If I wanted to show a smaller number, I just counted how many times we completed the process: 9 - number that is published with the NEXTA article we wrote on our test automation experience.
- Features pending on next team. I counted that while we had 16 of them a year before, we had none with the new process of taking full benefit of all code being changeable - including that owned by other teams. Writing code over writing tickets for anything of priority to our customer segment.
- Features delivered. I reverse engineered out the features from the ticket and change numbers, and got to yet another (smaller) number.
- Daily tests run. I counted how many tests we had now running on a daily basis. Again information that is published - 200 000.
So you see, numbers are everywhere. They give you glimpses to what might be happening, but at the same time they are totally unreliable. If you have a critical mind and good understanding on their downsides, looking at them may be good.
Going back in time even more, I find my favorite numbers: how I ended up having to log less and less bugs as the team's tester. From impressive numbers showing I found 8,5 bugs for every day of the year to having almost none logged as I moved to fix-and-forget and pairing with developers on fixes give a nice fuzzy feeling that turning my work invisible was a real improvement.
Ask your question, and numbers may help. Numbers first - or comparable numbers between different teams - usually cause havoc.
So like numbers, like me. But be careful out there. Even if they say you get what you measure, it is only true sometimes. We can talk about that another time.