Where I work, we have this constant struggle of us (the good guys - cyber security specialists) and them (the bad guys - in their various forms). The struggle is kind of fascinating to look at from a tester perspective.
We know software always has bugs. The good ones of us are good at catching them, and catching some of the relevant ones before we ever deliver our software out. And we catch some of them after release, listing carefully to signals and confirming suspicions. As a tester, I've come to the ultimate conclusion that what matters is our ability to change. It matters because we will miss something, regardless of our best (and improving) efforts. But it matters more because we have an adversary that plays the game from a different angle.
The people who create malware are software developers, just like us in many ways. I find it fascinating to think if they have the same needs of investing in testing. Any software introducing the right kind of bugs can give an attacker a route into a system they shouldn't be in. Does it matter if the software using an exploit is fine-tuned and tested?
I had a chance to talk with an old tester colleague of mine who now uses his analytical skills in working with security after something bad happens. Something bad these days could be that a company has ended up with ransomware, lots of critical data unaccessible as it has been encrypted. You could try restoring from backups, but what if your incremental backups don't sum up to full? Apparently you can also use your testing skills, testing the malware and finding bugs in it to be able to open the encrypted files again. The joy on my colleagues face on finding the bug and using it to open up all the files was the same joy we feel when we catch relevant bugs in our software. The skills used are same or similar, the target of testing is just completely different.
Working in a security company makes me wonder about the role of testing. A lot of the bugs I routinely handle have a lot to do with lost time, lost services, inability to do what needs doing. But some bugs, the security ones, have to do with lost access and lost data.
We're starting to see the value of security as data is crucial. I wonder if we will ever see the value of people's time, or if other solutions to free up time over delivering well functioning (tested and fixed) software will win out.
We live in interesting times. And today I stop to appreciate that what I do for *this* software, I could do for any software.
We know software always has bugs. The good ones of us are good at catching them, and catching some of the relevant ones before we ever deliver our software out. And we catch some of them after release, listing carefully to signals and confirming suspicions. As a tester, I've come to the ultimate conclusion that what matters is our ability to change. It matters because we will miss something, regardless of our best (and improving) efforts. But it matters more because we have an adversary that plays the game from a different angle.
The people who create malware are software developers, just like us in many ways. I find it fascinating to think if they have the same needs of investing in testing. Any software introducing the right kind of bugs can give an attacker a route into a system they shouldn't be in. Does it matter if the software using an exploit is fine-tuned and tested?
I had a chance to talk with an old tester colleague of mine who now uses his analytical skills in working with security after something bad happens. Something bad these days could be that a company has ended up with ransomware, lots of critical data unaccessible as it has been encrypted. You could try restoring from backups, but what if your incremental backups don't sum up to full? Apparently you can also use your testing skills, testing the malware and finding bugs in it to be able to open the encrypted files again. The joy on my colleagues face on finding the bug and using it to open up all the files was the same joy we feel when we catch relevant bugs in our software. The skills used are same or similar, the target of testing is just completely different.
Working in a security company makes me wonder about the role of testing. A lot of the bugs I routinely handle have a lot to do with lost time, lost services, inability to do what needs doing. But some bugs, the security ones, have to do with lost access and lost data.
We're starting to see the value of security as data is crucial. I wonder if we will ever see the value of people's time, or if other solutions to free up time over delivering well functioning (tested and fixed) software will win out.
We live in interesting times. And today I stop to appreciate that what I do for *this* software, I could do for any software.