The idea for this article came to me as I was reading the original book on “How to Read a …
One of my friends asked me this question last week. I thought of sharing it here for anyone else who may have the same question.
There are two aspects to this: technical, and the other is the people/cultural aspect.
They are a development team; it’s their job, it’s their duty to build secure software. Why does the dev team need convincing? Do they need convincing on building a feature, or do they need convincing to do unit testing? They don’t. Why security issues are any different?
If Dev teams are reluctant; It goes on to show deeper issues. Suppose they are unwilling to push fixes on production, which doesn’t speak well, to begin with, about the team, or the company culture, engineering teams culture or their competency as software engineers. They shouldn’t be reluctant to push the fixes. Instead, they should be reluctant to pushing bugs on production. There is a massive likelihood that the organization may have blamed them for a security or production incident in the past.
That’s the people/cultural side to it;
Now, let’s look at the technical aspect.
Suppose the team is just worried that fix may break things. There are multiple ways to solve it. They can push the fixes in the test environment first to see if it’s working exactly the way it should be, or they can do a blue-green deployment. They can do it only for 1% of their customer base and monitor for potential issues and see if everything is working as it should. If it seems to be working well, they can gradually do it for 99% of their customer base.
If the prioritization method is defined and made available to all, the business can take the call and decide, but the CTO/CEO should accept that Risk. Someone in the Leadership team must bear the risk that if this isn’t fixed, the user’s data or company information is at stake, depending on the impact. That’s part of good corporate governance.
Our job is to surface security issues and help the business make the right decisions. Our job is not to get every security bug fixed. Our job is to help the team prioritize the right issues to fix in the right way. Our job is not to force them to do something by a specific date.
Our job is that we evangelize, educate, and make a solid argument and propose a solution based on the data and metrics to convince.
Risk Management is nothing but being aware of your downsides and managing them. Security Issues could have a downside. If there are the downsides, there are the upsides of not fixing an issue.
Security issues are a risk. Security issues go into your risk register, saying:
“The security team identified and surfaced a risk and decided to accept the risk for so and so reason.”
The Risk can be Accepted, Mitigated, Avoided or transferred:
- Accepted means we accept the Risk because our engineering team is reluctant to fix this in production.
- Mitigate: Our team decided to fix the issue in production.
- Avoid: We brought down the application because it wasn’t being used and was worth fixing the issue.
- Transferred: It’s too complex or time-consuming; let’s outsource it instead of building it ourself, or get cybersecurity insurance.