Blazing a trail on the SAST road less traveled by
Recently I talked about how SonarSource is shifting the SAST paradigm from targeting security compliance auditors to developer-led code security. Today I'd like to talk about the most fundamental part of that change: accuracy in the issues we raise.
Most SAST tools target security compliance auditors. Their goal is to raise an issue for anything even remotely suspicious. There's no fear of false positives for those tools because the auditors will figure it out; after all it's the auditors' job to sort the wheat from the chaff and the signal from the noise. But for years now the rallying cry at SonarSource has been "Kill the noise!" As a developer-first company, we know there's little tolerance among developers for crying wolf. So our guiding principle has been to prefer "reasonable" false negatives to raising false positives.
What does that mean in practical terms? Well, let's play with some numbers. Let's say you have a codebase with 12 Vulnerabilities. That's 12 things that absolutely need fixing. A typical SAST analysis might raise 500 issues in total, and then the auditors will spend x weeks sorting through that to bring you, the developer, the audit result maybe a month or so after you've moved on to other code.
Not an appealing scenario, is it?
So okay, let's eliminate the lag time by pivoting to developer-led code security. Now, instead of taking weeks to sort through the SAST report, the auditors dump it on your desk. They expect you - as the developer - to find and fix the true Vulnerabilities. This scenario's even worse, both for you and for the security of the codebase. Because let's be honest, it won't take many false positives for you to throw up your hands and declare the whole thing a waste of time. Now, nothing gets fixed.
At SonarSource, we're keenly aware of that. That's why we accept reasonable false negatives. Instead of raising 12 real Vulnerabilities that are ultimately lost and ignored in a sea of false positives, we'd rather raise only 10 real Vulnerabilities that actually get fixed and miss the other two.
Don't misunderstand. We're not missing those other two (theoretical!) issues because we're sloppy or lazy. Sometimes in implementing a rule you have to strike a balance between catching every single issue … and also getting a few False Positives in the net, or tuning the rule sensitivity down to eliminate False Positives… and missing a few real issues at the same time. SonarSource developer Loic Joly recently gave a rule implementer's perspective on striking that delicate balance. As he explained, when we're faced with this choice at SonarSource, we're going to choose false negatives every time.
It's an issue of credibility. As I said earlier, we know that developers don't have patience with False Positives. So we make sure that when we raise an issue, there's something to fix. That doesn't mean we never raise False Positives. We're human too, and if developers were perfect, you wouldn't need us to start with. But our mission is giving you an accurate SAST analysis, and killing the noise. And it makes all the difference.
This is the second installment in a 4-part series on Developer-led SAST.:
- Taking the angst out of SAST analysis
- Blazing a trail on the SAST road less traveled by
- Security Hotspots maintain engagement in developer-led security
- Security auditors - the Cinderella story of SAST
Something to add? Join the discussion in the community.