According to market researcher ReportsnReports, North America is the largest market for security testing services, which are estimated to be $4.96 billion by 2019. So, why are 90 percent of security incidents (per the U.S. Department of Homeland Security) caused by exploits against defects in software?
It’s no secret to most software professionals that poor application security is one of the biggest vulnerabilities in any organization’s security defenses. Companies patch applications constantly. In July 2016, Oracle issued its largest patch bundle ever, fixing 276 security flaws with a single effort.
What many organizations have not yet accepted is that security flaws in production are not a given. They also aren’t addressing the fact that inadequate security testing isn’t the only problem. Developers are engaging in risky behaviors without organizational approval. Testing is happening at the wrong time using ineffective processes. The security team is often not integrated into the development process—or it has an adversarial relationship with the developers it is supposed to assist.
To resolve these issues and meet the market’s expectations for security, organizations must explore the underlying behaviors that are affecting security, why accelerated release cycles are exacerbating them, and how companies can fix the problem.
The SANS Institute 2015 State of Application Security Report provides some solid insight into the disconnect between development and security. The report found:
Information security engineers and software developers don’t understand each other’s jobs, priorities or missions.
Developers are trained to focus on delivering bits of code that performs as intended and meeting delivery expectations, rather than on ensuring that software is secure.
Only a small percentage of security testing is performed by the development team (21.6 percent) or quality assurance personnel (22.percent). The internal security team handles most (83.2 percent) of the testing.
To these challenges, I would add two more:
Pressure to develop and release software more quickly encourages developers to borrow code from others or use open source libraries that no one has checked for security flaws.
Many organizations continue to focus on testing at the end of the development cycle—right before release to production. (Some focus on eliminating all vulnerabilities in production, which is an even more dangerous practice).
The Cost of Being Bad
All of this dysfunction creates an entire ecosystem that encourages security flaws—and causes other problems. Here is a typical scenario:
Developers notify the security team when they arrive at a certain point in the development process, which may be “early” (when code for a particular application or system function—or with agile, particular sprint—is complete) or “late”(when the product is ready for release).
The security team performs its tests and finds numerous flaws, all of which must be corrected at some point.
A team evaluates and prioritizes the flaws, and determines which are critical and which can be addressed at a later date.
A developer must then go back and identify the source of each flaw, not only in his code but likely in related code, provided by other developers, used to build the vulnerable function or component.
After a considerable amount of work, the critical flaws are addressed and the cycle begins again.
The end result of this approach is that release cycles often slip, and some flaws are knowingly released into the finished product. Budgets are blown, and if the vulnerabilities are discovered before the next release when teams can fix them, customers are angered and corporate reputations are tarnished.
Some companies attempt to resolve this problem by allotting a specific period for security testing. I have worked with customers who engaged in a massive, month-long security testing effort before each release went live. With two releases a year, this meant that two full months were consumed dealing with security. That’s a productivity- and timeline-blowing approach, as well.
FIXING THE PROBLEM
There are many detailed improvement suggestions I can offer, but I will leave those for a future discussion. For now, let’s talk broad strokes. The big takeaway is that organizations must build security into the entire software development lifecycle (SDLC), and must address security at the early stages of the SLDC and not tack it on at the end.
They also must find a way to unify the development and security functions, both in action and in outlook. From my experience, the most effective way to achieve this goal is to integrate security into the developers’ process and not ask the developer to integrate into the security process. Following are some suggestions:
Focus less on ensuring every release is perfectly secure and more on architecting a process that consistently creates secure code. This effort should include teaching developers how to write secure code initially.
Develop policies (and security testing procedures, if the practice is allowed) for the use of borrowed and open-script code.
Include security architects in the development process at the start of every project
Invest in tools for secure development that can speed up testing and provide sufficient depth and breadth to security testing efforts.
Manage vulnerabilities like defects, by exception.
If the company feels compelled to take shortcuts and save budget, focus testing efforts on what has changed from the last release.
Use application defense tools, such as Runtime Application Self-Protection (RASP), to monitor applications and identify flaws in production.
Organizations that make these and other improvements will reap dramatic gains. In addition to significant reductions in security flaws (and resulting exploitation), most firms report team productivity gains of 15 percent or more. Cycle time for vulnerability management and remediation is often cut in half.
As a bonus, taking an integrated approach for security often paves the way for more comprehensive process improvement initiatives. Early defect detection and prevention becomes the norm, and both employee and application user satisfaction soar.