Monthly Archives: September 2014

Do SSDLC Programs Really Work, or: How to Measure Success (1)

When discussing Secure Software Development Lifecycle (SSDLC) programs, I get asked a lot whether such an investment is actually worth the effort, or whether it would be better to invest the available resources differently. It is true that running an SSDLC program actively is a big investment, and – as with any business expense – it is very reasonable to evaluate the ROI of these expenses.

As with any engineering program, having solid data is a must when communicating results to upper management. In particular the C-suite of management will request metrics that actually show that the program works as expected.

While there are dozens of metrics that can be collected when executing the SSDLC, I find four metrics particularly interesting. They deal with the inflow and processing of security defects in an organization’s portfolio. Eventually, most “relevant” findings from an SSDLC activity that are actually tracked somewhere (instead of being fixed immediately) are either security defects or security requirements. Security requirement findings are a lot harder to quantify than security defects, because they are bound to a product’s target market, that is, how a product is being used and deployed. Two identical products that are selling to two different target markets may have very different sets of security requirements. For security defects, this is similar to a certain extent. However, a security defect is usually a security defect, and only the severity rating that is attached to it changes with target markets and deployments. This makes security defects a metric which is very easy to quantify and compare.

The four metrics we will discuss in the following are:

  1. The number of qualified defects found per quarter
  2. The number of qualified defects fixed per quarter
  3. The difference between [1] and [2], that is, inflow vs outflow
  4. The overall number of open qualified security defects

During my tenure at HP, I helped several organizations within HP introduce and successfully run SSDLC programs. Such organizations in HP may be business units or groups of product teams. The data provided in this series of posts is constructed based on data I collected from an R&D organization in HP with a few hundred R&D engineers. For obvious reasons, I cannot reveal from which exact organization and what exact product set the data was derived from. To maintain additional confidentiality, I took data that is several years old and not current, changed the numbers in a way that would not change the overall pattern, and then transformed it into a relative scale. These transformations do not affected the trends in the graph, which makes the anonymized (and somewhat randomized) data good enough for this series of posts.

Number of Qualified Defects Found Per Quarter

The figure below shows the number of issues found per quarter over a time period of 10 (i.e. 9 plus 1) quarters, broken up into low, medium, high, and critical severity defects: this particular organization had a history of 20 quarters of security activities. All the results that were achieved in these 5 years are represented in the “20 Quarters” bar. The bars for Quarter 1 to Quarter 9 represent data points after the organization decided to introduce an SSDLC program. Note that all data points are relative to the results of the first 20 Quarters, as the absolute numbers are confidential and cannot be published.

2014-09-19 - Security Defects Metrics - Graph 1

The graph shows that the program started slowly, with the number of newly reported security defects per quarter being in the 10% range of the previous 20 Quarters. Then it gained traction, roughly doubling this number to about 25% per quarter relative to what was reported in the overall 20 Quarters before the program was initiated. After Quarter 7, the program really took off, and started to show fast growth on newly discovered defects.

This is a very typical pattern. Depending on the size of the organization, it usually takes 6 to 9 months after real activities (i.e. more than Powerpoint activities) have started to see any significant results. Once a few big teams or lighthouse projects within the organization are fully on board, and see the first encouraging results for their projects, managers start investing more resources and engineers start to develop a higher level of security awareness. At this point, we often see an exponential growth until a saturation level is reached, and then a slight decline to a stable level.

In this particular example, we see that this organization (which at the time was still in the growing phase) managed to find almost as many (about 75%) qualified security defects in a single quarter than they did in the previous 5 years.

Edit: continue reading…

Configure PMD in Multi-Module Maven Builds

Similar to Checkstyle, the execution of PMD can be both automated in developer builds and in the CI/CD environment, and at the same time use a centralized configuration artifact for the PMD rules setup.

The “Maven PMD Plugin” can be used to integrate PMD into Maven builds. The plugin actually performs two tasks: First the PMD analysis and second a “copy and paste” detection, where the plugin tries to detect reuse of code by copy and paste rather than language specific code reuse mechanisms. The plugin can execute multiple goals for these checks (“pmd” and “cpd”) and there are variants of these goals that fail the build in case of a rule violation (using “check” instead of “pmd” and “cpd-check” instead of “cpd”). By setting up a Maven profile for release builds, developers can create local rule violation reports in their build environment, and the CI/CD system can fail the build in case of rule violations.

The configuration below sets up a default configuration of the PMD plugin which allows a developer to execute the PMD checks with the “mvn clean install” command. In case of rule violations, the developer build is configured to not fail, but instead create data for reports. Some of the configuration settings are parameterized with Maven build variables, but the configuration works the same way when these settings are being hardcoded.

In this example, we use a configuration artifact (jar resource file) for the PMD rules configuration. The “dependency” section pulls in the configuration artifact, and the “ruleset” parameter points to the PMD rules configuration in the resource configuration artifact. This filename must match the actual rules filename used in the dependency. Note that PMD performs a JDK specific analysis, and thus requires to specify the JDK version. This should match the JDK version configured in the Maven Compiler Plugin.

<build>
  ...
  <plugins>
  ...
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-pmd-plugin</artifactId>
      <version>${mavenPmdPluginVersion}</version>
      <configuration>
        <linkXref>true</linkXref>
        <sourceEncoding>${project.build.sourceEncoding}</sourceEncoding>
        <minimumTokens>100</minimumTokens>
        <targetJdk>${javaVersion}</targetJdk>
        <rulesets>
          <ruleset>pmd-rules.xml</ruleset>
        </rulesets>
      </configuration>
      <dependencies>
        <dependency>
          <groupId>${buildToolsGroupId}</groupId>
          <artifactId>${buildToolsArtifactId}</artifactId>
          <version>${buildToolsVersion}</version>
        </dependency>
      </dependencies>
      <executions>
        <execution>
          <phase>validate</phase>
          <goals>
            <goal>pmd</goal>
            <goal>cpd</goal>
          </goals>
        </execution>
      </executions>
    </plugin>
  ...
  </plugins>
  ...
</build>

For the automated continuous integration (CI) build, we use a very similar variant of the configuration in a profile. For the CI build, the plugin is using the two alternate goals to fail the build in case of a rules violation:

<profiles>
  ...
  <profile>
    <id>release</id>
    ...
    <build>
      <plugins>
      ...
        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-pmd-plugin</artifactId>
          <version>${mavenPmdPluginVersion}</version>
          <configuration>
            <linkXref>true</linkXref>
            <sourceEncoding>${project.build.sourceEncoding}</sourceEncoding>
            <minimumTokens>100</minimumTokens>
            <targetJdk>${javaVersion}</targetJdk>
            <rulesets>
              <ruleset>pmd-rules.xml</ruleset>
            </rulesets>
          </configuration>
          <dependencies>
            <dependency>
              <groupId>${buildToolsGroupId}</groupId>
              <artifactId>${buildToolsArtifactId}</artifactId>
              <version>${buildToolsVersion}</version>
            </dependency>
          </dependencies>
          <executions>
            <execution>
              <phase>validate</phase>
              <goals>
                <goal>check</goal>
                <goal>cpd-check</goal>
              </goals>
            </execution>
          </executions>
        </plugin>
        ...
      </plugins>
      ...
    </build>
    ...
  </profile>
  ...
</profiles>

To create a graphical PMD and CPD report with the “mvn site” command, the Maven PMD plugin can be configured to render the results from the anlysis phase into an html report. To enable this functionality and automatically include the reports in the project’s site report, add the following code to the “reporting” section in the POM file (in addition to the plugin configuration in the “build” section):

<reporting>
  ...
  <plugins>
  ...
    <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-pmd-plugin</artifactId>
      <version>${mavenPmdPluginVersion}</version>
      <configuration>
        <linkXref>true</linkXref>
        <sourceEncoding>${project.build.sourceEncoding}</sourceEncoding>
        <minimumTokens>100</minimumTokens>
        <skipEmptyReport>false</skipEmptyReport>
        <targetJdk>${javaVersion}</targetJdk>
        <rulesets>
          <ruleset>pmd-rules.xml</ruleset>
        </rulesets>
      </configuration>
    </plugin>
  ...
  </plugins>
  ...
</reporting>

To create the graphical report, the plugin uses the analysis results that are already present in the “target” directory of an artifact, and then transforms them into a graphical representation. This is why the configuration does not have a dependency on the PMD configuration artifact.

Note that this plugin needs the JXR plugin to be executed beforehand if it should provide cross-references to the source code in its reports.

Edit: See https://github.com/mbeiter/util for an example on how to configure the Maven PMD Plugin as described in this post for a multi-module build.