SBOM Automation – Your First Step in the Security Journey
What is SBOM Automation?
Challenges with SBOM Automation
SBOM automation must be included as part of your CI/CD workflow. The steps for adding SBOM generation to one workflow are fairly straightforward. But adding SBOM automation to every workflow is a much bigger task. If you are an enterprise with hundreds of workflow files, every single one will need to be modified. And what do you get from that effort? You get an SBOM that sits in the build directory, and you can check off a security box. So is updating every workflow worth it? Well, many teams would call this busy work. If the data from the SBOM is not consumed or used, is SBOM automation worth it?
Consumption of SBOM Automation Data
The information derived from an SBOM is a critical first step in understanding your software security profile. The SBOM exposes the open-source packages with attributes you are consuming and delivering to your end users. But generating an SBOM alone does not provide you insights. The data must be consumed with SBOM management. To act upon the SBOM data, you need a collection and historical tracking method. And this collection and historical tracking should be continuous, auditing every update to every version and the impact of those changes.
Using DeployHub and the Ortelius Command Line Interface for SBOM Automation
DeployHub is a unified ‘evidence store’ of all your security and DevOps intelligence resulting from your CI/CD process. It continually collects critical information, such as SBOMs, each time a new artifact is pushed across the pipeline. To continuously gather pipeline intelligence, DeployHub must become part of your pipeline. When you update your workflow to generate SBOMs, you can add DeployHub to collect and consume the data. Once centralized, DeployHub leverages the SBOM information to generate real-time CVEs. It can also aggregate your microservice SBOMs up to the ‘logical’ application level, a critical security requirement in a cloud-native architecture.
DeployHub integrates into your CI/CD process using the Ortelius Open-Source Command Line (CLI). The Ortelius CLI gathers supply chain data based on a single pipeline workflow at the build and deploys steps. The build step gathers Swagger, SBOM, Readme, licenses, Git data, Docker image, and other build output. The deploy step records when a release occurs, what was sent, and where the objects were sent to.
The Ortelius CLI is maintained by the Ortelius Open Source Community under the governance of the Linux Foundation’s Continuous Delivery Foundation.
For the most up-to-date information on the Ortelius CLI, visit the Ortelius GitHub Repository. You will find a complete list of parameters for collecting Swagger, SBOM, and other tool reports and results.
DeployHub Team SBOM Automation POC (Whitepaper)
Get the Proof of Concept 4-step guide to implement your SBOM Automation with your CI/CD Pipeline and DeployHub Team.
Using the Ortelius CLI Data Gathering .toml
The Ortelius CLI reads from a .toml file. The .toml file contains non-derived information for each artifact that you create at your build step. In DeployHub, an artifact is referred to as a Component. A Component is a Container, DB Object, or file object (.jar, Lamda Function, Apex file, etc.). The .toml file will provide the ‘non-derived’ data for the Component you are tracking in DeployHub, which includes the Component name, owner, Component type, and owner contact details. The Ortelius CLI will read the .toml file from the Git Repository associated with your pipeline. You will need a separate Component if you use a Mono Repository for your entire codebase.toml file for each Component managed in sub-directories.
In a cloud-native microservice architecture, there are many, if not hundreds, of Components. Organizing your Components within DeployHub is done in two ways. They are grouped based on a subject Domain and assigned to a logical Application. Not all Components need to be assigned to an Application, but they should be stored in a subject matter Domain so they can be easily found and reused.
A logical Application is a collection of Components that make up a complete software system consumed by an end user. Applications are composed of shared Components and Application specific Components, and are a logical representation of what Components need to be deployed for the software system to run.
Note: Once created, your .toml file does not need to be updated unless the non-derived information changes or you want to reorganize to which Applications or Domains the Component has been assigned. For example, a Component has been reassigned to a new owner and new team represented by a Domain or Application.
Perform the following steps to add your Components using the .toml file:
Step 1 – Define Your DeployHub Pipeline Variables
The following variables should be set at the beginning of your Pipeline for your SBOM Automation:
Variable | Value | Description |
DHURL | URL to DeployHub Login | The URL used to access DeployHub |
DHUSER | UserID | The ID used to log into DeployHub |
DHPASS | password | The password used to log into DeployHub. This can encrypted based on the CI/CD solution. |
DOCKERREPO | Name of your Docker Repository | For Components that are Docker Images. Not needed for non-docker objects. |
IMAGE_TAG | Tag for the Docker Image if used | For Components that are Docker Images. Not needed for non-docker objects. |
Example
# Application Name and Version
Application = “GLOBAL.Santa Fe Software.Online Store Company.Hipster Store.Prod.helloworld app”
Application_Version = “1”
# Define Component Name, Variant and Version
Name = “GLOBAL.Santa Fe Software.Online Store Company”
Variant = “${GIT_BRANCH}”
Version = “v1.0.0.${BUILD_NUM}-g${SHORT_SHA}”
# Key/Values to associate to the Component Version
[Attributes]
DockerBuildDate = “${BLDDATE}”
DockerRepo = “${DOCKERREPO}”
DockerSha = “${DIGEST}”
DockerTag = “${IMAGE_TAG}”
DiscordChannel = “https://discord.gg/wM4b5yEFzS”
ServiceOwner= “${DHUSER}”
ServiceOwnerEmail = “stella@DeployHub.io”
Step 2 – Create your Component.toml file
Cut and paste the following into a component.toml file to perform you SBOM Automation. Update ‘your’ information, and commit/push it to your Git Repository.
# Application Name and Version – not required. If not used the Component will not be associated to an Application
Application = “GLOBAL.”your Application Name”
Application_Version = “your Application Version”
# Define Component Name, Variant and Version – required
Name = “GLOBAL.your Component Name”
Variant = “${GIT_BRANCH}”
Version = “vyour Component Version.${BUILD_NUM}-g${SHORT_SHA}”>
# Key/Values to associate to the Component Version
[Attributes]
DockerBuildDate = “${BLDDATE}”
DockerRepo = “${DOCKERREPO}”
DockerSha = “${DIGEST}”
DockerTag = “${IMAGE_TAG}”
DiscordChannel = “your Discord channel” or SlackChannel = “your Slack Channel”
ServiceOwner = “${DHUSER}”
ServiceOwnerEmail = “your Component Owner Email”
Example
export DHURL=https://deployhub.example.com
export DHUSER=Stella99
export DHPASS=chasinghorses
export DOCKERREPO=quay.io/DeployHub/hello-world
export IMAGE_TAG=1.0.0
Note: For SaaS users, you will have a second high-level qualifier that was created as part of your sign-up. This second high-level qualifier must be used as the start of your Application Name and Component Name. For example: GLOBAL.Santa Fe Software.Online Store.
Step 3 – Add a step in your pipeline to run Syft for your SBOM Automation (Optional)
DeployHub can consume any SPDX and CycloneDX formatted SBOM. If you are already generating SBOMs, you will pass the name of the SBOM results to DeployHub is step 4 below. If you are not generating SBOMs as part of your pipeline process, you will need to add SBOM generation to collect the lower dependency data. Following is how to add Syft to your workflow to include the collection of SBOM data.
Syft SBOM will generate Software Bill of Material Reports for popular coding languages and package managers, including Docker images.
Example
The following code example scans a Docker Image to generate the SBOM. See Syft Options to scan other objects and coding languages.
# install Syft
curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s — -b $PWD
# create the SBOM
./syft packages $DOCKERREPO:$IMAGE_TAG –scope all-layers -o cyclonedx-json > cyclonedx.json
# display the SBOM
cat cyclonedx.json
Step 4 – Run the Ortelius CLI to add Your Component and Create an Application.
To complete the process, you will need to install the Ortelius CLI where your CI/CD server is running. Refer to the Ortelius GitHub CLI Documentation for installation instructions. Execute the following calls to the Ortelius CLI as part of your workflow. It should be called after the build and SBOM generation:
Example
With CycloneDX SBOM
dh updatecomp –rsp component.toml –deppkg “cyclonedx@name of your SBOM file”
Example:
dh updatecomp –rsp component.toml –deppkg “cyclonedx@cyclonedx.json”
With SPDX SBOM
dh updatecomp –rsp component.toml –deppkg “spdx@name of your SBOM file. “
Example:
dh updatecomp –rsp component.toml –deppkg “spdx@spdx.json”
Without SBOM
dh updatecomp –rsp component.toml
Results
Once you have completed the above steps, you should see the following results: