Key Benefit

Federated Application Security Intelligence

Harvesting and Leveraging Application Security Best Practice Data

What is Application Security?

Application security can be defined as building security into your software starting at the earliest point - the code. This practice includes adding logic to add and test security features and to prevent security vulnerabilities. Application security also includes writing code to fortify user access, protect application input, encryption, and threat modeling.

Application security has been recognized as a set of best practices for developers; however, in recent years, the DevOps community has begun to understand they are also responsible for implementing application security best practices around the supply chain and the DevOps pipeline.

Putting Application Security Data to Work

Application security tooling is called via the DevOps Pipeline. These tools generate critical security insights for each component that is pushed through the pipeline. In a decoupled architecture, one ‘logical application’ could depend on hundreds of components, such as microservices, that are built via hundreds of DevOps pipelines. The result, critical logs and insights are siloed across the pipeline for each container. Application-level security data becomes fragmented, with micro views. According to Mckinsey and Company 65% to 80% of organizations are looking for more visibility into their security and DevOps logs.

Application SBOMs

What is required is the federation of this data to higher organizational levels, providing a sweeping view of the security profiles across the organization, from application levels to environment levels.

Top 5 Application Security Best Practices

So, where does the DevOps and Security data come from? By now, most companies have built DevOps pipelines that address some level of application security. The top 5 most common Application Security Best Practices include:

These best practices create the data that shows the micro-level information on each component pushed through the pipeline. By gathering this data and tracking historical changes, a more comprehensive view of an organizations application security profile can be derived.

Application Security in a Cloud-Native Environment

A cloud-native decoupled architecture adds complexity to the application security practice. In a decoupled architecture, hundreds of independent updates are moving across the pipeline all day long, creating new versions of ‘logical’ applications. Tracking the versions of each new component and ‘logical’ application being delivered across multiple environments becomes challenging. When the application security practice uncovers a high-risk vulnerability, understanding the vulnerability’s blast radius is critical for rapid response. With data fragmented across hundreds of components, it can take months to contain a single CVE.

Federating Component Data to the Application Level

As technology becomes more decoupled, introducing software supply chain management becomes more critical. A software supply chain management catalog can restore the concept of a monolithic release by aggregating component level details to the logical application level. The monolithic release becomes a ‘logical’ view from within the software supply chain management dashboard. From a single point of truth, a logical application’s security profile can be viewed, including aggregated SBOMs and CVEs. The application security data can also be aggregated to runtime environments and organizational siloes.  Federating the application security data of all components into a central software supply chain management catalog consolidates the data, showing not just a single component’s security profile but an entire organization’s security profile.

To meet the Biden Administration’s 2022 SBOM order, teams must deliver an SBOM that aggregates all application component dependencies’ SBOMs to the logical application-level every time a change is delivered. Achieving this level of SBOM reporting means the DevOps pipeline must automatically track the ‘logical’ application and create an application release version, SBOM, and CVE for each change.


Consolidate Comprehensive, End-to-End Insights

Creating tamper-proof software requires the generation of various types of security insights. Most of those insights, like SBOMS, are generated as part of the DevOps pipeline but left under the hood, sitting in a log, or displayed in various dashboards across the environment. In a cloud-native environment, hundreds of these logs and reports are generated for each new build of a decoupled component. The data is essential.

Generating security logs such as SBOMs and CVE results is the first step, but consuming the data and building actionable insights is the basis for building strong security policies, including zero-trust. What is required is a consolidation of the information. The consumption of this information provides a comprehensive, end-to-end understanding of the organization’s security profile.

For example, a software supply chain management catalog provides one place to find where ‘log4j’ is being consumed and running.  The catalog also provides a single location to view historical data to determine exposure levels, a CVE impact, and what versions of open-source objects are running across production environments. Security starts and ends with knowing what software is running across the enterprise. Consolidating the data allows for strong security policies with immediate insights to comprehensively take fast and accurate actions across all environments.

Additional Activities Required to Harden DevOps Pipeline

New security measures across the DevOps pipeline can improve your overall application security. Each phase of the pipeline will require updates to achieve the goal. If we look across the pipeline, 5 phases need to be updated:

  • Code and Pre-build – Critical security steps include code signing, scanning an entire codebase for vulnerabilities, and scanning individual files for code weaknesses.
  • Build – These actions include generating an image SBOM, image signing, and pre-package verification.
  • Post-Build – If the build step above does not include creating an SBOM image, a post-build effort is needed to add security actions for generating a complete SBOM of the entire build image.
  • Publish – Store and share containers, generate container CVEs, and collect security evidence to show an organization’s security profile.
  • Audit – Beyond adding security to the phases of the pipeline, auditing the pipeline itself further hardens the application life cycle process.

Where Does DeployHub Fit In?

DeployHub is a software supply chain management catalog that consumes and aggregates security and DevOps intelligence, providing comprehensive, end-to-end insights. DeployHub leverages this data to provide what is needed for IT teams to respond to issues and vulnerabilities within hours not months.  DeployHub is added to your CI/CD pipeline to automate the collection and aggregation of this data using a simple command line interface that can also add SBOM generation to the process if you have not already done so.

software supply chain management


Application Security best practices generate critical information for hardening digital assets against cyber attacks. The challenge is this critical data is fragmented across siloed tools, or hidden in logs. The use of a federated software supply chain management catalog will centralize and aggregate the data, providing a macro view of an organization’s security profile, including ‘logical’ application SBOM reports, open-source inventory, and CVE blast radius. The aggregated data provides the ability for teams to respond to cyber threats in hours, not months. It is also important to note that more application security tooling is being made available. As more data is created, the need for a federated catalog increases. 

Interesting Application Security projects to watch:

  • – Ortelius is a federated software supply chain catalog of all security and DevOps results, with data aggregation for comprehensive end-to-end insights and history. Incubating at the Continuous Delivery Foundation(CDF).
  •  – Pyrsia creates a decentralized package network with a build consensus. By building across multiple nodes Pyrsia can compare results and immediately notify you when a build has been compromised. Incubating at the Continuous Delivery Foundation(CDF).
  • – An event-based CI/CD engine built for Kubernetes. Also includes Tekton Chains for auditing the pipeline itself. Incubating at the Continuous Delivery Foundation(CDF).
  • CD.Events – A critical piece in the overall pipeline puzzle. CDEvents is a Continuous Delivery Foundation(CDF) community effort to define standards for creating a CD events framework. CDEvents will simplify and standardize CI/CD workflows, eliminating much of the one-off scripting, and creating an audit of what is occurring in the pipeline. An event-based CI/CD pipeline will make it easy to add and update Pipeline activities without touching hundreds of pipeline workflows.
  • Keptn – An event-based Cloud-native CI/CD engine for orchestrating your application lifecycle. Designed to include observability and remediation with a GitOps approach. Incubating at the Cloud Native Computing Foundation.
  • Alpha-Omega – Their goal is to first (Alpha), work with the most popular open-source projects to find and fix vulnerabilities and second (Omega) provides over 10,000 OS projects with automated security analysis. An Open Source Security Foundation (OpenSSF) community project.
  • – Container Signing, verification, and storage in an OCI registry. Provides a historical record of changes and allows for searching of the record. An Open Source Security Foundation (OpenSSF)community project.
  • Syft – A CLI tool and Go library for generating a Software Bill of Materials (SBOM) from container images and filesystems. Managed by Anchore.
  • Apko – Build and publish OCI container images built from Alpine Package Keeper packages. A safer way to create containers. Managed by ChainGuard.

More Info - DeployHub's APIs for Data Gathering

API Documentation

DeployHub has a full set of APIs for customizing your integrations, allowing you to connect any DevOps or security tool to your data gathering.

Creating Custom Pre and Post Actions with DMScript

Need to go deeper?  DeployHub’s DMScript allows you more control over your customization of the DeployHub Platform.

Demo - Adding Data Gathering to Your DevOps Pipeline

Suggested Whitepaper

Application Security Tooling for your DevOps Pipeline

Application security tooling is the automation of security best practices into the DevOps Pipeline. Application security has mainly focused on improving code to fortify user access, protect application input, encryption, and threat modeling. In addition, security enhancements to the DevOps Pipeline enforce best practices to harden the application lifecycle. This whitepaper provides a clear understanding of what is needed to harden application security at minimal cost.

Get the Whitepaper

application security and DevOps

Further Reading on Supply Chain Security:

Signup for Free and Get Started Today

Signup for DeployHub Team, the free SaaS supply chain management catalog. You will need a Company and Project Name to get started. The Company Name you enter will be created as your company’s private domain, referred to as your Global Domain. Your Project Name will be used under your company Domain. You will also receive an email from DeployHub. The email will contain your unique user ID and client ID, links, and useful information. Learn More

Get started federating all security and DevOps data with DeployHub Team SaaS Catalog.

Got questions?  Join our Discord channel and start a discussion. Open an issue on GitHub.

DeployHub Team is based on the Ortelius Open-Source project incubating at the Continuous Delivery Foundation.