Argo Workflows provider
Parses Argo API documents (apiVersion: argoproj.io/*) from .yaml
/ .yml files on disk, text-only static analysis, no argo binary,
no cluster access. Recognized kinds: Workflow, WorkflowTemplate,
ClusterWorkflowTemplate, CronWorkflow. Documents that don't
carry an argoproj.io/* apiVersion are silently skipped.
Producer workflow
pipeline_check --pipeline argo --argo-path workflows/
# A single workflow file works too.
pipeline_check --pipeline argo --argo-path workflows/release.yaml
All other flags (--output, --severity-threshold, --checks,
--standard, …) behave the same as with the other providers.
Argo-specific checks
- ARGO-005,
{{inputs.parameters.X}}substitution happens before the shell parses the script, so any unquoted use inscript.source/container.argsis a command-injection primitive. Pass the parameter viaenv:and reference quoted. - ARGO-003,
Workflow/CronWorkflowmust setserviceAccountName. Workflows that fall back to the namespace'sdefaultSA inherit whatever role someone later binds todefault.
What it covers
16 checks · 2 have an autofix patch (--fix).
| Check | Title | Severity | Fix |
|---|---|---|---|
| ARGO-001 | Argo template container image not pinned to a digest | HIGH | |
| ARGO-002 | Argo template container runs privileged or as root | HIGH | |
| ARGO-003 | Argo workflow uses the default ServiceAccount | MEDIUM | |
| ARGO-004 | Argo workflow mounts hostPath or shares host namespaces | CRITICAL | |
| ARGO-005 | Argo input parameter interpolated unsafely in script / args | CRITICAL | |
| ARGO-006 | Literal secret value in Argo template env or parameter default | CRITICAL | 🔧 fix |
| ARGO-007 | Argo workflow has no activeDeadlineSeconds | LOW | |
| ARGO-008 | Argo script source pipes remote install or disables TLS | HIGH | 🔧 fix |
| ARGO-009 | Artifacts not signed (no cosign/sigstore step) | MEDIUM | |
| ARGO-010 | No SBOM generated for build artifacts | MEDIUM | |
| ARGO-011 | No SLSA provenance attestation produced | MEDIUM | |
| ARGO-012 | No vulnerability scanning step | MEDIUM | |
| ARGO-013 | Argo workflow does not opt out of SA token automount | MEDIUM | |
| ARGO-014 | Argo template script runs unpinned package install | MEDIUM | |
| ARGO-015 | Input artifact pulls from an insecure (non-HTTPS) URL | HIGH | |
| TAINT-007 | Untrusted input flows across templates via Argo outputs.parameters |
HIGH |
ARGO-001: Argo template container image not pinned to a digest
Walks spec.templates[].container, spec.templates[].script, and spec.templates[].containerSet.containers[]. The image must contain @sha256: followed by a 64-char hex digest.
Recommended action
Pin every container / script template image to a content-addressable digest (alpine@sha256:<digest>). Tag-only references (alpine:3.18) and rolling tags (alpine:latest) let a compromised registry update redirect the workflow's containers at the next pull, with no audit trail in the WorkflowTemplate.
ARGO-002: Argo template container runs privileged or as root
Detection fires on securityContext.privileged: true, runAsUser: 0, runAsNonRoot: false, allowPrivilegeEscalation: true, or no securityContext block at all. Also walks spec.podSpecPatch (raw YAML) for an explicit privileged: true token.
Recommended action
Set securityContext.privileged: false, runAsNonRoot: true, and allowPrivilegeEscalation: false on every template container / script. A privileged container shares the node's kernel namespaces; a malicious image then has root on the build node and breaks the boundary between workflow and cluster.
ARGO-003: Argo workflow uses the default ServiceAccount
Applies to Workflow and CronWorkflow. WorkflowTemplate / ClusterWorkflowTemplate are exempt because the SA is set on the run that references them. An explicit serviceAccountName: default is treated the same as omission.
Recommended action
Set spec.serviceAccountName (or spec.workflowSpec.serviceAccountName for CronWorkflow) to a least-privilege ServiceAccount that carries only the secrets and RBAC the workflow needs. Falling back to the namespace's default SA grants access to whatever cluster-admin or wildcard role someone later binds to default, a privilege-escalation surface that should never be load-bearing for workflow pods.
ARGO-004: Argo workflow mounts hostPath or shares host namespaces
Walks spec.volumes[].hostPath and the raw spec.podSpecPatch string for hostNetwork, hostPID, hostIPC, and hostPath.
Recommended action
Use emptyDir or PVC-backed volumes instead of hostPath. Drop hostNetwork: true / hostPID: true / hostIPC: true from any inline podSpecPatch. A hostPath mount of /var/run/docker.sock or / lets the workflow break out of the pod and act as the underlying node.
ARGO-005: Argo input parameter interpolated unsafely in script / args
Fires on any {{inputs.parameters.X}}, {{workflow.parameters.X}}, or {{item.X}} token inside a script.source body or a container.args string that isn't already wrapped in quotes. Doesn't fire on the env-var indirection pattern, which is safe.
Recommended action
Don't interpolate {{inputs.parameters.<name>}} directly into script.source or container.args. Argo substitutes the value before the shell parses it, so a parameter containing ; rm -rf / runs as shell. Pass the parameter via env: (value: '{{inputs.parameters.<name>}}') and reference the env var quoted in the script ("$NAME"); or use inputs.artifacts for file payloads.
ARGO-006: Literal secret value in Argo template env or parameter default
Strong matches: AWS access keys, GitHub PATs, JWTs. Weak match: env var name suggests a secret (*_TOKEN, *_KEY, *PASSWORD, *SECRET) and the value is a non-empty literal rather than an interpolation.
Recommended action
Mount secrets via env.valueFrom.secretKeyRef (or a volumes: Secret mount) instead of writing the value into env.value or arguments.parameters[].value. Workflow manifests are committed to git and cluster-readable; literal values leak through normal access paths.
ARGO-007: Argo workflow has no activeDeadlineSeconds
Applies to Workflow, CronWorkflow, WorkflowTemplate, and ClusterWorkflowTemplate. The field can sit at the workflow level or on individual templates.
Recommended action
Set spec.activeDeadlineSeconds (or spec.workflowSpec.activeDeadlineSeconds on a CronWorkflow) so a hung step can't pin the workflow controller's reconcile cycle indefinitely. Pick a value generous enough for the slowest legitimate run (e.g. 3600 for a typical pipeline, 21600 for ML training). Per-template activeDeadlineSeconds is also accepted as evidence of intent.
ARGO-008: Argo script source pipes remote install or disables TLS
Walks script.source and joined container.args text with the cross-provider CURL_PIPE_RE and TLS_BYPASS_RE regexes.
Recommended action
Replace curl ... | sh with a download-then-verify-then-execute pattern. Drop TLS-bypass flags (curl -k, git config http.sslverify false); install the missing CA into the template image instead. Both forms let an attacker controlling DNS / a transparent proxy substitute the script the workflow runs.
ARGO-009: Artifacts not signed (no cosign/sigstore step)
Detection mirrors GHA-006 / TKN-009 / BK-009, the shared signing-token catalog (cosign, sigstore, slsa-github-generator, slsa-framework, notation-sign) is searched across every string in each Argo document. Fires only on artifact-producing Workflows / WorkflowTemplates (those that invoke docker build / docker push / kaniko / helm upgrade / aws s3 sync / etc.) so lint-only Workflows don't trip it.
Recommended action
Add a cosign step to the Workflow. The most common shape is a final sign template that runs cosign sign --yes <repo>@sha256:<digest> after the build. Sign by digest, not tag, so a re-pushed tag can't bypass the signature.
ARGO-010: No SBOM generated for build artifacts
An SBOM (CycloneDX or SPDX) records every component baked into the build. Without one, post-incident triage can't answer did this CVE ship? for a given artifact. Detection uses the shared SBOM-token catalog: syft, cyclonedx, cdxgen, spdx-tools, microsoft/sbom-tool. Fires only on artifact-producing Workflows.
Recommended action
Add an SBOM-generation template. syft <artifact> -o cyclonedx-json > /tmp/sbom.json runs in any standard container; cyclonedx-cli and cdxgen are alternative producers. Persist the SBOM as an output artifact so downstream templates and consumers can read it.
ARGO-011: No SLSA provenance attestation produced
Provenance generation is distinct from signing. A signed artifact proves who published it; a provenance attestation proves where / how it was built. Detection uses the shared provenance-token catalog (slsa-framework, cosign attest, in-toto, witness run, attest-build-provenance).
Recommended action
Add a cosign attest --predicate slsa.json --type slsaprovenance <ref> step after the build template, or use witness run to record the build environment. Publish the attestation alongside the artifact so consumers can verify how it was built, not just who signed it.
ARGO-012: No vulnerability scanning step
Vulnerability scanning sits at a different layer from signing and SBOM. It answers does this artifact ship a known CVE? rather than can we verify what it is?. Detection uses the shared vuln-scan-token catalog: trivy, grype, snyk, npm-audit, pip-audit, osv-scanner, govulncheck, anchore, codeql-action, semgrep, bandit, checkov, tfsec. Walks every Argo document and passes if any document includes a scanner reference.
Recommended action
Add a vulnerability scanner template. trivy fs /workdir for source / filesystem; trivy image <ref> for container images. grype, snyk, npm audit, pip-audit are alternatives. Fail the template on findings above a chosen severity so a regression blocks the merge instead of shipping.
ARGO-013: Argo workflow does not opt out of SA token automount
Companion to ARGO-003 (default ServiceAccount). The default SA only matters when its token is mounted; an explicit automountServiceAccountToken: false removes the token from the pod regardless of which SA the pod is bound to. Detection: workflow passes when the spec sets it to false AND every template either inherits that or sets its own automountServiceAccountToken: false. A template with it explicitly true (or unset against an unset spec-level value) is the failing shape.
Known false-positive modes
- Templates that genuinely need to call the Kubernetes API (GitOps pull,
kubectl applyfrom inside the workflow). SetautomountServiceAccountToken: trueon that template specifically and bind it to a least-privilege SA, the rule then fires only on the broad spec-level absence, which is the actual gap.
Recommended action
Set spec.automountServiceAccountToken: false on the Workflow / WorkflowTemplate, or per-template (templates[].automountServiceAccountToken: false) on any template that doesn't need to talk to the Kubernetes API. An explicit false keeps a compromised step from using the workflow's SA token to escalate inside the cluster, even when the SA itself is hardened (ARGO-003), a token automounted into every pod widens the leak surface.
ARGO-014: Argo template script runs unpinned package install
Detection reuses the cross-provider primitives PKG_INSECURE_RE and PKG_NO_LOCKFILE_RE from checks/base.py. Same rule pack already exists for GHA (GHA-021 / GHA-022), GitLab (GL-021 / GL-022), Bitbucket / Azure DevOps / Jenkins / CircleCI / Cloud Build / Buildkite / Tekton / Drone. Argo was a gap; this closes it.
Walks script.source plus joined container.args / container.command text per template. Steps and tasks across DAG / steps templates are equally in scope because they all reduce to a container with a shell payload.
Known false-positive modes
- Bootstrap-stage installs that intentionally pull latest (
apt-get install -y curlfor a tooling image rebuild) sometimes legitimately bypass the lockfile. Suppress via ignore-file scoped to the specific template name.
Recommended action
Pin every package install to a lockfile or a checksum-verified version. npm ci (not npm install), yarn install --frozen-lockfile, pip install -r requirements.txt --require-hashes, bundle install --frozen. Don't use --trusted-host / --no-verify / a non-HTTPS index URL — those bypass TLS or trust validation entirely (ARGO-008 covers the TLS subset; this rule covers the lockfile subset).
ARGO-015: Input artifact pulls from an insecure (non-HTTPS) URL
Argo Workflows resolves input artifacts before the template's container starts. The source can be http, git, s3, gcs, azure, hdfs, oss, or raw. The rule fires when:
http.urlstarts withhttp://(cleartext fetch)git.repostarts withgit://(legacy unauthenticated git protocol, no integrity)s3.endpointis set withinsecure: true(explicit TLS bypass)
Other artifact sources are skipped, an OCI / S3 / GCS pull carries its own integrity / signing posture that lives outside this rule.
Known false-positive modes
- Local-mirror development workflows occasionally use
http://against an internal registry that's only reachable from a private network. The integrity guarantee still relies on network isolation rather than transport encryption; suppress on the specific template name when this is the deliberate shape.
Recommended action
Pull every input artifact over HTTPS. Replace http:// with https:// in any http.url: block, and use https:// git remote URLs instead of git://, ssh://-without-key-pinning, or anonymous-cleartext access. Plain HTTP fetches let any on-path attacker swap the artifact bytes for a different payload, and Argo will execute whatever bytes arrive without an integrity check unless the artifact source provides one (S3 + checksum, OCI + digest). If the artifact source genuinely doesn't ship over HTTPS (a legacy internal mirror), wrap it in a CDN or proxy that adds TLS, then pin the artifact by checksum on the consuming side.
TAINT-007: Untrusted input flows across templates via Argo outputs.parameters
Detection walks every workflow document with spec.templates. Pass 1 looks for templates that declare outputs.parameters AND whose inline script.source interpolates {{inputs.parameters.<X>}}, recording the template's outputs as tainted. Pass 2 walks each template's DAG / Steps orchestrator for tasks whose arguments.parameters[*].value is {{tasks.<producer>.outputs.parameters.<X>}} matching a recorded leak. Pass 3 walks the consumer task's referenced template for the matching {{inputs.parameters.<consumer-param>}} reference in its script body and emits one path per match.
v1 limitations: workflowTemplateRef: cross-document references aren't resolved (would need the same machinery as the GHA --resolve-remote flow). onExit: exit handlers aren't yet walked.
Known false-positive modes
- If the producer template runs a sanitiser between the tainted
{{inputs.parameters.X}}interpolation and the output-path write, the consumer is no longer exploitable but TAINT-007 still fires. Suppress via ignore-file scoped to the consumer template name when this is the deliberate shape; the sanitiser is then load-bearing.
Recommended action
Sanitise the value at the producer template before it lands in an output parameter. The canonical safe pattern is to surface {{inputs.parameters.<X>}} into a quoted shell variable, run a sanitiser (tr -dc 'a-zA-Z0-9 ' for a freeform title), and only then redirect the cleaned value to the output path. The consumer template should still reference {{inputs.parameters.<name>}} quoted ("{{inputs.parameters.title}}") and never inline into a command without re-quoting. Removing the cross-template forwarding is the strongest fix; if the value genuinely needs to flow downstream, validate the sanitiser is doing what you think before relying on it.
Adding a new Argo Workflows check
- Create a new module at
pipeline_check/core/checks/argo/rules/argoNNN_<name>.pyexporting a top-levelRULE = Rule(...)and acheck(ctx: ArgoContext) -> Findingfunction. The orchestrator auto-discoversRULEand callscheckwith theArgoContext. - Add a mapping for the new ID in
pipeline_check/core/standards/data/owasp_cicd_top_10.py(and any other standard that applies). - Drop unsafe/safe snippets at
tests/fixtures/per_check/argo/ARGO-NNN.{unsafe,safe}.ymland add aCheckCaseentry intests/test_per_check_real_examples.py::CASES. - Regenerate this doc: