The MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines
The SAF CLI is the successor to Heimdall Tools and InSpec Tools.
The SAF CLI can be installed and kept up to date using npm
, which is included with most versions of NodeJS.
npm install -g @mitre/saf
To update the SAF CLI with npm
:
npm update -g @mitre/saf
top
The SAF CLI can be installed and kept up to date using brew
.
brew install mitre/saf/saf-cli
To update the SAF CLI with brew
:
brew upgrade mitre/saf/saf-cli
top
On Linux and Mac:
The docker command below can be used to run the SAF CLI one time, where arguments
contains the command and flags you want to run. For ex: --version
or view summary -i hdf-results.json
.
docker run -it -v$(pwd):/share mitre/saf <arguments>
To run the SAF CLI with a persistent shell for one or more commands, use the following, then run each full command. For ex: saf --version
or saf view summary -i hdf-results.json
. You can change the entrypoint you wish to use. For example, run with --entrypoint sh
to open in a shell terminal. If the specified entrypoint is not found, try using the path such as --entrypoint /bin/bash
.
docker run --rm -it --entrypoint bash -v$(pwd):/share mitre/saf
On Windows:
The docker command below can be used to run the SAF CLI one time, where arguments
contains the command and flags you want to run. For ex: --version
or view summary -i hdf-results.json
.
docker run -it -v%cd%:/share mitre/saf <arguments>
To run the SAF CLI with a persistent shell for one or more commands, use the following, then run each full command. For ex: saf --version
or saf view summary -i hdf-results.json
. You can change the entrypoint you wish to use. For example, run with --entrypoint sh
to open in a shell terminal. If the specified entrypoint is not found, try using the path such as --entrypoint /bin/bash
.
docker run --rm -it --entrypoint sh -v%cd%:/share mitre/saf
NOTE:
Remember to use Docker CLI flags as necessary to run the various subcommands.
For example, to run the emasser configure
subcommand, you need to pass in a volume that contains your certificates and where you can store the resultant .env. Furthermore, you need to pass in flags for enabling the pseudo-TTY and interactivity.
docker run -it -v "$(pwd)":/share mitre/saf emasser configure
Other commands might not require the -i
or -t
flags and instead only need a bind-mounted volume, such as a file based convert
.
docker run --rm -v "$(pwd)":/share mitre/saf convert -i test/sample_data/trivy/sample_input_report/trivy-image_golang-1.12-alpine_sample.json -o test.json
Other flags exist to open up network ports or pass through environment variables so make sure to use whichever ones are required to successfully run a command.
To update the SAF CLI with docker
:
docker pull mitre/saf:latest
top
To install the latest release of the SAF CLI on Windows, download and run the most recent installer for your system architecture from the Releases ?️ page.
To update the SAF CLI on Windows, uninstall any existing version from your system and then download and run the most recent installer for your system architecture from the Releases ?️ page.
top
Attest to 'Not Reviewed' controls: sometimes requirements can’t be tested automatically by security tools and hence require manual review, whereby someone interviews people and/or examines a system to confirm (i.e., attest as to) whether the control requirements have been satisfied.
attest create Create attestation files for use with `saf attest apply`
USAGE
$ saf attest create -o <attestation-file> [-i <hdf-json> -t <json | xlsx | yml | yaml>]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (optional) An input HDF file to search for controls
-o, --output=<value> (required) The output filename
-t, --format=<option> [default: json] (optional) The output file type
<options: json|xlsx|yml|yaml>
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf attest create -o attestation.json -i hdf.json
$ saf attest create -o attestation.xlsx -t xlsx
top
attest apply Apply one or more attestation files to one or more HDF results sets
USAGE
$ saf attest apply -i <input-hdf-json>... <attestation>... -o <output-hdf-path>
FLAGS
-h, --help Show CLI help.
-i, --input=<value>... (required) Your input HDF and Attestation file(s)
-o, --output=<value> (required) Output file or folder (for multiple executions)
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf attest apply -i hdf.json attestation.json -o new-hdf.json
$ saf attest apply -i hdf1.json hdf2.json attestation.xlsx -o outputDir
top
Translating your data to and from Heimdall Data Format (HDF) is done using the saf convert
command.
Want to Recommend or Help Develop a Converter? See the wiki ? on how to get started.
top
convert anchoregrype2hdf Translate a Anchore Grype output file into an HDF results set
USAGE
$ saf convert anchoregrype2hdf -i <anchoregrype-json> -o <hdf-scan-results-json> [-h] [-w]
FLAGS
-h, --help Show CLI help.
-i, --input=<anchoregrype-json> (required) Input Anchore Grype file
-o, --output=<hdf-scan-results-json> (required) Output HDF JSON File
-w, --includeRaw Include raw data from the input Anchore Grype file
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert anchoregrype2hdf -i anchoregrype.json -o output-hdf-name.json
Note: Uploading findings into AWS Security hub requires configuration of the AWS CLI, see the AWS documentation or configuration of environment variables via Docker.
convert hdf2asff Translate a Heimdall Data Format JSON file into
AWS Security Findings Format JSON file(s) and/or
upload to AWS Security Hub
USAGE
$ saf convert hdf2asff -a <account-id> -r <region> -i <hdf-scan-results-json> -t <target> [-h] [-R] (-u [-I -C <certificate>] | [-o <asff-output-folder>])
FLAGS
-C, --certificate=<certificate> Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure.
-R, --specifyRegionAttribute Manually specify the top-level `Region` attribute - SecurityHub
populates this attribute automatically and prohibits one from
updating it using `BatchImportFindings` or `BatchUpdateFindings`
-a, --accountId=<account-id> (required) AWS Account ID
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF JSON File
-o, --output=<asff-output-folder> Output ASFF JSON Folder
-r, --region=<region> (required) SecurityHub Region
-t, --target=<target> (required) Unique name for target to track findings across time
-u, --upload Upload findings to AWS Security Hub
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Send output to local file system
$ saf convert hdf2asff -i rhel7-scan_02032022A.json -a 123456789 -r us-east-1 -t rhel7_example_host -o rhel7.asff
Upload findings to AWS Security Hub
$ saf convert hdf2asff -i rds_mysql_i123456789scan_03042022A.json -a 987654321 -r us-west-1 -t Instance_i123456789 -u
Upload findings to AWS Security Hub and Send output to local file system
$ saf convert hdf2asff -i snyk_acme_project5_hdf_04052022A.json -a 2143658798 -r us-east-1 -t acme_project5 -o snyk_acme_project5 -u
top
Notice: HDF to Splunk requires configuration on the Splunk server. See Splunk Configuration.
convert hdf2splunk Translate and upload a Heimdall Data Format JSON file into a Splunk server
USAGE
$ saf convert hdf2splunk -i <hdf-scan-results-json> -H <host> -I <index> [-h] [-P <port>] [-s http|https] [-u <username> | -t <token>] [-p <password>] [-L info|warn|debug|verbose]
FLAGS
-H, --host=<host> (required) Splunk Hostname or IP
-I, --index=<index> (required) Splunk index to import HDF data into
-P, --port=<port> [default: 8089] Splunk management port (also known as the Universal Forwarder port)
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-p, --password=<password> Your Splunk password
-s, --scheme=<option> [default: https] HTTP Scheme used for communication with splunk
<options: http|https>
-t, --token=<token> Your Splunk API Token
-u, --username=<username> Your Splunk username
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
User name/password Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -u admin -p Valid_password! -I hdf
Token Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -t your.splunk.token -I hdf
For HDF Splunk Schema documentation visit Heimdall converter schemas
Previewing HDF Data Within Splunk:
An example of a full raw search query:
index="<<YOUR INDEX>>" meta.subtype=control | stats values(meta.filename) values(meta.filetype) list(meta.profile_sha256) values(meta.hdf_splunk_schema) first(meta.status) list(meta.status) list(meta.is_baseline) values(title) last(code) list(code) values(desc) values(descriptions.*) values(id) values(impact) list(refs{}.*) list(results{}.*) list(source_location{}.*) values(tags.*) by meta.guid id
| join meta.guid
[search index="<<YOUR INDEX>>" meta.subtype=header | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(statistics.duration) list(platform.*) list(version) by meta.guid]
| join meta.guid
[search index="<<YOUR INDEX>>" meta.subtype=profile | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(meta.profile_sha256) list(meta.is_baseline) last(summary) list(summary) list(sha256) list(supports{}.*) last(name) list(name) list(copyright) list(maintainer) list(copyright_email) last(version) list(version) list(license) list(title) list(parent_profile) list(depends{}.*) list(controls{}.*) list(attributes{}.*) list(status) by meta.guid]
An example of a formatted table search query:
index="<<YOUR INDEX>>" meta.subtype=control | stats values(meta.filename) values(meta.filetype) list(meta.profile_sha256) values(meta.hdf_splunk_schema) first(meta.status) list(meta.status) list(meta.is_baseline) values(title) last(code) list(code) values(desc) values(descriptions.*) values(id) values(impact) list(refs{}.*) list(results{}.*) list(source_location{}.*) values(tags.*) by meta.guid id
| join meta.guid
[search index="<<YOUR INDEX>>" meta.subtype=header | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(statistics.duration) list(platform.*) list(version) by meta.guid]
| join meta.guid
[search index="<<YOUR INDEX>>" meta.subtype=profile | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(meta.profile_sha256) list(meta.is_baseline) last(summary) list(summary) list(sha256) list(supports{}.*) last(name) list(name) list(copyright) list(maintainer) list(copyright_email) last(version) list(version) list(license) list(title) list(parent_profile) list(depends{}.*) list(controls{}.*) list(attributes{}.*) list(status) by meta.guid]
| rename values(meta.filename) AS "Results Set", values(meta.filetype) AS "Scan Type", list(statistics.duration) AS "Scan Duration", first(meta.status) AS "Control Status", list(results{}.status) AS "Test(s) Status", id AS "ID", values(title) AS "Title", values(desc) AS "Description", values(impact) AS "Impact", last(code) AS Code, values(descriptions.check) AS "Check", values(descriptions.fix) AS "Fix", values(tags.cci{}) AS "CCI IDs", list(results{}.code_desc) AS "Results Description", list(results{}.skip_message) AS "Results Skip Message (if applicable)", values(tags.nist{}) AS "NIST SP 800-53 Controls", last(name) AS "Scan (Profile) Name", last(summary) AS "Scan (Profile) Summary", last(version) AS "Scan (Profile) Version"
| table meta.guid "Results Set" "Scan Type" "Scan (Profile) Name" ID "NIST SP 800-53 Controls" Title "Control Status" "Test(s) Status" "Results Description" "Results Skip Message (if applicable)" Description Impact Severity Check Fix "CCI IDs" Code "Scan Duration" "Scan (Profile) Summary" "Scan (Profile) Version"
top
convert hdf2xccdf Translate an HDF file into an XCCDF XML
USAGE
$ saf convert hdf2xccdf -i <hdf-scan-results-json> -o <output-xccdf-xml> [-h]
FLAGS
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-o, --output=<output-xccdf-xml> (required) Output XCCDF XML File
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert hdf2xccdf -i hdf_input.json -o xccdf-results.xml
top
convert hdf2ckl Translate a Heimdall Data Format JSON file into a
DISA checklist file
USAGE
$ saf convert hdf2ckl saf convert hdf2ckl -i <hdf-scan-results-json> -o <output-ckl> [-h] [-m <metadata>] [--profilename <value>] [--profiletitle <value>] [--version <value>] [--releasenumber <value>] [--releasedate <value>] [--marking <value>] [-H <value>] [-I <value>] [-M <value>] [-F <value>] [--targetcomment <value>] [--role Domain Controller|Member Server|None|Workstation] [--assettype Computing|Non-Computing] [--techarea |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS] [--stigguid <value>] [--targetkey <value>] [--webdbsite <value> --webordatabase] [--webdbinstance <value> ] [--vulidmapping gid|id]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (required) Input HDF file
-o, --output=<value> (required) Output CKL file
CHECKLIST METADATA FLAGS
-F, --fqdn=<value> Fully Qualified Domain Name
-H, --hostname=<value> The name assigned to the asset within the network
-I, --ip=<value> IP address
-M, --mac=<value> MAC address
-m, --metadata=<value> Metadata JSON file, generate one with "saf generate ckl_metadata"
--assettype=<option> The category or classification of the asset
<options: Computing|Non-Computing>
--marking=<value> A security classification or designation of the asset, indicating its sensitivity level
--profilename=<value> Profile name
--profiletitle=<value> Profile title
--releasedate=<value> Profile release date
--releasenumber=<value> Profile release number
--role=<option> The primary function or role of the asset within the network or organization
<options: Domain Controller|Member Server|None|Workstation>
--stigguid=<value> A unique identifier associated with the STIG for the asset
--targetcomment=<value> Additional comments or notes about the asset
--targetkey=<value> A unique key or identifier for the asset within the checklist or inventory system
--techarea=<option> The technical area or domain to which the asset belongs
<options: |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS>
--version=<value> Profile version number
--vulidmapping=<option> Which type of control identifier to map to the checklist ID
<options: gid|id>
--webdbinstance=<value> The specific instance of the web application or database running on the server
--webdbsite=<value> The specific site or application hosted on the web or database server
--webordatabase Indicates whether the STIG is primarily for either a web or database server
DESCRIPTION
Translate a Heimdall Data Format JSON file into a DISA checklist file
EXAMPLES
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90:AB
$ saf convert hdf2ckl -i rhel8-results.json -o rhel8.ckl -m rhel8-metadata.json
top
convert hdf2csv Translate a Heimdall Data Format JSON file into a
Comma Separated Values (CSV) file
USAGE
$ saf convert hdf2csv -i <hdf-scan-results-json> -o <output-csv> [-h] [-f <csv-fields>] [-t]
FLAGS
-f, --fields=<csv-fields> [default: All Fields] Fields to include in output CSV, separated by commas
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-o, --output=<output-csv> (required) Output CSV file
-t, --noTruncate Don't truncate fields longer than 32,767 characters (the cell limit in Excel)
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Running the CLI interactively
$ saf convert hdf2csv --interactive
Providing flags at the command line
$ saf convert hdf2csv -i rhel7-results.json -o rhel7.csv --fields "Results Set,Status,ID,Title,Severity"
top
convert hdf2condensed Condensed format used by some community members
to pre-process data for elasticsearch and custom dashboards
USAGE
$ saf convert hdf2condensed -i <hdf-scan-results-json> -o <condensed-json> [-h]
FLAGS
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-o, --output=<condensed-json> (required) Output condensed JSON file
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert hdf2condensed -i rhel7-results.json -o rhel7-condensed.json
top
Output | Use | Command |
---|---|---|
ASFF json | All the findings that will be fed into the mapper | aws securityhub get-findings > asff.json |
AWS SecurityHub enabled standards json | Get all the enabled standards so you can get their identifiers | aws securityhub get-enabled-standards > asff_standards.json |
AWS SecurityHub standard controls json | Get all the controls for a standard that will be fed into the mapper | aws securityhub describe-standards-controls --standards-subscription-arn "arn:aws:securityhub:us-east-1:123456789123:subscription/cis-aws-foundations-benchmark/v/1.2.0" > asff_cis_standard.json |
convert asff2hdf Translate a AWS Security Finding Format JSON into a
Heimdall Data Format JSON file(s)
USAGE
$ saf convert asff2hdf -o <hdf-output-folder> [-h] (-i <asff-json> [--securityhub <standard-json>]... | -a -r <region> [-I | -C <certificate>] [-t <target>]) [-L info|warn|debug|verbose]
FLAGS
-C, --certificate=<certificate> Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure
-H, --securityHub=<standard-json> Additional input files to provide context that an ASFF file needs
such as the CIS AWS Foundations or AWS Foundational Security Best
Practices documents (in ASFF compliant JSON form)
-a, --aws Pull findings from AWS Security Hub
-h, --help Show CLI help.
-i, --input=<asff-json> (required if not using AWS) Input ASFF JSON file
-o, --output=<hdf-output-folder> (required) Output HDF JSON folder
-r, --region=<region> Security Hub region to pull findings from
-t, --target=<target>... Target ID(s) to pull from Security Hub (maximum 10), leave blank for non-HDF findings
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Using ASFF JSON file
$ saf convert asff2hdf -i asff-findings.json -o output-folder-name
Using ASFF JSON file with additional input files
$ saf convert asff2hdf -i asff-findings.json --securityhub <standard-1-json> ... --securityhub <standard-n-json> -o output-folder-name
Using AWS to pull ASFF JSON findings
$ saf convert asff2hdf --aws -o out -r us-west-2 --target rhel7
top
Note: Pulling AWS Config results data requires configuration of the AWS CLI, see the AWS documentation or configuration of environment variables via Docker.