MITRE 安全自动化框架 (SAF) 命令行界面 (CLI) 汇集了 MITRE 和安全社区开发的应用程序、技术、库和工具,以简化系统和 DevOps 管道的安全自动化
SAF CLI 是 Heimdall Tools 和 InSpec Tools 的继承者。
可以使用npm
安装 SAF CLI 并使其保持最新,大多数 NodeJS 版本都包含该 npm。
npm install -g @mitre/saf
要使用npm
更新 SAF CLI:
npm update -g @mitre/saf
顶部
可以使用brew
安装SAF CLI 并使其保持最新。
brew install mitre/saf/saf-cli
要使用brew
更新 SAF CLI:
brew upgrade mitre/saf/saf-cli
顶部
在 Linux 和 Mac 上:
下面的 docker 命令可用于运行 SAF CLI 一次,其中arguments
包含您要运行的命令和标志。例如: --version
或view summary -i hdf-results.json
。
docker run -it -v$(pwd):/share mitre/saf <arguments>
要使用持久 shell 运行 SAF CLI 以执行一个或多个命令,请使用以下命令,然后运行每个完整命令。例如: saf --version
或saf view summary -i hdf-results.json
。您可以更改您想要使用的入口点。例如,使用--entrypoint sh
运行以在 shell 终端中打开。如果未找到指定的入口点,请尝试使用诸如--entrypoint /bin/bash
之类的路径。
docker run --rm -it --entrypoint bash -v$(pwd):/share mitre/saf
在 Windows 上:
下面的 docker 命令可用于运行 SAF CLI 一次,其中arguments
包含您要运行的命令和标志。例如: --version
或view summary -i hdf-results.json
。
docker run -it -v%cd%:/share mitre/saf <arguments>
要使用持久 shell 运行 SAF CLI 以执行一个或多个命令,请使用以下命令,然后运行每个完整命令。例如: saf --version
或saf view summary -i hdf-results.json
。您可以更改您想要使用的入口点。例如,使用--entrypoint sh
运行以在 shell 终端中打开。如果未找到指定的入口点,请尝试使用诸如--entrypoint /bin/bash
之类的路径。
docker run --rm -it --entrypoint sh -v%cd%:/share mitre/saf
笔记:
请记住根据需要使用 Docker CLI 标志来运行各种子命令。
例如,要运行emasser configure
子命令,您需要传入包含证书的卷以及可以存储生成的 .env 的位置。此外,您需要传入用于启用伪 TTY 和交互性的标志。
docker run -it -v "$(pwd)":/share mitre/saf emasser configure
其他命令可能不需要-i
或-t
标志,而只需要绑定安装的卷,例如基于文件的convert
。
docker run --rm -v "$(pwd)":/share mitre/saf convert -i test/sample_data/trivy/sample_input_report/trivy-image_golang-1.12-alpine_sample.json -o test.json
存在其他标志来打开网络端口或传递环境变量,因此请确保使用成功运行命令所需的任何标志。
要使用docker
更新 SAF CLI:
docker pull mitre/saf:latest
顶部
要在 Windows 上安装最新版本的 SAF CLI,请从版本 ?️ 页面下载并运行适合您的系统架构的最新安装程序。
要更新 Windows 上的 SAF CLI,请从系统中卸载任何现有版本,然后从版本 ?️ 页面下载并运行适合您的系统架构的最新安装程序。
顶部
证明“未经审查”的控制:有时安全工具无法自动测试需求,因此需要手动审查,即有人采访人员和/或检查系统以确认(即证明)控制要求是否已通过使满意。
attest create Create attestation files for use with `saf attest apply`
USAGE
$ saf attest create -o <attestation-file> [-i <hdf-json> -t <json | xlsx | yml | yaml>]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (optional) An input HDF file to search for controls
-o, --output=<value> (required) The output filename
-t, --format=<option> [default: json] (optional) The output file type
<options: json|xlsx|yml|yaml>
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf attest create -o attestation.json -i hdf.json
$ saf attest create -o attestation.xlsx -t xlsx
顶部
attest apply Apply one or more attestation files to one or more HDF results sets
USAGE
$ saf attest apply -i <input-hdf-json>... <attestation>... -o <output-hdf-path>
FLAGS
-h, --help Show CLI help.
-i, --input=<value>... (required) Your input HDF and Attestation file(s)
-o, --output=<value> (required) Output file or folder (for multiple executions)
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf attest apply -i hdf.json attestation.json -o new-hdf.json
$ saf attest apply -i hdf1.json hdf2.json attestation.xlsx -o outputDir
顶部
使用saf convert
命令将数据转换为 Heimdall 数据格式 (HDF) 或从 Heimdall 数据格式 (HDF) 转换数据。
想要推荐或帮助开发转换器?看到维基百科了吗?关于如何开始。
顶部
convert anchoregrype2hdf Translate a Anchore Grype output file into an HDF results set
USAGE
$ saf convert anchoregrype2hdf -i <anchoregrype-json> -o <hdf-scan-results-json> [-h] [-w]
FLAGS
-h, --help Show CLI help.
-i, --input=<anchoregrype-json> (required) Input Anchore Grype file
-o, --output=<hdf-scan-results-json> (required) Output HDF JSON File
-w, --includeRaw Include raw data from the input Anchore Grype file
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert anchoregrype2hdf -i anchoregrype.json -o output-hdf-name.json
注意:将结果上传到 AWS Security hub 需要配置 AWS CLI,请参阅 AWS 文档或通过 Docker 配置环境变量。
convert hdf2asff Translate a Heimdall Data Format JSON file into
AWS Security Findings Format JSON file(s) and/or
upload to AWS Security Hub
USAGE
$ saf convert hdf2asff -a <account-id> -r <region> -i <hdf-scan-results-json> -t <target> [-h] [-R] (-u [-I -C <certificate>] | [-o <asff-output-folder>])
FLAGS
-C, --certificate=<certificate> Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure.
-R, --specifyRegionAttribute Manually specify the top-level `Region` attribute - SecurityHub
populates this attribute automatically and prohibits one from
updating it using `BatchImportFindings` or `BatchUpdateFindings`
-a, --accountId=<account-id> (required) AWS Account ID
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF JSON File
-o, --output=<asff-output-folder> Output ASFF JSON Folder
-r, --region=<region> (required) SecurityHub Region
-t, --target=<target> (required) Unique name for target to track findings across time
-u, --upload Upload findings to AWS Security Hub
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Send output to local file system
$ saf convert hdf2asff -i rhel7-scan_02032022A.json -a 123456789 -r us-east-1 -t rhel7_example_host -o rhel7.asff
Upload findings to AWS Security Hub
$ saf convert hdf2asff -i rds_mysql_i123456789scan_03042022A.json -a 987654321 -r us-west-1 -t Instance_i123456789 -u
Upload findings to AWS Security Hub and Send output to local file system
$ saf convert hdf2asff -i snyk_acme_project5_hdf_04052022A.json -a 2143658798 -r us-east-1 -t acme_project5 -o snyk_acme_project5 -u
顶部
注意:HDF 到 Splunk 需要在 Splunk 服务器上进行配置。请参阅 Splunk 配置。
convert hdf2splunk Translate and upload a Heimdall Data Format JSON file into a Splunk server
USAGE
$ saf convert hdf2splunk -i <hdf-scan-results-json> -H <host> -I <index> [-h] [-P <port>] [-s http|https] [-u <username> | -t <token>] [-p <password>] [-L info|warn|debug|verbose]
FLAGS
-H, --host=<host> (required) Splunk Hostname or IP
-I, --index=<index> (required) Splunk index to import HDF data into
-P, --port=<port> [default: 8089] Splunk management port (also known as the Universal Forwarder port)
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-p, --password=<password> Your Splunk password
-s, --scheme=<option> [default: https] HTTP Scheme used for communication with splunk
<options: http|https>
-t, --token=<token> Your Splunk API Token
-u, --username=<username> Your Splunk username
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
User name/password Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -u admin -p Valid_password! -I hdf
Token Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -t your.splunk.token -I hdf
有关 HDF Splunk Schema 文档,请访问 Heimdall 转换器架构
在 Splunk 中预览 HDF 数据:
完整原始搜索查询的示例:
index = " <<YOUR INDEX>> " meta . subtype = control | stats values ( meta . filename ) values ( meta . filetype ) list( meta . profile_sha256 ) values ( meta . hdf_splunk_schema ) first( meta . status ) list( meta . status ) list( meta . is_baseline ) values (title) last(code) list(code) values ( desc ) values (descriptions. * ) values (id) values (impact) list(refs{}. * ) list(results{}. * ) list(source_location{}. * ) values (tags. * ) by meta . guid id
| join meta . guid
[search index = " <<YOUR INDEX>> " meta . subtype = header | stats values ( meta . filename ) values ( meta . filetype ) values ( meta . hdf_splunk_schema ) list( statistics . duration ) list(platform. * ) list(version) by meta . guid ]
| join meta . guid
[search index = " <<YOUR INDEX>> " meta . subtype = profile | stats values ( meta . filename ) values ( meta . filetype ) values ( meta . hdf_splunk_schema ) list( meta . profile_sha256 ) list( meta . is_baseline ) last(summary) list(summary) list(sha256) list(supports{}. * ) last(name) list(name) list(copyright) list(maintainer) list(copyright_email) last(version) list(version) list(license) list(title) list(parent_profile) list(depends{}. * ) list(controls{}. * ) list(attributes{}. * ) list(status) by meta . guid ]
格式化表搜索查询的示例:
index = " <<YOUR INDEX>> " meta . subtype = control | stats values ( meta . filename ) values ( meta . filetype ) list( meta . profile_sha256 ) values ( meta . hdf_splunk_schema ) first( meta . status ) list( meta . status ) list( meta . is_baseline ) values (title) last(code) list(code) values ( desc ) values (descriptions. * ) values (id) values (impact) list(refs{}. * ) list(results{}. * ) list(source_location{}. * ) values (tags. * ) by meta . guid id
| join meta . guid
[search index = " <<YOUR INDEX>> " meta . subtype = header | stats values ( meta . filename ) values ( meta . filetype ) values ( meta . hdf_splunk_schema ) list( statistics . duration ) list(platform. * ) list(version) by meta . guid ]
| join meta . guid
[search index = " <<YOUR INDEX>> " meta . subtype = profile | stats values ( meta . filename ) values ( meta . filetype ) values ( meta . hdf_splunk_schema ) list( meta . profile_sha256 ) list( meta . is_baseline ) last(summary) list(summary) list(sha256) list(supports{}. * ) last(name) list(name) list(copyright) list(maintainer) list(copyright_email) last(version) list(version) list(license) list(title) list(parent_profile) list(depends{}. * ) list(controls{}. * ) list(attributes{}. * ) list(status) by meta . guid ]
| rename values ( meta . filename ) AS " Results Set " , values ( meta . filetype ) AS " Scan Type " , list( statistics . duration ) AS " Scan Duration " , first( meta . status ) AS " Control Status " , list(results{}.status) AS " Test(s) Status " , id AS " ID " , values (title) AS " Title " , values ( desc ) AS " Description " , values (impact) AS " Impact " , last(code) AS Code, values ( descriptions . check ) AS " Check " , values ( descriptions . fix ) AS " Fix " , values ( tags . cci {}) AS " CCI IDs " , list(results{}.code_desc) AS " Results Description " , list(results{}.skip_message) AS " Results Skip Message (if applicable) " , values ( tags . nist {}) AS " NIST SP 800-53 Controls " , last(name) AS " Scan (Profile) Name " , last(summary) AS " Scan (Profile) Summary " , last(version) AS " Scan (Profile) Version "
| table meta . guid " Results Set " " Scan Type " " Scan (Profile) Name " ID " NIST SP 800-53 Controls " Title " Control Status " " Test(s) Status " " Results Description " " Results Skip Message (if applicable) " Description Impact Severity Check Fix " CCI IDs " Code " Scan Duration " " Scan (Profile) Summary " " Scan (Profile) Version "
顶部
convert hdf2xccdf Translate an HDF file into an XCCDF XML
USAGE
$ saf convert hdf2xccdf -i <hdf-scan-results-json> -o <output-xccdf-xml> [-h]
FLAGS
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-o, --output=<output-xccdf-xml> (required) Output XCCDF XML File
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert hdf2xccdf -i hdf_input.json -o xccdf-results.xml
顶部
convert hdf2ckl Translate a Heimdall Data Format JSON file into a
DISA checklist file
USAGE
$ saf convert hdf2ckl saf convert hdf2ckl -i <hdf-scan-results-json> -o <output-ckl> [-h] [-m <metadata>] [--profilename <value>] [--profiletitle <value>] [--version <value>] [--releasenumber <value>] [--releasedate <value>] [--marking <value>] [-H <value>] [-I <value>] [-M <value>] [-F <value>] [--targetcomment <value>] [--role Domain Controller|Member Server|None|Workstation] [--assettype Computing|Non-Computing] [--techarea |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS] [--stigguid <value>] [--targetkey <value>] [--webdbsite <value> --webordatabase] [--webdbinstance <value> ] [--vulidmapping gid|id]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (required) Input HDF file
-o, --output=<value> (required) Output CKL file
CHECKLIST METADATA FLAGS
-F, --fqdn=<value> Fully Qualified Domain Name
-H, --hostname=<value> The name assigned to the asset within the network
-I, --ip=<value> IP address
-M, --mac=<value> MAC address
-m, --metadata=<value> Metadata JSON file, generate one with "saf generate ckl_metadata"
--assettype=<option> The category or classification of the asset
<options: Computing|Non-Computing>
--marking=<value> A security classification or designation of the asset, indicating its sensitivity level
--profilename=<value> Profile name
--profiletitle=<value> Profile title
--releasedate=<value> Profile release date
--releasenumber=<value> Profile release number
--role=<option> The primary function or role of the asset within the network or organization
<options: Domain Controller|Member Server|None|Workstation>
--stigguid=<value> A unique identifier associated with the STIG for the asset
--targetcomment=<value> Additional comments or notes about the asset
--targetkey=<value> A unique key or identifier for the asset within the checklist or inventory system
--techarea=<option> The technical area or domain to which the asset belongs
<options: |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS>
--version=<value> Profile version number
--vulidmapping=<option> Which type of control identifier to map to the checklist ID
<options: gid|id>
--webdbinstance=<value> The specific instance of the web application or database running on the server
--webdbsite=<value> The specific site or application hosted on the web or database server
--webordatabase Indicates whether the STIG is primarily for either a web or database server
DESCRIPTION
Translate a Heimdall Data Format JSON file into a DISA checklist file
EXAMPLES
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90:AB
$ saf convert hdf2ckl -i rhel8-results.json -o rhel8.ckl -m rhel8-metadata.json
顶部
convert hdf2csv Translate a Heimdall Data Format JSON file into a
Comma Separated Values (CSV) file
USAGE
$ saf convert hdf2csv -i <hdf-scan-results-json> -o <output-csv> [-h] [-f <csv-fields>] [-t]
FLAGS
-f, --fields=<csv-fields> [default: All Fields] Fields to include in output CSV, separated by commas
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-o, --output=<output-csv> (required) Output CSV file
-t, --noTruncate Don't truncate fields longer than 32,767 characters (the cell limit in Excel)
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Running the CLI interactively
$ saf convert hdf2csv --interactive
Providing flags at the command line
$ saf convert hdf2csv -i rhel7-results.json -o rhel7.csv --fields "Results Set,Status,ID,Title,Severity"
顶部
convert hdf2condensed Condensed format used by some community members
to pre-process data for elasticsearch and custom dashboards
USAGE
$ saf convert hdf2condensed -i <hdf-scan-results-json> -o <condensed-json> [-h]
FLAGS
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-o, --output=<condensed-json> (required) Output condensed JSON file
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert hdf2condensed -i rhel7-results.json -o rhel7-condensed.json
顶部
输出 | 使用 | 命令 |
---|---|---|
ASFF json | 将输入映射器的所有发现 | aws securityhub get-findings > asff.json |
AWS SecurityHub 支持标准 json | 获取所有启用的标准,以便您可以获得它们的标识符 | aws securityhub get-enabled-standards > asff_standards.json |
AWS SecurityHub 标准控制 json | 获取将输入映射器的标准的所有控件 | aws securityhub 描述标准控制 --standards-subscription-arn "arn:aws:securityhub:us-east-1:123456789123:subscription/cis-aws-foundations-benchmark/v/1.2.0" > asff_cis_standard.json |
convert asff2hdf Translate a AWS Security Finding Format JSON into a
Heimdall Data Format JSON file(s)
USAGE
$ saf convert asff2hdf -o <hdf-output-folder> [-h] (-i <asff-json> [--securityhub <standard-json>]... | -a -r <region> [-I | -C <certificate>] [-t <target>]) [-L info|warn|debug|verbose]
FLAGS
-C, --certificate=<certificate> Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure
-H, --securityHub=<standard-json> Additional input files to provide context that an ASFF file needs
such as the CIS AWS Foundations or AWS Foundational Security Best
Practices documents (in ASFF compliant JSON form)
-a, --aws Pull findings from AWS Security Hub
-h, --help Show CLI help.
-i, --input=<asff-json> (required if not using AWS) Input ASFF JSON file
-o, --output=<hdf-output-folder> (required) Output HDF JSON folder
-r, --region=<region> Security Hub region to pull findings from
-t, --target=<target>... Target ID(s) to pull from Security Hub (maximum 10), leave blank for non-HDF findings
GLOBAL FLAGS
-L, --logLevel=<option> [default: info] Specify level for logging (if implemented by the CLI command)
<options: info|warn|debug|verbose>
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Using ASFF JSON file
$ saf convert asff2hdf -i asff-findings.json -o output-folder-name
Using ASFF JSON file with additional input files
$ saf convert asff2hdf -i asff-findings.json --securityhub <standard-1-json> ... --securityhub <standard-n-json> -o output-folder-name
Using AWS to pull ASFF JSON findings
$ saf convert asff2hdf --aws -o out -r us-west-2 --target rhel7
顶部
注意:拉取 AWS Config 结果数据需要配置 AWS CLI,请参阅 AWS 文档或通过 Docker 配置环境变量。