Benchmark Vault performance
Operating Vault in an efficient manner to support your use cases requires that you are able to accurately measure its performance. Ideally, you can benchmark and measure performance in environments which resemble production use cases to produce realistic results.
Challenge
You need to measure Vault performance in a meaningful way, and in an environment that also resembles that of your intended use case with respect to compute resources.
The Vault server under test should have the same auth methods and secrets engines enabled, along with example secrets, leases, and token data present to accurately simulate your use cases.
Solution
HashiCorp provides the open source utility vault-benchmark to help you measure Vault performance at a granular level using several of the available auth methods and secrets engines.
You can use vault-benchmark as a command line interface, as a Docker image, or as a Kubernetes workload to match the infrastructure you're using for Vault.
Personas
The end-to-end scenario and hands-on lab described in this tutorial involves one persona.
The persona is a Vault operator with privileged permissions to enable and disable auth methods and secrets engines. All the tasks in the hands-on scenario are performed as this persona.
Prerequisites
You need the following resources to complete the hands-on scenario based on whether you'll use the Vault community edition with CLI, Docker, Kubernetes, or HCP Vault Dedicated.
To complete the hands-on lab using CLI versions of Vault and vault-benchmark, you need the following:
Vault binary installed and in your system PATH.
vault-benchmark binary installed and in your system PATH.
Launch Terminal
This tutorial includes a free interactive command-line lab that lets you follow along on actual cloud infrastructure.
Versions used for this tutorial
This tutorial was last tested 11 Oct 2023 on macOS using the following software versions.
$ sw_vers --productVersion
13.6
$ vault version
Vault v1.15.0 (b4d07277a6c5318bb50d3b94bbd6135dccb4c601), built 2023-09-22T16:53:10Z
$ vault-benchmark version
vault-benchmark v0.2.0
$ docker version --format '{{.Server.Version}}'
24.0.6
$ curl --version | head -n 1 | awk '{print $2}'
8.1.2
$ jq --version
jq-1.7
$ helm version --short
v3.13.1+g3547a4b
$ minikube version
minikube version: v1.31.2
commit: fd7ecd9c4599bef9f04c0986c4a0187f98a4396e
Lab setup
The setup for this tutorial's hands-on lab is different depending on whether you want to experiment with vault-benchmark in the CLI with a dev mode server, in the CLI with an Vault Dedicated server, in Docker, or in Kubernetes.
Create local hands-on lab home
You can create a temporary directory to hold all the content needed for this hands-on lab and then assign its path to an environment variable for later reference.
Open a terminal, and create the directory
/tmp/learn-vault-pgp
.$ mkdir /tmp/learn-vault-benchmark
Export the hands-on lab directory path as the value to the
HC_LEARN_LAB
environment variable.$ export HC_LEARN_LAB=/tmp/learn-vault-benchmark
Now choose the lab set up workflow that matches the environment you want to use for this hands-on lab.
Run a Vault dev mode server as a background process from your terminal session to follow the self-hosted Vault workflow in this hands-on lab.
Open a terminal and start a Vault dev server with
root
as the initial root token value.$ vault server \ -dev \ -dev-root-token-id root \ > "$HC_LEARN_LAB"/vault-server.log 2>&1 &
The Vault dev server defaults to running at
127.0.0.1:8200
. The server logs to the filevault-server.log
in the hands-on lab working directory, and gets automatically initialized and unsealed.Dev mode is not for production
Do not run a Vault dev server in production. This approach starts a Vault server with an in-memory database and all contents are lost when the Vault server process is stopped.
Export the environment variable for the
vault
CLI to address the Vault server.$ export VAULT_ADDR='http://127.0.0.1:8200'
Export an environment variable for the
vault
CLI to authenticate with the Vault server.$ export VAULT_TOKEN=root
Check Vault status.
$ vault status
Example output:
Key Value --- ----- Seal Type shamir Initialized true Sealed false Total Shares 1 Threshold 1 Version 1.15.0 Build Date 2023-09-22T16:53:10Z Storage Type inmem Cluster Name vault-cluster-becc609e Cluster ID 96babf60-d3e2-0ea0-4879-f78763ac595c HA Enabled false
The Vault server is ready for you to proceed with the hands-on lab.
Now that you've established a working lab, you're ready to explore vault-benchmark and its configuration.
Explore vault-benchmark
Your goal for this section is to explore vault-benchmark using a terminal session. Resources for diving deeper are provided at the section's conclusion.
Check your vault-benchmark version.
$ vault-benchmark version
Example output:
vault-benchmark v0.2.0
You can get help to discover available commands.
$ vault-benchmark --help
Example output:
Usage: vault-benchmark <command> [args] Command list: run Run vault-benchmark test(s) review Review previous test results
This hands-on lab focuses on your use of the
run
command. Get help for therun
command.$ vault-benchmark run --help
Usage: vault-benchmark run [options] This command will run a vault-benchmark test. Run a vault-benchmark test with a configuration file: $ vault-benchmark run -config=/etc/vault-benchmark/test.hcl For a full list of examples, please see the documentation. Command Options: -annotate=<string> Comma-separated name=value pairs include in bench_running prometheus metric. Try name 'testname' for dashboard example. -audit_path=<string> Path to file for audit log. -ca_pem_file=<string> Path to PEM encoded CA file to verify external Vault. This can also be specified via the VAULT_CACERT environment variable. -cleanup Cleanup benchmark artifacts after run. The default is false. -cluster_json=<string> Path to cluster.json file -config=<string> Path to a vault-benchmark test configuration file. -debug Run vault-benchmark in Debug mode. The default is false. -disable_http2 Force HTTP/1.1 The default is false. -duration=<duration> Test Duration. The default is 10s. -log_level=<string> Level to emit logs. Options are: INFO, WARN, DEBUG, TRACE. The default is INFO. This can also be specified via the VAULT_BENCHMARK_LOG_LEVEL environment variable. -pprof_interval=<duration> Collection interval for vault debug pprof profiling. -random_mounts Use random mount names. The default is true. -report_mode=<string> Reporting Mode. Options are: terse, verbose, json. The default is terse. -rps=<int> Requests per second. Setting to 0 means as fast as possible. -vault_addr=<string> Target Vault API Address. The default is http://127.0.0.1:8200. This can also be specified via the VAULT_ADDR environment variable. -vault_namespace=<string> Vault Namespace to create test mounts. This can also be specified via the VAULT_NAMESPACE environment variable. -vault_token=<string> Vault Token to be used for test setup. This can also be specified via the VAULT_TOKEN environment variable. -workers=<int> Number of workers The default is 10.
Configure benchmark
Your goal for this section is to configure vault-benchmark for a basic benchmark run.
Vault Benchmark is configured with a HashiCorp Configuration Language (HCL) file. You can use the example configuration shown in the Usage documentation in this hands-on lab.
Here is the entire example configuration file.
vault-benchmark-config.hcl
vault_addr = "http://127.0.0.1:8200"
vault_token = "root"
vault_namespace="root"
duration = "30s"
cleanup = true
test "approle_auth" "approle_logins" {
weight = 50
config {
role {
role_name = "benchmark-role"
token_ttl="2m"
}
}
}
test "kvv2_write" "static_secret_writes" {
weight = 50
config {
numkvs = 100
kvsize = 100
}
}
Lines 1-5 are global parameters:
vault_addr
: the full URL plus port for the Vault server to benchmark.vault_token
: the literal token value for a token with capabilities to enable and manage secrets engines and auth methods. In this hands-on lab, the initial root token value is used, but you should not use a root token in production this way.vault_namespace
: the name of the Enterprise namespace to use for the benchmark.duration
: number of seconds to run the benchmark.cleanup
: whether to remove all resources created by the benchmark run.
Benchmark in production not recommended
You should benchmark Vault servers dedicated to the purpose, but not production servers. The benchmark can adversely affect performance of the server, and does not make guarantees about cleanup of artifacts created during the benchmark run.
Lines 7-15 and 17-23 represent the 2 tests making up this benchmark configuration. This test configuration runs two different tests, an approle_auth test and a kvv2_write test, with the percentage of requests split evenly between the two.
The first test beginning at line 7, is for AppRole auth method logins. Note that the first parameter is weight. You can think of this as a percentage of the entire workload. This means that the benchmark performs AppRole logins 50% of the time throughout the entire run.
You can configure a test in its config stanza. In this test, role_name specifies the auth method role name benchmark-role. Each token Vault issues after an AppRole login with the benchmark-role is configured with a token_ttl value of 2 minutes to specify the token's time-to-live (TTL). You can learn more about available parameters in the AppRole auth method benchmark documentation.
The second test beginning at line 17, is for Key/Value version 2 secrets engine writes. This test takes the remaining 50% of benchmark operations with its weight setting. It also uses numkvs to specify that 100 key/value secrets be written, and kvsize to limit each value to 100 bytes. Available parameters are documented in the KV v1 and KV v2 secret benchmark documentation
You can learn more about all the tests in the vault-benchmark test documentation.
Now that you've reviewed the example configuration, you need to change it a bit to function with the Vault cluster you're using.
In the lab setup section, you set the VAULT_ADDR
and VAULT_TOKEN
(and VAULT_NAMESPACE
for Vault Dedicated) environment variables for communicating to your Vault cluster.
Values from environment variables override anything specified in the vault-benchmark configuration file.
With this in mind, you can set the vault_addr
, vault_token
, and vault_namespace
values to an empty string. Their values come from the environment variables you set.
root namespace by default
If you're using a Vault cluster that is not in HCP, you might recall that you did not export a VAULT_NAMESPACE
environment variable during your lab setup.
vault-benchmark uses the root
namespace by default when there is no value set in the configuration file. Specifying it is unnecessary unless you're using HCP Vault Dedicated or a Vault Enterprise cluster with a namespace that is not root
.
These steps apply equally to self-hosted, Docker and Vault Dedicated.
Return to your terminal session and write the configuration file to
/tmp/learn-vault-benchmark/vault-benchmark-config.hcl
.$ cat > "$HC_LEARN_LAB"/vault-benchmark-config.hcl << EOF vault_addr = "" vault_token = "" vault_namespace="" duration = "30s" cleanup = true test "approle_auth" "approle_logins" { weight = 50 config { role { role_name = "benchmark-role" token_ttl="2m" } } } test "kvv2_write" "static_secret_writes" { weight = 50 config { numkvs = 100 kvsize = 100 } } EOF
You've configured vault-benchmark for 2 tests, and they're now ready for you to run them.
Run benchmark
Your goal for this section is to run vault-benchmark with the configuration that you just created.
Run the benchmark.
$ vault-benchmark run \
-config="$HC_LEARN_LAB"/vault-benchmark-config.hcl
Example output:
2023-10-19T12:59:08.675-0400 [INFO] vault-benchmark: setting up targets
2023-10-19T12:59:10.814-0400 [INFO] vault-benchmark: starting benchmarks: duration=30s
2023-10-19T12:59:40.817-0400 [INFO] vault-benchmark: cleaning up targets
2023-10-19T13:00:05.371-0400 [INFO] vault-benchmark: benchmark complete
Target: http://127.0.0.1:8200
op count rate throughput mean 95th% 99th% successRatio
approle_logins 98537 3284.650546 3284.468629 1.683084ms 3.559447ms 6.001427ms 100.00%
static_secret_writes 98507 3283.561593 3283.438647 1.3473ms 2.940248ms 5.218769ms 100.00%
Review the output
The benchmark output is tabular by default. This example is from the self-hosted dev mode server, but the output will be similar for Vault in any environment.
2023-10-19T12:59:08.675-0400 [INFO] vault-benchmark: setting up targets
2023-10-19T12:59:10.814-0400 [INFO] vault-benchmark: starting benchmarks: duration=30s
2023-10-19T12:59:40.817-0400 [INFO] vault-benchmark: cleaning up targets
2023-10-19T13:00:05.371-0400 [INFO] vault-benchmark: benchmark complete
Target: http://127.0.0.1:8200
op count rate throughput mean 95th% 99th% successRatio
approle_logins 98537 3284.650546 3284.468629 1.683084ms 3.559447ms 6.001427ms 100.00%
static_secret_writes 98507 3283.561593 3283.438647 1.3473ms 2.940248ms 5.218769ms 100.00%
The first 4 lines are log outputs from vault-benchmark which describe its current actions.
The 5th line shows the benchmark target URL, and the 6th line represents headings for the metric data.
The data metrics are as follows for the last 2 lines, which represent each of the 2 benchmark tests you configured.
op: the test name.
count: the number of tests completed during the benchmark duration.
rate: the true operations per second rate of all tests of this type.
throughput: the operations per second rate of all successful tests of this type.
mean: the mean time in milliseconds per operation.
95th%: the 95th percentile time in milliseconds per operation.
99th%: the 99th percentile time in milliseconds per operation.
successRatio: the percentage of successful tests of this type. When tests are not successful, you should consult the Vault operational and audit logs for details on the unsuccessful tests.
Tip
You can also output the benchmark results as JSON with the -report_mode=json
flag.
Cleanup
Follow the steps for the environment you used in the hands-on lab to clean up.
Stop the Vault dev mode server.
$ pkill vault
Remove the hands on lab directory.
$ rm -rf "$HC_LEARN_LAB"
Unset environment variables.
$ unset VAULT_ADDR VAULT_TOKEN
Remove cached Vault token.
$ rm -f ~/.vault-token
Next steps
You learned the basics around the Vault Benchmark tool, including how to configure and run a benchmark. You also learned about the default benchmark output, and available documentation resources for Vault Benchmark.
To dive deeper into Vault performance, consider reviewing the performance tuning and production hardening documentation.