Snakemake storage plugin: azure
A Snakemake storage plugin that reads and writes from Azure Blob Storage. Pairs nicely with the Snakemake Executor Plugin for Azure Batch. See “Further information” for an example configuration in such a scenario.
Authentication defaults to using DefaultAzureCredential, an Azure ClI credential that will inherit the permissions of the caller. This enforces identity credential management best practices.
Installation
Install this plugin by installing it with pip or mamba, e.g.:
pip install snakemake-storage-plugin-azure
Usage
Queries
Queries to this storage should have the following format:
Query type |
Query |
Description |
---|---|---|
any |
|
A file in an Azure Blob Storage Account Container |
As default provider
If you want all your input and output (which is not explicitly marked to come from another storage) to be written to and read from this storage, you can use it as a default provider via:
snakemake --default-storage-provider azure --default-storage-prefix ...
with ...
being the prefix of a query under which you want to store all your
results.
You can also pass custom settings via command line arguments:
snakemake --default-storage-provider azure --default-storage-prefix ... \
--storage-azure-max-requests-per-second ... \ --storage-azure-account-name ...
Within the workflow
If you want to use this storage plugin only for specific items, you can register it inside of your workflow:
# register storage provider (not needed if no custom settings are to be defined here)
storage:
provider="azure",
# optionally add custom settings here if needed
# alternatively they can be passed via command line arguments
# starting with --storage-azure-..., see
# snakemake --help
# Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
max_requests_per_second=...,
# Azure Blob Storage Account name
account_name=...,
rule example:
input:
storage.azure(
# define query to the storage backend here
...
),
output:
"example.txt"
shell:
"..."
Using multiple entities of the same storage plugin
In case you have to use this storage plugin multiple times, but with different settings (e.g. to connect to different storage servers), you can register it multiple times, each time providing a different tag:
# register shared settings
storage:
provider="azure",
# optionally add custom settings here if needed
# alternatively they can be passed via command line arguments
# starting with --storage-azure-..., see below
# Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
max_requests_per_second=...,
# Azure Blob Storage Account name
account_name=...,
# register multiple tagged entities
storage foo:
provider="azure",
# optionally add custom settings here if needed
# alternatively they can be passed via command line arguments
# starting with --storage-azure-..., see below.
# To only pass a setting to this tagged entity, prefix the given value with
# the tag name, i.e. foo:max_requests_per_second=...
# Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
max_requests_per_second=...,
# Azure Blob Storage Account name
account_name=...,
rule example:
input:
storage.foo(
# define query to the storage backend here
...
),
output:
"example.txt"
shell:
"..."
Settings
The storage plugin has the following settings (which can be passed via command line, the workflow or environment variables, if provided in the respective columns):
CLI setting |
Workflow setting |
Envvar setting |
Description |
Default |
Choices |
Required |
Type |
---|---|---|---|---|---|---|---|
|
|
Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used. |
|
✗ |
str |
||
|
|
|
Azure Blob Storage Account name |
|
✓ |
str |
Further details
The below example Snakefile and command will stream a file, test.txt, containing the text “Hello, World” to the azure blob: https://accountname.blob.core.windows.net/container/test.txt.
rule touch:
output: "test.txt"
shell:
"echo 'Hello, World!' > {output}"
Command:
snakemake -j1 \
--default-storage-provider azure \
--default-storage-prefix "az://container"
--storage-azure-account-name accountname \
--verbose