Snakemake storage plugin: azure

https://img.shields.io/badge/repository-github-blue?color=%23022c22 https://img.shields.io/badge/author-Jake%20VanCampen-purple?color=%23064e3b PyPI - Version PyPI - License

Warning

No documentation found in repository https://github.com/snakemake/snakemake-storage-plugin-azure. The plugin should provide a docs/intro.md with some introductory sentences and optionally a docs/further.md file with details beyond the auto-generated usage instractions presented in this catalog.

Installation

Install this plugin by installing it with pip or mamba, e.g.:

pip install snakemake-storage-plugin-azure

Usage

Queries

Queries to this storage should have the following format:

Query type

Query

Description

any

az://account/container/path/example/file.txt

A file in an Azure Blob Storage Account Container

As default provider

If you want all your input and output (which is not explicitly marked to come from another storage) to be written to and read from this storage, you can use it as a default provider via:

snakemake --default-storage-provider azure --default-storage-prefix ...

with ... being the prefix of a query under which you want to store all your results. You can also pass custom settings via command line arguments:

snakemake --default-storage-provider azure --default-storage-prefix ... \
    --storage-azure-max-requests-per-second ... \        --storage-azure-endpoint-url ... \        --storage-azure-access-key ... \        --storage-azure-sas-token ...

Within the workflow

If you want to use this storage plugin only for specific items, you can register it inside of your workflow:

# register storage provider (not needed if no custom settings are to be defined here)
storage:
    provider="azure",
    # optionally add custom settings here if needed
    # alternatively they can be passed via command line arguments
    # starting with --storage-azure-..., see
    # snakemake --help
    # Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
    max_requests_per_second=...,
    # Azure Blob Storage Account endpoint url
    endpoint_url=...,
    # Azure Blob Storage Account Access Key Credential.
If set, takes precedence over sas_token credential.

access_key=…, # Azure Blob Storage Account SAS Token Credential sas_token=…,

rule example:
input:
storage.azure(

# define query to the storage backend here …

),

output:

“example.txt”

shell:

“…”

Using multiple entities of the same storage plugin

In case you have to use this storage plugin multiple times, but with different settings (e.g. to connect to different storage servers), you can register it multiple times, each time providing a different tag:

# register shared settings
storage:
    provider="azure",
    # optionally add custom settings here if needed
    # alternatively they can be passed via command line arguments
    # starting with --storage-azure-..., see below
    # Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used.
    max_requests_per_second=...,
    # Azure Blob Storage Account endpoint url
    endpoint_url=...,
    # Azure Blob Storage Account Access Key Credential.
If set, takes precedence over sas_token credential.

access_key=…, # Azure Blob Storage Account SAS Token Credential sas_token=…,

# register multiple tagged entities storage foo:

provider=”azure”, # optionally add custom settings here if needed # alternatively they can be passed via command line arguments # starting with –storage-azure-…, see below. # To only pass a setting to this tagged entity, prefix the given value with # the tag name, i.e. foo:max_requests_per_second=… # Maximum number of requests per second for this storage provider. If nothing is specified, the default implemented by the storage plugin is used. max_requests_per_second=…, # Azure Blob Storage Account endpoint url endpoint_url=…, # Azure Blob Storage Account Access Key Credential.

If set, takes precedence over sas_token credential.

access_key=…, # Azure Blob Storage Account SAS Token Credential sas_token=…,

rule example:
input:
storage.foo(

# define query to the storage backend here …

),

output:

“example.txt”

shell:

“…”

Settings

The storage plugin has the following settings (which can be passed via command line, the workflow or environment variables, if provided in the respective columns):

If set, takes precedence over sas_token credential.
  • None

  • str

    • --storage-azure-sas-token VALUE

    • sas_token

    • Azure Blob Storage Account SAS Token Credential

    • None

    • str