Parameterization best practices with Azure Data Factory
15 Dec 2025
Table of Contents
- What is Azure Data Factory?
- Why use multiple ADF environments
- Multi-ADF-environment example
- Will environment-dependent variables require a lot of work?
- How can ADF Global Parameters help?
- How can Azure Key Vault help in ADF?
- How to use ADF Linked Service Parameters
- How can ADF system assigned managed identities help?
- How to grant an ADF SMI appropriate permissions
If you want to make a “single” Azure Data Factory (“ADF”) git repository that can be seamlessly deployed into both nonproduction and production environments, you’ll have to carefully optimize its parameters and variables, moving hardcoded values as far to the “outside of the onion” as you can.
- (Note: Almost all of this article more or less also applies to Azure Synapse Analytics workspaces – the main concept that doens’t transfer is that Synapse doesn’t have an equivalent to Global Variables, at least not as of late 2025.)
What is Azure Data Factory?
ADF is Microsoft Azure’s point-and-click ”extract, transform, load” (“ETL”) data integration service, introduced in 2014.
Why use multiple ADF environments
To deploy variants of the same ADF resource into multiple nonproduction and production environments (e.g. pizza-adf-dev, pizza-adf-stg, pizza-adf-prd), you will need to design multiple ADF environments.
- Tip: Even if you do not think you need multiple ADF environments, configure every ADF you write as if you did. It doesn’t make your initial work much harder, and it drastically reduces work if you ever change your mind and realize you need more environments than you’d thought you’d need.
Multi-ADF-environment example
You may find yourself connecting nonproduction ADF resources to nonproduction data sources/targets, and production ADF resources to production ones. For example:
- Your
pizza-adf-devADF resource consumes a “toppings-dev.example.com” API. - Your
pizza-adf-prdADF resource consumes a “toppings-prd.example.com” API.
Imagine:
If you hard-coded “toppings-dev.example.com” into ADF pipelines within pizza-adf-dev, then it would be frustrating to change it when you’re ready to publish your work to run on pizza-adf-prd.
But:
Instead, you can configure ADF to look up the API’s domain name from some sort of variable whose actual value will depend on which ADF environment is running (pizza-adf-dev or pizza-adf-prd).
Takeaway:
Careful planning around environment-dependent variables lets you leverage CI/CD for an “edit once, publish anywhere” approach that improves consistency and expedites maintenance.
Will environment-dependent variables require a lot of work?
No. The following two best practices will save time by improving reusability:
- Seek opportunities to reduce the number of values that will need to change from one ADF environment to another. For example, authenticate ADF into data sources using passwordless ”managed identity” techniques.
- Imagine that, like layers of an onion, some of ADF’s many parameterization features (e.g. ADF pipeline parameters) are deeper “inside” a given ADF configuration, while others (ADF global variables & ADF linked service parameters) are closer to its “outside.”
- The “inner” parameterization features in ADF are not as easy to modify with CI/CD platforms as the “outer” ones.
- If a value should change between one ADF environment and another, move it “outward” (ADF global variables & ADF linked service parameters) or outside of ADF altogether (e.g. Azure Key Vault).
The tips below will help you achieve this goal in ADF.
How can ADF Global Parameters help?
ADF global parameter values are quite easy to change at publish-time with CI/CD.
- Note: Sadly, global parameters are not yet available in Azure Synapse Analytics, which otherwise often overlaps significantly with ADF.
If the environment (e.g. ”dev,” “stg,” “prd”) will determine the value that should be used by your ADF, then:
- Store the value into an ADF global parameter (unless it could be in Key Vault – see below).
- The “dynamic content” expression builder wizard can help you reference your global variables while building components of your ADF.
How can Azure Key Vault help in ADF?
Most ADF component-building wizards make it easy to reference parameters and variables stored within ADF.
However, some of them, such as ADF pipeline “Web” activities and many of the Linked Service create/edit wizards, also make it easy to reference an Azure Key Vault (“KV”) secret.
Azure Key Vault can help by doing the following:
- Reduce complexity. Whenever KV is an option, consider storing your environment-dependent variable values in KV instead of ADF. Then you won’t need to change their values at ADF publish-time at all.
- To further reduce complexity, reuse KP secret names. That is, if each ADF environment has its own corresponding KV, you might prefer to create an identically-named secret in each KV to reduce the number of edits required at publish-time:
toppings-api-domaininpizza-kv-dev: “toppings-dev.example.com”toppings-api-domaininpizza-kv-stg:- “
toppings-stg.example.com”
- “
toppings-api-domaininpizza-kv-prd:- “
toppings-prd.example.com”
- “
How to use ADF Linked Service Parameters
Despite “global’s” implications, linked services cannot see the values of ADF global parameters.
However, the values of “linked service parameters” are also easy to change at publish-time with CI/CD.
When creating a linked service:
- Set it up, test the connection, and save your work using the easiest approach possible.
- For Azure resources, it might be labeled “from Azure subscription.”
- Once you have created it, click its name to edit it and change over to manual entry.
- Look through the prepopulated data fields, determine which values are environment-dependent, and parameterize them. For example:
- Observe that the Base URL field of Key Vault linked service you created says: ”https://pizza-kv-dev.vault.azure.net”
- Note that the distinguishing part of the Base URL is: ”
pizza-kv-dev” - Expand the Parameters toggle toward the bottom of the editing wizard.
- Create a parameter. Name it something like: ”
kv_url_name.” - Set the parameter’s default value to the value that you are currently working with. Here, that would be: ”
pizza-kv-dev” - Change the Base URL field’s value to an ADF expression such as:
@concat('https://', linkedService().kv_url_name, '.vault.azure.net/')
- Re-test the linked service’s connection and save your work.
How can ADF system assigned managed identities help?
- When you authenticate ADF into data sources passwordlessly, you reduce the number of secrets that you have to store (and therefore edit per environment) in the first place.
- Your ADF instance’s “System Assigned Managed Identity“ (“SMI”) can often be selected as the Authentication Method (or similar label, such as “authentication type”) when setting up a data source for a Linked Service, an ADF pipeline “Web” activity, etc. Tip: if you see this option, take advantage of it.
How to grant an ADF SMI appropriate permissions
Don’t forget to ensure that each ADF instance’s SMI has been granted appropriate access (authorization / “authZ”) to the data source in question. For example:
- Grant the
pizza-adf-devSMI read access to appropriate secrets in the “pizza-kv-dev” Key Vault. - Grant the
pizza-adf-stgSMI read access to appropriate secrets in the “pizza-kv-stg” Key Vault. - Grant the
pizza-adf-prdSMI read access to appropriate secrets in the “pizza-kv-prd” Key Vault. - Add the
pizza-adf-devSMI as an Azure SQL Database user with read access to the “completed” view withinorders-db-dev.- Delete any old password-based database users within
orders-db-dev, if applicable.
- Delete any old password-based database users within
- Add the
pizza-adf-stgSMI as an Azure SQL Database user with read access to the “completed” view withinorders-db-stg.- Delete any old password-based database users within
orders-db-stg, if applicable.
- Delete any old password-based database users within
- Add the
pizza-adf-prdSMI as an Azure SQL Database user with read access to the “completed” view withinorders-db-prd.- Delete any old password-based database users within
orders-db-prd, if applicable.
- Delete any old password-based database users within