I can only assign a single value to a variable in azure pipeline YAML file, this is how I have now on Azure.
parameters:
- name: env
displayName: 'Environment'
type: string
default: ''
variables:
- name: sys_file1
value: 'first_file.xml'
- name: sys_file2
value: 'second_file.xml'
In Azure pipeline bash scripts, I need to read filename from variables and make an update on each of them. I have already done the changes for one filename using variable sys_file1. Now, we could have multiple files which needs the same update.
How can I read all filename and make same change on each of them?
Option 1: Can I defined multiple values to single variable, and then iterate over list of values in bash script?
Option 2: In bash script, how can I read filenames through variables with are matching regex sys_file*?
0
Can I defined multiple values to single variable, and then iterate over list of values in bash script?
Yes. This can be achieved.
Here is an example:
variables:
- name: sys_file
value: 'first_file.xml second_file.xml third_file.xml'
steps:
- bash: |
for item in $(sys_file); do
echo "$item"
done
Result:
On the other hand, you can also consider using the Pipeline Built-in feature: Each expression to loop the variable values.
Here is an example:
variables:
- name: sys_file
value: first_file.xml,second_file.xml,third_file.xml
steps:
- ${{ each file in split(variables.sys_file,',') }}:
- bash: |
echo ${{ file }}
In this case, the pipeline will add corresponding bash tasks to run the script according to the number of variable values.
Result:
2