I created an Azure machine image for an Ansible control node using Packer.
I use the shell provisioner to install Ansible from a script as follows:
#!/bin/bash
sudo apt-get update
sudo apt -y install python3-pip
pip3 install ansible
#...
This installs Ansible, but the Ansible CLI tools end up in the /.local/bin
dir of the packer user (i.e. under /home/packer/.local/bin
). I subsequently deploy the image to Azure using the Terraform azurerm_linux_virtual_machine
resource as follows:
resource "azurerm_linux_virtual_machine" "ansible" {
# ...
admin_username = var.username
source_image_id = var.source_image_id
admin_ssh_key {
username = var.username
public_key = var.ssh_public_key
}
# ...
}
I’s setting the username variable is set to ryan
. When I SSH into this VM as ryan
, the Ansible CLI tools are are not under /home/ryan
, but rather /home/packer
.
How should I deal with this?
Is there a way to get packer to run as a differnet user, or should I add a user in my packer shell provisioner script e.g. sudo adduser --disabled-password --gecos "" ryan
, or is there a way to get Ansible to use a differnt location for the CLI tools, or something else?
Full packer template is as follows:
source "azure-arm" "this" {
client_id = var.client_id
client_secret = var.client_secret
image_publisher = var.source_image_publisher
image_offer = var.source_image_offer
image_sku = var.source_image_sku
location = var.location
managed_image_name = var.image_name
managed_image_resource_group_name = var.resource_group_name
os_type = "Linux"
subscription_id = var.subscription_id
tenant_id = var.tenant_id
vm_size = var.vm_size
}
build {
sources = ["source.azure-arm.this"]
provisioner "shell" {
environment_vars = [
"SSH_PUBLIC_KEY=${var.ssh_public_key}",
"SSH_PRIVATE_KEY=${var.ssh_private_key}",
"SSH_PASSPHRASE=${var.ssh_passphrase}",
"PACKER_CONFIG_DIR=/home/${var.username}"
]
script = "./ansible.sh"
}
}
^^ note: I tried using the PACKER_CONFIG_DIR
environment var, but this didn’t work…
1