Using DSC to download from Azure file storage to an Azure VM

I’ve been exploring a lot PowerShell Desired State Configuration lately, and one of the things I’ve come up against was the need to be able to move files from a file share in an Azure storage account, to the local file system of a VM (usually to install software). Now since I’ve been trying to achieve as much as possible using DSC I decided to explore whether or not I could do this through DSC – and the news was good.

One of the out of the box resources for DSC is the “file” resource (doco at https://technet.microsoft.com/en-us/library/dn282129.aspx), which is great for copying files around and can copy from network locations, which is essentially what the Azure storage file shares are. So the first part was to get my File resource set up:

 File SQLBinaryDownload
 {
 DestinationPath = "C:\SQLInstall"
 Credential = $storageCredential
 Ensure = "Present"
 SourcePath = "\\[StorageAccountName].file.core.windows.net\software\SQL Server 2014"
 Type = "Directory"
 Recurse = $true
 }

This is pretty straightforward – you’re specifying the source path (in my case I had previously created a folder for SQL in my “software” share, so update the path accordingly – if you need help on getting content in to your file share, check out the doco), the local path you want to copy to, and in this case I’m saying to bring the folder and all of it’s child items (through the recurse property). The last part in there is the $storageCredential variable. I set this up in my Configuration as a parameter:

 param( 
 [Parameter(Mandatory=$true)] 
 [ValidateNotNullorEmpty()] 
 [PSCredential]
 $storageCredential
 )

This allows me to specify it as a variable when I register my DSC extension on the Azure VM. If you try to hard code a credential in there DSC will throw an error at you telling you that you have to either take it out or encrypt the file (because storing credentials in plain text is not smart, so it tries to protect you from that). Now the next question is what credentials am I giving it, and how do I pass that in – that looks a little like this:

Set-AzureVMDscExtension -VM $vm -ConfigurationArchive MyConfiguration.ps1.zip -ConfigurationName MyConfiguration -ConfigurationArgument @{ storageCredential= (Get-Credential) }

In the above example, $vm is your VM object (using Get-AzureVM for this), “MyConfiguration.ps1.zip” is your uploaded configuration file, and “MyConfiguration” is the name of the specific configuration in the file to apply to the server. You can see the arguments at the end though – this is where we can pump in an array of the parameters we requested in the file, in this case – my credentials. So this will prompt the user for the required credentials which we will use for the storage share.

What credentials should we use here? This is the easy part. The user name is just the name of your storage account (the same bit that was in the start of the UNC path from the first bit of script) and the password will be one of the access keys for the storage account (either the primary or secondary key, whichever you prefer. I tend to use the primary key for all things that I manually interact with, and the secondary for all automation related activities so that if something isn’t going right I can reset each independently). Once you provide the credentials the VM will now use the configuration you have specified and when the DSC consistency check runs it will check to make sure the required files exist, and if not it will download them for you. Once they have downloaded you can then look to trigger installations or whatever else it is you need to do with the files!

Update 8 July 2015: So after starting to push out this approach a bit more broadly I’ve run in to some issues with it – it appears that if two machines hit a file resource like this for the same Azure file share at the same time, you will see one of them report an error stating that concurrent connections are not allowed. This might be something there is a workaround for, I didn’t spend a lot of time looking as I was planning on refactoring my approach to how I get the files out to VMs for provisioning (a topic I plan on blogging about in the near future), and I could work around this by provisioning my VMs in sequence instead of parallel as well. But keep this in mind when planning to use this approach.

7 Replies to “Using DSC to download from Azure file storage to an Azure VM”

  1. Any ideas why this no longer works? I can't even get as far as pushing the config to the target server! Despite the extremely simple example, I get an error when trying to compile the MOF.

    ConvertTo-MOFInstance : System.InvalidOperationException error processing property 'Credential' OF TYPE 'File': Converting and storing encrypted passwords as plain text is not

    recommended. For more information on securing credentials in MOF file, please refer to MSDN blog: go.microsoft.com/fwlink

    At D:Google DriveScriptingPowerShellAzure DSCLewisSimpleExample.ps1:20 char:9

    The mentioned article goes in to graphic detail about using certificates to encrypt and decrypt credentials but fails to comprehend that you can't "get" a certificate from a machine that hasn't been built yet and which you're using DSC to build and configure from scratch.

    Like you, my intention is to bind to an Azure file share to download installation media for some components that I need that are far too large to include in the configuration archive itself. If for example I have a 200MB MSI package and I need to make a quick change to the DSC configuration, I have to update and upload the whole archive, which isn't condusive to rapid development.

    Just for completeness, this is the whole file.

    $ConfigData = @{

       AllNodes = @(

           @{

               NodeName                    = '*'

               PSDscAllowPlainTextPassword = $True

           }

       )

    }

    Configuration SimpleExampleWithCredentials {

       param(

           [Parameter(Mandatory=$true)]

           [ValidateNotNullOrEmpty()]

           [PSCredential]$Credentials

       )

       Import-DscResource -ModuleName PSDesiredStateConfiguration

       Node Localhost {

           File MyData

           {

               DestinationPath = "C:SQLInstall"

               Credential = $Credentials

               Ensure = "Present"

               SourcePath = "myfileshare.file.core.windows.netsoftware"

               Type = "Directory"

               Recurse = $true

           }

       }

    }

    SimpleExampleWithCredentials -ConfigurationData $ConfigData -Credentials (Get-Credential)

  2. Hi Lewis,

    What you are doing isn't actually a problem with the approach to the Azure stuff here, your problem is to do with how you are specifying to allow the plain text credentials (which I would stress is really not recommended, using the certificates here really isn't hard, especially with a self signed cert). What you need to do is specify an actual machine name node in the config data, and then refer to the all nodes variable in the configuration instead of hard coding local host. My full example below:

    $ConfigData = @{

      AllNodes = @(

          @{

              NodeName                    = '*'

              PSDscAllowPlainTextPassword = $True

          }

          @{

               NodeName     = 'dsc-01'

          }

      )

    }

    Configuration SimpleExampleWithCredentials {

      param(

          [Parameter(Mandatory=$true)]

          [ValidateNotNullOrEmpty()]

          [PSCredential]$Credentials

      )

      Import-DscResource -ModuleName PSDesiredStateConfiguration

      Node $AllNodes.NodeName {

          File MyData

          {

              DestinationPath = "C:SQLInstall"

              Credential = $Credentials

              Ensure = "Present"

              SourcePath = "myfileshare.file.core.windows.netsoftware"

              Type = "Directory"

              Recurse = $true

          }

      }

    }

    SimpleExampleWithCredentials -ConfigurationData $ConfigData -Credentials (Get-Credential)

  3. Hi Azure user,

    That would work as well – also the other method I wrote more recently is the Azure Storage DSC module (github.com/…/xAzureStorage) – it does require that the Azure powershell cmdlets are installed but it will keep the files in sync nicely.

    – B

      1. @john, Azure Automation DSC will encrypt the credential for you, so you could essentially use the same config and confidata referenced above and it will automatically be encrypted.

Leave a Reply