Azure DevOps: Secrets in Files

Published: Feb 18, 2021 by Isaac Johnson

We’ve discussed AKV and Hashi Vault, but one simple pattern for secrets storage and decimation is to use encrypted file storage.  While not as elegant, it can be more than sufficient, fast and readily available for many cases that need to just apply the KISS method.

Storing in AWS S3

First, if you haven’t already, setup an AWS IAM user.

Create a new user in IAM in the AWS Console

/content/images/2021/02/image-20.png

Type in username and set just for programmatic

/content/images/2021/02/image-21.png

Next we should create a group (we don’t really want this user in the administrators)

/content/images/2021/02/image-22.png

And we can set S3 Access only

/content/images/2021/02/image-23.png

After we select the group, we can come back to the create wizard

/content/images/2021/02/image-24.png

And set any tags

/content/images/2021/02/image-25.png

Finally, create the identity

/content/images/2021/02/image-26.png

When created, grab the access and secret

/content/images/2021/02/image-27.png

Back in Azure DevOps

Create a new AWS Service Connection

/content/images/2021/02/image-28.png

The details refer to the Key and ID we created in our first section

/content/images/2021/02/image-29.png

Create a bucket

Now in S3, let’s make a bucket with encryption…

/content/images/2021/02/image-30.png

Then when we create a bucket, let’s set a few things..

/content/images/2021/02/image-31.png

Then enable versioning and encryption

/content/images/2021/02/image-32.png

After we create, we should see confirmation

/content/images/2021/02/image-33.png

In Azure DevOps

We can then create a ClassicUI pipeline and add the s3 task:

/content/images/2021/02/image-34.png

In a YAML pipeline, the action that should look like:

steps:
- task: AmazonWebServices.aws-vsts-tools.S3Upload.S3Upload@1
  displayName: 'S3 Upload: idjazdodemobucket1'
  inputs:
    awsCredentials: AzDOS3
    regionName: 'us-east-1'
    bucketName: idjazdodemobucket1
    sourceFolder: tmp

with the following download in a different stage/agent

steps:
- task: AmazonWebServices.aws-vsts-tools.S3Download.S3Download@1
  displayName: 'S3 Download: idjazdodemobucket1'
  inputs:
    awsCredentials: AzDOS3
    regionName: 'us-east-1'
    bucketName: idjazdodemobucket1
    targetFolder: '$(Build.ArtifactStagingDirectory)'

In ClassicUI, that looks like

/content/images/2021/02/image-35.png

Verification

In testing, we can see the data uploaded:

/content/images/2021/02/image-17.png

And subsequently downloaded (this time on a windows host):

/content/images/2021/02/image-18.png

Azure Blob Storage

Create a new storage account

/content/images/2021/02/image-36.png

Blob store created

/content/images/2021/02/image-37.png

Storage account

/content/images/2021/02/image-38.png

Then set networking. While I’m leaving it public, setting routing to Microsoft however

/content/images/2021/02/image-39.png

Then set versioning on blobs

/content/images/2021/02/image-40.png

Then create

/content/images/2021/02/image-41.png

once created

/content/images/2021/02/image-42.png

Verify encryption is enabled

/content/images/2021/02/image-43.png

Now we can create a blob container

/content/images/2021/02/image-44.png

And we can see it listed here

/content/images/2021/02/image-45.png

In Azure DevOps

We can now user pipeline steps to access

steps:
- task: AzureCLI@2
  displayName: 'Azure CLI'
  inputs:
    azureSubscription: 'Pay-As-You-Go(d955c0ba-13dc-aaaa-aaaa-8fed74cbb22d)'
    scriptType: bash
    scriptLocation: inlineScript
    inlineScript: 'az storage blob upload-batch -d storagedemo01 --account-name idjstorageacc02 -s ./tmp'

And then download as

steps:
- task: AzureCLI@2
  displayName: 'Azure CLI '
  inputs:
    azureSubscription: 'Pay-As-You-Go(d955c0ba-13dc-aaaa-aaaa-8fed74cbb22d)'
    scriptType: bash
    scriptLocation: inlineScript
    inlineScript: 'az storage blob download-batch --destination $(Build.ArtifactStagingDirectory) --source storagedemo01 --account-name idjstorageacc02'

And as a ClassicUI pipeline

/content/images/2021/02/image-46.png

When run, we can see it uploaded

/content/images/2021/02/image-47.png

And then downloaded later in a different host

/content/images/2021/02/image-48.png

Summary

I showed two easy ways to use encrypted storage in AWS and Azure to store some data in one stage and retrieve it in a completely different.  While I didn’t dig into passing and parsing key pairs, One could use the format of their choosing to accomplish that.  I often use JSON with jq to set and retrieve values. e.g.

$ echo '[{"keyName":"whatever","keyValue":"somevalue"}]' > t.json
$ cat t.json
[{"keyName":"whatever","keyValue":"somevalue"}]

$ cat t.json | jq -r ".[] | .keyValue"
somevalue

To store less than a Gb of JSON data in an LRS account in US-East would likely run me 3 cents. If I added GRS redundancy, i would up that to about 7 cents. AWS S3 is similarly priced.

While not a replacement for AKV or SSM, using Blob and S3 provides a nice KISS method when you just need a reliable and fast way to store some configurations and/or secrets used by various pipelines.

azure-devops aws azure getting-started

Have something to add? Feedback? Try our new forums

Isaac Johnson

Isaac Johnson

Cloud Solutions Architect

Isaac is a CSA and DevOps engineer who focuses on cloud migrations and devops processes. He also is a dad to three wonderful daughters (hence the references to Princess King sprinkled throughout the blog).

Theme built by C.S. Rhymes