Social

onsdag den 19. oktober 2016

CustomScriptExtension in ARM Templates and Shared Access Signature (SAS) Tokens

I had some trouble with a custom script extension where the script required a SAS token to download some software. The token was simply truncated after the first '&'.

After some digging I thought I had to put the SAS token into quotes, and when looking into C:\Packages\Plugins\Microsoft.Compute.CustomScriptExtension\1.8\RuntimeSettings\0.settings I found that it was a sensible solution. I could also copy the "commandToExecute" and run it and get the expected result. In the variables section I added a:


  "variables": {
    "singlequote": "'",

And then put single quotes around the parameters('SASToken'). But no dice. The token was still getting truncated, this time with a 'in front...

So I decided to get rid of the '&', at least temporarily. base64 encoding to the rescue. And Luckily there is an ARM template function for just that. In the script I then added:

$SASToken = [System.Text.Encoding]::UTF8.GetString([System.Convert]::FromBase64String($SASToken))

Problem solved!

Seems to me that there is something odd in how the custom script extension calls PowerShell in this particular instance.

onsdag den 5. oktober 2016

Begin..Process..End and Error Handling

I had to wrap my mind around error handling and the begin..process..end function in PowerShell. It becomes really fun when I start throwing different ErrorActions after it!

This will be mostly some PowerShell snippets and their result. So without further ado, lets dive into some code!

This is a really simple function:

function myfunc
{
    [cmdletbinding()]
    param()

    begin
    {
        # some init code that throws an error
        try
        {
            throw 'some error'
            # code never reaches here
            Write-Output 'begin block'
        }
        catch [System.Exception]
        {
            Write-Error 'begin block'
        }
    }
    process
    {
        Write-Output 'process block'
    }
    end
    {
        Write-Output 'end block'
    }
}
Clear-Host
$VerbosePreference = "Continue"

Write-Host "-ErrorAction SilentlyContinue: the Write-Error in the begin block is suppressed" `
    -ForegroundColor Cyan
myfunc -ErrorAction SilentlyContinue
Write-Host "-ErrorAction Continue: displays the Write-Error in the begin block,
but the process and end block is executed" `
    -ForegroundColor Cyan
myfunc -ErrorAction Continue
Write-Host "-ErrorAction Stop: displays the Write-Error in the begin block. 
The Write-Error in the begin block becomes a terminating error. 
The process and end block is not executed" `
    -ForegroundColor Cyan
myfunc -ErrorAction Stop

The output is:




We see that for both ErrorActions Continue/SilentlyContinue that the process block is executed. When we use Stop then Write-Error becomes a terminating error and the pipeline is stopped.

Let us not dwell on that and move onto a function with some actual input:

# with input
function myfunc
{
    [cmdletbinding()]
    param(
        [Parameter(
            Position=0, 
            Mandatory=$true, 
            ValueFromPipeline=$true,
            ValueFromPipelineByPropertyName=$true)
        ]
        $x
    )

    begin
    {
        # No errors in the begin block this time
        Write-Output 'begin block'
    }
    process
    {
        if($x -gt 2)
        {
            Write-Error "$x is too big to handle!"
        }
        # echo input
        Write-Output $x
    }
    end
    {
        Write-Output 'end block'
    }
}
Clear-Host
$VerbosePreference = "Continue"

Write-Host "-ErrorAction SilentlyContinue: the Write-Error in the process block is suppressed" `
    -ForegroundColor Cyan
@(1,2,3) | myfunc -ErrorAction SilentlyContinue

Write-Host "-ErrorAction Continue: The Write-Error in the process block is displayed,
but `$x is still echoed" `
    -ForegroundColor Cyan
@(1,2,3) | myfunc -ErrorAction Continue

Write-Host "-ErrorAction Stop: The Write-Error in the process block becomes a terminating error, 
`$x > 2 is NOT echoed" `
    -ForegroundColor Cyan
@(1,2,3) | myfunc -ErrorAction Stop

The output is:



Now we see that something uninteded is happening for both ErrorActions Continue/SilentlyContinue. 3 is echoed still. With Stop the story is as before, Write-Error becomes a terminating error and 3 is not echoed.

Now we basically just add a return statement:

# with input
function myfunc
{
    [cmdletbinding()]
    param(
        [Parameter(
            Position=0, 
            Mandatory=$true, 
            ValueFromPipeline=$true,
            ValueFromPipelineByPropertyName=$true)
        ]
        $x
    )

    begin
    {
        # No errors in the begin block this time
        Write-Output 'begin block'
    }
    process
    {
        if($x -gt 2)
        {
            Write-Error "$x is too big to handle!"
            # continue on the pipeline. NOTE: continue does NOT continue but rather shuts down the pipeline completely
            return
        }
        # echo input
        Write-Output $x
    }
    end
    {
        Write-Output 'end block'
    }
}
Clear-Host
$VerbosePreference = "Continue"

Write-Host "-ErrorAction SilentlyContinue: the Write-Error in the process block is suppressed
(for both 3 and 4), and `$x > 2 is not echoed" `
    -ForegroundColor Cyan
@(1,2,3,4) | myfunc -ErrorAction SilentlyContinue

Write-Host "-ErrorAction Continue: The Write-Error in the process block is displayed
(twice, for both 3 and 4). `$x > 2 is not echoed" `
    -ForegroundColor Cyan
@(1,2,3,4) | myfunc -ErrorAction Continue
Write-Host 'The script keeps running' `
    -ForegroundColor Cyan

Write-Host "-ErrorAction Stop: The Write-Error in the process block becomes a terminating error,
'3' is NOT echoed. return is not exectuted hence the pipeline stops" `
    -ForegroundColor Cyan
@(1,2,3,4) | myfunc -ErrorAction Stop
Write-Host 'this is not reached' `
    -ForegroundColor Cyan

The output is:



We see that in all 3 cases that x greater than 2 is not echoed. Now ErrorAction Stop makes sense. We indicate that if the function fails for any input we do not wish to continue the script.

And we can add some error handling:

# with input
function myfunc
{
    [cmdletbinding()]
    param(
        [Parameter(
            Position=0, 
            Mandatory=$true, 
            ValueFromPipeline=$true,
            ValueFromPipelineByPropertyName=$true)
        ]
        $x
    )

    begin
    {
        # No errors in the begin block this time
        Write-Output 'begin block'
    }
    process
    {
        try
        {
            if($x -gt 2)
            {
                # this puts the error into the $Error variable
                throw "$x is too big to handle!"

            }
            # echo input
            Write-Output $x
            }
        catch [System.Exception]
        {
            Write-Error $Error[0].Exception
            Write-Verbose "continue on the pipeline '$x'"
            return
        }
        Write-Verbose "continue on the pipeline '$x'"
    }
    end
    {
        Write-Output 'end block'
    }
}
Clear-Host
$VerbosePreference = "Continue"

Write-Host "-ErrorAction SilentlyContinue: the Write-Error in the process block is suppressed 
(for both 3 and 4), and `$x is not echoed" `
    -ForegroundColor Cyan
@(1,2,3,4) | myfunc -ErrorAction SilentlyContinue

Write-Host "-ErrorAction Continue: The Write-Error in the process block is displayed 
(twice, for both 3 and 4).`$x is not echoed" `
    -ForegroundColor Cyan
@(1,2,3,4) | myfunc -ErrorAction Continue
Write-Host 'The script keeps running' `
    -ForegroundColor Cyan

Write-Host "-ErrorAction Stop: The Write-Error in the process block becomes a terminating error, 
'3' is NOT echoed. return is not exectuted and the pipeline stops" `
    -ForegroundColor Cyan
@(1,2,3,4) | myfunc -ErrorAction Stop
Write-Host 'this is not reached' `
    -ForegroundColor Cyan

The output is:


I hope this helps understanding how some of the begin..process..end function works with regards to errors and error handling. I know I will be returning to this from time and again :D


tirsdag den 4. oktober 2016

ARM Template Tip: Names

Naming resources in ARM templates can be quite lengthy. This is an example of naming a network interface:

"name": "[concat(parameters('vmNamePrefix'), '-', padLeft(copyIndex(1), 2, '0'), variables('nicPostfix'), '-', padLeft(copyIndex(1), 2, '0'))]",

And we have to reference this at a later point for the virtual machine resource. If we then change the name, we will have to remember to change this reference also.

What we can do is to define the name in the variables section like this:

    "nic": {
      "name": "[concat(parameters('vmNamePrefix'), '-', padLeft('{0}', 4, '0'), variables('nicPostfix'), '-', padLeft('{0}', 4, '0'))]"
    }

(I like to group variables). And then reference this variable in the resource like:

"name": "[replace(variables('nic').name, '{0}', string(copyIndex(1)))]",

What I have done is to make {0} a placeholder and then replace it with the result from copyIndex(). We now have a central location to change the name if needed with no need to update any resources.

Would be cool if we had a template function for formatting:

"name": "[format(variables('nic').name, copyIndex(1), '-nic')]"

It would take the string as input and then a variable number of additional arguments. Ex.


"nic": {
   "name": "concat(parameters('vmNamePrefix'), '0{0}', '{1}')]"
}

would become({0} is replaced with the result from copyIndex(1) and {1} replaced with -nic):

"VM01-nic"

And it could be made more advanced, perhaps leaning on the good ol' sprintf.

torsdag den 29. september 2016

Logging webhooks using Azure Functions and OMS Log Analytics

We recently discussed webhooks internally at work and the question popped on how to maintain and log the activity. Webhooks normally have a limited timespan (could be years though), and they should generally be kept a secret even if they are accompanied by a token that authorizes the caller.

What better way to log the webhook calls than using OMS Log Analytics? Once the data is logged there you have a plethora of options on what to do with it. Go ask my colleague Stanislav.

I also wanted to try out the fairly new Azure Functions, which acts as a relay to Log Analytics Data Collector API. The webhook itself comes from an Azure Automation runbook.

I documented the entire solution on Github, and you can find the repository here - it takes you from A to Z on how to setup the various moving parts in Azure. I hope you can find some inspiration on how to manage your webhooks.

mandag den 26. september 2016

Hello Azure Functions - Integrating with Github

I had a hard time finding out how to integrate a Github Repository into Azure functions, or rather what files and the structure to put in the repository so that Azure Functions would pick them up. A very basic setup follows.

This assumes an understanding of Github and Azure Functions. There are plenty of resources out there explaining that better than I can.

Github

Create a fresh repository and create a file, host.json, in the root:
{
 "functions" : [
  "HelloAzureFunctions"
 ],
 "id": "ed5d78e575e14f0481c899532d41f5c0"
}

Now create a folder called HelloAzureFunctions. Inside that create a file, function.json:

{
    "bindings": [
        {
            "type": "httpTrigger",
            "name": "req",
            "direction": "in",
            "methods": [ "get" ]
        },
        {
            "type": "http",
            "name": "res",
            "direction": "out"
        }
    ]
}

And in this case we will use PowerShell; we need a file called run.ps1:
$requestBody = Get-Content $req -Raw | ConvertFrom-Json
$name = $requestBody.name

if ($req_query_name) 
{
    $name = $req_query_name 
}

Out-File -Encoding Ascii -FilePath $res -inputObject "Hello, $name"
That is it! Commit to Github and go to your Azure Function app and integrate with the repository. The HelloAzureFunctions should appear as a function after a short while.

You can fork my repository if you like, https://github.com/spaelling/hello-azure-functions. There is also a a PowerShell script there that can be used for testing (you can just paste in the webhook URI in a browser if you rather like that).

Also keep your webhooks a secret. In the aforementioned script I show how to get the webhook URI from an Azure Key Vault.

fredag den 16. september 2016

Analyzing your bank statement using PowerBI

I wanted to figure out what we were spending our money on, but our bank is lacking behind when it comes to finance insight, so what better way than to use PowerBI?

First you need to export your bank statements into CSV. We have multiple accounts, so I just looked into the account that we use for everyday shopping (food, etc.). I had some trouble importing into PowerBI, so I imported the CSV data into Excel where you then have to select (all) the data and make it into a table (ctrl+t) before you can import it into PowerBI.

I had to sanitize the data; removing transfers from one account to another and purchases that should have been made on another account. If you spot something later simply remove the row in excel and import the file again.



You are now ready to create some reports based on the bank statement data. It should look something like this (if there is only a single row in the fields box it means that PowerBI was unable to make sense of the data):



Now check the box next to the Σ and then one of the other options and then click the pie-chart icon. My bank statement comes with category and sub-category for each entry. If you have some sort of categorisation, and checked that, then you will see something like this (without redactions):


Wow! Ok, you could do that in Excel also (I would spend hours how to figure this out in Excel though). It simply shows the distribution of purchases into each category. The big one is grocery-shopping, which is the primary purpose for this account.

Now comes the magic. Deselect the graph and then again click on the Σ, and whatever translates into an entry description and select the table icon. That is more or less just what we have in Excel, right?

Select one of the categories in the piechart and see what happens.



It now shows only the entries (summary of the amount) in the table that are related to the category that you selected. This is just the tip of the iceberg. PowerBI can do much more than that!

Finally you can figure out what your wife is spending all your money on ;)

How Awesome is Docker?

Fully configured Ubuntu server up and running in minutes? On Windows? Impossible you say? It is not!

Start by installing Docker. We will try to run the following Python code in the Docker container.


try:
    from slackclient import SlackClient
    #import os # we need this for the next line
    # print the environment variable we exported in Dockerfile
    print(os.environ.get('SOME_TOKEN'))
except Exception as e:
    print("Error when importing 'SlackClient'")
    print(repr(e))
else:
    print("Succes!!!'")    
finally:
    pass

Copy this snippet to a file and name it somecode.py. Create a file called Dockerfile and paste the following into it.

FROM ubuntu:latest
# update apt-get then install python3.5 and pip3
RUN apt-get -y update && apt-get install -y python3.5 && apt-get install -y python3-pip
# update pip3
RUN pip3 install --upgrade pip
# install python modulesslackclient
RUN pip3 install slackclient==1.0.0
# copy source files
COPY somecode.py /src/somecode.py
# export some tokens
ENV SOME_TOKEN='this is a token'
# run the bot
CMD ["python3.5", "/src/somecode.py"]

Then run these few lines of PowerShell.


cd $PSScriptRoot
# build the image (based on 'Dockerfile' in this folder) - ignore the security warning
docker build -t codebeaver/dockerisawesome --force-rm .
# run a container using the image we just created, --rm means we remove the container after it exists
docker run --rm codebeaver/dockerisawesome

It may take some time to download the Ubuntu base image (ca. 500mb).

I intentionally put in an error. We did not import the os library in the Python code. Uncomment import os and run the PowerShell code again. That was it. You can easily install additional Python libraries by editing the Dockerfile.

You can run the container in Azure and there are various services for running Docker containers for you.

Søg i denne blog