Friday, 30 October 2015

PowerShell Unit Testing with Pester 3.3

I have been working with the Pester unit testing framework at the moment, so I thought I would share some of the more interesting bits...

PowerShell returns everything from a function!

I think this is a trap every PowerShell programmer has fallen into at some point - probably at the beginning. The return statement will return everything which is not "captured" in a function. In my case the answer was to wrap some xml manipulation code in a function which casts to void so that the return statement would actually return the custom PSObject I was expecting instead of an XmlElement which was being created at some point in my code.


   [void](do-XmlManipulation $myParamater)

This was the explanation which really hit home

It helped me realise the awful truth!

Mocking

Mocking in Pester is fantastically simple, there is an object called Mock!

You simply type Mock followed by the function that you wish to mock and then return.

Mock get-MyObject {
   return { [PSCustomObject]@{ Project = "New and Exciting Project" } }
}


In this case I have returned a Powershell object with the property "Project" which is a string. For example, the code in the module may look like this:
## myRealCodeModule.psm1

$document = get-MyObject

function get-MyObject {  
   $xmlDocument = Get-Content -Path "C:\temp.xml"
      return $xmlDocument
   }
}

But the unit test will simply bypass the attempt to get the xml file and will return our object, as defined in our unit test, instead.

Unit testing powershell modules

I keep all my most important logic in PowerShell modules. Sure, I will unit test the main PowerShell script but probably this will be mostly mocking, the real meat should be found in nicely named and separated modules.

Example of testing mocked functions in the main PowerShell script:

$scriptPath = Split-Path $script:MyInvocation.MyCommand.Path
Import-Module ("{0}\migrate.ps1" -f $scriptPath) -Verbose

Describe 'Migrate' {
    Context "When calling" {
        Mock Restore-Database {
            return "OK"
        }
    }
 
 It 'Should call all mocks' {
  Assert-VerifiableMocks
 }
}

One of the strengths of modules is that they must be imported. When they are imported a snapshot is kept in the scope of the importing script. Unfortunately, that means when changing the module, the changes are not immediately reflected because a snapshot still exists against the unit test script. My particular solution is to use "remove-module". There is the recommended solution from Pester here. However, for my purposes this is overkill.


Get-Module | Remove-Module
$scriptPath = (Get-Location).Path
Import-Module ("{0}\myRealCodeModule.psm1" -f $scriptPath)


The other strength of modules is that only exported functions are exposed, which means that functions can be internalised or made "private". This is a good thing because I have more scope for writing neat, well named functions which will make my code more readable to future maintainers.

Unfortunately this is a potential barrier to unit testing effectively. Luckily Pester has the answer here as well with "InModuleScope":


   Describe 'When unit testing get-MyObject which is not exported by the module' {
      InModuleScope myRealCodeModule {
         It 'Should have called the mock' {
            
            Mock get-MyObject {
               return { [PSCustomObject]@{ Project = "New and Exciting Project" } }
            }
            
            my-MainExportedAndThereforePublicFunction

            Assert-MockCalled get-MyObject -Times 1
         }
      }
   }




For a getting started guide: see this excellent blog

Monday, 12 October 2015

ES6 or JS2015 is finally here!

Short post this time. I have been away from the Javascripot space for a little while, but now I have started a new NodeJS project.

I am incredibly excited about the wide spread adoption of ES6 this year and I have to say that VS Code has just made me incredibly happy. Not only does VS Code give ES6 intellisense but by telling the IDE that you wish to compile in ES6 you can stop it giving red squiggly lines as well. Fantastic stuff. Just add a jsconfig.json file with the following:

{
    "compilerOptions": {
        "target": "ES6"
    }
}

The only bad news is that surely I have to remember which features have actually been implemented by the V8 engine.

Well again I was very happy to discover that there is not a great deal left. The only proviso is that you have to run node --harmony which something you can alias in you command line runner of choice.

The other option is to run babel. In the end I have settled on doing the following :

  • npm install babel  
  • Add require(("babel/register") to the server.js file
The advantage of this approach is that I do not have to remember to write a custom build script with the correct parameters.


Monday, 3 August 2015

A Basic Gulp Gotcha

Recently I have been incorporating gulp into my personal website project. For those completely unfamiliar with Gulp, it is a tool which can be used to primarily to amalgamate your css or javascript into a single file to reduce the number of web requests required to satisfy your various dependencies. Checkout this highly amusing slideshow comparing Gulp with Grunt, it also gives further detail as to why these sort of build tools are such a good idea. It is worth noting that minifying may not be necessary if utilising  HTTP/2 "server push" technology. However there are many plugins and use cases for Grunt which you may still find useful.

I configured my gulpfile.js to do just three things:
  1. Concatenate all files from a set directory into one file and put it in the build folder
  2. Minify the resulting JavasScript
  3. Rename the file, changing the extension to .min.js









Everything was working great. By simply running 'gulp app-build' from the command line I could produce a nice small file, perfect for a snappy web experience.

However, I came across some serious issues when trying to deploy the project to production. I tried adding Gulp to my Kudu deployment script like I had done to load my npm modules. I was frustrated when despite my insistent hacking (which went further into the early hours of the night than I would like to admit!) I kept getting the following error or similar:

When I came back to revisit this problem during a sensible time of day, I discovered my mistake.
This issue on Github was a particular givaway. My thanks to Matheiu Swiip. I had incorrectly assumed that gulp was a tool that I could make a part of my build.

UPDATE:
Having spoken to a colleague about this, it turns out that you can in fact use gulp as part of the build. But any dependencies marked as "dev-dependencies" in the npm package file. So all I need to do is move the ones I need for my build into the main "dependencies" area.

I don't think that anybody doubts the reasons for minfying JS files. But in case anybody is interested, this is the difference it made to my AngularJS app:

We start with:


And after minfication:

At 938KB and 123KB we can see that AngularJS itself is the biggest file.

When finally I got gulp working properly in my production environment and I had concatenated and minified my dependencies things started looking really peachy:




Saturday, 4 April 2015

The full DNS lowdown for your website - A practical guide focusing on Windows Azure

Ok, so what is DNS or the Domain Name System? I have always thought of it as a human friendly way of naming computers which are serving content. However, while a snappy explanation to serve up to a non technical person, it does not really get to the knob of the issue.

DNS assigns names to IP addresses, DNS is Hierarchical and DNS is a protocol.

The DNS protocol specification simply describes how the DNS records should be formatted. This means that there is no need to have a single centralised database, indeed this would be very unwise!

Each DNS record has a type. There are a number of types, but the most commonly used are A, CNAME, MX and NS.

For those who really want to dig into the full specification Microsoft have compiled the current documents for the DNS specification here. For instance if you want to know what the types really are see RFC 1034 (November 1987).


Setting up DNS for your website

So moving away from theory to real life ...I want to set up my website so that when you type the URL mysite.com into chrome it goes straight to my website.

First of all I need to buy a domain name. There are a number of providers, Ghandi and DNSSimple are considered particularly good ones.

Once I have bought a domain name I can get going and set-up some CNAME (canonical name of an Alias - see RFC 1034 :-) ) records. This is because in this first instance I am telling my domain name to simply look for the azure domain name server which will be something like myproject.azuresites.com.

Therefore my CNAME record looks like:
host typetarget
www     CNAME     myproject.azuresites.net

So when I go to www.mydomain.com I get the resource hosted at myproject.azuresites.net.

What is actually happening here is that Azure has already setup an A record pointing from myproject.azuresites.net to the virtual IP address. The virtual IP address will then resolve to any number of different boxes in their "Cloud" server farm. I imagine it is probably powered by nerd sweat =P.

DNS changes can take up to 48 hours to propagate, although personally I have never had to wait longer than half an hour. There are a few little tricks which can help to minimise any confusion:

  1. Try checking a DNS tracking site e.g. whatsmydns.net 
  2. If it has propagated and you are still not seeing the correct result try flushing your local DNS cache by going to the cmd prompt and typing: ipconfig/flushdns.

Incoming email

What I want to do is push all the emails sent into my domains email address into my gmail account. To do this I need to understand a little about the DNS Mail Exchanger (MX) record.

So, the MX record.
host      type        target
  @          MX             ? depends ?

The target depends on what you are trying to achieve. You need a "mailbox" which is going to track emails going to your domain. E.g. to email@mydomain.com.

The easiest way is to buy a mailbox from your domain supplier and then configure your preferred email client app (e.g. gmail, yahoo etc...) to look at their SMTP server. You can then send/receive the mail from the mailbox via SSL encrypted connection. Take a look at this page for help on setting this up. I am working on providing you an alternative solution so watch this space...!

Outgoing Email

This is actually a big subject, many organisations and individuals have got into a lot of trouble over outgoing emails form an app. Span has always been the issue. The steps put in place to reduce spam have unfortunately produced a world where people often say "Your system/ you didn't send the email :( ".... " yes I have :/"..."Have you checked your span filter? :)". 



Non-www

Generally people no longer bother with typing the full address www.mydomain.com. In fact, the average Joe, will be completely baffled as to why their browser throws back an error.

Again DNS comes to the rescue, try:
host      type        target
  @      CNAME    myproject.azuresites.net

If you use this in conjunction with the 'www' record mentioned above then your users should be able to go to www.mydomain.com or mydomain.com with the same result.... some juicy content. hmmm!

Generally you may think that one or the other domain names (non-www or with www) is the preferable URL which should be visible to "the average Joe". What you probably want is the server to immediately redirect to say mydomain.com as soon as somebody goes to www.mydomain.com in their browser.

There are a number of techniques for doing this. With an IIS or Apache web server you can use a command line file which by convention is of the file type .htaccess. Placing this file in the route directory of your code will cause the hosting OS to immediately run the file when serving a resource. You can write something like:

# Redirect www urls to non-www
RewriteEngine on
RewriteCond %{HTTP_HOST} !^mydomain\.com
RewriteRule (.*) http://mydomain.com/$1 [R=301,L]

I am using NodeJS and this has its own technique to achieve this:

// redirect for non-www
app.all('/*', function(req, res, next) {
  if (req.headers.host.match(/^www\./) !== null) {
    res.redirect(301, req.protocol + '://' + req.headers.host.replace(/^www\./, '') + req.url);
  } else {
    next();
  }
})

app.set('trust proxy', true);

Sub-domains

Again these are easy to setup as well. Simply use the * symbol in your CNAME record:
* CNAME myproject.azuresites.net

Then the request to a certain subdomain e.g. elephants.mydomain.com can be sniffed for and intercepted by your code.