Irregular Injection of Opinion
RSS 2.0|Atom 1.0|CDF

 Monday, 03 May 2010
Windows Azure Certificates for Self management Scenarios

The Windows Azure Management API uses x509 certificates to authenticate callers. In order to make a call to the API you need to have a certificate with both public and private key at the client and and the public key uploaded into the Azure portal. But, if you then want to call the management API from your Windows Azure VMs then you’ll also need to install the cert into the instances by defining them in the service definition This post will show you how.

I found it a bit of a pain to get going so here’s my simple guide. I used this to setup the certs for my favourite open source Azure toolkit Lokad-Cloud. We’ll be creating a self signed certificate, then uploading that certificate into the Windows Azure management portal. Finally we’ll add the certificate to our service model to ensure that Windows Azure installs the certificate into our VM instance when it is started.

Here’s the approach in pictures so you can follow along.

  1. Create a self signed certificate in the IIS7 Manager
    Open IIS7 Manager
  2. Expand the node for your local machine
  3. Double Click Server Certificates
  4. Choose Create Self-Signed Certificate
  5. Give it a friendly name

You’re all done in IIS7 Manager. It’s just created a new Certificate and added this into the trusted root certification authorities on your machine. This is a highly trusted location for the cert so do be careful if you ever export it with the private key included.

We need to export it with the public key included so that we can upload it to the Windows Azure Portal.

  1. Run CertMgr.msc - just type it in full into the Start box.
  2. Expand 'Trusted Root Certificates Authorities –> Certificates"’
  3. Find your cert using the friendly name column and right click and choose Export
  4. Choose the option to NOT export the public key. You should generally avoid exporting the public key if at all possible*.
  5. Choose to export as a DER Encoded binary
  6. Save it somewhere useful – we stick ours into source control
  7. Choose Export again, but, this time choose to export the private key
    Leave the PFX options all unchecked
  8. Give it a sensible name.

You’re all done exporting. Now we need to upload it to Azure.

  1. Go to the Windows Azure Portal
  2. Choose the appropriate project
  3. Click the Account tab across the top of the page
  4. Click Manage My API Certificates
  5. Browse to and upload your *.cer file you created earlier
  6. You should now see the certificate listed against the account.
    Note that it’ll be listed using the Subject rather than the friendly name. So you’ll need to identify it by the thumbprint if you have multiple uploaded from the same machine. Azure Team: Can we please have the friendly name listed here?

Now we’re ready to use our certificate from the Windows Azure Tools for Visual Studio

  1. Open your Windows Azure Project – We’re using Lokad-Cloud here
  2. Expand the Cloud Project to show the Roles.
  3. Right click the Web Role and choose properties
    This will open the Snazzy Windows Azure graphical UI. This is much nice than editing the Model XML by hand.
  4. Choose the Certificates Tab
  5. In our case there is already a certificate entry defined by the default Lokad Model definition.
    Click the elipses (…) at the end of the thumbnail column to open the certificate chooser dialog.
  6. Choose the cert from the dialog
  7. Repeat for the worker role

The last thing we need to do is upload the certificate into our cloud service before we can upload our packages.

  1. Create a new Cloud Service
  2. Browse to the bottom of the page to find certificates and choose Manage
  3. Browse for your certificate (including the private key)
  4. Punch in your password and hit upload
  5. Confirm that you’re all uploaded.

You’re all done. Now you can happily deploy the app to Windows Azure along with having your certificate deployed into your Azure instances as well. This means that your Azure roles can now call the management API themselves.

*There will be times when you need to export the public key. We’ve had to do that in this scenario as we actually want to install the private key into our Azure instances. Another good example here is in a shared development environment. You have the option of either sharing a certificate among everyone or each uploading your own public key. Because we’re using the lokad tool and this explicitly supports just a single cert we’ve actually exported our cert (password protected) into our source control system so all developers can install the same private key.

If you have done an export and want to install the private cert onto a new machine then you’ll need to add it into the Local Computer –> Personal store.

  1. Run MMC by typing MMC in the start run box
  2. Add the Certificates Snapin
  3. Choose Computer Account
  4. Choose local computer
  5. Open the Personal Store and choose More Actions>Import
  6. Browse to and import your certificate.

The certificate should now be visible in the Windows Azure Tools for VS.NET cert selector dialog.

Windows Azure|Monday, 03 May 2010 09:18:56 UTC|Comments [0]|    

 Tuesday, 20 April 2010
Windows Azure Pricing Calculator Spreadsheet

I’ve mentioned this at a heap of sessions I’ve presented at and have never managed to get around to posting it.

So. Attached is my Windows Azure Pricing spreadsheet.

Pretty simple. has three worksheets.

  1. A simple Table storage calculator that determines the cost of table storage.
    This includes overhead calculation – i.e. what is the ration of name to value in your name/value pairs
    It also checks your key lengths fall within the 260 char URI limit and a few other things.
  2. A model of session state pricing.
    Basically shows that SQL Azure will typically be a cheaper option for Session state storage (or anything with little data on disk and high read/write counts).
  3. A detailed (very detailed) model of PhluffyPhotos.

For more detail please see my MIX10 session on Storage in Windows Azure Platform

Windows Azure|Tuesday, 20 April 2010 21:06:10 UTC|Comments [0]|    

 Thursday, 07 January 2010
Chris and Dave do Denniston

Denniston is a former coal mining area situated on an alpine plateau about 30km North of Westport on the West Coast of New Zealand (Wikipedia here). It was actually an active mining area right up until the 1960’s. In recent years it has fallen under the ambit of the Department of Conservation. Along with the Buller Cycling club they have been building out a bunch of Mountain Bike tracks around the area.

You can pull details on the cycling area from the DoC site and from the Buller Cycling club. 

Dave and I decided we’d try and visit the tracks as part of our summer South Island road trip. We drove in via Murchison as we did some paddling along the way. We had a couple of days in the Murchison region where the water levels were well up. Paddled a few runs that I’ve done before as well as a new run down the mangles that started with about a 2m waterfall- pulled quite a crowd when we ran that one for some reason… didn’t seem that difficult.
The crew on the Middle Matakitaki river Whitewater shuttles of doom.

We drove into Westport the evening before we planned to ride and found ourselves a nice motel (Buller Bridge Motel) with free WiFi. Got up early and headed up the hill. This place really is quite the plateau. The hill rises steeply off the ocean almost to an altitude of about 650m at the carpark. The views were pretty good for us; on a really fine and still day I can imagine you’d be able to see all the way up to Karamea.

Once we got to the top we had a bit of a potter around looking at some of the old mining ruins. Denniston is famous for the Denniston Incline which is a frighteningly steep, two pitch, coal railway that literally goes straight up the side of the hill. Check out this video from the NZ Archives for an idea of what it was like.

Below the carpark. Looking down to the incline top loading yard. Walking among the wagons in the loading yard Looking down the incline!

Then we headed up the hill to the Museum Car park which is the designated starting place for the Mountain Biking trails. While getting organized we saw the Google Car driving around- yes, even in the middle of bloody nowhere there is Google!

Getting organized at the museum Google car. Ho9pefully we'll be able to see ourselves!

We started out with the Ropers Hotel Circuit. 
Straight away the riding was quite different to anything we’d ridden before in New Zealand. Lots of slick rock and ledges. Was quite fast riding and reasonably hard on the suspension. A 6” trail bike is ideal- we had a Trance and a Mojo. The last part of this track before it hits the road is walking only- for ecological reasons rather than ride-ability. You definitely want to do this track in the predominant direction indicated on the map as it would be a pain to walk up that hill.

Next we rode out on one of the longer trails. This was nominally on 4WD track but I’d challenge most people to take their 4WD there and get it back in one piece. The track was Sullivans Circuit and it went off the other side of the Plateau back down towards Westport. 

Looking up towards Mt Rochfort More ruins. We think the top of the old aerial ropeway 

This track had plenty of challenging riding. Lots of large rocks and ledges. it was all too easy to go far faster than 0.1mm of lycra really should justify.

P1062417 P1062422 Just a little off

Finally we rode the Miners and Drill track circuits. Again, lots of hairy riding with a bunch of sketchy single track, some of which we both had to walk.

Gratuitous Bike Porn Riding the Slick Rock Self Porttrait

There was still plenty of sign of the old coal mining stuff and indeed the coal itself.

Coal mine fire Coal seam

Despite only doing about 30km (in 3.15hr!) we were in need of a beer once we got back to the car.

Post ride beer time Some amazing Rata trees on West Coast. This is a small one!

Here’s the GPS dump


And the full TCX file of my Garmin 310XT is here:

Adventure Sports | Gettin Fit|Thursday, 07 January 2010 17:42:29 UTC|Comments [0]|    

 Friday, 01 January 2010
Interesting Stats from 2009

Thought it would be interesting to do a blog post of my stats for 2009.

It was a pretty hellish year for travel. It’s the first full year that I’ve used TripIt which has proved to be a really useful tool. TripIt also provides travel stats.

Now not all of this was work travel (I had a couple of overseas holidays) but it was still a pretty full on year.

This is borne out in my exercise stats. Obviously my most exercised location was Wellington, but, I also did a whole heap in Issaquah, Washington and also in Bangalore, India.

Num. activities - Location

I remained pretty steady over the course of the year in terms of the amount of exercise I did. It was still really hard to maintain anything approaching a good training program though.


Once again it was a shitty year for me skiing wise, though these stats do miss out a bit of skiing I had in January. I also did far less mountain biking than I would have liked to.

Time - Category

I did do over 300hr of exercise over the year and travelled 2700km. I also burned 65,000 calories.

Will be interesting to see what I can manage this year.

Adventure Sports | Gettin Fit|Friday, 01 January 2010 03:42:44 UTC|Comments [0]|    

 Wednesday, 21 October 2009
Microsoft Office Extensions to the Open XML File Format (ISO29500) Specification

So the question was asked today in my Open XML Development for Office 2010 and beyond as to whether the Word 2010 Extensions documentation was available anywhere.

I had to take an action item to follow up and find this.

It can be found here:

Office2010 | SPC09|Wednesday, 21 October 2009 23:37:28 UTC|Comments [1]|    

 Monday, 19 October 2009
My Hopeless Gym Experience at the Mandalay Bay Spa

So… I genuinely object to paying US$20/day to use the hotel gym. But, needs must sometimes and so it was that I found myself at the Mandalay Bay Spa  $20 the poorer this morning. Been working reasonably hard in the build up to the K1 road race in a couple of weeks so needed to get some time in on the stationary trainer.

Well, what an utterly useless experience that was. I got inside the Gym to find that;

  1. They had two consumer grade exercycles and that was it as far as bikes went.
  2. None of the cardio equipment was available anyway

So even if I had been able to get on the equipment it wouldn’t have been much good for the Interval workout I had planned- I ripped the crank off a consumer grade exercycle in Bangalore this year so it’s just downright dangerous.

In the end I threw my toys out of the cot, got myself a refund of my fees and went for a run up the strip.


Think I’ll try and find someone friendly at the Luxor for the Gym tomorrow and will not be staying at the Mandalay Bay for MIX10 next year that’s for sure.

Gettin Fit | SPC09 | Travel|Monday, 19 October 2009 14:43:18 UTC|Comments [4]|    

 Tuesday, 22 September 2009
Smashed a new PB on my hill repeat ride

Got out for hills this morning on the road bike. Turned into a shorter ride, but, I pushed the 2nd interval really hard (100% MHR) and in doing so smashed about 7% off my PB up the short pinch climb I do.


The average power was 450 Watts over the 4min 31 seconds. Pity the hill wasn’t a touch longer as it meant my new PB 5 minute power ended up being only 431 Watts i.e. 30 seconds of my rest break at the top was included.. Goal for the next few weeks is to really work on lifting my power profile. I think I’ll probably look at doing Peak 1’w and 5’w on Happy Valley road as it certainly appears to help going up a hill. At the moment my 1’w isn’t even on the chart and I’d like to work on getting my entire profile into the Cat 4 region over the next few weeks in the lead up to the K1 race. Means I need to be targeting 350 Watts for 60 minutes which I think is going to be the hardest bit.

image  image

The other interesting thing is to see just how much temperature variation affects my (supposedly temperature compensating) barometric altimeter on the Polar s625x. All 5 of those intervals should be the same height, but, there is a definite trend downward over the set.

Adventure Sports | Gettin Fit|Tuesday, 22 September 2009 20:16:12 UTC|Comments [0]|    

 Thursday, 03 September 2009
Essential Tools for SQL Azure Development

Much like Wade Wegner I think that SQL Azure is the jewel in the cloud for Microsoft. None of the other vendors have anything like it. While it can be a bit sticker-shock-ish given that nominally a gig in SQL Azure is 65 times the price of a gig in Azure storage, once you actually run some real world scenario models it turns out to be really well priced. Think of a SQL Azure instance not as 1GB (or 10GB… but I see no reason to use 10GB partitions) of storage but rather as the cheapest damn fully backed up and HA relational database solution you’ll find anywhere- and yes FOSSers I include your stuff in that calculation; no greasy haired, under washed and over WoWed engineers needed here.

Anyway… I digress.

A major PITA in using SQL Azure is that the tooling is tantalizingly close to be OK… but in many ways it just doesn’t work. Dumping a SQL script and then re-creating the DB in SQL Azure is a painful exercise in find and replace- check out the hands on lab on Migrating a DB to the cloud in the Azure training kit for the gory details…

Wade has a blog post up about a freebie tool written by George Huey that automates this process for you. Essential for your Azure kit bag. It will parse out all the unsupported stuff.

I ran it up and gave it a nice brutal challenge…. the AdventureWorksLT script that’s used in the aforementioned lab. This includes both schema and data and is a decent effort to parse. The tool churned for a good 3 or 4 minutes…. but I got a script out! The original script includes some real curve-balls like XML Indexes and some tables with data to populate but no clustered index (SQL Azure needs a clustered index before you can insert into a table). I don’t expect it’ll get everything right., but, let’s take a look at how it does against my hand crafted script…..


It doesn’t support cut and paste or saving of the script yet so I’ll need to go and change the source first… More reporting back from me later. This looks really promising and I’m confident it’s going to solve 90% of the pain points I’ve been hitting trying to move complex (hell even simple) databases to the cloud.

Windows Azure|Thursday, 03 September 2009 11:12:12 UTC|Comments [4]|