Good old Microsoft!

Yes folks, I have been studying for my first Microsoft exam: Microsoft Windows Server 2012 70-410 with R2 updates! The first thing in the book is installing Windows Server. One of the useful features in Server 2012, Microsoft claim, is the ease in which you can move between the Core (the new default) and the GUI options.

Since there’s no substitute for trying it out, I installed server core and have spent many hours since using the published commands to try and upgrade it to a GUI, so far without success. Talk about falling at the first hurdle. This book is 400 pages and this is pretty much the first question in the test. Great.

Along the way I have learnt some stuff: how to remotely manage the server (in a workgroup), the key part of which is running

Set-Item WSMan:\localhost\Client\TrustedHosts -Value <YourtargetServernameHere> –Force

from a Windows 8 client with RSAT installed and having set the administrator password on the server and allowed remote management via sconfig.

The published way to perform the GUI install is using the powershell command install-windowsfeature. My results are as follows:


A Google of that revealed several suggestions, the majority of which suggest using the -source switch and adding GPOs or editing the registry to change the behaviour of WSUS, none of which worked.

There are various options of syntax described for the -source option but I have tried many and various and none of them worked for me.

The latest, and most promising blog was to be found at

Which suggested specifying a wim file as the source using the following command:

Install-WindowsFeature server-gui-mgmt-infra,server-gui-shell -source:wim:d:\sources\install.wim:4

Great, I thought! This time it would be right, it would work and no one would have to get nailed to anything. Having invoked this command and let it think for 5 minutes (long past when it had failed before) I get:

GUIoption2Oh well, back to the virtual drawing board.



Recently, I have been getting more familiar with Azure. It’s good. Since it’s initial launch way back in 2010, Azure has been maturing and has become a leading contender for businesses seeking to take advantage of Cloud technologies. Whilst Azure was later to market than it’s competitors, this has allowed it to be more forward thinking and to direct itself more towards the service aspects of Cloud, rather than just a new home for old servers and apps.

My initial impression, which is formed not only from experience but by the Azure documentation, is that it is targeted at cloud native applications, an environment for companies to build new apps in and for the cloud. This makes sense. Whilst it is entirely possible and valid to lift and shift your existing business processes, as embodied in the applications a business uses, into the cloud, it is certainly not the same as developing new cloud native apps.

Typically, large companies will have a mixture of bought and internally developed apps. In almost all cases, these applications were designed for a client-server or standalone on-premise environment, with connectivity “out” to the world. Moving those apps to the cloud is a task not to be underestimated, from recent experience. It would be a little bit like taking old fashioned analogue telephone handsets and installing them on the space station.

The trade-off for a business is how much cost is involved in the migration as opposed to re-development whilst at the same time delivering an overall cost saving by migrating to the cloud.

One final comment is that Azure is, in practice, a server only environment. Client’s are not “allowed”. As far as I can tell without wishing to delve into the mysteries of Microsoft licensing, virtualising Windows 7 or 8 is forbidden under nearly all license agreement types. There is no technical reason I know of that it would not work but I’m fairly sure Microsoft are intending to stop that particular use case, which is a bugbear if you are using Azure as a lab environment.