VDI on demand – exp.1

My first test was pretty successful! I had exported from my virtualcenter my old Portege 7200 to an ovf file on my current laptop with VMware workstation. This I managed to import and run. Cool! The only fly in the ointment being the bug in ovftool which gives an error when parsing the ovf file. I got round this by upgrading to the latest version of ovftool and running it manually to convert the ovf to a vmx file and then importing that.

The next challenge is, of course to move the VM I used today back to my VC, to make it truly portable. Changing formats multiple times is not practical so I need to see what I can do with a USB stick, which harps back to my datastore on a USB stick search of last year (still not supported by VMware). Should still be able to use one to store the VM files though.

Next experiment.

Desktop Virtualisation Forum (cont.)

So, on to Roy Illsey of Ovum “Market Trends: Is 2010 the tipping point for Desktop Virtualisation?”. This technology is new and small. The global market for desktops is 600 million units. Virtualisation has about 1/2% of that market. But it’s not just about the technology, it’s about the process too. This is where there needs to be a change in mindset. There is a convergance in thinking between Vmware and Citrix and a growing “ecosystem”, as evidenced by this forum and others.

Last, but not least Simon Bullers CEO of RedPixie presented “Implementing best practice for desktop virtualisation”. His bullet points about how to deliver a successful desktop virtualisation project included:

  • Create a culture of teamwork. Think about whether you need a dedicated team or do as BAU. Get the Data Centre involved.
  • Create a culture of end users. Build positive PR within the organisation and an appetite for change.
  • Technology – Client. There are various types of client to consider, there are various types of application to consider. Build a demo lab.
  • Technology – Storage. Measure and optimise throughout the course of the project.
  • Technology – Platform. Server hardware, blades v racks. Type 1 or type 2 hypervisor. Blend?
  • Size for the peak users.
  • Process: spend time information gathering and planning. Decide on scheduling.
  • Agree the appetite for risk.
  • Process: difficult decisions, don’t get involved in a blame game.
  • Financials – it’s a mine field! Does it need to show an ROI? User chargebacks?

A quick executive summary: Windows 7 is the best reason to adopt so far but don’t play the funny numbers game (a reference I guess to the potential cost savings).

Many points to consider there and the forum as a whole very worthwhile. After this talk there was a panel Q&A session before lunch. I regret not being able to attend the afternoon breakout sessions where no doubt plenty of discussion took place and many thought provoking ideas developed from the morning themes.

To re-iterate what I said yesterday, I think to be truly successful, a desktop virtualisation project has to deliver 100%.  The overheads of maintaining two or more desktop platforms are going to kill any efficiencies quickly. That is why I would seek to convince the sceptics first and not start with the users who were already fans.

I was interested in the “Zero Client” device. This for me is the “ultimate” solution, or at least the most evolved of all the solutions currently in play. In some ways it is a direct descendant of a dumb terminal of the type Wyse manufactured 30 years ago. A serial line delivering ASCII characters has been replaced by the LAN or WAN delivering rich media over optimised protocols. I would put my money on these types of device to be the most successful as the field develops.

As to the next generation of device after that, I think it will naturally be led by advances in user interfaces in the field of human computer interaction. That field seems to have been quiet in recent years after the revolution in mice, graphics and workstations. Maybe that’s my cue to go and watch some more Sci-Fi movies…there must be another HCI revolution due soon?

Desktop Virtualisation Forum

This morning I attended the Desktop Virtualisation Forum in London, billed as “how to reduce costs, increase flexibility, and improve security through virtualisation” organised by Outsourced Events and the BroadGroup. Platinum sponsors were Citrix and WYSE, with AppSense, ThinPrint, Pillar Data Systems also laying out their stall.

There was a good attendance for a Monday morning which consisted of four plenary sessions. The afternoon was divided into two breakouts but unfortunately I was unable to attend those.

Marion Howard Healy as the chair introduced the speakers and started us off with a couple of statistics: there is a 24% penetration of desktop virtualisation in the market (from which I take it that 24% of companies have some desktop virtualisation) and 59% of companies say that lack of experience is a barrier to adoption.

Patrick Irwin of Citrix gave the keynote, “Making sense of desktop virtualisation”. He started with an Albert Einstein quote “Insanity is doing the same thing over and over again and expecting different results” as a segway into the traditional way of deploying desktops, an 8 step loop. His definition of the desktop as three components: OS + Apps + profile which could be decoupled using virtualisation and delivered to the user as a service is a good model but for me is missing one vital component which is data.

He explained that desktop virtualisation is not VDI but that VDI was one of the Virtual Desktop delivery options which formed a range of solutions from server side compute e.g. hosted shared desktops to client side compute e.g. a local VM based desktop built on a type 1 hypervisor.

The benefits of desktop virtualisation are agility, productivity and cost although, unlike server virtualisation it is initially cost neutral.

This excellent introduction was followed by David Angwin of WYSE, “A solution for all? The promise and reality of desktop virtualisation”. He reminded us that we still had to tackle the challenges of managing a desktop introducing the idea of an ideal client and how the promise of such a device differed from the reality. The promise comprised a direction towards the cloud delivering cost benefits in opex, capex and energy as well as business benefits in terms of security, compliance and manageability.

The reality is that (according to Gartner) the TCO of a PC is $117/month versus a PC + VDI solution of $135/month (all the expensive back end infrastructure I guess). Savings start with a Thin Client (TC) and VDI at $72/month and extend with TC + WTS ($42/month) and TC + XA ($38/month). (‘Fraid I didn’t catch the last two acronyms).

He introduced the term “Zero Client” (coined by WYSE), a device which does one thing: connects to some virtual infrastructure. It has no O/S and no disk (so is inherently secure). A thin client by contrast has a local embedded O/S.

One example I found particularly interesting was that of Hilton Hotels who have used zero clients (I believe) along with specialist software¬† to do away with traditional call centres and tap a rich vein of home workers from a totally different demographic to give them a “virtual call centre”.

His take-aways were to break the relationship with the tin, look at the server as well as the client, fund with refresh and identify IT pioneers.

Again data was not sufficiently addressed for me, or in the remaining talks.

The final point, of identifying IT pioneers was echoed by other speakers, particularly Simon Bullers from RedPixie. However I think you could fall into a trap here…To ultimately be successful a virtualisation project has to deliver 100%. Every single desktop in your organisation which does not follow your virtual design pattern takes away from the benefit. If you only manage to get 80% of your desktops virtualised then the remaining 20% are going to weigh you down sufficiently to negate many of the benefits. The people you need to start with are not the early IT adopters but the IT sceptics. You need to convince your “problem users” first, not last. Address all their niggles, or at least offer them tangible benfits to convince them to adopt and you have cleared your biggest hurdle. In that case you have a much better chance of reaching 100%.

Tomorrow, I hope I will cover the remaining two talks: Roy Illsley from Ovum “Market Trends: Is 2010 the tipping point for Desktop Virtualisation?” and Simon Bullers from RedPixie on “Implementing best practice for desktop virtualisation”.