By Ewen Anderson,
If you were a fly on the wall at any IT department’s strategy planning meeting for 2011, chances are that virtual desktop infrastructure (VDI) will be on the agenda. While the financial arguments rage (quite correctly) about the costs and benefits, it is the good old foot soldiers of the IT department who will be rapidly pushed from the trenches into no-man’s land, setting up the new system under fire and trying to ensure that it works on day one, scales up rapidly from day two and then keeps running smoothly thereafter. Why all the interest and pressure? To put it simply, in today’s economic environment there has never been greater emphasis on “getting it right first time”.
Why? Well many organisations are going to need increased flexibility from their staff in order to meet a combination of cost reductions and changes to working patterns. Savings can be made through hot-desking, reducing the costs of premises, and by making staff more efficient by reliably delivering them applications and data securely at their point of use. More flexible working applies both to when and from where employees work, and how easily new users can be given the tools they need to do their job.
Technology that empowers users to work remotely (as long as it is simple, quick and secure) will also improve the standing of the IT department. Indeed, the perception of IT itself can be transformed from restrictive to progressive – but only as long as supply keeps that one step ahead of demand.
At the risk of generating sympathy for the IT department, the cross-fire is becoming more intense. Having successfully held off upgrading the desktop hardware to Vista, the long-promised desktop refresh is now under fire both from budget reductions and increased options around virtualisation and cloud computing. As if that was not enough, the carbon emission reduction targets are still there, governance and information security are still on the chief information officer’s agenda and the executive team have all taken the consumerisation debate so seriously that they have decided to try out iPads.
There is a temptation to wave a white flag and simply shove everything up into the cloud, but that is a bit like shoving everything up into the attic when selling your house. It all looks neat and tidy at first, but you quickly find that essential items are hard to reach…
As a consequence many organisations are looking for a journey with a number of stages rather than a single destination. So the future might be desktops on demand with streamed subscription apps and centralised data, all re-charged on a usage model, but the reality for most organisations is a desktop and application transformation that simply lifts the computing environment away from the device.
If this can make the user’s working environment less costly and more flexible (and also deliver high-speed WAN performance, multimedia and support for all their peripherals) it might just get our beleaguered troops over that last mile of mud and into the warm. So what do organisations need to do to achieve what the industry increasingly calls “abstraction”?
Virtualising and centralising need to be at the heart of any strategy; allowing the organisation to deliver software and desktops as a service. But effective automation and systems management are required to ensure that every virtual machine is built to a known standard and that the standard can be updated in one place with changes distributed automatically.
Personalisation is another attractive abstraction target. Separating the individual user’s settings from their device and operating system can have significant performance and end-user experience benefits, particularly when users may access their desktops in a number of ways, such as by laptop, desktop and thin client.
Not surprisingly storage is one of the most urgent considerations. It is well known that the business case for many of the early VDI projects fell foul when the storage costs of scaling the solution cost up from pilot to production became known. Desktop virtualisation of any scale requires a different profile of storage and simply transferring disk images from cheap SATA devices in PCs to expensive SAN fabric is not an option for most organisations.
What is more surprising is the range of answers to the “how do I get this storage cost under control” question. Thin provisioning, cloning, de-duplication, PAM cards, caching and a range of products and upgrades can all be applied at a cost. But for most customers asking your storage vendor which one is right for you is a little too much like opening your wallet and closing your eyes at the same time. In an emerging market it is important to get advice on the range of choices, implications and costs.
Finally, at the client end, customers increasingly turn their attention to the costs and environmental issues of power consumption, cooling, disposal and technology refresh; any move towards a thinner client has significant appeal, particularly as the local multimedia capability of the devices increases.
Only with a good understanding of the available technologies, together with existing investments and strategies, can an organisation choose the right solution with any degree of confidence. Finding the perfect solution for individual companies is not straightforward, although the simple rule of ‘you get what you pay for’ tends to apply.
A range of application virtualisation options are available, Virtualising the applications allows them to be streamed into the virtual machines as they are provisioned, reducing the number of builds needed and reducing resource requirements and overheads. Application virtualisation can also eliminate application conflicts and allow multiple versions of an application (e.g. Office 2003 & 2007) to be delivered to the same device.
Any organisation looking to become more agile needs a robust, scalable personalisation solution – particularly if they are planning a move to Windows 7 from Windows XP and seeking to retain user’s personal settings. Certain vendors such as Centralis use AppSense to ensure that the process of migration and co-existence becomes much less painful. From experience it is much easier to bring the users along for the ride if the journey starts with virtualising them and their settings.
Some of the latest thin clients are optimised for desktop virtualisation and could be attractive for any client looking for low-energy solutions to reduce their carbon emissions. This remote management software consumes the equivalent power of a Christmas tree bulb and, having no moving parts, are silent and very durable. Seamlessly combining these with the power of a full XP or Windows 7 desktop is a compelling proposition. The emergence of client–side virtualisation for mobile devices later this year will add a further dimension to strategic virtualisation and may finally cure IT of one of its enduring headaches: the cost and complexity of laptop support.
But I would be guilty of being disingenuous if I did not flag up the potential hazards of virtualisation as an abstraction tool. Without a clear end goal and strategic vision the journey is difficult to plan and success is hard to quantify. Increased pressure and scarce resources mean that planning and preparation must be perfect, or else performance and productivity will suffer. A small budget put aside for analysis can prevent costly mistakes.
Of course there can be a temptation to go fast and cheap, but for many users abstraction will touch every part of their working experience and skimping on product, strategy, analysis or design is likely to deliver some very unhappy abstracted users.
The technology to reduce costs and increase flexibility is currently available on the market. The next step is to convince a financial controller to read from the same menu and invest in that technology. Maybe ask him just after his computer has crashed for the fifth time….