Transcript of a BriefingsDirect podcast on the future of desktop virtualization and how enterprises can benefit from moving to this model.
Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Hewlett-Packard.
Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.
Today, we provide a sponsored podcast discussion on the growing interest and value in PC desktop virtualization strategies and approaches. Recently, a lot has happened technically that has matured the performance and economic benefits of desktop virtualization and the use of thin-client devices.
In desktop virtualization, the workhorse is the server, and the client assists. This allows for easier management, support, upgrades, provisioning, and control of data and applications. Users can also take their unique desktop experience to any supported device, connect, and pick up where they left off. And, there are now new offline benefits too.
At the same time as this functional maturity improved, we are approaching an inflection point in a market that is accepting of new clients and new client approaches like desktop virtualization.
Indeed, the latest desktop virtualization model empowers enterprises with lower total costs, greater management of software, tighter security, and the ability to exploit low-cost, low-energy thin client devices. It's an offer that more enterprises are going to find hard to refuse.
Here now to help us learn more about the role and outlook for desktop virtualization, we're joined by Jeff Groudan, vice president of Thin Computing Solutions at HP. Welcome to the show, Jeff.
Jeff Groudan: Thanks for having me, Dana.
Gardner: As I mentioned, there's a lot happening in the trends in the market that are supporting more interest in virtualization generally. We see server, storage, network, and now this desktop thing really catching on. I think it's because of the economics.
Groudan: There certainly are some things in the market that are sure driving a potential inflection point here. The market-driven things coming out of the recession are opening a lot of customers up to re-looking at some deployments that they may have delayed or specific IT projects that they have put on hold.
In addition, there has been an ongoing desire to increase security and a lot of new compliance requirements that the customers have to address. In addition, in general, as they are looking for ways to save on costs, they are consistently and constantly looking for different ways to more efficiently manage their distributed PC environments. All of these things are driving the high level of interest in PCs.
Gardner: With regards to this pent-up demand issue, we've certainly seen the Windows desktop environment, the operating system, now coming out with a very important upgrade and improvement with Windows 7. We've also seen of course some improvements on the hypervisor market for desktop virtualization. Do you have any sense of where this pent-up demand is really going to lead in terms of growth?
Groudan: In addition to the market drivers, we're seeing technology drivers that also are going to help line up for a real uptick in the size and rate of deployments on client virtualization.
You touched on the operating system trends. I think there has been some pause in operating system upgrades with Vista, as companies wait for Windows 7, and with that coming out in addition to Server 2008 R2 from Microsoft, as well as other updates from other virtualization software providers. You're really seeing a maturing of the client virtualization software in conjunction with the maturing of the next-generation Microsoft operating systems that are a catalyst here.
You're also seeing better performance on the hardware side and the infrastructure side. It's really also helping bring the cost per seat of the client virtualization deployment down into ranges that are lot more interesting for large deployments. Last, and near and dear to my heart, you're seeing more powerful, yet cost-effective, thin clients that you can put on the desk and that really ensure those end-users get the experience that you want them to get.
Gardner: It seems like enterprises are going to be faced with some major decisions about their client strategies, and if you are going to be facing this inflection point you might as well look at the full panoply of options at your disposal.
Groudan: Absolutely. Just to put it into context, there was recently some data from Gartner. They feel like there are well over 600 million desktop PCs in offices today. Their belief is that over the next five years, upwards of 15 percent of those could be replaced by thin clients. So that's quite a number of redeployments and quite an inflection point for client virtualization.
Gardner: I suppose another motivation for IT departments and enterprises is that they're looking at security, compliance, and regulatory issues that also make them re-evaluate their management approach as to how data and applications are delivered.
Groudan: Absolutely. There are a variety of areas that are relevant for customers to look at right now. On security, you're absolutely right. Every IT manager's nightmare scenario is to have their company on the front page of The Wall Street Journal, talking about a lost laptop, a hack, or some other way that personal data, patient data, or financial data somehow got out of their control into the wrong hands.
One of the key benefits of client virtualization is the ability to keep all the data behind the firewall in the data center and deploy thin clients to the edge of the network. Those thin clients, by design, don't have any local data.
Gardner: I suppose another relevant aspect of this is that it's not necessarily rip-and-replace. You are not going to take 600 million PCs and put in thin clients, but you can start working at the edge to identify certain classes of users, certain application sets, perhaps a call center environment, and start working on this on a graduated basis.
Groudan: You certainly can. Our general coaching to customers is that it's not necessary for everyone, for every user group, or every application set. But, certainly, for environments where you need to get them more manageable, you need more flexibility.
You need higher degrees of automation in order to manage a high number of distributed PCs with the benefits from centralized control, reduced labor costs, and the ability to manage remote or hard to get at locations -- things like branches, where you don't have a local IT. Those are great targets for early client virtualization deployments.
Gardner: I suppose another big issue in the marketplace now is how to increase automation. When you control the desktop experience from a server or data-center infrastructure, you've got that opportunity to automate these processes and get off that treadmill of trying to deal with each and every end point physically or at least through a labor approach.
Groudan: Exactly. When you think about the cost savings of client virtualization, usually the costs come from some of the long-term acquisition costs. Because the lifecycle of these solutions are closer to four or five years, you haven't acquired the same amount of equipment on the same cadence.
But, the big savings come from the people savings. The automation and the manageability mean you need fewer people dedicated to managing distributed PCs and the break-fix and help desk associated with that.
You can do two things with those efficiencies. You can either cut some cost, which, at some point, is the right approach. Increasingly, what we see is that rather than just cut cost, people re-deploy resources toward more value-generation oriented activities versus a cost center that you have to have to manage PCs. You can take resources and focus them on value-add generation projects that add to the bottom-line from the business efficiency perspective versus just our cost.
Gardner: In other ways, there is an interesting point because the total solution here has to involve those data center operators, the architects, and then the PC edge client folks. Now these may have been separate in some organizations, but what's HP's advice? Are you encouraging more collaboration and cooperation to strategize between the client group, and then the delivery of the infrastructure side?
Think beyond technical
Groudan: You really need to. That's been one of the inhibitors to earlier growth on client virtualization -- figuring out the business processes to get the data center guys and the edge of the network guys working on a combined plan. One key to success is clearly to be thinking beyond simply the technical architecture to how the business processes inside a company need to change.
All of a sudden, the data-center guys need to be thinking about the end-user. The end-user guys need to be thinking about the data center. Roles and responsibilities need to be hammered out. How do you charge the capital expense versus operational expense? What gets budgeted where? My advice is: as you're thinking about the technical architecture and all of the savings end-to-end, you need to also be thinking about the internal business processes.
Gardner: What that tells me is that this is not just about buying components and slapping in thin clients. This is really something you need to look at from a total solutions perspective. Do some planning, but the more total approach you take, the bigger economic payoff will be.
Groudan: That's absolutely right.
Gardner: Let's go back quickly to security. I remember when I first started hearing about desktop virtualization, somebody mentioned to me that all those agencies in Washington with the three-letter acronyms, the spooky guys, are all using desktop virtualization, because they can lock down the device and close off the USB port.
When that thing is shut off or that user logs out, there is no data and no inference. Nothing is left on the client. Everything is on the server. It's how you can really manage security. We are talking about taking that same benefit now to your enterprise users, your road warriors, and perhaps even remote branches. Right?
Groudan: That's absolutely correct. One of the beautiful things about a thin client is that when you unplug it from the network, it's basically a paperweight, and, from a security perspective, thin clients are getting pretty small too. People could take that thin client, put it in their briefcase, walk out with it, and they have nothing. They have no IT assets, no personal data, no R&D secrets, or whatever else there may be.
From a security perspective, they're very, very low power, designed to be remotely managed, and designed to be plug-and-play replaceable. From a remote IT perspective, on the very rare chance that a thin client breaks, you take one from the storage closet where you keep a couple of spares, plug it in, and you're up and running in five or 10 minutes.
Gardner: So, even if all things were equal in terms of the cost of operating and deploying these, just the savings in securing up your data and application seems like a pretty worthwhile incentive?
Groudan: It really does. Not all customers may have that kind of burning needs to secure data, but it's a drop-dead simple way of ensuring that there is no data out there on the edge of the network that you don't know about. It really gives you some confidence that you know where the data is and you know there are limited ways to get into that data. If you put the right security process in place, you know they're going to work independent of whether thousands of end-users follow all the processes, which is hard to mandate.
Gardner: What does HP mean by desktop virtualization? There has been some looseness around the topic. Some people focus on a business to consumer (B2C) approach, highly scaling, perhaps a limited number of apps, and through a telecom provider. Other folks are now in the market with solutions that are business to employee (B2E), that is your employee-focused solutions. Where does HP come down on this? What do you think is the most important approach and how do you define it in the market?
Views of the market
Groudan: We look at this market in two ways, in the context of client virtualization and in the broader context of thin computing. Just zeroing in on client virtualization, we call it Client Virtualization HP. It's desktop virtualization. It's the same animal.
We look it as a specific set of technologies and architectures that dis-aggregate the elements of a PC, which allows customers to more easily manage and secure their environment. What we're really doing is taking advantage of a lot of the new software capabilities that matured on the server side, from a server virtualization and utilization perspective. We're now able to deploy some of those technologies, hypervisors, and protocols on the client side.
We still see it is a fairly B2E-focused paradigm. You can certainly draw up on a whiteboard other models for broader audiences, but today we see most of the attraction and interest as more of a B2E model. As you touched on earlier, it's generally targeted at specific user groups and specific applications versus everybody in your environment.
Our specific objective is figuring out how to simplify virtualization, so that customers get past the technology, and really start to deliver the full benefit of virtualization, without all the complexity.
Gardner: There is a significant integration aspect of this. We talked about how you've got different groups within IT that are going to be affected, but you've got to be able to integrate component software, hypervisors, and management of data. It's a shift.
Groudan: We've were an early entrant in client virtualization, so we've got quite a track record behind us. What we learned led us to focus on a few things.
The first is that you don't want to have customers having to figure out how to architect the stuff on their own. If you think about PCs 20-25 years ago, customers didn't know how to architect a distributed PC environment. In 25 years, everybody has gotten good at it. We're still at the early stages on client virtualization.
So our focus is to deliver more complete integrated solutions, end to end from the desktop to the data center, lay it all out, and reference designs so customers can very comfortably understand how to go build out a deployment. They certainly may want to customize it. We want to get them 80-90 percent there just by telling them what our learnings have been.
The second thing we try to do is to give them best-in-class platforms. From a thin-client perspective, this is important, because you need to make sure that the end-user actually gets the experience that they are used to. One of the best ways to install a deployment is having the end-users say, "Hey, I've got a better experience on my desktop." Having thin clients that are designed from the ground up to deliver a desktop class experience is really critical.
Last, we need to make sure we've got the right ease of use and manageability tools in place, so this IT complexity can be removed. They know they can manage the virtual environments. They can manage the physical environments. They can manage the remote thin clients. We don't make these things too complex for the IT guys to actually deploy and manage.
Gardner: Now, there has been some trepidation in the market. People say, "Is this ready for prime-time?" Let's focus a little bit on what's been holding people up. I don't think it's necessarily the software.
When I talk to Microsoft people, they seem to be jazzed about desktop virtualization. Of course, you're still getting a license to use that desktop, and perhaps even it's aligned with a lot of the other server side products and services that Microsoft provides.
So, there is alignment by the software community. What's been holding up people, when they think of this desktop virtualization?
Groudan: There's been a handful of things. In the early days, there were still some gaps in the experience that the end-users would get -- multimedia, remoting, USB peripherals, and those kinds of things. HP and the broader industry ecosystem has done a lot in a year or two to close those gaps with specific pieces of software, high-performing thin clients, etc. We're at a point now, where you can feel pretty good that the end-users are going to get a very relevant experience as they compare to a desktop.
Second, the solutions are complicated, or we let them be complicated, because we put a lot of components in front of our customers, rather than complete solutions. By delivering more reference design models and tools you take away some of the complexity around the design, the set up, and the configuration that customers were facing in the early days.
Third, management software. Earlier, you didn't have single tool that would let you manage both the physical and the virtual elements of the desktop virtualization environment. HP and others have closed those gaps, and we have very powerful management tools that make this easy on an IT staff.
Last, it was hard to initially quantify where some of the cost savings have come from. Now, there are total cost of ownership (TCO) analysis tools, understanding where the savings can come from, and how you can take advantage of those savings. It's a lot better understood, and customers are more comfortable that they understand the return on investment (ROI).
Gardner: Are there certain types of enterprises that should be looking at this? In my mind, if you've already dived into virtualization, you're getting comfortable with that and you're getting some expertise on it. If you're also thinking about IT shared services in a service bureau approach to IT, your culture and organization might be well aligned to this. Are there any other factors that you can think of, Jeff, that might put up a flag that says, "We're a good candidate for this?"
Groudan: There are opportunities for just about every industry. We've seen certain verticals on the cutting edge of this. Financial services, healthcare, education, and public sector are a few examples of industries that have really embraced this quickly. They have two or three themes in common. One is an acute security need. If you think about healthcare, financial services, and government, they all have very acute needs to secure their environments. That led them to client virtualization relatively quickly.
Financial services and education both have some consistency around having large groups of knowledge workers in small locations. That lends itself very well to client virtualization type deployments. Education and healthcare both have a need for large, remote, campus type environments, where they have a need for a lot of PCs or desktop virtualization seats, a mobile campus environment. That's another sort of environments and use case that lends itself very well to these kinds of architectures.
Gardner: As I said earlier, it seems like an offer that's hard to refuse. It's just getting everything lined up. There are so many rationales that support this. But, in this economy, it's the dollar and cents that are the top concern, and will be for a while.
Do you have any examples of companies that have taken a plunge, done some desktop virtualization, perhaps with a certain class of user, perhaps in a call center environment or remote branch? What's been the experience and what are the paybacks at least economically?
Groudan: I'll give you two examples. First, is out of the education environment. They were trying to figure out how to increase reliability, while improving student access and increasing the efficiency of their IT staffs, because the schools are always challenged having sufficient IT resources.
They deployed desktop virtualization deployment with HP infrastructure and thin clients. They felt like they would lower the total cost, increase the up time for the students and in the classroom, increase the teacher productivity, because they are able to teach instead of trying to maintain PCs in the classroom that weren't necessarily working. They freed up their IT staff to go work on other value-added projects.
And, most important for a school, they increased the access and productivity of the students. To make that very real for you, students may only have one or two hours in front of the computer a day in school and they maybe doing many, many different things. So, they don't get that much time on an application or a project in school.
The solution that this Hudson Falls School deployed let the students access those applications from home. So, they could spend two or three hours a night from home on those applications getting very comfortable with them, getting very productive with them, and finishing their projects. It was a real productivity add for the students.
The second example is with Domino's Pizza. Many of us are familiar with them. They were struggling with the challenges of having a lot of remote sites and a lot of terminals that are shared. Supporting those remote sites, trying to maintain reliability, and keeping customer data secure were their burning needs, and they were looking for an alternative solution.
They deployed client virtualization with HP thin clients and they found they could lower their costs on an annual basis by $400 per seat, and they've gotten much longer life out of the terminals. They increased the up-time of the terminals and, by extension, limited the support required on site.
Then, by using this distributed model, where the data is back in a data center somewhere, they really secured customer data, credit card information, and those kinds of things. They're able to rest easy that that kind of information isn't going to somehow get out into the public domain.
Gardner: A couple of things that jump out at me from this is that all that data back on the server is really going to benefit your business intelligence (BI), analytics, auditing, reporting and those sorts of activities, when you don't have all that data out on all those clients, where you can't really easily get to it or manage it.
Value of data mining
Groudan: For, any company that has a lot of customer data, the ability to mine that data for trends, information, opportunity, or promotions is incredibly valuable.
Gardner: The other thing that jumped out at me is that this brings up the notion that if this works for PCs and thin clients, what about kiosks? What about public-facing visual interfaces of some kind? Can you give us a hint of what the future holds, if we take this model a step further?
Groudan: Sure, it brings up one of the themes I want to talk about. HP's unique vision is that client virtualization is just one of many ways of using thin computing to enable a lot of different models beyond just replacing the traditional desktop. As you mentioned, anywhere that's hard to get to, hard to maintain, or hard to support is a perfect opportunity to deploy thin computing solutions.
Kiosks and digital signage are generally in remote locations. They can be up on a wall somewhere. The best answer for them is to be connected remotely, so you can just manage them from centralized location.
We certainly see kiosks and signage as a great opportunity for thin computing. We do see some other opportunities to bring thin computing into the home and into small-medium business through the use some of the cloud trends and cloud applications and services. We've all seen some of the trends on. To me, thin computing ultimately is going to be much broader than the B2E client virtualization models that we're probably most familiar with.
Gardner: Obviously, HP has a lot invested here, a good stake in the future for you. Anything we should expect in the near future in terms of some additional innovation on this particularly on the B2B?
Groudan: Yeah, well, I can't talk about it too much, but we certainly have some very exciting launches coming up in the next couple of months where we're really focused on total cost per seat. How do we let people deploy these kinds of solutions and continue to get further economic benefits, delivering better tighter integration across the desktop to the data center?
The ease of deployment of these solutions can get easier-and-easier, and then ease of use and manageability tools. They allow the IT guys to deploy large deployments of client virtualization with as little touch and as little complexity as we can possibly make it. We're trying to automate these kinds of solutions. We're very excited about some of the things we'll be delivering to our customers in the next couple of months.
Gardner: Okay, very good. We've been talking about the growing interest and value in PC desktop virtualization strategies and approaches. I've learned quite a bit. I want to thank our guest today, Jeff Groudan, vice president of Thin Computing Solutions at HP. Thanks for joining, Jeff.
Groudan: My pleasure, Dana. Thanks for having us.
Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. You've been listening to a sponsored BriefingsDirect podcast. Thanks for listening, and come back next time.
Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Hewlett-Packard.
Transcript of a BriefingsDirect podcast on the future of desktop virtualization and how enterprises can benefit from moving to this model. Copyright Interarbor Solutions, LLC, 2005-2010. All rights reserved.