Tuesday, November 08, 2011

Case Study: Southwest Airlines' Productivity Takes Off Using Virtualization and IT as a Service

Transcript of a BriefingsDirect podcast on how travel giant Southwest Airlines is using virtualization to streamline customer service applications.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: VMware.

Dana Gardner: Hello, and welcome to a special BriefingsDirect podcast series coming to you in conjunction with a recent VMworld 2011 Conference.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host throughout this series of VMware-sponsored BriefingsDirect discussions.

Our next VMware case study interview focuses on Southwest Airlines, one of the best-run companies anywhere, with some 35 straight years of profitability, and how "IT as a service" has been transformative for them in terms of productivity.

Here to tell us more about how Southwest is innovating and adapting with IT as a compelling strategic differentiator is Bob Young, Vice President of Technology and Chief Technology Officer at Southwest Airlines. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

Welcome to BriefingsDirect, Bob.

Bob Young: Well, thank you very much. I appreciate the opportunity to speak with you.

Gardner: We have heard a lot about IT as a service, and unfortunately, a lot of companies face an IT organization that might be perceived as a little less than service-oriented, maybe even for some a roadblock or a hurdle. How have you at Southwest been able to keep IT squarely in the role of enablement?

Young: First off, as everybody should know already, Southwest Airlines is the customer service champ in the industry. Taking excellent care of our customers is just as important as filling our planes with fuel. It’s really what makes us go.

So as we are taking a look and trying to be what travelers want in an airline, and we are constantly looking for ways to improve Southwest Airlines and make it better for our customers, that's really where virtualization and IT as a service comes into play. What we want to be able to do is make IT not say, "Oh, this is IT versus something else."

People want to be able to get on Southwest.com, make a reservation, log on to their Rapid Rewards or our Loyalty Program, and they want to be able to do it when they want to do it, when they need to do it, from wherever they are. And it’s just great to be able to provide that service.

We provide that to them at any point in time that they want in a reliable manner. And that's really what it gets right down to -- to make the functions and the solutions that we provide ubiquitous so people don’t really need to think about anything other than, "I need to do this and I can do it now."

At your fingertips

Gardner: I travel quite a bit and it seems to me that things have changed a lot in the last few years. One of the nice things is that information seems to be at your fingertips more than ever. I never seem to be out of the loop now as a traveler. I can find out changes probably as quickly as the folks at the gate.

So how has this transfer of information been possible? How have you been able to keep up with the demands and the expectations of the travelers?

Young: One of the things that we like to do at Southwest Airlines is listen to our customers, listen to what their wants and desires are, and be flexible enough to be able to provide those solutions.

If we talk about information and the flow of information through applications and services, it really starts to segment the core technical aspects of that so the customer and our employees don’t really need to think about it. When they want to get the flight at the gates, the passenger is on a flight leg, etc., they can go ahead and get that at any moment in time.

Another good example of that is earlier this year we rolled out our new Rapid Rewards 2.0 program. It represents a bold and leading way to look at rewards and giving customers what they want. With this program, we've been able to make it such that we can make any seat available on any flight for our Rapid Rewards customers for rewards booking, which is unique in the industry.

What we want to be able to do is provide it whenever they want it, whenever they need it, at the right cost point, and to meet their needs.



The other thing it does is allows our current and potential members the flexibility in how they both earn miles and points and how they use them for rewards -- being able to plan ahead and allowing them to save some significant points.

The same is true of how we provide IT as a service. What we want to be able to do is provide it whenever they want it, whenever they need it, at the right cost point, and to meet their needs. We've got some of the best customers in the world and they like to do things for themselves. We want to allow them to do that for themselves and be able to provide our employees the same areas.

If you've been on a Southwest flight, you've seen our flight crews, our in-flight team, really trying to have fun and trying to make Southwest a fun place to work and to be, and we just want to continue to support that in a number of different ways.

Gardner: You have also had some very significant challenges. You're growing rapidly. Your Southwest.com website is a crucial and growing part of your revenue stream. You've had mergers, acquisitions, and integrations as a result of that, and as we just discussed, the expectations of your consumers, your customers, are ramping up as well -- more data, more mobile devices, more ways to intercept the business processes that support your overall products and services.

So with all those requirements, tell me a little bit about the how. How in IT have you been able to create common infrastructures, reduce redundancy, and then yet still ramp up to meet these challenging requirements?

Significant volume

Young: As you all know, Southwest.com is a very large travel site, one of the largest in the industry -- not just airlines, but the travel industry as a whole. Over 80 percent of our customers and consumers book travel directly on Southwest.com. As you may know, we have fare sales a couple of times a year, and that can drive a significant volume.

What we've been able to do and how we have been able to meet some of those challenges is through a number of different VMware products. One of the core products is VMware itself, if we talk about vSphere, vMotion, etc., to be able to provide that virtualization. You can get a 1-to-10 virtualization depending on which type of servers and blades you're using, which helps us on the infrastructure side of the house to maintain that and have the storage, physical, and electrical capacity in our data centers.

But it also allows us, as we're moving, consolidating, and expanding these different data centers, to be able to move that virtual machine (VM) seamlessly between points. Then, it doesn’t matter where it’s running.

That allows us the capacity. So if we have a fare sale and I need to add capacity on some of our services, it gives our us and our team that run the infrastructure the ability to bring up new services on new VMs seamlessly. It plugs right into how we're doing things, so that internal cloud allows us not to experience blips.

It's been a great add for us from a capacity management perspective and being able to get the right capacity, with the right applications, at the right time. It allows us to manage that in such a way that it’s transparent to our end-users so they don’t notice any of this is going on in the background, and the experience is not different.

It's been a great add for us from a capacity management perspective and being able to get the right capacity, with the right applications, at the right time.



Gardner: I understand that you're at a fairly high level of virtualization. Is that a place where you plan to stay? Are you going to pursue higher levels? Where do you expect to go with that?

Young: I'll give you a little bit of background. We started our virtualized environments about 18 months ago. We went from a very small amount of virtualization to what we coined our Server 2.0 strategy, which was really the combination of commodity-based hardware blades with VMware on that.

And that allowed us last year in the first and second quarter to grow from several hundred VMs to over several thousand, which is where we're at today in the production environment. If you talk about production, development, and test, production is just one of those environments.

It has allowed us to scale that very rapidly without having to add a thousand physical servers. And it has been a tremendous benefit for us in managing our power, space, and cooling in the data center, along with allowing our engineers who are doing the day-to-day work to have a single way to manage it, deploy, and move stuff around even more automatically. They don’t have to mess with that anymore, VMware just takes care of the different products that are part of the VMware Suite.

Gardner: And your confidence, has it risen to the level where you're looking at 70, 80, 90, even more percent of virtualization? How do you expect to end that journey?

Ready for the evolution

Young: I would love to be at 100 percent virtualized. That would be fantastic. I think unfortunately we still have some manufacturers and software vendors -- and we call them vendors, because typically we don’t say partners -- who decide they are not going to support their software running in the virtualized environment. That can create problems, especially when you need to keep some of those systems up 24 x 7, 365, with 99.95 percent availability.

We're hoping that changes, but the goal would be to move as much as we can, because if I take a look at virtualization, we are kind of our internal private cloud. What that’s really doing is getting us ready for the evolution that’s going to happen over the next, 5, 7, or 10 years, where you may have applications and data deployed out in a cloud, a virtual private cloud, public cloud if the security becomes good enough, where you've got to bring all that stuff together.

If you need to have huge amounts of capacity and two applications are not co-located that need to talk back and forth, you've got to be much more efficient on the calls and the communications and make that seamless for the customer.

This is giving us the platform to start learning more and start developing those solutions that don’t need to be collocated in a data center or in one or two data centers, but can really be pushed wherever it makes sense. That could be from wherever the most efficient data center is from a green technology perspective, use the least electricity and cooling power, to alternate energy, to what makes sense at the time of the year.

That is a huge add and a huge win for us in the IT community to be able to start utilizing some of that virtualization and even across physical locations.

It allows us to deploy that stuff within minutes, whereas it used to take engineers manually going to configure each thing separately. That’s been a huge savings.



Gardner: So as you've ramped up on your virtualization, I imagine you have been able to enjoy some benefits from that in terms of capital expense, hardware, and energy. How about in some of the areas around configuration management and policy management. Is there a centralization feature to this that also is paying dividends?

Young: That’s a huge cornerstone of the suite of tools that we've been able to get through VMware is being able to deploy custom solutions and even some of the off-the-shelf solutions on a standard platform, standard operating systems, standard configurations, standard containers for the web, etc. It allows us to deploy that stuff within minutes, whereas it used to take engineers manually going to configure each thing separately. That’s been a huge savings.

The other thing is, once you get the configuration right and you have it automated, you don’t have to worry about people taking some human missteps. Those are going to happen, and you've got to go back and redo something. That elimination of error and the speed at which we can do that is helping. As you expand your server footprints and the number of VMs and servers you have without having to add to your staff, you can actually do more with the same number of or fewer staff.

Gardner: I wonder how you feel about desktop virtualization. Another feature that we've seen in the field in the marketplace is those that make good use of server virtualization are in a better position to then take that a step further and extend it out through PC-over-IP and other approaches to delivering the whole desktop experience. Is that something that you're involved with or experimenting with? How do you feel about that?

Young: This has been going on and off in the IT industry for the past 10-15 years, if you talk about Net PCs and some of the other things. What’s really driven us to take a look at it is that around our environment we can control security on virtual desktops, etc., very clearly, very quickly and deliver that in a great service.

New mobile devices

The other thing that’s leading to this is, not just what we talked about in security, is the plethora of brand new mobile devices -- iPhones, iPads, Android devices, Galaxy. HP has a new device. RIM has a new device. We need to be able to deliver our services in a more ubiquitous manner. The virtual desktop allows us to go ahead and deliver some of those where I don’t need to control the hardware. I just control the interface, which can protect our systems virtually, and it’s really pretty neat.

I was on one of my devices the other day and was able to go in via virtual desktop that was set up to be able to use some of the core systems without having all that stuff loaded on my machine, and that was via the Internet. So it worked out phenomenally well.

Now, there are some issues that you have to do depending on whether you're doing collocation and facility, but you can easily get through some of that with the right virtualization setup and networking.

Gardner: So you have come an awfully long way. You say 18 months ago you were only embarking on virtualization, but now you're already talking about hybrid clouds and mobile enablement and wide area network optimization. How is that you have been able to bite off so much so soon? A lot of people would be intimidated, do more of that crawl-walk-run, with the emphasis on the crawl and walk parts?

Young: Well, I am very fortunate. I might come up with the vision of where we want to go and this is where IT is going, and I am very fortunate to have some very good and phenomenal engineers working on this, working through it, all the issues, all the little challenges that pop up along the way in order to do this.

It’s what our team would say is pretty cool technology, and it gets them excited about doing something new and different as well. I've got a couple of managers -- Tim Pilson, Mitch Mitchell -- and their teams, and some really good people.

So I really have to give credit to the teams that are working with me, my team who gets it done, and VMware for providing such a great product.



Jason Norman is one of the people, and Doug Rowland also has been very involved with getting this rolled out. It’s amazing what a core set of just a few people can do with the right technology, the right attitude, and passion to get it done. I've just been very impressed with their, what we call warrior spirit here at Southwest Airlines -- just not giving up, doing what it takes to get it done, and being able to utilize that with some of the VMware products.

It extends beyond that team. Our development teams use Spring and some other of the VMware products as well. If we run into an issue, it’s just like VMware on the development side of the house and product side of the house is really part of our extended team. They take it, they listen, and they come back with a fix and a patch in literally a day or two, rather than some other vendors with whom you might wait weeks or months and it might never make it to you.

So I really have to give credit to the teams that are working with me, my team who gets it done, and VMware for providing such a great product that the engineers want to use it, can use it, and can understand it, and make huge amounts of progress in a very short period of time.

Gardner: Well, great. It’s a very interesting and compelling story. We've been talking with Southwest Airlines and how they are continuing to innovate and adapt and using IT as a compelling strategic differentiator.

Our guest has been Bob Young, Vice President of Technology and Chief Technology Officer at Southwest Airlines. Thanks so much, Bob.

Young: Well, thank you.

Gardner: I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of VMware-sponsored BriefingsDirect discussions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: VMware.

Transcript of a BriefingsDirect podcast on how travel giant Southwest Airlines is using virtualization to streamline customer service applications. Copyright Interarbor Solutions, LLC, 2005-2011. All rights reserved.

You may also be interested in:

Tuesday, November 01, 2011

OSU Medical Center Gains Savings and Productivity Wins Through Ariba Cloud Procurement System

Transcript of a BriefingsDirect podcast on how The Ohio State University Medical Center has streamlined their procurement process with cloud-based tools from Ariba.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: Ariba.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on how businesses are using cloud commerce and automated procurement to dramatically improve how they source and manage the buying process.

In doing so, we'll see how they improve cash management and gain an analytical edge to constantly improving these processes. We will examine how The Ohio State University Medical Center (OSUMC) has moved its procurement activities to the cloud and adopted strategic sourcing, dramatically increasing efficiency across the purchases managed. We'll learn more about how a large medical enterprise is conducting its business better through collaborative cloud commerce.

Please join me now in welcoming our guest, Karen Sherrill, Senior Commodity Manager at The Ohio State University Medical Center in Columbus. Welcome to the show, Karen. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

Karen Sherrill: Why, thank you.

Gardner: What was wrong before? What did you need to change in how you sourced and procured that led you to adopt more automation and more of these cloud-based services?

Sherrill: The Medical Center is a government agency. So as you can imagine, it’s tied down with a lot of bureaucracy and paper. But as we moved into the year 2010, we were receiving a lot of pressure to do things faster and more efficiently. The only way to do that was some type of technology to allow our current staffing levels to support the needed growth to be able to support our customers faster.

We were processing about 422 bids a year, and that equated to about the same number of contracts. We had only 26 buyers who were able to support the business that was projected to grow significantly with our medical center expansion and the growth of our ambulatory site.

Some of the limitations that we were running up against was that our resources were spending a lot of time providing technical support to the old legacy system. In addition, the legacy system was supporting static documents. So we were posting PDFs, and suppliers were mailing in bids, which was not an efficient way to analyze them on the back-end.

We were manually tracking supplier contract terms, conditions, and modifications, and that was taking a significant amount of time in order to execute the final agreement.

We had no repository for contracts. So when people were seeking agreements, we were looking on shared drives and peoples’ desks and were having to go to external storage. Contracts were expiring. We were not aware of them and not renewing them in a timely fashion.

No framework

In addition, we had counterparts with the university and we weren't able to collaborate with them and have visibility into what bids they were doing and what bids we were doing. So there was no framework to support that type of collaboration that was essential for us to be successful going forward.

Those were some of the limitations and concerns that we were trying to address and we therefore implemented technology in order to help us meet or relieve some of these concerns.

Gardner: When you decided that you were no longer able to scale this effectively, and that your existing processes were breaking down under complexity and load, how did you go about seeking a solution? What were the requirements that you had in mind when you were seeking to improve on this situation?

Sherrill: One of the requirements was that it had to be an automated technology and it had to be easy to use. The provider of the technology would need to provide the technical support, primarily not for the buyers, but for the suppliers, who would be interacting with the system and would need training and guidance to navigate the system in order to submit an effective bid result.

Gardner: Tell me a bit about The Ohio State University Medical Center. For those who aren't familiar, let us know the breadth and depth of the organization there.

Those were some of the limitations and concerns that we were trying to address and we therefore implemented technology in order to help us meet or relieve some of these concerns.



Sherrill: The Ohio State University Medical Center is located in Columbus, Ohio. We consist of five hospitals. The main hospitals are the James Cancer Hospital, the Ohio State University Hospital, the main campus, and then we have Ohio State University Medical Center Hospital East.

We have about 58,000 inpatient admissions and about one million outpatient admissions, and we do about 15,000 inpatient surgeries, about 19,000 outpatient surgeries, and about a 120,137 visits to our emergency department.

We consists of about 450 beds, and that does not include our one million square foot expansion construction project that we currently have underway, and is expected to be completed in fiscal year 2014.

Gardner: So clearly it's a very big and busy place. Are you managing all the procurement for medical supplies and office supplies? What's the extent of the purchasing and buying that you're involved with?

Divided into two sections

Sherrill: We're divided into two sections. I head up the indirect products and services, which means that anything that's non-clinical or not related to a patient's care. So that would be marketing, advertising, linen, landscaping, anything that's not directly related with patient care.

I have a counterpart who handles all the clinical types of purchases, which would be mammography equipment, needles, drug-eluting stents, all the medical type related supplies and services.

Gardner: So there you were just a few years ago, looking at this problem, knowing where you wanted to go, knowing that you needed to automate and outsource a significant portion of this, and with a very big organization that doesn't just stop in order for you to bring in a new system. You need to keep the plane flying while you change the engine, so to speak.

So, Karen, tell us a little bit about how this unfolded. How did the journey begin, and where have you come in just a fairly short amount of time?

Sherrill: Initially they did an RFP for the e-sourcing technology. Ariba was selected. That was done prior to my coming on board, but when I came on board they said, "This is the technology we have selected and you need to implement this."

The first thing for me was to build relationships internally, and I did that by just taking the time to stop, listen to the way they currently did business, and why they did it that way.



As a result, being a new person in the organization, I didn't really know how the organization operated, because I came from the private sector. This was a public sector entity. I was a new person, and no one knew who I was.

The first thing for me was to build relationships internally, and I did that by just taking the time to stop, listen to the way they currently did business, and why they did it that way. Sometimes they may be doing things for a good reason or maybe there was a legal reason. There were other things they were doing because they did it for 50 years, and maybe we didn't need do it that way any longer. So that was the first step.

The second step was that the leadership heads agreed to lay out a significant amount of investment in this particular technology, and my only assignment was to implement it. So we created this analogy that the locomotive is out of the gate. The financial investment had been made. Then you have Karen Sherrill, who is trying to prove something, and my only assignment is to get this implemented.

The rule was that you have three choices: you can get on the locomotive and enjoy the ride, you can step aside and let the train pass you by, or you can try to get in front of the train and stop the progress.

That was the analogy or the vision that we put in place, and they could decide where they wanted to fall. We had individuals who chose three different areas.

Enjoy the ride

Most of them decided to get on the train and enjoy the ride, because they were really itching for change and becoming more efficient. Others were nonbelievers, saying that this is never going to stick, so I am going to let the train pass me by. And there were a few who were afraid of technology changing, afraid of the processes changing, afraid of the shift in power as a result of the processes changing, and they feared visibility. So they tried to stop the train.

Because we had leadership support, whenever we ran across those individuals, we could run it up the chain. We had an endearing term that they would then get the smack down or be smacked back in line so the train can continue on.

We had to do that several times, but in the end we knew what the destination was and the locomotive was going to get there. We were not going to be one of those organizations that bought the technology, and when the subscription expired, we had not had one bid go through the tool.

One of my personal objectives was that a year was not going to go by without one bid going through the tool. One year sounds like a long time, but they didn’t have any processes documented. So we had to step back a few months to document the processes in order to effectively communicate how we wanted this technology to be configured.

But it was not going to take more than a year to get one bid through the tool. Then, when we had the one success, the goal was to build upon that success and continue to get more and more momentum. That’s how we were able to drive this locomotive to conclusion very quickly.

We consciously made the decision not to integrate. There was really no need to do it at this point.



Gardner: It's so important when you're dealing with technology shifts not to underestimate the importance of those people issues and to provide the time for those transitions to take place. So that’s clearly a big deal.

Was there something about the way that the service delivery, the software as a service (SaaS), or cloud delivery helped in that regard? Did people feel like they were getting support? How would you characterize the way that a cloud delivery transition may have affected this people, or behavioral, aspect of things?

Sherrill: One of the key reasons we selected the cloud on-demand version versus something hosted behind our firewall, or even deciding to integrate it with our systems behind our firewall, is that you have to get IT support internally, get their approval, and that would have delayed the implementation of this significantly.

We consciously made the decision not to integrate. There was really no need to do it at this point. And they wanted to see if we would use the technology. Why invest in integrating something that’s never going to be adopted?

So one of the benefits of using the on-demand solution in the cloud is that you don’t have to build that interface behind a firewall and you do not have to use internal resources to implement or execute the system. So that was one benefit to using the cloud solution.

Assisting suppliers

One of the problems we were having was that on the old system buyers were providing suppliers’ technical support, such as suppliers can't remember their email, they got the wrong category, they didn’t get the bid, and the buyers were constantly having to interface with suppliers.

When you go to a technology approach that’s more advanced, they're going to need even more assistance on how to navigate the sites or get their passwords reset. With the on-demand technology we are able to utilize the Ariba help desk. Most of our tickets are supplier based, and within 12 months of implementing there were over 700 tickets issued to the help desk, which were transparent to me and all the buyers.

That's a benefit that is definitely required when you go to a more advanced technology for processing bids. The suppliers are going to be needing more support, but you can't have your resources that are supposed to be focusing on strategic sourcing spending all their time trying to help suppliers to submit a response to a bid.

Gardner: I'd like to hear more about the efficiencies that you've been able to develop with this system and approach, but let's look at the higher-level benefits around strategic sourcing.

Is there something about using the network, Ariba’s Community, their cloud, their Discovery process and services that allowed you to increase the amount of folks that would be interested in doing business with you. Were you able to find additional people when you were seeking them, sort of a matching process I suppose? Was there really a radical or a substantial difference in how you sourced, regardless of how you went about sourcing?

When we transitioned to Discovery, there were about 350,000 suppliers on Discovery. It's grown to over a half-million at this point.



Sherrill: There was an additional benefit from using Ariba Discovery in our legacy system. There were only about 5,000 suppliers. As a public entity we're required to do public notifications of our bid opportunities. With only 5,000 suppliers within our legacy base, the competition could be somewhat limited, because it's only those suppliers who knew about our site and had come there to register to receive our bid notification.

When we transitioned to Discovery, there were about 350,000 suppliers on Discovery. It's grown to over a half-million at this point. So we've substantially increased the number of suppliers that are aware of our bid opportunities.

When you increase the number of suppliers aware of your bid opportunities, the number of suppliers that participate grows or increases. When you have more suppliers participating, you have increased competition, which then lowers pricing. And we found that to be the case on two high profile projects.

One project was related to linen. There was a supplier that we were aware of but we couldn’t find their contact information. We put the public bid notification on Discovery, and the supplier popped up. They participated in our bid.

They didn’t win, but they were strong competition, and the incumbent felt the need to reduce their pricing by about 16 percent, which resulted in a half-million dollar savings in year one and year two of the agreement. So if we didn’t have that strong competition, that saving probably would not have been generated.

Driving down prices

The dynamics that were working there was that you had the incumbent, who if they lost the business, would have a lot of explaining to do, working against the dynamic of a new company who was being told, "Do whatever you have to do to get the business." That dynamic drove the pricing down. So that was one benefit.

On several of our categories, we're just getting a lot more suppliers participating, finding suppliers that can source things that the buyers may not be a subject matter expert in. Because we're joined with the university, they're utilizing the system too, and sometimes they're being asked to source things like hay or dental strips. They're not a subject matter expert in that. Where are they going to find suppliers?

By just putting out a bid notification on Discovery, it makes the buyers who are more of a generalist have the tools necessary to find suppliers that can compete on categories that they're not subject matter experts in, in order to generate the competition to reduce price and get additional value from the supplier base.

Gardner: You mentioned earlier that one of the requirements you had was to automate and create more of a solid data repository, perhaps a system of record, for these sorts of activities. In going to Ariba, have you been able to achieve that, and what has that done that for you? Is there an analytics benefit and an auditing or tracking benefit? How has being able to satisfy that requirement benefited you as a business?

Sherrill: It's allowed us to standardize on the Seven-Step Strategic Sourcing Process, and within that process there are certain documents that have to be completed. We get audited on that documentation being completed. It's basically around, did we follow the public bidding guidelines? Did we do a public bid? Was the lowest and most responsive bidder selected? And what was the documentation, the score sheet, that support that contract award?

When an auditor comes in and they select bids, we can run a report of all the bids that were run during a particular period of time.



Now, when an auditor comes in and they select bids, we can run a report of all the bids that were run during a particular period of time. They can just select the bids that they want, and the buyers can just do a search and pull the documents that were requested.

Before, they had to go to some file cabinet and find the bid number. What if it was misfiled? It allows us to obtain audited documents much more quickly and it also standardized on the process to make sure the documents are completed in order to close the project out. So from a visibility standpoint, that has benefited us.

From the reporting aspect, we also can see how many bids are being run by what people and how many contracts are being executed, so we can get visibility into workload. Someone is doing x number of bids, but their work is not very contract related, versus someone is not doing very many bids, but they are doing a lot of contracts, versus someone who is not doing any bids or any contracts -- and that’s kind of an issue. So we can run a report and keep track of productivity, where we may have to shift projects or resources in order to better support our customers. And we can do that by just running a quick report.

Savings is another big initiative. Everybody wants to know, how much savings you generated over what period of time. We used to have an Excel spreadsheet where people were loading in their projects. A high-level executive would say, "I want to know what the savings were," and there would be this big initiative about going to the spreadsheet to update your savings. Of course only one person could go into the particular spreadsheet at a time, and the accuracy of it was always questionable.

People feel more comfortable

Now we have the technology where people load up their projects and load up their savings. We can run a monthly report that is scheduled. People are required to review their savings numbers on a monthly basis and make any tweaks or adjustments. When we get those requests for savings number, the accuracy of it is a lot higher and people feel more comfortable with it and people can update their savings at the same time.

So if we have to do it on a quick term basis, like in two or three hours, you can say, "Go and update your projects." The source analysts can just pull down that report, and we've got our numbers. That is a beautiful thing as far as visibility into savings, tracking the savings, and building more confidence in the accuracy of the numbers we report.

Gardner: How about the productivity of your buyer workforce there? Obviously, you're doing more things, but are you doing it with the same staff? How are they feeling about the workload? What's the sense of what you can do with your resources there?

Sherrill: Our resources, since we went live, have only increased by one, but we got additional work to go with that. That was medical center expansion project. So the workload has increased and we have had the same number of people.

Before we implemented Ariba, the construction category had a lot of bids, and that particular buyer was getting a lot of complaints that he wasn't turning the bids around faster, or as fast as the end user would like, and the analysis was taking too long. They wanted to see the bid results. They wanted to see it analyzed quickly, and they were escalating these complaints.

We've got them to the point that when the bids come in, they can go in and pull them down themselves and do the analysis and decide who they want to shortlist.



We implemented Ariba. So now he's able to execute his bids faster. Once he loaded one bid up, if another one was similar, he could copy and just tweak, which increased the efficiency on his end.

And because the suppliers were inputting their responses online, it was easier for him to export and do the analysis much quicker, but that particular user base also embraced the technology. We've got them to the point that when the bids come in, they can go in and pull them down themselves and do the analysis and decide who they want to shortlist.

That particular group is much happier. We're not getting any complaints, and the projects are moving off of his plate much faster. In fact, we used to have a lot of projects that were past due, past due, past due. The last report that we did there was something that was just due out in the future. So we're starting to get ahead of our strategic sourcing pipeline.

Gardner: That sounds very impressive. What would you have in terms of suggestions or recommendations for folks who are examining moving to a cloud-based procurement and strategic sourcing service or activity? Do you have any 20/20 hindsight, something that you could offer in terms of what you learned along the way?

Sherrill: The first thing I would recommend, and most companies may have it, is that you need to have your processes documented. When you buy one of these subscriptions, the clock starts ticking. If you have to stop and document your processes, then you're not using the tool, but you are paying for it.

Document your processes

Have your processes documented, so as soon as your subscription starts, you can have the tool configured and you don't have to use eight months, like we had to do, to figure out how we wanted the tool to be configured.

They also need to keep in mind that the technology is just a tool and it requires people, processes, and the technology to be successful. I like to say that you have to have brains behind the tool or it's not going to work for you.

Some of the people here had the skill set to embrace the tool and the utilization of the tool, but we have other people who don't have the correct skill set in order to effectively utilize the tool. That’s kind of a constraint if you don't have the authority to transform your organization and right size it with the people that have the correct skill sets.

Another important implementation is that you've got to have support from the top, who will back you up when someone is trying to put up a wall or be an obstacle to the implementation.

They also need to keep in mind that the technology is just a tool and it requires people, processes, and the technology to be successful.



The way you get the leadership support is having an individual who has built credibility as far as the transformation actually working. The person leading the process needs to have the right skill set, and from what we found, that was a person who had excellent project management skills.

Also key is that they need to be able to do strategic sourcing. You have more credibility, if you're actually using the tool and actually using the processes that you are implementing.

Number three, the person has to have excellent communication and interpersonal skills to deal with those people who don't want to go along with the process, as well as team building skills. If that particular leader has all those skills, it allows the opportunity for them to do sourcing events and lead by example.

One of the things that made Ohio State University’s implementation successful is that my projects were the ones that went through the tool first. I was able to identify any problems and reconcile them immediately, so that the buyers that were putting in their bids behind mine would never run across those problems.

If there were problems that I encountered that I wasn't able to fix, and a buyer identified it, I could say, "I'm aware of that problem, and here is your workaround," so that I didn't get any "aha" or "got you" moment that they were trying to come up with to stop the implementation.

The second component to that is that you need to have early adopters. These were people who wanted to change, who wanted to have their sourcing projects to be the first ones to go to the tool. That was valuable in that, while I as a project leader will say that this is great, I'm going to say it's great, because I am responsible for rolling it out. But I had early adopters who also agree, who are fans, who are vocal about it, and also had successes. That was very instrumental to rolling the project out successfully.

Adding credibility

Then once you've got several of the people who have the credibility of utilizing the tool, who are actually sourcing professionals that are using the tool, it adds credibility, and then everyone else has no choice but to follow along.

The other important thing is that you have to shut down the old way of processing bids. We had a go-live date and we had those individuals that decided to use the old way all the way up until the last date, but when we came to the go-live date, there were no more bids going out the old way. That led to us having 100 percent compliance, since we went live May of 2010. Those would be my recommendations.

Gardner: That’s very good insight. I appreciate that. You've been listening to a sponsored podcast discussion on how businesses are using cloud commerce and automated procurement to dramatically improve how they source and manage the buying process.

The second component to that is that you need to have early adopters.



I want to thank our guest. We've been here with Karen Sherrill, Senior Commodity Manager, The Ohio State University Medical Center in Columbus. Thanks so much, Karen. That was very insightful. I appreciate it.

Sherrill: Thank you for having me.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. As always, thanks for listening and come back next time.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: Ariba.

Transcript of a BriefingsDirect podcast on how The Ohio State University Medical Center has streamlined their procurement process with cloud-based tools from Ariba. Copyright Interarbor Solutions, LLC, 2005-2011. All rights reserved.

You may also be interested in:

Monday, October 31, 2011

Virtualized Desktops Spur Use of 'Bring Your Own Device' in Schools, Allowing Always-On Access to Education Resources

Sponsored podcast discussion on how a community school corporation is moving to desktop virtualization to allow students, faculty, and administrators flexibility in location and devices.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: VMware.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on how enterprises are increasing their use of desktop virtualization in the post-PC era. We’ll also learn about the new phenomena of "bring your own device" (BYOD) and explore how IT organizations are enabling users to choose their own client devices, yet still gain access to all the work or learning applications and data they need safely, securely, and with high performance.

The nice thing about BYOD is that you can essentially extend what do you do on premises or on a local area network (LAN) to anywhere, to your home, to your travels, 24×7.

The Avon Community School Corp. in Avon, Indiana has been experimenting with BYOD and desktop virtualization, and has recently embarked in a wider deployment for both for the 2011-2012 school year. We’re about to hear their story. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

So please join me now in welcoming our guests -- Jason Brames, Assistant Director of Technology at Avon Community School. Welcome, Jason.

Jason Brames: Hello. Great to be here.

Gardner: We’re also here with Jason Lantz, Network Services Team Leader at Avon. Welcome, Jason Lantz.

Jason Lantz: Hello.

Gardner: Let’s start with you, Jason Brames. It sounds like you've been successful with server virtualization over the past couple of years with roughly 80 percent virtualization rate on those back-end systems. What made it important for you now to extend virtualization to the desktop? Why has this become an end-to-end value for you?

Brames: One of the things that is important to our district we noticed when doing an assessment of our infrastructure: We have aging endpoints. We had a need to extend the refresh rate of our desktop computers from what was typical -- for a lot of school districts typical is about a 5-year refresh rate -- to getting anywhere from 7 to 10, maybe even 12 years, out of a desktop computer.

By going to a thin client model and connecting those machines to a virtual desktop, we're able to achieve high quality results for our end users, while still giving them computing power that they need and allowing us to have the cost savings by negating the need to purchase new equipment every five years.

Gardner: So even though those PCs have 150,000 miles so to speak, you can keep them going and keep them running for another couple of years.

Brames: Yeah, and most importantly, providing that quality of service and computing power that the end user has grown accustomed to.

Gardner: Tell us a little bit, Jason, about Avon Community School Corp., the grades, your size, what sort of organization are you?

Supporting 5,500 computers

Brames: We're located about 12 miles west of Indianapolis, Indiana, and we have 13 instructional buildings. We're a pre-K-to-12 institution and we have approximately 8,700 students, nearing 10,000 end-users in total. We’re currently supporting about 5,500 computers in our district.

Gardner: That’s a large number. What was the problem you needed to solve when you were looking at this large number of devices and a large number of users? I assume that you probably want to get an even higher penetration of device per user.

Brames: Absolutely. By going with virtual environment, the problem that we were looking to solve was really just that -- how do we provide extended refresh rate for all of those devices?

Gardner: What I was driving at was not just the numbers but the ability to manage that. So the complexity and cost, was that part of the equation as well?

Lantz: As you said, with that many devices, getting out there and installing software, even if it’s a push, locally, or what have you, there's a big management overhead there. By using VMware View and having that in our data center, where we can control that, the ability to have your golden image that you can then push out to a number of devices has made it a lot easier to transition to this type of model.

We’re finding that we can get applications out quicker with more quality control, as far as knowing exactly what’s going to happen inside of the virtual machine (VM) when you run that application. So that’s been a big help.

Gardner: And we’re talking about not just productivity apps here, I assume. We’ve got custom apps, educational apps, and I'm going to guess probably a lot of video and rich media.

Lantz: A lot of our applications are Web-based, Education City, some of those. It’s a lot of graphics and video. And we found that we're still able to run those in our View environment and not have issues.

Gardner: Why don’t you tell us a little bit about your environment? What are you running in terms of servers? What is your desktop virtualization platform, and what is it that allows you to move on this so far?

Lantz: On the server side, we're running VMware vSphere 4.1. On the desktop side, we're running View 4.6. Currently in our server production, as we call it, we have three servers. And we're adding a fourth shortly. On the View side of things, we currently have two servers and we’re getting two more in the next month or so. So we’ll have a total of four.

Access from anywhere

Gardner: Now one of the nice things about the desktop virtualization and this BYOD is it allows people to access these activities more freely anywhere. My kids are used to being able to access anything. If you were to tell them you can only do school work in school, they'd look at you like you’re from another planet.

So how do you manage to now take what was once confined to the school network and allow the students and other folks in your community to do what they need to do, regardless of where they are, regardless of the device?

Brames: We’re a fairly affluent community. We have kids who were requesting to bring in their own devices. We felt as though encouraging that model in our district was something that would help students continue to use computers that were familiar to them and help us realize some cost savings long term.

So by connecting to virtual desktops in our environment, they get a familiar resource while they're within our walls in the school district, have access to all of their shared drives, network drives, network applications, all of the typical resources that are an expectation of sitting down in front of a school-owned piece of equipment. And they're seeing the availability of all of those things on their own device.

We’re also seeing an influx of more mobile-type devices such as tablets and even smartphones and things like that. The percentage of our users that are using tablets and smartphones right now for powerful computing or their primary devices is fairly low. However, we anticipate over time that the variety of devices we’ll have connecting to our network because of virtual desktops is going to increase.

We anticipate over time that the variety of devices we’ll have connecting to our network because of virtual desktops is going to increase.



Gardner: Jason Lantz, are you at the point where you're able to extend the same experience for those students who would be in school using a PC, getting all of mileage out of that that they can, saving you guys a few dollars in the process, but then move over to their own device, let’s call it a tablet, and start right into the same session? How is that hand-off happening? Are you there able to segue and provide a unified experience yet?

Lantz: That’s part of phase two of our approach that we’re implementing right now. We’ve gotten it out into the classrooms to get the students familiar with it, so that they understand how to use it. The next step in that process is to allow them to use this at home.

We currently have administrators that are using it in this fashion. They have tablets and are using the View client they connect in and get the same experience if they're in school or out of school.

So we’re to that point. Now that our administrators understand the benefits, now that our teachers have seen it in the classrooms, it’s a matter of getting it out there to the community.

One of the other ways that we’re making it available is that at our public library, we have a set of machines that students can access as well, because as you know, not every student has access to high-speed Internet, but they are able to go to library, check out these machines, and be able to get into the network that way. Those are some of the ways that we’re trying to bridge that gap.

Huge win-win

Gardner: It sounds like a huge win-win, because you’re able to reduce your costs, increase your control, and at the same time give the students a lifecycle of learning across all of the different devices and places that they might be. I think that’s fabulous.

Let's find out a bit more about how far into this you are. Jason Brames, you mentioned that you have about 5,500 devices endpoints. How far into that number are you with desktop virtualization? Then, maybe you can give us a sense of how many BYOD instances you have too?

Brames: Currently have 400 View desktop licenses. We’re seeing utilization of that license pool of 20-25 percent right now, and the primary reason that we’re seeing that utilization is because we’re really just beginning that phase, with this being our first year for our virtual desktop roll out. We’re really in the second year, but the first year of more widespread use.

We’re training teachers on how to adequately and effectively use this technology in their classroom with kids It's been very highly received and is being adopted very well in our classrooms, because people are seeing that we were able to improve the computing experience for them.

Gardner: I understand that you’ve had a partner involved with this. TIG I believe it is. How did that affect your ability to roll this out so far?

Our network infrastructure is very sound. We didn’t run into a lot of the issues that commonly you would with network bandwidth and things like that.



Lantz: Technology Integration Group has resources that allow us to see what other school districts are doing and what are some of the things that they’ve run into. Then, they bring back here and we can discuss how we want to roll it out in our environment. They’ve been very good at giving us ideas of what has worked with other organizations and what hasn’t. That’s where they've come in. They’ve really helped us understand how we can best use this in our environment.

Gardner: Sometimes I hear from organizations, when they move to desktop virtualization, that there are some impacts on things like network or storage that they didn’t fully anticipate. How has that worked for you? How has this roll out movement towards increased desktop virtualization impacted you in terms of what you needed to do with your overall infrastructure?

Lantz: Luckily for us we’ve had a lot of growth in the last two to three years, which has allowed us to get some newer equipment. So our network infrastructure is very sound. We didn’t run into a lot of the issues that commonly you would with network bandwidth and things like that.

On the storage side, we did increase our storage. We went with an EqualLogic box for that, but with View, it doesn’t take up a ton of storage space with link clones and things like that. So having seen a huge impact there, now as we get further into this, storage requirements will get greater, but currently that hasn’t been a big issue for us.

Gardner: On the flip-side of that, a lot of organizations I talk to, who moved to desktop virtualization, gained some benefits on things like backup, disaster recovery, security, and control over data and assets, and even into compliance and regulatory issues. Has there been an upside that you could point to in terms of being a more centralized control of the desktop content and assets?

Difficult to monitor

Lantz: When you start talking about students bringing in their own devices, it's difficult to monitor what's on that personally owned device.

We found that by giving them a View desktop, we know what's in our environment and we know what that virtual machine has. That allows us to have more secure access for those students without compromising what's on that student’s machine, or what you may not know about what's on that student’s machine. That’s been a big benefit for us allowing students to bring in their own devices.

Gardner: Otherwise you’re bringing something onto your networks that you really don’t know what's there, and lose control. This allows you to have that best of both worlds flexibility at some appreciation of how to keep your risks low.

Lantz: Absolutely.

Gardner: Do we have any metrics of success either in business or, in this case, learning terms and/or IT cost savings? What has this done for you? I know it's a little early, but what's the early results?

Brames: You did mention that it is a little bit early, but we believe that as we begin using virtual desktops more so in our environment, one of the major cost savings that we’re going to see as a result is licensing cost for unique learning applications.

By creating these pools of machines that have specialty software on them we’re able to significantly reduce the number of titles we need to license.



Typically in our district we would have purchased x number of licenses for each one of our instructional buildings because they needed to utilize that with students in the classroom. They may have a certain number of students that need access to this application, for example, but they're not all accessing it during the same time of the day or it's on a machine that’s on a fat client, a physical machine somewhere in the building, and it's difficult for students to have access to it.

By creating these pools of machines that have specialty software on them we’re able to significantly reduce the number of titles we need to license for certain learning applications or certain applications that improve efficiencies for teachers and for students.

So that’s one area in which we know we’re going to see significant return on our investment. We already talked about extending the endpoints, and with energy savings, I think we can prove some results there as well. Anything to add, Jason?

Lantz: One of the ones that’s hard to calculate is, as you mentioned, maintenance or management of this piece and technology, as we all know you’re doing more with less. This really gives you the ability to do that. How you measure that is sometimes difficult, but there are definitely cost savings there as well.

Gardner: Just to be clear, the folks that are adopting this first in your organization, are these the students, are they folks in a lab, a research environment, faculty? Who are the people that grok this and really jump on it first?

No lab deployment

Brames: The first place where we’re deploying are the student computing stations in our classrooms. We’re not deploying to lab environments as much as we are to those locations in our classrooms.

A typical classroom for us contains four student computing stations, as well as, depending upon the building size, three to five labs available. We’re not focusing our desktop virtualization on those labs. We’re focusing on the classroom computing stations right now. Potentially, we'll also be in labs, as we go into the future.

Then, in addition to those student computing stations, we’re seeing those applications where our administrative team or principals and our district-level administrators are able to begin using virtual desktops to access while they’re outside of the district and growing familiar with that, so that whenever we enter into that phase where we’re allowing our students to access from outside of our network, we have that support structure in place.

Gardner: That sounds important especially for those later grades and high school grades, because this is probably the type of experience they’re going to be getting should they move onto college, where they are going to each have a device and have this ubiquity. It seems to me that they'll be one step ahead, if they get used to that now in high school. Even junior high school sets them up to be more productive and adapted to what they'll get in a college environment.

Lantz: In a lot of organizations, it would make sense to start there. With their higher level, they're going to be able to use it outside the districts probably more than the elementary schools. But for us, it made sense with our older hardware. It’s primarily in a lot of our elementary schools, middle schools, and intermediates. So it made sense that that’s where we would start.

Administrators are able to begin using virtual desktops to access while they’re outside of the district.



Gardner: I know budgets are really important in just about any school environment. If you were to say, "Listen, the cost it would take for us to make sure each individual student had their own device would be X and the cost of supporting it would be additional each year," you might get some push back. I'm going to make a wild guess on that.

But it sounds to me like you’re able to go with desktop virtualization and increased use of BYOD and say, "Listen, we can get to near one-to-one parity with student to device for a lot less."

Do you have any sense of the delta there between what it would be if you stuck to traditional cost structures, traditional licensing, fat client, to get to that one to one ratio, compared to what you’re going to be able to do over time with this virtualized approach? Any sense of how big a delta that we have there?

Brames: Our finance department has been very supportive of us in this whole endeavor, and the return on investment (ROI) cost calculations and everything is something that our finance team is very good at. We appreciate that they were able to recognize with us that this is something that would be beneficial to the district.

I apologize that I'm not actually prepared to put any numbers on it. Because we're early, putting an actual number is challenging for me right now.

Metrics of success

Gardner: Jason Lantz, I know actual numbers are dollars, but do you have any sense of maybe a percentage or even just a generalization of what the comparison between the old way of getting the one to one versus the new ways?

Lantz: It's little bit difficult. In our Advanced Learning Center -- and Jason, you can help me out with this -- as far as student-owned devices versus people bringing in their own devices, do you know what those numbers would be?

Brames: Advanced Learning Center is the school building that has primarily senior students and advanced placement students. There are about 600 students that attend there.

Last year, 75 percent of those students were using school-owned equipment and 25 percent of them were bringing their own laptops to school. This year, what we have seen is that 43 percent of our students are beginning to bring their own devices to connect to our network and have access to network resources.

If that trend continues, which we think it will, we’ll be looking at certainly over 50 percent next year, hopefully approaching 60-65 percent of our students bringing their own devices. When you consider that that is approximately 400 devices that the school district did not need to invest in, that’s a significant saving for us.

This year, what we have seen is that 43 percent of our students are beginning to bring their own devices to connect to our network and have access to network resources.



Gardner: That’s a very rapid growth rate, and so you've been able to accommodate that. But you’re going from 25 percent to 43 percent and you’re certainly not seeing that uptake in terms of your total cost. So it’s a saving on significant basis.

Brames: It is a little bit of a small snapshot right now. Our senior center has seen this increase, and district-wide we think that our results can be projected to our K-12 grade levels over time.

Gardner: I commend you for being able to anticipate and accommodate these trends, because this is happening so rapidly with these devices.

One last set of questions on advice for others who would be moving towards more desktop virtualization and the enablement of BYOD. If you could do this over again, a little bit of 20/20 hindsight, what might you want to tell them in terms of being prepared?

Lantz: One thing that’s important is that when you explain this to users, the words "virtual desktop" can be a little confusing to teachers and your end-users. What I've done is taken the approach of it’s no different than having a regular machine and you can set it up to where it looks exactly the same.

No real difference

When you start talking with end users about virtual, it gets into, okay, "So it’s running back here, but what problems am I going to encounter?" and those sort of things. Trying to get that end user to realize that there really isn’t a difference between a virtual desktop and a real desktop has been important for us for getting them on board and making them understand that it’s not going to be a huge change for them.

Gardner: Over time, as it becomes seamless, they wouldn’t really know. They just log in based on their password and ID and then the things just work.

Lantz: Yeah.

Brames: Yeah, I think so.

Trying to get that end user to realize that there really isn’t a difference between a virtual desktop and a real desktop has been important for us.



Gardner: Very good. You’ve been listening to a sponsored podcast discussion on how enterprises and in this case, a learning group are increasing their use of desktop virtualization in the post-PC era. And they’re also very much on top of a new phenomenon around "bring your own device."

I’d like to thank our guests. We’ve been here with Jason Brames, Assistant Director of Technology at the Avon Community School Corp. Thank you, Jason.

Brames: You’re welcome. Thank you.

Gardner: And we’ve also been joined by Jason Lantz, Network Services Team Leader there in Avon, Indiana. Thank you, sir.

Lantz: All right. Thank you.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks again to our listeners, and don’t forget to come back next time.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: VMware.

Sponsored podcast discussion on how a community school corporation is moving to desktop virtualization to allow students, faculty, and administrators flexibility in location and devices. Copyright Interarbor Solutions, LLC, 2005-2011. All rights reserved.

You may also be interested in:

Friday, October 28, 2011

Continuous Improvement And Flexibility Are Keys to Successful Data Center Transformation, Say HP Experts

Transcript of a sponsored podcast in conjunction with an HP video series on how companies can transform data centers productively and efficiently.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: HP.

For more information on The HUB -- HP's video series on data center transformation, go to www.hp.com/go/thehub.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on two major pillars of proper and successful data center transformation (DCT) projects. We’ll hear from a panel of HP experts on proven methods that have aided productive and cost-efficient projects to reshape and modernize enterprise data centers.

This is the first in a series of podcasts on DCT best practices and is presented in conjunction with a complementary video series. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Here today, we’ll learn about the latest trends buttressing the need for DCT and then how to do it well and safely. Specifically, we’ll delve into why it's important to fully understand the current state of an organization’s IT landscape and data center composition in order to then properly chart a strategy for transformation.

Secondly, we'll explore how to avoid pitfalls by balancing long-term goals with short-term flexibility. The key is to know how to constantly evaluate based on metrics and to reassess execution plans as DCT projects unfold. This avoids being too rigidly aligned with long-term plans and roadmaps and potentially losing sight of how actual progress is being made -- or not.

With us now to explain why DCT makes sense and how to go about it with lower risk, we are joined by our panel: Helen Tang, Worldwide Data Center Transformation Lead for HP Enterprise Business; Mark Grindle, Master Business Consultant at HP, and Bruce Randall, Director of Product Marketing for Project and Portfolio Management at HP.

Welcome to you all.

My first question goes to Helen. What are the major trends driving the need for DCT? Also, why is now such a good time to embark on such projects?

Helen Tang: We all know that in this day and age, the business demands innovation, and IT is really important, a racing engine for any business. However, there are a lot of external constraints. The economy is not getting any better. Budgets are very, very tight. They are dealing with IT sprawl, aging infrastructure, and are just very much weighed down by this decade of old assets that they’ve inherited.

So a lot of companies today have been looking to transform, but getting started is not always very easy. So HP decided to launch this HUB project, which is designed to be a resource engine for IT to feature a virtual library of videos, showcasing the best of HP, but more importantly, ideas for how to address these challenges. We as a team, decided to tackle it with a series that’s aligned around some of the ways customers can approach addressing data centers, transforming them, and how to jump start their IT agility.

The five steps that we decided that as keys for the series would be the planning process, which is actually what we’re discussing in this podcast: data center consolidation, as well as standardization; virtualization; data center automation; and last but not least, of course, security.

IT superheroes


T
o make this video series more engaging, we hit on this idea of IT as superheroes, because we’ve all seen people, especially in this day and age, customers with the clean budget, whose IT team is really performing superhuman feats.

We thought we’d produce a series that's a bit more light-hearted than is usual for HP. So we added a superhero angle to the series. That’s how we hit upon the name of "IT Superhero Secrets: Five Steps to Jump Start Your IT Agility." Hopefully, this is going to be one of the little things that can contribute to this great process of data center modernizing right now, which is a key trend.

With us today are two of these experts that we’re going to feature in Episode 1. And to find these videos, you go to hp.com/go/thehub.

Gardner: Now we’re going to go to Mark Grindle. Mark, you've been doing this for quite some time and have learned a lot along the way. Tell us why having a solid understanding of where you are in the present puts you in a position to better execute on your plans for the future.

Mark Grindle: Thank you, Dana. There certainly are a lot of great reasons to start transformation now.

But as you said, the key to starting any kind of major initiative, whether it’s transformation, data center consolidation, or any of these great things like virtualization, technology refresh that will help you improve your environment, improve the service to your customers, and reduce costs, which is what this is all about, is to understand where you are today.

Most companies out there with the economic pressures and technology changes that have gone on have done a lot to go after the proverbial low-hanging fruit. But now it’s important to understand where you are today, so that you can build the right plan for maximizing value the fastest and in the best way.

When we talk about understanding where you are today, there are a few things that jump to mind. How many servers do I have? How much storage do I have? What are the operating system levels and the versions that I'm at? How many desktops do I have? People really think about that kind of physical inventory and they try to manage it. They try to understand it, sometimes more successfully and other times less successfully.

But there's a lot more to understanding where you are today. Understanding that physical inventory is critical to what you need to understand to go forward, and most people have a lot of tools out there already to do that. I should mention that those of you who don’t have tools that can get that physical inventory, it’s important that you do.

I've found so many times when I go into environments that they think they have a good understanding of what they have physically, and a lot of times they do, but rarely is that accurate. Manual processes just can't keep things as accurate or as current as you really need, when you start trying to baseline your environment so that you can track and measure your progress and value.

Thinking about applications


O
f course, beyond the physical portions of your inventory, you'd better start thinking about your applications. What are your applications. What language are they written in? Are those traditional or supportable commercial-off-the-shelf (COTS) type applications? Are they homegrown? That’s going to make a big difference in how you move forward.

And of course, what does your financial landscape look like? What’s going in the operating expense? What’s your capital expense? How is it allocated out, and by the way, is it consistently allocated out.

I've run into a lot of issues where a business unit in the United States has put certain items into an operating expense bucket. In another country or a sub-business unit or another business unit, they're tracking things differently in where they put network cost or where they put people cost or where they put services. So it's not only important to understand where your money is allocated, but what’s in those buckets, so that you can track the progress.

Then, you get into things like people. As you start looking at transformation, a big part of transformation is not just the cost savings that may come about through being able to redeploy your people, but it's also from making sure that you have the right skill set.

If you don’t really understand how many people you have today, what roles and what functions they’re performing, it's going to become really challenging to understand what kind of retraining, reeducation, or redeployment you’re going to do in the future as the needs and the requirements and the skills change.

You really need to understand where they are, so you can properly prepare them for that future space that they want to get into.



You transform, as you level out your application landscape, as you consolidate your databases, as you virtualize your servers, as you use more storage carrying all those great technology. That's going to make a big difference in how your team, your IT organization runs the operations. You really need to understand where they are, so you can properly prepare them for that future space that they want to get into.

So understanding where you are, understanding all those aspects of it are going be the only ways to understand what you have to do to get you in a state. As was mentioned earlier, you know the metrics of measurement to track your progress. Are you realizing the value, the saving, the benefit to your company that you initially used or justified transformation?

Gardner: Mark, I had a thought when you were talking. We’re not just going from physical to physical. A lot of DCT projects now are making that leap from largely physical to increasingly virtual. And that is across many different aspects of virtualization, not just server virtualization.

Is there a specific requirement to know your physical landscape better to make that leap successfully? Is there anything about moving toward a more virtualized future that adds an added emphasis to this need to have a really strong sense of your present state?

Grindle: You're absolutely right on with that. A lot of people have server counts -- I've got a thousand of these, a hundred of those, 50 of those types of things. But understanding the more detailed measurements around those, how much memory is being utilized by each server, how much CPU or processor is being utilized by each server, what do the I/Os look like, the network connectivity, are the kind of inventory items that are going to allow you to virtualize.

Higher virtualization ratios


I
talk to people and they say, "I've got a 5:1 or a 10:1 or a 15:1 virtualization ratio, meaning that you have 15 physical servers and then you’re able to talk to one. But if you really understand what your environment is today, how it runs, and the performance characteristics of your environment today, there are environments out there that are achieving much higher virtualization ratios -- 30:1, 40:1, 50:1. We’ve seen a couple that are in the 60 and 70:1.

Of course, that just says that initially they weren’t really using their assets as well as they could have been. But again, it comes back to understanding your baseline, which allows you to plan out what your end state is going to look like.

If you don’t have that data, if you don’t have that information, naturally you've got to be a little more conservative in your solutions, as you don’t want to negatively impact the business of the customers. If you understand a little bit better, you can achieve greater savings, greater benefits.

Remember, this is all about freeing up money that your business can use elsewhere to help your business grow, to provide better service to those customers, and to make IT more of a partner, rather than just a service purely for the business organization.

Gardner: So it sounds as if measuring your current state isn’t just measuring what you have, but measuring some of the components and services you have physically in order to be able to move meaningfully and efficiently to virtualization. It’s really a different way to measure things, isn’t it?

The more data you have, the better you’re going to be able to figure out your end-state solution, and the more benefit you’re going to achieve out of that end state.



Grindle: Absolutely. And it’s not a one-time event. To start out in the field -- whether transformation is right for you and what your transformations look like -- you can do that one-time inventory, that one-time collection of performance information. But it’s really going to be an ongoing process.

The more data you have, the better you’re going to be able to figure out your end-state solution, and the more benefit you’re going to achieve out of that end state. Plus, as I mentioned earlier, the environment changes, and you’ve got to constantly keep on top of it and track it.

You mentioned that a lot of people are going towards virtualization. That becomes an even bigger problem. At least when you’re standing up a physical server today, people complain about how long it takes in a lot of organizations, but there are a lot of checks and balances. You’ve got to order that physical hardware. You've got to install the hardware. You’ve got to justify it. It's got to be loaded up with software. It’s got to be connected to the network.

A virtualized environment can be stood up in minutes. So if you’re not tracking that on an ongoing basis, that's even worse.

Gardner: Let’s now go to Bruce Randall. Bruce, you’ve been looking at the need for being flexible in order to be successful, even as you've got a long-term roadmap ahead of you. Perhaps you could fill us in on why it’s important to evaluate along the way and not be even blinded by long-term goals, but keep balancing and reassessing along the way?

For more information on The HUB -- HP's video series on data center transformation, go to www.hp.com/go/thehub.

Account for changes

Bruce Randall: That goes along with what Mark was just saying about the infrastructure components, how these things are constantly changing, and there has to be a process to account for all of the changes that occur.

If you’re looking at a transformation process, it really is a process. It's not a one-time event that occurs over a length of time. Just like any other big program or project that you may be managing you have to plan not only at the beginning of that transformation, but also in the middle and even sometimes in the end of these big transformation projects.

If you think about these things that may change throughout that transformation, one is people. You have people that come. You have people that are leaving for whatever reason. You have people that are reassigned to other roles or take roles that they wanted to do outside of the transformation project. The company strategy may even change, and in fact, in this economy, probably will most likely within the course of the transformation project.

The money situation will most likely change. Maybe you’ve had a certain amount of budget when you started the transformation. You counted on that budget to be able to use it all, and then things change. Maybe it goes up. Maybe it goes down, but most likely, things do change. The infrastructure as Mark pointed to is constantly in flux.

So even though you might have gotten a good steady state of what the infrastructure looked like when you started your transformation project, that does change as well. And then there's the application portfolio. As we continue to run the business, we continue to add or enhance existing applications. The application portfolio changes and therefore the needs within the transformation.

Even though you might have gotten a good steady state of what the infrastructure looked like when you started your transformation project, that does change as well.



Because of all of these changes occurring around you, there's a need to plan not only for contingencies to occur at the beginning of the process, but also to continue the planning process and update it as things change fairly consistently. What I’ve found over time, Dana, with various customers, as they are doing these transformation projects and they try to plan, that planning stage is not just the beginning, not just at the middle, and not just the one point. In other words, it makes the planning process go a lot better and it becomes a lot easier.

In fact, I was speaking with a customer the other day. We went to a baseball game together. It was a customer event, and I was surprised to see this particular customer there, because I knew it was their yearly planning cycle that was going on. I asked them about that, and they talked about the way that they had used our tools. The HP tool sets that they used had allowed them to literally do planning all the time. So they could attend a baseball game instead of attend the planning fire-drill.

So it wasn’t a one-time event, and even if the business wanted a yearly planning view, they were able to produce that very, very easily, because they kept their current state and current plans up to date throughout the process.

Gardner: This reminds me that we've spoken in the past, Bruce, about software development. Successful software development for a lot of folks now involves agile principles. There are these things they call scrum meetings, where people get together and they're constantly reevaluating or adjusting, getting inputs from the team.

Having just a roadmap and then sticking to it turns out to not be just business as usual, but can actually be a path to disaster. Any thoughts about learning from how software is developed in terms of planning for a large project like a DCT.

A lot of similarities

Randall: Absolutely. There are a lot of similarities between the new agile methodologies and what I was just describing in terms of planning at the beginning, in the middle, and the end basically constantly. And when I say the word, plan, I know that evokes in some people a thought of a lot of work, a big thing. In reality, what I am talking about is much smaller than that.

If you’re doing it frequently, the planning needs to be a lot smaller. It's not a huge, involved process. It's very much like the agile methodology, where you’re consistently doing little pieces of work, finishing up sub-segments of the entire thing that you needed to do, as opposed to all of it describing it all, having all your requirements written out at the beginning, then waiting for it to get done sometime later.

You’re actually adapting and changing, as things occur. What's important in the agile methodology, as well as in this transformation, like the planning process I talked about for transformation, is that you still have to give management visibility into what's going on.

Having a planning process and even a tool set to help you manage that planning process will also give management the visibility that they need into the status of that transformation project. The planning process, also like the agile, the development methodology allows collaboration. As you’re going back to the plan, readdressing it, thinking about the changes that have occurred, you’re collaborating between various groups in silos to make sure that you’re still in tune and that you’re still doing things that you need to be doing to make things happen.

One other thing that often is forgotten within the agile development methodology, but it’s still very important, particularly for transformation, is the ability to track the cost of that transformation at any given point in time. Maybe that's because the budget needs to be increased or maybe it's because you're getting some executive mandate that the budget will be decreased, but at least knowing what your costs are, how much you’ve spent, is very, very important.

One other thing that often is forgotten within the agile development methodology, but it’s still very important, particularly for transformation, is the ability to track the cost of that transformation.



Gardner: When you say that, it reminds me of something 20 years or more ago in manufacturing, the whole quality revolution, thought leaders like Deming and the Japanese Kaizen concept of constantly measuring, constantly evaluating, not letting things slip. Is there some relationship here to what you’re doing in project management to what we saw during this “quality revolution” several decades ago?

Randall: Absolutely. You see some of the tenets of project management that are number one. You're tracking what’s going on. You’re measuring what’s going on at every point in time, not only with the cost and the time frames, but also with the people who are involved. Who's doing what? Are they fulfilling the task we’ve asked them to do, so on and so forth. This produces, in the end, just as Deming and others have described, a much higher quality transformation than if you were to just haphazardly try to fulfill the transformation, without having a project management tool in place, for example.

Gardner: So we’ve discussed some of these major pillars of good methodological structure and planning for DCT. How do you get started? Are there some resources available to get folks better acquainted with these to begin executing on how to put in place measurements, knowing their current state, creating a planning process that's flexible and dynamic before they even get into a full-fledged DCT? So what resources are available, and I'll open up this to the entire panel.

Randall: One thing that I would start with is to use multiple resources from HP and others to help customers in their transformation process to both plan out initially what that transformation is going to look like and then give you a set of tools to automate and manage that program and the changes that occur to it throughout time.

That planning is important, as we’ve talked about, because it occurs at multiple stages throughout the cycle. If you have an automated system in place, it certainly it makes it easier to track the plan and changes to that plan over time.

Gardner: And then you’ve created this video series. You also have a number of workshops. Are those happening fairly regularly at different locations around the globe? How are the workshops available to folks just to start in on this?

A lot of tools


Grindle: We do have a lot of tools as I was mentioning. One of the ones I want to highlight is the Data Center Transformation Experience workshop. And the reason I want to highlight because it really ties into what we’ve been talking about today. It’s an interactive session involving large panels, very minimal presentation and very minimal speaking by the HP facilitators.

We walk people through all the aspects of transformation and this is targeted at a strategic level. We’re looking at the CIOs, CTOs, and the executive decision makers to understand why HP did what they did as far as transformation goes.

We discuss what we’ve seen out in the industry, what the current trends are, and pull out of the conversation with these people where their companies are today. At the end of a workshop, and it's a full-day workshop, there are a lot of materials that are delivered out of it that not only documents the discussions throughout the day, but really provides a step or steps of how to proceed.

So it’s a prioritization. You have facility, for example, that might be in great shape, but your data warehouses are not. That’s an area that you should go after fast, because there's a lot of value in changing it, and it’s going to take you a long time. Or there's a quick hit in your organization and the way you manage your operation, because we cover all the aspects of program management, governance, management of change. That’s the organizational change for a lot of people. As for the technology, we can help them understand not only where they are, but what the initial strategy and plan should be.

You brought up a little bit earlier, Dana, some of the quality people like Deming, etc. We’ve got to remember that transformation is really a journey. There's a lot you can accomplish very rapidly. We always say that the faster you can achieve transformation, the faster you can realize value and the business can get back to leveraging that value, but transformation never ends. There's always more to do. So it's very analogous to the continuous improvement that comes out of some of the quality people that you mentioned earlier.

We always say that the faster you can achieve transformation, the faster you can realize value and the business can get back to leveraging that value, but transformation never ends.



Gardner: I'm curious about these workshops. Are they happening relatively frequently? Do they happen in different regions of the globe? Where can you go specifically to learn about where the one for you might be next?

Grindle: The workshops are scheduled with companies individually. So a good touch point would be with your HP account manager. He or she can work with you to schedule a workshop and understand that how it can be done. They're scheduled as needed.

We do hold hundreds of them around the world every year. It’s been a great workshop. People find it very successful, because it really helps them understand how to approach this and how to get the right momentum within their company to achieve transformation, and there's also a lot of materials on our website.

Gardner: You've been listening to a sponsored BriefingsDirect podcast discussion on two major pillars of proper and successful DCT projects, knowing your true state to start and then also being flexible on the path to long-term milestones and goals.

I’d like to thank our panel, Helen Tang, Worldwide Data Center Transformation Lead for HP Enterprise Business; Mark Grindle, Master Business Consultant at HP, and Bruce Randall, Director of Product Marketing for Project and Portfolio Management at HP. Thank you to you all.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, and thanks again for our audience and their listening and attention, and do come back next time.

For more information on The HUB -- HP's video series on data center transformation, go to www.hp.com/go/thehub.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: HP.

Transcript of a sponsored podcast in conjunction with an HP video series on how companies can transform data centers productively and efficiently. Copyright Interarbor Solutions, LLC, 2005-2011. All rights reserved.

You may also be interested in: