Showing posts sorted by relevance for query ADP. Sort by date Show all posts
Showing posts sorted by relevance for query ADP. Sort by date Show all posts

Monday, October 31, 2011

Virtualized Desktops Spur Use of 'Bring Your Own Device' in Schools, Allowing Always-On Access to Education Resources

Sponsored podcast discussion on how a community school corporation is moving to desktop virtualization to allow students, faculty, and administrators flexibility in location and devices.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: VMware.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on how enterprises are increasing their use of desktop virtualization in the post-PC era. We’ll also learn about the new phenomena of "bring your own device" (BYOD) and explore how IT organizations are enabling users to choose their own client devices, yet still gain access to all the work or learning applications and data they need safely, securely, and with high performance.

The nice thing about BYOD is that you can essentially extend what do you do on premises or on a local area network (LAN) to anywhere, to your home, to your travels, 24×7.

The Avon Community School Corp. in Avon, Indiana has been experimenting with BYOD and desktop virtualization, and has recently embarked in a wider deployment for both for the 2011-2012 school year. We’re about to hear their story. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

So please join me now in welcoming our guests -- Jason Brames, Assistant Director of Technology at Avon Community School. Welcome, Jason.

Jason Brames: Hello. Great to be here.

Gardner: We’re also here with Jason Lantz, Network Services Team Leader at Avon. Welcome, Jason Lantz.

Jason Lantz: Hello.

Gardner: Let’s start with you, Jason Brames. It sounds like you've been successful with server virtualization over the past couple of years with roughly 80 percent virtualization rate on those back-end systems. What made it important for you now to extend virtualization to the desktop? Why has this become an end-to-end value for you?

Brames: One of the things that is important to our district we noticed when doing an assessment of our infrastructure: We have aging endpoints. We had a need to extend the refresh rate of our desktop computers from what was typical -- for a lot of school districts typical is about a 5-year refresh rate -- to getting anywhere from 7 to 10, maybe even 12 years, out of a desktop computer.

By going to a thin client model and connecting those machines to a virtual desktop, we're able to achieve high quality results for our end users, while still giving them computing power that they need and allowing us to have the cost savings by negating the need to purchase new equipment every five years.

Gardner: So even though those PCs have 150,000 miles so to speak, you can keep them going and keep them running for another couple of years.

Brames: Yeah, and most importantly, providing that quality of service and computing power that the end user has grown accustomed to.

Gardner: Tell us a little bit, Jason, about Avon Community School Corp., the grades, your size, what sort of organization are you?

Supporting 5,500 computers

Brames: We're located about 12 miles west of Indianapolis, Indiana, and we have 13 instructional buildings. We're a pre-K-to-12 institution and we have approximately 8,700 students, nearing 10,000 end-users in total. We’re currently supporting about 5,500 computers in our district.

Gardner: That’s a large number. What was the problem you needed to solve when you were looking at this large number of devices and a large number of users? I assume that you probably want to get an even higher penetration of device per user.

Brames: Absolutely. By going with virtual environment, the problem that we were looking to solve was really just that -- how do we provide extended refresh rate for all of those devices?

Gardner: What I was driving at was not just the numbers but the ability to manage that. So the complexity and cost, was that part of the equation as well?

Lantz: As you said, with that many devices, getting out there and installing software, even if it’s a push, locally, or what have you, there's a big management overhead there. By using VMware View and having that in our data center, where we can control that, the ability to have your golden image that you can then push out to a number of devices has made it a lot easier to transition to this type of model.

We’re finding that we can get applications out quicker with more quality control, as far as knowing exactly what’s going to happen inside of the virtual machine (VM) when you run that application. So that’s been a big help.

Gardner: And we’re talking about not just productivity apps here, I assume. We’ve got custom apps, educational apps, and I'm going to guess probably a lot of video and rich media.

Lantz: A lot of our applications are Web-based, Education City, some of those. It’s a lot of graphics and video. And we found that we're still able to run those in our View environment and not have issues.

Gardner: Why don’t you tell us a little bit about your environment? What are you running in terms of servers? What is your desktop virtualization platform, and what is it that allows you to move on this so far?

Lantz: On the server side, we're running VMware vSphere 4.1. On the desktop side, we're running View 4.6. Currently in our server production, as we call it, we have three servers. And we're adding a fourth shortly. On the View side of things, we currently have two servers and we’re getting two more in the next month or so. So we’ll have a total of four.

Access from anywhere

Gardner: Now one of the nice things about the desktop virtualization and this BYOD is it allows people to access these activities more freely anywhere. My kids are used to being able to access anything. If you were to tell them you can only do school work in school, they'd look at you like you’re from another planet.

So how do you manage to now take what was once confined to the school network and allow the students and other folks in your community to do what they need to do, regardless of where they are, regardless of the device?

Brames: We’re a fairly affluent community. We have kids who were requesting to bring in their own devices. We felt as though encouraging that model in our district was something that would help students continue to use computers that were familiar to them and help us realize some cost savings long term.

So by connecting to virtual desktops in our environment, they get a familiar resource while they're within our walls in the school district, have access to all of their shared drives, network drives, network applications, all of the typical resources that are an expectation of sitting down in front of a school-owned piece of equipment. And they're seeing the availability of all of those things on their own device.

We’re also seeing an influx of more mobile-type devices such as tablets and even smartphones and things like that. The percentage of our users that are using tablets and smartphones right now for powerful computing or their primary devices is fairly low. However, we anticipate over time that the variety of devices we’ll have connecting to our network because of virtual desktops is going to increase.

We anticipate over time that the variety of devices we’ll have connecting to our network because of virtual desktops is going to increase.



Gardner: Jason Lantz, are you at the point where you're able to extend the same experience for those students who would be in school using a PC, getting all of mileage out of that that they can, saving you guys a few dollars in the process, but then move over to their own device, let’s call it a tablet, and start right into the same session? How is that hand-off happening? Are you there able to segue and provide a unified experience yet?

Lantz: That’s part of phase two of our approach that we’re implementing right now. We’ve gotten it out into the classrooms to get the students familiar with it, so that they understand how to use it. The next step in that process is to allow them to use this at home.

We currently have administrators that are using it in this fashion. They have tablets and are using the View client they connect in and get the same experience if they're in school or out of school.

So we’re to that point. Now that our administrators understand the benefits, now that our teachers have seen it in the classrooms, it’s a matter of getting it out there to the community.

One of the other ways that we’re making it available is that at our public library, we have a set of machines that students can access as well, because as you know, not every student has access to high-speed Internet, but they are able to go to library, check out these machines, and be able to get into the network that way. Those are some of the ways that we’re trying to bridge that gap.

Huge win-win

Gardner: It sounds like a huge win-win, because you’re able to reduce your costs, increase your control, and at the same time give the students a lifecycle of learning across all of the different devices and places that they might be. I think that’s fabulous.

Let's find out a bit more about how far into this you are. Jason Brames, you mentioned that you have about 5,500 devices endpoints. How far into that number are you with desktop virtualization? Then, maybe you can give us a sense of how many BYOD instances you have too?

Brames: Currently have 400 View desktop licenses. We’re seeing utilization of that license pool of 20-25 percent right now, and the primary reason that we’re seeing that utilization is because we’re really just beginning that phase, with this being our first year for our virtual desktop roll out. We’re really in the second year, but the first year of more widespread use.

We’re training teachers on how to adequately and effectively use this technology in their classroom with kids It's been very highly received and is being adopted very well in our classrooms, because people are seeing that we were able to improve the computing experience for them.

Gardner: I understand that you’ve had a partner involved with this. TIG I believe it is. How did that affect your ability to roll this out so far?

Our network infrastructure is very sound. We didn’t run into a lot of the issues that commonly you would with network bandwidth and things like that.



Lantz: Technology Integration Group has resources that allow us to see what other school districts are doing and what are some of the things that they’ve run into. Then, they bring back here and we can discuss how we want to roll it out in our environment. They’ve been very good at giving us ideas of what has worked with other organizations and what hasn’t. That’s where they've come in. They’ve really helped us understand how we can best use this in our environment.

Gardner: Sometimes I hear from organizations, when they move to desktop virtualization, that there are some impacts on things like network or storage that they didn’t fully anticipate. How has that worked for you? How has this roll out movement towards increased desktop virtualization impacted you in terms of what you needed to do with your overall infrastructure?

Lantz: Luckily for us we’ve had a lot of growth in the last two to three years, which has allowed us to get some newer equipment. So our network infrastructure is very sound. We didn’t run into a lot of the issues that commonly you would with network bandwidth and things like that.

On the storage side, we did increase our storage. We went with an EqualLogic box for that, but with View, it doesn’t take up a ton of storage space with link clones and things like that. So having seen a huge impact there, now as we get further into this, storage requirements will get greater, but currently that hasn’t been a big issue for us.

Gardner: On the flip-side of that, a lot of organizations I talk to, who moved to desktop virtualization, gained some benefits on things like backup, disaster recovery, security, and control over data and assets, and even into compliance and regulatory issues. Has there been an upside that you could point to in terms of being a more centralized control of the desktop content and assets?

Difficult to monitor

Lantz: When you start talking about students bringing in their own devices, it's difficult to monitor what's on that personally owned device.

We found that by giving them a View desktop, we know what's in our environment and we know what that virtual machine has. That allows us to have more secure access for those students without compromising what's on that student’s machine, or what you may not know about what's on that student’s machine. That’s been a big benefit for us allowing students to bring in their own devices.

Gardner: Otherwise you’re bringing something onto your networks that you really don’t know what's there, and lose control. This allows you to have that best of both worlds flexibility at some appreciation of how to keep your risks low.

Lantz: Absolutely.

Gardner: Do we have any metrics of success either in business or, in this case, learning terms and/or IT cost savings? What has this done for you? I know it's a little early, but what's the early results?

Brames: You did mention that it is a little bit early, but we believe that as we begin using virtual desktops more so in our environment, one of the major cost savings that we’re going to see as a result is licensing cost for unique learning applications.

By creating these pools of machines that have specialty software on them we’re able to significantly reduce the number of titles we need to license.



Typically in our district we would have purchased x number of licenses for each one of our instructional buildings because they needed to utilize that with students in the classroom. They may have a certain number of students that need access to this application, for example, but they're not all accessing it during the same time of the day or it's on a machine that’s on a fat client, a physical machine somewhere in the building, and it's difficult for students to have access to it.

By creating these pools of machines that have specialty software on them we’re able to significantly reduce the number of titles we need to license for certain learning applications or certain applications that improve efficiencies for teachers and for students.

So that’s one area in which we know we’re going to see significant return on our investment. We already talked about extending the endpoints, and with energy savings, I think we can prove some results there as well. Anything to add, Jason?

Lantz: One of the ones that’s hard to calculate is, as you mentioned, maintenance or management of this piece and technology, as we all know you’re doing more with less. This really gives you the ability to do that. How you measure that is sometimes difficult, but there are definitely cost savings there as well.

Gardner: Just to be clear, the folks that are adopting this first in your organization, are these the students, are they folks in a lab, a research environment, faculty? Who are the people that grok this and really jump on it first?

No lab deployment

Brames: The first place where we’re deploying are the student computing stations in our classrooms. We’re not deploying to lab environments as much as we are to those locations in our classrooms.

A typical classroom for us contains four student computing stations, as well as, depending upon the building size, three to five labs available. We’re not focusing our desktop virtualization on those labs. We’re focusing on the classroom computing stations right now. Potentially, we'll also be in labs, as we go into the future.

Then, in addition to those student computing stations, we’re seeing those applications where our administrative team or principals and our district-level administrators are able to begin using virtual desktops to access while they’re outside of the district and growing familiar with that, so that whenever we enter into that phase where we’re allowing our students to access from outside of our network, we have that support structure in place.

Gardner: That sounds important especially for those later grades and high school grades, because this is probably the type of experience they’re going to be getting should they move onto college, where they are going to each have a device and have this ubiquity. It seems to me that they'll be one step ahead, if they get used to that now in high school. Even junior high school sets them up to be more productive and adapted to what they'll get in a college environment.

Lantz: In a lot of organizations, it would make sense to start there. With their higher level, they're going to be able to use it outside the districts probably more than the elementary schools. But for us, it made sense with our older hardware. It’s primarily in a lot of our elementary schools, middle schools, and intermediates. So it made sense that that’s where we would start.

Administrators are able to begin using virtual desktops to access while they’re outside of the district.



Gardner: I know budgets are really important in just about any school environment. If you were to say, "Listen, the cost it would take for us to make sure each individual student had their own device would be X and the cost of supporting it would be additional each year," you might get some push back. I'm going to make a wild guess on that.

But it sounds to me like you’re able to go with desktop virtualization and increased use of BYOD and say, "Listen, we can get to near one-to-one parity with student to device for a lot less."

Do you have any sense of the delta there between what it would be if you stuck to traditional cost structures, traditional licensing, fat client, to get to that one to one ratio, compared to what you’re going to be able to do over time with this virtualized approach? Any sense of how big a delta that we have there?

Brames: Our finance department has been very supportive of us in this whole endeavor, and the return on investment (ROI) cost calculations and everything is something that our finance team is very good at. We appreciate that they were able to recognize with us that this is something that would be beneficial to the district.

I apologize that I'm not actually prepared to put any numbers on it. Because we're early, putting an actual number is challenging for me right now.

Metrics of success

Gardner: Jason Lantz, I know actual numbers are dollars, but do you have any sense of maybe a percentage or even just a generalization of what the comparison between the old way of getting the one to one versus the new ways?

Lantz: It's little bit difficult. In our Advanced Learning Center -- and Jason, you can help me out with this -- as far as student-owned devices versus people bringing in their own devices, do you know what those numbers would be?

Brames: Advanced Learning Center is the school building that has primarily senior students and advanced placement students. There are about 600 students that attend there.

Last year, 75 percent of those students were using school-owned equipment and 25 percent of them were bringing their own laptops to school. This year, what we have seen is that 43 percent of our students are beginning to bring their own devices to connect to our network and have access to network resources.

If that trend continues, which we think it will, we’ll be looking at certainly over 50 percent next year, hopefully approaching 60-65 percent of our students bringing their own devices. When you consider that that is approximately 400 devices that the school district did not need to invest in, that’s a significant saving for us.

This year, what we have seen is that 43 percent of our students are beginning to bring their own devices to connect to our network and have access to network resources.



Gardner: That’s a very rapid growth rate, and so you've been able to accommodate that. But you’re going from 25 percent to 43 percent and you’re certainly not seeing that uptake in terms of your total cost. So it’s a saving on significant basis.

Brames: It is a little bit of a small snapshot right now. Our senior center has seen this increase, and district-wide we think that our results can be projected to our K-12 grade levels over time.

Gardner: I commend you for being able to anticipate and accommodate these trends, because this is happening so rapidly with these devices.

One last set of questions on advice for others who would be moving towards more desktop virtualization and the enablement of BYOD. If you could do this over again, a little bit of 20/20 hindsight, what might you want to tell them in terms of being prepared?

Lantz: One thing that’s important is that when you explain this to users, the words "virtual desktop" can be a little confusing to teachers and your end-users. What I've done is taken the approach of it’s no different than having a regular machine and you can set it up to where it looks exactly the same.

No real difference

When you start talking with end users about virtual, it gets into, okay, "So it’s running back here, but what problems am I going to encounter?" and those sort of things. Trying to get that end user to realize that there really isn’t a difference between a virtual desktop and a real desktop has been important for us for getting them on board and making them understand that it’s not going to be a huge change for them.

Gardner: Over time, as it becomes seamless, they wouldn’t really know. They just log in based on their password and ID and then the things just work.

Lantz: Yeah.

Brames: Yeah, I think so.

Trying to get that end user to realize that there really isn’t a difference between a virtual desktop and a real desktop has been important for us.



Gardner: Very good. You’ve been listening to a sponsored podcast discussion on how enterprises and in this case, a learning group are increasing their use of desktop virtualization in the post-PC era. And they’re also very much on top of a new phenomenon around "bring your own device."

I’d like to thank our guests. We’ve been here with Jason Brames, Assistant Director of Technology at the Avon Community School Corp. Thank you, Jason.

Brames: You’re welcome. Thank you.

Gardner: And we’ve also been joined by Jason Lantz, Network Services Team Leader there in Avon, Indiana. Thank you, sir.

Lantz: All right. Thank you.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks again to our listeners, and don’t forget to come back next time.

Listen to the podcast. Find it on iTunes/iPod. Download the transcript. Sponsor: VMware.

Sponsored podcast discussion on how a community school corporation is moving to desktop virtualization to allow students, faculty, and administrators flexibility in location and devices. Copyright Interarbor Solutions, LLC, 2005-2011. All rights reserved.

You may also be interested in:

Monday, January 28, 2019

Who, if Anyone, is in Charge of Multi-Cloud Business Optimization?

Transcript of a discussion on how changes in business organization and culture demand a new approach to leadership over such functions as hybrid and multi-cloud procurement and optimization.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of the Analyst podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on the latest insights into successful digital transformation.

Gardner
This composable cloud strategies interview explores how changes in business organization and culture demand a new approach to leadership over such functions as hybrid and multi-cloud procurement and optimization.

We’ll now hear from an IT industry analyst about the forces reshaping the consumption of hybrid cloud services and why the model around procurement must be accompanied by an updated organizational approach -- perhaps even a new office or category of officer in the business category.

Here to help us explore who -- or what -- should be in charge of spurring effective change in how companies acquire, use, and refine their new breeds of IT is John Abbott, Vice President of Infrastructure and Co-Founder of The 451 Group. Welcome, John.

John Abbott: Thank you very much for inviting me.


Gardner: What has changed about the way that IT is being consumed in companies? Is there some gulf between how IT was acquired and the way it is being acquired now?

Cloud control controls costs 

Abbott: I think there is, and it’s because of the rate of technology change. The whole cloud model is up over traditional IT and is being modeled in a way that we probably didn’t foresee just 10 years ago. So, CAPEX to OPEX, operational agility, complexity, and costs have all been big factors.

Abbott
But now, it’s not just cloud, it's multi-cloud as well. People are beginning to say, “We can’t rely on one cloud if we are responsible citizens and want to keep our IT up and running.” There may be other reasons for going to multi-cloud as well, such as cost and suitability for particular applications. So that’s added further complexity to the cloud model.

Also, on-premises deployments continue to remain a critical function. You can’t just get rid of your existing infrastructure investments that you have made over many, many years. So, all of that has upended everything. The cloud model is basically simple, but it's getting more complex to implement as we speak.

Gardner: Not surprisingly, costs have run away from organizations that haven’t been able to be on top of a complex mixture of IT infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-a-service (SaaS). So, this is becoming an economic imperative. It seems to me that if you don't control this, your runaway costs will start to control you.

Abbott: Yes. You need to look at the cloud models of consumption, because that really is the way of the future. Cloud models can significantly reduce cost, but only if you control it. Instant sizes, time slices, time increments, and things like that all have a huge effect on the total cost of cloud services.

Also, if you have multiple people in an organization ordering particular services from their credit cards, that gets out of control as well. So you have to gain control over your spending on cloud. And with services complexity -- I think Amazon Web Services (AWS) alone has hundreds of price points -- things are really hard to keep track of.
Gain New Insights Into
Managing the Next Wave
Of IT Disruption
Gardner: When we are thinking about who -- or what -- has the chops to know enough about the technology, understand the economic implications, be in a position to forecast cost, budget appropriately, and work with the powers that be who are in charge of enterprise financial functions -- that's not your typical IT director or administrator.

IT Admin role evolves in cloud 

Abbott: No. The new generation of generalist IT administrators – the people who grew up with virtualization -- don't necessarily look at the specifics of a storage platform, or compute platform, or a networking service. They look at it on a much higher level, and those virtualization admins are the ones I see as probably being the key to all of this.

But they need tools that can help them gain command of this. They need, effectively, a single pane of glass -- or at least a single control point -- for these multiple services, both on-premises and in the cloud.

Also, as the data centers become more distributed, going toward the edge, that adds even further complexity. The admins will need new tools to do all of that, even if they don't need to know the specifics of every platform.

Gardner: I have been interested and intrigued by what Hewlett Packard Enterprise (HPE) has been doing with such products as HPE OneSphere, which, to your point, provides more tools, visibility, automation, and composability around infrastructure, cloud, and multi-cloud.

But then, I wonder, who actually will best exploit these tools? Who is the target consumer, either as an individual or a group, in a large enterprise? Or is this person or group yet to be determined?

Abbott: I think they are evolving. There are skill shortages, obviously, for managing specialist equipment, and organizations can’t replace some of those older admin types. So, they are building up a new level of expertise that is more generalist. It’s those newer people coming up, who are used to the mobile world, who are used to consumer products a bit more, that we will see taking over.

We are going toward everything-as-a-service and cloud consumption models. People have greater expectations on what they can get out of a system as well.

Also, you want the right resources to be applied to your application. The best, most cost-effective resources; it might be in the cloud, it might be a particular cloud service from AWS or from Microsoft Azure or from Google Cloud Platform, or it might be a specific in-house platform that you have. No one is likely to have of all that specific knowledge in the future, so it needs to be automated.

We are going toward everything-as-a-service and cloud consumption models. People have greater expectations on what they can get out of a system as well.
We are looking at the developers and the systems architects to pull that together with the help of new automation tools, management consoles, and control plans, such as HPE OneSphere and HPE OneView. That will pull it together so that the admin people don’t need to worry so much. A lot of it will be automated.

Gardner: Are we getting to a point where we will look for an outsourced approach to overall cloud operations, the new IT procurement function? Would a systems integrator, or even a vendor in a neutral position, be able to assert themselves on best making these decisions? What do you think comes next when it comes to companies that can't quite pull this off by themselves?

People and AI partnership prowess

Abbott: The role of partners is very important. A lot of the vertically oriented systems integrators and value-added resellers, as we used to call them, with specific application expertise are probably the people in the best position.

We saw recently at HPE Discover the announced acquisition of BlueData, which allows you to configure in your infrastructure a particular pool for things like big data and analytics applications. And that’s sort of application-led.

The experts in data analysis and in artificial intelligence (AI), the data scientists coming up, are the people that will drive this. And they need partners with expertise in vertical sectors to help them pull it together.

Gardner: In the past when there has been a skills vacuum, not only have we seen a systems integration or a professional services role step up, we have also seen technology try to rise to the occasion and solve complexity.

Where do you think the concept of AIOps, or using AI and machine learning (ML) to help better identify IT inefficiencies, will fit in? Will it help make predictions or recommendations as to how you run your IT?
Gain New Insights Into
Managing the Next Wave
Of IT Disruption
Abbott: There is a huge potential there. I don’t think we have actually seen that really play out yet. But IT tools are in a great position to gather a huge amount of data from sensors and from usage data, logs, and everything like that and pull that together, see what the patterns are, and recommend and optimize for that in the future.

I have seen some startups doing system tuning, for example. Experts who optimize the performance of a server usually have a particular area of expertise, and they can't really go beyond that because it's huge in itself. There are around 100 “knobs” on a server that you can tweak to up the speed. I think you can only do that in an automated fashion now. And we have seen some startups use AI modeling, for instance, to pull those things together. That will certainly be very important in the future.

https://451research.com

Gardner: It seems to me a case of the cobbler’s children having no shoes. The IT department doesn’t seem to be on the forefront of using big data to solve their problems.

Abbott: I know. It's really surprising because they are the people best able to do that. But we are seeing some AI coming together. Again, at the recent HPE Discover conference, HPE InfoSight made news as a tool that’s starting to do that analysis more. It came from the Nimble acquisition and began as a storage-specific product. Now it’s broadening out, and it seems they are going to be using it quite a lot in the future.

Gardner: Perhaps we have been looking for a new officer or office of leadership to solve multi-cloud IT complexity, but maybe it's going to be a case of the machines running the machines.

Faith in future automation 

Abbott: A lot of automation will be happening in the future, but that takes trust. We have seen AI waves [of interest] over the years, of course, but the new wave of AI still has a trust issue. It takes a bit of faith for users to hand over control.

But as we have talked about, with multi-cloud, the edge, and things like microservices and containers -- where you split up applications into smaller parts -- all of that adds to the complexity and requires a higher level of automation that we haven’t really quite got to yet but are going toward.

Gardner: What recommendations can we conjure for enterprises today to start them on the right path? I’m thinking about the economics of IT consumption, perhaps getting more of a level playing field or a common denominator in terms of how one acquires an operating basis using different finance models. We have heard about the use of these plans by HPE, HPE GreenLake Flex Capacity, for example.

I wrote a research paper on essentials of edge-to-cloud and hybrid management. We recommend a proactive cloud strategy. Think out where to put your workloads and how to distribute them across different clouds.
What steps would you recommend that organizations take to at least get them on the path toward finding a better way to procure, run, and optimize their IT?

Abbott: I actually recently wrote a research paper for HPE on the eight essentials of edge-to-cloud and hybrid IT management. The first thing we recommended was a proactive cloud strategy. Think out your cloud strategy, of where to put your workloads and how to distribute them around to different clouds, if that’s what you think is necessary.

Then modernize your existing technology. Try and use automation tools on that traditional stuff and simplify it with hyperconverged and/or composable infrastructure so that you have more flexibility about your resources.

Make the internal stuff more like a cloud. Take out some of that complexity. It's has to be quick to implement. You can’t spend six months doing this, or something like that.
Gain New Insights Into
Managing the Next Wave
Of IT Disruption
Some of these tools we are seeing, like HPE OneView and HPE OneSphere, for example, are a better bet than some of the traditional huge management frameworks that we used to struggle with.

Make sure it's future-proof. You have to be able to use operating system and virtualization advances [like containers] that we are used to now, as well as public cloud and open APIs. This helps accelerate things that are coming into the systems infrastructure space.

Then strive for everything-as-a-service, so use cloud consumption models. You want analytics, as we said earlier, to help understand what's going on and where you can best distribute workloads -- from the cloud to the edge or on-premises, because it's a hybrid world and that’s what we really need.

And then make sure you can control your spending and utilization of those services, because otherwise they will get out of control and you won't save any money at all. Lastly, be ready to extend your control beyond the data center to the edge as things get more distributed. A lot of the computing will increasingly happen close to the edge.

Computing close to the edge

Abbott: Yes. That's has to be something you start working on now. If you have software-defined infrastructure, that's going to be easier to distribute than if you are still wedded to particular systems, as the old, traditional model was.

Gardner: We have talked about what companies should do. What about what they shouldn't do? Do you just turn off the spigot and say no more cloud services until you get control?

It seems to me that that would stifle innovation, and developers would be particularly angry or put off by that. Is there a way of finding a balance between creative innovation that uses cloud services, but within the confines of an economic and governance model that provides oversight, cost controls, and security and risk controls?

Abbott: The best way is to use some of these new tools as bridging tools. So, with hybrid management tools, you can keep your existing mission-critical applications running and make sure that they aren't disrupted. Then, gradually you can move over the bits that make sense onto the newer models of cloud and distributed edge.

You don't do it in one big bang. You don’t lift-and-shift from one to another, or react, as some people have, to reverse back from cloud if it has not worked out. It's about keeping both worlds going in a controlled way. You must make sure you measure what you are doing, and you know what the consequences are, so it doesn't get out of control.
Gain New Insights Into
Managing the Next Wave
Of IT Disruption
Gardner: I’m afraid we’ll have to leave it there. We have been exploring how changes in business organization and culture have demanded a new approach to oversight and management of total IT assets, resources, and services. And we have learned about how consumption of hybrid and multi-cloud services is a starting point for regaining control over a highly heterogeneous IT landscape.

Please join me in thanking our guest, John Abbott, Vice President of Infrastructure and Co-Founder of The 451 Group. Thanks so much, John.

Abbott: Thank you very much, indeed. I enjoyed it.

Gardner: And a big thank you to our audience as well for joining this BriefingsDirect Voice of the Analyst hybrid IT management strategies interview. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-sponsored discussions.

Thanks again for listening. Please pass this on to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how changes in business organization and culture demand a new approach to leadership over such functions as hybrid and multi-cloud procurement and optimization. Copyright Interarbor Solutions, LLC, 2005-2019. All rights reserved.


You may also be interested in:

Friday, February 22, 2019

Industrial-Strength Wearables Combine with Collaboration Cloud to Bring Anywhere Expertise to Intelligent-Edge Work

https://www.realwear.com

Transcript of a discussion on how workers in harsh conditions are gaining ease in accessing and interacting with the best intelligence thanks to cloud-enabled, hands-free, voice-activated, and multimedia wearable computers.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on digital transformation success stories.

Gardner
Our next industrial-edge innovation use-case examines how RealWear, Inc. and Hewlett Packard Enterprise (HPE) MyRoom combine to provide workers in harsh conditions ease in accessing and interacting with the best intelligence. We’ll now learn how a hands-free, voice-activated, and multimedia wearable computer solves the last few feet issue for delivering a business’ best data and visual assets to some of its most critical onsite workers.

Here to describe the new high-water mark for wearable augmented collaboration technologies are Jan Josephson, Sales Director for EMEA at RealWear. Welcome to BriefingsDirect, Jan.

Jan Josephson: Thank you, very much. I’m happy to be here.

Gardner: We’re also joined by John “JT” Thurgood, Director of Sales for UK, Ireland, and Benelux at RealWear. Welcome, JT.

John “JT” Thurgood: Thank you, Dana, it’s good to be here.


Gardner: A variety of technologies have come together to create the RealWear solution. Tell us why nowadays a hands-free, wearable computer needs to support multimedia and collaboration solutions to get the job done.

Hands-free help

Thurgood: Over time, our industrial workers have moved through a digitization journey as they find the best ways to maintain and manage equipment in the field. They need a range of tools and data to do that. So, it could be an engineer wearing personal protective equipment in the field. He may be up on scaffolding. He typically needs a big bundle of paperwork, such as visual schematics, and all kinds of authorization documents. This is typically what an engineer takes into the field. What we are trying to do is make his life easier.

Thurgood
You can imagine it. An engineer gets to an industrial site, gets permission to be near the equipment, and has his schematics and drawings he takes into that often-harsh environment. His hands are full. He’s trying to balance and juggle everything while trying to work his way through that authorization process prior to actually getting on and doing the job – of being an engineer or a technician.

We take that need for physical documentation away from him and put it on an Android device, which is totally voice-controlled and hands-free. A gyroscope built into the device allows specific and appropriate access to all of those documents. He can even freeze at particular points in the document. He can refer to it visually by glancing down, because the screen is just below eye-line.

The information is available but not interfering from a safety perspective, and it’s not stopping him from doing his job. He has that screen access while working with his hands. The speakers in the unit also help guide him via verbal instructions through whatever the process may be, and he doesn’t even have to be looking at documentation.
Learn More About Software-Defined
And Hybrid Cloud Solutions
That Reduce Complexity
He can follow work orders and processes. And, if he hits a brick wall -- he gets to a problem where even after following work processes, going through documentation, and it this still doesn’t look right -- what does he do? Well, he needs to phone a buddy, right? The way he does that is the visual remote guidance (VRG) MyRoom solution from HPE.

He gets the appropriate expert on the line, and that expert can be thousands of miles away. The expert can see what’s going on through the 16-megapixel camera on the RealWear device. And he can talk him through the problem, even in harsh conditions because there are four noise-canceling microphones on the device. So, the expert can give detailed, real-time guidance as to how to solve the problem.

You know, Dana, typically that would take weeks of waiting for an expert to be available. The cost involved in getting the guy on-site to go and resolve the issue is expensive. Now we are enabling that end-technician to get any assistance he needs, once he is at the right place, at the right time.

Gardner: What was the impetus to create the RealWear HMT-1? Was there a specific use case or demand that spurred the design?

Military inspiration, enterprise adoption

Thurgood: Our chief technology officer (CTO), Dr. Chris Parkinson, was working in another organization that was focused on manufacturing military-grade screens. He saw an application opportunity for that in the enterprise environment.

And it now has wide applicability -- whether it’s in the oil and gas industry, automotive, and construction. I’ve even had journalists wanting to use this device, like having a mobile cameraman.

He foresaw a wide range of use-cases, and so worked with a team -- with our chief executive officer (CEO), Andy Lowery -- to pull together a device. That design is IP66-rated, it’s hardened, and it can be used in all weather, from -20C to 50C, to do all sorts of different jobs.

There was nothing in the marketplace that provides these capabilities. We now have more than 10,000 RealWear devices in the field in all sorts of vertical industries.
The impetus was that there was nothing in the marketplace that provides these capabilities. People today are using iPads and tablets to do their jobs, but their hands are full. You can’t do the rest of the tasks that you may need to do using your hands.

We now have more than 10,000 RealWear devices in the field in all sorts of industrial areas. I have named a few verticals, but we’re discovering new verticals day-by-day.

Gardner: Jan, what were some of the requirements that led you to collaborate with HPE MyRoom and VRG? Why was that such a good fit?

Josephson: There are a couple of things HPE does extremely well in this field. In these remote, expert applications in particular, HPE designed their applications really well from a user experience (UX) perspective.

Josephson
At the end of the day, we have users out there and many of them are not necessarily engineers. So the UX side of an application is very important. You can’t have a lot of things clogging up your screen and making things too complicated. The interface has to be super simple.

The other thing that is really important for our customers is the way HPE does compression with their networked applications. This is essential because many times -- if you are out on an oil rig or in the middle of nowhere -- you don’t have the luxury of Wi-Fi or a 4G network. You are in the field.

The HPE solution, due to the compression, enables very high-quality video even at very-low bandwidth. This is very important for a lot of our customers. HPE is also taking their platform and enabling it to operate on-premises. That is becoming important because of security requirements. Some of the large users want a complete solution inside of their firewall.

So it’s a very impressive piece of software, and we’re very happy that we are in this partnership with HPE MyRoom.

Gardner: In effect, it’s a cloud application now -- but it can become a hybrid application, too.

Connected from the core to the edge

Thurgood: What’s really unique, too, is that HPE has now built-in object recognition within the toolset. So imagine you’re wearing the RealWear HMT-1, you’re looking at a pump, a gas filter, or some industrial object. The technology is now able to identify that object and provide you with the exact work orders and documentation related to it.

We’re now able to expand out from the historic use-case of expert remote visual guidance support into doing so much more. HPE has really pushed the boundaries out on the solution.

Gardner: It’s a striking example of the newfound power of connecting a core cloud capability with an edge device, and with full interactivity. Ultimately, this model brings the power of artificial intelligence (AI) running on a data center to that edge, and so combines it with the best of human intelligence and dexterity. It’s the best of all worlds.

JT, how is this device going to spur new kinds of edge intelligence?

Thurgood: It’s another great question because 5G is now coming to bear as well as Wi-Fi. So, all of a sudden, almost no matter where you are, you can have devices that are always connected via broadband. The connectivity will become ubiquitous.
Learn More About Software-Defined
And Hybrid Cloud Solutions
That Reduce Complexity
Now, what does that do? It means never having an offline device. All of the data, all of your Internet of Things (IoT) analytics and augmented and assisted reality will all be made available to that remote user.

So, we are looking at the superhuman versions of engineers and technicians. Historically you had a guy with paperwork. Now, if he’s always connected, he always has all the right documentation and is able to act and resolve tasks with all of the power and the assistance he needs. And it’s always available right now.

So, yes, we are going to see more intellectual value being moved down to the remote, edge user.

At RealWear, we see ourselves as a knowledge-transfer company. We want the user of this device to be the conduit through which you can feed all cloud-analyzed data. As time goes by, some of the applications will reside in the cloud as well as on the local device. For higher-order analytics there is a hell of a lot of churning of data required to provide the best end results. So, that’s our prediction.


Gardner: When you can extend the best intelligence to any expert around the world, it’s very powerful concept.

For those listening to or reading this podcast, please describe the HMT-1 device. It’s fairly small and resides within a helmet.

Using your headwear

Thurgood: We have a horseshoe-shaped device with a screen out in front. Typically, it’s worn within a hat. Let’s imagine, you have a standard cap on your head. It attaches to the cap with two clips on the sides. You then have a screen that protrudes from the front of the device that is held just below your eye-line. The camera is mounted on the side. It becomes a head-worn tablet computer.

It can be worn in hard hats, bump caps, normal baseball caps, or just with straps (and no hat). It performs regardless of the environment you are in -- be that in wind, rain, gales, such as working out on an offshore oil and gas rig. Or if you are an automotive technician, working in a noisy garage, it simply complements the protective equipment you need to use in the field.

Gardner: When you can bring this level of intelligence and instant access of experts to the edge, wherever it is, you’re talking about new economics. These type of industrial use cases can often involve processes where downtime means huge amounts of money lost. Quickly intercepting a problem and solving it fast can make a huge difference.

Do you have examples that provide a sense of the qualitative and quantitative benefits when this is put to good use?

https://www.realwear.com
Thurgood: There are a number of examples. Take automotive to start with. If you have a problem with your vehicle today, you typically take it to a dealership. That dealer will try to resolve the issue as quickly as it can. Let’s say the dealership can’t. There is a fault on the car that needs some expert assistance. Today, the dealership phones the head office and says, “Hey, I need an expert to come down and join us. When can you join us?” And there is typically a long delay.

So, what does that mean? That means my vehicle is off the road. It means I have to have a replacement vehicle. And that expert has to come out from head office to spend time traveling to be on-site to resolve the issue.
What can happen now using the RealWear device in conjunction with the HPE VRG MyRoom is that the technician contacts the expert engineer remotely and gets immediate feedback and assistance on resolving the fault. As you can imagine, the customer experience is vastly improved based on resolving the issue in minutes – and not hours, days, or even weeks.

Josephson: It’s a good example because everyone can relate to a car. Also, nowadays the car manufacturers are pushing a lot more technology into the cars. They are almost computers on wheels. When a car has a problem, chances are very slim you will have the skill-set needed in that local garage.

The whole automotive industry has a big challenge because they have all of these people in the field who need to learn a lot. Doing it the traditional way -- of getting them all into a classroom for six weeks -- just doesn’t cut it. So, it’s now all about incident-based, real-time learning.

Another benefit is that we can record everything in MyRoom. So if I have a session that solves a particular problem, I can take that recording and I have a value of one-to-many rather than one-to-one. I can begin building up my intellectual property, my FAQs, my better customer service. A whole range of values are being put in front here.

Gardner: You’re creating an archive, not just a spot solution. That archive can then be easily accessible at the right time and any place.

Josephson: Right.

Gardner: For those listeners wondering whether RealWear and VRG are applicable to their vertical industry, or their particular problem set, what are couple of key questions that they might ask themselves?

Shared know-how saves time and money

Thurgood: Do your technicians and engineers need to use their hands? Do they need to be hands-free? If so, you need a device like this. It’s voice-controlled, it’s mounted on your head.

Do they wear personal protectant equipment (PPE)? Do they have to wear gloves? If so, it’s really difficult to use a stylus or poke the screen of a tablet. With RealWear, we provide a totally hands-free, eyes-forward, very safe deployment of knowledge-transfer technology in the field.

If you need your hands free in the field, or if you’re working outdoors, up on towers and so on, it’s a good use of the device.

Josephson: Also, if your business includes field engineers that travel, do you have many traveling days where you had to go back because you forgot something, or it wasn’t the right skill-set on the first trip?

If instead you can always have someone available via the device to validate what we think is wrong and actually potentially fix it, I mean, it’s a huge savings. Fewer return or duplicate trips.

Gardner: I’m afraid we’ll have to leave it there. We have been exploring how RealWear and HPE MyRoom combine to provide workers in harsh conditions ease in accessing and interacting with the best intelligence.
Learn More About Software-Defined
And Hybrid Cloud Solutions
That Reduce Complexity
And we have learned how a hands-free, voice-activated, and multimedia wearable computer is solving the last few feet issue for delivering a business’ best data and visual assets to some of its most critical on-site workers.

Please join me in thanking our guests, Jan Josephson, Sales Director for EMEA at RealWear, and JT, Director of Sales for UK, Ireland, and Benelux at RealWear, which is based in Vancouver, Washington.

Thurgood: Anytime, Dana. Thank you, very much.


Gardner: And a big thank you to our audience as well for joining this BriefingsDirect Voice of the Customer digital transformation success story. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-sponsored interviews.

Thanks again for listening. Please pass this on to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how workers in harsh conditions are gaining ease in accessing and interacting with the best intelligence thanks to cloud-enabled, hands-free, voice-activated, and multimedia wearable computers. Copyright Interarbor Solutions, LLC, 2005-2019. All rights reserved.

You may also be interested in: