Monday, December 21, 2015

How INOVVO Delivers Analysis that Leads to Greater User Retention and Loyalty for Mobile Operators

Transcript of a discussion on how advanced analytics drawing on multiple data sources provides wireless operators improved interactions with their subscribers and enhances customer experience through personalized insights.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the HPE Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Our next big-data case study discussion examines how INOVVO delivers impactful analytical services to mobile operators to help them engender improved end-user loyalty.

We'll see how advanced analytics, drawing on multiple data sources, enables INOVVO’s mobile carrier customers to provide mobile users with faster, more reliable, and relevant services.

To learn more about how INOVVO uses big data to make major impacts on mobile services, please join me in welcoming Joseph Khalil, President and CEO of INOVVO in Reston, Virginia. Welcome, Joseph.
Embed the HPE Big Data Analytics Engines
To Meet Enterprise-Scale Requirements
Get More Information
Joseph Khalil: Thank you, Dana. I'm glad to be here.

Gardner: User experience and quality of service are so essential nowadays. What has been the challenge for you to gain an integrated and comprehensive view of subscribers and networks that they're on in order to uphold that expectation for user experience and quality?

Khalil: As you mentioned in your intro, we cater to the mobile telco industry. Our customers are mobile operators who have customers in North America, Europe, and the Asia-Pacific region. There are a lot of privacy concerns when you start talking about customer data, and we're very sensitive to that.

Khalil
The challenge is to handle the tremendous volume of data generated by the wireless networks and still adhere to all privacy guidelines. This means we have to deploy our solutions within the firewalls of network operators. This is a big-data solution, and as you know, big data requires a lot of hardware and a big infrastructure.

So our challenge is how we can deploy big data with a small hardware footprint and high storage capacity and performance. That’s what we’ve been working on over the last few years. We have a very compelling offer that we've been delivering to our customers for the past five years. We're leveraging HPE Vertica for our storage technology, and it has allowed us to meet very stringent deployment requirements. HPE has been and still is a great technology partner for us.

Gardner: Tell us a little bit more about how you do that in terms of gathering that data, making sure that you adhere to privacy concerns, and at the same time, because velocity, as we know, is so important, quickly deliver analytics back. How does that work?

User experience

Khalil: We deal with a large number of records that are generated daily within the network. This is data coming from deep packet inspection probes. Almost every operator we talk to has them deployed, because they want to understand the user experience on their networks.

These probes capture large volume of clickstream data. Then, they relay it to us almost in a near real-time fashion. This is the velocity component. We leverage open-source technologies that we adapted to our needs that allow us to deal with the influx of streaming data.

We're now in discussion with HPE about their Kafka offering, which deals with streaming data and scalability issues and seems to complement our current solution and enhances our ability to deal with the velocity and volume issues. Then, our challenge is not just dealing with the data velocity, but also how to access the data and render reports in few seconds.

One of our offering is a care product that’s used by care organizations. They want to know what their customers did the last hour on the network. So there's a near real-time urgency to have this data streamed, loaded, processed, and available for reporting. That’s what our platforms offers.

Gardner: Joseph, given that you're global in nature and that there are so many distribution points for the gathering of data, do you bring this all into a single data center? Do you use cloud or other on-demand elements? How do you manage the centralization of that data?
Our customers can go and see the performance of everything that’s happened on the network for the last 13 months.

Khalil: We don’t have cloud deployments to date, even though our technology allows for it. We could deploy our software in the cloud, but again, due to privacy concerns with customers' data, we end up deploying our solutions in-network within the operators’ firewalls.

One of the big advantages of our solution is that we can choose to host it locally on customers’ premises. We typically store data for up to 13 months. So our customers can go and see the performance of everything that’s happened on the network for the last 13 months.

We store the data at different levels -- hourly, daily, weekly, monthly -- but to answer your question, we deploy on-site, and that’s where all the data is centralized.

Gardner: Let’s look at why this is so important to your customer, the mobile carrier, the mobile operator. What is it that helps their business and benefits their business by having this data and having that speed of analysis?

Customer care

Khalil: Our customer care module, the Subscriber Analytix Care, is used by care agents. These are the individuals that respond to 611 calls from customers complaining about issues with their devices, coverage, or whatever the case may be.

When they're on the phone with a customer and they put in a phone number to investigate, they want to be able to get the report to render in under five seconds. They don’t want to have the customer waiting while the tool is churning trying to retrieve the care dashboard. They want to hit "go," and have the information come on their screen. They want to be able to quickly determine if there's an issue or not. Is there a network issue, is it a device issue, whatever the case may be?

So we give them that speed and simplicity, because the data we are collecting is very complex, and we take all the complexity away. We have our own proprietary data analysis and modeling techniques, and it happens on-the-fly as the data is going through the system. So when the care agent loads that screen, it’s right there at a glance. They can quickly determine what the case may be that’s impacting the customer.
Our care module has been demonstrated to reduce the average call handle time, the time care personnel spend with the customer on the phone.

Our care module has been demonstrated to reduce the average call handle time, the time care personnel spend with the customer on the phone. For big operators, you could imagine how many calls they get every day. Shaving a few minutes off each call can amount to a lot of savings in terms of dollars for them.

Gardner: So in a sense, there’s a force-multiplier by having this analysis. Not only do you head off the problems and fix them before they become evident, which includes better user experience, they're happier as a customer. They stay on the network. But then, when there are problems, you can empower those people who are solving the problem, who are dealing with that customer directly to have the right information in hand.

Khalil: Exactly. They have everything. We give them all the tools that are available to them to quickly determine on the fly how to resolve the issue that the customer is having. That’s why speed is very important for a module like care.
Embed the HPE Big Data Analytics Engines
To Meet Enterprise-Scale Requirements
Get More Information
For our marketing module, speed is important, but not as critical as care, because now you don’t have a customer waiting on the line for you to run your report to see how subscribers are using the network or how they're using their devices. We still produce reports fairly quickly in few seconds, which is also what the platform can offer for marketing.

Gardner: So those are some of the immediate and tactical benefits, but I should think that, over time, as you aggregate this data, there is a strategic benefit, where you can predict what demands are going to be on your networks and/or what services will be more in demand than others, perhaps market by market, region by region. How does that work? How do you provide that strategic level of analysis as well?

Khalil: This is on the marketing side of our platform, Subscriber Analytix Marketing. It's used by the CMO organizations, by marketing analysts, to understand how subscribers are using the services. For example, an operator will have different rate plans or tariff plans. They have different devices, tablets, different offerings, different applications that they're promoting.

How are customers using all these services? Before the advent of deep packet inspection probes and before the advent of big data, operators were blind to how customers are using the services offered by the network. Traditional tools couldn’t get anywhere near handling the amount of data that’s generated by the services.

Specific needs

Today, we can look at this data and synthesize it for them, so they can easily look at it, slice and dice it along many dimensions such as, age, gender, device type, location, time, you name it. Marketing analysts can then use these dimensions to ask very detailed questions about usage on the network. Based on that, they can target specific customers with specific offers that match their specific needs.

Gardner: Of course, in a highly competitive environment, where there are multiple carriers vying for that mobile account, the one that’s first to market with those programs can have a significant advantage.

Khalil: Exactly. Operators are competing now based on the services they offer and their related costs. Back 10-15 years ago, radio coverage footprint and voice plans were the driving factors. Today, it's the data services offered and their associated rate plans.

Gardner: Joseph, let’s learn a little bit more about INOVVO. You recently completed purchase of comScore’s wireless solutions division. Tell us a bit about how you’ve grown as a company, both organically and through acquisition, and maybe the breadth of your services beyond what we've already described?
Our tool allows them to anticipate when existing network elements exhaust their current capacity.

Khalil: INOVVO is a new company. We started in May 2015, but the business is very mature. My senior managers and I have been in this business since 2005. We started the Subscriber Analytix product line back in 2005. Then, comScore acquired us in 2010, and we stayed with them for about 5 years, until this past May.

At that time, comScore decided that they wanted to focus more on their core business and they decided to divest the Subscriber Analytix group. My senior management and I executed a management buyout, and that’s how we started INOVVO.

However, comScore is still a key partner for us. A key component of our product is a dictionary for categorizing and classifying websites, devices, and mobile apps. That’s produced by comScore, and comScore is known in this industry as the gold standard for these types of categorizations .

We have exclusive licensing rights to use the dictionary in our platform. So we have a very close partnership with comScore. Today, as far as the services that INOVVO offers, we have a Subscriber Analytix product line, which is for care, marketing, and network.

We talked about care and marketing, we also have a network module. This is for engineers and network planners. We help engineers understand the utilization of their network elements and help them plan and forecast what the utilization is going to be in the near future, given current trends, and help them stay ahead of the curve. Our tool allows them to anticipate when existing network elements exhaust their current capacity.

Gardner: And given that platform and technology providers like HPE are enabling you to handle streaming real-time highly voluminous amounts of data, where do you see your services going next?

It appears to me that more than just mobile devices will be on these networks. Perhaps we're moving towards the Internet of Things (IoT). We're looking more towards people replacing other networks with their mobile network for entertainment and other aspects of their personal and business lives. At that packet level, where you examine this traffic, it seems to me that you can offer more services to more people in the fairly near future.

Two paths

Khalil: IoT is big and it’s showing up on everybody’s radar. We have two paths that we're pursuing on our roadmap. There is the technology component, and that’s why HPE is a key partner for us. We believe in all their big data components that they offer. And the other component for us is the data-science component and data analysis.

The innovation is going to be in the type of modeling techniques that are going to be used to help, in our case, our customers, the mobile operators. Moving down the road, there could be other beneficiaries of that data, for example companies that are deploying the sensors that are generating the data.

I'm sure they want some feedback on all that data that their sensors are generating. We have all the building blocks now to keep expanding what we have and start getting into those advanced analytics, advanced methodologies, and predictive modeling. These are the areas, and this is where we see really our core expertise, because we understand this data.

Today you see a lot of platforms showing up that say, “Give me your data and I'll show you nice looking reports.” But there is a key component that is missing and that is the domain expertise in understanding the data. This is our core expertise.
My advice is that it’s a new field and you need to consider not just the Hadoop storage layer but the other analytical layers that complements it.

Gardner: Before we finish up, I'd like to ask you about lessons learned that you might share with others. For those organizations that are grappling with the need for near real-time analytics with massive amounts of data, having tremendous amount of data available to them, maybe it’s on a network, maybe it’s in a different environment, do you have any 20/20 hindsight that you might offer on how to make the best use of big data and monetize it?

Khalil: There is a lot of confusion in the industry today about big data. What is big data and what do I need for big data? You hear the terms Hadoop. "I have deployed a Hadoop cluster. So I have solved my big data needs." You ask people what’s their big-data strategy, and they say they have deployed Hadoop. Well, then. what are you doing with Hadoop? How are you accessing the data? How are you reporting on the data?

My advice is that it’s a new field and you need to consider not just the Hadoop storage layer but the other analytical layers that complements it. Everybody is excited about big data. Everybody wants to really have strategy to use big data, and there are multiple components to it. We offer a key component. We don't pitch ourselves to our customers and say, “We are your big data solution for everything you have.”

There is an underlying framework that they have to deploy, and Hadoop is one of them. then comes our piece. It sits on top of the data hosting infrastructure and feeds from all the different data types, because in our industry, typical operators have hundreds if not thousands of data silos that exist in their organization.

So you need framework to really host the various data sources, and Hadoop could be one of them. Then, you need a higher-level reporting layer, an analytical layer, that really can start combining these data silos and making sense of it and bringing value to the organization. So it's a complete strategy of how to handle big data.

Gardner: And that analytics layer that's what HPE Vertica is doing for you.

Key component

Khalil: Exactly. HPE is a key component of what do we do in our analytical layer. There are misconceptions. When we go talk to our customers, They say, “Oh, you're using your Vertica platform to replicate our big data store,” and we say that we're not. The big data store is a lower level, and we're an analytical layer. We're not going to keep everything. We're going to look at all your data, throw away a lot of it, just keep what you really need, and then synthesize it to be modeled and reported on.

Gardner: I'm afraid we'll have to leave it there. We've been exploring how INOVVO delivers impactful analytical services to mobile operators so they can foster improved end-user loyalty, and we've identified how advanced analytics, drawing on multiple data sources, provides a better network quality assurance and, of course, an all-important better user experience.
Embed the HPE Big Data Analytics Engines
To Meet Enterprise-Scale Requirements
Get More Information
So join me in thanking Joseph Khalil, President and CEO of INOVVO in Reston, Virginia. And a big thank you as well to our audience for joining us for this big data innovation case study discussion.

I'm Dana Gardner; Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored discussions. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how advanced analytics drawing on multiple data sources provides wireless operators improved interactions with their subscribers and enhances customer experience through personalized insights. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:

Thursday, December 17, 2015

DevOps by Design--A Practical Guide to Effectively Ushering DevOps into Any Organization

Transcript of a BriefingsDirect discussion on powerful best practices for making DevOps an accelerant to broader business goals.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the HPE Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on IT innovation -- and how it’s making an impact on people’s lives.

Gardner
Our next DevOps innovation case study highlights how Cognizant Infrastructure Services has worked with a large telecommunications and Internet services company to make DevOps benefits a practical reality.

We'll learn important ways to successfully usher DevOps into a large, complex enterprise IT environment, and we'll hear best practices on making DevOps an accelerant to broader business goals -- such as adapting to the Internet of Things (IoT) requirements, advancing mobile development, and allowing for successful cloud computing adoption.

To provide a practical guide to effectively ushering DevOps into any organization, we're joined by Sachin Ohal, Manager Consulting at Cognizant Infrastructure Services in Philadelphia. Welcome, Sachin.

Sachin Ohal: Hi. I'm glad to be on the show. How are you?

Gardner: I'm great. Thanks for being here. We're also here with Todd DeCapua, Chief Technology Evangelist at HPE Software. Welcome, Todd.
DevOps Solutions Unify Development,
Accelerate Innovation and Meet Market Demands

Find out More from Hewlett Packard Enterprise
Todd DeCapua: Dana, great to be here.

Gardner: Let's start at a high level and then drill down. When we're talking about DevOps in a large environment, what are the barriers that we're facing these days? What is some of the resistance when we think about ushering it in? I know it's a complex undertaking, but what are the things we need to be thinking about in terms of overcoming that and making DevOps a reality?

DeCapua: Some of the things that I've seen in the last 10 or so years -- thinking about things like agile transformations, then DevOps -- people often say, "What are the tools that I need to buy?" And it's not about the tools. It's starting with the culture of the organization and helping people understand the reasons why a change is needed, and as they would understand why, then how is it that we can start to adopt some of these fundamentals?

Gardner: Sachin, what are some of the problems that we need to overcome?

Different models

Ohal: Fundamentally, industries come in with many different models, which was a sending and receiving mode rather than a communicating mode.

Ohal
So either one team is sending to the other team or one organization is sending to the other team. When we come up with a model like DevOps, the IT team starts DevOps without selecting an area where DevOps needs to start, or where a team needs to take a lead to start DevOps in the organization.

Companies are trying to enhance their IT infrastructure. They want to enforce DevOps. On the other hand, when they all start communicating, they're getting lost. This has become a fundamental problem in implementing DevOps.

Gardner: You've been working with a number of companies in bringing DevOps best practices into play. What are some of the bedrock foundation steps companies should take? Is there a common theme, or does it vary from company to company?

Ohal: DevOps is a kind of domain that varies inside a company. We can't compare company to company. It varies company to company, domain to domain, organization to organization, because here we're talking about developing a culture. When we talk about developing a culture, a thought process, understanding those thought processes plays a key role.

And if we fundamentally talk about an application development organization, testing organization, or the IT ops organization, they have their own key performance indicators (KPIs), their own thought process, and their own goals defined.

Many times, we observe that within the IT organization, development, testing, and operations have different goals, objectives, and KPI’s. They never cross-functionally define business needs. They mostly define technology as organization-specific. As an example, a functional tester doesn’t know how developers are communicating with each other, or the security team for security-related issues. An operations engineer has KPI up-time, but he really doesn’t know the various application modules he's supporting.  

Suddenly, by enforcing DevOps, we're telling all the organization to begin communicating, start intersecting, start having cross-communication. So this has become a key problem in the 21st century infrastructure, application, testing, or overall DevOps framework implementation. Communication and understanding have become key challenges for organizations.

Gardner: Before we get into the specific use case scenario and case study, what is the relationship between Cognizant and HPE? You're a services provider; they're a technology provider. How does it work?

Strong partner

Ohal: We're a strong partner with HPE. Cognizant is a consulting company, a talent company. On the other hand, HPE is an enterprise-scale product delivery company. There is a very nice synergy between Cognizant and HPE.

When we go to market, we assess the situation, we request HPE to come on-premises, to work with us, have a handshake, form a high-performance team, and deliver into an enterprise solution to Cognizant's and HPE's customers.

Gardner: Todd, given the challenges of bringing DevOps to bear in many organizations, the fact that it varies from company to company really sounds like a team sport, not something one can do completely alone. It's an ecosystem play. Is that right?

DeCapua: It absolutely is. When I think about this ecosystem, there are three players. You have your customer first, but then you have an organization like HPE that provides enterprise products and capabilities, and then other partners like Cognizant that can bring in the talent to be able to put it all together.

DeCapua
As we think about kind of this transition and think about what these challenges are that our number one player, our customers, have, there are these foundational pieces that you think about -- things like time-to-market as being a challenge, brand value being a challenge, and, of course, revenue is another challenge.

As we were talking early on, what are those fundamental challenges that our customer, again as a team sport, are being challenged with? We see that this is different for every one of our customers, and starting with some of these fundamentals, what are those challenges?

Understanding that helps with, "We need to make a change. We need to influence the culture. We need to do all these pieces." Before we jump right into that technical solution, let’s sit down as the teams together, with a customer, with someone like HPE, with someone like Cognizant, and really understand what our challenges are.

Gardner: All right. Let's drill down a bit into a specific scenario. Sachin, a large telecommunications, media and Internet services company, tell us about what their goals were and why they were pursuing DevOps and practical improvement in how they have a test/deploy synergy.

Ohal: When we talk about telco, pharma or retail customers, they fundamentally come up with many upstream/downstream revenue-oriented, customer service, workbench platforms -- and it's very hard to establish a synergy between all the platforms, and to make them understand what their end goal is.

Obviously the end goal is customer service, but to achieve that goal you have to go through so many processes, so many handshakes on a business level, on a technology level, on a customer-service level, and even internal customer service level.

Always supporting

In today's world, we are IT for IT. None of the organizations inside a company works as an independent IT group. They work IT for IT. They are always supporting either business or internal IT group.

Having this synergy established, having this core value established, we come across many people who don't understand the communication. The right tools are not in place. Once we overcome the tools and the communication process, the major question is how I'll put that process in end-to-end in the IT organization

That, again, becomes a key challenge to that process, because it's not easy to have it adopted with something new. As Todd said, we're talking about Agile development and mobile. Your IT organization becomes your business. You're asking to inject something new with no result. It's like injecting some test assay with some new drug. That's exactly the feeling any IT executive has: "Why am I supposed to be injecting this thing?"

Do I have a value out of it or don't I, because there is no benchmark available in the industry that people succeed in a certain domain or a certain area. There are always bits and pieces. This is a key challenge that we observe across the industry  -- a lack of adaptiveness to a new technology or a new process. We're still seeing that.
There is no benchmark available in the industry that people succeed in a certain domain or a certain area. There are always bits and pieces.

I have a couple of customers who say, "Oh, I run Windows 2000 server. I run Windows 98. I have no idea how many Java libraries my application is using." They are also unable to explain why they still have so many.

It's similar on the testing side. Somebody says, "I use a Load Testing Solution 9," where even HPE themselves got rid of it three or four years back.

Then, if you come to the operations organization, people say, "I use a very old server." What does it mean? It means that business is just getting IT services. They have to understand that this service needs to be enhanced so that the business will be enhanced.

Technology enhancement doesn’t mean that my data center is flooded with some new technology. Technology enhancement means that my entire end-to-end landscape is upgraded with a new technology that will support for next gen, but I'm still struggling with legacy. These are the key challenges we observe in the market.

Gardner: But specifically with this use case, how did you overcome them? Did you enter into the test environment and explain to them how they should do things differently, leverage their data in different ways, or did you go to the developers first? Is there a pattern to how you begin the process of providing a better DevOps outcome?

End-to-end process

Ohal: First of all, we had to define an end-to-end delivery process and then we had to identify end-to-end business value out of that delivery process.

Once we identified the business value, we drew a line between various organizations so they could understand that they were not cutting across each other, but going parallel. But this is a thin line, which is going to work, and which will definitely vary domain-to-domain.

In a multi-generational business plan, when we talk about drawing this thin line, we don’t have any scope that tells exactly how we draw it in IT organization, a business organization, or inside IT. We draw it in a testing organization or a development organization.

DevOps can be started in any landscape. We may start with a testing organization and then we decide to pull it into the development and IT organization.
DevOps Solutions Unify Development,
Accelerate Innovation and Meet Market Demands

Find out More from Hewlett Packard Enterprise
In some cases we may start with a development organization, and then testing and operational organizations come into place. Some businesses start DevOps, and they say that they want to do things the way they want.

If you really ask me about a specific case study, rather than giving a very centric answer, I want to tell you that the answer is a wide area. I don’t just want to take our audience in a wrong direction. Somebody else started in testing. So we'll just start in testing. Somebody else started in development. Let’s start in development.

You can start anywhere, but before starting, just stay back, decide where you want to start, why you want to start, how you want to start, and get the right folks and the right tools in the picture.

Gardner: Given that there is a general pattern, but there are also deep specifics, could you walk us through the general methodology that you have been talking about and that you are describing?

Ohal: At one point in time, most users or most listeners on this podcast, were startup companies. They started up their company as a product or as service and they were struggling with a market.

Then, they shifted themselves as a product company. When I say product it doesn’t mean a physical product; it might be service as a product. Then, they started merger, acquisition, and enhancing their portfolio in the market. They've done a couple of exercises that fundamentally industry does.

Service companies

Now, more big companies are transforming themselves to the service companies. They want to make sure that their existing customers and their new customers are getting the same values, because challenges remain, while adding new customers. Are my existing customers still with me? Are they happy and satisfied, and are they willing to continue business with my company?

Are they getting equivalent service to what we have committed to them? Are they getting my new technology and business value out of those services?

This creates a lot of pressure on IT and business executives. In mobile computing and cloud computing, suddenly some companies are trying to transform themselves into cloud companies from service companies. There is a lot of pressure on their IT organization to go toward cloud.

They're starting with using cloud web services, cloud authentication at an IT level. We're not talking a larger landscape, but they're trying that. Basically this transformation from startup to product, product to services, and then services to cloud. That is your multi-generational vision with your multi-generational business plan, because your people change, your IT changes, your technology changes, your business models keep changing, your customers change, your revenue changes, and the mode of revenue changes.
That's where your IT plays a key role. Information technology becomes a key strategic business unit in your organization that is driving this whole task force.

Consider the example of eBay and Google. At some point in time, they never existed. We never even thought that these companies would be leading on Wall Street, giving us so much employment, or have such a large consumer base.

Being a consulting company like Cognizant, we observe those trends in the market very quickly. We see those changes in the market, we assist them, and we come with our own internal teams that understanding this all -- yet the customer multi-generational vision remains the same.

To run this vision I have a strategic business objective, a strategic business unit. How will this unit communicate with the strategic business objective? That's where your IT plays a key role. Information technology becomes a key strategic business unit in your organization that is driving this whole task force.

While driving this task force, if you didn’t define your DevOps in a multi-generational business plan, what will happen is that your focus is IT-centric. The moment technology changes, you're in trouble. The moment the process changes -- and the moment you think about cross domain in your company -- you're in trouble.

As an example, a telco is doing a cross-domain with the retailer. Then, pharma is doing cross-domain with the telco. Do you want to spend double for your IT or your business, or do you want to shut down the existing project and fund a new project?

There are so many questions that come into the picture when we talk about an IT-centric DevOps organization, but when we have business-centric DevOps initiation, we accommodate all the views, and accordingly, IT takes control of your business and they help you to run your business.

Gardner: So business agility is really the payoff, Todd?

Looking at disruptions

DeCapua: Yes. Dana and Sachin, as we look at this challenge and wrapping this around the use case that Cognizant has -- not only the one customer that we are talking about, but really all of them -- and thinking through this multi-generational business plan using DevOps, there are some real fundamentals to think about. But there are disruptions in the world today, and maybe starting there helps to illustrate a little bit better why this concept of a multi-generational business plan is so important.

Consider Uber, Yelp, or Netflix. Each one of them is in a different stage of a multi-generational business plan, but as to this foundational element that Sachin had been explaining -- where some organizations today are stuck in a legacy technology or IT organization -- it’s really starting at that fundamental level of understanding, What are our strategic business objectives?

Then look at this from whether there's a strategic business unit and where that's focused. Then, build up from there to where you have technology that lives on the top of that.

What’s fun for me is when I look at Uber, Yelp, or Netflix, knowing they are all different, but some of them do have a product and some of them don’t. Some of them are an IT organization that has a services layer that connects all of these pieces together.
Look at this from whether there's a strategic business unit and where that's focused. Then, build up from there to say you have technology that lives on the top of that./div>

So whether it's a large telecom or an Internet provider, there are products, but there has really been a focus on services.

What can help is that this organizational, multi-generational vision is going to live through the iterations that every organization goes through. I hate to keep pounding on these three examples, but I think they're great in ways that help illustrate this.

We all remember when things like Uber came in as a startup and was not really well-understood. Then, you look down, and it has become productized. It’s probably safe to assume that we've reached a certain level where it's available in most cities that I travel to.

Then, you move into something more like a product, looking at Yelp. That is definitely a product that’s mainstream. It definitely has a lot of users today. Then you move down into the service area, and as something would mature into a service it has now become definitely adopted in the majority of their target users.

The fourth I would like to call on is cloud. As you move to something like cloud, that's where Netflix becomes a perfect example. It’s all cloud-based. I'm a subscriber. I know that I can have streaming video any device, anywhere in the world, at any time, on Netflix delivered from the cloud.

So these four generational business plan items that we are talking about -- startup, products, service, and cloud -- again, carrying that underlying vision, all supported by information technology and a defined strategic business objective, focusing on a strategic business unit.

It’s really important to help understand that as I look at somebody like Cognizant as a partner and the approach that they have used with several of their customers.

Gardner: For organizations reading this or listening in that are interested in getting to that multi-generational benefit -- where their investments in IT pay off dividends for quite some time, particularly in their ability to adapt to change rapidly -- any good starting points? Are there proof of concept (POC) places where you start? I know it’s boiling the ocean in some ways, but there must be some good principles to get going with.

Sensing inside

Ohal: Definitely there are. In this 21st Century IT business goal, first you have to sense everything inside of your business, rather than sensing the outside market. Sense all your business thoroughly, in real time. What is it doing?

You have to analyze your business model. Does my business model fit in these four fundamental parts? Where am I right now? Am I into the startup side, product side, service side, or cloud and where do I want to go? We have to define that, and then based on that, you have to adopt DevOps. You have to make sure where you are adopting your DevOps.

I was on product and I'm going to services, so I need a DevOps fitting here. Or I'm right now in a well-matured product and I want to go on a cloud. Where I am going? Or, I'm right now on a cloud and I want to have more and more refined services for my customers.

Find out that scale and define that scale, rather than getting many IT groups together and just doing a brainstorming session. Where am I supposed to stand? No. What is your business vision? What is your customer value? Those values really derive your business, and to derive that business use DevOps.
You have to make sure where you are adopting your DevOps.

It's not for just getting the continuous delivery in-place or continuous integration in-place. Two IT executives are talking, "You're in my organization doing a great handshake," and the business says, "I don’t want that handshake. I want that up-time."

There are so many various aspects, various views. Todd mentioned that he has all these examples, but if you check other example as well, they're very focused on their multi-generational business plan, and if you want to succeed, you have to be focused on those aspects as well.

Gardner: Anything else to add, Todd?

DeCapua: As far as getting started and what works and where you go, there are a number of different ways that we've worked with our customers to get started.

One of the ones that I have seen proven is something that has been neglected. For example, there's a maintenance backlog. Here are items that over six months, a year, or sometimes even two years, have just been neglected. If you really want to try to find some quick value, maybe it’s pulling that maintenance backlog off, prioritizing that with your customer, understanding what's important still, what’s not important any longer, and shortening it down to a target list.
The second piece that comes in is this analysis capability. How are you tracking the results?

Then being able to identify that if we're going to focus a few resources on a few of these high-priority items that are going to continue to be neglected, then starting to adopt some of these practices and capabilities to then immediately show value to that business owner because we have applied a few resources with a little bit of time and gone after the highest priority items that otherwise would have been neglected.

The second piece that comes in is this analysis capability. How are you tracking the results? What are those metrics that you're using to show back to the business that they have their multi-generational plan and strategy laid out, but how is it that they are incrementally showing this value as they're delivering over and over again?

But start small. Maybe go after that neglected maintenance backlog being a really easy target, and then showing the incremental value over time, again, through the sensing that Sachin has mentioned. Also be able to analyze and predict those results and then be able to adapt over time with speed and accuracy.

Gardner: I'm afraid we'll have to leave it there. We've been learning about how Cognizant Infrastructure Services has worked with HPE to help a large Internet services provider to make DevOps benefits a practical reality.

And we've heard some powerful best practices on making DevOps an accelerant to broader business goals, but at the level of a multi-generational business activity.

So I want to thank our guests, Sachin Ohal, Manager Consulting at Cognizant Infrastructure Services. Thanks, Sachin.
DevOps Solutions Unify Development,
Accelerate Innovation and Meet Market Demands

Find out More from Hewlett Packard Enterprise
Ohal: Thank you very much. Glad to have been on your show.

Gardner: And we have also been talking with Todd DeCapua, Chief Technology Evangelist at HPE Software. Thanks, Todd.

DeCapua: Thank you, and speak with you guys soon.

Gardner: And a big thank you also to our audience for joining us for this special DevOps case study discussion. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored discussions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a BriefingsDirect discussion on powerful best practices for making DevOps an accelerant to broader business goals. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:

Tuesday, December 08, 2015

Need for Fast Analytics in Healthcare Spurs Sogeti Converged Solutions Partnership Model

Transcript of a BriefingsDirect discussion on how a triumvirate of big players have teamed to deliver a rapid and efficient analysis capability for healthcare data.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the HPE Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on IT innovation and how it’s making an impact on people’s lives.

Google
Our next big-data discussion explores how a triumvirate of big-data players have teamed to deliver a rapid and efficient analysis capability across disparate data types for the healthcare industry. We'll learn how the drive for better patient outcomes amid economic efficiency imperatives has created a demand for a new type of big-data implementation model.

This model with the support from Hewlett Packard Enterprise, Microsoft, and Sogeti leverages a nimble big-data platform, converged solutions, hybrid cloud, and deep vertical industry expertise. The result is innovative and game-changing insights across healthcare ecosystems of providers, patients, and payers.

The ramp-up to these novel insights is rapid, and the cost-per-analysis value is very aggressive. Here to share the story on how the Data-Driven Decisions for Healthcare initiative arose and why it portends more similar vertical industry focused solutions, we're joined by Bob LeRoy, Vice President in the Global Microsoft Practice and Manager of the HPE Alliance at Sogeti USA. He's based in Cincinnati. Welcome Bob.
Converged Systems +
Analytics = Transformation
Learn More from Sogetilabs
Bob LeRoy: Hi, Dana. Thanks for inviting me. Glad to be here.

Gardner: Why the drive for a new model for big data analytics in healthcare? What are some of the drivers, some of the trends, that have made this necessary now?

LeRoy: Everybody is probably very familiar with the Affordable Care Act (ACA), also known as ObamaCare. They've put a lot of changes in place for the healthcare industry, and primarily it's around cost containment. Beyond that, the industry itself understands that they need to improve the quality of care that they're delivering to patients. That's around outcomes, how can we affect the care and the wellness of individuals.

LeRoy
So it’s around cost and the quality of the care, but it’s also about how the industry itself is changing, both from how providers are now doing more with payments and how classic payers are doing more to actually provide care themselves. There is this blur between the lines of payer and provider.

Some of these people are actually becoming what we call accountable care organizations (ACOs). We see a new one of these ACOs come up each week, where they are both payer and provider.

Gardner: Not only do we have a dynamic economic landscape, but the ability to identify what works and what doesn't work can really be important, especially when dealing with multiple players and multiple data types. This is really not just knowing your own data; this is knowing data across organizational boundaries.

LeRoy:  Exactly. And there are a lot of different data models that exists. When you look at things like big data and the volume of data that exist out in the field, you can put that data to use to understand who are your critical patients, and how that can affect your operations?

Gardner:  Why do we look to a triangulated solution between players like Hewlett Packard Enterprise, Microsoft, and Sogeti? What is it about the problem that you're trying to solve that has led it to a partnership type of solution?

Long-term partner

LeRoy: Sogeti, a wholly-owned subsidiary of the Capgemini Group, has been a long-term partner with Microsoft. The tools that Microsoft provides are one of the strengths of Sogeti. We've been working with HPE now for almost two years, and it's a great triangulation between the three companies. Microsoft provides the software, HPE provides the hardware, and Sogeti provides the services to deliver innovative solutions to customers and do it in a rapid way. What you're getting is best in class in all three of those categories -- the software, the hardware, and the services.

Gardner: There's another angle to this, too, and it’s about the cloud delivery model. How does that factor into this? When we talked about hardware, it sounds like there's an on-premises aspect to it, but how does the cloud play a role?

LeRoy: Everybody wants to hear about the cloud, and certainly it’s important in this space, too, because of the type of data that we're collecting. You could consider social data or data from third party software-as-a-service (SaaS) applications, and that data can exist everywhere.

You have your on-premise data and you have your off-premise data. The tool that we're using, in this case from HPE and Microsoft, really lend themselves well to developing a converged environment to deliver best in class across those different environments. They're secured, delivered quickly, and they provide the information and the insights that their hospitals and insurance companies really need.

Gardner: So we have a converged solution set from HPE. We have various clouds that we can leverage. We have great software from Microsoft. Tell us a little about Sogeti and what you're bringing to the table. What is it that you've been doing in healthcare that helps solidify this solution and the rapid analysis requirements?
Sogeti’s strength is that we're really focused on the technology and the implementations of technology.

LeRoy: This is one of the things that Sogeti brings into the table. Sogeti is part of the Capgemini Group, a global organization with 150,000 employees, and Sogeti is one of the five strategic business units of the group. Sogeti’s strength is that we're really focused on the technology and the implementations of technology and we are focused on several different verticals, healthcare being one of them.

We have experts on the technology stacks, but we also have experts in healthcare itself. We have people who we've pulled from the healthcare industry. We taught them what we do in the IT world, so they can help us focus best practices and technologies to solve real healthcare organizational problems, so that we can get toward the quality of care and the cost reduction that the ACA is really looking for. That’s a real strength that's going to add significant values to healthcare organizations.

Gardner: It’s very important to see that one size does not fit all when it comes to the systems. Having industry verticalization is required, and you're embarking on a retail equivalent to this model, and manufacturing in other sectors might come along as well.

Let's look at why this approach to this problem is so innovative. What have been some of the problems that have held back the ability of large and even mid-sized organizations in the healthcare vertical industry from getting these insights? What are some of the hurdles that they've had to overcome and that perhaps beg for a new and different model and a new approach?

Complexity of data

LeRoy: There are a couple of factors. For sure, it’s the complexity of the data itself. The data is distributed over a wide variety of systems. So it’s hard to get a full picture of a patient or a certain care program, because the systems are spread out all over the place. When the systems in so many different ways end up with you, you get part of the data. You don’t get the full picture. We call that poor data quality, and that y makes it hard for somebody who's doing analysis to really understand and gain insight from their data.

Of course, there's also the existing structure that’s in place within organizations. They've been around for a long time. People are sometimes resistant to change. Take all of those things together and you end up with a slow response time to delivering the data that they're looking for.

Access to the data becomes very complex or difficult for an end-user or a business analyst. The cost of changing those structures can be pretty expensive. If you look at all those things together, it really slows down an organization’s ability to understand the data that they've got to gain insights about their business.

Gardner: Just a few years ago, when we used to refer to data warehouses, it was a fairly large undertaking. It would take months to put these into place, required a data center or some sort of a leasing arrangement, and of course a significant amount of upfront costs. How has this new model approached those costs and length of time or ramp-up time issues?
HPE is providing a box that’s going to allow me to put both into a single environment. So that’s going to reduce my cost a lot.

LeRoy: Microsoft’s model that they have put in place to support their Analytics Platform System (APS) allows them to license their tools at a lower price. The other thing that's really made a difference is the way HPE has put together their ConvergedSystem that allows us to tie these hybrid environments together to aggregate the data in a very simple solution that provides a lot of performance.

If I have to look at unstructured data and structured data, I often need two different systems. HPE is providing a box that’s going to allow me to put both into a single environment. So that’s going to reduce my cost a lot.

They have also delivered it as an appliance, so I don't need to spend a lot of time buying, provisioning, or configuring servers, setting up software, and all those things, I can just order this ConvergedSystem from HPE, put it in my data center, and I am almost ready to go. That’s the second thing that really helps save a lot of time.
Converged Systems Help Transform
Healthcare and Financial Services
Learn More from Sogetilabs
The third one is that at Sogeti Services, we have some intellectual property (IP) to help the data integration from these different systems and the aggregation of the data. We've put together some software and some accelerators to help make that integration go faster.

The last piece of that is a data model that structures all this data into a single view that makes it easier for the business people to analyze and understand what they have. Usually, it would take you literally years to come up with these data models. Sogeti has put all the time into it, created these models, and made it something that we can deliver to a customer much faster, because we've already done it. All we have to do is install it in your environment.

It's those three things together -- the software pricing from Microsoft, the appliance model from HP, and the IP and the accelerators that Sogeti has.

Consumer's view

Gardner: Bob, let’s look at this now through the lens of that consumer, the user. It wasn’t that long ago where most of the people doing analytics were perhaps wearing white lab coats, very accomplished in their particular query languages and technologies. But part of the thinking now for big data is to get it into the hands of more people.

What is it that your model, this triumvirate of organizations coming together for a solution approach, does in terms of making this data more available? What are the outputs, who can query it, and how has that had an impact in the marketplace?

LeRoy: We've been trying to get this to the end users for 30 years. I've been trying to get reports in the hands of users and let them do their own analysis, and every time I get to a point where I think this is the answer, the users are going to be able to do their own reports, that frees up guys in the IT world like me to go off and do other things, it doesn’t always work.

This time, though, it's really interesting. I think we have got it. We allow the users access directly to the data, using the tools that they already know. So I'm not going to create and introduce a new tool to them. We're using tools that are very similar to Excel, that point to a data source that’s well organized for them already and it’s the data that they are already familiar with.
This is something that we couldn't do before, and it’s very exciting to see that we're able to gain such insights and be able to take action against those insights.

So if they're using Microsoft Excel-like tools, they can do Power Pivots and pivot tables that they've already being doing, but just in an offline manner. Now, I can give them direct access to real-time data.

Instead of waiting until noon to get reports out, they can go and look online and get the data much sooner, so we can accelerate their access time to it, but deliver it in a format that they're comfortable with. That makes it easier for them to do the analysis and gain their insights without the IT people having to hold their hands.

Gardner: Perhaps we have some examples that we can look to that would illustrate some of this. You mentioned social media, the cloud-based content or data. How has that come to bear on some of these ways that your users are delivering value in terms of better healthcare analytics?

LeRoy: The best example I have is the ability to bring in data that’s not in a structured format. We often think of external data, but sometimes it’s internal data, too -- maybe x-rays or people doing queries on the Internet. I can take all of that structured data and correlate it to my internal electronic medical records or my health information systems that I have on-premise.

If I'm looking at Google searches, and people are looking for keywords such as "stress," "heart attacks," "cardiac care," or something like that, those keywords, I can map the times that people are looking at those kinds of queries by certain regions. I can tie that back to my systems and ask what the behavior or the traffic patterns look like within my facility at those same times. You can target certain areas to maybe change my staffing model, if there is a big jump in searches, do a campaign to ask people to come in and do a screening, or encourage people to get to their primary-care physicians.

There are a lot of things we can do with the data by looking just at the patterns. It will help us narrow down the areas of our coverage that we need to work with, what geographic areas I need to work on, and how I manage the operations of the organization, just by looking at the different types of data that we have and tying them together. This is something that we couldn't do before, and it’s very exciting to see that we're able to gain such insights and be able to take action against those insights.

Applying data science

Gardner: I can see now why you're calling it the Data Driven Decisions for Healthcare, because you're really applying data science to areas that would probably never have been considered for it before. People might use intuition or anecdote or deliver evidence that was perhaps not all that accurate. Maybe you could just illustrate a little bit more ways in which you're using data science and very powerful systems to gather insights into areas that we just never thought to apply such high-powered tools to before.
Converged Systems +
Analytics = Transformation
Learn More from Sogetilabs
LeRoy: Let’s go back to the beginning when we talked about how we change the quality of care that we are providing. Today, doctors collect diagnosis codes for just about every procedure that we have done. We don’t really look and see how many times those same procedures are repeated or which doctors are performing which procedures. Let’s look at the patients, too, and which patients are getting those procedures. So we can tie those diagnosis codes in a lot of different ways.

The one that I think I probably would like the best is that I want to know which doctors perform those procedures only once per patient and have the best results come from the treatments that that doctor performs. Now, if I'm from a hospital, I know which doctors perform which procedures the best and I can direct the patients that need those procedures to those doctors that provide the best care.
My quality of care goes up, the patient has a better experience, and we're going to do it a lower cost because we're only doing it once. 

And the reverse of that might be that if the doctor doesn’t perform that procedure well, let’s avoid sending him those kinds of patients. Now, my quality of care goes up, the patient has a better experience, and we're going to do it a lower cost because we're only doing it once. 

Gardner: Let’s dive into this solution a bit, because I'm intrigued by the fact that this model of bringing converged-infrastructure provider, a software provider and expertise in the field that crosses the chasm between a technology capability and a vertical industry knowledge-base works. So let’s dig in a little bit. The Microsoft APS, tell us a little bit about that -- what it includes and why it’s powerful and applicable in this situation?

LeRoy: The APS is a solution that combines unstructured data and structured data into a single environment and it allows the IT guys to run classic SQL queries against both.

On one side, we have what used to be called parallel data warehouse. It’s a really fast version of SQL Server. It's massively parallel processing and it can run queries super fast. That’s the important part. I have structured data that I can get to very quickly.

The other half of it is HDInsight, which is Microsoft's open source implementation of Hadoop. Hadoop is all unstructured data. In between these two things there is PolyBase. So I can query the two together and I can join structured and unstructured data together.

Then, since Microsoft created this APS specification, HPE then implemented that in a box that they call a ConvergedSystem 300. Sogeti has used that to build our IP against. We can consume data from all these different areas, put it into the APS, and deliver that data to an end user through a simple interface like Excel or Power BI or some other visualization tool.

Significant scale

Gardner: Just to be clear for our audience, sometimes people hear appliance and they don't think necessarily big scale, but the HPE ConvergedSystem 300 for the Microsoft APS is quite significant with server storage, networking technologies, and large amounts of data, up to 6 petabytes. So we're talking about some fairly significant amounts of data here, not small fry.

LeRoy: And they put everything into that one rack. We think of appliance as something like a toaster that we plug in. That’s pretty close to where they are, not exactly, but you drop this big rack into your data center, give it an IP address, give it some power, and now you can start to take existing data and put it in there. It runs extremely well because they've incorporated the networking and the computing platforms and the storage all within a single environment, which is really effective.

Gardner: Of course, one of the big initiatives at Microsoft has been cloud with Azure. Is there a way in which the HPE Converged Infrastructure in a data center can be used in conjunction with a cloud service like Azure or other cloud, public cloud, infrastructure-as-a-service (IaaS) cloud or even data warehousing cloud services that accelerates the ability to deliver this fast and/or makes it more inclusive or more types of data for more places? How does the public cloud fit into this?
One of the great things about the solution that Microsoft and HPE put together is it’s very much a converged system that allows us to bridge on-prem and the cloud together.

LeRoy: You can distribute the solution across that space. In fact, we take advantage of the cloud delivery as a model. We use a tool called Power BI from Microsoft that allows you to do visualizations.

The system from HPE is a hybrid solution. So we can distribute it. Some of it can be in the cloud and some of it can be on-prem. It really depends on what your needs are and how your different systems are already configured. It’s entirely flexible. We can put all of it on-prem, in a single rack or a single appliance or we can distribute it out to the cloud.

One of the great things about the solution that Microsoft and HPE put together is it’s very much a converged system that allows us to bridge on-prem and the cloud together.

Gardner: And of course, Bob, those end users that are doing those queries, that are getting insights, they probably don’t care where it's coming from as long as they can access it, it works quickly, and the costs are manageable.

LeRoy: Exactly.

Gardner: Tell me a little bit about where we take this model next -- clearly healthcare, big demand, huge opportunity to improve productivity through insights, improve outcomes, while also cutting costs.

You also have a retail solution approach in that market, in that vertical. How does that work? Is that already available? Tell us a little bit about why the retail was the next one you went to and where it might go next in terms of industries?

Four major verticals

LeRoy: Sogeti is focused on four major verticals: healthcare, retail, manufacturing, and life sciences. So we are kind of going across where we have expertise.

The healthcare one has been out now for nine months or so. We see retailers in another place. There are point solutions where people have solved part of this equation, but they haven’t really dug deep in understanding how to get it from end to end, which is something that Sogeti has done now. From the point a person walks into a store, we would be alerted through all of these analytics that we have. We would be alerted that the person arrived and take action against that.

We do what we can to increase our traffic and our sales with individuals and then aggregate all of that data. You're looking at things like customers, inventory, or sales across an organization. That end-to-end piece is something that I think is very unique within the retail space.

After that, we're going to go to manufacturing. Everybody likes to talk about the Internet of Things (IoT) today. We're looking at some very specific use cases on how we can impact manufacturing so IoT can help us predict failures right on a manufacturing line. Or if we have maybe heavy equipment out on a job site, in a mine, or something like that, we could better predict when equipment needs to be serviced, so we can maximize the manufacturing process time.
We're looking at some very specific use cases on how we can impact manufacturing so IoT can help us predict failures right on a manufacturing line.

Gardner: Any last thoughts in terms of how people who are interested in this can acquire it? Is this something that is being sold jointly through these three organizations, through Sogeti directly? How is this going to market in terms of how healthcare organizations can either both learn more and/or even experiment with it?

LeRoy: The best way to do it is search for us online. It's mostly being driven by Sogeti and HPE. Most of the healthcare providers that are also heavy HPE users could be aware of it already, and talking to an HPE rep or to a Sogeti rep is certainly the easiest path to move forward on.

We have a number of videos that are out on YouTube. If you search for Sogeti Labs and Data Driven Decisions, you will certainly find my name and a short video that shows it. And of course sales reps and customers are welcome to contact me or anybody from Sogeti or HP.
Converged Systems Help Transform
Healthcare and Financial Services
Learn More from Sogetilabs
Gardner: Once again, the official name of this initiative is the Data Driven Decisions for Healthcare. I'm afraid we will have to leave it there. We've been discussing how a triumvirate of big players -- Hewlett Packard Enterprise, Microsoft, and Sogeti -- have teamed to deliver a rapid and efficient analysis capability across disparate data types for the healthcare industry.

And we've learned how this new type of big data implementation model quickly and affordably delivers innovative and game changing insights across ecosystems of providers, patients and payers in healthcare, and it looks like it’s going to soon be doing interesting productivity benefits for retail, manufacturing, and life sciences as well.

So join me please in thanking our guest, Bob LeRoy, Vice President in the Global Microsoft Practice and Manager of the HPE Alliance at Sogeti USA, based in Cincinnati. Thank you, Bob.

LeRoy: Thanks, Dana.

Gardner: And I'd also like to thank our audience as well for joining us for this big data innovation discussion. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-sponsored discussions. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a BriefingsDirect discussion on how a triumvirate of big players have teamed to deliver a rapid and efficient analysis capability for healthcare data. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in: