Showing posts with label HAVEn. Show all posts
Showing posts with label HAVEn. Show all posts

Thursday, December 12, 2013

Healthcare Turns to Big Data Analytics Platforms to Gain Insight and Awareness for Improved Patient Outcomes

Transcript of a BriefingsDirect on the need to tap the potential of big data to improve healthcare delivery and how the technology to do that is currently lagging.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion of IT innovation and how it's making an impact on people’s lives.

Gardner
Once again, we’re focusing on how IT leaders are improving their services to deliver better experiences and payoffs for businesses and end users alike. I’m now joined by our co-host for this sponsored series, Chief Software Evangelist at HP, Paul Muller. Welcome Paul, how are you today?

Paul Muller: Fighting fit, and healthy Dana, yourself?

Gardner: Glad to hear it. I’m doing very well, thanks. We’re going to now examine the impact that big-data technologies and solutions are having on the highly dynamic healthcare industry. We’ll explore how analytics platforms and new healthcare-specific solutions together are offering far greater insight and intelligence into how healthcare providers are managing patient care, cost, and outcomes.

And we’re going to hear firsthand of how these new offerings, announced this week at the HP Discover Conference in Barcelona, are designed specifically to give hospitals and care providers new data-driven advantages as they seek to transform their organizations.

With that, please join me in welcoming our guest, Patrick Kelly, Senior Practice Manager at the Avnet Services Healthcare Practice. Welcome, Patrick. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Patrick Kelly: Thank you, Dana. It's great to be with both you and Paul.

Gardner: Just to put this into some perspective, Paul, as you travel the globe, as I know you do, how closely are you seeing an intersection between big data and the need for analytics in healthcare. Is this a US-specific drive, or is this something that’s sweeping many markets as well?

Muller: It's undoubtedly a global trend, Dana. One statistic that sticks in my mind is that in 2012 what was estimated was approximately 500 petabytes of digital healthcare data across the globe. That’s expected to reach 25,000 petabytes by the year 2020. So, that’s a 50-times increase in the amount of digital healthcare data that we expect to be retaining.
Muller

The reasons for that is simply that having better data helps us drive better healthcare outcomes. And we can do it in a number of different ways. We move to what we call most evidence-based medicines, rather than subjecting people to a battery of tests, or following a script, if you like.

The test or the activities that are undertaken with each individual are more clearly tailored, based on the symptoms that they’re presenting with, and data helps us make some of those decisions.

Basic medical research

The other element of it is that we’re now starting to bring in more people and engage more people in basic medical research. For example, in the US, the Veterans Administration has a voluntary program that’s using blood sample and health information from various military veterans. Over 150,000 have enrolled to help give us a better understanding of healthcare.

We’ve had similar programs in Iceland and other countries where we were using long-term healthcare and statistical data from the population to help us spot and address healthcare challenges before they become real problems.

The other, of course, is how we better manage healthcare data. A lot of our listeners, I’m sure, live in countries where electronic healthcare records (EHR) are a hot topic. Either there is a project under way or you may already have them, but that whole process of establishing them and making sure that those records are interchangeable is absolutely critical.

Then, of course, we have the opportunity of utilizing publicly available data. We’ve all heard of Google being utilized to identify the outbreaks of flu in various countries based on the frequency of which people search for flu symptoms.
There’s a huge array of data that you need to bring together, in addition to just thinking about the size of it.

So, there’s definitely a huge number of opportunities coming from data. The challenge that we’ll find so frequently is that when we talk about big data, it's critical not just to talk about the size of the data we collect, but the variety of data. You’ve got things like structured EHR. You have unstructured clinical notes. If you’ve ever seen a doctor’s scribble, you know what I’m talking about.

You have medical imaging data, genetic data, and epidemiological data. There’s a huge array of data that you need to bring together, in addition to just thinking what is the size of it. Of course, overarching all of these are the regulatory and privacy issues that we have to deal with. It's a rich and fascinating topic.

Gardner: Patrick Kelly, tell us a little bit about what you see as the driving need technically to get a handle on this vast ocean of healthcare data and the huge potential for making good use of it? 

Kelly: All the points Paul brought up were spot-on. It really is a problem of how to deal with such a deluge of data. Also, there’s a great change that’s being undertaken because of the Affordable Care Act (ACA) legislation and that’s impacting not only the business model, but also the need to switch to an electronic medical record.

Capturing data

From an EHR perspective to date, IT is focused on capturing that data. They take and then transpose what’s on a medical record into an electronic format. Unfortunately, where we’ve fallen short in helping the business is taking that data that’s captured and making it useful and meaningful in analytics and helping the business to gain visibility and be able to pivot and change as the need to change the business model is being brought to bear on the industry.

Gardner: For those of our audience who are not familiar with Avnet, please describe your organization. You’ve been involved with a number of different activities, but healthcare seems to be pretty prominent in the group now. [Learn more about Avnet's Healthcare Analytics Practice.]

Kelly
Kelly: Avnet has made a pretty significant investment over the last 24 months to bolster the services side of the world. We’ve brought numbers up to around 2,000 new personnel on board to focus on everything in the ecosystem, from -- as we’re talking about today -- healthcare all the way up to hardware, educational services, and supporting partners like HP. We happen to be HP’s largest enterprise distributor. We also have a number of critical channel partners.

In the last eight months, we came together and brought on board a number of individuals who have deep expertise in healthcare and security. They work to focus on building out healthcare practice that not only provides services, but is also developing kind of a healthcare analytics platform.

Gardner: Paul Muller, you can’t buy healthcare analytics in a box. This is really a team sport; an ecosystem approach. Tell me a little bit about what Avnet is, how important they are in HP’s role, and, of course, there are going to be more players as well.
What Avnet brings to the table is the understanding of the HAVEn technology, combined with deep expertise in the area of healthcare and analytics.

Muller: The listeners would have heard from the HP Discover announcements over the last couple of days that Avnet and HP have come together around what we call the HAVEn platform. HAVEn as we might have talked about previously on the show stands for Hadoop, Autonomy, Vertica, Enterprise Security, with the “n” being any number of apps. [Learn more about the HAVEn platform.]

The "n" or any numbers of apps is really where we work together with our partners to utilize the platform, to build better big-data enabled applications. That’s really the critical capability our partners have.

What Avnet brings to the table is the understanding of the HAVEn technology, combined with deep expertise in the area of healthcare and analytics. Combining that, we've created this fantastic new capability that we’re here to talk about now.

Gardner: Back to you, Patrick. Tell me a bit about what you think are the top problems that need to be solved in order to get healthcare information and analytics to the right people in a speedy fashion. What are our hurdles to overcome here?

Kelly: If we pull back the covers and look at some of the problems or challenges around advancing analytics and modernization into healthcare, it’s really in a couple of areas. One of them is that it's a pretty big cultural change.

Significant load

Right now, we have an overtaxed IT department that’s struggling to bring electronic medical records online and to also deal with a lot of different compliance things around ICD-10 and still meet meaningful use. So, that’s a pretty significant load on those guys.

Now, they’re being asked to look at delivering information to the business side of the world. And right now, there's not a good understanding, from an enterprise-wide view, of how to use analytics in healthcare very well.

So, part of the challenge is governance and strategy and looking at an enterprise-wide road map to how you get there. From a technology perspective, there’s a whole problem around industry readiness. There are a lot of legacy systems floating around that can range from 30-year-old mainframes up to more modern systems. So there’s a great deal of work that has to go around modernizing the systems and then tying them together. That all leads to problems with data logistics and fragmentation and really just equals cost and complexity.

One of the traditional approaches that other industries have followed with enterprise data warehouses and traditional extract, transform, load (ETL) approaches are just too costly, too slow, and too difficult for healthcare system to leverage. Finally, there are a lot of challenges in the process of the workflow.

Muller: These sound conceptual at a high level, but the impact on patient outcomes is pretty dramatic. One statistic that sticks in my head is that hospitalizations in the U.S. are estimated to account for about 30 percent of the trillions of dollars in annual cost of healthcare, with around 20 percent of all hospital admissions occurring within 30 days of a previous discharge.
Better utilizing big-data technology can have a very real impact on the healthcare outcomes of your loved ones.

In other words, we’re potentially letting people go without having completely resolved their issues. Better utilizing big-data technology can have a very real impact, for example, on the healthcare outcomes of your loved ones. Any other thoughts around that, Patrick?

Kelly: Paul, you hit a really critical note around re-admissions, something that, as you mentioned, has a real impact on the outcomes of patients. It's also a cost driver. Reimbursement rates are being reduced because of failure. Hospitals would be able to address the shortfalls either in education or follow-up care that end up landing patients back in the ER.

You’re dead on with re-admissions, and from a big-data perspective, there are two stages to look at. There’s a retrospective look that is a challenge even though it's not a traditional big-data challenge. There’s still lot of data and a lot of elements to look into just to identify patients that have been readmitted and track those.

But the more exciting and interesting part to this is the predictive, looking forward and seeing the patient’s conditions, their co-morbidity, how sick they are, what kind of treatment they receive, what kind of education they received and the follow-up care as well as how they behave in the outside world. Then, it’s bringing all that together and building a model to be able to determine whether this person is at risk to readmit. If so, how do we target care to them to help reduce that risk. 

Gardner: We certainly have some technology issues to resolve and some cultural shifts to make, but what are the goals in the medical field, in the provider organizations themselves? I’m thinking of such things as cutting cost, but more that, things about treatments and experience and even gaining perhaps a holistic view of a patient, regardless of where they are in the spectrum.

Waste in the system

Muller: You kind of hit it there, Dana, with the cutting cost. I was reading a report today, and it was kind of shocking. There is a tremendous amount of waste in the system, as we know. It said that in the US, $600 billion, 17.6 percent of the nation’s GDP, that is focused on healthcare is potentially being misspent. A lot of that is due to unnecessary procedures and tests, as well as operational inefficiency.

From a provider perspective, it's getting a handle on those unnecessary procedures. I’ll give you an example. There’s been an increase in the last decade of elective deliveries, where someone comes in and says that they want to have an early delivery for whatever reason. The impact, unfortunately, is an additional time in the neo-natal intensive care unit (NICU) for the baby.

It drives up a lot of cost and is dangerous for both the mother and child. So, getting a handle on where the waste is within their four walls, whether it’s operationally, unnecessary procedures, or tests and being able to apply Lean Six Sigma, and some of these process is necessary to help reduce that.

Then, you mentioned treatments and how to improve outcomes. Another shocking statistic is that medical errors are the third leading cause of death in the US. In addition to that, employers end up paying almost $40,000 every time someone receives a surgical site infection.
From a provider perspective, it's getting a handle on those unnecessary procedures.

Those medical errors can be everything from a sponge left in a patient, to a mis-dose of a medication, to an infection. Those all lead to a lot of unnecessary death as well as driving up cost not only for the hospital but for the payers of the insurance. These are areas that they will get visibility into to understand where variation is happening and eliminate that.

Finally, a new aspect is customer experience. Somehow, reimbursements are going to be tied to -- and this is new for the medical field -- how I as a patient enjoy, for lack of better term, my experience as the hospital or with my provider, and how engaged I had become in my own care. Those are critical measures that analytics are going to help provide.

Gardner: We have a big chore ahead of us with the need for changing the way that IT is conducted in these organizations. Obviously, what you’ve just described are different ways of doing medicine based on data and analysis, but we also have this change in the way that medicine is being delivered in the US. You mentioned the ACA. We’re moving from a paid by procedure basis much more to a paid by the outcomes basis. This shifts things and transforms things tremendously too.

Now that we have a sense of this massive challenge ahead of us, what are organizations like Avnet and providers like HP with HAVEn doing that will help us start to get a handle on this? Give us a sense, Patrick, of what you are bring into the market with the announcement in Barcelona.

Kelly: As difficult as it is to reduce complexity in any of these analytic engagements, it's very costly and time consuming to integrate any new system into a hospital. One of the key things is to be able to reduce that time to value from a system that you introduce into the hospital and use to target very specific analytical challenges.

From Avnet’s perspective, we’re bringing a healthcare platform that we’re developing around the HAVEn stack, leveraging some of those great powerful technologies like Vertica and Hadoop, and using those to try to simplify the integration task at the hospitals.

Standardized inputs

We’re building inputs from HL7, which is just a common data format within the hospital, trying to build some standardized inputs from other clinical systems, in order to reduce the heavy lift of integrating a new analytics package in the environment.

In addition, we’re looking to build a unified view of the patient’s data. We want to extend that beyond the walls of the hospital and build a unified platform. The idea is to put a number of different tools and modular analytics on top of that to have some very quick wins, targeted things like we've already talked about, from readmission all the way into some blocking and tackling operational work. It will be everything from patient flow to understanding capacity management.

It will bring a platform that accelerates the integration and analytics delivery in the organization. In addition, we’re going to wrap that into a number of services that range from early assessment to road map and strategy to help with business integration all the way around continuing to build and support the product with the help system.

The goal is to accelerate delivery around the analytics, get the tools that they need to get visibility into the business, and empower the providers and give them a complete view of the patient.

Gardner: Paul, it’s very impressive when you look at what can be done when an ecosystem comes together. When you look at applications, like what Avnet is delivering, it seems to me they’re also changing the game in terms of who can use these analytics. We’re seeing visualizations and we’re seeing modular approaches like Patrick described. How much of a sea change are we seeing in terms of not just creating better analytics, but getting them to more people, perhaps people had never really had access to this intelligence before.
It’s the immediacy of interaction that is going to make the biggest difference.

Muller: That’s a critical element. It's simple, easy to understand, and visualizations are an important element of it. The other is just simply the ability to turn these sorts of questions around more quickly.

If you think about traditional medical studies and even something as simple as drug development, in the past getting access to the data, being able to have a conversation with the data, has been very difficult, because sourcing it, scrubbing it, correlating it, processing it has taken years.

Even simple queries could take days to run. It’s become more complex and you have to do things like look for correlation across longitudinal records or understanding unstructured clinical notes that have been written by a doctor or, more importantly, by different doctor's. Each of them is writing something similar, but in a different way. Then, there’s the massive volume of information involved. Patrick touched on some of the behavioral aspects or lifestyle choices people make.

The ability to take all of that information at one time and have a conversation, where it's a slice and dice it and interact with it, is another important aspect to the usability and the democratizing access to some of that information. Whether, it would be the researchers or government officials and health care workers looking for example for the potential outbreaks of disease or to plan a better health care system, it’s not just great visualizations that are important. That certainly helps, but it’s the immediacy of interaction that is going to make the biggest difference.

Gardner: Patrick, when you do these basic infrastructure improvements, when you create a different culture to make the data analysis available fast, you start to get towards that predictive, rather then reactive, approach. Do you have some sense or even examples of what good can come of this? Are there some tangible benefits, some soft benefits, to get as a payback. I’m thinking clearly pretty quickly because we probably need to demonstrate value rather soon in this environment?

About visibility

Kelly: Dana, any first step with this is about visibility. It opens the eyes around processes in the organization that are problematic and that can be very basic around things like scheduling in the operating room and utilization of that time to length of stay of patients.

A very a quick win is to understand why your patients seem to be continually having problems and being in the bed longer then they should be. It’s being able, while they're filling those beds, to redirect care, case workers, medical care, and everything necessary to help them get out of the hospital sooner and improve their outcomes.

A lot of times, we've seen a look of surprise when we've shown, here is the patient who has been in for 10 days for a procedure that should have only been a two-day stay, and really giving visibility there. That’s the first step, though very basic.

As we start attacking some of these problems around hospital-based infection, we help the provider make sure that they are covering all their bases and doing kind of the best practices, and eliminating the variation between each physician and care provider, you start seeing some real tangible improvements and outcomes in saving peoples lives.

When you see that from any population be it stroke, re-admissions -- as we talked about earlier -- with heart failure and being able to make sure those patients are avoiding things like pneumonia, you bring visibility.
A challenge for a hospital that has acquired a number of physicians is how to get visibility into those physician practices.

Then, in predictive models and optimizing how the providers and the caregivers are working is really key. There are some quick wins, and that’s why traditionally we built these master repositories that we then built reports on top of. It’s a year and a half to delivery for any value, and we’re looking to focus on very specific use cases and trying to tackle them very quickly in a 90- to 120-day period.

Gardner: Patrick, do you have any early-adopter examples you can provide for us, so that we have a sense of what types of organizations are putting this into place, what they’ve done first, and what have been the outcomes?

Kelly: We're partnering with a 12-hospital health care system, dealing with again some blocking and tackling around understanding better how to utilize their physician network.

A challenge for a hospital that has acquired a number of physicians is how to get visibility into those physician practices. How do you understand the kinds of things we've talked about -- cost, patient experience, outcomes -- out in the wild, in the primary care offices, and in the specialty offices? That data has traditionally just been completely segmented from the hospital systems.

The challenge is building tools that are going to be leveraged by the physician themselves, as well as the hospitals and at an executive level, and utilizing that information to help optimize how those practices are running. It’s kind of a basic problem for most businesses, but it's something very real for hospitals to deal with.

Massive opportunity

Gardner: Paul Muller, this seems to be a massive opportunity, something that will be going on from many years with HP, Vertica, and HAVEn. Trillions of dollars have been spent on ways that can give us better patient experiences, higher health rates, lower mortality rates. So, it’s a win, win, win, right? The hospitals win, the insurers win, the governments win, the patients win, the doctors win. What sort of opportunity is this and how is HP going at it?

Muller: You’ve absolutely nailed the assessment there. It’s an all-around benefit. A healthy society is a healthy economy. That’s pretty crystal clear to everybody. The opportunity for HP and our partners is to help enable that by putting the right data at the finger tips of the people with the potential to generate life saving or lifestyle improving insights. That could be developing a new drug, improving the impatient experience, or helping us identify longer-term issues like genetic or other sorts of congenital diseases.

From our perspective, it’s about providing the underlying platform technology, HAVEn, as the big data platform. The great partner ecosystem that we've developed in Avnet is a wonderful example of an organization that’s taken the powerful platform and very quickly turned that into something that can help not only save money, but as we just talked about, save lives which I think is fantastic.

Gardner: Patrick, as we wrap up, we can certainly see many ways in which these technologies in this analysis can be used immediately for some very significant benefits. But I’m thinking that it also puts in place a tremendous foundation for what we know is coming in the future -- more sensors, more information coming from the patients, more telemetry, so that it's coming remotely, maybe from their bodies, while they are out of the hospital.
In this industry, it’s very life and death, versus it's just purely a financial incentive.

We know that mobile devices are becoming more and more common, not only in patient environments, but in the hospitals and the care-provider organizations. We know the cloud and hybrid cloud services are becoming available and can distribute this data and integrate it across so many more types of processes.

It seems to me that you not only get a benefit from getting to a big-data analysis capability now, but it puts you in a position to be ready when we have more types of data -- more speed, more end points, and, therefore, more requirements for what your infrastructure, whether on premises or in a cloud, can do. Tell me a little bit about what you think the Avnet and HP Solution does for setting you up for these future trend? 

Kelly: At this point, technology today is just not where it needs to be, especially in healthcare. An EKG spits out 1,000 data points per second. There is no way, at this point, without the right technology, that you can actually deal with that.

If we look to a future where providers do less monitoring, so less vital collection, fewer physicals, and all of that is coming from your mobile device, it's coming from intelligent machines. There really needs to be an infrastructure in place to deal with that.

I spent a lot of time working with Vertica even before Avnet. Vertica, Hadoop, and leveraging economy in the area of unstructured data is a technology that is going to allow the scalability and the growth that’s going to be necessary to leverage the data that we need to make it an asset and much less challenge and allow us to transform healthcare.

The key to that is unlocking this tremendous trove of data. In this industry, as you guys have said, it’s very life and death, versus it's just purely a financial incentive.

Targeting big data

Muller: I might jump in on that as well, Dana. This is an important point that we can’t lose sight of as well. As I said when you and I hosted the previous show, big data is also a big target.

One of the things that every healthcare professional and regulator, every member of the public needs to be mindful of is a large accumulation of sensitive personally identifiable information (PII).

It's not just a governance issue, but it's a question of morals and making sure that we are doing the right thing by the people who are trusting themselves not just with their physical care, but with how they present in society. Medical information can be sensitive when available not just to criminals but even to prospective employers, members of the family, and others.

The other thing we need to be mindful of is we've got to not just collect the big data, but we've got to secure it. We've got to be really mindful of who’s accessing what, when they are accessing, are they appropriately accessing it, and have they done something like taking a copy or moved it else where that could indicate that they have malicious intent.
It's also critical we think about big data in the context of health from a 360-degree perspective.

It's also critical we think about big data in the context of health from a 360-degree perspective.

Kelly: That’s a great point. And to step back a little bit on that, one of the things that brings me a little comfort around that is there are some very clear guidelines in the way of HIPAA around how this data is managed, and we look at it from baking the security into it, in everything from the encryption to the audit ability.

But it’s also training the staff working in these environments and making sure that all of that training is put in place to ensure the safety of that data. One of the things that always leaves me scratching my head is that I can go down the street into the grocery store and buy a bunch of stuff. By the time I get to register, they seem to know more about me than the hospital does when I go to the hospital.

That’s one of the shocking things that make you say you can’t wait until big data gets here. I have a little comfort too, because there are at least laws in place to try to corral that data and make sure everyone is using it correctly.

Gardner: Very good. I’m afraid we’ll have to leave it there. Please join me in thanking our co-host, Paul Muller, Chief Software Evangelist at HP. Thanks so much, Paul.

Muller: Thank you for having me back on the show again, Dana. I really love being here.

Gardner: Of course and also a thank you to the supporter of this series, HP Software. And a reminder to our audience to carry on the dialog with Paul Muller through the Discover group on LinkedIn. We've been having a discussion about how big data and healthcare are intersecting and how there’s a huge opportunity for far greater insight and intelligence into how healthcare providers are managing their patient’s care, the cost and ultimately the outcomes.

And I’d also like to remind you that you can access this, and other episodes of the HP Discover podcast series on iTunes under BriefingsDirect.

And, of course, a big thank you to our guest. We’ve been talking with Patrick Kelly, Senior Practice Manager at the Avnet Services Healthcare Practice. Thanks so much, Patrick.

Kelly: Thank you, guys.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your co-host for this ongoing series. And lastly, a big thank you to our audience for joining this HP Discover Discussion, and reminder to come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect on the need to tap the potential of big data to improve healthcare delivery and how the technology to do that is currently lagging. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Wednesday, November 13, 2013

Cardlytics on HP Vertica Powers Millions of Swiftly Tailored Marketing Offers to Bank Card Consumers

Transcript of a BriefingsDirect podcast on how a marketing company uses HP Vertica to match advertisers with potential customers across an ever-growing expanse of data and queries.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your moderator for this ongoing discussion of IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we’re focusing on how IT leaders are improving their business performance for better access, use and analysis of their data and information. This time we’re coming to you directly from the recent HP Vertica Big Data Conference in Boston.

Our next innovation case study interview highlights how data-intensive credit- and debit-card marketing services provider, Cardlytics, provides millions of highly tailored marketing offers to banking consumers across the United States. We'll learn more about how Cardlytics, in adopting a new analytics platform, gained huge data analysis capacity, vastly reduced query times, and swiftly meets customer demands at massive scale.

So please join me now in welcoming Craig Snodgrass, Senior Vice President for Analytics and Product at Cardlytics Inc., based in Atlanta. Welcome, Craig. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Craig Snodgrass: Thanks for having me.

Gardner: At some point, you must have had a data infrastructure or legacy setup that wasn't meeting your requirements. Tell us a little bit about the journey that you've been on gaining better analytic results for your business.

Snodgrass: As with any other company, our data was growing and growing and growing. Also growing at the same time was the number of advertisers that we were working with. Since our advertisers spanned multiple categories -- they range from automotive, to retail, to restaurants, to quick-serve -- the types of questions they were asking were different.

Snodgrass
So we had this intersection of more data and different questions happening at a vertical level. Using our existing platform, we just couldn't answer those questions in a timely manner, and we couldn't iterate around being able to give our advertisers even more insights, because it was just taking too long.

First, we weren’t able to even get answers. Then, when there was the back-and-forth of wanting to understand more or get more insight it just ended up taking longer-and-longer. So at the end of the day, it came down to multiple and unstructured questions, and we just couldn't get our old systems to respond fast enough.

Gardner: Tell us a bit about Cardlytics. Who are your customers, and what do you do for them?

Growing the business

Snodgrass: Our customers are essentially anybody who wants to grow their business. That's probably a common answer, but they are advertisers. They're folks who are used to traditional media, where when they do a TV or radio ad. They're hitting everybody, people that were going to come to their store anyways and people who probably weren’t going to come to their store.

We're able to target who they want to bring into their store through looking at both debit-card and credit-card purchase data, all in an anonymized manner. We’re able to look at past spending behavior, and say, based on those spending behaviors, that these are the types of customers that are most likely to come to your store and more importantly, most likely to be a long-term customer for you.

We can target those, we can deliver the advertising in the form of a reward, meaning the customer actually gets something for the advertising experience. We deliver that through their bank.

The bank is able to do this for their customers as well. The reward comes from the bank, and the advertiser gets a new channel to go bring in business. Then, we can track for them over time what their return on ad-spend is. That’s not an advantage they’ve had before with the traditional advertising they’ve been doing.
It works inside of retail, just as well as restaurants, subscriptions, and the other categories that are out there as well.

Gardner: So it sounds like a win, win, win. As a consumer, I'm going to get offers that are something more than a blanket. It's going to be something targeted to me as the bank that’s providing the credit card. They're going to get loyalty by having a rewards effort that works. Then, of course, those people selling goods and services have a new way of reaching and marketing those goods and services in a way they can measure.

Snodgrass: Yeah, and back to this idea of the multiple verticals. It works inside of retail, just as well as restaurants, subscriptions, and the other categories that are out there as well. So it's not just a one-category type reward.

Gardner: But to make it work, to make that value come across to the consumer, it needs to be a quality targeted effort. Therefore you need to take a lot of data and do a lot of queries.

Snodgrass: You got it. A customer will know quickly when something is not relevant. If you bring in a customer for whom it may not be relevant or they weren’t the right customer, they're not going to return.

The advertiser isn't going to get their return on ad-spend. So it's actually in both our interests to make sure we choose the right customers, because we want to get that return on ad-spend for the advertisers as well.

Gardner: Craig, what sort of volume of data are we talking about here?

Intersecting growth

Snodgrass: We're doing roughly 10 terabytes a year. From a volume standpoint, it's a combination of not just the number of transactions we're bringing in, but the number of requests, queries, and answers that we’re having to go against it. That intersection of growth in volume and growth in questions is happening at the same time.

For us right now, our data is structured. I know a lot of companies are working on the unstructured piece. We're in a world where in the payment systems and banking systems, the data is relatively structured and that's what we get, which is great. Our questions are unstructured. They're everywhere from corporate real estate types of questions, to loyalty, to just random questions that they've never known before.

One key thing that we can do for advertisers is, at a minimum, answer two large questions. What is my market share in an area? Typically, advertisers only know when customers come into their store with that transaction. They don't know where that customer goes and, obviously, they don't know when people don’t come into their store.

We have that full 360-degree view of what happens at the customer level, so we can answer, for a geographic area or whatever area that an advertiser wants, what is their market share and how is their market share trending week-to-week.

The other piece is that when we do targeting, there could be somebody that visits a location three times over a certain time period. You don't know if they're somebody who shops the category 30 times or if they only shop them three times. We can actually answer share-of-wallet for a customer, and you can use that in targeting, designing your campaigns, and more importantly, in analysis. What's going on with these customers?
For us, with Vertica, one of the key components isn't just the speed, but how quick we can scale if the number of queries goes up.

Gardner: So this is any marketers' dream. This is what people have been trying to do and thinking about doing for decades, and now we’re able to get there. One of the characteristics, though, if I understand your challenge from a data-processing perspective, is not only the volume. You're going to have many different queries hitting this at once, because you have so many different verticals and customers. The better job you do, the more queries will be generated.

Snodgrass: It's a self-fulfilling prophesy. For us, with Vertica, one of the key components isn't just the speed, but how quick we can scale if the number of queries goes up. It's relatively easy to predict what our growth and data volume is going to be. It is not easy for me to predict what the growth in queries is going to be. Again, as advertisers understand what types of questions we can answer, it's unfortunately a ratio of 10 to 1. Once they understand something, there are 10 other questions that come out of it.

We can quickly add nodes and scalability to manage the increase in volumes of queries, and it's cheap. This is not expensive hardware that you have to put in. That is one of the main decision points we had. Most people understand HP Vertica on the speed piece, but that and the quick scalability of the infrastructure were critical for us.

Gardner: Just as your marketing customers want to be able to predict their spend and the return on investment (ROI) from it, do you sense that you can predict and appreciate, when you scale with HP Vertica what your costs will be? Is there a big question mark or do you have a sense of, I do this and I have to pay that?

Snodgrass: It is the "I do this and I'll have to pay that," the linearness. For those who understand Vertica, that’s a bit of a pun, but the linear relationship is that if we need to scale, all we need to do is this. It's very easy to forecast. I may not know the date for when I need to add something, but I definitely know what the cost will be when we need to add it.

Compare and contrast

Gardner: How do you measure, in addition to that predictability of cost, your benefits? Are there any speeds and feeds that you can share that compare and contrast and might help us better understand how well this works?

Snodgrass: There are two numbers. During the POC phase, we had a set of 10 to 15 different queries that we used as a baseline. We saw anywhere from 500x to 1,000x or 1,500x speed in return of getting that data. So that’s the first bullet point.

The second is that there were queries that we just couldn't get to finish. At some point, when you let it go long enough, you just don't know if it is going to converge. With Vertica, we haven't hit that limit yet.

Vertica has also allowed to have varying degrees of analysts’ capabilities when it comes to SQL writing. Some are elegant and they write fantastic, very efficient queries. Others are still learning the best way to go put the queries together. They will still always return with Vertica. In the legacy world prior to Vertica, those are the ones that just wouldn't return.
In a SaaS shop, there are a lot of things that you're going to do in SaaS that you are not going to go do in SQL

I don’t know the exact number for how much more productive they are, but the fact that their queries are always returning, and returning in a timely manner, obviously has dramatically increased their productivity. So it's a hard one to measure, but forget how fast the queries have returned, the productivity of our analyst has gone up dramatically.

Gardner: What could an analytics platform do better for you? What would you like to see coming down the pipeline in terms of features, function, and performance?

Snodgrass: If you could do something in SQL, Vertica is fantastic. We'd like more integration with R, more integration with software as a service (SaaS), more integration with these sophisticated tools. If you get all the data into their systems, maybe they can manipulate it in a certain way, but then, you are managing two systems.

Vertica is working on a little bit better integration with R through distributed R, but there's also SaaS as well. In a SaaS shop, there are a lot of things that you're going to do in SaaS that you are not going to go do in SQL. That next level of analytics integration is where we would love to go see the product go.

Gardner: Last question. Do you expect that there will be different types of data and information that you could bring to bear on this? Perhaps some sort of camera, sensor of some sort, point-of-sale information, or mobile and geospatial information that could be brought to bear? How important is it for you to have a platform that can accommodate seemingly almost any number of different information types and formats?

Snodgrass: The best way to answer that one is that we don't ever want to tell business development that the reason they can't pursue a path is because we don't have a platform that can support that.

Different paths

Today, I don't know where the future holds from these different paths, but there are so many different paths we can go down. It's not just the Vertica component, but the HP HAVEn components and the fact that they can integrate with a lot of the unstructured, I think they call it “the human data versus the machine data.”

It's having the human data pathway open to us. We don't want to be the limiting factor for why somebody would want to do something. That's another bullet point for HP Vertica in our camp. If a business model comes out, we can support it.

Gardner: Clearly there's a revolution taking place in retail, and it sounds like you are on the vanguard of that.
I don't know where the future holds from these different paths, but there are so many different paths we can go down.

Snodgrass: Yeah, I agree.

Gardner: Okay, well I'm afraid we'll have to leave it there. We've been learning how data-intensive credit card marketing services provider Cardlytics is providing millions of highly tailored marketing offers to their banking consumers and customers for marketing activities in sales across the U.S.

And we've also heard how they deployed an HP Vertica Analytics Platform to provide better analytics to deliver those insights to these many customers. So a big thank you to our guest, Craig Snodgrass, Senior Vice President for Analytics and Product at Cardlytics.

Snodgrass: Thank you, Dana.

Gardner: And thank you also to our audience for joining us for this special HP Discover Podcast coming to you directly from the recent HP Vertica Big Data Conference in Boston.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP sponsored discussions. Thanks again for joining, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect podcast on how a marketing company uses HP Vertica to match advertisers with potential customers across an ever-growing expanse of data and queries.
Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Thursday, September 19, 2013

MZI HealthCare Identifies Big Data Patient Productivity Gems Using HP Vertica

Transcript of a BriefingsDirect podcast on how a healthcare services provider has harnessed data analytics to help its users better understand complex trends and outcomes.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Performance Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your moderator for this ongoing discussion of IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we’re focusing on how IT leaders are improving their business performance for better access, use and analysis of their data and information. This time we’re coming to you directly from the recent HP Vertica Big Data Conference in Boston.

Our next innovation case study highlights how a healthcare solutions provider leverages big-data capabilities. We'll see how they've deployed the HP Vertica Analytics Platform to help their customers better understand population healthcare trends and identify how well healthcare processes are working.

To learn more about how high performance and cost-effective big data processing forms a foundational element to improving overall healthcare quality and efficiency, please join me now in welcoming our guest, Greg Gootee, Product Manager at MZI Healthcare, based in Orlando. Welcome, Greg. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Greg Gootee: Hi. Thank you, Dana.

Gardner: Tell me a little bit about how important big data is turning out to be for how healthcare is being administered. It seems like there is a lot of change going on in terms of how compensation is going to take place, and information analysis seems perhaps important than ever.

Gootee: Absolutely. When you talk about change, change in healthcare is really dramatic, maybe more dramatic than any other industry has ever been. If you look at other industries where they have actually been able to spread that change over time, in healthcare it's being rapidly accelerated.

Gootee
In the past, data had been stored in multiple systems and multiple areas on given patients. It's been difficult for providers and organizations to make informed decisions about that patient and their healthcare. So we see a lot of change in being able to bring that data together and understand it better.

Gardner: Tell us about MZI, what you do, who your customers are, and where you're going to be taking this big data ability in the future.

Gootee: MZI Healthcare has predominantly been working on the payer side. We have a product that's been on the market for over 25 years helping with benefit administration and the lines of payers and different independent physician associations (IPAs) and third-party administrators (TPAs).

Our customers have always had a very tough time bringing in data from different sources. A little over two years ago, MZI decided to look at how we could leverage that data to help our customers better understand their risk and their patients, and ultimately change the outcomes for those patients.

Predictive analysis

Gardner: I think that's how the newer regulatory environment is lining up in terms of compensation. This is about outcomes, rather than procedures. Tell us about your requirements for big data in order to start doing more of that predictive analysis.

Gootee: If you think about how data has been stored in the past for patients across their continuum of care, where, as they went from facility to facility, and physician to physician, it's really been so spread apart. It's been difficult to help understand even how the treatments are affecting that patient.

I've talked a lot about my aunt in previous interviews. Last year, she went into a coma, not because the doctors weren't doing the right thing, but because they were unable to understand what the other doctors were doing.

She went to many specialists and took medication from each one of those to help with her given problem, but what happened was there was an interaction with medication. They didn't even know if she’d come out of the coma.

These things happen every day. Doctors make informed decisions from their experience and the data that they have. So it's critical that they can actually see all the information that's available to them.

When we look at healthcare and how it's changing, for example the Affordable Care Act, one of the main focuses is obviously cost. We all know that healthcare is growing at a rate that's just unsustainable, and while that's the main focus, it's different this time.
Not only are we trying to reduce cost, but we are trying to increase the care that's given to those patients.

We've done that before. In the Clinton Administration we had a kind of HMO and it really made a dramatic difference on cost. It was working, but it didn't give people a choice. There was no basis on outcomes, and the quality of care wasn't there.

This time around, that's probably the major difference. Not only are we trying to reduce cost, but we are trying to increase the care that's given to those patients. That's really vital to making the healthcare system a better system throughout the United States.

Gardner: Given the size of the data, the disparate nature of the data, more-and-more human data will be brought to bear. What were your technical requirements, and what was the journey that you took in finding the right infrastructure?

Gootee: We had a couple of requirements that were critical. When we work with small- and medium-size organizations (SMBs), they really don't have the funds to put in a large system themselves. So our goal was that we wanted to do something similar to what Apple has done with the iPhone. We wanted to take multiple things, put them into one area, and reduce that price point for our customers.

One of the critical things that we wanted to look at was overall price point. That included how we manage those systems and, when we looked at Vertica, one of the things that we found very appealing was that the management of that system is minimal.

High-end analytics

The other critical thing was speed, being able to deliver high-end analytics at the point of care, instead of two or three months later, and Vertica really produced. In fact, we did a proof of concept with them. It was almost unbelievable some of the queries that ran and the speed at which that data came back to us.

You hear things like that and see it through the conference, no matter what volume you may have. It's very good. Those were some of our requirements, and we were able to put that in the cloud. We run in the Amazon cloud and we were able to deliver that content to the people that need it at the right time at a really low price point.

Gardner: Let me understand also the requirement for concurrency. If you have this posted on Amazon Web Services, you're then opening this up to many different organizations and many different queriers. Is there an issue for the volume of queries happening simultaneously, or concurrency? Has that been something you've been able to work through?

Gootee: Absolutely. That's another value add that we get. The ability to expand and scale the Vertica system along with the scalability that we get with the Amazon Services allows us to deliver that information. No matter what type of queries we're getting, we can expand that automatically. We can grow that need, and it really makes a large difference in how we could be competitive in the marketplace.

Gardner: I suppose another dynamic to this on the economic side is the predictability of your cost -- x data volume, x queries. I can predict with perhaps even linear ability what my cost would be. Is that the case with you, because I know that in the past, many organizations didn't know what the costs were going to be until they got in, and it was too late.
Cloud services take some of that unknown away. It lets you scale as you need it and scale back if you don't need it.

Gootee: If you look at traditional ways that we've delivered software or a content before, you always over-buy, because you don’t know what it's going to be. Then, at some point, you don't have enough resources to deliver. Cloud services take some of that unknown away. It lets you scale as you need it and scale back if you don't need it.

So it's the flexibility for us. We're not a large company, and what's exciting about this is that these technologies help us do the same thing that the big guys do. It really lets our small company compete in a larger marketplace.

Gardner: Going back to the population health equation and the types of data and information, we heard a presentation this morning and we saw some examples of HP HAVEn, bringing together Hadoop, Autonomy, Vertica, Enterprise Security, and then creating applications on top of that. Is this something that's of interest to you? How important is this ability to get at all the information in all the different formats as you move forward?

Gootee: That's very critical for us. The way we interact in America and around the world has changed a lot. The HAVEn platform provides us with some opportunities to improve on what we have with healthcare's big security concerns, and the issue of the mobility of data. Getting it anywhere is critical to us, as well as better understanding how that data is changing.

We've heard from a lot of companies here that really are driving that user experience. More-and-more companies are going to be competing on how they can deliver things to a user in the way that they like it. That's critical to us, and that platform really gives us the ability to do that.

Gardner: Well great. I'm afraid we'll have to leave it there. We've been learning how a healthcare solutions provider has been leveraging big-data capabilities, and we've seen how they've deployed the HP Vertica Analytics platform to help customers better understand population healthcare trends, and also to identify how well healthcare processes are working.

So a big thank you to our guest, Greg Gootee, Product Manager at MZI Healthcare. Thanks, Greg.

Gootee: Thank you, Dana.

Gardner: And thanks also to our audience for joining us for this special HP Discover Performance Podcast coming to you directly from the recent HP Vertica Big Data Conference in Boston.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP Sponsored Discussions. Thanks again for joining, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.
Transcript of a BriefingsDirect podcast on how a healthcare services provider has harnessed data analytics to help its users better understand complex trends and outcomes. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Tuesday, September 10, 2013

Unum Group Architect Charts a DevOps Course to a Hybrid Cloud Future

Transcript of a BriefingsDirect podcast on how Unum Group has benefited from a better process around application development and deployment using HP tools.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Performance Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your moderator for this ongoing discussion of IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we're focusing on how IT leaders are improving their services' performance to deliver better experiences and payoffs for businesses and end users alike, and this time we're coming to you directly from the recent HP Discover 2013 Conference in Las Vegas.

Our next innovation case study interview highlights how employee benefits provider Unum Group has been building a DevOps continuum and is exploring the benefits of a better process around applications development and deployment. And we are going to learn more about how they've been using certain tools and approaches to improve their applications delivery.

So join me in thanking our guests for being here. We're joined by Tim Durgan, an Enterprise Application Architect at Unum Group. Welcome, Tim.

Tim Durgan: Thank you, Dana.

Gardner: We're also here with Petri Maanonen, Senior Product Marketing Manager for Application Performance Management at HP Software. Welcome, Petri. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Petri Maanonen: Hello, Dana.

Gardner: Let's talk a little bit about what's important for your company. You're a large insurer. You're in the Fortune 500. You're one of the largest employee benefits providers in the U.S. and you have a big presence in the UK as well. What are some of the imperatives that have driven you to try to improve upon your applications delivery?

Durgan: Even though, as you said, we're one of the largest employee benefits providers in the United States, we began to realize that there were smaller companies starting to chip away in segments of the market.

Durgan
It became imperative to deliver products more rapidly to the market, because delivery was a multi-year effort, which was unacceptable. If it took that long from concept to delivery, there would be a completely new market dynamic at play.

We started to look at application architectures like service-oriented architecture (SOA) to deliver agility, process automation, and rules automation -- all very mainstream approaches. We discovered pretty quickly that to use those approaches effectively you needed to have a level of governance.

Governance initiative

We had an SOA governance initiative that I led and we brought in technology from HP to aid us with that. It was the Business Service Management (BSM) suite of tools, the Systinet Repository, and some partner products from HP.

What we discovered very quickly is that in enterprise architecture, where I am from in the company, bringing in an operational tool like monitoring was not hailed as, "Thanks for helping us." There was this organizational push back. It became very clear to me early on that we were operating in silos. Delivery was doing their efforts, and we would throw it over the wall to QA. QA would do their job, and then we would ultimately move it out to a production environment and operational aspects would take over.

It really dawned on me early on that we had to try to challenge the status quo around the organization. That's what started to get me focused on this DevOps idea, and HP has a number of products that are really allowing that philosophy to become a reality.

Gardner: Tell me what you think that philosophy is. Does it differ from perspective and position within organizations as an enterprise architect, sort of a über role over some of these groups? How do you define DevOps?

Durgan: I have a couple of principles that I use when I talk about DevOps, and I try to use titles for these principles that are a little disruptive, so people pay attention.

For instance, I'll say "eliminate the monkeys," which essentially means you need to try to automate as much as possible. In many companies, their development process is filled with committees of people making decisions on criteria that are objective. Machines are very good at objective criteria. Let's save the humans for subjective things.
We want to put a product out quickly, but if it's going to fail, we would love to know it's going to fail very quickly, not make millions of dollars in investments.

That's what I talk about when we say eliminate the monkeys, get people out of the middle. It's really interesting, because as an architect, I recognize the automation of business process. But somehow I missed the fact that we need to automate the IT process, which in a lot of ways, is what DevOps is about.

Another principle is "fail fast." If you're going to deliver software fast, you need to be able to fail fast. As an example that I presented here at the conference last year -- which I knew most of the HP people loved -- was Palm. I'm sure they wished they had failed faster, because that was a pretty painful lesson, and a lot of companies struggle with that.

Unum does. We want to put a product out quickly, but if it's going to fail, we would love to know it's going to fail very quickly, not make millions of dollars in investments.

Another one is visibility throughout. I will say monitoring is a team sport. In a lot of companies, there are 50 or 60 monitoring tools. Each team has a monitoring tool. You have to have a secret decoder ring to use each monitoring tool.

While diversity is normally a great thing, it isn't when it comes to monitoring. You can't have the ops guy looking at data that's different from what the developer is looking at. That means you're completely hopeless when it comes to resolving issues.

Working collaboratively

My last one is "Kumbaya." A lot of IT organizations act competitively. Somehow infrastructure believes they can be successful without development and without QA and vice versa. Business sees only IT. We are a complete team and we have to work collaboratively to achieve things.

So those are really the ways I think about DevOps at the company.

Gardner: Petri, when you hear words like "process automation for IT" and a common view of the data across IT groups, it must be music to your ears?

Maanonen: Oh, sure. And the team has been very accurately capturing the essence of how DevOps needs to be supported as a function and of course shared among different kinds of teams in silos.

Maanonen
If you look at HP, we've been supporting these various teams for 15 years, whether it has been testing a performance of an application or monitoring from the end-user perspective and so forth. So we've been observing from our customers -- and Unum is a brilliant example of that -- them growing and developing their kind of internal collaboration to support these DevOps processes. Obviously the technology is a good supporting factor in that.

Tim was mentioning the continuous delivery type of demands from the business. We have been trying to step up, not only by developing the technology, but actually bringing very quickly supportive software-as-a-service (SaaS) types of offerings, Agile Manager and Performance Anywhere for example. Then, customers can quickly adopt the supporting technology and get this collaboration and a DevOps cycle, the continuous improvement cycle, going.

Gardner: Now, of course, this isn't just a technology discussion. When you said Kumbaya, obviously this is about getting people to see the vision, buy into the vision, and then act on the vision. So tell me a little bit more, Tim, about the politics of DevOps.
We are a complete team and we have to work collaboratively to achieve things.

Durgan: So you are going to ask me politics for this public interview. At Unum there is none, first of all, but I hear there is at other companies. I think the problem that a lot of companies have, and Unum as well, is that unfortunately we all have individual expectations and performance. We all have a performance review at the end of the year and we have things that we need to do. So it is, as you mentioned, getting everybody to buy into that holistic vision, and having these groups all sign up for the DevOps vision.

We've had good success in the conversation so far at Unum. I know we've talked to our Chief Technology Officer, and he's very supportive of this. But because we're still on the journey, we want data, metrics, and some evidence to support the philosophy. I think we're making some progress in the political space, but it's still a challenge.

I'm part of the HP BSM CAB (Customer Advisory Board), and in that group is, they talk about these other different small monitoring products trying to chip away at HP's market. The product managers, will ask, "Why is that? And I say that part of the problem is BSM is pitching enterprise monitoring.

The assumption is that a lot of organizations sign on to the enterprise monitoring vision. A lot of them don't, because the infrastructure team cares about the server, the application team cares about the app, and the networking team cares about the network. In a lot of ways, that's the same challenge you have in DevOps.

Requests for visibility

But I hear a lot of requests from the infrastructure and application teams for that visibility into each other's jobs, into their spaces, and that's what DevOps is pitching. DevOps is saying, "We want to give you visibility, engineer, so that you can understand what this application needs, and we want to give you visibility, developer, into what's happening in the server environment so you can partner better there."

There is a good grassroots movement on this in a lot of ways, more than a top-down. If you talk about politics, I think in a lot of cases it has to be this “Occupy IT” movement.

Gardner: What are some of the paybacks that are tangible and identifiable when DevOps is done properly, when that data is shared and there is a common view, and the automation processes gets underway?

Maanonen: What we hear from our customers, and obviously Unum is no exception to that, is that they're able to measure the return on investment (ROI) from the number of downtime hours or increased productivity or revenue, just avoiding the old application hiccups that might have been happening without this collaborative approach.

Also, there's the reduction of the mean time to resolve the issues, which they see in production and, with more supportive data than before, provide the fix through their development and testing cycles. That's happening much faster than in the past.
There is a good grassroots movement on this in a lot of ways, more than a top-down.

Where it might have been taking days or weeks to get some bugs in the application fixed, this might be happening in hours now because of this collaborative process.

Gardner: Tim, what about some of the initiatives that you're bound to be facing in the future, perhaps more mobile apps, smaller apps, the whole mobile-first mentality, and then more cloud options for you to deploy your apps differently, depending on what the economics and the performance and other requirements dictate. Does DevOps put you in a better position vis-à-vis what we all seem to see coming down the pike?

Durgan: It is, if you think about movement to the cloud, which Unum is very much looking at now. We're evaluating a cloud-first strategy. My accountability is writing this strategy.

And you start to think about, "I'm going to take this application and run it on a data center I don’t own anymore. So the need for visibility, transparency, and collaboration is even greater."

It’s a philosophy that enables all of the new emerging needs, whether it’s mobile, cloud, APIs, edge of the enterprise, all those types of phenomena. One of the other major things  we didn’t touch on it earlier that I would contend is a hurdle for organizations is, if you think about DevOps and that visibility, data is great, but if you don’t have any idea of expectations, it’s just data.

What about service-level management (SLM) and ITIL process, processes that predated ITIL, just this idea of what are the expectations, performance, availability, what have you for any aspect of the IT infrastructure or applications? If you don’t have a mature process there, it’s really hard for you to make any tangible progress in a DevOps space, an ALM space, or any of those things. That’s an organizational obstacle as well.

Make it real

One of the things we're doing at Unum is we're trying to establish SLAs beginning in dev, and that’s where we take fail fast to make it real. When I come to the conference and presented it, I had a lot of people look surprised. So I think it's radical.

If I can’t meet that SLA in dev, there's no way I am going to magically meet it in production without some kind of change. And so that’s a great enhancement. At first people say, that’s an awful lot of burden, but I try to say, "Look, I'm giving you, developer, an opportunity to fail and resolve your problem Monday through Friday, versus it goes to production, you fail, and you're here on the weekends, working around the clock."

That, to me is just one of those very simple things that is at the heart of a DevOps philosophy, a fail fast philosophy, and a big part of that development cycle. A lot of the DevOps tooling space right now is focused on some ALM on the front end, HP Agile Manager, and deployment.

Well, those are great, but as an application architect, I care about design and development. I think HP is well-positioned to do some great things with BSM, which has all that SLA data, and integrate that with things like the Repository, which has great lifecycle management. You start having these enforcement points and you say, "This code isn't moving unless it meets an SLA." That decision is made by the tool, objective criteria, decided by the system. There's no need to have a human involved. It's a great opportunity for HP to really do some cutting-edge and market-leading stuff.
Cloud and mobile are coming into play and are increasing the velocity of the applications and services being provisioned out to the end users.

Maanonen: We see that the cloud and mobile, as you mentioned, Dana, are coming into play and are increasing the velocity of the applications and services being provisioned out to the end users. We see that this bigger and larger focus, looking from the end user perspective of receiving the service, whether it’s a mobile or a cloud service, is something that we've been doing through our technology as a unifying factor.

It's very important when you want to break the silos. If the teams are adopting this end-user perspective, focusing on the end user experience improvement in each step of the development, testing, and monitoring, this is actually giving a common language for the teams and enhancing the chances of improved collaboration in the organization.

Durgan: That's a really good point. You start to hear this phrase now, the borderless enterprise, and it’s so true. Whether it’s mobile, cloud, or providing APIs to your customers, brokers, or third parties, that's the world we now live in. So we need to increase that quality and that speed to market. It’s no longer nice to have; You've got to deliver on that stuff.

If you don’t adopt DevOps principles and do some of these things around failing fast and providing holistic visibility and shared data, I just don't see how you change the game, how you move from your quarterly release cycle to a monthly, weekly, or daily release cycle. I don’t see how you do it.

Gardner: Here at HP Discover, we're hearing a lot about HAVEn, a platform that’s inclusive of many data and information types, with scale and speed and provisioning.

We're also hearing about Converged Cloud, an opportunity to play that hybrid continuum in the best way for your organization. And we heard some interesting things about HP Anywhere, going mobile, and enabling those endpoints at an agnostic level.

But after all, it’s still about the applications. If you don't have good apps -- and have a good process and methodology for delivering those apps -- all those other benefits perhaps don't pay back in the way they should.

Strong presence

So what’s interesting to me is that HP may be unique in that it has a very strong presence in the applications test, dev, deployment, fostering Agile, and fostering DevOps that the other competitors that are presenting options for mobile or for cloud don't have. So that’s a roundabout way of saying how essential it is to make people like Tim happy to the future of HP?

Maanonen: Tim has been pointing out that they're coming from a traditional IT environment and they're moving to the cloud now very fast. So you can see the breadth of the HP portfolio. Whatever technology area you're looking at, we should be pretty well-equipped to support companies and customers like Unum and others in different phases of their journey and the maturity curve when they move into cloud, mobile, and so forth. We're very keen to leverage and share those experiences we have here over the years with different customers.

Yesterday, there were customer roundtable events and customer advisory boards, where we're trying to make the customers share their experiences and best practices on what they've learned here. Hopefully, this podcast is giving an avenue to the other customers to hear what they should explore next.

But the portfolio breadth is one of the strengths for HP, and we're trying to stay competitive in each area. So I am happy that you have been observing that in the conference.
The portfolio breadth is one of the strengths for HP, and we're trying to stay competitive in each area.

Gardner: Last word to you, Tim. What would you like to see differently -- not necessarily just from a product perspective, but in terms of helping you cross the chasm from a siloed development organization and a siloed data center and production organization? What do you need to be able to improve on this DevOps challenge?

Durgan: The biggest thing HP can do for us is to continue to invest in those integrations of that portfolio, because you're right, they absolutely have great breadth of the offerings.

But I think the challenge for HP, with a company the size they are, is that they can have their own silos. You can talk to the Systinet team and talk to the BSM team and say, "Am I talking to the same company still?" So I think making that integration turnkey, like the integrations we're trying to achieve, is using their SOA Repository, their Systinet product as the heart of an SOA governance project.

We're integrating with Quality Center to have defects visible in the repository, so we can make an automated decision that this code moves because it has a reasonable number of defects. Zero is what we'd like to say, but let's be honest here, sometimes you have to let one go, if it’s minor. Very minor for any Unum people reading this.

Then, we are integrating with BSM, because we want that SLA data and that SLM data, and we are integrating with some of their partner products.

There’s great opportunity there. If that integration can be a smoother thing, an easier thing, a turnkey type operation, that makes the portfolio, that breadth something that you can actually use to get significant traction in the DevOps space.

Gardner: Well, great. I'm afraid we will have to leave it there. We've been learning about how Unum Group has been working toward a DevOps benefit and how they've been using HP products to do so.

So join me in thanking our guests, Tim Durgan, Enterprise Application Architect at Unum Group. Thank you, Tim.

Durgan: Thank you, Dana.

Gardner: And also Petri Maanonen, Senior Product Marketing Manager for Application Performance Management at HP Software. Thank you, Petri.

Maanonen: Thank you, Dana.

Gardner: And I'd like to thank our audience as well for joining us for this special HP Discover Performance Podcast coming to you from the recent HP Discover 2013 Conference in Las Vegas.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP sponsored discussions. Thanks again for joining, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect podcast on how Unum Group has benefitted from a better process around application development and deployment using HP tools. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in: