Wednesday, July 23, 2014

UK Solutions Developer Systems Mechanics Uses HP HAVEn for BI, Streaming and Data Analysis

Transcript of a sponsored BriefingsDirect podcast on making telcos more responsive to customer and operators by using big-data tools and analysis.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing sponsored discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we’re focusing on how companies are adapting to the new style of IT to improve IT performance and deliver better user experiences, and business results. This time, we’re coming to you directly from the recent HP Discover 2013 Conference in Barcelona.

We’re here to learn directly from IT and business leaders alike how big data, mobile, and cloud -- along with converged infrastructure -- are all supporting their goals.

Our next innovation case study focuses on how Systems Mechanics Limited is using elements of the HP HAVEn portfolio, to improve how their products can perform in processes such as business intelligence (BI), analytics streaming, and data analysis.

To learn more, we’re here with Andy Stubley, Vice President of Sales and Marketing at Systems Mechanics, based in London. Welcome, Andy.

Andy Stubley: Hello.

Gardner: So tell us a bit about what you do at Systems Mechanics. It sounds like a very interesting organization. You've been doing a lot with data, and monetizing that in some very compelling ways.

Stubley: Yes, indeed. System Mechanics is a UK-based organization. We’re principally a consultancy and a software developer. We’ve been working in the telco space for the last 10-15 years. We also have a history in retail and financial services.

Stubley
The focus we've had recently and the products we’ve developed into our Zen family are based on big data, particularly in telcos, as they evolve from principally old analog conversations into devices where people have smartphone applications and data becomes ever more important.

All that data and all those people connected to the network cause a lot more events that need to be managed, and that data is both a cost to the business and an opportunity to optimize the business. So we have a cost reduction we apply and a revenue upside we apply as well.

Quick example

Gardner: Can you give us a quick example, just so our listeners who might not be familiar with all of these technical terms and instances of use would understand? What’s a typical way a telco will use Zen, and what would they do with it?

Stubley: For a typical way, let’s take a scenario where you’re looking in network and you can’t make a phone call. Two major systems are catching that information. One is a fault-management system that’s telling you there is a fault on the network and it reports that back to the telecom itself.

The second one is the performance management system. That doesn’t specify faults basically, but it tells you if you’re having things like thresholds being affected, which may have an impact on performance every time. Either of those can have an impact on your customer, and from a customer’s perspective, you might also be having a problem with the network that isn’t reported by either of the systems.

We’re finding that social media is getting a bigger play in this space. Why is that? Now, particular the younger populations with consumer-based telcos, mobile telcos particularly, if they can’t get a signal or they can’t make a phone call, they get onto social media and they are trashing the brand.

They’re making noise. A trend is combining fault management and performance management, which are logical partners with social media. All of a sudden, rather than having a couple of systems, you have three.

In our world, we can put 25 or 30 different data sources on to a single Zen platform. In fact, there is no theoretical limit to the number we could, but 20 to 30 is quite typical now. That enables us to manage all the different network elements, different types of mobile technologies, LTE, 3G, and 2G. It could be Ericsson, Nokia, Huawei, ZTE, or Alcatel-Lucent. There is an amazing range of equipment, all currently managed through separate entities. We’re offering a platform to pull it all together in one unit.

The other way I tend to look at it is that we’re trying to turn the telcos into how you might view a human. We take the humans as the best decision-making platforms in the world and we probably still could claim that. As humans, we have conscious and unconscious processes running. We don’t think about breathing or pumping our blood around our system, but it’s happening all the time.
We use a solution with visualization, because in the world of big data, you can’t understand data in numbers.

We have senses that are pulling in massive amount of information from the outside world. You’re listening to me now. You’re probably doing a bunch of other things while you are tapping away on a table as well. They’re getting senses of information there and you are seeing, and hearing, and feeling, and touching, and tasting.

Those all contain information that’s coming into the body, but most of the activity is subconscious. In the world of big data, this is the Zen goal, and what we’re delivering in a number of places is to make as many actions as possible in a telco environment, as in a network environment, come to that automatic, subconscious state.

Suppose I have a problem on a network. I relate it back to the people who need to know, but I don’t require human intervention. We’re looking a position where the human intervention is looking at patterns in that information to decide what they can do intellectually to make the business better.

That probably speaks to another point here. We use a solution with visualization, because in the world of big data, you can’t understand data in numbers. Your human brain isn’t capable of processing enough, but it is capable of identifying patterns of pictures, and that’s where we go with our visualization technology.

Gather and use data

Gardner: So your clients are able to take massive amounts of data and new types of data from a variety of different sources. Rather than be overwhelmed by that, through this analogy to being subconscious, you’re able to gather and use it.

But when something does present a point of information that’s important, you can visualize that and bring that to their attention. It’s a nice way of being the right mix of intelligence, but not overwhelming levels of data.

Stubley: Let me give you an example of that. We’ve got one customer who is one of the largest telcos in EMEA. They’re basically taking in 90,000 alarms from the network a day, and that’s their subsidiary companies, all into one environment. But 90,000 alarms needing manual intervention is a very big number.

Using the Zen technology, we’ve been able to reduce that to 10,000 alarms. We’ve effectively taken 90 percent of the manual processing out of that environment. Now, 10,000 is still a lot of alarms to deal with, but it’s a lot less frightening than 90,000, and that’s a real impact in human terms.

Gardner: Very good. Now that we understand a bit about what you do, let’s get into how you do it. What’s beneath the covers in your Zen system that allows you to confidently say we can take any volume of data we want?
If we need more processing power, we can add more services to scale transparently. That enables us to get any amount of data, which we can then process.

Stubley: Fundamentally, that comes down to the architecture we built for Zen. The first element is our data-integration layer. We have a technology that we developed over the last 10 years specifically to capture data in telco networks. It’s real-time and rugged and it can deal with any volume. That enables us to take anything from the network and push it into our real-time database, which is HP’s Vertica solution, part of the HP HAVEn family.

Vertica analysis is to basically record any amount of data in real time and scale automatically on the HP hardware platform we also use. If we need more processing power, we can add more services to scale transparently. That enables us to get any amount of data, which we can then process.

We have two processing layers. Referring to our earlier discussion about conscious and subconscious activity, our conscious activity is visualizing that data, and that’s done with Tableau.

We have a number of Tableau reports and dashboards with each of our product solutions. That enables us to envision what’s happening and allows the organization, the guys running the network, and the guys looking at different elements in the data to make their own decisions and identify what they might do.

We also have a streaming analytics engine that listens to the data as it comes into the system before it goes to Vertica. If we spot the patterns we’ve identified earlier “subconsciously,” we’ll then act on that data, which may be reducing an alarm count. It may be "actioning" something.

It may be sending someone an email. It may be creating a trouble ticket on a different system. Those all happen transparently and automatically. It’s four layers simplifying the solution: data capture, data integration, visualization, and automatic analytics.

Developing high value

Gardner: And when you have the confidence to scale your underlying architecture and infrastructure, when you are able to visualize and develop high value to a vertical industry like a telco, this allows you to then expand into more lines of business in terms of products and services and also expand into move vertical. Where have you taken this in terms of the Zen family and then where do you take this now in terms of your market opportunity?

Stubley: We focus on mobile telcos. That’s our heritage. We can take any data source from a telco, but we can actually take any data source from anywhere, in any platform and any company. That ranges from binary to HTML. You name it, and if you’ve got data, we could load it.

That means we can build our processing accordingly. What we do is position what we call solution packs, and a solution pack is a connector to the outside world, to the network, and it grabs the data. We’ve got an element of data modeling there, so we can load the data into Vertica. Then, we have already built reports in Tableau that allows us to interrogate automatically. That’s at a component level.

Once you go to a number of components, we can then look horizontally across those different items and look at the behaviors that interact with each other. If you are looking at pure telco terms, we would be looking at different network devices, the end-to-end performance of the network, but the same would apply to a fraud scenario or could apply to someone who is running cable TV.
The very highest level is finding what problem you’re going to solve and then using the data to solve it.

So multi-play players are interesting because they want to monitor what’s happening with TV as well and that will fit in exactly in the same category. Realistically, anybody with high-volume, real-time data can take benefit from Vertica.

Another interesting play in this scenario is social gaming and online advertising. They all have similar data characteristics, very high volume and fixed data that needs to be analyzed and processed automatically.

Gardner: We have quite a few other organizations that are exploring how to use the available technologies to gather and exploit data. Are there any lessons learned, any hindsight perspectives you can provide as other organizations, whether they’re using their own technology or using third-parties or some combination, what should you keep in mind as you begin this journey?

Stubley: A lot of the lessons have been learned time-and-time again. Frequently, people fall into the same traps over-and-over again. Insanity is not learning from previous mistakes, isn’t it? What we see most often, particularly in the big-data world, and I see this in a number of different forms, is when people are looking for the data to be the solution, rather than solving the business problem.

The very highest level is finding what problem you’re going to solve and then using the data to solve it. You won’t identify the problems just with big data itself. It’s too big a problem and it’s irrational, if you think about it.

One of the great things we have is that a number of solutions that sit on a big data enable you to make a starting point and then step, in small steps, to a big data solution that is more encompassing, but proven at every stage of the way. It’s a very classic project behavior with proof at every point, delivery at every point, value at every point. But please, please don’t think big data is the answer.

Why Vertica?

Gardner: One last question delving into the process by which you’ve crafted your architecture and capabilities. How long have you been using Vertica, and what is it that drove you to using it vis-à-vis alternatives?

Stubley: As far as the Zen family goes, we have used other technologies in the past, other relational databases, but we’ve used Vertica now for more than two-and-a-half years. We were looking for a platform that can scale and would give us real-time data. At the volumes we were looking at nothing could compete with Vertica at a sensible price. You can build yourself any solid solution with enough money, but we haven’t got too many customers who are prepared to make that investment.

So Vertica fits in with the technology of the 21st century. A lot of the relational database appliances are using 1980 thought processes. What’s happened with processing in the last few years is that nobody shares memory anymore, and our environment requires a non-shared memory solution. Vertica has been built on that basis. It was scaled without limit.

Gardner: And as you mentioned, Andy, Vertica is part of the HAVEn family and Hadoop, Autonomy, and other security aspects and compliance aspects are in there as well. How do you view things going forward, as more types and volumes of data are involved?
Vertica fits in with the technology of the 21st century. A lot of the relational database appliances are using 1980 thought processes.

You’re still trying to increase your value and reduce that time to delivery for the analysis. Any thoughts about what other aspects of HAVEn might fit well into your organization?

Stubley: One of the areas we’re looking at that I mentioned earlier was social media. Social media is a very natural play for Hadoop, and Hadoop is clearly a very cost-effective platform for vast volumes of data at real-time data load, but very slow to analyze.

So the combination with a high-volume, low-cost platform for the bulk of data and a very high performing real-time analytics engine is very compelling. The challenge is going to be moving the data between the two environments. That isn’t going to go away. That’s not simple, and there is a number of approaches. HP Vertica is taking some.

There is Flex Zone, and there are any number of other players in that space. The reality is that you probably reach an environment where people are parallel loading the Hadoop and the Vertica. That’s what we probably plan to do. That gives you much more resilience. So for a lot of the data we’re putting into our system, we’re actually planning to put the raw data files into Hadoop, so we can reload them as necessary to improve the resilience of the overall system too.

Gardner: Well good luck with that. I hope perhaps we can hear more about that in the next HP Discover event, but I’m afraid we’ll have to leave it there for this particular venue. I’d like to thank our guest, Andy Stubley, Vice President of Sales and Marketing at Systems Mechanics Limited, based in London. Thank you so much, Andy.

Stubley: Thank you very much.

Gardner: And a big thank you to our audience, as well, for joining us in this special new style of IT discussion coming to you directly from the HP Discover 2013 Conference in Barcelona. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP sponsored discussions. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a sponsored BriefingsDirect podcast on making telcos more responsive to customer and operators by using big-data tools and analysis. Copyright Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in:

Tuesday, July 15, 2014

Health Data Deluge Requires Secure Information Flow Via Standards, Says The Open Group’s New Healthcare Director

Transcript of a BriefingsDirect podcast on how new devices and practices have the potential to expand the information available to healthcare providers and facilities.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow July 21-22, 2014 in Boston.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions and I'll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0, Healthcare, and Security issues.

One area of special interest is the healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, enterprise architecture, and standards can improve the communication and collaboration among healthcare ecosystem players.

And so, we're joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the healthcare industry to rapid change.

With that, please join me now in welcoming our guest, Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I'm looking forward to the Boston conference and want to remind our listeners and readers that it's not too late to sign up to attend. You can learn more at www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It's upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

Lee
We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it's very difficult to get them to speak the same language. It's almost as if you're in a large house with lots of different rooms, and  every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts, and that’s what we're undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What are the hurdles? What prevents a nice, seamless, easy flow and collaboration in information that creates better outcomes? What’s the holdup?

Many barriers

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into a medical office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there's been movement toward digitizing that information, not everyone has used the same system. It's almost like trains running on different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we're also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant [information] for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.
We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it's perhaps the future. The solution is somewhere in there, too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we're all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets -- big data, some people refer to it as. I'm talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them -- and healthcare can do the same. We're just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Quality side

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I'd like to get into a few of the hotter trends. But before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to healthcare.

Maybe you could provide us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22?

Lee: We have a packed day. We're very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.”
It will become an area where standards development and The Open Group can be very helpful.

Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority.  As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. 

A great deal of data of potentially useful health data will be generated.  How this information can be used -- not just by consumers but also by the healthcare establishment that takes care of them as patients -- will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado, will discuss a major feature of the Affordable Care Act — the health insurance exchanges -- which are designed to bring health insurance to tens of millions of people who previous did not have access to it. 

He is going to talk about how enterprise architecture -- which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa -- has helped the State of Colorado develop their health insurance exchange.

After the plenaries, we will break up into three tracks, one of which is healthcare-focused. In this track there will be three presentations, all of which discuss how enterprise architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.


Care delivery

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using enterprise architecture, focusing on one of our platinum members, Oracle, and a company called Intelligent Medical Objects, and how they're working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an enterprise architecture-based solution at an insurance company. The payers, or the insurers -- the big companies that are responsible for paying bills and collecting premiums -- have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system -- the ICD-10 -- which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers -- health insurance companies (some of which are integrated with providers) -- as very important stakeholders in the big picture.

In the afternoon, we're going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals' data-keeping and data-sharing systems.
Payers are not always  recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

We'll have a panel of experts that responds to these pain points, these challenges, and then we'll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we're very much looking forward to the afternoon as well.

Gardner: I'd also like to remind our readers and listeners that they can take part in this by attending the conference, and there is information about that at the opengroup.org website.

It's really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They're all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there's an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game changer in terms of how people react to their own data -- and then bring more data into the interactions they have with healthcare providers?

Challenge to predict

Lee: It's always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person-generated data from mobile health devices is going to be a game changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds.

As you say, there are big players getting involved. Already some of the pedometer-type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in The New Yorker.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.
There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now.

There will be a question of course as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another, and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it's certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood-sugar level let's say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Game changer

Gardner: It seems to me a real potential game changer as well, and that something like Boundaryless Information Flow and standards will play an essential role in. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what's really going on, on a day-to-day basis.

But then, it's also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we'll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.
There are issues, around security in particular, where healthcare will be at the leading edge.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It's going to be an interesting time.

Of course, we've also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what's going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There's a great future ahead here. It's not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

The conference

Gardner: Well, great. I'm sure there will be a lot more about how to actually implement some of those concepts at the conference. Again, that’s going to be in Boston, beginning July 21, 2014.

We'll have to leave it there. We've been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the healthcare industry to rapid change. And, as we've heard, that might very well spill over into other industries as well.

So we've seen how innovation and adaptation around technology, enterprise architecture and standards can improve the communication and collaboration among healthcare ecosystem players.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group, Boston, 2014 beginning on July 21. It's not too late to register (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, healthcare and other relevant topics.

So a big thank you to our guest, Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you, very much.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these Thought Leadership Interviews. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect podcast on how new devices and practices have the potential to expand the information available to healthcare providers and facilities. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in:

Monday, July 14, 2014

HP Network Management Advances Heighten Performance While Reducing Total Costs for Nordic Telco TDC

Transcript of a BriefingsDirect podcast on how HP network management tools help companies adapt to the new style of IT to deliver better user experiences.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing sponsored discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we’re focusing on how companies are adapting to the new style of IT to improve IT performance and deliver better user experiences, and business results. This time, we’re coming to you directly from the HP Discover 2013 Conference in Barcelona.

We’re here the week of December 9 to learn directly from IT and business leaders alike how big data, mobile, and cloud, along with converged infrastructure are all supporting their goals.

Our next innovation case study interview highlights how TDC is using network-management improvements to better manage and get better performance from their networks across several Nordic countries. To explain how they’re doing that, we’re joined by Lars Niklasson, the Senior Consultant at TDC in Stockholm. Welcome, Lars.

Lars Niklasson: Thank you.

Gardner: First of all, tell us about TDC, what you do, and how large your installations are.

Niklasson: TDC is an operator in the Nordic region, where we have a network covering Norway, Sweden, Finland, and Denmark. In Sweden, we’re also an integrator and have a quite big consultant role in Sweden.

Gardner: And you have a number of main businesses in your organization. There’s TDC Solutions and mobile. There’s even television and some other hosting. Explain for us how large your organization is. It sounds quite big.

Niklasson
Niklasson: Yeah, in Sweden we’re around 800 people, and the whole TDC group is almost 10,000 people.

Gardner: So it’s obviously a very significant network to support this business and deliver the telecommunication services. Maybe you could define your network for us.

Niklasson: It's quite big, over 50,000 devices, and everything is monitored of course. It’s a state-of-the-art network.

Gardner: When you have so many devices to track, so many types of layers of activity and levels of network operations, how do you approach keeping track of that and making sure that you’re not only performing well, but performing efficiently?

Niklasson: Many years ago, we implemented Network Node Manager (NNM) and we have several network operating centers in all countries using NNM. When HP released different smart plug-ins, we started to implement those too for the different areas that they support, such as quality assurance, traffic, and so on.

Gardner: So you’ve been using HP for your network management and HP Network Management Center for some time, and it has of course evolved over the years. What are some of the chief attributes that you like or requirements that you have for network operations, and why has the HP product been so strong for you?

Quick and easy

Niklasson: One thing is that it has to be quick and easy to manage. We have lots of changes all the time, especially in Sweden, when a customer comes. And in Sweden, we’re monitoring end customers’ networks.

It's also very important to be able to integrate it with the other systems that we have. So we can, for example, tell which service-level agreement (SLA) a particular device has and things like that. NNM makes this quite efficient.

Gardner: One of the things that I’ve heard people struggle with is the amount of data that’s generated from networks that then they need to be able to sift through and discover anomalies. Is there something about visualization or other ways of digesting so much data that appeals to you?

Niklasson: NNM is quite good at finding the root cause. You don’t get very many incidents when something happens. If I look back at other products and older versions, there were lots and lots of incidents and alarms. Now, I find it quite easy to manage and configure NNM so it's monitoring the correct things and listening to the correct traps and so on.

Gardner: TDC uses network management capabilities and also sells it. They also provide it with their telecom services. How have you experienced the use in the field? Do any of your customers also manage their own networks and how has this been for your consumers of network services?

Niklasson: We’re also an HP partner in selling NNM to end customers. Part of my work is helping customers implement this in their own environment. Sometimes a customer doesn’t want to do that. They buy the service from us, and we monitor the network. It’s for different reasons. One could be security, and they don’t allow us to access the network remotely. They prefer to have it in-house, and I help them with these projects.
Now, I find it quite easy to manage and configure NNM so it's monitoring the correct things and listening to the correct traps.

Gardner: Lars, looking to the future, are there any particular types of technology improvements that you would like to see or have you heard about some of the roadmaps that HP has for the whole Network Management Center Suite? What interests you in terms of what's next?

Niklasson: I would say two things. One is the application visibility in the network, where we can have some of that with traffic that’s cleaner, but it's still NetFlow-based. So I’m interested in seeing more deep inspection of the traffic and also more virtualization of the virtual environments that we have.

Gardner: All right. I think that’s about all we have time for. I’d really like to thank you for your time. We have been here with Lars Niklasson. He is the Senior Consultant at TDC in Stockholm. Thank you so much.

Niklasson: Thank you.

Gardner: And I’d also like to thank our audience for joining us for this special new style of IT discussion coming to you directly from the HP Discover 2013 Conference in Barcelona.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP sponsored discussions. Thanks again for joining, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect podcast on how HP network tools help companies adapt to the new style of IT to deliver better user experiences. Copyright Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in: