Showing posts with label Andy Stubley. Show all posts
Showing posts with label Andy Stubley. Show all posts

Wednesday, July 23, 2014

UK Solutions Developer Systems Mechanics Uses HP HAVEn for BI, Streaming and Data Analysis

Transcript of a sponsored BriefingsDirect podcast on making telcos more responsive to customer and operators by using big-data tools and analysis.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing sponsored discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we’re focusing on how companies are adapting to the new style of IT to improve IT performance and deliver better user experiences, and business results. This time, we’re coming to you directly from the recent HP Discover 2013 Conference in Barcelona.

We’re here to learn directly from IT and business leaders alike how big data, mobile, and cloud -- along with converged infrastructure -- are all supporting their goals.

Our next innovation case study focuses on how Systems Mechanics Limited is using elements of the HP HAVEn portfolio, to improve how their products can perform in processes such as business intelligence (BI), analytics streaming, and data analysis.

To learn more, we’re here with Andy Stubley, Vice President of Sales and Marketing at Systems Mechanics, based in London. Welcome, Andy.

Andy Stubley: Hello.

Gardner: So tell us a bit about what you do at Systems Mechanics. It sounds like a very interesting organization. You've been doing a lot with data, and monetizing that in some very compelling ways.

Stubley: Yes, indeed. System Mechanics is a UK-based organization. We’re principally a consultancy and a software developer. We’ve been working in the telco space for the last 10-15 years. We also have a history in retail and financial services.

Stubley
The focus we've had recently and the products we’ve developed into our Zen family are based on big data, particularly in telcos, as they evolve from principally old analog conversations into devices where people have smartphone applications and data becomes ever more important.

All that data and all those people connected to the network cause a lot more events that need to be managed, and that data is both a cost to the business and an opportunity to optimize the business. So we have a cost reduction we apply and a revenue upside we apply as well.

Quick example

Gardner: Can you give us a quick example, just so our listeners who might not be familiar with all of these technical terms and instances of use would understand? What’s a typical way a telco will use Zen, and what would they do with it?

Stubley: For a typical way, let’s take a scenario where you’re looking in network and you can’t make a phone call. Two major systems are catching that information. One is a fault-management system that’s telling you there is a fault on the network and it reports that back to the telecom itself.

The second one is the performance management system. That doesn’t specify faults basically, but it tells you if you’re having things like thresholds being affected, which may have an impact on performance every time. Either of those can have an impact on your customer, and from a customer’s perspective, you might also be having a problem with the network that isn’t reported by either of the systems.

We’re finding that social media is getting a bigger play in this space. Why is that? Now, particular the younger populations with consumer-based telcos, mobile telcos particularly, if they can’t get a signal or they can’t make a phone call, they get onto social media and they are trashing the brand.

They’re making noise. A trend is combining fault management and performance management, which are logical partners with social media. All of a sudden, rather than having a couple of systems, you have three.

In our world, we can put 25 or 30 different data sources on to a single Zen platform. In fact, there is no theoretical limit to the number we could, but 20 to 30 is quite typical now. That enables us to manage all the different network elements, different types of mobile technologies, LTE, 3G, and 2G. It could be Ericsson, Nokia, Huawei, ZTE, or Alcatel-Lucent. There is an amazing range of equipment, all currently managed through separate entities. We’re offering a platform to pull it all together in one unit.

The other way I tend to look at it is that we’re trying to turn the telcos into how you might view a human. We take the humans as the best decision-making platforms in the world and we probably still could claim that. As humans, we have conscious and unconscious processes running. We don’t think about breathing or pumping our blood around our system, but it’s happening all the time.
We use a solution with visualization, because in the world of big data, you can’t understand data in numbers.

We have senses that are pulling in massive amount of information from the outside world. You’re listening to me now. You’re probably doing a bunch of other things while you are tapping away on a table as well. They’re getting senses of information there and you are seeing, and hearing, and feeling, and touching, and tasting.

Those all contain information that’s coming into the body, but most of the activity is subconscious. In the world of big data, this is the Zen goal, and what we’re delivering in a number of places is to make as many actions as possible in a telco environment, as in a network environment, come to that automatic, subconscious state.

Suppose I have a problem on a network. I relate it back to the people who need to know, but I don’t require human intervention. We’re looking a position where the human intervention is looking at patterns in that information to decide what they can do intellectually to make the business better.

That probably speaks to another point here. We use a solution with visualization, because in the world of big data, you can’t understand data in numbers. Your human brain isn’t capable of processing enough, but it is capable of identifying patterns of pictures, and that’s where we go with our visualization technology.

Gather and use data

Gardner: So your clients are able to take massive amounts of data and new types of data from a variety of different sources. Rather than be overwhelmed by that, through this analogy to being subconscious, you’re able to gather and use it.

But when something does present a point of information that’s important, you can visualize that and bring that to their attention. It’s a nice way of being the right mix of intelligence, but not overwhelming levels of data.

Stubley: Let me give you an example of that. We’ve got one customer who is one of the largest telcos in EMEA. They’re basically taking in 90,000 alarms from the network a day, and that’s their subsidiary companies, all into one environment. But 90,000 alarms needing manual intervention is a very big number.

Using the Zen technology, we’ve been able to reduce that to 10,000 alarms. We’ve effectively taken 90 percent of the manual processing out of that environment. Now, 10,000 is still a lot of alarms to deal with, but it’s a lot less frightening than 90,000, and that’s a real impact in human terms.

Gardner: Very good. Now that we understand a bit about what you do, let’s get into how you do it. What’s beneath the covers in your Zen system that allows you to confidently say we can take any volume of data we want?
If we need more processing power, we can add more services to scale transparently. That enables us to get any amount of data, which we can then process.

Stubley: Fundamentally, that comes down to the architecture we built for Zen. The first element is our data-integration layer. We have a technology that we developed over the last 10 years specifically to capture data in telco networks. It’s real-time and rugged and it can deal with any volume. That enables us to take anything from the network and push it into our real-time database, which is HP’s Vertica solution, part of the HP HAVEn family.

Vertica analysis is to basically record any amount of data in real time and scale automatically on the HP hardware platform we also use. If we need more processing power, we can add more services to scale transparently. That enables us to get any amount of data, which we can then process.

We have two processing layers. Referring to our earlier discussion about conscious and subconscious activity, our conscious activity is visualizing that data, and that’s done with Tableau.

We have a number of Tableau reports and dashboards with each of our product solutions. That enables us to envision what’s happening and allows the organization, the guys running the network, and the guys looking at different elements in the data to make their own decisions and identify what they might do.

We also have a streaming analytics engine that listens to the data as it comes into the system before it goes to Vertica. If we spot the patterns we’ve identified earlier “subconsciously,” we’ll then act on that data, which may be reducing an alarm count. It may be "actioning" something.

It may be sending someone an email. It may be creating a trouble ticket on a different system. Those all happen transparently and automatically. It’s four layers simplifying the solution: data capture, data integration, visualization, and automatic analytics.

Developing high value

Gardner: And when you have the confidence to scale your underlying architecture and infrastructure, when you are able to visualize and develop high value to a vertical industry like a telco, this allows you to then expand into more lines of business in terms of products and services and also expand into move vertical. Where have you taken this in terms of the Zen family and then where do you take this now in terms of your market opportunity?

Stubley: We focus on mobile telcos. That’s our heritage. We can take any data source from a telco, but we can actually take any data source from anywhere, in any platform and any company. That ranges from binary to HTML. You name it, and if you’ve got data, we could load it.

That means we can build our processing accordingly. What we do is position what we call solution packs, and a solution pack is a connector to the outside world, to the network, and it grabs the data. We’ve got an element of data modeling there, so we can load the data into Vertica. Then, we have already built reports in Tableau that allows us to interrogate automatically. That’s at a component level.

Once you go to a number of components, we can then look horizontally across those different items and look at the behaviors that interact with each other. If you are looking at pure telco terms, we would be looking at different network devices, the end-to-end performance of the network, but the same would apply to a fraud scenario or could apply to someone who is running cable TV.
The very highest level is finding what problem you’re going to solve and then using the data to solve it.

So multi-play players are interesting because they want to monitor what’s happening with TV as well and that will fit in exactly in the same category. Realistically, anybody with high-volume, real-time data can take benefit from Vertica.

Another interesting play in this scenario is social gaming and online advertising. They all have similar data characteristics, very high volume and fixed data that needs to be analyzed and processed automatically.

Gardner: We have quite a few other organizations that are exploring how to use the available technologies to gather and exploit data. Are there any lessons learned, any hindsight perspectives you can provide as other organizations, whether they’re using their own technology or using third-parties or some combination, what should you keep in mind as you begin this journey?

Stubley: A lot of the lessons have been learned time-and-time again. Frequently, people fall into the same traps over-and-over again. Insanity is not learning from previous mistakes, isn’t it? What we see most often, particularly in the big-data world, and I see this in a number of different forms, is when people are looking for the data to be the solution, rather than solving the business problem.

The very highest level is finding what problem you’re going to solve and then using the data to solve it. You won’t identify the problems just with big data itself. It’s too big a problem and it’s irrational, if you think about it.

One of the great things we have is that a number of solutions that sit on a big data enable you to make a starting point and then step, in small steps, to a big data solution that is more encompassing, but proven at every stage of the way. It’s a very classic project behavior with proof at every point, delivery at every point, value at every point. But please, please don’t think big data is the answer.

Why Vertica?

Gardner: One last question delving into the process by which you’ve crafted your architecture and capabilities. How long have you been using Vertica, and what is it that drove you to using it vis-à-vis alternatives?

Stubley: As far as the Zen family goes, we have used other technologies in the past, other relational databases, but we’ve used Vertica now for more than two-and-a-half years. We were looking for a platform that can scale and would give us real-time data. At the volumes we were looking at nothing could compete with Vertica at a sensible price. You can build yourself any solid solution with enough money, but we haven’t got too many customers who are prepared to make that investment.

So Vertica fits in with the technology of the 21st century. A lot of the relational database appliances are using 1980 thought processes. What’s happened with processing in the last few years is that nobody shares memory anymore, and our environment requires a non-shared memory solution. Vertica has been built on that basis. It was scaled without limit.

Gardner: And as you mentioned, Andy, Vertica is part of the HAVEn family and Hadoop, Autonomy, and other security aspects and compliance aspects are in there as well. How do you view things going forward, as more types and volumes of data are involved?
Vertica fits in with the technology of the 21st century. A lot of the relational database appliances are using 1980 thought processes.

You’re still trying to increase your value and reduce that time to delivery for the analysis. Any thoughts about what other aspects of HAVEn might fit well into your organization?

Stubley: One of the areas we’re looking at that I mentioned earlier was social media. Social media is a very natural play for Hadoop, and Hadoop is clearly a very cost-effective platform for vast volumes of data at real-time data load, but very slow to analyze.

So the combination with a high-volume, low-cost platform for the bulk of data and a very high performing real-time analytics engine is very compelling. The challenge is going to be moving the data between the two environments. That isn’t going to go away. That’s not simple, and there is a number of approaches. HP Vertica is taking some.

There is Flex Zone, and there are any number of other players in that space. The reality is that you probably reach an environment where people are parallel loading the Hadoop and the Vertica. That’s what we probably plan to do. That gives you much more resilience. So for a lot of the data we’re putting into our system, we’re actually planning to put the raw data files into Hadoop, so we can reload them as necessary to improve the resilience of the overall system too.

Gardner: Well good luck with that. I hope perhaps we can hear more about that in the next HP Discover event, but I’m afraid we’ll have to leave it there for this particular venue. I’d like to thank our guest, Andy Stubley, Vice President of Sales and Marketing at Systems Mechanics Limited, based in London. Thank you so much, Andy.

Stubley: Thank you very much.

Gardner: And a big thank you to our audience, as well, for joining us in this special new style of IT discussion coming to you directly from the HP Discover 2013 Conference in Barcelona. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP sponsored discussions. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a sponsored BriefingsDirect podcast on making telcos more responsive to customer and operators by using big-data tools and analysis. Copyright Interarbor Solutions, LLC, 2005-2014. All rights reserved.

You may also be interested in: