Tuesday, July 09, 2013

Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now

Transcript of a BriefingsDirect podcast on current misconceptions about big data and how organizations should best approach a big-data project.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Dell Software.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions and you're listening to BriefingsDirect.

Gardner
Today, we present a sponsored podcast discussion on debunking some major myths around big data. It used to be that data was the refuse of business applications, a necessary cleanup chore for audit and compliance sake.

But now, as analytics grow in importance for better running businesses and in knowing and predicting dynamic market trends and customer wants in real-time, data itself has become the killer application.

As the volumes and types of value data are brought to bear on business analytics, the means to manage and exploit that sea of data has changed rapidly, too. But that doesn't mean that the so-called big data is beyond the scale of mere business mortals or too costly or complex for mid-size companies to master.

So we're here to pose some questions -- many of them the stuff of myth -- and then find better answers to why making data and big data the progeny of innovative insight is critical for more companies.

To help identify and debunk the myths around big data so that you can enjoy the value of those analytics better, please join me in welcoming our guest, Darin Bartik, Executive Director of Products in the Information Management Group at Dell Software. Welcome, Darin. [Disclosure: Dell is a sponsor of BriefingsDirect podcasts.]

Darin Bartik: Thanks, Dana. Good to be with you.

Gardner: We seem to be at an elevated level of hype around big data. I guess a good thing about that is it’s a hot topic and it’s of more interest to more people nowadays, but we seem to have veered away from the practical and maybe even the impactful. Are people losing sight of the business value by getting lost in speeds and feeds and technical jargon? Is there some sort of a disconnect between the providers and consumers of big data?

Bartik: I'm sure we're going to get into a couple of different areas today, but you hit the nail on the head with the first question.  We are experiencing a disconnect between the technical side of big data and the business value of big data, and that’s happening because we’re digging too deeply into the technology.

Bartik
With a term like big data, or any one of the trends that the information technology industry talks about so much, we tend to think about the technical side of it. But with analytics, with the whole conversation around big data, what we've been stressing with many of our customers is that it starts with a business discussion. It starts with the questions that you're trying to answer about the business; not the technology, the tools, or the architecture of solving those problems. It has to start with the business discussion.

That’s a pretty big flip. The traditional approach to business intelligence (BI) and reporting has been one of technology frameworks and a lot of things that were owned more by the IT group. This is part of the reason why a lot of the BI projects of the past struggled, because there was a disconnect between the business goals and the IT methods.

So you're right. There has been a disconnect, and that’s what I've been trying to talk a lot about with customers -- how to refocus on the business issues you need to think about, especially in the mid-market, where you maybe don’t have as many resources at hand. It can be pretty confusing.

Part of the hype cycle

The other thing you asked is, “Are vendors confusing people?" Without disparaging the vendors like us, or anyone else, that’s part of the problem of any hype cycle. Many people jumped on the bandwagon of big data. Just like everyone was talking cloud. Everyone was talking virtualization, bring your own device (BYOD), and so forth.

Everyone jumps on these big trends. So it's very confusing for customers, because there are many different ways to come at the problem. This is why I keep bringing people back to staying focused on what the real opportunity is. It’s a business opportunity, not a technical problem or a technical challenge that we start with.

Gardner: Right. We don’t want to lose the track of the ends because the means seem to be so daunting. We want to keep our focus on the ends and then find the means. Before we go into our myths, tell me a little bit, Darin, about your background and how you came to be at Dell.

Bartik: I've been a part of Dell Software since the acquisition of Quest Software. I was a part of that organization for close to 10 years. I've been in technology coming up on 20 years now. I spent a lot of time in enterprise resource planning (ERP), supply chain, and monitoring, performance management, and infrastructure management, especially on the Microsoft side of the world.

Most recently, as part of Quest, I was running the database management area -- a business very well-known for its products around Oracle, especially Toad, as well as our SQL Server management capabilities. We leveraged that expertise when we started to evolve into BI and analytics.

I started working with Hadoop back in 2008-2009, when it was still very foreign to most people. When Dell acquired Quest, I came in and had the opportunity to take over the Products Group in the ever-expanding world of information management. We're part of the Dell Software Group, which is a big piece of the strategy for Dell over all, and I'm excited to be here.
It’s not a size issue. It's really a trend that has happened as a result of digitizing so much more of the information that we all have already.

Gardner: Great. Even the name "big data" stirs up myths right from the get-go, with "big" being a very relative term. Should we only be concerned about this when we have more data than we can manage? What is the relative position of big data and what are some of the myths around the size issue?

Bartik: That’s the perfect one to start with. The first word in the definition is actually part of the problem. "Big." What does big mean? Is there a certain threshold of petabytes that you have to get to? Or, if you're dealing with petabytes, is it not a problem until you get to exabytes

It’s not a size issue. When I think about big data, it's really a trend that has happened as a result of digitizing so much more of the information that we all have already and that we all produce. Machine data, sensor data, all the social media activities, and mobile devices are all contributing to the proliferation of data.

It's added a lot more data to our universe, but the real opportunity is to look for small elements of small datasets and look for combinations and patterns within the data that help answer those business questions that I was referencing earlier.

It's not necessarily a scale issue. What is a scale issue is when you get into some of the more complicated analytical processes and you need a certain data volume to make it statistically relevant. But what customers first want to think about is the business problems that they have. Then, they have to think about the datasets that they need in order to address those problems.

Big-data challenge

That may not be huge data volumes. You mentioned mid-market earlier. When we think about some organizations moving from gigabytes to terabytes, or doubling data volumes, that’s a big data challenge in and of itself.

Analyzing big data won't necessarily contribute to your solving your business problems if you're not starting with the right questions. If you're just trying to store more data, that’s not really the problem that we have at hand. That’s something that we can all do quite well with current storage architectures and the evolving landscape of hardware that we have.

We all know that we have growing data, but the exact size, the exact threshold that we may cross, that’s not the relevant issue.

Gardner: I suppose this requires prioritization, which has to come from the business side of the house. As you point out, some statistically relevant data might be enough. If you can extrapolate and you have enough to do that, fine, but there might be other areas where you actually want to get every little bit of possible data or information relevant, because you don't know what you're looking for. They are the unknown unknowns. Perhaps there's some mythology about all data. It seems to me that what’s important is the right data to accomplish what it is the business wants.

Bartik: Absolutely. If your business challenge is an operational efficiency or a cost problem, where you have too much cost in the business and you're trying to pull out operational expense and not spend as much on capital expense, you can look at your operational data.
There's a lot of variability and prioritization that all starts with that business issue that you're trying to address.

Maybe manufacturers are able to do that and analyze all of the sensor, machine, manufacturing line, and operational data. That's a very different type of data and a very different type of approach than looking at it in terms of sales and marketing.

If you're a retailer looking for a new set of customers or new markets to enter in terms of geographies, you're going to want to look at maybe census data and buying-behavior data of the different geographies. Maybe you want datasets that are outside your organization entirely. You may not have the data in your hands today. You may have to pull it in from outside resources. So there's a lot of variability and prioritization that all starts with that business issue that you're trying to address.

Gardner: Perhaps it's better for the business to identify the important data, rather than the IT people saying it’s too big or that big means we need to do something different. It seems like a business term rather than a tech term at this point.

Bartik: I agree with you. The more we can focus on bringing business and IT to the table together to tackle this challenge, the better. And it does start with the executive management in the organization trying to think about things from that business perspective, rather than starting with the IT infrastructure management team. 

Gardner: What’s our second myth?

Bartik: I'd think about the idea of people and the skills needed to address this concept of big data. There is the term "data scientist" that has been thrown out all over the place lately. There’s a lot of discussion about how you need a data scientist to tackle big data. But “big data” isn't necessarily the way you should think about what you’re trying to accomplish. Instead, think about things in terms of being more data driven, and in terms of getting the data you need to address the business challenges that you have. That’s not always going to require the skills of a data scientist.

Data scientists rare

I suspect that a lot of organizations would be happy to hear something like that, because data scientists are very rare today, and they're very expensive, because they are rare. Only certain geographies and certain industries have groomed the true data scientist. That's a unique blend between a data engineer and someone like an applied scientist, who can think quite differently than just a traditional BI developer or BI programmer.

Don’t get stuck on thinking that, in order to take on a data-driven approach, you have to go out and hire a data scientist. There are other ways to tackle it. That’s where you're going to combine people who can do the programming around your information, around the data management principles, and the people who can ask and answer the open-minded business questions. It doesn’t all have to be encapsulated into that one magical person that’s known now as the data scientist.

Gardner: So rather than thinking we need to push the data and analytics and the ability to visualize and access this through a small keyhole, which would be those scientists, the PhDs, the white lab coats, perhaps there are better ways now to make those visualizations and allow people to craft their own questions against the datasets. That opens the door to more types of people being able to do more types of things. Does that sum it up a bit?

Bartik: I agree with that. There are varying degrees of tackling this problem. You can get into very sophisticated algorithms and computations for which a data scientist may be the one to do that heavy lifting. But for many organizations and customers that we talk to everyday, it’s something where they're taking on their first project and they are just starting to figure out how to address this opportunity.

For that, you can use a lot of the people that you have inside your organization, as well potentially consultants that can just help you break through some of the old barriers, such as thinking about intelligence, based strictly on a report and a structured dashboard format.
Often a combination of programming and some open-minded thinking, done with a  team-oriented approach, rather than that single keyhole person, is more than enough to accomplish your objectives.

That’s not the type of approach we want to take nowadays. So often a combination of programming and some open-minded thinking, done with a  team-oriented approach, rather than that single keyhole person, is more than enough to accomplish your objectives.

Gardner: It seems also that you're identifying confusion on the part of some to equate big data with BI and BI with big data. The data is a resource that the BI can use to offer certain values, but big data can be applied to doing a variety of other things. Perhaps we need to have a sub-debunking within this myth, and that is that big data and BI are different. How would you define them and separate them?

Bartik: That's a common myth. If you think about BI in its traditional, generic sense, it’s about gaining more intelligence about the business, which is still the primary benefit of the opportunity this trend of big data presents to us. Today, I think they're distinct, but over time, they will come together and become synonymous.

I equate it back to one of the more recent trends that came right before big data, cloud. In the beginning, most people thought cloud was the public-cloud concept. What’s turned out to be true is that it’s more of a private cloud or a hybrid cloud, where not everything moved from an on-premise traditional model, to a highly scalable, highly elastic public cloud. It’s very much a mix.

They've kind of come together. So while cloud and traditional data centers are the new infrastructure, it’s all still infrastructure. The same is true for big data and BI, where BI, in the general sense of how can we gain intelligence and make smarter decisions about our business, will include the concept of big data.

Better decisions

So while we'll be using new technologies, which would include Hadoop, predictive analytics, and other things that have been driven so much faster by the trend of big data, we’ll still be working back to that general purpose of making better decisions.

One of the reasons they're still different today is because we’re still breaking some of the traditional mythology and beliefs around BI -- that BI is all about standard reports and standard dashboards, driven by IT. But over time, as people think about business questions first, instead of thinking about standard reports and standard dashboards first, you’ll see that convergence.

Gardner: We probably need to start thinking about BI in terms of a wider audience, because all the studies I've seen don't show all that much confidence and satisfaction in the way BI delivers the analytics or the insights that people are looking for. So I suppose it's a work in progress when it comes to BI as well.

Bartik: Two points on that. There has been a lot of disappointment around BI projects in the past. They've taken too long, for one. They've never really been finished, which of course, is a problem. And for many of the business users who depend on the output of BI -- their reports, their dashboard, their access to data -- it hasn’t answered the questions in the way that they may want it to.

One of the things in front of us today is a way of thinking about it differently. Not only is there so much data, and so much opportunity now to look at that data in different ways, but there is also a requirement to look at it faster and to make decisions faster. So it really does break the old way of thinking.
People are trying to make decisions about moving the business forward, and they're being forced to do it faster.

Slowness is unacceptable. Standard reports don't come close to addressing the opportunity in front us, which is to ask a business question and answer it with the new way of thinking supported by pulling together different datasets. That’s fundamentally different from the way we used to do it.

People are trying to make decisions about moving the business forward, and they're being forced to do it faster. Historical reporting just doesn't cut it. It’s not enough. They need something that’s much closer to real time. It’s more important to think about open-ended questions, rather than just say, "What revenue did I make last month, and what products made that up?" There are new opportunities to go beyond that.

Gardner: I suppose it also requires more discipline in keeping your eye on the ends, rather than getting lost in the means. That also is a segue to our next myth, which is, if I have the technology to do big data, then I'm doing big data, and therefore I'm done.

Bartik: Just last week, I was meeting with a customer and they said, "Okay, we have our Hadoop cluster set up and we've loaded about 10 terabytes of sample data into this Hadoop cluster. So we've started our big data project."

When I hear something like that, I always ask, "What question are you trying to answer? Why did you load that data in there? Why did you start with Hadoop? Why did you do all this?" People are starting with the technology first too often. They're not starting with the questions and the business problems first.

Not the endgame

You said as far as making sure that you keep your eye on the endgame, the endgame is not to spin up a new technology, or to try a new tool. Hadoop has been one of those things where people have started to use that and they think that they're off and running on a big-data project. It can be part of it, but it isn't where you want to start, and it isn’t the endgame.

The endgame is solving the business problem that you're out there trying to address. It’s either lowering costs inside the business, or it’s finding a new market, figuring out why this customer set loves our products and why some other customer set doesn’t. Answering those questions is the endgame, not starting a new technology initiative.

Gardner: When it comes to these technology issues, do you also find, Darin, that there is a lack of creativity as to where the data and information resides or exists and thinking not so much about being able to run it, but rather acquire it? Is there a dissonance between the data I have and the data I need. How are people addressing that?

Bartik: There is and there isn’t. When we look at the data that we have, that’s oftentimes a great way to start a project like this, because you can get going faster and it’s data that you understand. But if you think that you have to get data from outside the organization, or you have to get new datasets in order to answer the question that’s in front of us, then, again, you're going in with a predisposition to a myth.

You can start with data that you already have. You just may not have been looking at the data that you already have in the way that’s required to answer the question in front of you. Or you may not have been looking at it all. You may have just been storing it, but not doing anything with it.
Storing data doesn’t help you answer questions. Analyzing it does.

Storing data doesn’t help you answer questions. Analyzing it does. It seems kind of simple, but so many people think that big data is a storage problem. I would argue it's not about the storage. It’s like backup and recovery. Backing up data is not that important, until you need to recover it. Recovery is really the game changing thing.

Gardner: It’s interesting that with these myths, people have tended, over the years, without having the resources at hand,  to shoot from the hip and second-guess. People who are good at that and businesses that have been successful have depended on some luck and intuition. In order to take advantage of big data, which should lead you to not having to make educated guesses, but to have really clear evidence, you can apply the same principle. It's more how you get big data in place, than how you would use the fruits of big data.

It seems like a cultural shift we have to make. Let’s not jump to conclusions. Let’s get the right information and find out where the data takes us.

Bartik: You've hit on one of the biggest things that’s in front of us over the next three to five years -- the cultural shift that the big data concept introduces.

We looked at traditional BI as more of an IT function, where we were reporting back to the business. The business told us exactly what they wanted, and we tried to give that to them from the IT side of the fence.

Data-driven organization

But being successful today is less about intuition and more about being a data-driven organization, and, for that to happen, I can't stress this one enough, you need executives who are ready to make decisions based on data, even if the data may be counter intuitive to what their gut says and what their 25 years of experience have told them.

They're in a position of being an executive primarily because they have a lot of experience and have had a lot of success. But many of our markets are changing so frequently and so fast, because of new customer patterns and behaviors, because of new ways of customers interacting with us via different devices. Just think of the different ways that the markets are changing. So much of that historical precedence no longer really matters. You have to look at the data that’s in front of us.

Because things are moving so much faster now, new markets are being penetrated and new regions are open to us. We're so much more of a global economy. Things move so much faster than they used to. If you're depending on gut feeling, you'll be wrong more often than you'll be right. You do have to depend on as much of a data-driven decision as you can. The only way to do that is to rethink the way you're using data.

Historical reports that tell you what happened 30 days ago don't help you make a decision about what's coming out next month, given that your competition just introduced a new product today. It's just a different mindset. So that cultural shift of being data-driven and going out and using data to answer questions, rather than using data to support your gut feeling, is a very big shift that many organizations are going to have to adapt to.

Executives who get that and drive it down into the organization, those are the executives and the teams that will succeed with big data initiatives, as opposed to those that have to do it from the bottom up.
It's fair to say that big data is not just a trend; it's a reality. And it's an opportunity for most organizations that want to take advantage of it.

Gardner: Listening to you Darin, I can tell one thing that isn’t a product of hype is just how important this all is. Getting big data right, doing that cultural shift, recognizing trends based on the evidence and in real-time as much as possible is really fundamental to how well many businesses will succeed or not.

So it's not hype to say that big data is going to be a part of your future and it's important. Let's move towards how you would start to implement or change or rethink things, so that you can not fall prey to these myths, but actually take advantage of the technologies, the reduction in costs for many of the infrastructures, and perhaps extend and exploit BI and big data problems.

Bartik: It's fair to say that big data is not just a trend; it's a reality. And it's an opportunity for most organizations that want to take advantage of it. It will be a part of your future. It's either going to be part of your future, or it's going to be a part of your competition’s future, and you're going to be struggling as a result of not taking advantage of it.

The first step that I would recommend -- I've said it a few times already, but I don't think it can't be said too often -- is pick a project that's going to address a business issue that you've been unable to address in the past.

What are the questions that you need to ask and answer about your business that will really move you forward?" Not just, "What data do we want to look at?" That's not the question.

What business issue?

The question is what business issue do we have in front of us that will take us forward the fastest? Is it reducing costs? Is it penetrating a new regional market? Is it penetrating a new vertical industry, or evolving into a new customer set?

These are the kind of questions we need to ask and the dialogue that we need to have. Then let's take the next step, which is getting data and thinking about the team to analyze  it and the technologies to deploy. But that's the first step – deciding what we want to do as a business.

That sets you up for that cultural shift as well. If you start at the technology layer, if you start at the level of let's deploy Hadoop or some type of new technology that may be relevant to the equation, you're starting backwards. Many people do it, because it's easier to do that than it is to start an executive conversation and to start down the path of changing some cultural behavior. But it doesn’t necessarily set you up for success.

Gardner: It sounds as if you know you're going on a road trip and you get yourself a Ferrari, but you haven't really decided where you're going to go yet, so you didn’t know that you actually needed a Ferrari.

Bartik: Yeah. And it's not easy to get a tent inside a Ferrari. So you have to decide where you're going first. It's a very good analogy.
Get smart by going to your peers and going to your industry influencer groups and learning more about how to approach this.

Gardner: What are some of the other ways when it comes to the landscape out there? There are vendors who claim to have it all, everything you need for this sort of thing. It strikes me that this is more of an early period and that you would want to look at a best-of-breed approach or an ecosystem approach.

So are there any words of wisdom in terms of how to think about the assets, tools, approaches, platforms, what have you, or not to limit yourself in a certain way?

Bartik: There are countless vendors that are talking about big data and offering different technology approaches today. Based on the type of questions that you're trying to answer, whether it's more of an operational issue, a sales market issue, HR, or something else, there are going to be different directions that you can go in, in terms of the approaches and the technologies used.

I encourage the executives, both on the line-of-business side as well as the IT side, to go to some of the events that are the "un-conferences," where we talk about the big-data approach and the technologies. Go to the other events in your industry where they're talking about this and learn what your peers are doing. Learn from some of the mistakes that they've been making or some of the successes that they've been having.

There's a lot of success happening around this trend. Some people certainly are falling into the pitfalls, but get smart by going to your peers and going to your industry influencer groups and learning more about how to approach this.

Technical approaches

There are technical approaches that you can take. There are different ways of storing your data. There are different ways of computing and processing your data. Then, of course, there are different analytical approaches that get more to the open-ended investigation of data. There are many tools and many products out there that can help you do that.

Dell has certainly gone down this road and is investing quite heavily in this area, with both structured and unstructured data analysis, as well as the storage of that data. We're happy to engage in those conversations as well, but there are a lot of resources out there that really help companies understand and figure out how to attack this problem.

Gardner: In the past, with many of the technology shifts, we've seen a tension and a need for decision around best-of-breed versus black box, or open versus entirely turnkey, and I'm sure that's going to continue for some time.

But one of the easier ways or best ways to understand how to approach some of those issues is through some examples. Do we have any use cases or examples that you're aware of, of actual organizations that have had some of these problems? What have they put in place, and what has worked for them?
There are a lot of resources out there that really help companies understand and figure out how to attack this problem.

Bartik: I'll give you a couple of examples from two very different types of organizations, neither of which are huge organizations. The first one is a retail organization, Guess Jeans. The business issue they were tackling was, “How do we get more sales in our retail stores? How do we get each individual that's coming into our store to purchase more?”

We sat down and started thinking about the problem. We asked what data would we need to understand what’s happening? We needed data that helps us understand the buyer’s behavior once they come into the store. We don't need data about what they are doing outside the store necessarily, so let's look specifically at behaviors that take place once they get into the store.

We helped them capture and analyze video monitoring information. Basically it followed each of the people in the store and geospatial locations inside the store, based on their behavior. We tracked that data and then we compared against questions like did they buy, what did they buy, and how much did they buy. We were able to help them determine that if you get the customer into a dressing room, you're going to be about 50 percent more likely to close transactions with them.

So rather than trying to give incentives to come into the store or give discounts once they get into the store, they moved towards helping the store clerks, the people who ran the store and interacted with the customers, focus on getting those customers into a dressing room. That itself is a very different answer than what they might have thought of at first. It seems easy after you think about it, but it really did make a significant business impact for them in rather short order.

Now, they're also thinking about other business challenges that they have and other ways of analyzing data and other datasets, based on different business challenges, but that’s one example.

Another example is on the higher education side. In universities, one of the biggest challenges is having students drop out or reduce their class load. The fewer classes they take, or if they dropout entirely, it obviously goes right to the top and bottom line of the organization, because it reduces tuition, as well as the other extraneous expenses that students incur at the university.

Finding indicators

The University of Kentucky went on an effort to reduce students dropping out of classes or dropping entirely out of school. They looked at a series of datasets, such as demographic data, class data, the grades that they were receiving, what their attendance rates were, and so forth. They analyzed many different data points to determine the indicators of a future drop out.

Now, just raising the student retention rate by one percent would in turn mean about $1 million of top-line revenue to the university. So this was pretty important. And in the end, they were able to narrow it down to a couple of variables that strongly indicated which students were at risk, such that they could then proactively intervene with those students to help them succeed.

The key is that they started with a very specific problem. They started it from the university's core mission: to make sure that the students stayed in school and got the best education, and that's what they are trying to do with their initiative. It turned out well for them.

These were very different organizations or business types, in two very different verticals, and again, neither are huge organizations that have seas of data. But what they did are much more manageable and much more tangible examples  many of us can kind of apply to our own businesses.

Gardner: Those really demonstrate how asking the right questions is so important.
What we have today is a set of capabilities that help customers take more of a data-type agnostic view and a vendor agnostic view to the way they're approaching data and managing data.

Darin, we're almost out of time, but I did want to see if we could develop a little bit more insight into the Dell Software road map. Are there some directions that you can discuss that would indicate how organizations can better approach these problems and develop some of these innovative insights in business?

Bartik: A couple of things. We've been in the business of data management, database management, and managing the infrastructure around data for well over a decade. Dell has assembled a group of companies, as well as a lot of organic development, based on their expertise in the data center for years. What we have today is a set of capabilities that help customers take more of a data-type agnostic view and a vendor agnostic view to the way they're approaching data and managing data.

You may have 15 tools around BI. You may have tools to look at your Oracle data, maybe new sets of unstructured data, and so forth. And you have different infrastructure environments set up to house that data and manage it. But the problem is that it's not helping you bring the data together and cross boundaries across data types and vendor toolset types, and that's the challenge that we're trying to help address.

We've introduced tools to help bring data together from any database, regardless of where it may be sitting, whether it's a data warehouse, a traditional database, a new type of database such as Hadoop, or some other type of unstructured data store.

We want to bring that data together and then analyze it. Whether you're looking at more of a traditional structured-data approach and you're exploring data and visualizing datasets that many people may be working with, or doing some of the more advanced things around unstructured data and looking for patterns, we’re focused on giving you the ability to pull data from anywhere.

Using new technologies

We're investing very heavily, Dana, into the Hadoop framework to help customers do a couple of key things. One is helping the people that own data today, the database administrators, data analysts, the people that are the stewards of data inside of IT, advance their skills to start using some of these new technologies, including Hadoop.

It's been something that we have done for a very long time, making your C players B players, and your B players A players. We want to continue to do that, leverage their existing experience with structured data, and move them over into the unstructured data world as well.

The other thing is that we're helping customers manage data in a much more pragmatic way. So if they are starting to use data that is in the cloud, via Salesforce.com or Taleo, but they also have data on-prem sitting in traditional data stores, how do we integrate that data without completely changing their infrastructure requirements? With capabilities that Dell Software has today, we can help integrate data no matter where it sits and then analyze it based on that business problem.

We help customers approach it more from a pragmatic view, where you're  taking a stepwise approach. We don't expect customers to pull out their entire BI and data-management infrastructure and rewrite it from scratch on day one. That's not practical. It's not something we would recommend. Take a stepwise approach. Maybe change the way you're integrating data. Change the way you're storing data. Change, in some perspective, the way you're analyzing data between IT and the business, and have those teams collaborate.
But you don't have to do it all at one time. Take that stepwise approach.

But you don't have to do it all at one time. Take that stepwise approach. Tackle it from the business problems that you're trying to address, not just the new technologies we have in front of us.

There's much more to come from Dell in the information management space. It will be very interesting for us and  for our customers to tackle this problem together. We're excited to make it happen.

Gardner: Well, great. I'm afraid we'll have to leave it there. We've been listening to a sponsored BriefingsDirect podcast discussion on debunking some major myths around big data use and value. We've seen how big data is not necessarily limited by scale and that the issues around  it don't always have to supersede the end for your business goals.

We've also learned more about levels of automation and how Dell is going to be approaching the market. So I appreciate that. With that, we'll have to end it and thank our guest.

We've been here with Darin Bartik, Executive Director of Products in the Information Management Group at Dell Software. Thanks so much, Darin.

Bartik: Thank you, Dana, I appreciate it.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks also to our audience for joining and listening, and don't forget to come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Dell Software.

Transcript of a BriefingsDirect podcast on current misconceptions about big data and how organizations should best approach a big-data project.  Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automation, Bring Commercial Benefits to Enterprises

Transcript of a BriefingsDirect podcast on how The Open Group is working to stay ahead of converging challenges organization face with big data, mobile, cloud and social.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

We're here now with a panel of experts to explore the business implications of the current shift to so-called Platform 3.0. Known as the new model through which big data, cloud, and mobile and social -- in combination -- allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we're here now to learn more how to leverage Platform 3.0 as more than a IT shift -- and as a business game-changer.

With that, please join me in welcoming our panel: Dave Lounsbury, Chief Technical Officer at The Open Group. Welcome, Dave.

Dave Lounsbury: Hi, Dana, happy to be here.

Gardner: We're also here with Chris Harding, Director of Interoperability at The Open Group. Welcome, Chris. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Chris Harding: Thank you, Dana, and it's great to be on this panel.

Gardner: And also Mark Skilton, Global Director in the Strategy Office at Capgemini. Welcome, Mark.

Mark Skilton: Hi, Dana, thanks for inviting us today. I'm very happy to be here.

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They're turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it's a pretty fundamental change.

Lounsbury
If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we're starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Gardner: Chris Harding is there a time issue here? Is this something that organizations should sit back, watch how it unfolds, and then gauge their response? Or is there a benefit of being out in front of this in some way?

Harding: I don’t know about in front of this. Enterprises have to be up with the way that things are moving in order to keep their positions in their industries. Enterprises can't afford to be working with yesterday's technology. It's a case of being able to understand the information that they're presented and make the best decision to reflect that.

Harding
We've always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we're talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs -- velocity, volume, and value -- on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we're now into is the multi-workload environment, where you've got mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton
It has to do with not just one solution, not one subscription model, because we're now into this subscription-model era, the subscription economy, as one group tends to describe it. Now, we're looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we're dealing with -- 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Dave Lounsbury, back to you. Why should IT be thinking about this as a fundamental shift, rather than a step change or a modest change? It seems to me that this combination of factors almost blows the whole IT definition of 10 years ago, out of the water. Is it that big a deal for IT? It also has an impact on business. I'd like to just focus on how IT organizations might need to start rethinking things?
There's no point giving someone data if it's not been properly managed or if there's incorrect information.

Lounsbury: A lot depends on how you define your IT organization. It's useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it's how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There's no point giving someone data if it's not been properly managed or if there's incorrect information.

What's going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we've seen in the open-source  role and things like that nature, but there's the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they're going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can't do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally -- how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I'd just like to add to Dave's excellent points about, the shape of data has changed, but also about why should IT get involved. We're seeing that there's a shift in the constituency of who is using this data.

We've got the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We've got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there's also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: So we have more types of users, more classes of individuals and resources within a enterprise starting to avail themselves more of these intelligence capabilities more ubiquitously, vis-à-vis the mobile and the cloud delivery opportunity.
This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

How do we prevent this from going off the rails? How is it that we don’t start creating multiple fire hoses of information and/or too much data, but not enough analysis? Chris Harding, any thoughts about where perhaps The Open Group or others can step in to help make this a more fruitful, rather than chaotic, transition?

Harding: This a very important point. And to add to the difficulties, it's not only that a whole set of different people are getting involved with different kinds of information, but there's also a step change in the speed with which all this is delivered. It's no longer the case, that you can say, "Oh well, we need some kind of information system to manage this information. We'll procure it and get a program written" that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It's really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It's a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Lounsbury: I'd like to expand on that a little bit if I could, Dana. I agree with all the points that Chris and Mark just made. We should also mention that it's not just the speed of the analysis on the consumption side. We're going to see a lot of rapid evolution in the input side as well.

New data sources

We're starting to see lot of new data sources come on line. We've touched on the mobile devices and the social networks that those mobile devices enable, but we’re really also on the cusp of this idea of the "Internet of things," where there is a vast globe full of network connected sensors and actuators out there, all of which produce their own data.

Part of the process that Chris alluded to and the best practices Chris alluded to is how you run your business processes so that you keep your feeds up to date, so that you can adapt quickly to new sources of information, as well as adapt quickly to the new demands for information from the lines of business.

Gardner: It seems to be somewhat unprecedented that we have multiple change agents playing off of one another with complexity, scale, and velocity all very much at work. It's one thing to have a vision about how you would want to exploit this, but it's another to have a plan about how to go about that.

Mark Skilton, with your knowledge of Capgemini and the role that they play in the market, it seems to me that there's a tremendous need for some examples or some sense of how to go about managing the ability to exploit Platform 3.0 without getting tripped up and overwhelmed in the process.

Skilton: That’s right. Capgemini has been doing work in this area. I break it down into four levels of scalability. It's the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We're very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.
Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it's all about connectivity in the field. I meet a number of clients who are saying, "We’ve got this cloud service," or "This service is in a certain area of my country. If I move to another parts of the country or I'm traveling, I can't get connectivity." That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the "long tail effect." It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: Dave Lounsbury, we're coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We're still in the formational stages of  "third platform" or Platform 3.0 for The Open Group as an industry. To some extent, we're starting pretty much at the ground floor with that in the Platform 3.0 forum. We're leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we've got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we're really working towards in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Gardner: Anything to offer on that Chris?

Harding: There are some excellent points there. We certainly need to understand the business environment within which Platform 3.0 will be used. We've heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we've heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We're also hearing about marketplaces for services, new ways in which services are being made available and combined.
What are the problems that need to be resolved in order to understand what kind of shape the new platform will have?

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: At the same time, The Open Group is looking to enter into more vertical industry emphasis with its activities. At the Philadelphia Conference, you've chosen finance, government and healthcare. Dave or Chris, is there something about these three vertical industries that make them excellent test cases for Platform 3.0? Is there something about going into a vertical industry that helps with the transition to 3.0, rather than a general or one-size-fits-all approach? What's the impact of vertical industry emphasis on this transition?

Lounsbury: First, I'll note that the overarching theme of The Open Group Conferences is about business transformation -- how you adapt and evolve your business to take better advantage of the efficiencies afforded by IT and other developments. So as a horizontal activity, Platform 3.0 fits in very well with that, because I believe these transformational drivers from the evolution of Platform 3.0 are going to affect all industries.

To get back to your question, the benefit of Platform 3.0 will be most immediately and urgently felt in vertical industries that deal with extremely large volumes of data and need to filter very large volumes of data in order to achieve their business objectives and run their businesses efficiently.

For example, one of the things that healthcare is struggling with right now is a mass of patient records that need to be done. How do care givers or care providers make sense of those, make sure that everybody is up-to-date, and make sure that everybody is simply working off of the same data? It's a core question for them.

Today's problem

That’s today's problem which some of the infrastructure of Platform 3.0 will undoubtedly help with. When you come to looking at care not only as an individual topic, how my doctor or nurse gives care to me, but in terms of the larger trends in healthcare, can we look at how certain drugs effect certain diseases, it's a perfect example for the use of data and strong analytics to get information. We couldn’t have actually gotten that before, simply because we couldn’t bring it together and understand it.

In some sense, the biotech industry has been leading this trend. Genomics have really seeded a lot of the big data capabilities.

That will be a very exciting area for healthcare. If you go into any Apple Store, you'll see a whole retail rack of gadgets that you wear on your body that tell you how fit you are, or how fit you aren’t in some cases. It will tell you what your pulse is, your heart rate, and your body mass index. We're getting very close to a time when we will have things that might even measure and report bits of your blood chemistry. We're very close to that, for example, with blood sugar.

That data might, through the concepts of Platform 3.0, provide a really personalized and much more immediate healthcare loop in the patient care. Again, these are all things a few years out. The Open Group is deliberately choosing to get in early, so we and our members can be informed about these trends, how to take advantage of them and what standards are going to be needed to do it.

We can go on about finance too, but it's also another area where this massive data that will need to be correlated and analyzed.

Gardner: You are saying that not only are we facing an internet of things, we're going to be facing an internet of living things as well. So, there's a lot of data to come.
The Open Group is deliberately choosing to get in early, so we and our members can be informed about these trends.

One of the great things about The Open Group that I've been observing over the years is that it really provides a super important environment for different types of organizations to collaborate and share their stories and understand what others are doing, both in their own vertical industries, but also another types of business.

I expect that’s really going to be a huge benefit to organizations as they transition towards Platform 3.0, to learn from how others are doing it and even how others have stumbled along the way? But do you have any early indicators, either examples or use cases that would illustrate just how important this is, how instrumental this can be in helping companies?

Let's go across our panel. Mark Skilton at Capgemini, any examples that we could point to that would indicate that when you do this well, when you transition, when you take advantage of all these changes in tandem, you get pragmatic and even measurable benefit.

Skilton: Identifying business value is the key and builds on what David was talking about in terms of having new types of data, sensors, and capabilities. What we’re finding is clients are dealing with this in eHealth, eGovernment and eFinance.  

Cost of health care

In the health sector the rising cost of health care and the increasing life expectancy and longevity of the population is increasing pressure on the cost of health care in many countries. eHealth initiatives, use of new technologies such as mobile patient monitoring, and improved digital patient record management and care planning will aim to drive down the cost of medical care while improving the quality of life of patients.

In the federal government sector the eGov initiatives seek to develop citizen services and value for money of public spend programs. Open data initiatives aim to develop information and marketing sharing of services.

What can we do there to accelerate the adoption of services across markets. How can we actually bring mobile services to customers quickly? How can we grow growth of different vertical and horizontal markets?  They're looking for convergence of Platform 3.0 services where I can offer portal services.

In the finance sector we see adoption of new technologies to scale to multiple consumer markets with rapid insight and large scale data analytics to profile financial behavior and credit risk profiles for example.

A recent seminar that I was involved in was about cost avoidance of the future cost of investing in more infrastructure. How can you bring big data and social capabilities together, bring new experiences and improve quality of life, and improve the citizens' value of services from their government? How can you drive new financial processes and services? There are many similar case studies across multiple industries.
But it's really early days yet. The idea of Platform 3.0 is only just crystallizing.

Gardner: Chris Harding, being involved with interoperability so deeply, are there any examples or use cases that you can point to where not only are organizations looking internally for better efficiency and productivity gain, but perhaps are expanding the capabilities of Platform 3.0 outside of their organizations into a ecosystem or even greater? What are some of the divisions around extending 3.0 benefits into a wider, collaborative environment?

Harding: If you want a practical but historical example of how shared information, analytics, collection, distribution can empower a whole industry, you only have to look at the finance industry, where it's been commonplace actually for some time. Shared information is collected in real time, various companies analyze it, and it's distributed and made available in graphical form. You can probably get it on your mobile phone if you want.

Imagine how that kind of information processing ability could be translated into other areas, such as healthcare, so that on a routine basis, medical people could get up-to-the-minute information on critical patients wherever they are. You can see what possibilities we are looking at.

But it's really early days yet. The idea of Platform 3.0 is only just crystallizing, and the point of it is, to pick up on Mark's point, that enterprises everywhere are constantly under pressure to do more and more with fewer and fewer resources. That’s why some kind of standard platform that will enable industries across the board to take advantage of this kind of possibility is something that we really need.

Lounsbury: We all know the Gartner hype cycle. We get out on the early edge of things. We see the possibilities, and then there is the trough of disillusionment. Chris has touched on something very important that I think is necessary for there to be a successful transition to this Platform 3.0 world we envisioned.

Data growth

One of the big risks here is that we see figures that say the amount of data produced doubles every 1.2 years. Well, the rate of growth of people who can deal with that data, data scientists and whatever, is pretty much a linear growth. Maybe it's 5 percent a year or 10 percent a year, or something like that, but it's not doubling every 1.2 years.

One of the reasons that it's very important for people to come in, get engaged, and start bringing in these use cases that you've mentioned is because the sooner we get to have common understandings and common approaches, the more efficient our industrial base and our use of the big data will be.

The biggest challenge to actually attaining the value of Platform of 3.0 will be having the human processes and the business processes needed to deal with that volume and velocity that Mark alluded to right at the beginning. To me that's a critical aspect that we've got to bring in -- how we get the people aware of this as well.

Gardner: We're getting close to the end, but looking to the future, Dave, we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we've begun to see that how a recommendation engine could be brought to bear. I'm thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.
In the future, we'll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thought?

Skilton: What we're talking about is the next generation of the internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the internet.

I think that in the future, we'll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We'll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what's happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: Chris Harding, there's this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, "So now this is where we could make a change to our business." It's the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It's going to be different for every business, and I'm very happy to say this, it's something that computers aren’t going to be able to do for a very long time yet. It's going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it's a very exciting time, and we'll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we'll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, "It's going to be them" or "It's going to be them."
Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I'd disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we'll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Gardner: Well, great. I'm afraid we'll have to leave it there. There will be lots more to hear at the conference itself. Today we've been talking about the business implications of the shift to Platform 3.0. They're coming about, and we can start to plan for transitions. We've seen how Platform 3.0 provides a potential game-changing opportunity for companies to leverage advanced intelligence and automation and heighten productivity in their businesses.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference this July, in 2013, in Philadelphia. It’s not too late to register or to follow the proceedings online and also via Twitter. You'll hear more about Platform 3.0 as well as enterprise transformation and how that’s impacting specifically the finance, government, and healthcare sectors. [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

I'd like to thank our panel for joining us today. It has been very interesting. Thank you Dave Lounsbury, Chief Technical Officer at The Open Group.

Lounsbury: Thank you, Dana, thank you for hosting the discussion, and we look forward to seeing many of the listeners in Philadelphia.

Gardner: We've also been here with Chris Harding, Director of Interoperability at The Open Group. Thanks so much, Chris.

Harding: Thank you, Dana, it's been a great discussion.

Gardner: And lastly, thanks to Mark Skilton, Global Director in the Strategic Office at Capgemini. Thank you, sir.

Skilton: Thank you, Dana, and to Dave and Chris. It's been an interesting, very topical discussion. Thank you very much.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these thought leader interviews. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect podcast on how The Open Group is working to stay ahead of converging challenges organization face with big data, mobile, cloud and social. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:


Monday, July 08, 2013

The Open Group July Conference Emphasizes Value of Placing Structure and Agility Around Enterprise Risk Reduction Efforts

Transcript of a BriefingsDirect podcast about the how to achieve better risk management with better analysis of risk factors.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector.

We're here now with a panel of experts to explore new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We'll learn how enterprises are better delivering risk assessment and, one hopes, defenses, in the current climate of challenging cybersecurity. And we'll see how predicting risks and potential losses accurately, is an essential ingredient in enterprise transformation.

With that, please join me in welcoming our panel, we're here with Jack Freund, Information Security Risk Assessment Manager at TIAA-CREF. Jack has spent over 14 years in enterprise IT, is a visiting professor at DeVry University, and also chairs a Risk-Management Subcommittee for the ISACA. Welcome back, Jack.

Jack Freund: Glad to be here, Dana. Thanks for having me.

Gardner: We're also here with Jack Jones, Principal at CXOWARE, and he has more than nine years of experience as a Chief Information Security Officer (CISO). He is also an inventor of the FAIR, risk analysis framework. Welcome, Jack.

Jack Jones: Thank you very much.

Gardner: We're also here with Jim Hietala, Vice President, Security, at The Open Group. Welcome, Jim. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Jim Hietala: Thanks, Dana, good to be here.

Gardner: Let’s start with you, Jim. It’s been about six months since we spoke about these issues around risk assessment and understanding risk accurately, and it’s hard to imagine things getting any better in the last six months. There’s been a lot of news and interesting developments in the cyber-security landscape.

So has this heightened interest? What are The Open Group and others are doing in this field of risk assessment and accuracy and determining what your losses might be and how that can be a useful tool?

Hietala: I would say it has. Certainly, in the cybersecurity world in the past six or nine months, we've seen more and more discussion of the threats that are out there. We’ve got nation-state types of threats that are very concerning, very serious, and that organizations have to consider.

Hietala
With what’s happening, you've seen that the US Administration and President Obama direct the National Institute of Standards and Technology (NIST) to develop a new cybersecurity framework. Certainly on the government side of things, there is an increased focus on what can we do to increase the level of cybersecurity throughout the country in critical infrastructure. So my short answer would be yes, there is more interest in coming up with ways to accurately measure and assess risk so that we can then deal with it.

Perception shift

Gardner: Jack Jones, do you also see a maturity going on, or are we just hearing more in the news and therefore there is a perception shift? How do you see things? How have things changed, in your perception, over the last six to nine months?

Jones
Jones: I continue to see growth and maturity, especially in areas of understanding the fundamental nature of risk and exploration of quantitative methods for it. A few years ago, that would have seemed unrealistic at best, and outlandish at worst in many people’s eyes. Now, they're beginning to recognize that it is not only pragmatic, but necessary in order to get a handle on much of what we have to do from a prioritization perspective.

Gardner: Jack Freund are you seeing an elevation in the attention being paid to risk issues inside companies in larger organizations? Is this something that’s getting the attention of all the people it should?

Freund: We're entering a phase where there is going to be increased regulatory oversight over very nearly everything. When that happens, all eyes are going to turn to IT and IT risk management functions to answer the question of whether we're handling the right things. Without quantifying risk, you're going to have a very hard time saying to your board of directors that you're handling the right things the way a reasonable company should.

As those regulators start to see and compare among other companies, they'll find that these companies over here are doing risk quantification, and you're not. You're putting yourself at a competitive disadvantage by not being able to provide those same sorts of services.

Gardner: So you're saying that the market itself hasn’t been enough to drive this, and that regulation is required?

Freund
Freund: It’s probably a stronger driver than market forces at this point. The market is always going to be able to help push that to a more prominent role, but especially in information security. If you're not experiencing primary losses as a result of these sorts of things, then you have to look to economic externalities, which are largely put in play by regulatory forces here in the United States.

Jones: To support Jack’s statement that regulators are becoming more interested in this too, just in the last 60 days, I've spent time training people at two regulatory agencies on FAIR. So they're becoming more aware of these quantitative methods, and their level of interest is rising.

Gardner: Jack Jones, this is probably a good time for us to explain a little bit more about FAIR. For those listeners who might not be that familiar with it, please take a moment to give us the high-level overview of what FAIR is.

Jones: Sure, just thumbnail sketch of it. It’s, first and foremost, a model for what risk is and how it works. It’s a decomposition of the factors that make up risk. If you can measure or estimate the value of those factors, you can derive risk quantitatively in dollars and cents.

Risk quantification

You see a lot of “risk quantification” based on ordinal scales -- 1, 2, 3, 4, 5 scales, that sort of thing. But that’s actually not quantitative. If you dig into it, there's no way you could defend a mathematical analysis based on those ordinal approaches. So FAIR is this model for risk that enables true quantitative analysis in a very pragmatic way.

Gardner: FAIR stands for a Factor Analysis of Information Risk. Is that correct?

Jones: That is correct.

Gardner: Jim Hietala, we also have in addition to a very interesting and dynamic cybersecurity landscape a major trend getting traction in big data, cloud computing, and mobile. There's lots going on in the IT world. Perhaps IT's very nature, the roles and responsibilities, are shifting. Is doing risk assessment and management becoming part and parcel of core competency of IT, and is that a fairly big departure from the past?

Hietala: As to the first question, it's having to become kind of a standard practice within IT. When you look at outsourcing your IT operations to a cloud-service provider, you have to consider the security risks in that environment. What do they look like and how do we measure them?

It's the same thing for things like mobile computing. You really have to look at the risks of folks carrying tablets and smart phones, and understand the risks associated with those same things for big data. For any of these large-scale changes to our IT infrastructure you’ve got to understand what it means from a security and risk standpoint.
We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive.

Gardner: Jack Freund or Jack Jones, any thoughts about the changing role of IT as a service and service-level agreement brokering aspects of IT aligned with risk assessment?

Freund: I read an interesting article this morning around a school district that is doing something they call bring your own technology (BYOT). For anybody who has been involved in these sort of efforts in the corporate world that should sound very familiar. But I want to think culturally around this. When you have students wondering how to do these sorts of things and becoming accustomed to being able to bring current technology, oh my gosh. When they get to the corporate world and start to work, they're going to expect the same sorts of levels of service.

To answer to your earlier question, absolutely. We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive. That’s important.

Whether that’s an embedded function within IT or it’s an overarching function that exists across multiple business units, there are different models that work for different size companies and companies of different cultural types. But it has to be there. It’s absolutely critical.

Gardner: Jack Jones, how do you come down this role of IT shifting in the risk assessment issues, something that’s their responsibility. Are they embracing that or  maybe wishing it away?

Jones: It depends on whom you talk to. Some of them would certainly like to wish it away. I don't think IT’s role in this idea for risk assessment and such has really changed. What is changing is the level of visibility and interest within the organization, the business side of the organization, in the IT risk position.

Board-level interest

Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn't happen. Now, you're getting a lot more board-level interest in IT risk, and with that visibility comes a responsibility, but also a certain amount of danger. If they’re doing it really badly, they're incredibly immature in how they approach risk.

They're going to look pretty foolish in front of the board. Unfortunately, I've seen that play out. It’s never pretty and it's never good news for the IT folks. They're realizing that they need to come up to speed a little bit from a risk perspective, so that they won't look the fools when they're in front of these executives.

They're used to seeing quantitative measures of opportunities and operational issues of risk of various natures. If IT comes to the table with a red, yellow, green chart, the board is left to wonder, first how to interpret that, and second, whether these guys really get it. I'm not sure the role has changed, but I think the responsibilities and level of expectations are changing.

Gardner: Part of what FAIR does in risk analysis in general is to identify potential losses and put some dollars on what potential downside there is. That provides IT with the tool, the ability, to rationalize investments that are needed. Are you seeing the knowledge of potential losses to be an incentive for spending on modernization?
Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn't happen.

Jones: Absolutely. One organization I worked with recently had certain deficiencies from the security perspective that they were aware of, but that were going to be very problematic to fix. They had identified technology and process solutions that they thought would take them a long way towards a better risk position. But it was a very expensive proposition, and they didn't have money in the IT or information security budget for it.

So, we did a current-state analysis using FAIR, how much loss exposure they had on annualized basis. Then, we said, "If you plug this solution into place, given how it affects the frequency and magnitude of loss that you'd expect to experience, here's what’s your new annualized loss exposure would be." It turned out to be a multimillion dollar reduction in annualized loss exposure for a few hundred thousand dollars cost.

When they took that business case to management, it was a no-brainer, and management signed the check in a hurry. So they ended up being in a much better position.

If they had gone to executive management saying, "Well, we’ve got a high risk and if we buy this set of stuff we’ll have low or medium risk," it would've been a much less convincing and understandable business case for the executives. There's reason to expect that it would have been challenging to get that sort of funding given how tight their corporate budgets were and that sort of thing. So, yeah, it can be incredibly effective in those business cases.

Gardner: Correct me if I am wrong, but you have a book out since we last spoke. Jack, maybe you could tell a bit about of that and how that comes to bear on these issues?

Freund: Well, the book is currently being written. Jack Jones and I have entered into a contract with Elsevier and we're also going to be preparing the manuscript here over the summer and winter. Probably by second quarter next year, we'll have something that we can share with everybody. It's something that has been a long time coming. For Jack, I know he has wanted to write this for a long time.

Conversational book

We wanted to build a conversational book around how to assess risk using FAIR, and that's an important distinction from other books in the market today. You really want to dig into a lot of the mathematical stuff. I'm speaking personally here, but I wanted to build a book that gave people tools, gave practitioners the risk tools to be able to handle common challenges and common opposition to what they are doing every day, and just understand how to apply concepts in FAIR in a very tangible way.

Gardner: Very good. What about the conference itself. We're coming up very rapidly on The Open Group Conference. What should we expect in terms of some of your presentations and training activities?

Jones: I think it will be a good time. People would be pleased to have the quality of the presentations and some of the new information that they'll get to see and experience. As you said, we're offering FAIR training as a part of a conference. It's a two-day session with an opportunity afterwards to take the certification exam.

If history is any indication, people will go through the training. We get a lot of very positive remarks about a number of different things. One, they never imagined that risk could be interesting. They're also surprised that it's not, as one friend of mine calls it "rocket surgery." It's relatively straightforward and intuitive stuff. It's just that as a profession, we haven't had this framework for reference, as well as some of the methods that we apply to make it practical and defensible before.
Once you learn how to do it right, it's very obvious which are the wrong methods and why you can't use them to assess risk.

So we've gotten great feedback in the past, and I think people will be pleasantly surprised at what they experienced.

Freund: One of the things I always say about FAIR training is it's a real red pill-blue pill moment -- in reference to the old Matrix movies. I took FAIR training several years ago with Jack. I always tease Jack that it's ruined me for other risk assessment methods. Once you learn how to do it right, it's very obvious which are the wrong methods and why you can't use them to assess risk and why it's problematic.

I'm joking. It's really great and valuable training, and now I use it every day. It really does open your eyes to the problems and the risk assessment portion of IT today, and gives a very practical and actionable things to do in order to be able to fix that, and to provide value to your organization.

Gardner: Jim Hietala, the emphasis in terms of vertical industries at the conference is on finance, government and healthcare. They seem to be the right groups to be factoring more standardization and understanding of risk. Tell me how it comes together. Why is The Open Group looking at vertical industries at this time?

Hietala: Specific to risk, if I can talk about that for a second, the healthcare world, at least here in the US, has new security rules, and one of the first few requirements is perform an annual risk assessment. So it's currently relevant to that industry.

Same with finance

It’s the same thing with finance. One of the regulations around financial organizations tells them that, in terms of information security, they need to do a risk assessment. In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

In terms of The Open Group and verticals, we've done lots of great work in the area of enterprise architecture, security, and all the areas for which we've done work. In terms of our conferences, we've evolved things over the last year or so to start to look at what are the things that are unique in verticals.

It started in the mining industry. We set up a mining metals and exploration forum that looked at IT and architecture issues related specifically to that sector. We started that work several years ago and now we're looking at other industries and starting to assess the unique things in healthcare, for example. We've got a one day workshop at Philadelphia on the Tuesday of the conference, looking at IT and transformation opportunities in the healthcare sector.

That's how we got to this point, and we'll see more of that from The Open Group in the future.

Gardner: Are there any updates that we should be aware of in terms of activities within The Open Group and other organizations working on standards, taxonomy, and definitions when it comes to risk?
In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

Hietala: I'll take that and dive into that. We at The Open Group originally published a risk taxonomy standard based on FAIR four years ago. Over time, we've seen greater adoption by large companies and we've also seen the need to extend what we're doing there. So we're updating the risk taxonomy standard, and the new version of that should be published by the end of this summer.

We also saw within the industry, the need for a certification program for risk analysts, and so they'd be trained in quantitative risk assessment using FAIR. We're working on that program and we'll be talking more about it in Philadelphia. Follow the conference on Twitter at #ogPHL.

Along the way, as we were building the certification program, we realized that there was a missing piece in terms of the body of knowledge. So we created a second standard that is a companion to the taxonomy. That will be called the Risk Analysis Standard that looks more at some of that the process issues and how to do risk analysis using FAIR. That standard will also be available by the end of the summer and, combined, those two standards will form the body of knowledge that we'll be testing against in the certification program when it goes live later this year.

Gardner: Jack Freund, it seems that between regulatory developments, the need for maturity in these enterprises, and the standardization that's being brought to bear by such groups as The Open Group, it's making this quite a bit more of the science and less of an art.

What does that bring to organizations in terms of a bottom-line effect? I wonder if there is a use case or even an example that you could mention and explain that would help people better understand of what they get back when they go through these processes and they get this better maturity around risk?

Risk assessment

Freund: I'm not an attorney, but I have had a lot of lawyers tell me -- I think Jim had mentioned before in his vertical conversation -- that a lot of the regulations start with performing annual risk assessment and then choose controls based upon that. They're not very prescriptive that way.

One of the things that it drives in organizations is a sense of satisfaction that we've got things covered more than anything else. When you have your leadership in these organizations understanding that you're doing what a regular reasonable company would do to manage risk this way, you have fewer fire drills. Nobody likes to walk into work and have to deal with hundred different things.

We're moving hard drives out of printers and fax machines, what are we doing around scanning and vulnerabilities, and all of those various things that every single day can inundate you with worry, as opposed to focusing on the things that matter.

I like a folksy saying that sort of sums things up pretty well -- a dime holding up a dollar. You have all these little bitty squabbly issues that get in the way of really focusing on reducing risk in your organization in meaningful ways and focusing on the things that matter.

Using approaches like FAIR, drives a lot of value into your organization, because you're freeing up mind share in your executives to focus on things that really matter.
If something happens downstream, and you didn't follow best practice, you're often asked to explain why you didn't follow the herd.

Gardner: Jack Jones, a similar question, any examples that exemplify the virtues of doing the due diligence and having some of these systems and understanding in place?

Jones: I have an example to Jack Freund’s point about being able to focus and prioritize. One organization I was working with had identified a significant risk issue and they were considering three different options for risk mitigation that had been proposed. One was "best practice,” and the other two were less commonly considered for that particular issue.

An analysis showed with real clarity that option B, one of the not-best practice options, should reduce risk every bit as effectively as best practice, but had a whole lot lower cost. The organization then got to make an informed decision about whether they were going to be herd followers or whether they were going to be more cost-effective in risk management.

Unfortunately, there’s always danger in not following the herd. If something happens downstream, and you didn't follow best practice, you're often asked to explain why you didn't follow the herd.

That was part of the analysis too, but at the end of the day, management got to make a decision on how they wanted to behave. They chose to not follow best practice and be more cost-effective in using their money. When I asked them why they felt comfortable with that, they said, "Because we’re comfortable with the rigor in your analysis."

Best practice

To your question earlier about art-versus-science, first of all, in most organization there would have been no question. They would have said, "We must follow best practice." They wouldn’t even examine the options, and management wouldn’t have had the opportunity to make that decision.

Furthermore, even if they had "examined” those options using a more subjective, artistic approach, somebody's wet finger in the air, management almost certainly would not have felt comfortable with a non-best practice approach. So, the more scientific, more rigorous, approach that something like FAIR provides, gives you all kinds of opportunity to make informed decisions and to feel more comfortable more about those decisions.

Gardner: It really sounds as if there's a synergistic relationship between a lot of the big-data and analytics investments that are being made for a variety of reasons, and also this ability to bring more science and discipline to risk analysis.

How do those come together, Jack Jones? Are we seeing the dots being connected in these large organizations that they can take more of what they garner from big data and business intelligence (BI) and apply that to these risk assessment activities, is that happening yet?

Jones: It’s just beginning to. It’s very embryonic, and there are only probably a couple of organizations out there that I would argue are doing that with any sort of effectiveness. Imagine that -- they’re both using FAIR.
There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you.

But when you think about BI or any sort of analytics, there are really two halves to the equation. One is data and the other is models. You can have all the data in the world, but if your models stink, then you can't be effective. And, of course, vise versa. If you’ve got great model and zero data, then you've got challenges there as well.

Being able to combine the two, good data and effective models, puts you in much better place. As an industry, we aren’t there yet. We've got some really interesting things going on, and so there's a lot of potential there, but people have to leverage that data effectively and make sure they're using a model that makes sense.

There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you. The models will grossly misinform you. So people have to be careful, because data is great, but if you’re applying it to a bad model, then you're in trouble.

Gardner: We are coming up near the end of our half hour. Jack Freund, for those organizations that are looking to get started, to get more mature, perhaps start leveraging some of their investments in areas like big data, in addition to attending The Open Group Conference or watching some of the plenary sessions online, what tips do you have for getting started? Are there some basic building blocks that should be in place or ways in which to get the ball rolling when it comes to a better risk analysis?

Freund: Strong personality matters in this. They have to have some sort of evangelist in the organization who cares enough about it to drive it through to completion. That’s a stake on the ground to say, "Here is where we're going to start, and here is the path that we are going to go on."

Strong commitment

When you start doing that sort of thing, even if leadership changes and other things happen, you have a strong commitment from the organization to keep moving forward on these sorts of things.

I spend a lot of my time integrating FAIR with other methodologies. One of the messaging points that I keep saying all the time is that what we are doing is implementing a discipline around how we choose our risk rankings. That’s one of the great things about FAIR. It's universally compatible with other assessment methodologies, programs, standards, and legislation that allows you to be consistent and precise around how you're connecting to everything else that your organization cares about.

Concerns around operational risk integration are important as well. But driving that through to completion in the organization has a lot to do with finding sponsorship and then just building a program to completion. But absent that high-level sponsorship, because FAIR allows you to build a discipline around how you choose rankings, you can also build it from the bottom up.

You can have these groups of people that are FAIR trained that can build risk analyses or either pick ranges -- 1, 2, 3, 4 or high, medium, low. But then when questioned, you have the ability to say, "We think this is a medium, because it met our frequency and magnitude criteria that we've been establishing using FAIR."
Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis.

Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis. In the end it's an interesting and reasonable path to get to risk utopia.

Gardner: Jack Jones, any thoughts from your perspective on a good way to get started, maybe even through the lens of the verticals that The Open Group has targeted for this conference, finance, government and healthcare? Are there any specific important things to consider on the outset for your risk analysis journey from any of the three verticals?

Jones: A good place to start is with the materials that The Open Group has made available on the risk taxonomy and that soon to be published risk-analysis standard.

Another source that I recommend to everybody I talk to about other sorts of things is a book called How to Measure Anything by Douglas Hubbard. If someone is even least bit interested in actually measuring risk in quantitative terms, they owe it to themselves to read that book. It puts into layman’s terms some very important concepts and approaches that are tremendously helpful. That's an important resource for people to consider too.

As far as within organizations, some organizations will have a relatively mature enterprise risk-management program at the corporate level, outside of IT. Unfortunately, it can be hit-and-miss, but there can be some very good resources in terms of people and processes that the organization has already adopted. But you have to be careful there too, because with some of those enterprise risk-management programs, even though they may have been in place for years, and thus, one would think over time and become mature, all they have done is dig a really deep ditch in terms of bad practices and misconceptions.

So it's worth having the conversation with those folks to gauge how clueful are they, but don't assume that just because they have been in place for a while and they have some specific title or something like that that they really understand risk at that level.

Gardner: Well, very good. I'm afraid we will have to leave it there. We've been talking with a panel of experts about the new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We've seen how enterprises are better delivering risk assessments, or beginning to, as they are facing challenges in cyber-security as well as undergoing the larger undertaking of enterprise transformation.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in July 2013 in Philadelphia. There's more information on The Open Group website about that conference for you to attend or to gather information from either in live streaming or there are often resources available to download after the conference. Follow the conference on Twitter at #ogPHL.

So with that thanks to our panel. We've been joined by Jack Freund, Information Security Risk Assessment Manager at TIAA-CREF. Thank you so much, Jack.

Freund: Thank you, Dana.

Gardner: And also Jack Jones, Principal at CXOWARE. Thank you, sir.

Jones: It's been my pleasure. Thanks.

Gardner: And then also lastly, Jim Hietala, Vice President, Security at The Open Group. Thank you, Jim.

Hietala: Thank you, Dana.

Gardner: And this is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leader interview series. Registration to the July 15 conference remains open to attend in person. I hope to see you there. We'll also be conducting some more BriefingsDirect podcasts from the conference, so watch for those in future posts. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect podcast about the how to achieve better risk management with better analysis of risk factors. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in: