Tuesday, July 09, 2013

Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automation, Bring Commercial Benefits to Enterprises

Transcript of a BriefingsDirect podcast on how The Open Group is working to stay ahead of converging challenges organization face with big data, mobile, cloud and social.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

We're here now with a panel of experts to explore the business implications of the current shift to so-called Platform 3.0. Known as the new model through which big data, cloud, and mobile and social -- in combination -- allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we're here now to learn more how to leverage Platform 3.0 as more than a IT shift -- and as a business game-changer.

With that, please join me in welcoming our panel: Dave Lounsbury, Chief Technical Officer at The Open Group. Welcome, Dave.

Dave Lounsbury: Hi, Dana, happy to be here.

Gardner: We're also here with Chris Harding, Director of Interoperability at The Open Group. Welcome, Chris. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Chris Harding: Thank you, Dana, and it's great to be on this panel.

Gardner: And also Mark Skilton, Global Director in the Strategy Office at Capgemini. Welcome, Mark.

Mark Skilton: Hi, Dana, thanks for inviting us today. I'm very happy to be here.

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They're turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it's a pretty fundamental change.

Lounsbury
If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we're starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Gardner: Chris Harding is there a time issue here? Is this something that organizations should sit back, watch how it unfolds, and then gauge their response? Or is there a benefit of being out in front of this in some way?

Harding: I don’t know about in front of this. Enterprises have to be up with the way that things are moving in order to keep their positions in their industries. Enterprises can't afford to be working with yesterday's technology. It's a case of being able to understand the information that they're presented and make the best decision to reflect that.

Harding
We've always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we're talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs -- velocity, volume, and value -- on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we're now into is the multi-workload environment, where you've got mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton
It has to do with not just one solution, not one subscription model, because we're now into this subscription-model era, the subscription economy, as one group tends to describe it. Now, we're looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we're dealing with -- 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Dave Lounsbury, back to you. Why should IT be thinking about this as a fundamental shift, rather than a step change or a modest change? It seems to me that this combination of factors almost blows the whole IT definition of 10 years ago, out of the water. Is it that big a deal for IT? It also has an impact on business. I'd like to just focus on how IT organizations might need to start rethinking things?
There's no point giving someone data if it's not been properly managed or if there's incorrect information.

Lounsbury: A lot depends on how you define your IT organization. It's useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it's how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There's no point giving someone data if it's not been properly managed or if there's incorrect information.

What's going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we've seen in the open-source  role and things like that nature, but there's the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they're going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can't do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally -- how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I'd just like to add to Dave's excellent points about, the shape of data has changed, but also about why should IT get involved. We're seeing that there's a shift in the constituency of who is using this data.

We've got the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We've got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there's also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: So we have more types of users, more classes of individuals and resources within a enterprise starting to avail themselves more of these intelligence capabilities more ubiquitously, vis-à-vis the mobile and the cloud delivery opportunity.
This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

How do we prevent this from going off the rails? How is it that we don’t start creating multiple fire hoses of information and/or too much data, but not enough analysis? Chris Harding, any thoughts about where perhaps The Open Group or others can step in to help make this a more fruitful, rather than chaotic, transition?

Harding: This a very important point. And to add to the difficulties, it's not only that a whole set of different people are getting involved with different kinds of information, but there's also a step change in the speed with which all this is delivered. It's no longer the case, that you can say, "Oh well, we need some kind of information system to manage this information. We'll procure it and get a program written" that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It's really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It's a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Lounsbury: I'd like to expand on that a little bit if I could, Dana. I agree with all the points that Chris and Mark just made. We should also mention that it's not just the speed of the analysis on the consumption side. We're going to see a lot of rapid evolution in the input side as well.

New data sources

We're starting to see lot of new data sources come on line. We've touched on the mobile devices and the social networks that those mobile devices enable, but we’re really also on the cusp of this idea of the "Internet of things," where there is a vast globe full of network connected sensors and actuators out there, all of which produce their own data.

Part of the process that Chris alluded to and the best practices Chris alluded to is how you run your business processes so that you keep your feeds up to date, so that you can adapt quickly to new sources of information, as well as adapt quickly to the new demands for information from the lines of business.

Gardner: It seems to be somewhat unprecedented that we have multiple change agents playing off of one another with complexity, scale, and velocity all very much at work. It's one thing to have a vision about how you would want to exploit this, but it's another to have a plan about how to go about that.

Mark Skilton, with your knowledge of Capgemini and the role that they play in the market, it seems to me that there's a tremendous need for some examples or some sense of how to go about managing the ability to exploit Platform 3.0 without getting tripped up and overwhelmed in the process.

Skilton: That’s right. Capgemini has been doing work in this area. I break it down into four levels of scalability. It's the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We're very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.
Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it's all about connectivity in the field. I meet a number of clients who are saying, "We’ve got this cloud service," or "This service is in a certain area of my country. If I move to another parts of the country or I'm traveling, I can't get connectivity." That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the "long tail effect." It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: Dave Lounsbury, we're coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We're still in the formational stages of  "third platform" or Platform 3.0 for The Open Group as an industry. To some extent, we're starting pretty much at the ground floor with that in the Platform 3.0 forum. We're leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we've got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we're really working towards in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Gardner: Anything to offer on that Chris?

Harding: There are some excellent points there. We certainly need to understand the business environment within which Platform 3.0 will be used. We've heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we've heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We're also hearing about marketplaces for services, new ways in which services are being made available and combined.
What are the problems that need to be resolved in order to understand what kind of shape the new platform will have?

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: At the same time, The Open Group is looking to enter into more vertical industry emphasis with its activities. At the Philadelphia Conference, you've chosen finance, government and healthcare. Dave or Chris, is there something about these three vertical industries that make them excellent test cases for Platform 3.0? Is there something about going into a vertical industry that helps with the transition to 3.0, rather than a general or one-size-fits-all approach? What's the impact of vertical industry emphasis on this transition?

Lounsbury: First, I'll note that the overarching theme of The Open Group Conferences is about business transformation -- how you adapt and evolve your business to take better advantage of the efficiencies afforded by IT and other developments. So as a horizontal activity, Platform 3.0 fits in very well with that, because I believe these transformational drivers from the evolution of Platform 3.0 are going to affect all industries.

To get back to your question, the benefit of Platform 3.0 will be most immediately and urgently felt in vertical industries that deal with extremely large volumes of data and need to filter very large volumes of data in order to achieve their business objectives and run their businesses efficiently.

For example, one of the things that healthcare is struggling with right now is a mass of patient records that need to be done. How do care givers or care providers make sense of those, make sure that everybody is up-to-date, and make sure that everybody is simply working off of the same data? It's a core question for them.

Today's problem

That’s today's problem which some of the infrastructure of Platform 3.0 will undoubtedly help with. When you come to looking at care not only as an individual topic, how my doctor or nurse gives care to me, but in terms of the larger trends in healthcare, can we look at how certain drugs effect certain diseases, it's a perfect example for the use of data and strong analytics to get information. We couldn’t have actually gotten that before, simply because we couldn’t bring it together and understand it.

In some sense, the biotech industry has been leading this trend. Genomics have really seeded a lot of the big data capabilities.

That will be a very exciting area for healthcare. If you go into any Apple Store, you'll see a whole retail rack of gadgets that you wear on your body that tell you how fit you are, or how fit you aren’t in some cases. It will tell you what your pulse is, your heart rate, and your body mass index. We're getting very close to a time when we will have things that might even measure and report bits of your blood chemistry. We're very close to that, for example, with blood sugar.

That data might, through the concepts of Platform 3.0, provide a really personalized and much more immediate healthcare loop in the patient care. Again, these are all things a few years out. The Open Group is deliberately choosing to get in early, so we and our members can be informed about these trends, how to take advantage of them and what standards are going to be needed to do it.

We can go on about finance too, but it's also another area where this massive data that will need to be correlated and analyzed.

Gardner: You are saying that not only are we facing an internet of things, we're going to be facing an internet of living things as well. So, there's a lot of data to come.
The Open Group is deliberately choosing to get in early, so we and our members can be informed about these trends.

One of the great things about The Open Group that I've been observing over the years is that it really provides a super important environment for different types of organizations to collaborate and share their stories and understand what others are doing, both in their own vertical industries, but also another types of business.

I expect that’s really going to be a huge benefit to organizations as they transition towards Platform 3.0, to learn from how others are doing it and even how others have stumbled along the way? But do you have any early indicators, either examples or use cases that would illustrate just how important this is, how instrumental this can be in helping companies?

Let's go across our panel. Mark Skilton at Capgemini, any examples that we could point to that would indicate that when you do this well, when you transition, when you take advantage of all these changes in tandem, you get pragmatic and even measurable benefit.

Skilton: Identifying business value is the key and builds on what David was talking about in terms of having new types of data, sensors, and capabilities. What we’re finding is clients are dealing with this in eHealth, eGovernment and eFinance.  

Cost of health care

In the health sector the rising cost of health care and the increasing life expectancy and longevity of the population is increasing pressure on the cost of health care in many countries. eHealth initiatives, use of new technologies such as mobile patient monitoring, and improved digital patient record management and care planning will aim to drive down the cost of medical care while improving the quality of life of patients.

In the federal government sector the eGov initiatives seek to develop citizen services and value for money of public spend programs. Open data initiatives aim to develop information and marketing sharing of services.

What can we do there to accelerate the adoption of services across markets. How can we actually bring mobile services to customers quickly? How can we grow growth of different vertical and horizontal markets?  They're looking for convergence of Platform 3.0 services where I can offer portal services.

In the finance sector we see adoption of new technologies to scale to multiple consumer markets with rapid insight and large scale data analytics to profile financial behavior and credit risk profiles for example.

A recent seminar that I was involved in was about cost avoidance of the future cost of investing in more infrastructure. How can you bring big data and social capabilities together, bring new experiences and improve quality of life, and improve the citizens' value of services from their government? How can you drive new financial processes and services? There are many similar case studies across multiple industries.
But it's really early days yet. The idea of Platform 3.0 is only just crystallizing.

Gardner: Chris Harding, being involved with interoperability so deeply, are there any examples or use cases that you can point to where not only are organizations looking internally for better efficiency and productivity gain, but perhaps are expanding the capabilities of Platform 3.0 outside of their organizations into a ecosystem or even greater? What are some of the divisions around extending 3.0 benefits into a wider, collaborative environment?

Harding: If you want a practical but historical example of how shared information, analytics, collection, distribution can empower a whole industry, you only have to look at the finance industry, where it's been commonplace actually for some time. Shared information is collected in real time, various companies analyze it, and it's distributed and made available in graphical form. You can probably get it on your mobile phone if you want.

Imagine how that kind of information processing ability could be translated into other areas, such as healthcare, so that on a routine basis, medical people could get up-to-the-minute information on critical patients wherever they are. You can see what possibilities we are looking at.

But it's really early days yet. The idea of Platform 3.0 is only just crystallizing, and the point of it is, to pick up on Mark's point, that enterprises everywhere are constantly under pressure to do more and more with fewer and fewer resources. That’s why some kind of standard platform that will enable industries across the board to take advantage of this kind of possibility is something that we really need.

Lounsbury: We all know the Gartner hype cycle. We get out on the early edge of things. We see the possibilities, and then there is the trough of disillusionment. Chris has touched on something very important that I think is necessary for there to be a successful transition to this Platform 3.0 world we envisioned.

Data growth

One of the big risks here is that we see figures that say the amount of data produced doubles every 1.2 years. Well, the rate of growth of people who can deal with that data, data scientists and whatever, is pretty much a linear growth. Maybe it's 5 percent a year or 10 percent a year, or something like that, but it's not doubling every 1.2 years.

One of the reasons that it's very important for people to come in, get engaged, and start bringing in these use cases that you've mentioned is because the sooner we get to have common understandings and common approaches, the more efficient our industrial base and our use of the big data will be.

The biggest challenge to actually attaining the value of Platform of 3.0 will be having the human processes and the business processes needed to deal with that volume and velocity that Mark alluded to right at the beginning. To me that's a critical aspect that we've got to bring in -- how we get the people aware of this as well.

Gardner: We're getting close to the end, but looking to the future, Dave, we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we've begun to see that how a recommendation engine could be brought to bear. I'm thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.
In the future, we'll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thought?

Skilton: What we're talking about is the next generation of the internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the internet.

I think that in the future, we'll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We'll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what's happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: Chris Harding, there's this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, "So now this is where we could make a change to our business." It's the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It's going to be different for every business, and I'm very happy to say this, it's something that computers aren’t going to be able to do for a very long time yet. It's going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it's a very exciting time, and we'll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we'll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, "It's going to be them" or "It's going to be them."
Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I'd disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we'll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Gardner: Well, great. I'm afraid we'll have to leave it there. There will be lots more to hear at the conference itself. Today we've been talking about the business implications of the shift to Platform 3.0. They're coming about, and we can start to plan for transitions. We've seen how Platform 3.0 provides a potential game-changing opportunity for companies to leverage advanced intelligence and automation and heighten productivity in their businesses.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference this July, in 2013, in Philadelphia. It’s not too late to register or to follow the proceedings online and also via Twitter. You'll hear more about Platform 3.0 as well as enterprise transformation and how that’s impacting specifically the finance, government, and healthcare sectors. [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

I'd like to thank our panel for joining us today. It has been very interesting. Thank you Dave Lounsbury, Chief Technical Officer at The Open Group.

Lounsbury: Thank you, Dana, thank you for hosting the discussion, and we look forward to seeing many of the listeners in Philadelphia.

Gardner: We've also been here with Chris Harding, Director of Interoperability at The Open Group. Thanks so much, Chris.

Harding: Thank you, Dana, it's been a great discussion.

Gardner: And lastly, thanks to Mark Skilton, Global Director in the Strategic Office at Capgemini. Thank you, sir.

Skilton: Thank you, Dana, and to Dave and Chris. It's been an interesting, very topical discussion. Thank you very much.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these thought leader interviews. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect podcast on how The Open Group is working to stay ahead of converging challenges organization face with big data, mobile, cloud and social. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:


Monday, July 08, 2013

The Open Group July Conference Emphasizes Value of Placing Structure and Agility Around Enterprise Risk Reduction Efforts

Transcript of a BriefingsDirect podcast about the how to achieve better risk management with better analysis of risk factors.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector.

We're here now with a panel of experts to explore new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We'll learn how enterprises are better delivering risk assessment and, one hopes, defenses, in the current climate of challenging cybersecurity. And we'll see how predicting risks and potential losses accurately, is an essential ingredient in enterprise transformation.

With that, please join me in welcoming our panel, we're here with Jack Freund, Information Security Risk Assessment Manager at TIAA-CREF. Jack has spent over 14 years in enterprise IT, is a visiting professor at DeVry University, and also chairs a Risk-Management Subcommittee for the ISACA. Welcome back, Jack.

Jack Freund: Glad to be here, Dana. Thanks for having me.

Gardner: We're also here with Jack Jones, Principal at CXOWARE, and he has more than nine years of experience as a Chief Information Security Officer (CISO). He is also an inventor of the FAIR, risk analysis framework. Welcome, Jack.

Jack Jones: Thank you very much.

Gardner: We're also here with Jim Hietala, Vice President, Security, at The Open Group. Welcome, Jim. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Jim Hietala: Thanks, Dana, good to be here.

Gardner: Let’s start with you, Jim. It’s been about six months since we spoke about these issues around risk assessment and understanding risk accurately, and it’s hard to imagine things getting any better in the last six months. There’s been a lot of news and interesting developments in the cyber-security landscape.

So has this heightened interest? What are The Open Group and others are doing in this field of risk assessment and accuracy and determining what your losses might be and how that can be a useful tool?

Hietala: I would say it has. Certainly, in the cybersecurity world in the past six or nine months, we've seen more and more discussion of the threats that are out there. We’ve got nation-state types of threats that are very concerning, very serious, and that organizations have to consider.

Hietala
With what’s happening, you've seen that the US Administration and President Obama direct the National Institute of Standards and Technology (NIST) to develop a new cybersecurity framework. Certainly on the government side of things, there is an increased focus on what can we do to increase the level of cybersecurity throughout the country in critical infrastructure. So my short answer would be yes, there is more interest in coming up with ways to accurately measure and assess risk so that we can then deal with it.

Perception shift

Gardner: Jack Jones, do you also see a maturity going on, or are we just hearing more in the news and therefore there is a perception shift? How do you see things? How have things changed, in your perception, over the last six to nine months?

Jones
Jones: I continue to see growth and maturity, especially in areas of understanding the fundamental nature of risk and exploration of quantitative methods for it. A few years ago, that would have seemed unrealistic at best, and outlandish at worst in many people’s eyes. Now, they're beginning to recognize that it is not only pragmatic, but necessary in order to get a handle on much of what we have to do from a prioritization perspective.

Gardner: Jack Freund are you seeing an elevation in the attention being paid to risk issues inside companies in larger organizations? Is this something that’s getting the attention of all the people it should?

Freund: We're entering a phase where there is going to be increased regulatory oversight over very nearly everything. When that happens, all eyes are going to turn to IT and IT risk management functions to answer the question of whether we're handling the right things. Without quantifying risk, you're going to have a very hard time saying to your board of directors that you're handling the right things the way a reasonable company should.

As those regulators start to see and compare among other companies, they'll find that these companies over here are doing risk quantification, and you're not. You're putting yourself at a competitive disadvantage by not being able to provide those same sorts of services.

Gardner: So you're saying that the market itself hasn’t been enough to drive this, and that regulation is required?

Freund
Freund: It’s probably a stronger driver than market forces at this point. The market is always going to be able to help push that to a more prominent role, but especially in information security. If you're not experiencing primary losses as a result of these sorts of things, then you have to look to economic externalities, which are largely put in play by regulatory forces here in the United States.

Jones: To support Jack’s statement that regulators are becoming more interested in this too, just in the last 60 days, I've spent time training people at two regulatory agencies on FAIR. So they're becoming more aware of these quantitative methods, and their level of interest is rising.

Gardner: Jack Jones, this is probably a good time for us to explain a little bit more about FAIR. For those listeners who might not be that familiar with it, please take a moment to give us the high-level overview of what FAIR is.

Jones: Sure, just thumbnail sketch of it. It’s, first and foremost, a model for what risk is and how it works. It’s a decomposition of the factors that make up risk. If you can measure or estimate the value of those factors, you can derive risk quantitatively in dollars and cents.

Risk quantification

You see a lot of “risk quantification” based on ordinal scales -- 1, 2, 3, 4, 5 scales, that sort of thing. But that’s actually not quantitative. If you dig into it, there's no way you could defend a mathematical analysis based on those ordinal approaches. So FAIR is this model for risk that enables true quantitative analysis in a very pragmatic way.

Gardner: FAIR stands for a Factor Analysis of Information Risk. Is that correct?

Jones: That is correct.

Gardner: Jim Hietala, we also have in addition to a very interesting and dynamic cybersecurity landscape a major trend getting traction in big data, cloud computing, and mobile. There's lots going on in the IT world. Perhaps IT's very nature, the roles and responsibilities, are shifting. Is doing risk assessment and management becoming part and parcel of core competency of IT, and is that a fairly big departure from the past?

Hietala: As to the first question, it's having to become kind of a standard practice within IT. When you look at outsourcing your IT operations to a cloud-service provider, you have to consider the security risks in that environment. What do they look like and how do we measure them?

It's the same thing for things like mobile computing. You really have to look at the risks of folks carrying tablets and smart phones, and understand the risks associated with those same things for big data. For any of these large-scale changes to our IT infrastructure you’ve got to understand what it means from a security and risk standpoint.
We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive.

Gardner: Jack Freund or Jack Jones, any thoughts about the changing role of IT as a service and service-level agreement brokering aspects of IT aligned with risk assessment?

Freund: I read an interesting article this morning around a school district that is doing something they call bring your own technology (BYOT). For anybody who has been involved in these sort of efforts in the corporate world that should sound very familiar. But I want to think culturally around this. When you have students wondering how to do these sorts of things and becoming accustomed to being able to bring current technology, oh my gosh. When they get to the corporate world and start to work, they're going to expect the same sorts of levels of service.

To answer to your earlier question, absolutely. We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive. That’s important.

Whether that’s an embedded function within IT or it’s an overarching function that exists across multiple business units, there are different models that work for different size companies and companies of different cultural types. But it has to be there. It’s absolutely critical.

Gardner: Jack Jones, how do you come down this role of IT shifting in the risk assessment issues, something that’s their responsibility. Are they embracing that or  maybe wishing it away?

Jones: It depends on whom you talk to. Some of them would certainly like to wish it away. I don't think IT’s role in this idea for risk assessment and such has really changed. What is changing is the level of visibility and interest within the organization, the business side of the organization, in the IT risk position.

Board-level interest

Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn't happen. Now, you're getting a lot more board-level interest in IT risk, and with that visibility comes a responsibility, but also a certain amount of danger. If they’re doing it really badly, they're incredibly immature in how they approach risk.

They're going to look pretty foolish in front of the board. Unfortunately, I've seen that play out. It’s never pretty and it's never good news for the IT folks. They're realizing that they need to come up to speed a little bit from a risk perspective, so that they won't look the fools when they're in front of these executives.

They're used to seeing quantitative measures of opportunities and operational issues of risk of various natures. If IT comes to the table with a red, yellow, green chart, the board is left to wonder, first how to interpret that, and second, whether these guys really get it. I'm not sure the role has changed, but I think the responsibilities and level of expectations are changing.

Gardner: Part of what FAIR does in risk analysis in general is to identify potential losses and put some dollars on what potential downside there is. That provides IT with the tool, the ability, to rationalize investments that are needed. Are you seeing the knowledge of potential losses to be an incentive for spending on modernization?
Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn't happen.

Jones: Absolutely. One organization I worked with recently had certain deficiencies from the security perspective that they were aware of, but that were going to be very problematic to fix. They had identified technology and process solutions that they thought would take them a long way towards a better risk position. But it was a very expensive proposition, and they didn't have money in the IT or information security budget for it.

So, we did a current-state analysis using FAIR, how much loss exposure they had on annualized basis. Then, we said, "If you plug this solution into place, given how it affects the frequency and magnitude of loss that you'd expect to experience, here's what’s your new annualized loss exposure would be." It turned out to be a multimillion dollar reduction in annualized loss exposure for a few hundred thousand dollars cost.

When they took that business case to management, it was a no-brainer, and management signed the check in a hurry. So they ended up being in a much better position.

If they had gone to executive management saying, "Well, we’ve got a high risk and if we buy this set of stuff we’ll have low or medium risk," it would've been a much less convincing and understandable business case for the executives. There's reason to expect that it would have been challenging to get that sort of funding given how tight their corporate budgets were and that sort of thing. So, yeah, it can be incredibly effective in those business cases.

Gardner: Correct me if I am wrong, but you have a book out since we last spoke. Jack, maybe you could tell a bit about of that and how that comes to bear on these issues?

Freund: Well, the book is currently being written. Jack Jones and I have entered into a contract with Elsevier and we're also going to be preparing the manuscript here over the summer and winter. Probably by second quarter next year, we'll have something that we can share with everybody. It's something that has been a long time coming. For Jack, I know he has wanted to write this for a long time.

Conversational book

We wanted to build a conversational book around how to assess risk using FAIR, and that's an important distinction from other books in the market today. You really want to dig into a lot of the mathematical stuff. I'm speaking personally here, but I wanted to build a book that gave people tools, gave practitioners the risk tools to be able to handle common challenges and common opposition to what they are doing every day, and just understand how to apply concepts in FAIR in a very tangible way.

Gardner: Very good. What about the conference itself. We're coming up very rapidly on The Open Group Conference. What should we expect in terms of some of your presentations and training activities?

Jones: I think it will be a good time. People would be pleased to have the quality of the presentations and some of the new information that they'll get to see and experience. As you said, we're offering FAIR training as a part of a conference. It's a two-day session with an opportunity afterwards to take the certification exam.

If history is any indication, people will go through the training. We get a lot of very positive remarks about a number of different things. One, they never imagined that risk could be interesting. They're also surprised that it's not, as one friend of mine calls it "rocket surgery." It's relatively straightforward and intuitive stuff. It's just that as a profession, we haven't had this framework for reference, as well as some of the methods that we apply to make it practical and defensible before.
Once you learn how to do it right, it's very obvious which are the wrong methods and why you can't use them to assess risk.

So we've gotten great feedback in the past, and I think people will be pleasantly surprised at what they experienced.

Freund: One of the things I always say about FAIR training is it's a real red pill-blue pill moment -- in reference to the old Matrix movies. I took FAIR training several years ago with Jack. I always tease Jack that it's ruined me for other risk assessment methods. Once you learn how to do it right, it's very obvious which are the wrong methods and why you can't use them to assess risk and why it's problematic.

I'm joking. It's really great and valuable training, and now I use it every day. It really does open your eyes to the problems and the risk assessment portion of IT today, and gives a very practical and actionable things to do in order to be able to fix that, and to provide value to your organization.

Gardner: Jim Hietala, the emphasis in terms of vertical industries at the conference is on finance, government and healthcare. They seem to be the right groups to be factoring more standardization and understanding of risk. Tell me how it comes together. Why is The Open Group looking at vertical industries at this time?

Hietala: Specific to risk, if I can talk about that for a second, the healthcare world, at least here in the US, has new security rules, and one of the first few requirements is perform an annual risk assessment. So it's currently relevant to that industry.

Same with finance

It’s the same thing with finance. One of the regulations around financial organizations tells them that, in terms of information security, they need to do a risk assessment. In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

In terms of The Open Group and verticals, we've done lots of great work in the area of enterprise architecture, security, and all the areas for which we've done work. In terms of our conferences, we've evolved things over the last year or so to start to look at what are the things that are unique in verticals.

It started in the mining industry. We set up a mining metals and exploration forum that looked at IT and architecture issues related specifically to that sector. We started that work several years ago and now we're looking at other industries and starting to assess the unique things in healthcare, for example. We've got a one day workshop at Philadelphia on the Tuesday of the conference, looking at IT and transformation opportunities in the healthcare sector.

That's how we got to this point, and we'll see more of that from The Open Group in the future.

Gardner: Are there any updates that we should be aware of in terms of activities within The Open Group and other organizations working on standards, taxonomy, and definitions when it comes to risk?
In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

Hietala: I'll take that and dive into that. We at The Open Group originally published a risk taxonomy standard based on FAIR four years ago. Over time, we've seen greater adoption by large companies and we've also seen the need to extend what we're doing there. So we're updating the risk taxonomy standard, and the new version of that should be published by the end of this summer.

We also saw within the industry, the need for a certification program for risk analysts, and so they'd be trained in quantitative risk assessment using FAIR. We're working on that program and we'll be talking more about it in Philadelphia. Follow the conference on Twitter at #ogPHL.

Along the way, as we were building the certification program, we realized that there was a missing piece in terms of the body of knowledge. So we created a second standard that is a companion to the taxonomy. That will be called the Risk Analysis Standard that looks more at some of that the process issues and how to do risk analysis using FAIR. That standard will also be available by the end of the summer and, combined, those two standards will form the body of knowledge that we'll be testing against in the certification program when it goes live later this year.

Gardner: Jack Freund, it seems that between regulatory developments, the need for maturity in these enterprises, and the standardization that's being brought to bear by such groups as The Open Group, it's making this quite a bit more of the science and less of an art.

What does that bring to organizations in terms of a bottom-line effect? I wonder if there is a use case or even an example that you could mention and explain that would help people better understand of what they get back when they go through these processes and they get this better maturity around risk?

Risk assessment

Freund: I'm not an attorney, but I have had a lot of lawyers tell me -- I think Jim had mentioned before in his vertical conversation -- that a lot of the regulations start with performing annual risk assessment and then choose controls based upon that. They're not very prescriptive that way.

One of the things that it drives in organizations is a sense of satisfaction that we've got things covered more than anything else. When you have your leadership in these organizations understanding that you're doing what a regular reasonable company would do to manage risk this way, you have fewer fire drills. Nobody likes to walk into work and have to deal with hundred different things.

We're moving hard drives out of printers and fax machines, what are we doing around scanning and vulnerabilities, and all of those various things that every single day can inundate you with worry, as opposed to focusing on the things that matter.

I like a folksy saying that sort of sums things up pretty well -- a dime holding up a dollar. You have all these little bitty squabbly issues that get in the way of really focusing on reducing risk in your organization in meaningful ways and focusing on the things that matter.

Using approaches like FAIR, drives a lot of value into your organization, because you're freeing up mind share in your executives to focus on things that really matter.
If something happens downstream, and you didn't follow best practice, you're often asked to explain why you didn't follow the herd.

Gardner: Jack Jones, a similar question, any examples that exemplify the virtues of doing the due diligence and having some of these systems and understanding in place?

Jones: I have an example to Jack Freund’s point about being able to focus and prioritize. One organization I was working with had identified a significant risk issue and they were considering three different options for risk mitigation that had been proposed. One was "best practice,” and the other two were less commonly considered for that particular issue.

An analysis showed with real clarity that option B, one of the not-best practice options, should reduce risk every bit as effectively as best practice, but had a whole lot lower cost. The organization then got to make an informed decision about whether they were going to be herd followers or whether they were going to be more cost-effective in risk management.

Unfortunately, there’s always danger in not following the herd. If something happens downstream, and you didn't follow best practice, you're often asked to explain why you didn't follow the herd.

That was part of the analysis too, but at the end of the day, management got to make a decision on how they wanted to behave. They chose to not follow best practice and be more cost-effective in using their money. When I asked them why they felt comfortable with that, they said, "Because we’re comfortable with the rigor in your analysis."

Best practice

To your question earlier about art-versus-science, first of all, in most organization there would have been no question. They would have said, "We must follow best practice." They wouldn’t even examine the options, and management wouldn’t have had the opportunity to make that decision.

Furthermore, even if they had "examined” those options using a more subjective, artistic approach, somebody's wet finger in the air, management almost certainly would not have felt comfortable with a non-best practice approach. So, the more scientific, more rigorous, approach that something like FAIR provides, gives you all kinds of opportunity to make informed decisions and to feel more comfortable more about those decisions.

Gardner: It really sounds as if there's a synergistic relationship between a lot of the big-data and analytics investments that are being made for a variety of reasons, and also this ability to bring more science and discipline to risk analysis.

How do those come together, Jack Jones? Are we seeing the dots being connected in these large organizations that they can take more of what they garner from big data and business intelligence (BI) and apply that to these risk assessment activities, is that happening yet?

Jones: It’s just beginning to. It’s very embryonic, and there are only probably a couple of organizations out there that I would argue are doing that with any sort of effectiveness. Imagine that -- they’re both using FAIR.
There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you.

But when you think about BI or any sort of analytics, there are really two halves to the equation. One is data and the other is models. You can have all the data in the world, but if your models stink, then you can't be effective. And, of course, vise versa. If you’ve got great model and zero data, then you've got challenges there as well.

Being able to combine the two, good data and effective models, puts you in much better place. As an industry, we aren’t there yet. We've got some really interesting things going on, and so there's a lot of potential there, but people have to leverage that data effectively and make sure they're using a model that makes sense.

There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you. The models will grossly misinform you. So people have to be careful, because data is great, but if you’re applying it to a bad model, then you're in trouble.

Gardner: We are coming up near the end of our half hour. Jack Freund, for those organizations that are looking to get started, to get more mature, perhaps start leveraging some of their investments in areas like big data, in addition to attending The Open Group Conference or watching some of the plenary sessions online, what tips do you have for getting started? Are there some basic building blocks that should be in place or ways in which to get the ball rolling when it comes to a better risk analysis?

Freund: Strong personality matters in this. They have to have some sort of evangelist in the organization who cares enough about it to drive it through to completion. That’s a stake on the ground to say, "Here is where we're going to start, and here is the path that we are going to go on."

Strong commitment

When you start doing that sort of thing, even if leadership changes and other things happen, you have a strong commitment from the organization to keep moving forward on these sorts of things.

I spend a lot of my time integrating FAIR with other methodologies. One of the messaging points that I keep saying all the time is that what we are doing is implementing a discipline around how we choose our risk rankings. That’s one of the great things about FAIR. It's universally compatible with other assessment methodologies, programs, standards, and legislation that allows you to be consistent and precise around how you're connecting to everything else that your organization cares about.

Concerns around operational risk integration are important as well. But driving that through to completion in the organization has a lot to do with finding sponsorship and then just building a program to completion. But absent that high-level sponsorship, because FAIR allows you to build a discipline around how you choose rankings, you can also build it from the bottom up.

You can have these groups of people that are FAIR trained that can build risk analyses or either pick ranges -- 1, 2, 3, 4 or high, medium, low. But then when questioned, you have the ability to say, "We think this is a medium, because it met our frequency and magnitude criteria that we've been establishing using FAIR."
Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis.

Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis. In the end it's an interesting and reasonable path to get to risk utopia.

Gardner: Jack Jones, any thoughts from your perspective on a good way to get started, maybe even through the lens of the verticals that The Open Group has targeted for this conference, finance, government and healthcare? Are there any specific important things to consider on the outset for your risk analysis journey from any of the three verticals?

Jones: A good place to start is with the materials that The Open Group has made available on the risk taxonomy and that soon to be published risk-analysis standard.

Another source that I recommend to everybody I talk to about other sorts of things is a book called How to Measure Anything by Douglas Hubbard. If someone is even least bit interested in actually measuring risk in quantitative terms, they owe it to themselves to read that book. It puts into layman’s terms some very important concepts and approaches that are tremendously helpful. That's an important resource for people to consider too.

As far as within organizations, some organizations will have a relatively mature enterprise risk-management program at the corporate level, outside of IT. Unfortunately, it can be hit-and-miss, but there can be some very good resources in terms of people and processes that the organization has already adopted. But you have to be careful there too, because with some of those enterprise risk-management programs, even though they may have been in place for years, and thus, one would think over time and become mature, all they have done is dig a really deep ditch in terms of bad practices and misconceptions.

So it's worth having the conversation with those folks to gauge how clueful are they, but don't assume that just because they have been in place for a while and they have some specific title or something like that that they really understand risk at that level.

Gardner: Well, very good. I'm afraid we will have to leave it there. We've been talking with a panel of experts about the new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We've seen how enterprises are better delivering risk assessments, or beginning to, as they are facing challenges in cyber-security as well as undergoing the larger undertaking of enterprise transformation.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in July 2013 in Philadelphia. There's more information on The Open Group website about that conference for you to attend or to gather information from either in live streaming or there are often resources available to download after the conference. Follow the conference on Twitter at #ogPHL.

So with that thanks to our panel. We've been joined by Jack Freund, Information Security Risk Assessment Manager at TIAA-CREF. Thank you so much, Jack.

Freund: Thank you, Dana.

Gardner: And also Jack Jones, Principal at CXOWARE. Thank you, sir.

Jones: It's been my pleasure. Thanks.

Gardner: And then also lastly, Jim Hietala, Vice President, Security at The Open Group. Thank you, Jim.

Hietala: Thank you, Dana.

Gardner: And this is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leader interview series. Registration to the July 15 conference remains open to attend in person. I hope to see you there. We'll also be conducting some more BriefingsDirect podcasts from the conference, so watch for those in future posts. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect podcast about the how to achieve better risk management with better analysis of risk factors. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Tuesday, July 02, 2013

Cloud Services Help SHI Redefine the Buyer-Seller Dynamic for Huge Efficiency Gains Worldwide

Transcript of a BriefingsDirect podcast on how the networked economy is improving business and sales for an IT provider and its customers.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Ariba, an SAP Company.

Dana Gardner: Hello, and welcome to a special BriefingsDirect podcast series coming to you from the 2013 Ariba LIVE Conference in Washington, D.C.

Gardner
We're here to explore the latest in collaborative commerce and to learn how innovative companies are tapping into the networked economy. We'll see how they are improving their business productivity and sales, along with building far-reaching relationships with new partners and customers.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, and I'll be your host throughout the series of Ariba-sponsored BriefingsDirect discussions.

Our next innovator interview focuses on SHI International, a global provider of IT products, procurement, and related services, with more than $4 billion in annual turnover. We'll learn how SHI teamed with Ariba, an SAP company, to streamline IT product discovery and purchasing processes for large agricultural machinery builder AGCO. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

To hear how they did it, please join me in welcoming our guest. We're here with John D’Aquila, Applications Support Manager at SHI International Corp. in Somerset, New Jersey. Welcome, John.

John D’Aquila: Welcome, Dana.

Gardner: Good to have you with us. Tell me a little bit about the requirements for buying and selling in this era of "fast is better," "more data is inevitable." What’s different now about buying and selling IT products and services than, say, three or four years ago?

D'Aquila: One thing that has really changed is that IT asset management is a hot topic right now. Customers want to track their purchases much more efficiently than in the past, so they can know exactly how much they have at all times. They want to know if they're over-licensed, under-licensed on the software side, or as far as hardware goes, they want to make sure that they have enough hardware in stock, but don’t have too much. You don’t want to have whole closets and warehouses full of equipment.

Gardner: So it's just as we've heard in a lot of other vertical sectors -- fit for purpose, not too wasteful, just in time, not over-inventory, that sort of thing. You have to be very precise, and therefore, you need to have the data about what’s going on across your supply chain.

D'Aquila: Correct. That's where electronic commerce comes in, in IT asset management. I always say that it starts with a great PO, because we want to make sure that when we receive that purchase order, we have as much information that the customer is going to be looking for us to report on downstream.

Years later, if they come back to us and say, how many desktops did we purchase over the last three years and who are they for, the only way we could tell them who it was for is if they told us that information on the purchase order.

Streamlined solution

So the best way to get that is to have a streamlined solution that everyone is using when they're procuring their desktop PC, versus the situation where one PO came over handwritten, one PO came over via fax, and the level of information on each of those POs would be different.

Gardner: How are you doing in terms of getting people to get more digital, more electronic? Is IT a leader or a laggard, or is it all over the map, depending on the individual organization?

D'Aquila: At SHI, as part of every customer QBR or RFP demonstration, we definitely focus on the shi.com portal, which is a standalone website solution to provide them the ability to procure their products from a customized catalog solution.

D'Aquila
Then we show them how we can leverage our check-out question process to collect the information, to make sure that every request and purchase order comes over with that same level of information. If a customer has a solution like Ariba, then we explain to them how we can work with that.

Gardner: This would be a good point, I suppose, to learn more about SHI. Tell us about your organization, how it came about, what you're doing, and why this whole notion of being ultra-efficient across your purchasing processes is essential to your business.

D'Aquila: SHI is a global provider of IT products and solutions. We're headquartered in Somerset, New Jersey, and as you mentioned before, we had over $4 billion in revenue last year. This year we expect to surpass $5 billion.

The number of employees has doubled in four years. So there is definitely an investment internally to enhance the backbone of SHI, which is the sales force and the operations departments.

One thing that I always like to talk about is that as I walk in in the morning -- and all employees walk in -- Above the SHI logo it says "Innovative Solutions and World Class Support." This reminds every employee, as they walk in, that our customers are the reason we're successful, and the way we retain those customers is by providing those innovative solutions and world-class support.

Gardner: Tell me a bit more about how these low-touch orders are executed, and what Ariba’s role is? How are we getting people to be more efficient and more data driven when it comes to procuring their IT services and products?

Customer driven

D'Aquila: The whole Ariba process is typically driven by the customer. In the early stages of evaluating a solution, we can tell them, if they ask us which one have you worked with and what are the benefits of each, but typically the decision has already been made by the time they come to my team.

We'll explain to them our capabilities around that, and how we could seek benefits from little pieces of information on either the punch-out setup request or on the purchase order.

Gardner: Tell us a bit about this example so we can learn more about how a good way to do this unfolds. AGCO -- who are they, how did they become your customer, what are you doing with them, and how do they exemplify what should be going on here?

D'Aquila: AGCO has been a customer of SHI’s for many years. The spend was at some growth, but it was really a slow trend up. Eric Deese is the contractor who is working on the project of enabling Ariba throughout AGCO.

We had a conference call to discuss the requirements and his scheduling and understanding his expectations of what we were going to do. From there, we put the resources in place. We did some testing with Eric, a full test, from the purchase order to invoice, to make sure that everything worked properly. Then, I handed it over to Tammy Wagner, who is the Account Executive for AGCO.
We've tailored a catalog around the requirements that Eric provided to make it easier for his users to find products.

One thing that we really like to focus on with customers is, rather than show them everything we could sell, we show what they actually need and want. So we've tailored a catalog around the requirements that Eric provided to make it easier for his users to find products.

Since we've gone live, the number of products purchased from SHI and the different product lines has tripled. So it's been a great success story.

Gardner: How are these trends around cloud, big data, and more process-driven efficiency goals translated into actual savings or efficiencies? Can we quantify it? Are there any metrics of success even for a company like AGCO? What did they gain when they did this better?

D'Aquila: One thing is that they control their spend. In speaking to Eric, he explained that the AGCO users were buying software from everywhere. Some people would buy a shrink-wrap copy of software, which is really not the right way to buy software. They would use their P-Cards, and then they would just do an expense report, so it wouldn't be captured properly within their cost centers and the internal accounting.

Now, he said, all the employees of AGCO are going into the Ariba application and procuring their software from SHI. So maverick spend has been controlled.

As far the cloud, we're not doing anything today with AGCO in that space. SHI does have cloud solutions, backup-as-a-service solutions, and hopefully in the future we can build that out.

Single-point purchasing

Gardner: Can you prove back to them, when they do this with a single point for purchasing and when they have a standard operating procedure that everyone lines up behind? You must get more data in that regard than you can feed back to the customer to prove to them what they are saving. For example, the P-Card tax, that's not involved. How can you quantify this in dollar terms? Do you have a means to do that?

D'Aquila: We don't know exactly how much they've paid in the past. However, we can show Eric the spend with SHI and how it has grown. We work with you. Your overall spend has helped you secure better pricing with the manufacturers and with SHI, which in the long-term will turn over savings for AGCO.

Gardner: As IT organizations, in particular, are looking to move more towards an operations expenditure (OPEX) approach rather than the capital expenditure (CAPEX), they're looking for services, for leasing, and for outsourcing types of services. How is that impacting your business and how does that also impact the buying and selling process?

D'Aquila: There has definitely been a trend of more operational expense, versus capital. We notice that customers are no longer treating a desktop as a commodity. It's more of a rental. You're going to use it for a few years and it's no longer going to be expected to run the life of an employee.

So the catalog refresh cycles, have changed, as far as the number of items in the catalog. There is definitely standardizing and making sure that everyone in the organization has the same type of product, so they can get better imaging and so forth.
Although it is BYOD, they're still putting minimum specifications that really require a business-type tool. You are not going to get away with a retail laptop, desktop, or even the smaller mobile devices.

There is also a trend toward bring your own device (BYOD) that has been coming our way. Organizations are telling their employees, here is your minimum specifications, you can buy any PC, but it's out of your own pocket. It's up to you to purchase it, but you can bring that to work, whether it's a mobile device or even a laptop.

Gardner: Are you starting to see any trends with BYOD where they would say, you can buy it, but why don't you buy it through these guys because they get a bulk rate? Is there a sort of a hybrid, where it's the corporation managing the buy, getting the benefits of the bulk sale, the organization around that, but having it be done through the end user, the employee, and then managed by them over time?

D'Aquila: When we're involved, that's the BYOD procedure that I see in place. The customer does pick a standard set of solutions and products and say, here is what you could choose from 20 items, and you should buy this from SHI, because we have secured deals through the manufacturer and through SHI to get discounted pricing. Of course, they can go to a retail shop on a weekend and maybe get one of the five that come in that are on sale, but typically that's not going to meet the specifications.

Although it is BYOD, they're still putting minimum specifications that really require a business-type tool. You are not going to get away with a retail laptop, desktop, or even the smaller mobile devices.

Gardner: John, we've been talking a lot about how the buyer from your organization is benefiting from an Ariba relationship. How about on your acquisition side, your supply chain? Is the Ariba Network coming into play on that side as well?

Net new customers

D'Aquila: We use Ariba as a seller, we have seen great benefit in growing customers, and that's really where we focus. We want to get net new customers and grow the catalogs and offerings to the existing customers.

Today, there may be a customer that only purchases software from SHI. We want to introduce them to the fact that although we were Software House International, we are SHI now, because we sell all products that are IT related -- hardware, services, and solutions.

Gardner: And because we are here at Ariba LIVE, what are you hearing that excites you. It may be the spot-buying information. Is that something that would be of interest to you?
We sell all products that are IT related -- hardware, services, and solutions.

D'Aquila: Yes. I've used Discovery in the past. I think there were a lot of empty requests we would respond, and then they wouldn't be viewed. I'm expecting that with the Spot Buy, because it will come directly out of the SAP application and will be someone keying in a request and looking for the bids, we'll get better leads from the solution. I'm looking forward to see what comes of it.

Gardner: I am afraid we will have to leave it there. We've been talking about how SHI has teamed up with Ariba to streamline IT product purchasing, processes, especially for a large agricultural company, AGCO.

Thank you so much to our guest, John D’Aquila, Applications Support Manager at SHI International. Thanks so much.

D'Aquila: Thank you, Dana.

Gardner: And thank you to our audience for joining this special podcast coming to you from the 2013 Ariba LIVE Conference in Washington D.C.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout the series of Ariba sponsored BriefingsDirect discussions. Thanks again for joining, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Ariba, an SAP Company.

Transcript of a BriefingsDirect podcast on how the networked economy is improving business and sales for an IT provider and its customers. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in: