Wednesday, September 30, 2009

Doing Nothing Can Be Costliest IT Course When Legacy Systems and Applications Are Involved

Transcript of a BriefingsDirect podcast on the risks and drawbacks of not investing wisely in application modernization and data center transformation.

Listen to podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Hewlett Packard.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on the high, and sometimes underappreciated, cost for many enterprises of doing nothing about aging, monolithic applications. Not making a choice about legacy mainframe and poorly used applications is, in effect, making a choice not to transform and modernize the applications and their supporting systems.

Not doing anything is a choice to embrace an ongoing cost structure that may well prevent significant new spending for IT innovations. It’s a choice to suspend applications on, perhaps, ossified platforms and make their reuse and integration difficult, complex, and costly.

Doing nothing is a choice that, in a recession, hurts companies in multiple ways, because successful transformation is the lifeblood of near and long-term productivity improvements.

Here to help us better understand the perils of continuing to do nothing about aging legacy and mainframe applications, we’re joined by four IT transformation experts from Hewlett-Packard (HP). Please join me in welcoming our guests. First, Brad Hipps, product marketer for Application Lifecycle Management (ALM) and Applications Portfolio Software at HP. Welcome, Brad.

Brad Hipps: Thank you.

Gardner: Also, John Pickett from Enterprise Storage and Server Marketing at HP. Hello, John.

John Pickett: Hi. Welcome.

Gardner: Paul Evans, worldwide marketing lead on Applications Transformation at HP. Hello, Paul.

Paul Evans: Hello, Dana.

Gardner: And, Steve Woods, application transformation analyst and distinguished software engineer at EDS, now called HP Enterprise Services. Good to have you with us, Steve.

Steve Woods: Thank you, Dana.

Gardner: Let me start off by going to Paul. The recession has had a number of effects on people, as well as budgets, but I wonder what, in particular, the tight cost structures have had on this notion of tolerating mainframe and legacy applications?

Cost hasn't changed

Evans: Dana, what we’re seeing is that the cost of legacy systems and the cost of supporting the mainframe hasn’t changed in 12 months. What has changed is the available cash that companies have to spend on IT, as, over time, that cash amount may have either been frozen or is being reduced. That puts even more pressure on the IT department and the CIO in how to spend that money, where to spend that money, and how to ensure alignment between what the business wants to do and where the technology needs to go.

Given the fact that we knew already that only about 10 percent of an IT budget was spent on innovation before, the problem is that that becomes squeezed and squeezed. Our concern is that there is a cost of doing nothing. People eventually end up spending their whole IT budgets on maintenance and upgrades and virtually nothing on innovation.

At a time when competitiveness is needed more than it was a year ago, there has to be a shift in the way we spend our IT dollars and where we spend our IT dollars. That means looking at the legacy software environments and the underpinning infrastructure. It’s absolutely a necessity.

Gardner: So, clearly, there is a shift in the economic impetus. I want to go to Steve Woods. As an analyst looking at these issues, what’s changed technically in terms of reducing something that may have been a hurdle to overcome for application transformation?

Woods: For years, the biggest hurdle was that most customers would say they didn’t really have to make a decision, because the performance wasn’t there. The performance reliability wasn't there. That is there now. There is really no excuse not to move because of performance reliability issues.

What's still there, and is changing today, is the ability to look at a legacy source code application. We have the tools now to look at the code and visualize it in ways that are very compelling. That’s typically one of the biggest obstacles. If you look at a legacy application and the number of lines of code and number of people that are maintaining it, it’s usually obvious that large portions of the application haven’t really changed much. There's a lot of library code and that sort of thing.

That’s really important. We’ve been straight with our customers that we have the ability to help them understand a large terrain of code that they might be afraid to move forward. Maybe they simply don’t understand it. Maybe the people who originally developed it have moved on, and because nobody really maintains it, they have fear of going in the areas of the system.

Also, what has changed is the growth of architectural components, such as extract transform and load (ETL) tools, data integration tools, and reporting tools. When we look at a large body of, say, 10 million lines of COBOL and we find that three million lines of that code is doing reporting or maybe two million is doing ETL work, we typically suggest they move that asymmetrically to a new platform that does not use handwritten code.

That’s really risk aversion -- doing it very incrementally with low intrusion, and that’s also where the best return on investment (ROI) picture can be portrayed. You can incrementally get your ROI, as you move the reports and the data transformation jobs over to the new platform. So, that’s really what’s changed. These tools have matured so that we have the performance and we also have the tools to help them understand their legacy systems today.

Gardner: Now, one area where economics and technology come together quite well is the hardware. Let’s go to John with regards to virtualization and reducing the cost of storage. How has that changed the penalty increase for doing nothing?

Functionality gap

Pickett: Typically, when we take a look at the high-end of applications that are going to be moving over and sitting on a legacy system, many times they’re sitting on a mainframe platform. With that, one of the things that have changed over the last several years is the functionality gap between what exists in the past 5 or 10 years ago in the mainframe. That gap has not only been closed, but, in some cases, open systems exceed what’s available on the mainframe.

So, just from a functionality standpoint, there is certainly plenty of capability there, but to hit on the cost of doing nothing, and implementing what you currently have today is that it’s not only the high cost of the platform. As a matter of fact, one of our customers who had moved from a high-end mainframe environment onto an Integrity Superdome, calculated that if you were to take their cost savings and to apply that to playing golf at one of the premier golf places in the world, Pebble Beach, you could golf every day with three friends for 42 years, 10 months, and a couple of days.

It’s not only a matter of cost, but it’s also factoring in the power and cooling as well. Certainly, what we’ve seen is that the cost savings that can be applied on the infrastructure side are then applied back into modernizing the application.

Gardner: I suppose the true cost benefits wouldn’t be realized until after some sort of a transformation. Back to Paul Evans. Are there any indications from folks who have done this transformation as to how substantial their savings can be?

Evans: There are many documented cases that HP can provide, and, I think, other vendors can provide as well, In terms of looking at their applications and the underpinning infrastructure, as John was talking about, there are so many documented cases that point people in the direction that there are real cost savings to be made here.

There's also a flip side to this. Some research that McKinsey did earlier in the year took a sample of 100 companies as they went into the recession. They were brand leadership companies. Coming out of the recession, only 60 of those companies were still in a leadership position. Forty percent of those companies just dropped by the wayside. It doesn’t mean they went out of business. Some did. Some got acquired, but others just lost their brand leadership.

That is a huge price to pay. Now, not all of that has to do with application transformation, but we firmly believe that it is so pivotal to improve services and revenue generation opportunities that, in tough times, need to be stronger and stronger.

What we would say to organizations is, "Take a hard look at this, because doing nothing could be absolutely the wrong thing to do. Having a competitive differentiation that you continue to exploit and continue to provide customers with improving level of service is to keep those customers at a tough time, which means they’ll be your customers when you come out of the recession."

Gardner: Let’s go to Brad. I’m also curious on a strategic level about flexibility and agility, are there prices to be paid that we should be considering in terms of lock in, fragility, or applications that don’t easily lend themselves to a wider process.

'Agility' an overused term

Hipps: This term "agility" is the right term to use, but it gets used so often that people tend to forget what it means. The reality of today’s modern organization -- and this is contrasted even from 5, certainly 10 years ago -- is that when we look at applications, they are everywhere. There has been an application explosion.

When I started in the applications business, we were working on a handful of applications that organizations had. That was the extent of the application in the business. It was one part of it, but it was not total. Now, in every modern enterprise, applications really are total -- big, small, medium size. They are all over the place.

When we start talking about application transformation and we assign that trend to agility, what we’re acknowledging is that for the business to make any change today in the way it does business, in any new market initiative, in any competitive threat it wants to respond to, there is going to be an application -- very likely "applications," plural, that are going to need to be either built or changed to support whatever that new initiative is.

The fact of the matter is that changing or creating the applications to support the business initiative becomes the long pole to realizing whatever it is that initiative is. If that’s the case, you begin to say, "Great. What are the things that I can do to shrink that time or shrink that pole that stands between me and getting this initiative realized in the market space?”

From an application transformation perspective, we then take that as a context for everything that’s motivating a business with regard to its application. The decisions that you're going to make to transform your applications should all be pointed at and informed by shrinking the amount of time that takes you to turn around and realize some business initiative.

So, in 500 words or less, that's what we’re seeking with agility. Following pretty closely behind that, you can begin to see why there is a promise in cloud. It saves me a lot of infrastructural headaches. It’s supposed to obviate a lot of the challenges that I have around just standing up the application and getting it ready, let alone having to build the application itself. So I think that is the view of transformation in terms of agility and why we’re seeing things like cloud. These other things really start to point the direction to greater agility.

Gardner: It sounds as if there is a penalty to be paid or a risk to be incurred by being locked

That pool of data technology only gets bigger and bigger the more changes that I have coming in and the more changes that I'm trying to do.

into the past.

Hipps: That’s right, and you then take the reverse of that. You say, "Fine. If I want to keep doing things as is, that means that every day or every month that goes by, I add another application or I make bigger my current application pool using older technologies that I know take me longer to make changes in.

In the most dramatic terms, it only gets worse the longer I wait. That pool of data technology only gets bigger and bigger the more changes that I have coming in and the more changes that I'm trying to do. It’s almost as though I’ve got this ball and chain that I’ve attached to my ankle. I'm just letting that ball part get bigger and bigger. There is a very real agility cost, even setting aside what your competition may be doing.

Gardner: So, the inevitability of transformation goes from a long horizon to a much nearer and dearer issue. Let’s go back to Steve Woods of EDS. What are some misconceptions about starting on this journey? Is this really something that’s going to highly disrupt an organization or are there steps to do it incrementally? What might hold people back that shouldn't?

More than one path

Woods: I think probably one of the biggest misconceptions is when somebody has a large legacy application written in a second-generation language such as COBOL or perhaps PL/1 and they look at the code and imagine the future with still having handwritten code. But, they imagine maybe it’ll be in Java or C# or .Net, and they don’t pick the next step and say, "If I had to look at the system and rebuild it today, would I do it the same way?" That’s what you are doing if you just imagine one path to modernization.

Some of the code they have in their business logic might find their way into some classes in Java and some classes in .Net. What we prefer to do is a functional breakdown on what the code is actually doing functionally and then try to imagine what the options are that we have going forward. Some of that will become handwritten code and some of it will move to those sorts of implementations.

So, we really like to look at what the code is doing and imagine other areas that we could possibly implement those changes in. If we do that, then we have a much better approach to moving them. The worse thing to do -- and a lot of customers have this impression -- is to automatically translate the code from COBOL into Java.

Java and C# are very efficient languages to generate a function point, where there’s a measure of

It’s looking at the code, at what you want to do from a business process standpoint, and looking at the underlying platform.

functionality. Java takes about eight or ten lines of code. In COBOL, it takes about a 100 lines.

Typically, when you translate automatically from COBOL to Java, you still get pretty much the same amount of code. In actuality, you’ve taking the maintenance headache and making it even larger by doing this automated translation. So, we prefer to take a much more thoughtful approach, look at what the options are, and put together an incremental modernization strategy.

Gardner: Paul Evans, this really isn’t so much pulling the plug on the mainframe, which may give people some shivers. They might not know what to expect over a period of decades or what might happen when they pull the plug.

Evans: We don't profess that people unplug mainframes. If they want to, they may plug in an HP system in its place. We’d love them to. But, being very pragmatic, which is what we like to be, it's looking at what Steve was talking about. It’s looking at the code, at what you want to do from a business process standpoint, and looking at the underlying platform.

It's understanding what quality of service you need to deliver and then understand the options available. In even base technologies like this is, with microprocessors, the power that can be delivered these days means we can do all sorts of things at prices, speed, size, power output, and CO2 emissions that we could only dream of a few years ago. This power enables us to do all sorts of things.

The days where there was this walled-off area in a data center, which no other technology could match, are long gone. Now, the emphasis has been on consolidation and virtualization. There is also a big focus on legacy modernization. CIOs and IT directors, or whatever they might be, do understand there’s an awful lot of money spent on maintaining, as Steve said, handwritten legacy code that today run the organization and need to continue to provide these business processes.

Bite-size chunks

There are far faster, cheaper, and better ways to do that, but it has to be something that is planned for. It has to be something that is executed flawlessly. There's a long-term view, but you take bite-sized chunks out of it along the way, so that you get the results you need. You can feed those good results back into the system and then you get an upward spiral of people seeing what is truly possible with today’s technologies.

Gardner: John Pickett, are there any other misconceptions or perhaps under-appreciated points of information from the enterprise storage and server perspective?

Pickett: Typically, when we see a legacy system, what we hear, in a marketing sense, is that the often high-end -- and I’ll just use that as an example -- mainframes could be used as a consolidation factor. What we find is that if you're going to be moving applications or you’re modernizing applications onto an open-system environment to take advantage of the full gamut of tools and open system applications that are out there, you're not going to be doing that on a legacy environment. We see that the more efficient way of going down that path is onto an open-standard server platform.

Also, some of the other misconceptions that we see, again in a marketing sense, are that a mainframe is very efficient. However, if you compare that to a high-end HP system, for example, and just take a look at the heat output -- which we know is very important -- there is more heat. The difference in heat between a mainframe and an Integrity Superdome, for example, is enough to power a two-burner gas grill, a Weber grill. So, there's some significant heat there.

On the energy side, we see that the Superdome consumes 42 percent less energy. So, it's a very

. . . Some of the other misconceptions that we see, again in a marketing sense, are that a mainframe is very efficient.

efficient way of handling the operating-system environment, when you do modernize these applications.

Gardner: Brad Hipps, when we talk about modernizing, we’re not just modernizing applications. It’s really modernizing the architecture. What benefits, perhaps underappreciated ones, come with that move?

Hipps: I tend to think that in application transformation in most ways they’re breaking up and distributing that which was previously self-contained and closed.

Whether you're looking at moving to some sort of mainframe processing to distributed processing, from distributed processing to virtualization, whether you are talking about the application team themselves, which now are some combination of in-house, near-shore, offshore, outsourced sort of a distribution of the teams from sort of the single building to all around the world, certainly the architectures themselves from being these sort of monolithic and fairly brittle things that are now sort of services driven things.

You can look at any one of those trends and you can begin to speak about benefits, whether it’s leveraging a better global cost basis or on the architectural side, the fundamental element we’re trying to do is to say, "Let’s move away from a world in which everything is handcrafted."

Assembly-line model

Let’s get much closer to the assembly-line model, where I have a series of preexisting trustworthy components and I know where they are, I know what they do, and my work now becomes really a matter of assembling those. They can take any variety of shapes on my need because of the components I have created.

We're getting back to this idea of lower cost and increased agility. We can only imagine how certain car manufacturers would be doing, if they were handcrafting every car. We moved to the assembly line for a reason, and software typically has lagged what we see in other engineering disciplines. Here we’re finally going to catch up. We're finally be going to recognize that we can take an assembly line approach in the creation of application, as well, with all the intended benefits.

Gardner: And, when you standardize the architecture, instead of having to make sure there is a skillset located where the systems are, you can perhaps bring the systems to where the different skills are?

Hipps: That’s right. You can begin to divorce your resources from the asset that they are creating, and that’s another huge thing that we see. And, it's true, whether you're talking about a service or a component of an application or whether you're talking about a test asset. Whatever the case may be, we can envision a series of assets that make an application successful. Now, those can be distributed and geographically divorced from the owners.

Gardner: Where this has been a "nice to have" or "something on the back-burner" activity,

The pressure it’s bringing to bear on people is that the time is up when people just continue to spend their dollars on maintaining the applications . . . They can't just continue to pour money after that.

we're starting to see a top priority emerge. I’ve heard of some new Forrester research that has shown that legacy transformation is becoming the number one priority. Paul, can you offer some more insight on that?

Evans: That’s research that we're seeing as well, Dana, and I don’t know why. ... The point is that this may not be what organizations "want" to do.

They turn to the CIO and say, "If we give you $10 million, what is that you'd really like to do." What they're actually saying is this is what they know they've got to do. So, there is a difference between what they like and what they've got to do.

That goes back to when we started in the current economic situation. The pressure it’s bringing to bear on people is that the time is up when people just continue to spend their dollars on maintaining the applications, as Steve and Brad talked about and the infrastructure that John’s talked about. They can't just continue to pour money after that.

There has to be a bright point. Someone has got to say, “Stop. This is crazy. There are better ways to do this.” What the Forrester research is pointing is that if you go around to a worldwide audience and talk to a thousand people in influential positions, they're now saying, "This is what we 'have' to do, not what we 'want' to do. We're going to do this, and we're going to take time out and we're going to do it properly. We're going to take cost out of what we are doing today and it’s not going to come back."

Flipping the ratio

All the things that Steve and Brad have talked about in terms of handwritten code, once we have done it, once we have removed that handwritten code, that code that is too big for what it needs to be in terms to get the job done. Once we have done it once, it’s out and it’s finished with and then we can start looking at economics that are totally different going forward, where we can actually flip this ratio.

Today, we may spend 80 percent or 90 percent of our IT budget on maintenance, and 10 percent on innovation. What we want to do is flip it. We're not going to flip it in a year or maybe even two, but we have got to take steps. If we don’t start taking steps, it will never go away.

Hipps: I've got just one thing to add to that in terms of the aura of inevitability that is coming with the transformation. When you look at IT over the last 30 years, you can see that, fairly consistently, you can pick your time frame, and somewhere in neighborhood of every seven to nine years, there has been sort of an equivalent wave of modernization. The last major one we went through was late '90s or early 2000s, with the combination of Y2K and Web 1.0. So, sure enough, here we are, right on time with the next wave.

What’s interesting is this now number-one priority hasn’t reached the stage of inevitability. I

When you look at IT over the last 30 years, you can see that, fairly consistently, you can pick your time frame, and somewhere in neighborhood of every seven to nine years, there has been sort of an equivalent wave of modernization.

look back and think about what organizations in 2003 were still saying, "No, I refuse the web. I refuse the network world. It’s not going to happen. It’s a passing fancy," and whatever the case maybe. Inasmuch as there were organizations doing that, I suspect they're not around anymore, or they're around much smaller than they were. I do think that’s where we are now.

Cloud is reasonably new, but outsourcing is another component where transformation has been around long enough that most people have been able to look it square the eye and figure out, "You know what. There is real benefit here. Yes, there are some things I need to do on my side to realize that benefit. There is no such thing as a free lunch, but there is a real benefit here and it’s I am going to suffer, if not next year, then three years from now, if I don’t start getting my act together now."

Gardner: John Pickett, are there any messages from the boosters of mainframes that perhaps are no longer factors or are even misleading?

Pickett: There are certainly a couple of those. In the past, the mainframe was thought to be the harbinger of RAS -- reliability, availability, and serviceability. Many of those features exist on open systems today. It’s not something that is dedicated just to the high-end of the mainframe environment. They are out there on systems that are open-system platforms, significantly cheaper. In many cases, the RAS of these systems far exceeds what we’ll see on the mainframe.

That’s just one piece. Other misconceptions and things that you typically saw historically have been on the mainframe side, such as being able to drive a business-based objective or to be able to prioritize resources for different applications or different groups of users. Well, that’s something that has existed for a number of years on the open system side -- things such as backup and recovery and being able to provide very high levels of disaster recovery.

Misleading misconception

A misconception that this is something that can only be done in a mainframe environment, is not only misleading, but also not making the move to an open-system platform, continues to drive IT budget unnecessarily into an infrastructure that could be applied to either the application modernization that we have been talking about here or into the skills -- people resources within the data center.

Gardner: We seem to have a firm handle on the cost benefits over time. Certainly, we have a total cost picture, comparing older systems to the newer systems. Are there more qualitative, or what we might call "soft benefits," in terms of the competitiveness of an organization? Do we have any examples of that?

Evans: What we have to think about is the target audience out there. More and more people have access to technology. We have the generation now coming up that wants it now and wants it off the Web. They are used to using social networking tools that people have become accustomed to. So, it's one of the soft, squidgy areas as people go through this transformation.

I think that we can put hard dollars -- or pounds or euros -- against this for the moment, the inclusion of Web 2.0 or Enterprise 2.0 capabilities into applications. We have customers who are now trying that, some of it inside the firewall and some of it beyond. One, this can provide a much richer experience for the user. Secondly, you begin to address an audience that is used to analyzing these things in their day-to-day life anyway.

Why, when they step into the world of the enterprise, do they have to step back 50 years in terms

More and more people have access to technology. We have the generation now coming up that wants it now and wants it off the Web.

of capability? You just can’t imagine that certain things that people require are being done in batch mode anymore. The real-time enterprises are what people now expect and want.

So, as people go through this transformation, not only can they do all the plethora of things we have talked about in terms of handwritten code, mainframes, structure, and service-oriented architecture (SOA), but they can also start taking steps towards how they can really get these applications in line and embed them within an intimate culture.

If they start to take on board some of the newer concepts around cloud to experiment they have to understand that people aren’t going to just make this big leap of faith. At the end of the day, it's enterprise apps. We make things, apply things and count things -- and people have got to continue to do that. At the same time, they need to take these pragmatic steps to introduce these newer technologies that really can help them not only retain their current customer base, but attract new customers as well.

Gardner: Paul, when organizations go through this transformation, modernize, and go to open systems, does that translate into some sort of a business benefit, in terms of making that business itself more agile, maybe in a mergers and acquisition sense? Would somebody resist buying a company because they've got a big mainframe as an albatross around its neck?

Fit for purpose

Evans: Definitely, to have your IT fit for purpose, is something that is part of the inherent health of the organization. For organizations whose IT is way behind where it is today, it's definitely part of the health check.

To some degree, if you don’t want to get taken over or merged or acquired, maybe you just let your IT sag to where it is today, with mainframes and legacy apps, and nobody would want you. But then, you’re back to where we were earlier. You become one of those 40 percent of the companies that disappear off the face of the planet. So, it’s a sort of a double-edged sword, you make yourself attractive and you could get merged or acquired. On the other hand, you don’t do it and you’re going to go out of business. I still think I prefer the former rather than the latter.

Gardner: Let’s talk more specifically about what HP is bringing to the table. We’ve flushed out this issue quite a bit. Is there a long history at HP of modernization?

Evans: There are two things. There is what we have done internally, within the organization in the company. We’ve had to sort of eat our own dog food, in the sense that there are companies that were merged and companies that were acquired -- HP, Compaq, Digital, EDS, whatever.

It’s just not acceptable anymore to run these as totally separate IT organizations. You have to

When you take a look at the history of what we've been able to do, migrating legacy applications onto an open system platform, we actually have a long history of that.

quickly understand how to get this to be an integrated enterprise. It’s been well documented what we have done internally, in terms of taking massive amount of cash out of our IT operations, and yet, at the same time, innovating and providing a better service, while reducing our applications portfolio from something like 15,000 to 3,000.

So, all of these things going at the same time, and that has been achieved within HP. Now, you could argue that we don't have mainframes, so maybe it’s easier. Maybe that’s true, but, at the same time, modernization has been growing, and now we're right up there in the forefront of what organizations need to do to make themselves cost-effective, agile, and flexible, going forward.

Gardner: John Pickett, what about the issue around standards, neutrality, embracing heterogeneity, community and open source? Are these issues that HP has found some benefits from?

Pickett: Without a doubt. When you take a look at the history of what we've been able to do, migrating legacy applications onto an open system platform, we actually have a long history of that. We continue to not only see success, but we’re seeing acceleration in those areas.

A couple of drivers that we ended up seeing are really making the case for customers, not only the significant cost savings that we have talked about earlier. So, we're talking 50 percent or 70 percent total cost of ownership (TCO) savings driving from a legacy of mainframe environment over to an HP environment.

Additional savings

In addition to that, you also have the power savings. Simply by moving, the amount of energy that saved is enough to light 80 houses for one year. We’ve already talked about the heat and the space savings. It’s about a third of what you’re going to be seeing for a high-end mainframe environment for a similar system from HP with similar capabilities.

Why that’s important is because if customers are running out of data-center room and they’re looking at increasing their compute capacity, but they don’t have room within their data center, it just makes sense to go with a more efficient, more densely packed power system, with less heat and energy than what you’ll see on a legacy environment.

Gardner: Brad Hipps, about this issue about of being able to sell from a fairly neutral perspective, based on a solutions value, does that bring something to the table?

Hipps: We alluded earlier to the issue of lock in. If we’re going to, as we do, fly under the banner of bringing flexibility and agility to an organization, it’s tough to wave that banner without being pretty open in who you’re going to play with and where.

Organizations have a very fine eye for what this is going to mean for me not just six months from now, but two years from now, and what it’s going to mean to successors in line in the organization. They don’t want to be painted into a corner. That’s something that HP is very cognizant of, and has been very good about.

This may be a little bit overly optimistic, but you have to be able to check that box. If you’re going to make a credible argument to any enterprise IT organization, you have to show your openness and you have to check the box that says we’re not going to paint you into a corner.

Gardner: Steve Woods, for those folks who need to get going on this, where do you get started? We mentioned that iterative nature, but there must be perhaps low-hanging fruit, demonstrations of value that then set up a longer record of success.

Woods: Absolutely. What we find with our customers is that there are various levels in the processes of understanding their legacy systems. Often, we find some of them are quite mature and have gone down the road quite a bit. We offer some assessments based upon single applications and also portfolio of applications. We do have a modernization assessment and we do have a portfolio assessment. We also offer a best-shore assessment to ensure that you are using the correct resources.

Often, we find that we walk in, and the customers just don’t know anything about what their

We have the visual intelligence tools that very quickly allow us to see inside the system, see the duplicate source code, and provide them with high level cost estimates.

options are. They haven’t done any sort of analysis thus far. In those cases, we offer what we’re calling a Modernization Opportunity Workshop.

It's a very quick, usually 4-8 hour, on-site, and it takes about four weeks to deliver the entire package. We use some tools that I have created at HP that look at the clone code within the application. It’s very important to understand the pattens of the clone code and have visualizations. We have the visual intelligence tools that very quickly allow us to see inside the system, see the duplicate source code, and provide them with high level cost estimates.

We use a tool called COCOMO and we use Monte Carlo simulation. We’re able very quickly to give them a pretty high-level, 30-page report that indicates the size. Often, size is something that is completely misunderstood. We have been into customers who tell us they have four million lines of code, and we actually count the code as only 400,000 lines of code. So, it’s important to start with a stake in the ground and understand exactly where you’re at with the size.

We also do functionality composition support to understand that. That’s all delivered with very little impact. We know the subject matter experts are very busy, and we try to lessen the impact of doing that. That’s one of the places we can start, when the customer just has some uncertainty and they're not even sure where to start.

Gardner: We’ve been discussing the high penalties that can come with inaction around applications and legacy systems. We’ve been talking about how that factors into the economy and the technological shifts around the open systems and other choices that offer a path to agility and multiple-sourcing options.

I want to thank our panelists today for our discussion about the high costs and risks inherent in doing nothing around legacy systems. We’ve been joined by Brad Hipps, product marketer for Application Lifecycle Management and Applications Portfolio Software at HP. Thank you Brad.

Hipps: Thank you.

Gardner: John Pickett, Enterprise Storage and Server Marketing at HP. Thank you, John.

Pickett: Thank you Dana.

Gardner: Paul Evans, Worldwide Marketing Lead on Applications Transformation at HP. Thank you, Paul.

Evans: Thanks Dana.

Gardner: And Steve Woods, applications transformation analyst and distinguished software engineer at EDS. Thank you Steve.

Woods: Thank you Dana.

Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. You’ve been listening to a sponsored BriefingsDirect podcast. Thanks for listening and come back next time.

Listen to podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Hewlett Packard.

Transcript of a BriefingsDirect podcast on the risks and drawbacks of not investing wisely in application modernization and data center transformation. Copyright Interarbor Solutions, LLC, 2005-2009. All rights reserved.

Monday, September 21, 2009

Part 1 of 4: Web Data Services Extend Business Intelligence Depth and Breadth Across Social, Mobile, Web Domains

Transcript of first in a series of sponsored BriefingsDirect podcasts with Kapow Technologies on Web Data Services and how harnessing the explosion of Web-based information inside and outside the enterprise buttresses the value and power of business intelligence.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Kapow Technologies.

See popular event speaker Howard Dresner's latest book, Profiles in Performance: Business Intelligence Journeys and the Roadmap for Change, or visit his website.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on the future of business intelligence (BI) -- on bringing more information from more sources into an analytic process, and thereby getting more actionable intelligence out.

The explosion of information from across the Web, from mobile devices, inside of social networks, and from the extended business processes that organizations are now employing all provide an opportunity, but they also provide a challenge.

This information can play a critical role in allowing organizations to gather and refine analytics into new market strategies, better buying decisions, and to be the first into new business development opportunities. The challenge is in getting at these Web data services and bringing them into play with existing BI tools and traditional data sets.

This is the first in a series of podcasts, looking at the future of BI and how Web data services can be brought to bear on better business outcomes.

So, what are Web data services and how can they be acquired? Furthermore, what is the future of BI when these extended data sources are made into strong components of the forecasts and analytics that enterprises need to survive the recession and also to best exploit the growth that follows?

Here to help us explain the benefits of Web data services and BI is Howard Dresner, president and founder of Dresner Advisory Services. Welcome to the show, Howard.

Howard Dresner: Thanks, Dana. It's great to be here today.

Gardner: We're also joined by Ron Yu, vice president of marketing at Kapow Technologies. Thanks for joining, Ron.

Ron Yu: Hi, Dana. Great to be with you today.

Gardner: Howard, let me start with you. We've certainly heard a lot about BI over the past several years. There's a very strong trend and lots of investments are being made. How does this, in fact, help companies during the downturn that we are unfortunately still in and then prepare for an upside?

Empowering end users

Dresner: BI is really about empowering end users, as well as their respective organizations, with insight, the ability to develop perspective. In a downturn, what better time is there to have some understanding of some of the forces that are driving the business?

Of course, it's always useful to have the benefit of insight and perspective, even in good times. But, it tends to go from being more outward-focused during good times, focused on markets and acquiring customers and so forth, to being more introspective or internally focused during the bad times, understanding efficiencies and how one can be more productive.

So, BI always has merit and in a downturn it's even more relevant, because we are really less tolerant of being able to make mistakes. We have to execute with even greater precision, and that's really what BI helps us do.

Gardner: Well, if we're looking either internally at our situation or externally at our opportunities, the more information we have at our disposal the stronger our analytical return.

It is a moving target, because the world continues to evolve. There are lots of information sources.



Dresner: Certainly, one would hope so. If you're trying to develop perspective, bringing as much relevant data or information to bear is a valuable thing to do. A lot of organizations focus just on lots of information. I think that you need to focus on the right information to help the organization and individuals carry out the mission of that organization.

Gardner: And that crucial definition of "right information" has changed or is a moving target. How do you keep track of what's the right stuff?

Dresner: It is a moving target, because the world continues to evolve. There are lots of information sources. When I first started covering this beat 20 years ago, the available information was largely just internal stores, corporate stores, or databases of information. Now, a lot of the information that ought to be used, and in many cases, is being used, is not just internal information, but is external as well.

There are syndicated sources, but also the entire World Wide Web, where we can learn about our customers and our competitors, as well as a whole host of sources that ought to considered, if we want to be effective in pursuing new markets or even serving our existing customers.

Gardner: Ron Yu, we've certainly seen an increase in business processes that are now developed from components beyond just a packaged application set. We've seen a mixture of Web, mobile, and other end points being brought to bear on how people interact with their businesses and these processes.

Give me a sense on the extended scope of BI and how do we get at what is now part and parcel with the extended enterprises.

The right data

Yu: I fully agree with Howard. It's all about the right data and, given the current global and market conditions, enterprises have cut really deep -- from the line of business, but also into the IT organizations. However, they're still challenged with ways to drive more efficiencies, while also trying to innovate.

The challenges that are being presented are monumental where traditional BI methods and tools are really providing powerful analytical capabilities. At the same time, they're increasingly constrained by limited access to not only relevant data, but how to get timely access to data.

What we see are pockets of departmental use cases, where marketing departments and product managers are starting to look outside in public data sources to bring in valuable information, so they can find out how the products and services are doing in the market.

Gardner: Howard, we began this discussion with a lofty goal of defining the future of BI. I wonder if you think that the innovation to come from BI activities is a function of the analytics engine or the tools, or is it a function of getting at more, but relevant, information and bringing that to bear.

Dresner: It's an interesting question. One of the things that I focus upon in my second book, which is about to be published next month, is performance-directed culture and the underpinning or the substrate of a performance-directed culture. I won't go into great detail right now, but it has to do with common trust in the information and the availability and currency of the information, as a way to help the organization align with the mission.

The future of BI is not just about the tools and technology. It's great to have tools and technology. I certainly am a fan of technology, being somewhat of a gadget fiend, but that's not going to solve your organization's problems and it's not going to help them align with the mission.

What is going to help them align with the mission is making sure that they have timely, relevant, and complete information, as well as the proper culture to help them support the mission of the enterprise.

Having all the gadgetry is great. Certainly, making the tools more intuitive is a useful and worthwhile thing to do, but it's only as good as the underlying content and insight to support those end users. The future is about focusing on the information and those insights that can empower the individuals, their respective departments, and the enterprise to stay aligned with the mission of that organization.

Other trends afoot

Gardner: The trend and interest in BI is not isolated. There are other complementary, or at least coincidental, mega-trends afoot. One of them, from my perspective, is this whole notion of community, rather than just company, individual, or monolithic thinking. We are expanding into ecosystems.

Cloud computing is becoming a popular notion nowadays. People are thinking about how to cross organizational boundaries, how to access resources, perhaps faster better cheaper, from across organizational boundaries.

This also brings in this opportunity to start melding, mashing up, and comparing and contrasting data sets across these organizational boundaries. Is there a mega-trend that, from your perspective, Howard, we need to start thinking about BI as a data set-joined function?

Dresner: I fall back on Tom Malone's work, The Future of Work, his book from 2004, where he talks about organizations. Because of the reduced cost of communications, organizations will start to move, and are moving, towards looser bonds, democratized structures, and even market-based structures -- and he cites a number of examples in his book.

The way that you hold together an organization, this loosely bound organization, is through the notion of BI and performance management, which means we certainly have to compare, I wouldn't say data per se, but certainly various measures. We have to share data. We have to combine data and exchange data to get the job done -- whatever that job is. As needs be, we can break those bonds and form new bonds to get the job done.

For every application that IT or line of business develops, it just creates another data silo and another information silo. You have another place that information is disconnected from others.



This doesn’t mean that the future of business is a bunch of small micro-organizations coming together. It really applies to any organization that wants to be agile and entrepreneurial in nature. The underlying foundation of that has to be data and BI in order to function.

Gardner: So, it's about how these organizations relate to one another. Ron, from your perspective, what are some of the essential problems that need to be solved on allowing companies to better understand themselves and then to have this permeability at a process level, a content data, and BI level with other players?

Yu: The term I'd like to use is really about inclusive BI. Inclusive BI essentially includes new and external data sources for departmental applications, but that's only the beginning. Inclusive BI is a completely new mindset. For every application that IT or line of business develops, it just creates another data silo and another information silo. You have another place that information is disconnected from others.

Critical decision-making requires, as Howard was saying earlier, that all business information is easily leveraged whenever it's needed. But today, each application is separate and not joined. This makes the line of business and decision- making very difficult, and it's not in real time.

An easier way

As this dynamic business environment continues to grow, it’s completely infeasible for IT to update their existing data warehouses or to build a new data mart. That can't be the solution. There has to be an easier way to access and extract data exactly where it resides, without having to move data back and forth from data bases, data marts, and data warehouses, which effectively becomes snapshot.

When line of business is working with these data snapshots, by definition it's out of date. Catalytic CIOs and forward looking information architects understand this dilemma and, given that most enterprises are already Web-enabled, they are turning to Web data services to build bridges across all these data silos.

Gardner: Another trend we mentioned, the permeability of the organization, is this involvement -- people being participants in the social networks, having a great deal of publishing going on, putting content out there that can be very valuable to a company. End users seem to want to tell companies what they want, if the companies are willing to listen. We have this opportunity now to create dialogue and conversation, rather than simply looking at the sales receipts.

Tell me how this whole social phenomena of the community and the sharing fits into Web data services?

Web data services provides immediate access to the delivery of this critical data into the business user's BI environment, so that the right timely decisions can be made



Yu: There is effectively a new class of BI applications as we have been discussing, that depends on a completely different set of data sources. Web data services is about this agile access and delivery of the right data at the right time.

With different business pressures that are surfacing everyday, this leads to a continuous need for more and more data sources. But, as Howard was talking about earlier, how do you handle all of that?

Web data services provides immediate access to the delivery of this critical data into the business user's BI environment, so that the right timely decisions can be made. It effectively takes these dashboards, reporting, and analytics to the next level for critical decision-making. So when we look deeper into this and how is this actually playing out, it's all about early and precise predictions.

Let's talk about a few examples. Government agencies are using Web data services to combat terrorism. So, you can be certain that they have all the state-of-the-art analysis tools, spatial mapping, etc. Web data services effectively turbo-charges these analyst tools and is giving them the highest precision in their threat analysis.

These intelligence agencies have access to open-source intelligence, social networks, blogs, forums, even Twitter feeds, and can see exactly what's happening in real time. They can do this predictive analysis and are much better positioned than ever to avert horrible acts of terrorism like 9/11.

Gardner: Howard, do you think, to Ron’s point, that we need to sidestep IT and the traditional purveyors of BI? Is this something that can be done by the end users themselves?

Competency centers

Dresner: It's a very interesting question, and a provocative one too, I might add. But, sidestep IT? Not all IT organizations are inflexible. Some of them certainly are. One of the things that I have advocated for years is the notion of competency centers, certainly in larger organizations. The idea of a competency center is to get the skills in a place, where they can do the most good and where they can really focus on being expedient.

Delivering something to the end user a year after they ask for it really isn't terribly useful. You need to be as agile as possible to respond to ever-changing business needs. There are a very few businesses out there that are static, where things aren’t moving very quickly. In most organizations and most markets, things move pretty darn quickly, and you have to be able to respond to them.

If you don't respond to the users quickly, they find a way to solve their problems themselves, and that really has become an issue in many organizations. I’d like to say that it's a minority, but it's not. It's a majority of them, where IT is going down a slightly a different path, sometimes a dramatically different path, than the end users.

Surprisingly, there are some IT organizations that are pretty well aligned and they are responsive. So, it's not a situation where the end users need to completely discount IT, but some IT organizations have become pretty inflexible. They are focused myopically on some internal sources and are not being responsive to the end user.

To the extent that they can find new tools like Web data services to help them be more effective and more efficient, they are totally open to giving line of business self-service capabilities.



You need to be careful not to suffer from what I call BI myopia, where we are focused just on our internal corporate systems or our financial systems. We need to be responsive. We need to be inclusive of information that can respond to the user's needs as quickly as possible, and sometimes the competency center is the right approach.

I have instances where the users do wrest control and, in my latest book, I have four very interesting case studies. Some are focused on organizations, where it was more IT driven. In other instances, it was business operations or finance driven.

Yu: There is, in most cases, a middle ground, and IT certainly isn't looking for more things to do. To the extent that they can find new tools like Web data services to help them be more effective and more efficient, they are totally open to giving line of business self-service capabilities.

Gardner: Ron, whether it's the IT department and a fully sanctioned tool and approach that they are supporting or whether it's self-service, we can't just open up the fire hose and have all of this content dump into our business and analytics activities.

What do you bring to the table in terms of not only getting access to Web data services, but also cleansing them, vetting them, putting them in the right format, and making sure it's secure and their privacy issues are being adhered to? What's the value add to go beyond access into a qualitative set of highly valued assets?

Start with the use case

Yu: Sometimes, the problem we face, when we talk about BI, is that we immediately start talking about the software, the servers, and the things that we needed to build. BI really starts with the business use case.

What is it that the line of business is trying to do and can we develop the right facilities in order to work on that project? Yet, if those projects don't become so overbearing that you just create IT project gridlock, then I think we have something new to say.

For example, in leading financial services companies, what they're looking for is on this theme of early and precise predictions. How can you leverage information sources that are publicly available, like weather information, to be able to assess the precipitation and rainfall and even the water levels of lakes that directly contribute to hydroelectricity?

If we can gather all that information, and develop a BI system that can aggregate all this information and provide the analytical capabilities, then you can make very important decisions about trading on energy commodities and investment decisions.

Web data services effectively automates this access and extraction of the data and metadata and things of that nature, so that IT doesn't have to go and build a brand new separate BI system every time line of business comes up with a new business scenario.



Web data services effectively automates this access and extraction of the data and metadata and things of that nature, so that IT doesn't have to go and build a brand new separate BI system every time line of business comes up with a new business scenario.

Gardner: Again, to this notion of the fire hose, you are not just opening up the spigot. You're actually adding some value and helping people manage and control this flow, right?

Yu: Exactly. It's about the preciseness of the data source that the line of business already understands. They want to access it, because they're working with that data, they're viewing that data, and they're seeing it through their own applications every single day.

But, that data is buried deep within the application in the database and the only way that they can do this through the traditional ways is through opening up a new IT ticket and asking their database, their data warehouse, or their application to be updated. That just is very time-consuming and very expensive for everyone involved.

Gardner: To your point earlier, you end up getting a significant latency, and it's probably precisely the kind of Web services data that you want to get closer to real time in order to analyze what's going on.

Voice of the customer

Yu: That's exactly the case. The voice of the customer provides huge financial and exposure protection for product vendors. For example, if a tire manufacturer had the ability to monitor consumer sentiment, they would be able to investigate and even issue early recalls well before tragic events happen, which would create even larger financial loses and huge damage on the brand.

Gardner: Ron, help me understand a little bit better what Kapow Technologies brings in terms of this Web data services support. How does that also relate to a larger BI solution that incorporates Web data services.

Yu: We're going a little bit into the technical side of things now. Effectively, Kapow Web Data Server, which is our product, is a platform that provides IT and some line of business users, who actually have more of a technical aptitude, the ability to visually interact with the data sources through the Web, HTML and the Ajax front-end of an application or Web page or a Web portal.

Effectively, you visually program and give instructions through point-and-click, which gives you precise navigation through all of the forms and as deep as you want to go into that Website or Web application.

As you point and click, you can give instructions about extracting the data and even enriching the data. For example, going to LinkedIn, you see that there are certain images that are assigned to specific data. With our product, you can interpret those graphical images and give them a value.

Our product effectively gives you that precise surgical navigation and extraction of any data from exactly the application that you're working with . . .



Our product effectively gives you that precise surgical navigation and extraction of any data from exactly the application that you're working with to create an RSS feed, a REST service, or, in a case of traditional BI, even loading it directly into a SQL database with a one-button deployment.

There is no programming involved. So, you can imagine how incredibly productive this is for IT. You don't have to waste time writing SQL scripts, application programming interfaces (APIs), and things of that nature. It enables that easy access and moves on to the higher value of what IT can deliver, which is on the application and presentation side.

Gardner: Howard, in your work with your clients and your research for your new book, did you encounter any examples that you can recall where folks have taken this to heart and moved beyond the traditional content types that BI has supported? What sort of experience, paybacks, and benefits have they enjoyed?

Not just internal sources

Dresner: The answer is yes. There are a number of good examples. Obviously, I encourage everybody to order a copy of the new book, which is out next month. But, including other sources than just internal sources gives you a better perspective. It creates a much more interesting and rich tapestry of the business and the market in which it lives.

One of the organizations I dealt with is in the hospitality business. Understanding their market, understanding what their competition is doing, what offers that they are providing means that they have to go to those Websites, as well as accessing some social networking sites.

They have to understand what's the customer sentiment is out there and what sort of offers their competition is offering on a Sunday night, for example, in order for them to remain competitive. You have to understand the changing trends, if you want to be a “hip hotel chain.” What does that mean? What's changing socially in those particularly geographies and markets that you play in that you need to be aware of and respond to.

The same thing is true in other industries. Another one of the organizations I worked with is in the healthcare industry. So understanding your patient requirements is important, if you want to be a more patient-oriented organization. What are their changing needs? What are their desires? What are those things that they expect from their service provider? You are not going to get that from your internal database?

Providing access to external content in conjunction with the content from your internal systems gives you a greater perspective. How many times have we heard, "Gee, if I'd only known that, I could have made a better decision or I could have framed the decision-making process more effectively?" That's really where we are in the history of BI right now.

We need to provide a better perspective, more complete and more timely perspective, in order to frame the decision-making processes. Going back to my original point, and really the central point of the book, how do we get everybody in the organization aligned with the mission to make sure that we're all fulfilling our particular role within the organization and using things like BI and the right sorts of data to achieve that purpose?

But, when you look outside the firewall -- and I'm talking about all these public data sources and even partners -- how do you collaborate better with your partners? All of these things are Web enabled.



Yu: I agree, Howard, and I think that's just the tip of the iceberg. If we look at the spirit of what corporate performance management or enterprise performance management is supposed to deliver, BI systems are really dealing with operational data and financial data within the firewall. But, when you look outside the firewall -- and I'm talking about all these public data sources and even partners -- how do you collaborate better with your partners? All of these things are Web enabled.

How do you bring things together from outside the firewall and integrate them with the operational and financial data? That challenge will really be a huge payoff, once IT organizations and CIOs can leverage Web data services for this huge payoff within that enterprise, whether it's the next generation of BI for business-to-employee (B2E) applications, business-to-business (B2B) with their partners, or even business-to-consumer (B2C) applications.

Gardner: Ron, I wonder if you have any examples, folks that have gone out and gathered these Web data services? What sort of uses have they put them to and what paybacks have they encountered?

Partners and B2B

Yu: We've talked a lot about public Web data sources. Let's talk about partners and B2B. One of the Fortune 500 financial services companies was required by regulatory compliance to report on their treasury transactions on 10,000 treasury transactions per day.

They had several analysts fully dedicated to logging in to each of their top 100 banking partners and extracting information, loading it into an Excel spreadsheet, and then normalizing the data and cleansing the data You know that when you use manual efforts, you will never get precise around the data quality, but that was the best facility that they had.

Then, they would take that Excel spreadsheet, load that into a database, and put a BI tool on top of that to provide their transactional dashboard. They spent three years evaluating technologies and trying to build the solution on their own and they failed.

So they came to Kapow Technologies and implemented a proof of concept within three weeks. They were able to get three of their top banking partners to develop a BI dashboard to monitor and manage these transactions and the full deployment in three months. Now, they are looking to expand that to other aspects of their business.

Gardner: I think we've learned a lot here about Web data services. Ron, where do you see it going in the future? How does this move beyond the vision that we already have developed here?

Yu: As Howard has been advocating about getting the right data, once you get the data access right, where the data is accurate, noise-free and timely, then the future of BI will really be about automated decision making.

We got a taste with some of the examples that I talked about with financial services and working with the partners, but also investment decisions and things like that. In the same way that we've seen that in financial decisions around making buy/sell decisions in an automated predictive way, there is this same opportunity that exists across all industries.

Gardner: Howard, do you agree that future BI is increasingly an automated affair?

Dresner: There are certainly places where we ought to be automating BI. Decision automation certainly. But, to my way of thinking, BI is involved in empowering users and making them smarter. There is a tremendous amount of room for improvement there.

As I said, I've been on this beat for 20 years now, and certainly have seen improvements in the tools, across the board, from the bottom of the stack all the way to the top, and we can certainly see increased penetrations in the use.

The next hurdle is applying the technology a little bit more effectively. That's really where we have fallen far short, not understanding why we are implementing the technology -- let's give everybody BI and a data warehouse and hope for the best. Not that there hasn't been any goodness associated with it, but certainly not one that is requisite with the investments that have been made.

Going back to what I said, earlier in the broadcast, the focus upon the performance-directed cultures and using the technology as an enabler to support those cultures is really where I think organizations need to apply their thinking.

Gardner: I'm afraid we'll have to leave it there. We've been discussing how Web data services play a critical role in allowing companies to gather and refine their analytics to engage in better market strategies and better buying decisions and to join and explore business development opportunities. Helping us to deal in a future BI and the role of Web data services, we've been joined by Howard Dresner, president and founder of Dresner Advisory Services. Thanks so much, Howard.

Dresner: My pleasure. Thanks for having me.

Gardner: Also, we have been joined by Ron Yu, vice president of marketing at Kapow Technologies. Thank you, Ron.

Yu: Thanks, Dana. I had a great time.

Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. You've been listening to a sponsored BriefingsDirect podcast. Thanks for listening, and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Kapow Technologies.

See popular event speaker Howard Dresner's latest book, Profiles in Performance: Business Intelligence Journeys and the Roadmap for Change, or visit his website.

Transcript of first in a series of sponsored BriefingsDirect podcasts with Kapow Technologies on Web Data Services and how harnessing the explosion of Web-based information inside and outside the enterprise buttresses the value and power of business intelligence. In Part Two, Kapow co-founder and CTO Stefan Andreasen and Forrester analyst Jim Kobielus discuss how Web data services provide ease of access to data from a variety of sources. Copyright Interarbor Solutions, LLC, 2005-2009. All rights reserved.

Friday, September 18, 2009

Caught Between Peak and Valley -- How CIOs Can Survive Today, While Positioning for Tomorrow

Transcript of a sponsored BriefingsDirect podcast on what CIOs need to do to survive the current economic downturn, while preparing for the coming upturn.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Download the slides. Sponsor: Hewlett-Packard.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on whether CIOs are making the right decisions and adjustments in both strategy and execution, as we face a new era in IT priorities. The combination of the down economy, resetting of IT investment patterns, and the need for agile business processes, along with the arrival of some new technologies, are all combining to force CIOs to reevaluate their plans.

What should CIOs make as priorities in the short, medium, and long terms? How can they reduce total cost, while modernizing and transforming IT? What can they do to better support their business requirements? In a nutshell, how can they best prepare for the new economy?

Here to help us address new questions during a very challenging time, and yet also a time in which opportunity and differentiation for CIOs begins, is Lee Bonham, marketing director for CIO Agenda Programs in Hewlett-Packard's (HP’s) Technology and Solutions Group. Welcome to the show, Lee.

Lee Bonham: Hi. Thanks very much for having me on.

Gardner: We certainly get the sense that CIOs are shifting in their priorities and making real-time adjustments. So much has happened in just the last six months, as a result of the shifting economic landscape. How do you think, from your vantage point, that IT has to adjust, given these new economic realities?

Bonham: We all recognize that we’re in a tough time right now. In a sense, the challenge has become even more difficult over the past six months for CIOs and other decision-makers. Many people have budget challenges and are having to make tough decisions about where to spend their scarce investment dollars. The demand for technology to deliver business value is still strong, and it perhaps has even increased, but the supply of funding resources for many organizations has stayed flat or even gone down.

To cope with that, CIOs have to work smarter, not harder, and have to restructure their IT spending. Looking forward, we see, again, a change in the landscape. So, people who have worked through the past six months may need to readjust now.

Gardner: Is this what you mean by the new economy -- doing more for less -- or is there more to it than that?

Bonham: Doing more for less has been around for a little while, and will continue. As we look ahead, we see a few new questions emerging. From an economic and financial point of view, hopefully we're close now to the bottom of the downturn. The forecasters suggest that the economy will start to improve slowly, but hopefully steadily, over the next 6 to 12 months.

What that means for CIOs is they need to think about how to position themselves and how to position their organizations to be ready when that growth and new opportunity starts to kick in. At the same time, there are some new technologies that CIOs and IT organizations need to think about, position, understand, and start to exploit, if they’re to gain advantage.

Gardner: From HP’s perspective, what sort of factors are important in terms of moving to improve productivity and to gain that agility.

Need to take stock

Bonham: If we think about the priorities and the challenges, as they have been over the past six months, what we’ve been saying is that organizations need to take stock of where they are and implement three strategies:
  • One is to standardize, optimize, and automate their technology infrastructure -- to make the best use of the systems that they have installed and have available at the moment. Optimizing infrastructure can lead to some rapid financial savings and improved utilization, giving a good return on investment (ROI).
  • Secondly, they needed to prioritize -- to stop doing some of the projects and programs that they’ve had on their plate and focus their resources in areas that give the best return.
  • Thirdly, they should look at new, flexible sourcing options and new ways of financing and funding existing programs to make sure that they are not a drain on capital resources. We’ve been putting forward strategies to help in these three areas to allow our customers to remain competitive and efficient through the downturn. As I said, those needs will carry on, but there are some other challenges that will emerge in the next few months.
Gardner: So, I suppose looking backward over the past six months or so, it's been very much a

In general, we’re seeing and suggesting that now is the time for CIOs to move their foot nearer the accelerator and maybe a little bit off the brake.

cost-cutting and cost-saving mode, but you really can’t save your way out of a transformation. How do they know when to make that switch, if you will, from defense to offense?

Bonham: That’s a really good question. The answer is very dependent on the industry, the geography, and the specific environment that each organization finds itself in. In general, we’re seeing and suggesting that now is the time for CIOs to move their foot nearer the accelerator and maybe a little bit off the brake.

If CIOs are not laying the groundwork now and thinking about their plans for when the economy starts to recover, they are in danger of being too late and missing the opportunities as they emerge. There is no hard-and-fast rule, but, at the same time, people should think and take stock and maybe set some plans—or perhaps drop some plans—so that they can get ready for growth over the next few months.

Gardner: I imagine that having a strong ROI analysis associated with certain projects will allow them to get funded. How do they balance the long-term with that need for a short-term use case or cost-benefit analysis story?

Focus on the short term

Bonham: They still have to balance those two things. What we’ve been saying is that firms need to optimize their cost, focus on the short term, and make sure that they can survive in this current period, but also thrive as the economy recovers.

There are a number of different techniques and ways that customers can achieve that. Clearly, an alignment between business and the IT organization is key. CIOs need to work closely with their line-of-business managers and colleagues and make sure they understand business requirements and priorities of the rest of the organization.

There are some tools and techniques that leading CIOs have been putting in place around project prioritization and portfolio management to make sure that they are making the right choices for their investments. We’re seeing quite a difference for those organizations that are using those tools and techniques. They’re getting very significant benefits and savings.

Gardner: I suppose having that visibility, knowing exactly what you have, what works, what doesn’t work, and how to measure those become really critical, when you’re trying to make the transitions, as we said, from long-term and short-term, offense and defense, or the brake and the accelerator.

Bonham: It’s really important to understand what projects are in progress, what projects are

Our survey work shows that technologies like consolidation, virtualization, application modernization and data-center automation have really moved up the scale in terms of importance.

delivering real value, and to optimize the spend. What we’ve seen is that leading organizations are really focusing their resources on projects that are delivering fast ROI, typically, within six months or less, so that they get real benefit, savings, and real business value in the short-term.

But, there is also another set of applications and opportunities as people look to grow. Growth may come in emerging markets, in new industry segments, and so on. CIOs need to look at innovation opportunities. Matching the short-term and the long-term is a real difficult question. There needs to be a standard way of measuring the financial benefit of IT investment that helps bridge that gap.

Gardner: A little earlier, you mentioned new technologies. What is it that that organizations can put in place that maybe is not just process, not just people, but actual technologies to assist them in these crucial times.

Bonham: Well, if we look at the area of financial cost savings and efficiency improvements, there is a whole range of technologies that people are adopting. Our survey work shows that technologies like consolidation, virtualization, application modernization and data-center automation have really moved up the scale in terms of importance.

Just as an example, server and storage consolidation is being implemented by well over 50 percent of the major organizations around the world as a way of saving cost and improving efficiency. That’s not the only area that’s important. I've already outlined the topic of project prioritization, making sure that you’re spending your scarce investment dollars in the right way.

Optimizing the investment

Tools like Project and Portfolio Management (PPM) software help allocate budget and decide which programs and projects are delivering a good return and should be continued, versus those that are maybe not so important and should be delayed or canceled. Those software tools can help in making sure that organizations optimize their investment.

Gardner: I wonder if you have any examples, either use cases or companies that have moved in this direction. What sort of payoffs do they get?

Bonham: We’ve seen quite a few customers who have really taken a great approach and have been ahead of the game in terms of using consolidation and virtualization tools, standardizing their data centers and other technology components.

As an example, Joanne Cummins, chief information officer of Standard Register, a document and print management services company based in Dayton, Ohio, has really used a whole range of techniques to reduce operating cost, simplify the technical infrastructure, reduce the number of servers and server and storage administration costs, and really get the most from her technology investment.

She's delivering yearly savings of $400,000 for the organization, as well as delivering a number of other benefits, like a flexible allocation of resources and faster application development and deployment. A combination of techniques is often the best approach. We're seeing that we can help customers choose the right solution to meet their needs in many cases.

Gardner: When the CIO leadership individual needs to go back to the business leadership --

Tools like the PPM software can help define and outline those financial benefits in a way that financial analysts and CFOs can recognize.

the accountants, the bean counters, if you will -- what sort of metrics do they need to describe in order to get the investments to make these new technology improvements?

Bonham: Over the past six months, the financial community is looking for fast return -- projects that are going to deliver quick benefits. CIOs need to make sure that they represent their programs and projects in a clear financial way, much more than they have been before this period. Tools like the PPM software can help define and outline those financial benefits in a way that financial analysts and CFOs can recognize.

Gardner: I’m also curious about the advice you would give for CIOs listening here today. What’s the general advice that we can offer in terms of getting that new economy bang for the buck.

Bonham: Let’s try and think about this in terms of some metrics. There is a total IT budget metric that CIOs need to think about. Over the past few months, many have been focusing on reducing the total cost of IT and maximizing efficiency, as well as targeting effectiveness. But, there is also a metric of how much you are spending on maintenance and management of existing systems versus innovation and growth. Typically, organizations spend 60-70 percent of their budget on maintenance and management of existing systems.

What CIOs need to think about going forward is how to grow the spend on innovation and applications, so that they can drive real business value and better business outcomes for their organization and be more competitive as the economy emerges.

Gardner: So, for that large chunk of their budget, it’s going to these ongoing operations, maintenance and support. You can consolidate, call out applications that might not be delivering much value, archive and remove data, this whole notion of modernization, and then use virtualization, and I think that can significantly reduce that larger nut of the equation, right?

Transformational approach

Bonham: That’s exactly right. Those organizations that are taking a transformational approach, an end-to-end approach, choosing those technologies that are giving them efficiency are going to lead the way. They are the ones that are going to have investment dollars available to allocate to new projects, which will drive their business in the upturn and give them the growth opportunities they want.

Gardner: So, for a CIO, they want to find that golden strategy that both reduces costs over the long-term, but increases that business agility, and then frees up those funds for those innovations and additional technologies. It's a trifecta, if you will, of consolidate, modernize, and virtualize.

Bonham: Absolutely. We’re seeing many firms on that course.

Gardner: For those folks who are interested in learning more or getting started, where do they go for information, and how do they set up a process?

Bonham: HP has a whole range of services and technologies that can address the specific needs

These organizations and CIOs want to think through their next step and think through where to start and how to make sure their IT strategy is in line with their business needs.

that we’ve talked about -- virtualization requirements, the services to help firms consolidate and standardize their technology, to implement automation tools, to better manage their portfolio projects, and to speed up software development.

Through EDS, an HP company, we have outsourcing, which can take the burden away from CIOs by outsourcing those services. We have HP Financial Services that can help fund through leasing and financing new investment requirements.

These organizations and CIOs want to think through their next step and think through where to start and how to make sure their IT strategy is in line with their business needs. We also have some consulting services and workshops that we call the CIO Agenda that can help people get started and make sure they are on the right course for the next few months to optimize their investment and their business outcomes.

Gardner: We’ve been discussing whether CIOs are making right decisions and how to make adjustments moving forward, both in strategy and execution. We’ve been joined by Lee Bonham. He is the marketing director for the CIO Agenda Programs in HP’s Technology and Solutions Group. I welcome his thoughts and appreciate his input. Thank you for joining us, Lee.

Bonham: Well, thanks very much indeed. Have great day.

Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. You’ve been listening to a sponsored BriefingsDirect podcast. Thanks for listening and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Download the slides. Sponsor: Hewlett-Packard.

Transcript of a sponsored BriefingsDirect podcast on what CIOs need to do to survive the current economic downturn, while preparing for the coming upturn. Copyright Interarbor Solutions, LLC, 2005-2009. All rights reserved.