Monday, November 16, 2009

BriefingsDirect Analysts Discuss Business Commerce Clouds: Wave of the Future or Old Wine in a New Bottle?

Edited transcript of BriefingDirect Analyst Insights Edition podcast, Vol. 46 on "business commerce clouds."

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Charter Sponsor: Active Endpoints. Also sponsored by TIBCO Software.

Special offer: Download a free, supported 30-day trial of Active Endpoint's ActiveVOS at www.activevos.com/insight.

Dana Gardner: Hello, and welcome to the latest BriefingsDirect Analyst Insights Edition, Vol. 46. I'm your host and moderator Dana Gardner, principal analyst at Interarbor Solutions.

This periodic discussion and dissection of IT infrastructure related news and events, with a panel of industry analysts and guests, comes to you with the help of our charter sponsor, Active Endpoints, maker of the ActiveVOS, visual orchestration system, and through the support of TIBCO Software.

Our topic this week on BriefingsDirect Analyst Insights Edition, and it is the week of Oct. 26, 2009, centers on "business commerce clouds." As the general notion of cloud computing continues to permeate the collective IT imagination, an offshoot vision holds that multiple business-to-business (B2B) players could use the cloud approach to build extended business process ecosystems.

Under this notion, a gaggle of cloud-enabled partners could effect multiple-party services and complex processes -- all from the Internet. Business commerce clouds could produce efficiencies over traditional e-commerce processes and partnerships, and even do things -- in terms of reach, complexity, numbers of partners, and cost savings -- that had not been possible before.

It's sort of like a marketplace in the cloud on steroids, on someone else's servers, perhaps to engage on someone's business objectives, and maybe even satisfy some customers along the way.

I, for one, can imagine a dynamic, elastic, self-defining, and self-directing business-services environment that wells up around the needs of a business group or niche, and then subsides when lack of demand dictates. It's really a way to make fluid markets adapt at Internet speed, at low cost, to business requirements, as they come and go.

The concept of this business commerce cloud was solidified for me just a few weeks ago, when I spoke to Tim Minahan, chief marketing officer at Ariba. Tim and I were analysts together way back in the '90s. It seems like yesterday on some levels, and then many years ago at another.

So, I've invited Tim to join us to delve into the concept, and the possible attractions, of business commerce clouds. Welcome to the show, Tim.

Tim Minahan: Thank you, Dana. I'm pleased to be here.

Gardner: Please also join me in welcoming our IT industry analyst guests this week. We're joined by Tony Baer, senior analyst at Ovum. Hey, Tony.

Tony Baer: Hey, Dana. I'm here, and Happy Halloween everybody. Please don't make this conversation too scary.

Gardner: Alright. Brad Shimmin, principal analyst at Current Analysis is here. Hey, Brad.

Brad Shimmin: Hi, Dana.

Gardner: Also, Jason Bloomberg, managing partner at ZapThink. Howdy, Jason.

Jason Bloomberg: Hi, how is it going?

Gardner: Good. JP Morgenthal, independent analyst and IT consultant is here. Hey, JP. And making her debut, Sandy Kemsley, independent IT analyst and architect.

Sandy Kemsley: Hi, Dana. It's great to be here.

Gardner: Very good. Nice to hear from you. Let's go back to Tim. What are we really talking about here? I tried to do a setup, but it was obviously vague. "Business commerce clouds" -- what's the concept?

Leveraging cloud

Minahan: You said it nicely. When we talk about business commerce clouds, what we're talking about is leveraging the cloud architecture to go to the next level. When folks traditionally think of the cloud or technology, they think of managing their own business processes. But, as we know, if we are going to buy, sell, or manage cash, you need to do that with at least one, if not more, third parties.

The business commerce cloud leverages cloud computing to deliver three things. It delivers the business process application itself as a cloud-based or a software-as-a-service (SaaS)-based service. It delivers a community of enabled trading partners that can quickly be discovered, connected to, and enable collaboration with them.

And, the third part is around capabilities --the ability to dial up or dial down, whether it be expertise, resources, or other predefined best practice business processes -- all through the cloud.

Gardner: Tell me why Ariba is interested in this. How does this extend what they have done? And, for those of our listeners that don't know about Ariba, maybe you could give us the quick elevator pitch on it.

Minahan: Certainly. Ariba started out back in 1996 with a common mission in mind to help companies manage spend more effectively. It has since transitioned to deliver those results more efficiently by becoming a SaaS-based provider.

We realized we weren't just creating value for the buyers, but we were creating value for the sellers.



Quite simply, spend management is the holistic approach of helping you control your supply chain cost, minimize risk, and then optimize cash. Along the way, what we found was that we were connecting all these parties through a shared network that we call the Ariba Supplier Network.

We realized we weren't just creating value for the buyers, but we were creating value for the sellers. They were pushing us to develop new ways for them to create new business processes on the shared infrastructure -- things like supply chain financing, working capital management, and a simple way to discover each other and assess who their next trading partners may be.

Gardner: Tony Baer, we've talked a lot about cloud. We've heard a lot about it, often as an abstraction, often about infrastructure and development, test and dev, and storage of data. But, businesses are motivated by applications, processes -- things that get things done. Does this notion of a business commerce cloud work for you, and is it something that might be a catalyst to the whole cloud concept?

History repeats

Baer: Well, this is interesting. History really does go around in cycles. I'd like to direct a question back at Tim. I think there are some very interesting possibilities, and in certain ways this is very much an evolutionary development that began with the introduction of EDI 40 or 45 years ago, or something like that, I forget the exact date.

Actually, if you take a took at supply-chain practices among some of the more innovative sectors, especially consumer electronics, where you deal with an industry that's very volatile both by technology and consumer taste, this whole idea of virtualizing the supply chain, where different partners take on greater and greater roles in enabling each other, is very much a direct follow on to all that.

Roughly 10 years ago, when we were going though the Internet 1.0 or the dot-com revolution, we started getting into these B2B online trading hubs with the idea that we could use the Internet to dynamically connect with business partners and discover them. Part of this really seemed to go against the trend of supply-chain practice over the previous 20 years, which was really more to consolidate on a known group of partners as opposed to spontaneously connecting with them.

I'm obviously exaggerating there, but, Tim, how does this really differ, in terms of the discovery functions that you were talking about before, from these B2B clouds -- we weren't calling them clouds back then -- but these B2B trading hubs that we were talking about almost 10 years ago?

Part of this really seemed to go against the trend of supply-chain practice over the previous 20 years . . .



Minahan: That's a very good question. There are certainly similarities, but the major difference is that back then it was, "If you build it, they will come." The reality today is that they are here and they are looking for more ways to collaborate.

If you look at the Ariba Network that I mentioned before, in the past year, companies have processed $120 billion worth of purchased transactions and invoices over this network. Now, they're looking at new ways to find new trading partners -- particularly as the incidence of business bankruptcies are up -- as well as extend to new collaborations, whether it be sharing inventory or helping to manage their cash flow.

Gardner: Brad Shimmin, remaining with this "back to the future" notion, there are lots of different commerce environments out there. There has been a platform approach to it. Sometimes that's worked. When we got to the need of integration, we needed to open that up and create standards.

But now the cloud accelerates, or even heightens, this neutrality or standards requirement. Do you think that the cloud is perhaps a catalyst to moving to these ecosystems of business processes and services that will do what we couldn't do with EDI or even standards?

An enabler

Shimmin: That's a great point. I don't look at it as a catalyst, I look at it as an enabler, in a positive way. What the cloud does is allow what Tim was hinting, with more spontaneity, self-assembly, and visibility into supply chains in particular that you didn't really get before with the kind of locked down approach we had with EDI.

That's why I think you see so many of those pure-play EDI vendors like GXS, Sterling, SEEBURGER, Inovis, etc. not just opening up to the Internet, but opening up to some of the more cloudy standards like cXML and the like, and really doing a better job of behaving like we in the 2009-2010 realm expect a supply chain to behave, which is something that is much more open and much more visible.

Gardner: Sandy Kemsley, again, welcome to the show. How does this strike you as an enterprise IT architect? Is this something that appears like pie in the sky, a little too daunting, or is this something that makes you very interested?

Is it, "I would love to get some business services I can get my hands on and start crafting business processes beyond what's available for my service-oriented architecture (SOA) internally, or what I have used in terms of regular old enterprise software?"

Kemsley: I think it has huge potential, but one of the issues that I see is that so many companies are afraid to start to open up, to use external services as part of their mission-critical businesses, even though there is no evidence that a cloud-based service is any less reliable than their internal services. It's just that the failures that happen in the cloud are so much more publicized than their internal failures that there is this illusion that things in the cloud are not as stable.

There are also security concerns as well. I have been at a number of business process management (BPM) conferences in the last month, since this is conference season, and that is a recurring theme. Some of the BPM vendors are putting their products in the cloud so that you can run your external business processes purely in the cloud, and obviously connect to cloud-based services from those.

A lot of companies still have many, many problems with that from a security standpoint, even though there is no evidence that that's any less secure than what they have internally. So, although I think there is a lot of potential there, there are still some significant cultural barriers to adopting this.

Gardner: Let's go to Tim Minahan on that. Tim, what's the answer to these cultural and other inhibitors? Is there low-lying fruit -- people who would love to get out and do B2B activities? Even if there is the perception of risk, they are going to do it anyway, because it's so attractive?

Security always an issue

Minahan: First, on the security note, security has always been an issue. That was the rubric, even back to original EDI days on, "Am I going to exchange this? It's much more secure when I mail it to them."

Ultimately, when you look at the scale that a cloud or SaaS vendor has -- in many cases those that are processing large transactions right now -- the level of investment they should make around security is quite significant, more significant than not all, but most of the participants in that community.

So, that's something that continues to come up. Increasingly, and probably because of the current economic situation, more and more companies are looking to what business processes they can put in the cloud, whether it be a commerce process or talent management.

. . . The cloud provider, because of the economies of scale they have, oftentimes provides better security and can invest more in security, partitioning, and the like than many enterprises can deliver themselves.



Gardner: Tim, I think I heard you say that basically you get what you pay for. When it comes to security, if you are willing to invest, then you can get the level of security you need to do whatever it is that you need to do.

Minahan: What I'm saying is that the cloud provider, because of the economies of scale they have, oftentimes provides better security and can invest more in security, partitioning, and the like than many enterprises can deliver themselves. It's not just security. It's the other aspects of your architectural performance.

Gardner: I see. So, the cloud provider being centralized and having a methodological approach can look at the whole security picture and actually implement on it. Enterprises that are distributed, scattered, and have been working toward security from a variety of perspectives for 10, 15, or 20 years, don’t always get that opportunity to get the top-down approach?

Minahan: Exactly.

Gardner: Jason Bloomberg, what do you think about this notion of business commerce clouds, and is this something that's going to happen in the near term?

Bloomberg: I must say that I am coming at it from a skeptic's perspective. It doesn’t sound like there's anything new here. As Tony was pointing out, we were talking about this 10 years ago. There just doesn’t seem to be that much that is particularly new or different.

We're using the word "cloud" now, and we were talking about "business webs." I remember business webs were all the rage back when Ariba had their first generation of offerings, as well as Commerce One and some of the other players in that space.

Age-old challenges

The challenges then are still the challenges now. Companies don't necessarily like doing business with other organizations that they don't have established relationships with. The value proposition of the central marketplaces has been hammered out now. If you want to use one, they're already out there and they're already matured. If you don't want to use one, putting the word "cloud" on it is not going to make it any more appealing.

So, really, I'm looking for anything new or different here. It really sounds more like just old wine in new bottles. Vendors are just saying, "Let's do cloud," but, if anything, cloud is introducing more problems than solutions.

Talking about security is a bit of a red herring, because some of the cloud issues are really more broad governance issues than security issues. Two events in the last few weeks have highlighted this fact. One is the Microsoft Sidekick data loss problem, where the Sidekick mobile devices stored their data in the cloud instead of locally on their device. Microsoft dropped the ball, and the data were lost for a while. While that wasn't strictly speaking a security issue, it was a more subtle issue.

The second was where a spammer got into Amazon's EC2, and put the entire EC2 cloud environment on a spam blacklist. So one bad apple basically got all of the IP addresses for the EC2 environment on the blacklist. Again, not strictly speaking security. It's security related, but it's really more of a complex issue than that.

I predict we will have a number of other issues like that, unexpected cloud-based problems that aren't along the lines of traditional issues that we have seen, web-based security issues.



I predict we will have a number of other issues like that, unexpected cloud-based problems that aren't along the lines of traditional issues that we have seen, web-based security issues.

Everybody is familiar with denial of service (DOS) attacks and other issues like that, but we are going to have new and different kinds of issues that are going to slow down adoption of cloud, on the one hand. Then, you will also have the issue that a lot of these marketplaces are nothing new. They are already out there. They are already established, and there isn't necessarily a lot of additional advantage to be gained by buying new gear or moving to a cloud provider.

Gardner: Tim, how about that? Other than injecting the word "cloud" in here, putting some lipstick on something that already exists, what's new?

Minahan: First, I want to address the bastardization of the term "cloud." The Microsoft Sidekick example is a good one, where the cloud bigots rushed to use that as just another example of how cloud is dangerous.

Just because you store your data in a central repository that's hosted, it doesn't necessarily make it a cloud. So, I think there is some misgiving there that folks are lining up on one side of the aisle to try to dispel that.

Creating efficiencies

Second, as it applies to the cloud and the commerce cloud, what's interesting here is the new services that can be available. It's different. It's not just about discovering new trading partners. It's about creating efficiencies and more effective commerce processes with those trading partners.

I'll give you a good example. I mentioned before about the Ariba Network with $111 billion worth of transactions and invoices being transferred over this every year for the past 10 years. That gives us a lot of intelligence that new companies are coming on board.

An example would be The Receivables Exchange. Traditionally sellers, if they wanted to get their cash fast, could factor the receivables at $0.25 on the dollar. This organization recognized the value of the information that was being transacted over this network and was able to create an entirely new service.

They were able to mitigate the risk, and provide supply chain financing at a much lower basis -- somewhere between two to four percent by using the historical information on those trading relationships, as well as understanding the stability of the buyer.

What we're seeing with our customers is that the real benefits of the cloud come in three areas: productivity, agility, and innovation.



Because folks are in a shared infrastructure here that can be continually introduced, new services can be dialed up and dialed down. It's a lot different than a rigid EDI environment or just a discovery marketplace.

Gardner: Tim, isn't there also somewhat of a business model shift here? If those costs come down, as you're projecting, because of the efficiencies of cloud, then the savings can be passed along. Isn't it possible we could see something along the lines of the Apple App Store, where, all of a sudden, volume and participation go up, some sort of a network effect, due to the fact that the cost of the applications has come down quite a bit. Is there something like that going on?

Minahan: Yeah. Cost is one aspect of it. When most people talk about the benefits of the cloud, they talk about the cost discussion. What we're seeing with our customers is that the real benefits of the cloud come in three areas: productivity, agility, and innovation. I'll spend a moment on each.

When you talk about productivity, we have talked to CFOs and CIOs today who just took a lot of cost and headcount out of their operations, thanks to the downturn. All indications are that they're not going to hire them back, even when the economy rebounds. The cloud gives them an opportunity to drive efficiencies and productivity, really without adding infrastructure.

Core competence

The second area is agility, which has become a core competence for a successful business. Many companies got caught flatfooted with the downturn, and they are just gun shy to make that same mistake again. So, the cloud gives them a new way to dial up infrastructure and resources, as needed, and the flexibility to dial them down, when they don't need them.

But the last part is that the greatest benefit of all here is innovation. That's the greatest benefit of the cloud in general, but in the commerce cloud in particular, because companies are sharing their business applications, processes, and infrastructure with their trading partners. They can benefit from the innovation of the entire community.

Your analogy to iTunes is perfect. It's the ability to have the community actually develop or offer best practice process services that can be utilized by other members of the community. That's the type of thing we are beginning to see: New business processes that are built on top of the cloud, because you already have the technology, the community, and the capabilities built in.

Gardner: JP Morgenthal, you're not a cloud bigot, are you?

Morgenthal: A cloud bigot? No. My vision for the cloud is far beyond the basic economic principles, and has yet to be realized. Economic factors are just the start of the groundswell that will bring people there, but the real value won't be seen unless the community comes. Once the community comes, I think we will see some really interesting things occur.

Web services, cloud, nothing really has moved the ball forward from the real problem of two partners coming together, establishing an agreement, and doing work together.



But what I want to address with regard to this is that from 2004 through 2008, I basically had developed a platform as a service around supply chain management and warehouse management for enterprise manufacturing and retail. So, I got some inside view into how this community really works, and a lot of their needs for communicating with each other.

I'm skeptical. I was at XML Solutions as their CTO when we first started doing B2B and building up the first exchanges, and the same problems are still there. They haven’t gone away. Emerging technologies haven’t differentiated that. Web services, cloud, nothing really has moved the ball forward from the real problem of two partners coming together, establishing an agreement, and doing work together.

Putting additional information in the cloud and making value out of that add some overall value to the cost of the information or the cost of running the system, so you can derive a few things. But, ultimately, the same problems that are needed to drive a community working together, doing business together, exchanging product through an exchange are still there.

Gardner: JP, aren’t you describing a great opportunity, though, for some organization to come in and perhaps be neutral enough, where they could play the role that Apple is playing with the App Store, and attract a community of developers, participants, contributors, but also bring together the audience that can consume? It seems to me that Ariba, as well as others, have this in mind. The cloud might be a way in which that opportunity can finally be realized. Is that possible?

Not for the cloud

Morgenthal: I don't see the cloud as being the thing to realize this. This has been a vision, dream, and goal of many of these exchange environments -- WorldWide Retail Exchange, the 1.4 Exodus I believe is the one now. We've had these environments. They exist. It's not a matter of getting developers to come build anything for it.

What's being done through these environments is the exchange of money and goods. And, it's the overhead related to doing that, that makes this complex. RollStream is another startup in the area that's trying to make waves by simplifying the complexities around exchanging the partner agreements and doing the trading partner management using collaborative capabilities. Again, the real complexity is the business itself. It's not even the business processes. The data is there.

I was working with a automotive retail group that contributed their parts and excess inventory into an exchange. Everybody did that. The thing they were contributing to was about exchanging and the other groups within that same community were looking for those excess inventories and being able to purchase them.

Even that, which sort of sounds like it should have been fairly simple, was overly complex, because of the underlying business requirements around it and exchanging funds and getting paid. Technology is a means to an end. The end that's got to get fixed here isn't an app fix. It's a community fix. It's a "how business gets done" fix. Those processes are not automated. Those are human tasks.

When folks talk about cloud, they really think about the infrastructure, and what we are talking about here is a business service cloud.



Gardner: Tim, the issue here seems to be that business is tough. There has to be trust. There have to be contracts. There has got to be the exchange of funds, basically a handshake in the sky. But, that's only as good as the handshake would have been in real physical terms. What's your response to that? Are there other areas that can be automated, where those business trust issues aren’t quite as prominent?

Minahan: I totally agree. If you go back to my original statement as to what's in the cloud, I think there is some mistaking here. When folks talk about cloud, they really think about the infrastructure, and what we are talking about here is a business service cloud.

Gartner calls it the business process utility, which ultimately is a form of technology-enabled business process outsourcing. It's not just the technology. The technology or the workflow is delivered in the cloud or as a web-based service, so there is no software, hardware, etc. for the trading partners to integrate, to deploy or maintain. That was the bane of EDI private VANs.

The second component is the community. Already having an established community of trading partners who are actually conducting business and transactions is key. I agree with the statement that it comes down to the humans and the companies having established agreements. But the point is that it can be built upon a large trading network that already exists.

The last part, which I think is missing here, and that's so interesting about the business service cloud, or in this case the business commerce cloud, are the capabilities. It's the ability for either the solution provider or other third parties to deliver skills, expertise, and resources into the cloud as well as a web-based service.

It's also the information that can be garnered off the community to create new web-based services and capabilities that folks either don't have within their organization or don't have the ability or wherewithal to go out and develop and hire on their own. There is a big difference between cloud computing and these business service clouds that are growing.

Gardner: Tony Baer, it seems that our discussion today automatically went to the enterprise level, the big global 2000 type companies. What about a small to medium-sized business (SMB), an organization that perhaps didn't have the wherewithal, either through technology or budget to engage in an EDI way back when, EAI later on, or business exchanges? Is there an opportunity for the cloud to open up the addressable market for these e-commerce activities, B2B activities, to that smaller kind of company?

Same promises

Baer: It's kind of interesting. I was chuckling as you were mentioning that, because I remember that the same promises were made when the idea of Internet-based EDI came up. "Gee, this is a way to avoid the costs and overhead of proprietary value added VANs. Now, we can reduce the handshaking process, so we can get all those tier three and tier fours into electronic commerce," which at that time was defined as EDI.

I agree with you that EDI itself is several generations behind what we are talking about here. There's no question about that. There are certainly possibilities, because obviously, as you go further back up the supply chain, going toward the smaller companies, the security requirements are not always going to be as severe.

On the other hand, if they are part of a trading-hub type network -- in other words, that they are hooked into or tapped into a Toyota or something like that -- the fact they are a small company doesn't mean that they're not going to be subject to Toyota’s requirements, especially when it comes to security and other types of contractual obligations. I'll give you a mixed yes and no answer there.

For small businesses trading amongst themselves, there probably is going to be some modest upswing there, especially in terms of being able to expand themselves to address a wider market. But, there are still some real limits there, especially if they are dealing with large, let's say, tier one trading partners.

That's where I think you will see the most success with these commerce clouds -- a very specific community of like-minded suppliers and purchasers that want to get together and open their businesses up to one another.



Gardner: Brad Shimmin, you've been dealing with both SOA and collaboration issues. Is there an opportunity for these smaller companies, larger companies, or divisions within larger companies to go find themselves some workflow application in the sky? Maybe even something like Google Wave, which is now getting lots of invites. People are now starting to play around with this thing, maybe an ecosystem of contributors, developers.

Is there an opportunity for the point on the arrow to this business commerce cloud to come in the form of workflow and collaboration? Then, when you reach a point within that workflow or collaborative activity where you need some kind of a service, or product, or business partners, this cloud can be there as a resource. Maybe it can be a marketplace, auction, or exchange, where you look for the best price and the best service. What do you think about that?

Shimmin: That's a really great idea. I have a two-part answer for you. The first goes back to what Tim was saying about how this should look like Apple App Store. I agree, but that's not the full picture. The fuller picture is to look at it as a combination of that and the Amazon marketplace. That's where I think you will see the most success with these commerce clouds -- a very specific community of like-minded suppliers and purchasers that want to get together and open their businesses up to one another.

And what Tim was getting at, which is the great part of this is, is that it's unlike the Amazon-only model. I'm not talking about EC2, by the way. I'm just talking about the Amazon store itself.

Gardner: That's right. They are the front-end retail part, where you then can exchange dollars and buy from a variety of other players. So it's a B2B description of an e-commerce cloud, right?

Cost of entry

Shimmin: Right. I wanted to stay away from the whole Amazon Web Services (AWS) side of this back to the generic cloud, just talking about a like-minded group of community or a community of companies. They want to be able to come together affordably, so that the SMB can on board an exchange at an affordable rate. That's really been the problem with most of these large-scale EDI solutions in the past. It's so expensive to bring on the smaller players that they can't play.

Amazon has really solved that problem, if you look at how they run their fulfillment procedures. I see that with a combination of the iTunes idea -- those suppliers themselves contributing to that environment, that ecosystem, by building a business process that does something that's maybe specific to them. Or, maybe it's something that's generalized enough that everyone can make use of it.

That's the widgetized rendition of, "Hey, I want to on board, and I see that I've got a widget that lets me open up a certain business process and make use of it." That's the key to bringing on these smaller players and letting them actually make money more affordably than before.

The second part of that answer was about the social side of this thing. That's where I think that you really don't want to see a generic über e-commerce, cloud commerce computing site, that's supposed to be everything to all things. It's why you don't see a forum on all topics in the world. You see forums on very specific topics.

Gardner: We don't see a Wal-Mart equivalent in the B2B space, right?

Shimmin: Right. When you have that sort of like-mindedness, you have the wherewithal to collaborate. But, the problem has always been finding the right people, getting to that knowledge that people have, and getting them to open it up. That's where the social networking side of this comes in. That's where I see the big EDI guns I was talking about and the more modernized renditions opening up to this whole Google Wave notion of what collaboration means in a social networking context.

That's one key area -- being able to have the collaboration and social networking during the modeling of the processes.



What you are getting at with that kind of solution is this expertise of, "It's midnight, and I am sorry, but I do need to get this widget. Who out here has that? Let me on board you quickly, and let's fulfill my supply chain needs." Boom, presto, we are connected, and we are making money.

Gardner: Sandy Kemsley, we've been fishing around for why a cloud environment will spur on this business commerce activity. Maybe we should be looking at the social networking aspects as well. What, from your perspective, in a social networking environment for business purposes might spur on this sort of exchange-in-the cloud activity?

Kemsley: Well, Dana, I think there are two interesting sides to that. This is where I see collaboration and social networking coming to play on BPM. One is on the process discovery and modeling side, being able to collaborate with people, usually in different organizations, on what your processes are.

When you're looking at processes that include commerce aspects, if you are doing B2B between two businesses, then definitely you want to get everybody involved in modeling those processes. That's one key area -- being able to have the collaboration and social networking during the modeling of the processes.

The second is during execution. When you are executing a process, whether it's an internal process, or one that's reaching out to other companies as well. It's being able to collaborate out of step in the process in order to accomplish whatever task it is that's being assigned to you at that step. That might include calling out to people who are inside or outside your organization. Having your business processes executing in the cloud usually gives you more latitude to be able to call on people outside your own organization and to collaborate at a point in the business process.

Those are the two main areas that I see social networking coming to play with BPM.

Gardner: Let's bounce it back to Tim Minahan at Ariba. We've mentioned SMBs. Is this something for them? We've mentioned collaboration and workflow. Will those be points in the arrow to adoption? Then, we've addressed the social networking aspect. Maybe, you have some feedback on those three issues?

The community is key

Minahan: I'll start with the last here -- the core component, the commuity. What Gartner calls the business-service clouds or business process utilities, the core component of that, particularly when you are talking about inter-enterprise collaboration, is indeed the community.

We use the term "community" and not just network or VAN or something like that, because it's not just about the transaction. It's about the exchange of expertise. It's about the ability to develop affinity groups, and the ability to either resell or share best practice business processes.

We're seeing that already through the exchange that we have amongst our customers or around our solutions. We're also seeing that in a lot of the social networking communities that we participate in around the exchange of best practices. The ability to instantiate that into reusable workflows is something that's certainly coming.

Folks are always asking these days, "We hear a lot about this cloud. What business processes or technologies should we put in the cloud?" When you talk about that, the most likely ones are inter-enterprise, whether they be around commerce, talent management, or customer management, it's what happens between enterprises where a shared infrastructure makes the most sense.

Every downturn spawns the next area of innovation.



Gardner: How about those SMBs? Is this something that's right for them?

Minahan: Absolutely. Every downturn spawns the next area of innovation. In the downturn that we have gone through, look at the advantages SMBs have right now -- not to have to develop information or workflows.

If they can borrow best practices from the commerce cloud, from other large companies, get on board very, very quickly and at a much lower cost, and get engaged at a much lower cost, that's an advantage for them. They can focus on how they create the competitive differentiation instead of managing infrastructure.

Gardner: So, borrowing on a lot of cloud activities, you give away a part of the process in order to then capitalize or monetize on something else, maybe a little further down the process?

Minahan: Exactly.

Gardner: That might be of interest to the small businesses. Jason Bloomberg, going back to you, have you heard anything along the lines of the collaboration in social networking that strikes you as new? We didn't really have this social networking phenomenon 10 years ago or even five years ago. Has that changed the game at all, when it comes to these business process exchange activities?

Social networking

Bloomberg: Clearly, social networking is an important part of the story. It was one of the things that was still too immature back in the late '90s, that we saw in early part of this decade really coming to the fore. That's the key part of the story, but I wouldn't say that it's necessarily a cloud thing. Social networking is one thing, and cloud is something else.

What I hear happening on this conversation is the word "cloud" just being spread so thin that it's becoming less and less meaningful. It's easy to say, "Oh, well, a hosted-provider model like Sidekick isn't cloud computing," but most people would consider that's to be cloud computing.

Now, we were talking about business service clouds and business process clouds, and the word "cloud" is becoming so general. It's like anything that is external to enterprise is now a cloud. Oh, by the way, some internal enterprise is also a cloud. And, oh, it could be a software, and maybe it's not software. Maybe it's business process, or maybe it's something you do. Maybe it's social networking.

It's becoming such a very broad term that I think we're risking watering it down to the point that it's nothing but a cliché. I would recommend that if you're going to use the term "cloud computing," come up with a clear definition, where there is certain distinction between what is cloud computing and what is not.

The audience for this particular type of requirement is certainly looking for economies of scale, and is very good at it.



There's nothing wrong with the business marketplaces and the business web idea from the 90s, but it isn't necessarily the same thing as cloud computing, and extending the word "cloud" just waters it down to the point that it doesn't have any meaning anymore.

Gardner: I think "business commerce cliché" has quite a nice ring to it. JP Morgenthal, is this really Internet or is it not even worth bringing Internet into it? We just want to find better, faster, and cheaper ways to do commerce.

Morgenthal: You know that everybody is looking for efficiencies, and economies of scale. The audience for this particular type of requirement is certainly looking for economies of scale, and is very good at it. One of their issues to date, has been trust and not some reliance on the technologies. You've mentioned social networking. Back at Ikimbo, we had tried to introduce social networking around supply-chain management. We were starting to see some uptake before 9/11.

There probably is some merit to building secure communities of interest that allow people to communicate with their partners more effectively about what's going on in their business and their business needs and to move to a more just-in-time operation. Layout less capital expenditure. Have less inventory. Do more vendor ownership of the products and goods until they're sold.

Those are definitely areas of interest, and that can be driven by some technological change around these communities. Aa I said, we try to innovate perhaps too early. Maybe now the popularity around Enterprise 2.0 will mesh with that and business leaders will start to better understand how the two come together, versus trying to educate them. Any time you enter into a market that you need to educate, you find resistance.

Frivolous activity

By the same token, social networking also has a downside from the perspective that it's been introduced as a very frivolous activity versus a good solid business practice. Some of that may have to be undone now. You've got to do some reverse education, so to speak, to remove that frivolity from business leader's heads, around things like Facebook and Twitter, and how they impact business.

I know people who are out there helping business leaders understand and use social networking in their organizations, are going through a lot of those frustrations.

Gardner: Tim Minahan, you're our guest this week, so we'll give the last word to you. For those organizations and folks listening to the show, what should they be keeping in mind as they consider what business services and processes for pure B2B commerce activities belong in the cloud? What should they keep their eye on and how might they even get started in participating?

Minahan: When you think about the cloud, it's about the shared application instance or infrastructure that's ultimately shared among, in this case, multiple trading partners. As you mentioned before, it goes back to its primordial ooze stage. They probably backed out object-oriented architectures that became component-based architectures and SOA and are now moving toward the cloud.

When you get right down to it, it's about assembling the best business practice for your company. CIO's become much more relevant. They become business process architects.

In this case, good business processes to consider are those that go between enterprises. Go back to Willie Sutton, the bank robber. Why did he rob banks? Well, that's where the money was. Well, why do you want to focus on improving your commerce efficiencies and effectiveness. It's because that's what's required to grow your business.

Gardner: Alright. We've been joined by Tim Minahan, the CMO at Ariba. Thank you very much for joining.

Minahan: Thank you for having me.

Gardner: And, we've had our panel of IT analysts this week. Sandy Kemsley, independent IT analyst and architect. Thanks so much for joining.

Kemsley: Thanks, it was a great time.

Gardner: JP Morgenthal, independent analyst and IT consultant. Thank you, sir. Jason Bloomberg, managing partner at ZapThink. I always appreciate your input.

Bloomberg: It's been a pleasure.

Gardner: Brad Shimmin, principal analyst at Current Analysis. Thank you for joining again.

Shimmin: Thank you, Dana, and Happy Halloween everyone.

Gardner: And I hope it wasn't too spooky for you, Tony Baer, senior analyst at Ovum.

Baer: I wasn't too scared, but it was a very fascinating conversation. Thanks, Dana.

Gardner: I want to also thank our sponsors for this BriefingsDirect Analyst Insights Edition podcast, Active Endpoints and TIBCO Software.

This is Dana Gardner, principal analyst at Interarbor Solutions. Thanks for listening and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Charter Sponsor: Active Endpoints. Also sponsored by TIBCO Software.

Special offer: Download a free, supported 30-day trial of Active Endpoint's ActiveVOS at www.activevos.com/insight.

Edited transcript of BriefingDirect Analyst Insights Edition podcast, Vol. 46 on "business commerce clouds." Copyright Interarbor Solutions, LLC, 2005-2009. All rights reserved.

Monday, November 09, 2009

Part 3 of 4: Web Data Services--Here's Why Text-Based Content Access and Management Plays Crucial Role in Real-Time BI

Transcript of a sponsored BriefingsDirect podcast on information management for business intelligence, one of a series on web data services with Kapow Technologies.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Kapow Technologies.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today we present a sponsored podcast discussion on how text-based content and information from across web properties and activities are growing in importance to businesses. The need to analyze web-based text in real-time is rising to where structured data was in importance just several years ago.

Indeed, for businesses looking to do even more commerce and community building across the Web, text access and analytics forms a new mother lode of valuable insights to mine.

In Part 1 of our series on web data services with Kapow Technologies, we discussed how external data has grown in both volume and importance across the Internet, social networks, portals, and applications.

As the recession forces the need to identify and evaluate new revenue sources, businesses need to capture such web data services for their business intelligence (BI) to work better, deeper, and faster.

In Part 2, we dug even deeper into how to make the most of web data services for BI, along with the need to share those web data services inferences quickly and easily.

Now, in this podcast, Part 3 of the series, we discuss how an ecology of providers and a variety of content and data types come together in several use-case scenarios. We look specifically at how near real-time text analytics fills out a framework of web data services that can form a whole greater than the sum of the parts, and this brings about a whole new generation of BI benefits and payoffs.

Here to help explain the benefits of text analytics and their context in web data services, is Seth Grimes, principal consultant at Alta Plana Corp. Thanks for joining, Seth.

Seth Grimes: Thank you, Dana.

Gardner: We're also joined by Stefan Andreasen, co-founder and chief technology officer at Kapow Technologies. Welcome, Stefan.

Stefan Andreasen: Thank you, Dana.

Gardner: We have heard about text analytics for some time, but for many people it's been a bit complex, unwieldy, and difficult to manage in terms of volume and getting to this level of a "noise-free" text-based analytic form. Something is emerging that you can actually work with, and has now become quite important.

Let's go to you first, Seth. Tell us about this concept of noise free. What do we need to do to make text that's coming across the Web in sort of a fire hose something we can actually work with?

Difficult concept

Grimes: Dana, noise free is an interesting concept and a difficult concept, when you're dealing with text, because text is just a form of human communication. Whether it's written materials or spoken materials that have been transcribed into text, human communications are incredibly chaotic.

We have all kinds of irregularities in the way that we speak -- grammar, spelling, syntax. Putting aside any kind of irregularities, we have slang, sarcasm, abbreviations, and misspellings. Human communications are chaotic and they are full of "noise." So really getting to something that's noise-free is very ambitious.

I'm going to tell you straightforwardly, it's not possible with text analytics, if you are dealing with anything resembling the normal kinds of communications that you have with people. That's not to say that you can't aspire to a very high level of accuracy to getting the most out of the textual information that's available to you in your enterprise.

It's become an imperative to try to deal with the great volume of text -- the fire hose, as you said -- of information that's coming out. And, it's coming out in many, many different languages, not just in English, but in other languages. It's coming out 24 hours a day, 7 days a week -- not only when your business analysts are working during your business day. People are posting stuff on the web at all hours. They are sending email at all hours.

If you want to keep up, if you want to do what business analysts have been referring to as a 360-degree analysis of information, you've got to have automated technologies to do it.



Then, the volume of information that's coming out is huge. There are hundreds of millions of people worldwide who are on the Internet, using email, and so on. There are probably even more people who are using cell phones, text messaging, and other forms of communication.

If you want to keep up, if you want to do what business analysts have been referring to as a 360-degree analysis of information, you've got to have automated technologies to do it. You simply can't cope with the flood of information without them.

That's an experience that we went through in the last decades with transactional information from businesses. In order to apply BI or to get BI out of them, you have to apply automated methods with specialized software.

Fortunately, the software is now up to the job in the text analytics world. It's up to the job of making sense of the huge flood of information from all kinds of diverse sources, high volume, 24 hours a day. We're in a good place nowadays to try to make something of it with these technologies.

Gardner: Of course, we're seeing the mainstream media starts behaving more like bloggers and social media producers. We're starting to see that when events happen around the world, the first real credible information about them isn't necessarily from news organizations, but from witnesses. They might be texting. They might be using Twitter. It seems that if you want to get real-time information about what's going on, you need to be able to access those sorts of channels.

Text analytics

Grimes: That's a great point Dana, and it helps introduce the idea of the many different use-cases for text analytics. This is not only on the Web, but within the enterprise as well, and crossing the boundary between the Web and the inside of the enterprise.

Those use-cases can be the early warning of a Swine flu epidemic or other medical issues. You can be sure that there is text analytics going on with Twitter and other instant messaging streams and forums to try to detect what's going on.

You even have Google applying this kind of technology to look at the pattern of the searches that people are putting in. If people are searching on a particular medical issue centered in a particular geographic location, that's a good indicator that there's something unusual going on there.

It's not just medical cases. You also have brand and reputation management. If someone has started posting something very negative about your company or your products, then you want to detect that really quickly. You want early warning, so that you can react to it really quickly.

We have some great challenges out there, but . . . we have some great technologies to respond to those challenges.



We have a great use case in the intelligence world. That's one of the earliest adopters of text analytics technology. The idea is that if you are going to do something to prevent a terrorist attack, you need to detect and respond to the signals that are out there, that something is pending really quickly, and you have to have a high degree of certainty that you're looking at the right thing and that you're going to react appropriately.

We have some great challenges out there, but, as I said, we have some great technologies to respond to those challenges in a whole variety of business, government, and other applications.

Gardner: Stefan, I think there are very few people who argue with the fact that there is great information out there on the Web, across these different new channels that have become so prominent, but making that something that you can use is a far different proposition. Seth has been telling us about automated tools. Tell us what you see in terms of web data services and how we can make this information available to automated system.

Deep data

Andreasen: Thank you Dana. Let's just look at something like Google. You go there and do a search, and you think that you're searching the entire Internet. But, you're not, because you're probably not going to access data that's hidden behind logins, behind search forms, and so on.

There is a huge amount of what I call "deep web," very valuable information that you have to get to in some other way. That's where we come in and allow you to build robots that can go to the deep web and extract information.

I'd also like to talk a little bit more about the noise-free thing and go to the Google example. Let's say you go to Google and you search for "IBM software." You think that you will be getting an article that has something to do with IBM software.

You often actually find an article that has nothing to do with IBM software, but, because there are some advertisements from IBM, IBM was a hit. There is some other place that links to software, and you will find software. Basically, end up in something completely irrelevant.

Eliminating noise is getting rid of all this stuff around the article that is really irrelevant, so you get better results.

The other thing around noise-free is the structure. It would be great if you could say, "I want to search an article about IBM software which was dated after Oct. 7," or whatever, but that means you also need to have that additional structured information in it.

It's very important to have tools that can . . . understand where the content is within a page and what's the navigation on that page.



The key here is to get noise-free data and to get full data. It's not only to go to the deep web, but also get access to the data in a noise-free way, and in at least a semi-structured way, so that you can do better text analysis, because text analysis is extremely dependent on the quality of data.

Grimes: I have to agree with you there, Stefan. It's very important to have tools that can strip away not only the ads, but understand where the content is within a page and what's the navigation on that page.

We might not be interested in navigation elements, the fluff that's on a page. We want to focus on the content. In addition, nowadays on the Web, there's a big problem of duplication of material that's been hosted in multiple sites. If you're dealing with email or forums, then people typically quote previous items in their reprise, and you want to detect and strip that kind of stuff away and focus on the real relevant content. That is definitely part of the noise-free equation, getting to the authentic content.

Gardner: Stefan, you refer to the deep web. I imagine this also has a role, when it comes to organizations trying to uncover information inside of their firewalls, perhaps among their many employees and all the different tools that they're using. We used to call it the intranet, but is there an intranet effect here for this ability to gather noise-free text information that we can then start processing?

Extended intranet

Andreasen: Absolutely. I'd even say the extended intranet. If we're looking at a web browser, which is the way that most business analysts or other persons today are accessing business applications, we're accessing three different kinds of applications.

One involves applications inside the firewall. It could be the corporate intranet, etc. Then there are applications where you have to use a login, and this can be your partners. You're logging in to your supplier to see if some item is in stock. Or, it can be some federal reporting site or something.

The sites behind the login are like the extended enterprise. Then, of course, there is everything out of the World Wide Web -- more than 150 million web pages out there -- which have all kinds of data, and a lot of that is behind search forms, and so on.

Gardner: Seth, as a consultant and analyst, you've been focused on text analytics for some time, but perhaps a number of our listeners aren't that familiar with it. Could you maybe give us a brief primer on what it is that happens when you identify some information -- be it Internet, extended web, deep web? How do you go through some basic steps to analyze, cleanse, and then put data into a form that you can then start working with?

Grimes: Dana, I'm going to first give you an extremely short history lesson, a little factoid for you. Text analytics actually predates BI. The basic approaches to analyzing textual sources were defined in the late '50s. Actually, there is a paper from an IBM researcher from 1958, that defines BI as the analysis of textual sources.

People apply so-called machine-learning technologies in order to improve the accuracy of what they are doing.



What happened is that enterprises computerized their operations, their accounting, their sales, all of that in the 1960s. That numerical data from transactional systems is readily analyzable, where text is much more difficult to analyze. But, now we have come to the point, as I said earlier, where there is software and great methods for analyzing text.

What do they do? The front-end of any text analysis system is going to be information retrieval. Information retrieval is a fancy, academic type of term, meaning essentially the same thing as search. We want to take a subset of all of the information that's out there in the so-called digital universe and bring in only what's relevant to our business problems at hand. Having the infrastructure in place to do that is a very important aspect here.

Once we have that information in hand, we want to analyze it. We want to do what's called information extraction, entity extraction. We want to identify the names of people, geographical location, companies, products, and so on. We want to look for pattern-based entities like dates, telephone numbers, addresses. And, we want to be able to extract that information from the textual sources.

In order to do that, people usually apply a combination of statistical and linguistic methods. They look for language patterns in the text. They look for statistics like the co-occurrence of words in multiple text. When two words appear next to each other or close to each other in many different documents -- that can be web pages or other documents -- that indicates the degree of relationship. People apply so-called machine-learning technologies in order to improve the accuracy of what they are doing.

Suitable technologies

All of this sounds very scientific and perhaps abstruse -- and it is. But, the good message here is one that I have said already. There are now very good technologies that are suitable for use by business analysts, by people who aren't wearing those white lab coats and all of that kind of stuff. The technologies that are available now focus on usability by people who have business problems to solve and who are not going to spend the time learning the complexities of the algorithms that underlie them.

So, we're at the point now where you can even treat some of these technologies as black boxes. They just work. They produce the results that you need in the form that you need them. That can be in a form that extracts the information into databases, where you can do the same kind of BI that you have been used to for the last 20 years or so with BI tools.

It can be visualizations that allow you to see the interrelationships among the people, the companies, and the products that are identified in the text. If you're working in law enforcement or intelligence, that could be interrelationships among individuals, organizations, and incidents of various types. We have visualization technologies and BI technologies that work on top of this.

Then, we have one other really nice thing that's coming on the horizon, which is semantic web technology -- the ability to use text analytics to support building a web of data that can be queried and navigated by automated software tools. That makes it even easier for individuals to carry out everyday business and personal problems for that matter.

Obviously, any BI or any text analysis is no better than the data source behind it.



Gardner: I'd like to dig into some use-cases and understand a little bit better how this is being used productively in the field. Before we do that, Stefan, maybe you could explain from Kapow Technologies' perspective, how you relate to this text analytics field that Seth so nicely just described. Where does Kapow begin and end, and how do you play perhaps within an ecosystem of providers that help with text analytics?

Andreasen: Text analytics, exactly as Seth was saying, is really a form of BI. In BI, you are examining some data and drawing some conclusions, maybe even making some automated actions on it.

Obviously, any BI or any text analysis is no better than the data source behind it. There are four extremely important parameters for the data sources. One is that you have the right data sources.

There are so many examples of people making these kind of BI applications, text analytics applications, while settling for second-tier data sources, because they are the only ones they have. This is one area where Kapow Technologies comes in. We help you get exactly the right data sources you want.

The other thing that's very important is that you have a full picture of the data. So, if you have data sources that are relevant from all kinds of verticals, all kinds of media, and so on, you really have to be sure you have a full coverage of data sources. Getting a full coverage of data sources is another thing that we help with.

Noise-free data

We already talked about the importance of noise-free data to ensure that when you extract data from your data source, you get rid of the advertisements and you try to get the major information in there, because it's very valuable in your text analysis.

Of course, the last thing is the timeliness of the data. We all know that people who do stock research get real-time quotes. They get it for a reason, because the newer the quotes are, the surer they can look into the crystal ball and make predictions about the future in a few seconds.

The world is really changing around us. Companies need to look into the crystal ball in the nearer and nearer future. If you are predicting what happens in two years, that doesn't really matter. You need to know what's happening tomorrow. So, the timeliness of the data is important.

Let me get to the approach that we're taking. Business analysts work with business applications through their web browser. They actually often cut and paste data out of business application into some spreadsheet.

The world is really changing around us. Companies need to look into the crystal ball in the nearer and nearer future.



You can see our product as a web browser, where you can teach it how to interact with the website, how to only extract the data that's relevant, and how you can structure that data, and then repeat it. Our product can give you automated, real-time, and noise-free access to any data you see in a web browser.

How does that apply to text analytics? Well, it gives you the 100-percent covered, real-time data source, with all of those values that I just explained.

Gardner: I really was intrigued by this notion of the crystal ball, and not two years from now, but tomorrow. It seems to me that so many people are putting up so much information about their lives, their preferences. People in business are doing the same around their occupation. We have this virtual focus group going on around us all the time. If we could just suck out the right information based on our products, we could get that crystal ball polished up.

Let me go back to you, Stefan. Can you give us an example of where a market research, customer satisfaction, or virtual focus group benefit is being derived from these text analytics capabilities?

Knowing the customer

Andreasen: Absolutely. For any company selling services or products, the most important thing for them to know is what the customers think about their product. Are we giving our customers the right customer service? Are we packaging our products the right way? How do we understand the customer's buying behavior, the customer communications, and so on?

Intuit is a customer we have together with a text analysis company called Clarabridge. They use text analysis solution to understand the TurboTax customers.

Before they had a text analysis system, they had some people that did one percent coverage sampling of forums on the web, their own customer support system, and emails into their contact center to get some rudimentary overview of what the customer thought.

We went in, and with Kapow Technologies they can now get to all these data sources -- forums online, their own customer support center, and wherever there are networks of TurboTax users -- and extract all the information in near real-time. Then, they use the text-analysis engine to make much, much better predictions of what the customers think, and they actually having the finger on the pulse.

With the web, you don't have to get those people together, because they come together on their own and participate in social media forums of various types.



If a set of customers suddenly talk about a feature that doesn't work, or that is much better in the competitor's product -- and thereby looking into the near future of the crystal ball --they can react early and try to deal with this in the best possible way.

Gardner: Seth Grimes, is this an area where you have seen a lot of the text analytics work focused on these sort of virtual focus groups?

Grimes: Definitely. That's an interesting concept. The idea behind a focus group is that it's a traditional qualitative research tool for market research firms. They get a bunch of people into a room and they have the facilitator lead those people through a conversation to talk about brand names, marketing, positioning, and then get their reactions to it.

With the web, you don't have to get those people together, because they come together on their own and participate in social media forums of various types. There are a whole slew of them. Together they constitute a virtual focus group, as you say.

The important point here is to get at the so-called voice of the customer. In other words, what is the customer saying in his own voice, not in some form where you're forcing that person to tick off number one, two, three, four, or five, in order to rate your product. They can bring up the issues that are of interest to them, whether they are good or bad issues, and they can speak about those issues however they naturally do. That's very important.

I've actually been privileged to share a stage with the analytics manager from Intuit, Chris Jones, a number of times to talk about what he is doing, the technologies, and so on. It's really interesting stuff that amplifies what Stefan had to say.

Broad picture

The idea is that you can use these technologies, both to get a broad picture of the issues, and no longer have to bend those issues into categories that your business analysts have predefined. Now, you can generate the topics of most interest, using automated, statistical methods from what the people are actually saying. In other words, you let them have their own voice.

You also get the effect of not only looking at the aggregate picture, at the mass of the market, but also at the individual cases. If someone posts about a problem with one of the products to an online forum, you can detect that there's an issue there.

You can make sure that the issues gets to the right person, and the company can personally address each issue in order to really keep it from escalating and getting a lot of attention that you really don't want it to get. You get the reputation of being a very responsive company. That's a very important thing.

The goal here is not necessarily to make more money. The goal is to boost your customer satisfaction rating, Net Promoter score, or however you choose to measure it. These technologies, the text technologies, are a very important package and part of the overall package of responding to customer issues and boosting customer satisfaction.

While you're doing it, those people are going to buy more. They're going to reduce your support costs, all of that kind of stuff, and you are going to make more money. So, by doing the right thing, you're also doing something good for your own company.

What you really want to know is who this person knows in all kinds of social networks on the 'Net, and to try to make a network of who are the real influencers and who are the network centers.



Gardner: In business, you want to reduce the guesswork to do better by your customers. Stefan, as I understand it, Kapow Technologies has been quite successful in working with a variety of military, government, and intelligence agencies around the world on getting this real-time information as to what's going on, but perhaps with the stakes being a bit higher, things like terrorism, and even insurrections and uprising.

Tell us a little bit about a second use case scenario, where text analytics are being used by government agencies and intelligence agencies.

Andreasen: As Seth said, the voice of the customer is very interesting and very valuable use case with text analysis. I'll add one thing to what Seth said. He was talking about product input, and of course, we all know that developing products -- maybe not so much a product like TurboTax, but developing a car -- is extremely expensive. So, understanding what kind of product your customers want in the future is an important part of the voice of the customer.

With a lot of the customers in the military intelligence, it's similar. Of course, they would like to know what people are writing from a sentiment point of view, an opinion point of view, but another thing that's actually even more important in the intelligence community is what I will call relationships.

Seth mentioned relationships earlier, and also understanding the real influencers and who are the ones that have the most connections in these relationships. Let's say somebody writes an article about how you mix some chemicals together to make an efficient bomb. What you really want to know is who this person knows in all kinds of social networks on the 'Net, and to try to make a network of who are the real influencers and who are the network centers.

Finding relationships

We see a lot of uses of our product, going out to blogs, forums, etc., in all kinds of languages, translating it often into English, and doing this relationship analysis. A very popular product for that, which is a partner of ours, is Palantir Technologies. It has a very cool interactive way of finding relationships. I think this is also very relevant for normal enterprises.

Yesterday I met with one of the big record companies, which is also a customer of ours. As soon as I explained this relationship stuff, they said, "We can really use this for anti-piracy, because it is really just very few people who do the major work when it gets to getting copies of new films out in the 'Net. So, understanding this relationship can be very relevant for this kind of scenario as well.

Grimes: Dana, when you introduced our podcast today, you used the term ecology or ecosystem, and that's a real great concept that we can apply here in a number of dimensions. We do have an ecosystem in at least two dimensions.

Stefan mentioned one of the Kapow partners, Palantir. We earlier mentioned the text analytics partner, Clarabridge. We have the ability now through integration technologies like Kapow to bring together different information sources, very disparate, different information sources with different characteristics, to provide an ecosystem of information that can be analyzed and brought to bear to solve particular business or government problems.

I find that ecosystem concept to be very useful here in framing the discussions about how the text technologies fit into something that's a much larger picture.



We have a set of software technologies that can similarly be integrated into an ecosystem to help you solve those problems. That might be text analysis technologies. It might be traditional BI or data warehousing technologies. It might be visualization technologies, whatever it takes to handle your particular business problem.

As we've been discussing, we do see applications in a whole variety of business and government issues, whether it's customer or intelligence or many other things that we haven’t even discussed today. So, I find that ecosystem concept to be very useful here in framing the discussions about how the text technologies fit into something that's a much larger picture.

Gardner: So, we are looking at the ecologies. We are looking at some of these use-cases. It seems to me that we also want to be able to gather information from a variety of different players, perhaps in some sort of a supply chain, ecosystem, business process, channel partners, or value added partners. The ecology and ecosystem concept works not only in terms of what we do with this information, but how we can apply that information back out to activities that are multi-player, beyond the borders or boundaries of any one organization.

I'm thinking about product recall, health, and public-health types of issues. Seth, have you worked with any clients or do you have any insights into how text analytics is benefiting an extended supply chain of some sort, and how the ecosystem of insight into the text analytics solves some unique problems there?

Product recall

Grimes: Product recall is an interesting one. Let me give you an example there. This is, like most examples that we are going to discuss, a multifaceted one.

People are all familiar with the problems with Firestone tires back a number of years ago, early in this decade, where the tread was coming off tires. Well, there are a number of parties that are going to be interested in this problem.

I am sorry, but put aside the consumers who are obviously affected by it, very badly affected by it. But, we have the manufacturers, not only of the tires, but also of the vehicles, the Ford Explorer in this case.

We have the regulatory bodies in the government, parts of the U.S. Department of Transportation. We have the insurance industry. All of these are stakeholders who have an interest in early detection, early addressing, and early correction of problem.

You don't want to wait until there are just so many cases here that it's just obvious to everyone, the issues really spill out into the press, and there are questions of negligence, and so on. So, how can you address something like a problem with tires where the tread is coming off?

You don't want to wait until there are just so many cases here that it's just obvious to everyone, the issues really spill out into the press, and there are questions of negligence, and so on.



Well, one way is warranty claims. For example, someone might file a claim through the vehicle manufacturer, Ford in this case, or through the tire manufacturer, claiming a defective product. Sometimes, just an individual tire is defective, but sometimes that's an indication of manufacturing or design issues. So you have warranty claims.

You also have accident reports that are filed by police departments or other government agencies and find their way into databases in the Department of Transportation and other places. Then, you have news reports about particular incidents.

There are multiple sources of information. There are multiple stakeholders here. And, there are multiple ways of getting at this. But, like so many problems, you're going to get at the issue much faster, if you combine information from all of these different sources, rather than relying on a single source.

Again, that's where the importance of building up an ecosystem of different data sources that come to bear on your problem is really important, and that's just a typical use case. I know of other organizations, manufacturing organizations, that are using this technology in conjunction with data-mining technologies for warranty claims, for example. Consumer appliances is another area that I have heard a lot about, but really there is no limitation in where you can apply this.

Gardner: Stefan, from your perspective, for these extended supply chains, public health issues, etc., again we get down to this critical time element -- for example, the Swine flu outbreak last spring. If folks could identify through text analytics where this was starting to crop up, they didn't have to wait for the hospital reports necessarily. Is that an instance where some of these technologies can really play an important role?

Big pitfall

Andreasen: Absolutely. Before I get into some more real examples, I want to emphasize some of the things that Seth was saying. He's talking about getting to multiple data sources. I cannot stress enough that what I have seen out there as one of the biggest pitfalls when people are making a text analysis solution or actually any BI solution is that they look at what data sources they have and they settle for that.

They should have said, "What are the optimal data sources to get the best prediction and get the best outcome out of this text analysis?" They should settle for no less than that.

The example here will actually explain that. I also have a tire example. We actually have two different kinds of customers using our products looking at tires, tire explosions, and tire recalls.

One is a tire company itself. They go to automated forums and try to monitor if people are doing exactly what Seth is saying, filing claims or writing on an automotive blog: "I got this tire, and it exploded." "It's just really bad." "Don't buy it." All those kinds of information from different sources.

If you get enough of the data source and you get that data in real-time, you can actually go in and contain the situation of a potential tire recall before it happens, which of course could be very valuable for your company.

Many different players here can use the same kind of information for different purposes, and that makes this really interesting.



The other use case is stock research. We have a lot of customers doing financial and market research with our technology. One of them is using our product, for example, to go out and check the same forums, but their objective is to predict if there is a tire recall. Then, they can predict that the stock is going to get a crash, when that happens, and project that beforehand.

Many different players here can use the same kind of information for different purposes, and that makes this really interesting as well.

Gardner: Well, it really seems the age old part of this is that, getting information first has many, many advantages, but the new element is that more and more information is in the form of analytics out in the web.

I wonder if we could cap this discussion -- we are about out of time -- by looking at the future. Seth, you mentioned earlier the semantic web. How automated can this get, and what needs to take place in order for that vision of a semantic web to take place?

Grimes: Well, the semantic web right now is a dream. It's a dream that was first articulated over a decade ago by Tim Berners-Lee, the person who created the World Wide Web, but it is one that is on the fast track to being realized. Being realized in this case means creating meaning.

What Stefan was referring to earlier when he talked about the dates of a published article, the title, perhaps other metadata fields such as the author, creating information that describes what's out there on the web and in databases.

Machine processable

Rendering that information into a form that's machine processable, not only in the sense of analysis, but also in the sense of making interconnections among different pieces of information, is what the semantic web is really about. It's about structuring information that's out there on the Web. That can include what Stefan referred to as the deep web, and creating tools that allow people to search and issue other types of queries against that web data.

It's something that people are working hard on now, but I don't think will be really realized in terms of any broad business usable applications for a fair number of years. Not next year or the year after, but maybe three to five years out, we will really start to see a very broadly useful business application. There is going to be niche applications in the near term, but later something much broader.

It's a direction that really hits on the themes that we have been talking about today, integrating applications and data from multiple sources and of multiple types in order to create a whole that is much greater than each of the parts.

We need software technologies that can do that nowadays, and fortunately we have them.



We need software technologies that can do that nowadays, and fortunately we have them, as we have been discussing. We need a path that will evolve us towards something that really creates much greater value for much larger massive applications in the future, and fortunately the technologies that we have now are evolving in that direction.

Gardner: Very good. I think we have to leave it there. I want to thank both of our guests. We have been discussing the role of text analytics and how companies can take advantage of that and bring that into play with their BI and marketing and other activities, and how the mining of this information is now being done by tools and is increasingly being automated.

I want to thank Seth Grimes, principal consultant at Alta Plana Corp., for joining us. Thanks so much, Seth.

Grimes: Again, thank you Dana, and thanks to Kapow for making this possible.

Gardner: Also, Stefan Andreasen, co-founder and CTO at Kapow Technologies. Thanks again for sponsoring and joining us, Stefan.

Andreasen: Well, thank you. That was a great discussion. Thank you.

Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. This is Part Three of a series from Kapow Technologies on using BI and web data services in unique forms to increase business benefits.

You have been listening to a sponsored BriefingsDirect podcast. Thanks and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Kapow Technologies.

Transcript of a sponsored BriefingsDirect podcast on information management for business intelligence, one of a series on web data services with Kapow Technologies. Copyright Interarbor Solutions, LLC, 2005-2009. All rights reserved.

Friday, October 30, 2009

Business and Technical Cases Build for Data Center Consolidation and Modernization

Transcript of a sponsored BriefingsDirect podcast on how data center consolidation and modernization helps enterprises reduce cost, cut labor, slash energy use, and become more agile.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Akamai Technologies.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion on how data-center consolidation and modernization of IT systems helps enterprises reduce cost, cut labor, slash energy use, and become more agile.

We'll look at the business and technical cases for reducing the numbers of enterprise data centers. Infrastructure advancements, standardization, performance density, and network services efficiencies are all allowing for bigger and fewer data centers that can carry more of the total IT requirements load.

These strategically architected and located facilities offer the ability to seek out best long-term outcomes for both performance and cost -- a very attractive combination nowadays. But, to gain the big payoffs from fewer, bigger, better data centers, the essential list of user expectations for performance and IT requirements for reliability need to be maintained and even improved.

Network services and Internet performance management need to be brought to bear, along with the latest data-center advancements to produce the full desired effect of topnotch applications and data delivery to enterprises, consumers, partners, and employees.

Here to help us better understand how to get the best of all worlds -- that is high performance and lower total cost from data center consolidation -- we're joined by our panel. Please join me in welcoming James Staten, Principal Analyst at Forrester Research. Welcome, James.

James Staten: Thanks for having me.

Gardner: We're also joined by Andy Rubinson, Senior Product Marketing Manager at Akamai Technologies. Welcome, Andy.

Andy Rubinson: Thank you, Dana. I'm looking forward to it.

Gardner: And, Tom Winston, Vice President of Global Technical Operations at Phase Forward, a provider of integrated data management solutions for clinical trials and drug safety, based in Waltham, Mass. Welcome, Tom.

Tom Winston: Hi, Dana. Thanks very much.

Gardner: Let me start off with James. Let's look at the general rationale for data-center modernization and consolidation. What are the business, technical, and productivity rationales for doing this?

Data-center sprawl

Staten: There is a variety of them, and they typically come down to cost. Oftentimes, the biggest reason to do this is because you've got sprawl in the data center. You're running out of power, you're running out of the ability to cool any more equipment, and you are running out of the ability to add new servers, as your business demands them.

If there are new applications the business wants to roll out, and you can't bring them to market, that's a significant problem. This is something the organizations have been facing for quite some time.

As a result, if they can start consolidating, they can start moving some of these workloads onto fewer systems. This allows them to reduce the amount of equipment they have to manage and the number of software licenses they have to maintain and lower their support costs. In the data center overall, they can lower their energy costs, while reducing some of the cooling required and getting rid of some of those power drops.

Gardner: James, isn't this sort of the equivalent of Moore's Law, but instead of at silicon clock-speed level, it's at a higher infrastructure abstraction? Are we virtualizing our way into a new Moore's Law era?

Staten: Potentially. We've always had this gap between how much performance a new CPU or a new server could provide and how much performance an application could take advantage of. It's partly a factor of how we have designed applications. More importantly, it's a factor of the fact that we, as human beings, can only consume so much at so fast a rate.

Most applications actually end up consuming on average only 15-20 percent of the server. If that's the case, you've got an awful lot of headroom to put other applications on there.

We were isolating applications on their own physical systems, so that they would be protected from any faults or problems with other applications that might be on the same system and take them down. Virtualization is the primary isolating technology that allows us to do that.

Gardner: I suppose there are some other IT industry types of effects here. In the past, we would have had entirely different platforms and technologies to support different types of applications, networks, storage, or telecommunications. It seems as if more of what we consider to be technical services can be supported by a common infrastructure. Is that also at work here?

Unique opportunity

Staten: That's mostly happening as well. The exception to that rule is definitely applications that just can't possibly get enough compute power or enough contiguous compute power. That creates the opportunity for unique products in the market.

More and more applications are being broken down into modules, and, much like the web services and web applications that we see today, they're broken into tiers. Individual logic runs on its own engine, and all of that can be spread across some more monetized, consistent infrastructure. We are learning these lessons from the dot-coms of the world and now the cloud-computing providers of the world, and applying them to the enterprise.

Gardner: I've heard quite a few numbers across a very wide spectrum about the types of payoffs that you can get from consolidating and modernizing your infrastructure and your data centers. Are there any rules of thumb that are typical types of paybacks, either in some sort of a technical or economic metric?

Staten: There's a wide range of choices from the fact that the benefits come from how bad off you are when you begin and how dramatically you consolidate. On average, across all the enterprises we have spoken to, you can realistically expect to see about a 20 percent cost reduction from doing this. But, as you said, if you've got 5,000 servers, and they're all running at 5 percent utilization, there are big gains to be had.

Gardner: The economic payoff today, of course, is most important. I suppose there is a twofold effect as well. If you're facing a capacity issue and you're thinking about spending $40 or $50 million for an additional data center, and if you can reduce the need to do that or postpone it, you're saving on capital costs. At the same time, you could, perhaps through better utilization, reduce your operating costs as well.

Staten: Absolutely. One of the biggest benefits you get from virtualization is flexibility. It's so much easier to patch a workload and simply keep it running, while you are doing that. Move it to another system, but apply the patch, make sure the patch worked, deploy a clone, and then turn off the old version.

That's much more powerful, and it gives a lot more flexibility to the IT shop to maintain higher service-level agreements (SLAs), to keep the business up and running, to roll out new things faster, and be able to roll them back more easily.

Gardner: Andy Rubinson, this certainly sounds like a no-brainer: Get better performance for less money and postpone large capital expenditures. What are some of the risks that could come into play while we are starting to look at this whole picture? I'm interested in what's holding people back.

Rubinson: I focus mainly on delivery over the Internet. There are definitely some challenges, if you're talking about using the Internet with your data center infrastructure -- things like performance latency, availability challenges from cable cuts, and things of that nature, as well as security threats on the Internet.

It's thinking about how can you do this, how can you deliver to a global user base with your data center, without having to necessarily build out data centers internationally, and to be able to do that from a consolidated standpoint.

Gardner: So, for those organizations that are not just going to be focused on employees, or, if they are, that they are a global organization, they need to be thinking the most wide area network (WAN) possible. Right?

Rubinson: Absolutely.

Gardner: Let's go to our practitioner, Tom Winston. Tom, what sort of effects were you dealing with at Phase Forward, when you were looking at planning and strategy around data center location, capacity, and utilization?

Early adopter

Winston: Well, we were in a somewhat different position, in that we were actually an early adopter of virtualization technology, and certainly had seen the benefits of using that to help contain our data-center sprawl. But, we were also growing extremely rapidly.

When I joined the organization, it had two different data centers -- one on the East Coast and one on the West Coast. We were facing the challenge of potentially having to expand into a European data center, and even potentially a Pacific Rim data center.

By continuing to expand our virtualization efforts, as well as to leverage some of the technologies that Andy just mentioned as far as, Internet acceleration, via some of the Akamai technologies, we were able to forego that data center expansion. In fact, we were able to consolidate our data center to one East Coast data center, which is now our primary hosting center for all of our applications.

So, it had a very significant impact for us by being able to leverage both that WAN acceleration, as well as virtualization, within our own four walls of the data center. [Editor's note: WAN here and in subsequent uses refers to public wide area networks and not private.]

Gardner: Tom, just for the edification of our listeners, tell us a little bit about Phase Forward. Where are your users and where do your applications need to go.

In an age where . . . people are expecting things to be moving extremely quickly and always available, it's very important for us to be able to provide that application all the time, and to perform at a very high level.



Winston: We run electronic data capture (EDC) software, and pharmacovigilance software for the largest pharmaceutical and clinical device makers in the world. They are truly global organizations in nature. So, we have users throughout the world, with more and more heavy population coming out of the Asia Pacific area.

We have a very large, diverse user base that is accessing our applications 24x7x365, and, as a result, we have performance needs all the time for all of our users.

In an age where, as James mentioned, people are expecting things to be moving extremely quickly and always available, it's very important for us to be able to provide that application all the time, and to perform at a very high level.

One of the things James mentioned from an IT perspective is being able to manage that virtual stack. Another thing that virtualization allows us to do is to provide that stack and to improve performance very quickly. We can add additional compute resources into that virtual environment very quickly to scale to the needs that our users may have.

Gardner: James Staten, back to you. Based on Tom's perspective of the combination of that virtualization and the elasticity that he gets from his data center, and the ability to locate it flexibly, thanks to some network optimization and reliability issues, how important is it for companies now, when they think about data center consolidation, to be flexible in terms of where they can locate?

All over the place

Staten: It's important that they recognize that their users are no longer all in the same headquarters. Their users are all over the place. Whether they are an internal employee, a customer, or a business partner, they need to get access to those applications, and they have a performance expectation that's been set by the Internet. They expect whatever applications they are interacting with will have that sort of local feel.

That's what you have to be careful about in your planning of consolidation. You can consolidate branch offices. You can consolidate down to fewer data centers. In doing so, you gain a lot of operational efficiencies, but you can potentially sacrifice performance.

You have to take the lessons that have been learned by the people who set the performance bar, the providers of Internet-based services, and ask, "How can I optimize the WAN? How can I push out content? How can I leverage solutions and networks that have this kind of intelligence to allow me to deliver that same performance level?" That's really the key thing that you have to keep in mind. Consolidation is great, but it can't be at the sacrifice of the user experience.

Gardner: When you find the means to deliver that user experience, that frees you up to then place your data centers strategically based on things like skills or energy availability or tax breaks, and so forth. Isn't that yet another economic incentive here?

Staten: You want to have fewer data centers, but they have to be in the right location, and the right location has to be optimized for a variety of factors. It has to be optimized for where the appropriate skill sets are, just as you described. It has to be optimized for the geographic constraints that you may be under.

We're able to take some of that load off of the servers, and do the work in the cloud, which also helps reduce them.



You may be doing business in a country in which all of the citizen information of the people who live in that country must reside in that country. If that's the case, you don't necessarily have to own a data center there, but you absolutely have to have a presence there.

Gardner: Andy, back to you. What are some of the pros and cons for this Internet delivery of these applications? I suppose you have to rearchitect, in order to take advantage of this as well.

Rubinson: There are two main areas from the positives, the benefits, and that's the cost efficiency of delivering over the Internet, as well as the responsiveness. From the cost perspective, we're able to eliminate unnecessary hardware. We're able to take some of that load off of the servers, and do the work in the cloud, which also helps reduce them.

A lot of cost efficiencies

There are a lot of cost efficiencies that we get, even as you look to Tom's statement about being able to actually eliminate a data center and avoid having to build out a new data center. Those are all huge areas, where it can help to use the Internet, rather than having to build out your own infrastructure.

Also, in terms of responsiveness, by using the Internet, you can deploy a lot more quickly. As Tom explained, it's being able to reach the users across the globe, while still consolidating those infrastructures and be able to do that effectively.

This is really important, as we have seen more and more users that are going outside of the corporate WANs. People are connecting to suppliers, to partners, to customers, and to all sorts of things now. So, the private WANs that many people are delivering their apps over are now really not effective in reaching those people.

Gardner: As James said earlier, we've got different workloads and different types of applications. Help me understand what Akamai can do. Do you just accelerate a web app, or is there a bit more in your quiver in terms of dealing with different types of loads of media, content, application types?

Rubinson: There are a variety of things that we are able to deliver over the Internet. It includes both web- and IP-based applications. Whether it's HTTP, HTTPS, or anything that's over TCP/IP, we're able to accelerate.

. . . The other key area where we have benefit is through the delivery of dynamic data. By optimizing the cloud, we're able to speed the delivery of information from the origin as well.



We also do streaming. One of the things to consider here is that we actually have a global network of servers that kind of makes up the cloud or is an overlay to the cloud. That is helping to not only deliver the content more quickly, but also uses some caching technology and other things that make it more efficient. It allows us to give that same type of performance, availability, and security that you would get from having a private WAN, but doing it over the much less expensive Internet.

Gardner: You're looking at specifics of an application in terms of what's going to be delivered at frequent levels versus more infrequent levels, and you can cache the data and gain the efficiency with that local data store. Is that how it works?

Rubinson: A lot of folks think about Akamai as being a content delivery network (CDN), and that's true. There is caching that we are doing. But, the other key area where we have benefit is through the delivery of dynamic data. By optimizing the cloud, we're able to speed the delivery of information from the origin as well. That's where it's benefiting folks like Tom, where he is able to not only cache information, but the information that is dynamic, that needs to get back from the data center, goes more quickly.

Gardner: Let's check in with Tom. How has that worked out for you? What sort of applications do you use with wide area optimization, and what's been your experience?

Flagship application

Winston: Our primary application, our flagship application, is a product called InForm, which is the main EDC product that our customers use across the Internet. It's accelerated using Akamai technology, and almost 100 percent of our content is dynamic. It has worked extremely well.

Prior to our deployment of Akamai, we had a number of concerns from a performance standpoint. As James mentioned, as you begin to virtualize, you also have to be very conscious of the potential performance hits. Certainly, one of the areas that we were constrained with was performance around the globe.

We had users in China who, due to the amount of traffic that had to traverse the globe, were not happy with the performance of the application. Specifically, we brought in Akamai to start with a very targeted group of users and to be able to accelerate for them the application in that region.

It literally cut the problem right out. It solved it almost immediately. At that point, we then began to spread the rest of that application acceleration product across the rest of our domains, and to continue to use that throughout the product set.

Having an application perform to the level of a Google is something that our end users expect, even though obviously it's a much different application in what it's attempting to solve and what it's attempting to do.



It was extremely successful for us and helped solve performance issues that our end users were having. I think some of the comments that James made are very important. We do live in a world where everybody expects every application across the Internet to perform like Google. You want to search and you expect it to be back in seconds. If it's not, people tend to be unhappy with the performance of the application.

In our application, it's a much more complex application. A lot more is going on behind the scenes -- database calls, whatever it may be. Having an application perform to the level of a Google is something that our end users expect, even though obviously it's a much different application in what it's attempting to solve and what it's attempting to do. So, the benefits that we were able to get from the acceleration servers were very critical for us.

Rubinson: Just to add to that, we recently commissioned a study with Forrester, looking at what is that tolerance threshold [for a page to load]. In the past it had been that people had tolerance for about four seconds. As of this latest study, it's down to two seconds. That's for business to consumer (B2C) users. What we have seen is that the business-to-business (B2B) users are even more intolerant of waiting for things.

It really has gotten to a point where you need that immediate delivery in order to drive the usage of the tools that are out there.

Gardner: I suppose that's just human nature. Our expectations keep going up. They usually don't go down.

Rubinson: True.

Gardner: Back to you, Tom. Tell me a little bit more about this application. Is this a rich Internet application (RIA)? Is this strictly a web interface? Tell us a little bit more about what the technical challenge was in terms of making folks in China get the same experience as those on the East Coast, who were a mile away from your data center.

Everything is dynamic

Winston: The application is one that has a web front-end, but all the information is being sent back to an Oracle database on the back-end. Literally, every button click that you make is making some type of database query or some type of database call, as I mentioned, with almost zero static content. Everything is dynamic.

There is a heavy amount of data that has to go back and forth between the end user and the application. As a result, prior to acceleration, that was very challenging when you were trying to go halfway around the globe. It was almost immediate for us to see the benefits by being able to hop onto the Akamai Global Network and to cut out a number of the steps across the Internet that we had to traverse from one point to our data center.

Gardner: So, it was clearly an important business metric, getting your far-flung customers happy with their response times. How did that however translate back when you reverse engineered from the experience to what your requirement would be within that data center? Was there sort of a meeting of the minds between what you now understand the network is capable of, with what then you had to deliver through your actual servers and infrastructure?

l guess I'm looking for an efficiency metric or response in terms of what the consolidation benefit was.

Winston: As I mentioned, we had already consolidated from a virtualization standpoint within the four walls of the data center. So, we were continuing to expand in that footprint. But, what it allowed us to do was forego having to put a data center in the Pacific Rim or put a data center in Europe to put the application closer to the end user.

Operating like a cloud is really operating in this more homogeneous, virtualized, abstracted world that we call server virtualization in most enterprises.



Gardner: Let's look to the future a little bit. James, when people think nowadays about cloud computing, that's a very nebulous discussion and topic set. It seems as if what we're talking about here is that more enterprises are going to have to themselves start behaving like what people think of as a cloud.

Staten: Yes, to a degree. There is obviously a positive aspect of cloud and one that can potentially be a negative.

Operating like a cloud is really operating in this more homogeneous, virtualized, abstracted world that we call server virtualization in most enterprises. You want to operate in this mode, so that you can be flexible and you can put applications where they need to be and so forth.

But, one of the things that cloud computing does not deliver is that if you run it in the cloud, you are not suddenly in all geographies. You are just in a shared data center somewhere in the United States or somewhere in your geography. If you want to be global, you still have to be global in the same sense that you were previously.

Cloud not a magic pill

Rubinson: Absolutely. Just putting yourself in the cloud doesn't mean that you're not going to have the same type of latency issues, delivering over the Internet. It's the same thing with availability in trying to reach folks who are far away from that hosted data center. So, the cloud isn't necessarily the answer. It's not a pill that you can take to fix that issue.

Gardner: Andy, I don't think you can mention names, but you are not only accelerating the experience for end users of enterprise applications like a Phase Forward. You're also providing similar services for at least several of the major cloud providers.

Rubinson: It really is anybody who is using the cloud for delivery. Whether it's a high-tech, a pharma company, or even a hosting provider in the cloud, they've all seen the value of ensuring that their end users are having a positive experience, especially folks like software-as-a-service (SaaS) providers.

We've had a lot of interest from SaaS companies that want to ensure that they are not only able to give a positive user experience, but even from a sales perspective, being able to demonstrate their software in other locations and other regions is very valuable.

Obviously, by using the best practices that we've adopted to have blazing fast websites and applying them to make sure that all of your applications, consumed by everyone, are still blazing fast means that you don't have to reinvent the wheel.



Gardner: Now, James, when a commercial cloud provider provides an SLA to their customers, they need to meet it, but they also need to keep their costs as low as possible. More and more enterprises are trying to behave like service providers themselves, whether it's through ITIL adoption, IT shared services or service-oriented architecture (SOA). Over time, we're certainly seeing movement toward a provider-supplier, consumer-subscription relationship of some kind.

If we can use this acceleration and the ability to use the network for that requirement of performance to a certain degree, doesn't this then free up the folks who have to meet those SLAs in terms of what they need to provide? I'm getting back to this whole consolidation issue.

Staten: To some degree. Obviously, by using the best practices that we've adopted to have blazing fast websites and applying them to make sure that all of your applications, consumed by everyone, are still blazing fast means that you don't have to reinvent the wheel. Those practices work for your website. You just apply them to more areas.

If you're applying practices you already know, then you can free up your staff to do other things to modernize the infrastructure, such as deploying ITIL more widely than you have so far. You can make sure that you apply virtualization to a larger percentage of your infrastructure and then deal with the next big issue that we see in consolidation, which is virtual machine (VM) sprawl.

Can get out of control

T
his is where you are allowing your enterprise customers, whether they are enterprise architects, developers, or business units to deploy new VMs much more quickly. Virtualization allows you to do that, but you can quickly get out of control with too many VMs to manage.

Dealing with that issue is what is front and center for a lot of enterprise IT professionals right now. If they haven't applied the best practices or performance to their application sets and to their consolidation practices, that's one more thing on their plate that they need to deal with.

Gardner: So, this also can relate to something that many of us are forecasting. Not much of it happening yet, but it's this notion of a hybrid approach to cloud and sourcing, where you might use your data center up to a certain utilization, and under certain conditions, where there is a spike in demand, you could just offload that to a third-party Cloud provider.

If you're assured from the WAN services that the experience is going to be the same, regardless of the sourcing, they are perhaps going to be more likely to pursue such a hybrid approach. Is that fair to say, James?

Staten: This is a really good point that you're bringing up. We wrote about this in a report we called "Hollow Out The MOOSE." MOOSE is Forrester's term for the Maintenance and Ongoing Operations, Systems, and Equipment, which is basically everything you are running in your data center that hasn't yet been deployed up to this point.

The real answer is that you need to choose the right type of solution for the right problem. We call this Strategic Rightsourcing . . .



The challenge most enterprises have is that MOOSE consumes 70 or 80 percent of their entire budget, leaving very little for new innovation and other things. They see things like cloud and they say, "This is great. I'll just move this stuff to the cloud, and suddenly it will save me money."

No. The real answer is that you need to choose the right type of solution for the right problem. We call this Strategic Rightsourcing, which says to take the things that others do better than you and have others do them, but know economically whether that's a positive tradeoff for you or not. It doesn't necessarily have to be cash positive, but it has to be an opportunity to be cost positive.

In the case of cloud computing, if I have something that I have to run myself, it's very unique to how I design it, and it's really best that I run it in my data center, you're not saving money by putting that in the cloud.

If it's an application that has a lot of elasticity, and you want it to have the ability to be on two virtual machines during the evening, and scale up to as many as 50 during the day, and then shrink back down to 2, that's an ideal use of cloud, because cloud is all about temporary capacity being turned on.

A lot of people think that it's about performance, and it's not. Sure, load balancing and the ability to spawn new VMs increases the performance of your application, but performance is experienced by the person at the end of the wire, and that's what has to be optimized. That's why those types of networks are still very valuable.

Gardner: Tom Winston, is this vision of this hybrid and the use of cloud for ameliorating spikes and therefore reducing your total cost appealing to you?

Has to be right

Winston: It is, but I couldn't agree more with what James just said. It has to be for the right situation. Certainly, we've started to look at some of our applications, potentially using them in a cloud environment, but right now our critical application, the one that I mentioned earlier, is something that we have to manage. It's a very complex environment. We manage it and we need to hold it very close to the vest.

People have the idea that, "Gee, if I put it in the cloud, my life just got a lot easier." I actually think the reverse might be true, because if you put it into the cloud, you lose some control that you have when it's inside your four walls.

Now, you lose the ability to be able to provide the level of service you want for your customers. Cloud needs to be for the right application and for the right situation, as James mentioned. I really couldn't agree more with that.

For Akamai, it's really about how we're able to accelerate that.



Gardner: So, the cloud is not the right hammer for all nails, but for when that nail is correct, that hybrid model can perhaps be quite a economic benefit. Andy, at Akamai, are you guys looking at that hybrid model, and is there something there that your services might foster?

Rubinson: This is really something that we are agnostic about. Whether it's in a data center owned by the customer or whether it's in a hosted facility, we are all about the means of delivery. It's delivering applications, websites, and so forth over the public Internet.

It's something we're able to do, if there are facilities that are being used for, say, disaster recovery, where it's the hybrid scenario that you are describing. For Akamai, it's really about how we're able to accelerate that. How we are able to optimize the routing and the other protocols on the Internet to make that get from wherever it's hosted to a global set of end users.

We don't care about where they are. They don't have to be on the corporate, private WANs. It's really about that global reach and giving the levels of performance to actually provide an SLA. Tell me who else out there provides an SLA for delivery over the Internet? Akamai does.

Gardner: Well, we'll have to leave it there. We've been discussing how data center consolidation and modernization can help enterprises cut costs, reduce labor, slash their energy use, and become more agile, but also keeping in mind the requirements about the performance across wide area networks.

We've been joined by James Staten, he is a Principal Analyst at Forrester Research. Thank you, James.

Staten: Thank you.

Gardner: We were also joined by Andy Rubinson, Senior Product Marketing Manager at Akamai Technologies. Thank you, Andy.

Rubinson: Thank you very much.

Gardner: Also, I really appreciate your input Tom Winston, Vice President of Global Technical Operations at Phase Forward.

Winston: Dana, thanks very much. Thanks for having me.

Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. You've been listening to a sponsored BriefingsDirect podcast. Thanks for listening, and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download the transcript. Learn more. Sponsor: Akamai Technologies.

Transcript of a sponsored BriefingsDirect podcast on how data center consolidation and modernization helps enterprises reduce cost, cut labor, slash energy use, and become more agile. Copyright Interarbor Solutions, LLC, 2005-2009. All rights reserved.