Monday, August 11, 2008

WSO2 Data Services Provide Catalyst for SOA and Set Stage for New Cloud-Based Data Models

Transcript of BriefingsDirect podcast on data services, SOA and cloud-based data hosting models.

Listen to the podcast. Sponsor: WSO2.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, a sponsored podcast discussion about data services, how services-oriented architecture (SOA) is enabling an expansive reach for data, particularly enterprise data, how this will relate to the development of cloud infrastructures and services.

We’ll also examine what technologies and approaches organizations will need to federate and integrate these data sources and services and hosts, but without additional risk. That is to say, to free up data, give it more legs, but without losing control or increasing security risk.

We're also going to look at how open-source software relates to this, and how organizations are bridging the risk reduction and larger availability of data using open-source software.

To help us in this discussion, we are joined by distinguished a panel. First, Paul Fremantle, the chief technology officer at WSO2. Welcome, Paul.

Paul Fremantle: Hi, nice to be here.

Gardner: We are also joined by Brad Svee, the IT manager of development at Concur Technologies, a travel and expense management company in Redmond, Wash. Welcome to the show, Brad.

Brad Svee: Glad to be here.

Gardner: We are also joined by James Governor, a principal analyst and founder at RedMonk. Good to have you back, James.

James Governor: Thank you very much. Hello, everyone.

Gardner: Let's set this up by looking at the problem. I suppose the good news of looking into the past is that data has been secure and controlled. There are lots of relational databases with many, many bells and whistles applied over the years. There has also been a lot of development, middleware, and system's administration work to manage that data, keep it available, but also secure. That's the good news.

The bad news is that it's been limited in some respects by that firewall of personnel, technologies, and standards around it. I want to go to first to Paul Fremantle. Can you tell us a little bit about why the old model is not going to be sustainable in the world of services and mixed hosting environment?

Fremantle: It's a very interesting question. There are two key issues around the old model. The first is the just-in-time nature of data systems that are being bought today. Typically, customers are coming onto Websites and expecting to see the status of the world as it is right now.

They don't want to know what happened yesterday. They don't want to know what happened a week ago. They want to know exactly what happened right now. So it's very important that we move away from batch-oriented systems and file dumping and move to a real, live connected world, which is what people expect out of the Internet.

That live, connected world needs to be managed properly and it's very, very difficult to build that as a single monolithic system. So, it's really essential to move the ownership of that data to the people who really know it, create it, and own it, and that really plays to SOA. This is the model of keeping the data where it belongs and yet making it available to the rest of the world. That's the first point.

The second point is, of course, the danger inherent in getting it wrong. I have two stories, which I think will shed some interesting light on this. One is, I was working with a government organization and they were involved in a situation where every day one of the employees has to FTP a data file from a remote system and back load into their local system.

The employee went ill, and, of course, this didn't happen. They had a whole process to find out who had the password, who could do this and solve this problem. The employees had no one there to back load this data. As I was investigating, it turned out that data in the remote system, from the other organization, was actually coming from within their own organization.

There was another employee uploading the data from the main system to the remote system every day, and they had no clue about this. They didn't realize that this process had built up, where the data from organization A, was being sent to organization B, and then re-downloaded to organization A again, every single day, taking up two employee's time to do that.

Gardner: This is sort of the medieval approach to data transfer.

Fremantle: This is the medieval approach to data transfer. This was not back in 1963 that this is happening. This was actually happening in 2007.

Governor: Medieval or not, the simple fact is that there are vast amounts of exactly that kind of stuff going on out there. Another lovely story told by Martin Fowler talks about a customer -- I believe he was in the U.K. NHS, but I should be a little bit careful there. I should say it was a large organization, and they were freaking out. They said, "We've got to get off Java, because the printer driver is just no good."

He said, "What exactly are you trying to do? Let's have a chat about the situation." "We got to get off Java. We will just try and work it out." He looked at the work was that got involved. Basically, they were getting a document, printing it out, taking it across the room, and then typing it into another system on the other side of the room. He had to tell them, "Well, maybe there is another way of doing it that won't require printer drivers."

Gardner: One of the motivators, it seems, is if nothing dramatic requires you to change your patterns, then you stay with them. It's sort of inertia with people's behavior, and that includes IT. What we're seeing now is an impetus, or an acceleration and automation in services, because they have to, because there are outside organizations involved. A business process is not strictly internal, from one side of the room to the other, but could be across the globe and span many different companies. Does that sound correct, Paul?

Fremantle: Absolutely. I just want to give you a second example, which has been very well published in the U.K. where I live, but maybe it hasn't been so well known outside of U.K. The revenue and the customs in the U.K. had a significant problem recently, where they sent a CD containing 20 million records, including the birth dates, names, national insurance numbers, and bank account details of the 20 million people to another government department.

And, they lost it. They sent it again, and they lost it again. It would not be too far to say this had significant ramifications on the government and their ability to stay in government. The payoff of this was, they had policemen out searching rubbish dumps. They had to send a personal letter to each of the 20 million people. Banks had to update their security procedures.

The overall cost of this mistake, I imagine, must be in the millions of pounds. Now, the interesting question is, firstly, they didn't encrypt that data properly, but even if they had, there is a huge difference between encrypting an online system and encrypting a CD. If a hacker gets hold of the CD, he can spend as long as it takes to decrypt that system. If it takes him two years of computing power to do that, he can sit there for two years and break it.

If you have an encrypted online system and someone keeps trying to break it, the administrator sees log messages, knows that something is happening, and can deal with that. So it's not just the lack of encryption and the bulk dumping of data from one department to other, that's the problem. The model of sticking it on a CD hugely increases the dangers.

Governor: Well, people should be imprisoned for that, or at least lose the right to trade. Obviously, being government organizations, it's difficult to make that stick, but the U.K. government loves the use of phrase "fit for purpose." Quite frankly, there has been evidence that they are not fit for purpose.

Interestingly enough, one of the things about the importance of data and managing it more effectively, is thinking about data in a more rigorous way. I was going to talk on this call about "leaky abstractions." One of the problems with SOA is the notion that, "Oh, we can we can just take the system as it is and make it available as a service."

Actually, you do want to do some thinking and modeling about your data, your data structures, and how it can be accessible, and so on, because of this notion of leaky abstractions. You can push something in one place and something else is going to pop out in another by just taking a service as it is and making it online. You may not be doing the work required to use it more effectively.

I think that's the kind of thing that Paul is talking about there. What better example of the leaky abstraction is there than somebody sending a disk and not tracking where it goes? Again, the fact that there wasn't any cryptography used is really shocking, but frankly, this is business as usual.

Fremantle: In fact just to completely confirm what you are saying there, the government department that wanted this data did not want the bank account details, the national insurance numbers, or the ages. They didn't want all that data. What actually happened was the revenue and customs team were not sufficiently IT aware to be able to export just the data that was wanted, so they dumped everything onto the disk.

I think that exactly confirms what you are talking about the leaky abstraction. They just pushed out everything, because that was the simplest possible thing to do, when it wasn't exactly what's required, which is what should have been done.

Gardner: So, it does seem clear that the status quo is not sustainable. That there is inherent risk in the current system and that simply retrofitting existing data in turning it on as a service is not sufficient. Either you need to rationalize, think about the data, and generate the ability to slice it and dice it a little better, so that in the case of this disk of vast amounts of information, there was only a small portion of that that was actually required.

Let's look at this also through the lens of, "If we need to change, how do we best do best do that?" Let's look at an example of how someone who needs to view data in a different sense, in a more modern sense, how they are adjusting? Let's go to Brad at Concur. Your organization is involved with helping to improve the efficiency and productivity of travel and management inside of organizations.

Your data is going to come from a variety of areas that probably could be sensitive data in many organizations. Certainly, people are not happy about having their travel information easily available around the organization or certainly outside of it. And, of course there are government and tax implications, compliance, and implications as well. Can you give us a little bit of sense of what your data problem set is and if it's different from what we have heard on the "medieval" front? What sort of approaches you would like to take and have been taking?

Svee: First, I would like to clarify the distinct separation between our research and development team, which actually works on our product that we sell the clients, and my team, which works internally with our internal data.

I would like to draw a distinct clarification between those two. I am only able to speak to the internal data, but what we have found is exactly that that. Our data is trapped in these silos, where each department owns the data, and there is a manual paper process to request a report.

Requesting a customer report takes a long time, and what we have been able to do is try to expose that data through Web services using mashup type UI technology and data services to keep the data in the place that it belongs, without having a flat file flying between FTP servers, as you talked about, and start to show people data that they haven't seen before in an instant, consumable way.

Gardner: So, not even taking that further step of how this data might be used in an extended enterprise environment or across or departmental organization boundaries, just inside your organization, as you are trying to modernize and free up the data, you are looking at this through the lens of availability, real time, lower cost and clip, print, and touch from IT personnel. What sort of technologies and approaches have you been taking in order to try to achieve that?

Svee: For the last year or so, we have been pushing an SOA initiative and we have been evaluating the WSO2 product line, since, maybe November. We have been trying to free up our data, as well as rethink the way all our current systems are integrated. We are growing fairly rapidly and as we expand globally it is becoming more and more difficult to expose that data to the teams across the globe. So we have to jump in and rethink the complete architecture of our internal systems.

Gardner: What is it about the architecture that has a bearing on these flexibility and agility you are looking for, but that also protects your sense of reduced risk, security privacy access control?

Svee: Most of the data that we are dealing with is fairly sensitive, and therefore almost all of it has a need for at least per-user access basis, as well as, when we are transporting data, we will have to make sure that it's encrypted or at least digitally signed.

Gardner: Now, it seems to me that this data will need to be available through a browser-based portal or application to the end users, but that the data is also going to play a role with back office system, ledger, and different accounting activities, as this travel and expense content needs to be rectified across the company's books.

Svee: The browser becomes the ubiquitous consumption point for this data, and we are able to mash up the data, providing a view into several different systems. Before, that was not possible, and the additional piece of moving the file between financial systems, for example, we are able to not have to pull files, but actually use Web services to send only the data that has changed, as opposed to a complete dump of the data, which really decreases our network bandwidth usage.

Governor: There's even potentially a green argument in there. I mean, all of this batch is just kind of crazy and unnecessary. We see a lot of it. There is so much data duplicated everywhere. It seems like we, as an industry, are very good at just replicating and getting ridiculous redundancy, and not so good at synchronizing and really thinking about what data does need to be transported and working with that accordingly.

That sort of makes a lot of sense to me. It's very good to hear you are taking that approach. I think sometimes we miss-call things SOA, when in fact what you are doing is kind of "suck and play." You take this thing, suck old things out, and then work on the new thing, as opposed to actually thinking about the data structures you need to enable the data to be useful and fit you.

Gardner: Let's go to Paul. Now, here is an instance where the organization has, I think, its feet in both camps. In the old style, there is accounting, the ledgers, and data extension across application sets from a common repository, and how to batch that in such a way that the data is all on the same page, so to speak, across these applications in a time frame.

We also need to take this out through Web applications to individuals and also across applications that are Web services enabled. So, it sounds like what we have here is a situation where the data needs to do many different tricks, not just a couple of old basic tricks.

What is it that WSO2 has done recognizing this kind of need in the market and is able to satisfy this need?

Fremantle: What we have built is what we call WSO2 Data Services, which is a component of our application server. The WSO2 Data Services component allows you to take any data source that is accessible through JDBC, MySQL databases, Oracle databases, or DB2, but, in addition, we also have support for Excel, CSV files, and various other formats and very simply expose it as XML

Now this isn't just exposed to, for example, Web Services. In fact, it can also be exposed by REST interfaces. It can be exposed through XML over HTTP, can even be exposed as JSON. JavaScript Object Notation makes it very easy to build Ajax interfaces. It can also support it over JMS, and messaging system.

So the fundamental idea here is that the database can be exposed through a simple mapping file into multiple formats and multiple different protocols, without having to write new code or without having to build new systems to do that. What we're really replacing there is, for example, where you might take your database and build an object relational map and then you use multiple different programming toolkits -- one Web services toolkit, one REST toolkit, one JMS toolkit -- to then expose those objects.

We take all that pain away, and say, "All you have to do is a single definition of what your data looks like in a very simple way, and then we can expose that to the rest of the world through multiple formats."

Gardner: When that data changes on the core database, those changes are then reflected across all of these different avenues, channels, and approaches it's sharing. Is that correct?

Fremantle: Absolutely, because it's being accessed on demand and then exposing them as needed through whichever format they ask for. So, it's not storing those data formats in it's own system,

Governor: One of the things that I really like about this story is that we went through a period where there was a view that everything needed to be done with the WS stack, and the only way to do SOA, the only way to data integration, was to use these large-scale Web standards. But they're not critical in all cases, and it really depends on your requirements for the security and so on. Do you really need SOAP and some of the heavier weight protocols and technology?

I think that the approaches that say, "Let's understand is this behind the firewall? What are the levels of protection that are required?" "Can we do this in a simpler fashion?" are very valuable. The point about JSON, for UI related stuff, certainly REST kind of interfaces, but at the end of the day it's a question of, do you have developers that are available out there in your shop or to hire that are going to be able to do the work that's required and some good examples that came out of the Web world?

If you look at eBay, they had a SOAP API, but nobody used it. A great number, or 80 percent plus, of the calls were using RESTful styles. Understanding the nature of your problem and having more flexibility is very, very important.

Gardner: One of the things that I really like about this is that, almost like Metcalfe's Law. The more participants there are on the network, the more valuable it is. The more people and systems and approaches to distributing data, the more valuable the data becomes. What's been nice is that we've elevated this distribution value with data, at the same time that open source and community-based development have become much more prominent.

That means that the ways in which the data is shared and transferred is not just going to be dependent upon a commercial vendor's decision about which standards to support, but we can open this up to a community where even very esoteric users can get a community involvement to write and create the means for sharing and transferring.

The data can take on many more different integration points, and standards can evolve in new and different ways. Let's discuss a little bit, first with Paul, about the role of open source, community, and opening up the value of data.

Fremantle: I am just a fanatic about open source and community. I think that open source is absolutely vital to making this work, because fundamentally what we're talking about is breaking down the barriers between different systems. As you say, every time you're pushing the proprietary software solution that isn't based on open standards, doesn't have open APIs, and doesn't have the ability to improve it and contribute back, you're putting in another barrier.

Everyone has woken up to this idea of collaboration through Web 2.0 websites, whether through Flickr or FaceParty or whatever. What the rest of the world is waking up to is what open-source developers have been discovering over the last five to ten years. Open source is Web 2.0 for developers. It's how do you collaborate, how do I put my input, my piece of the pie? It's user-generated content for developers, and that power is unbelievable. I think we're going to see that grow even more over the next few years.

Governor: I fundamentally agree with it. Open source was an application of a pattern. Open source was the first real use case for this need for a distributed way of working, and we're certainly seeing that broadened out. People are getting a much, much better understanding of some of the values and virtues of open approaches of exposing data to new sources.

Very often, you will not get the insight, but someone else will, and that sort of openness and transparency, and that's one of the key challenges -- actually just getting organizations to understand some of the value of opening up their data.

I think that is one thing to have to tools to see that. Another is that we all now are beginning to see organizations kind of get it. Certainly, "How do we syndicate our information?" is a really key question. We are seeing media companies ask themselves exactly that. "Do we have an API? How do we build an API? Where do we get an API, so that people can syndicate the information that we have?”

I suppose I'm just double-clicking on what Paul said -- that passion is something that is becoming more and much better understood. Reuters is realizing it has to have an API. The Guardian, which is a British newspaper -- and those Americans certainly of the leftward persuasion are very familiar with it -- now has a team that is also presenting at Web conferences and talking about the API. We've got to think about how to make data more available, and open source will just be the first community to really understand this

Gardner: I'd like to bounce this off of Brad at Concur. Do you feel a little bit less anxious, or more at ease, knowing that whatever data needs that you have for the future, you don't have to wait for a vendor to come up with the solution? You might be able to go and explore what's available in a community, or if it's not available, perhaps write it yourself or have it written and contribute it back to the community. It seems to me that this would be something that would make you sleep better at night -- that an open-source and community-based approach to data services deliverability gives you more options.

Svee: I personally love open source. I think that it is the movement that's going to fix software and all these proprietary systems. I think that my small team, four developers and myself, would not be able to produce the kind of quality products internally that we're essentially asked to do, without being able to stand on the shoulders of a lot of these geniuses out there who are writing amazing code.

Gardner: Do you agree that there is this sense that you can almost future-proof yourself by recognizing, as you embrace open source, that you're not going to get locked in, that you're going to have flexibility and opportunity in the future

Svee: Exactly. I find that there are a few products that we have that we've been locked into for quite some time. It's very difficult to try to move forward and evaluate anything new, when we're locked into something that's proprietary and maybe not even supported anymore. With the open-source community out there, we're finding that the answers we get on forums and from mailing lists are every year getting faster and better. More people are collaborating, and we're trying to contribute as much as we can as well.

Gardner: And, of course, over the past several years, we've seen a tremendous uptake in the use of open-source databases and sources from MySQL, Ingres, Postgres, and there are others. Let's bounce this back now to the WSO2 product set. What is it about, when you are developing your products, Paul, that open source becomes an enabler, as well as, in a sense, a channel into the market?

Fremantle: What was interesting about us developing this data services solution was the fact of what we built on top. The data service's component that we built actually took us very little time to get to its first incarnation, and obviously we are constantly improving it and adding new capabilities.

We were working on that and it didn't take time, but the very first prototype of this was just a piece of work by one of our team who went out and did this. What enabled that really was the whole framework on which it was built, the access to framework, the app server that we built, and that framework built on the work of literally hundreds of people around the world worked on it.

For example, if we talk about the JMS support, that was a contribution by a developer to that project. The JSON support was a contribution by another developer and relied on the JSON library written by someone else. The fact that we can choose the level of encryption and security from HTTPS all the way up to full digital signatures relies on the works of the Apache XML security guys who have written XML security libraries. That's an incredible, complex piece of work and it's really the pulling together of all these different components to provide a simple useful facility.

I think it's so amazing, because you really stand on the shoulders of giants. That's the only way you can put it. What I like about this is to hear Brad say that he is doing the same, we are doing the same, and all around there is a value change of people doing small contributions that, when put together, add up to something fantastic. That's just an amazing story.

Gardner: Given that there are many approaches that Brad, as a user organization, is undertaking, and they dovetail somewhat with what you are doing as a supplier, we also have other suppliers that are embracing open source increasingly and building out product sets that have originated from technology that was contributed or project format or license. How do these pieces come together, if we have a number of different open-source infrastructure projects and the products? I'm thinking about perhaps an ESB, and your data solution, and some integration middleware. What's the whole that's greater than the sum of the parts?

Governor: I certainly have some pretty strong opinions here. I think we can learn a lot from the ecosystems as well. One of the absolutely key skills in open source, as a business, is packaging. Packaging is very, very important to open source, and pulling things together and then offering support and service is a very powerful model.

It's really nothing new. If we look at personal computers, you go out and you can buy yourself chips from AMD or Intel, you can buy an OEM version of Windows or choose to do with Linux, you can buy RAM from another company, you can buy storage disks from another company, and kind of glom it all together.

But, as that industry has shown us, it really makes a lot more sense to buy it from a specialist packager. That might be Dell, HP, or others. I think that open-source software has really got some similar dynamics. So, if you want an Eclipse IDE, you are likely to be buying it from an IBM or a Genuitec or CodeGear, and a couple of those are our clients. I should disclose that.

In this space we've got the same dynamics. If you are, for example, a Web company, and you don't want to be paying these third parties to do that packaging for you, fine. But, for the great mass of enterprises, it really doesn't make that much sense to be spending all your time there with glue guns, worrying about how pieces fit together, even in Eclipse, where it is a very pluggable architecture.

It makes a great deal of sense to outsource that to a third party, because otherwise it's really a recipe for more confusion, I would argue. So yes, you can do it yourself, but that doesn't necessarily mean, you should. The PC example, yes, for a hobbyist or someone who wants to learn about the thing, absolutely, build your own, roll your own. But, for getting on with things in business, it does make sense to work with the packager that's going to offer you full service and support.

Fremantle: I've got to jump in here and say that's exactly our model. Though we don't just offer the data services, we offer, an ESB, a mashup server, and SOA registry, and we make sure all those things work together. The reality is that there are a lot of programmers out there who are hobbyists, so there are a lot of people who do like to take individual components and pieces and put them together, and we support both of those equally, but I think your analogy of the PC market and that plug and play model is absolutely like open source and specifically open-source SOA. We all focus very much on interoperability will make sure that our products work together.

Open source drives this market of components, and it's exactly the same thing that happened in the PC market. As soon as there was an open buy off that wasn't owned by a single company, the world opened up to people being able to make those components, work in peace and harmony, and compete on a level playing field. That's exactly where the open-source market is today.

Gardner: So how about that, Brad? Do you really like the idea that you can have a package approach, but you can also shake and bake it your own way?

Svee: That's exactly the sweet part in my opinion. I can shake and bake, I can code up a bunch of stuff, I can prototype stuff rapidly, and then my boss can sleep well at night, when he knows that he can also buy some support, in case whatever I cook up doesn't quite come out of the oven. I see there's a kind of new model in open source that I think is going to be successful because of that.

Gardner: Okay, now we have seen some very good success with this model: have it your way, if you will, on the infrastructure level. We are moving up into data services now. It seems to me that this also sets us up to move an abstraction higher into the realm of data portability. Just as we are seeing the need in social networks, where the end user wants to be able to take their data from one supplier of a social networking function to another, I think we are going to start to see more of that in business ecologies as well.

A business will not want to be locked into a technology, but it also doesn't want to be locked into a relationship with another supplier, another business. They want to be able to walk away from that when the time is right and take their data with it. So, maybe we'll close out our discussion with a little blue-sky discussion about this model taking a step further out into the cloud. Any thoughts about that, Paul?

Fremantle: I think that's a really interesting discussion. I was at a conference with Tim O'Reilly about two years ago and we were having exactly this discussion, which is that openness of services needs to be matched by openness of data. We are definitely seeing that in the Web marketplace through back-end systems like Amazon S3 storage, and we are beginning to see a lot of other people start to jump on this and start to build open accessible databases.

I think that's an absolutely fantastic usage for this kind of data service, which is to say, "It's my data. I don't just want to host things in an open fashion. I don't want to write code in an open fashion. I want open services and open data, so I can get it, move it, protect it myself, and relocate it."

So, I think there's a really interesting idea behind this, which is, once we get to the point where your data is no longer tied to a specific system and no longer has to be co-located with a particular MySQL database, we start to free up that processing. If you look at what Amazon did with the Elastic Cloud Service and their storage system, the storage system came first. The data services were a precursor to having an effective cloud-computing platform. So, it's really a precursor. You have to have data services, before you can start to migrate your processing and scale it up in this fashion.

Gardner: What do you think, James? Is this something that will be in great demand in the market, and there is also a green angle here?

Governor: Yeah, I think undoubtedly it will. Simon Phipps from Sun talks about the freedom to leave. We had a big example recently, Comcast buying Plaxo. They have lost a lot of the users. A lot of Plaxo users just closed up their account there. Interestingly enough, Plaxo had a nice function to do that -- very good for closing the account, not so good for exporting the data. I am not so sure the problems are primarily technical. I think there are a great deal of policy and social problems that we are going to have to deal with.

It's very interesting to me that we call people heroes that are trying to break Facebook terms of service, in some cases with the recent data portability example. We've got some really key challenges about what does data ownership mean. From my perspective, as I said earlier, I think it's very important that we have the mechanisms whereby we have access to data without necessarily allowing replication of it all over the place.

If it is your data, then yes, by all means, you should have permission to take a copy of it. What about if you're on a network and you want to take all the data and all of the surrounding metadata? Really, the discussion becomes about that metadata. Am I allowed to get anything back from Google about my behaviors and other people's behaviors?

It's really a social question, and we, as a society or a number of different societies, have got to think about this, and what we want from our data, what we want from privacy, and what we want we want from transparency. We can gain wonderful things, I mean wonderful advantages, but there is also the flip side, and I think it's very important that we keep that in mind.

So, it's going to be a wild ride. It's exciting, and I think that it is important that we get the tools in place, so that once we get the policies well understood, we can actually begin to do things more effectively. So, again, it's very exciting, but there are a lot of threats and lot of risks that we do need to take account of. Those risks are expanded, as I say, by what I sometime call "information bulimia." This notion that we just keep eating and swallowing more and more information and more data and we need more information, and if you do that, what you end up doing is puking it all up.

Gardner: Let's close here with that real-world perspective, Brad, aside from the visual image of puking, does this interest you in terms of the idea of third-party neutral cloud-based data and does that have any bearing on your real-world issues?

Svee: Well, I can give you an example what we were able to do with data services. Within a matter of weeks, not even months, we are able to use the data services in the application server from WSO2 to essentially give a complete client picture to the business by reaching into the ERP system, pointing out invoices and products, and then reaching into the CRM system to pull out open issues, as well as, sales manager, probably about 50 data points about each customer from the CRM, and then expose those services through a simple JSON-based UI with a smart type-ahead for the customer name. Quickly, we are able to show a picture of our clients that hadn't previously been available -- and within a matter of weeks actually.

Gardner: That data could have come from any number of different sources if, to James' point, you had the proper permissioning?

Svee: Yeah, and since we are IT and we own the systems, we are able to determine who is who, and we were able to use a Web service, another data service into our HR system, to pull out roles to see whether or not you could access that information.

Gardner: That's highly valuable from a productivity and planning perspective. If you are a business strategist, that's precisely the kind of information you want?

Svee: Exactly, and they were amazed that they've had been able to live their lives without it for so long.

Gardner: Paul, do you think much of this common view business, when it comes to data services?

Fremantle: Actually, we are working on another project with a health-care provider, which is providing a single patient view. So, it's exactly the same kind of scenario with significant security and encryption and data challenges to make sure that you don't provide the wrong information to the wrong person. Obviously, all the same issues need to be solved, and being able to pull together everything that is known about a patient from multiple different system into a single view once again has huge value to the organization.

Gardner: Well, this has to be our swan song on this particular podcast. We are out of time. I want to thank our guests for helping us get into a nice far-reaching discussion about data services, what the problem set has been, what the opportunity is, and how at least one organization, Concur, is making some good use of these technologies. We have been joined by Paul Fremantle, chief technology officer at WSO2. Thank you, Paul.

Fremantle: Thank you, it has been great fun.

Gardner: I also strongly appreciate your input Brad Svee, IT manager of development at the Redmond, Wash.- based Concur. Thank you, Brad.

Svee: Well, thank you.

Gardner: And always, thank you, James Governor from RedMonk for joining. We appreciate your input.

Governor: Thank you much. It has been an interesting discussion.

Gardner: This is Dana Gardner, principal analyst at Interarbor Solutions. You have been listening to a sponsored BriefingsDirect Podcast. Thanks and come back next time.

Listen to the podcast. Sponsor: WSO2.

Transcript of BriefingsDirect podcast on data services, SOA and cloud-based data hosting models. Copyright Interarbor Solutions, LLC, 2005-2008. All rights reserved.

Monday, August 04, 2008

SOA Demands Broader Skills and Experience for Enterprise Architects Across Technical and Political Domains, Says Open Group Panel

Transcript of BriefingsDirect podcast recorded at the 19th Annual Open Group Enterprise Architect's Practitioners Conference on July 22, 2008, in Chicago.

Listen to the podcast. Listen on iTunes/iPod. Sponsor: The Open Group.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you're listening to BriefingsDirect. Today, a sponsored podcast discussion coming to you from a live panel at the 19th Annual Open Group Enterprise Architect's Practitioners Conference on July 22, 2008, in Chicago. We're going to discuss the role and impact of skills and experience for Enterprise Architects (EAs) in the context of services-oriented architecture (SOA).

We're going to talk about both the public sector, that is to say, government organizations, as well as the private sector, because these issues cut across all areas of information technology (IT).

First, let me introduce our panel of experts and guests. We are joined by Tony Baer, a senior analyst at Ovum. We're also joined by Eric Knorr, editor-in-chief of InfoWorld, as well as Joe McKendrick, author, consultant, prolific blogger, and IT industry analyst. Andras Szakal, the chief architect at IBM's Federal Software Group is here. And lastly, David Cotterill joins us. He’s head of innovation at the U.K. Government Department for Work and Pensions, which is generally equivalent to the Social Security Administration here in the United States.

We've heard a lot at this conference about the role of enterprise architects, how it's swiftly evolving. There is a need for alignment between business and IT at multiple levels. There is also concern about security and managing complexity as organizations move toward SOA methods and principles.

But when I talk to developers, architects and operations people -- and when we get down to what makes them successful -- they often come back to hiring. They talk about the people that they need to bring into their organizations to succeed, and the correct way to promote and encourage those people within their organizations.

A lot of that, of course, is important to The Open Group and The Open Group Architecture Framework (TOGAF), as well as its certification processes, IT Architect Certification (ITAC). It seems important for us to discuss why certification and frameworks need to take into account the full architect, the full person, not only in terms of their knowledge -- their book knowledge, their a priori knowledge -- but also their experience and skills.

We're asking so much of these people, in the context of politics, collaboration, and transformation. My first question then goes to Andras at IBM. Are the skill sets for EA and for SOA the same, or are they significantly different?

Andras Szakal: That's a good question. I believe that they are significantly different. Obviously, an enterprise architect needs to have a background in an EA method. They have to be able to apply the EA method. There are a whole set of governance requirements that an enterprise architect is involved with that an SOA practitioner may not be involved with.

There is quite a bit of intersection between the SOA architect and the implementation of the business processes that they are responsible for. So that's the intersection between EA and SOA. Certainly, they're not equivalent -- even if you believe some of the media out there.

Some of the SOA advertisements suggest that maybe SOA is equivalent to EA, but it's not. For SOA you need to have a background in understanding how to transform the business processes into actual implementation, versus enterprise architects who are following the EA methods and following the practices for producing the deliverables that are handed over to the EA practitioner.

Gardner: Is there a general rule of thumb about the balance? Is this an 80-20 equation, 60-40, 50-50? From your experience in the government organizations, what's the proper balance, when you're hiring an architect, between being technically savvy, and the other skills around organization and management?

Szakal: Within the government, EA is, I would say, trending more toward the political aspect, to the executive-level practitioner, than it is, say, an integrator who has hired a set of well-trained SOA practitioners.

The enterprise architect is really more focused on trying to bring the organization together around a business strategy or mission. And, certainly, they understand the tooling and how to translate the mission, vision, and strategy into an SOA architecture -- at least in the government. If you look at folks like some of the past enterprise architects and chief architects, like Dick Burk [former Chief Architect and Manager, Federal Enterprise Architecture Program Office of E-Government and Information Technology Office of Management and Budget, Executive Office of the U.S. President] … I would say he was more of an administrator than a practitioner.

If you look at the SOA practitioners who come from the systems integrators, who are the majority of the individuals who implement systems in the U.S. federal government's base, they are focused more on IT and the implementation.

Gardner: Good, thanks. David Cotterill, in the U.K. you are a very large and substantial agency. How do you view this balancing act between the skills and knowledge? Is this about book skills? Is it about experience?

Of course, we are dealing with the most variable of variables -- people. How are you seeing the breakdown and where do you see the most important skill sets for “new age,” if you will, enterprise architects?

David Cotterill: Well, I think the technical background can be taken as a given for an enterprise architect. We expect that they have a certain level of depth and breadth about them, in terms of the broadest kind of technology platforms. But what we are really looking for are people who can then move into the business space, who have a lot more of the softer skills, things like influencing … How do you build and maintain a relationship with a senior business executive?

Those are kind of the skills sets that we're looking for, and they are hard to find. Typically, you find them in consultants who charge you a lot of money. But they're also skills that can be learned -- provided you have a certain level of experience.

We try to find people who have a good solid technical background and who show the aptitude for being able to develop those softer skills. Then, we place them in an environment and give them the challenge to actually develop those skills and maintain those relationships with the business. When it works, it's a really great thing, because they can become the trusted partner of our business people and unmask all the difficulties of the IT that lies behind.

We also have a more applications-focused space, in terms of a delivery-focused space. We need people who have a greater, more technical understanding of what the IT would be from our perspective, so that they can challenge the integrators and the suppliers -- just to make sure that they are doing the right thing, that we're keeping as open and flexible as we would like to be, and so that we're not tied into any given supplier.

Gardner: A lot of these roles are new and they're unexplored. We are into new territory. What do you look for in terms of the right kind of experience, the right mix of experience, for the people that you want to bring into these roles?

Cotterill: We hire a lot of people from financial services, because working with an organization the size of ours, you have to not be fazed by the scale of what you are trying to do. So, it helps having worked in a large organization, and understanding the complexities of scale. Typically, we're looking for people who have presented at a board level before, so they have demonstrated the ability to engage with board-level executives. That's really what we're trying to get.

Gardner: Let's go to some of our analysts. Tony Baer, you've been covering SOA for some time. We often hear a lot of prescriptives and definitions, methodologies, reference implementations and so forth -- and not a lot of attention is given to this dynamic management at the human level. Do the software vendors need to come out with a different messaging approach to help these organizations as they try to transform themselves, not just in terms of product, but in terms of this business-IT alignment?

Tony Baer: The short answer is “yes,” but the long answer is that vendors are having an awfully hard time trying to bridge the gap. I'll just give an example of IBM. I'm not trying to single out IBM here. IBM has probably been one of the companies that been most honest about this.

Just to give an idea in terms of trying to bridge different silos, I was talking with Danny Sabbah at IBM’s Rational division, and one of the interesting results of IBM's acquisition of Rational was that they have tried to cross-purpose some of the assets from Rational, where it could apply to some of the other product groups.

The obvious place is in the area of concern over how your systems operate at runtime. So they applied elements of the Rational Unified Process to Tivoli, but they also found at the same time that each brand was used to marketing to different entry points within the organization. They had a hard time trying to bridge that, because there was a mutual distrust.

As I said, I'm not singling out IBM here. I think IBM has been more ambitious in trying to tackle this problem, more honest about it. The same thing here applies with SOA and EA, and with trying to define and hopefully raise the consciousness within the EA profession that you need to have more of a business-level sensibility.

So the short of it is, the answer is yes. The long of it is, it's going to be awfully difficult to get there.

Gardner: Well, let's go to IBM for a second and give them a chance. I'm sure that within their own organization they are also dealing with, "Which came first, the chicken or the egg," except it's professional services or software, and you have mentioned, Andras, that you have this discussion. It's a tough problem.

Szakal: When we are engaging with the customer, we try to have one face. It depends on the business problem that they are solving. So, do you have a discussion about how you implement SOA, or do you first kind of try to roll back the discussion to the point where you can say, "What is the business problem?" and then apply some kind of framework and methodology to mete that out effectively, and then begin to map on the technologies that are necessary to be there.

That's the balance my group tries to play. We are software architects, but we are really trying to solve the business problem. We do this through a framework that we call the SOA Entry Points. We call this people, process, information, connectivity, and reuse. We apply a framework we now call the Smart SOA Approach, which allows you to try to define where you are on the continuum of maturity, and how you apply that with respect to the needs of the business and the business function that you are implementing. An organization that is essentially immature will want to begin to implement SOA infrastructure before they begin to integrate business processes or dynamically adopt SOA at very higher maturity rate.

Gardner: I would also like to bounce a question off of you, Andras. When you're hiring, what do you look for in the requirements? Is there any surprise for this audience about what you look for on the resume of the people that you like to bring into be architects for you at IBM?

Szakal: That's a good question, because I do hire quite a few architects. I would look for people who have deep technical skills, and have had experience in implementing systems successfully. I have seen plenty of database administrators come to me and try to call themselves an architect, and you can't be one-dimensional on the information side, although I am an information bigot too.

So you're looking for a broad experience and somebody who has methodology background in design, but also in enterprise architecture. In that way, they can go from understanding the role of the enterprise architect, and how you take the business problem and slice that out into business processes, and then map the technology layer on to it.

Gardner: Are there any things that you might see on a resume that would disqualify someone in your mind from this role of modern-day enterprise architect focusing more on SOA?

Szakal: I think there are few that would disqualify. You have to be very articulate, both written and verbally. Obviously, you cannot work with people if you have no people skills. You have to have a broad background in technology, both hardware and software. You have to understand the value of EA and business-process choreography. So if you are simply a database administrator and you are very focused on design, that's not really architecture.

If you are simply one-dimensional, the architecture that you are applying is the architecture that you know, and you are not really acting as an “architect.” If you only implemented an IBM solution, a DB2 solution, or say a Microsoft .NET solution, are you really an architect -- or are you just somebody who is a very good specialist at implementing .NET or an information management solution?

Gardner: Joe McKendrick, it sounds like we are defining a newer super-human role here for someone. Not too stuck in one aspect of their IT experience, but not overly technical either. They need to be a politician, a chef, a gardener, and a house painter. It sounds like a difficult thing for any one person to step up to. Is this all feasible? Is this logical? Should we certify individuals, or is this really something that probably requires more of a team?

Joe McKendrick: Thanks, Dana. It's a great point, and I want to take it back it to your previous question about how the vendors are addressing this. IBM is a kind of an exception to the rule here, but what we are seeing in the industry is that the vendors are talking very actively about business transformation, which is a very high-level activity. Ultimately, the goal of SOA is a fairly major transformation of the business to achieve more agility and more flexibility. However, most of the vendor community targets the IT department from the CIO on down.

The big question is whether the vendor community, at this stage in the game, may be asking a little too much of IT departments. Do IT personnel, IT managers, really have the bandwidth or the training, the wherewithal, or the organizational support to go out to the rest of the organization, and say, "Okay, folks, we are going to transform you now." I think our leading IT vendors, in particular, are leaning a little bit too heavily on their IT department, where SOA is a community effort.

In fact, the analogy I'd like to think of in terms of SOA is a condominium association. If you look at a condo association, you don't have one single owner. You have multiple owners. No one really has complete ownership of the building or the community. What happens is that each owner turns over management of the community to this governance group, the condo management association. IT is one of those owners. The condo turns shared services, be it gardening, landscaping, trash pick up and so forth to the management association.

Gardner: Well, I have to say, I hope that the IT department work a little better than some of the condo associations I have been affiliated with. That brings up another issue, and so let's go to Eric Knorr from InfoWorld. This sounds like balancing on so many different levels … group versus individual, command-and-control versus collaboration.

It reminds me of what's taking place in development on the design-time side of things around, Agile and Scrum. I know that Agile and Scrum might not be germane to an architect, but they have been designed to deal with teams and complexity and managing many moving parts by creating more of a team approach with a leader, sometimes called a master, and lots of iterative-stage meetings. Does this make sense, talking about not so much an individual or a team, but really a new way to manage complexity and innovation in a fast-paced environment?

Eric Knorr: Within the context of SOA, one thing we might not have touched on yet is that often, at least in the most successful examples that we have seen, there has to be some sort of visionary leadership. In case study after case study, you run into a chief architect, or even a chief technology officer sometimes, who has really made that connection, in an SOA context, between not only looking at the business processes, but breaking them down into business services and figuring out how to map a technical infrastructure against that.

That leadership is so important, because SOA is such an elusive concept that it's very easy to fall back into the old habits of enterprise application integration (EAI), and thinking in terms of point-to-point integration and not thinking in terms about the last presentation, that strategic value.

At the same time, SOA doesn't happen by decree. It happens with a feedback loop from the bottom up. The demands on the leadership skills are really very high. In organizing these teams that you are talking about here, a board of review is an essential part of that. Maybe you start your governance with an interoperability framework, so everybody knows what protocols are being used.

Then, you get deeper into the design-time governance rules, and you don't want to get to a point where you have such a granular level of rules that your best developers feel like they are being strangled … "Oh, here come the governance cops." So, you need to have that sort of reality feedback, and I think that takes a high level of managerial skill to pull off and still keep everybody on the same track, because if you don't exert enough control, you are going to have people building redundant services again. It's a real balancing act.

Gardner: I suppose it often happens, both in development and in IT operations, that dictates will come down, methodologies will be established, lists will be made, matrices will be presented, people will nod their heads, and they'll go off and do whatever they think it takes to get the job done, which gives us a little bit of a problem in terms of maintaining security, and maintaining the expectations and requirements to the business side of the house. Let's go back to David Cotterill. In the real world, in your organization, do you find that people often ignore what becomes the supposed standards of operating procedure?

Cotterill: Not that often, actually, because in the U.K. we've had some recent examples of where people have ignored operating procedures around security, and the message is pretty clear now that we've put in a lot of governance and compliance procedures to try and remove that level of, "I am just going to ignore and do what I feel is right. I'll publish something which I feel is the right thing to do."

We know that the impacts of that are not IT impacts, but they affect the business, the organization, and they affect ministers. Our friends in the press like to get hold of those things. So, we are very, very sensitive to not allowing that kind of “skunkworks” kind of thing to see the light of day.

The challenge, therefore is -- and this goes back to think to the point about visionary leadership -- how do you establish the right governance and platforms that allow people to be innovative in terms of the solutions that they bring forward, but also has that right level of control that says, "Okay, you are not going to publish something which opens up the entire department's customer information to attack." It's a real balancing act, and that's where leadership comes in I think.

Gardner: I am assuming that the Information Technology Infrastructure Library (ITIL) plays a significant role in your organization?

Cotterill: Absolutely, in terms of the operations management.

Gardner: And with ITIL, this moves you toward the shared services approach of IT management, the understanding that the users in your business at large or, in your case, the government department, are the real customers. Perhaps market forces come into play. That is to say, supply and demand. You don't get paid if the bids don't come in. The bids don't come in if the experience isn't good and rewarding over time for the end users. Do you think that whenever we deal with complexity, on the level of we are talking about, that supply and demand, letting the free forces of competition come into play -- does that help?

Cotterill: I think it helps. Government traditionally has a very command-and-control approach to innovation, and that stifles innovation effectively. I know it's a most effective method of stifling innovation. So when working with suppliers what we are always trying to do is introduce that level of competition. It's not necessarily just a vendor-customer relationship that we are talking about, because we are dealing with services that affect citizens and public information that should be available to citizens. We actually have to look about how the citizens want to use the information that government holds.

As an example, in the U.K. recently the government just launched a competition called "Show Us A Better Way," which is open to the public to identify uses of U.K. government data. How would they like to see U.K. government data mashed up to present new innovative solutions for the public? That's kind of an interesting thing, which takes us out that traditional supplier customer model.

Gardner: Okay, I'd like to check one of my own premises, and that was that SOA and enterprise architecture is common and similar between large enterprises and the public sector. Let's go to Andras at IBM. Do you find that to be the case from your experience that the way that you are dealing with your federal and government clientele gives you certain advantages or differentiates you from what is going on in the private sector?

Szakal: Obviously, within the U.S. federal government space, EA was adopted as an industry, before many other industries using failure modes and effects analysis (FMEA) and the Department of Defense Architecture Framework (DoDAF). IBM is very focused on trying to create a normative model for enterprise architecture.

We started off doing that with DoDAF in the Object Management Group (OMG), and now we've kind of settled on Unified Modeling Language (UML), the next innovative version of that standard which establishes a normative metamodel. You can actually flow artifacts from the EA into the actual IT architecture and design layer, so that there is seamless integration. We can have tool-based round tripping, so that we can actually have a dynamic asset management repository that is pulled in dynamically at the lowest levels into the EA layer that we can then use as part of our normative models.

Then we can push that down into the design layer, and it all connects and makes sense. We can make sure that there is a normalization of the artifacts and not just pretty pictures at the A level being handed over to an IT architect, who then has a process of going through and making their own determinations about the meaning of that picture. Those are some of the challenges that we face as a community.

Gardner: Well, I think clearly what we've heard form our panelists aligns well with many of the discussions today -- how transparency and breaking out of silos is important, not just for technology, but for how people behave in, and operate, and in understanding the business goals, and communicating them appropriately.

Even as many moving parts become essential, it is indeed a talented person who can cross the boundaries of the technology issues, and foster this collaboration clearly by an evangelism, across not only technology, but the business domains. I certainly congratulate those of you who are doing that, and hope that we can find a lot more folks in the field that are willing to step up and take on such a large responsibility.

I'd like to thank our panel of guests as we close today. We've been joined by Tony Baer, senior analyst at Ovum; Eric Knorr, editor-in-chief at InfoWorld; Joe McKendrick author and IT analyst and blogger; Andras Szakal, chief architect, IBM's Federal Software Group; and David Cotterill, head of innovation -- a good title, if you can manage to get it -- at the U.K. Government Department for Work and Pensions.

This is Dana Gardner, principal analyst at Interarbor Solutions. You've been listening to a sponsored BriefingsDirect podcast. I'd like to thank Allen Brown and The Open Group for helping put this together. Thanks, and have a good day.

Listen to the podcast
. Listen on iTunes/iPod. Sponsor: The Open Group.

Transcript of BriefingsDirect podcast recorded at the 19th Annual Open Group Enterprise Architect's Practitioners Conference on July 22, 2008, in Chicago.

Thursday, July 10, 2008

HP Information Management Maven Rod Walker Describes How BI Helps Business Leaders Innovate

Transcript of BriefingsDirect podcast recorded at the Hewlett-Packard Software Universe Conference in Las Vegas, Nevada the week of June 16, 2008.

Listen to the podcast here. Sponsor: Hewlett-Packard.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to a special BriefingsDirect podcast recorded live at the Hewlett-Packard Software Universe Conference in Las Vegas. We are here in the week of June 16, 2008. This sponsored HP Software Universe live podcast is distributed by BriefingsDirect Network.

We now welcome to the show Rod Walker, the vice president for Information Management in Hewlett Packard's Consulting and Integration (C&I) group. Welcome to the show, Rod.

Rod Walker: Thank you very much, Dana. It's a pleasure to be here.

Gardner: We are going to talk about some of the high-level business values that are being derived in the field from business intelligence (BI), data warehousing, data integration and generating quality data from the vast storm of information content data that's available to companies. This is, I suppose, a real competitive issue. This is what companies use to develop strategy, and to help them figure out where to take their businesses.

First off, let's tell our listeners a little about the information management practice, and a little bit about your background.

Walker: Thank you. I have been in the IT business, consulting business, for 37 years at this point. I came to Hewlett-Packard a year-and-a-half ago with the acquisition of Knightsbridge Solutions. They are one of the pre-eminent consulting firms in the BI and data warehousing space, and I was the CEO at Knightsbridge.

Gardner: All right. First, help us understand the problem out there. What's the issue? Is it that there is just too much data, that it's not good data, there is redundant data, all the above?

Walker: It's all the above and more. What we have is Global 2000, Fortune 500 companies that are struggling with all kinds of different issues, whether it's increasing market share, increasing the wallet share with their customers, dealing with compliance issues, competitive issues, or growing their business on a global basis, instead of regional basis.

They've all got different kinds of things that they are doing, and where we come in is we help them optimize different parts of their business. More and more, companies are becoming more fact based, data driven, and analytically focused, in terms of how they are running their businesses? So, they are using that to competitive advantage, to solve all these different types of business problems.

Gardner: So, no more just calling it from the gut?

Walker: Yeah, this is not shoot from the hip. This is, "How do we use the numbers to get ahead?"

Gardner: And, just having numbers isn't enough. It really is about distilling the numbers and finding the gems of information in there.

Walker: Yeah, and actually what we're seeing is that this type of work has evolved from where it's been a small group of analysts sitting in the back room, running models and making recommendations to management, to the point where you now have tiers of people throughout the organization -- from the CEO down to the individuals who interact with the customers -- and what they all need is better information.

Some of them need it in real time, and all this information needs to be provided from a consistent multi-tiered data infrastructure for the enterprise, so that they all are, in effect, operating off at the same facts, at different levels of details, and different levels of aggregation. But it all needs to be consistent, and the data that is used internally needs to be consistent with the data that's provided externally at the same time.

Gardner: Okay, let's unpack that a little bit in looking at some use-case scenarios. Why don't you tell us a little bit about how certain companies are out there are deriving a high business value from these activities? Can you perhaps give us an example from the health-care sector.

Walker: From the health care sector, one of the hot topics in health care these days -- and it's good for all of us -- is how do you improve patient outcomes? How do you improve the quality of patient care and ultimately the degrees of success in treatment? What we are seeing is that more and more of the providers of health-care services are trying to use their clinical data that they are gathering on a systematic basis -- what was this patient' problem, how did we treat him, and what was the end result?

We have both individual, if you will, hospital chains, trying to gather this information and doing it on their own, as well as various consortiums, who have the advantage of bringing in the clinical data literally from hundreds, if not thousands of hospitals, putting it into a consistent database. Then, what you can do is hunt through that data for best practices.

I can go into one hospital and say, "For this disease, for this treatment regime, what are my patient outcomes? And how do they compare, to the patient outcomes of hundreds of other hospitals?" If I am near the bottom, I've got a problem, and I better go fix it, for the sake of the quality of the care that I am providing and to avoid lawsuits and so forth. On the other hand, if I am in the top three percent, that is a marketing opportunity.

I can turn that back around and go out in the marketplace and say that for cardiac care or diabetes care, whatever the case may be, I am one of the top 10 best in the country. You hear that as consumers. You hear the stuff out there now where people are actually advertising, that they are really good in some aspect of health care.

Gardner: It's more of a marketplace?

Walker: It's absolutely a marketplace and the good news is that it's becoming competitive on the dimensions that we care about. First and foremost, what's the success in treating your illness. This is a hallelujah for all of us, and it's all because the data is becoming collectible, presentable and analyzable -- and people are doing it.

Gardner: And you can analyze that data with a certain a level of anonymity for the patients?

Walker: It has to be.

Gardner: Right.

Walker: It has to be. It's required. So the data is anonymized, and that's fine. For the kind of analysis they need to do, you don't need to know who the patient was, and you don't need any identifying information that would allow you to figure out who the patient is. It's readily anonymized.

Gardner: Right, and the fact that we are here in Las Vegas in a casino, raises a question about analytics. Do you do any analytics for the gaming industry?

Walker: Not yet, although, if you want an example of that, go see the movie "21" or read the book, "Bringing Down the House." There are your analytics for the gaming industry. Of course, it's not the one they want to talk about.

Gardner: All right. Well, speaking of markets, companies are looking for ways of exploiting their IT resources, getting return on their investment from their IT spend, and being able to up-sell and cross-sell the customers that they do have data on is becoming a great way to do that.

Walker: A lot of the work we do with our customers tends to be how they deal with their customers. And there's a lot of different aspects to this. It starts with, and it's kind of basic, just understanding your customer. Many of the big, complex organizations we deal with are still operating as collections of silos. They are typically, product based and geographically based, and both of those things make it difficult for them to really understand all the interactions they have with an individual customer.

They do not know when a customer walks in the door, calls, or goes on their Website, or whatever the case may be, if this is one of my best customers. Just because this person doesn't have a lot of money in a bank account doesn't mean, they don't have big mortgages, run a big company, and have huge certificates of deposit (CDs) or something else. If you just look at what they are doing on this one transaction in that one account, you completely misread who this customer is. So, trying to really understand your customer and who they are is a piece of the puzzle.

Then, the next piece of the puzzle is how do you increase your wallet share with that customer from the standpoint of how do you make sure they are loyal, particularly the ones that you highly value and are very profitable for you? Then, how do you interact with them and say, "Hey, you've got these services. How about this one?"

And, if they are big enough customer, you may make them a special offer or give them a better deal, and maybe you add a little bit to that CD rate that you are going to offer them. Then, the next trick that they run into is, when and how and under what circumstances do you make that offer?

It's one thing to send out a mailing based on some batch review of your customer files overnight, once a month, or once a week, but we are really finding that our customers want to do more and more is, when they are there visiting that Website, when they are in the branch, when they are doing that transaction, you want to really hit them at the point of sale with the offer right then. So, if somebody's made that big deposit, maybe that's the time you want to talk to him about a CD, or market basket question. They are buying a lot of something, well, how about this accessory or this other thing that tends to go with it? Hit him with it right now.

Gardner: And today, I suppose, more and more companies are interfacing with their customers and prospects, through the Web and through applications. We see self-help portals. We see people actually wanting to do business through the Web, but to have the analytics to then offer them the right path throughout that sales process becomes critical.

Walker: And, not only are the paths multiplying, but what we need to do in terms of how we architect these kinds of solutions and systems, is make sure that you can make that specific offer to that customer, no matter which channel he is contacting you through. It could be the call center. It could be the Web. It could be that kiosk. It could be physically walking into a store or a branch.

Gardner: It could be increasingly your cell phone.

Walker: It could be a cell phone, absolutely. So the answer is that it doesn't matter how they have contacted you. You want to have the same analytics, the same class of offer to be presentable through any channel, anytime, anywhere.

Gardner: So, it's so much more than a single view of the customer. It's really an amalgamated view of what that customer probably would want next?

Walker: Absolutely, and this is all based upon analytics. I don't doubt that there are some businesses who will use this not just to up-sell and cross-sell some customers, but maybe in some cases they drive a few of them away at the same time.

Gardner: If not done properly.

Walker: Well, maybe on purpose. Maybe, I don't want to you as a customer.

Gardner: I see. Weed them out.

Walker: Weed them out at the same time.

Gardner: Interesting. Let's move on to another use case. Energy is a big topic these days. People are wondering when the price of oil will start coming down, instead of going up. What can analytics and business intelligence bring to those, who are now out there looking for the increase in the oil supply?

Walker: Well, there are a lot of different kinds of things, as you might imagine, that the energy companies are applying analytics to. We have been involved with them in both the retail sales side in terms of the analytics there -- the energy trading business, in terms of how do you swap and trade crude as well as finished products on a worldwide basis? And we have got involved in some other things, like centralizing well information.

If you look at how an exploration or production company deals with well information, they may go out and sign up for leases, and so they gather a whole bunch of lease information. They've got an exploration unit, that goes out and actually drills the well and collects a lot of information about that well. They've got a production company that collects information around the well as it produces. These are all different functional silos. You've got a legal department that does the negotiations and does the deals. They've got their file cabinets full of paper and information around those wells.

Then, of course, you've got the finance organization that has to take the money that's obtained from selling the products that come out of that well, and then redistribute it back to the owners of the well, and the royalty owners, and to some degree, each of these different business units keeps their own information around the wells, as opposed to there being one master of repository, the data of record, certified data for that well. So, there is an opportunity for them to be much more efficient, and make that data available on a consistent accurate basis to everybody who needs it.

Gardner: A single view of the petroleum.

Walker: A single view of the well, at least.

Gardner: How about one more used case scenario, risk management? How does an organization reduces exposure to risk, perhaps shore up its security, and maybe even be mindful of compliance and regulatory issues, vis-à-vis BI.

Walker: If you take the banking industry, for example, banking is slowly going global. You've got these huge banks operating around the world. They've got all kinds of regulatory compliance issues to deal with, both on an international basis with Basel II, as well as on a country-by-country basis. So, of course, you have to feed accurate information, consistent information to all of the different regulatory bodies.

At the same time, part of that is also managing your operational risk on an worldwide basis, and that could be anything from your currency risk to your interest rate exposure or your customer credit risk. It's one thing to look at your customer credit risk in terms of this subsidiary of that company, but what about the rest of the company, or what about their risk in this country, versus the risk on a global basis?

Do you have that information collected in a way that you can assess all those risks and apply your judgments and make your operational decisions appropriately. That's just one aspect of risk management that we have been involved with, in that case numerous banks, but it can also be things like a credit card fraud, ultimately, in real time, analyzing the transactions as they come in. There are just lots of other risk factors out there on an industry-by-industry basis.

Gardner: You raised the issue about real time. Many times we think about analytics as having data that's been sitting around for a while. It will stay, and we can take some time to go in and look it over, but, I think, increasingly, we are finding enterprises seeking to analyze things a bit more on the fly. How does that relate to what you are doing?

Walker: Well, there are some relatively easy examples of that kind of thing. A couple we just alluded here. One I just mentioned was the credit card fraud aspects of this. There are other people who look at trading opportunities and trading analytics. Whether it's equity markets, energy markets, or whatever kind of markets they trade in, if you can do your analytics just a little bit faster than the next guy, and get your trades in a little bit quicker, that can mean serious money, and we have run into some of those kinds of issues as well.

Then, one of the big emerging areas for real time gets back to this business of customer interaction on a real time basis, if the customer calls the call centers, shows up in the branch, and then goes on the website. You don't want to be looking at yesterday's data, if he's doing all of those things today.

So, you want to see his transactions he has done all at the same time. You want that complete view of the customer. There's another less real-time aspect to this. When you're talking about a complete view of the customer, the other thing we are seeing is, it's not just the transaction information. It's not just the structured information that we gather from all these systems.

There are studies out there that say, 60 percent of the data you have in your organization is actually not structured data at all. It's in documents, e-mails, and other forms of images, audio, and video, whatever the case may be. One of our challenges first is to get people to have a 360-degree view of the customer.

The next thing is to have them have a complete view of the customer. What is everything we have in our organization that we have about that customer? Can I get at it, when I need to get at it, either when I am dealing with the customer real time, or even if it's not real time? The point is that I've got to be able to get all the relevant data, not just the stuff that's easy, because it's in the systems.

Gardner: Very interesting, I think this certainly shows how IT investment has many new and additional forms of payback. We're really just getting into the icing on the cake, right?

Walker: Absolutely, and as the technology continues to evolve, and we get better and better at this, and as our customers go through the maturity process and mature with the technologies and the business issues, they are getting smarter and smarter about what they can accomplish with this. You actually see them progress from using data, to using information, to streamlining the business, and then getting to the point where they really try to innovate, compete, and alter their strategies based on the information they are now able to bring to the table.

Gardner: It really shows how IT can be an competitive advantage in a very significant way.

Walker: Absolutely, and all the trends and demographics in business come back to kind of where we started, which is that it's all about business becoming more analytic data driven, and really trying to optimize, not just their operations, but their market share, and how they compete against their competition. At the same time, how can I just do a better job serving my customers?

Gardner: Great, we have been talking with Rod Walker, he's the vice president for Information Management in Hewlett-Packard's Consulting and Integration Group. I really appreciate your time.

Walker: Delighted to be here, and happy to talk anytime.

Gardner: This comes to you as a sponsored HP Software Universe live podcast recorded at the Venetian Resort in Las Vegas. Look for other podcast from this HP event at hp.com website, under "Software Universe Live Podcasts," as well as, through the BriefingsDirect Network. I would like to thank our producers on today’s show, Fred Bals and Kate Whalen, and also our sponsor Hewlett-Packard.

I'm Dana Gardner, principal analyst at Interarbor Solutions. Thanks for listening, and come back next time for more in-depth podcasts on enterprise software infrastructure and strategies. Bye for now.

Listen to the podcast. Sponsor: Hewlett-Packard.

Transcript of BriefingsDirect podcast recorded at the Hewlett-Packard Software Universe Conference in Las Vegas, Nevada. Copyright Interarbor Solutions, LLC, 2005-2008. All rights reserved.

HP's Adaptive Infrastructure Head Duncan Campbell Discusses Data Center Efficiency and Energy Conservation Best Practices

Transcript of BriefingsDirect podcast recorded at the Hewlett-Packard Software Universe Conference in Las Vegas, Nevada the week of June 16, 2008.

Listen to the podcast here. Sponsor: Hewlett-Packard.

Dana Gardner: Hi, this is Dana Gardner, principal analyst at Interarbor Solutions, and you’re listening to a special BriefingsDirect podcast recorded live at the Hewlett-Packard Software Universe Conference in Las Vegas. We are here in the week of June 16, 2008. This sponsored HP Software Universe live podcast is distributed by BriefingsDirect Network.

We are joined now by Duncan Campbell, the vice president in-charge of the Adaptive Infrastructure program at HP. Welcome to show, Duncan.

Duncan Campbell: Great. My pleasure to be here, Dana.

Gardner: You know, a lot has been said about data centers and how they are shifting, how people are trying to bring in more capability at higher scale to deal with more complexity, and, of course, cut costs and even labor resources. That is to say, automate whenever possible. Tell us a little bit about how you characterize or describe the data center situation and the challenges the companies are facing right now.

Campbell: It will be my pleasure. In fact, Dana, what we're seeing is almost a perfect storm happening here in the data center right now, and the next generation data center is, in fact, a hot topic. It's a hot topic not just because we're here in Las Vegas and it's 102 degrees, but, in fact, the fundamental design center of the data center is being challenged right now and it's really under siege by a number of different factors.

One of the things you talked about is cost, and another one of the things is energy efficiency. Another key element that we are seeing at this point really has to do with the fundamental challenge that customers who are striving more-and-more to deal with automation have less time. We have an excellent opportunity here to have conversations with customers and partners of Software Universe about the adaptive infrastructure, which is HP's program for the next generation data center.

Gardner: What is different from this next generation data center, the one that we are working with, working toward even, and what was described as a very modern up-to-date data center five years ago?

Campbell: Good question. Fundamentally, what HP has is a strategy that allows our IT managers to be much more engaged with lines of business, because we are allowing IT, at this point, really to participate in a dialogue to be much more engaged with lines of business, as it relates to how IT can be fundamentally thought more as not just a cost-type of agenda, but in fact be more fundamental to driving the business.

That being the case, Dana, we have six fundamental technology enablers that we work with customers to select from to design their next generation data center, and these six technologies are really critical. One has to do with the type of systems they choose, and more and more it's becoming a reality where they are becoming more dense. These systems are drawing more power, so we need to work with customers on how best to design those solutions.

Second, are key enablers around energy-efficiency type of technologies. The third is around virtualization. The fourth is around management, and then we also have security, and finally automation. These are some of the key technologies that are part of the adaptive infrastructure.

Gardner: Now, it seems that the architects and the decision-makers, the specifiers in the operations units of large organizations, have their hands full these days, and, as you've mentioned, have energy issues to contend with. They are also dealing with consolidation in many cases and legacy modernization, bringing more of a services orientation to their applications for purposes of reuse and governance and extending across multiple business processes, assets and resources. So, in an efficiency drive it seems that there is a notion of having to fly the airplane and change the wings at the same time.

I also hear from a lot of enterprises concerned that manual processes are not scaling. When it comes to test, a bug, change management, issues around performance management, making a printout and sticking up on the wall and finding the time-stamps for incidents and uncovering the root causes that way is not scaling. How does HP come in with products and services to help companies manage these multiple major trends that are impacting them?

Campbell: Well, I think you did nail some of the key needs. So, we would agree entirely. It's not just about a rip-and-replace strategy. It is dealing with those core issues that you spoke of, both in terms of cost, energy, and some other elements around quality of service to be more aligned with the line of business and then speed.

So, to your point about how to get started, most customers understand the basic value proposition around the adaptive infrastructure, which is about this 24/7 lights-out computing environment. It's based on modular building blocks, using very off-the-shelf software and comprehensive services.

One thing that we do that is unique. We provide specific assessment services for our customers, and this is not just about product. It's really more understanding their needs, where they set the baseline of their specific needs by business. And, it's not just about technology. It's about their governance management and organizational type of needs. Then, we design specific recommendations on how to proceed, given their specific environment.

Gardner: Because we're at a user conference and a technology forum, I am assuming that there is some news to be had here, or perhaps you can share some of that with us. Something about blade servers, I believe it was.

Campbell: Exactly, and so I hope you are holding onto your hat there. One of the things about the adaptive infrastructure, people are always looking for proof points. They say, "Yeah, great strategy. I understand the value proposition, Duncan, but it's all about the proof points in making it real."

Last year, we had both our blade systems, which was really it's an adaptive infrastructure in a box. It includes virtualization. It includes blade storage and servers and management capabilities.

One of the areas that people fundamentally love, which was rock-solid business and high availability, were our non-stop servers, and they say, "Are you just abandoning that?" No, the news that we are offering here is that we're going to now have brand new bladed non-stop systems that are going to be a fundamental proof point of our adaptive infrastructure.

Were bringing some of those high-availability features people love, but in more of this adaptive infrastructure type of environment. And it's one of the one thing our software customers love, because as you start to kick up service-oriented architecture (SOA) projects, specific business continuity projects, or strategy applications, you have to have adaptive infrastructure that provides that type of value to you.

Gardner: Let's return to the energy issue. I'm also seeing some news coming out of these events this week around dynamic smart cooling. That's a mouthful. What does it really mean?

Campbell: Good question. Dynamic smart cooling. The one thing that you should understand when we talk about energy-efficiency, is it's about not just the new technology, which is always improving, but some of the facilities type of capabilities we have. So, we have some fantastic new services from our EYP services, which HP acquired, that designs most of the new data centers on Planet Earth at this point. Among the new capabilities we have around the data center, the key one is dynamic smart cooling.

Barclays Bank, for example, recently adopted it across their whole company to save greater than 13 percent of their data-center cost. It manages the air flow in your facilities. So, in combination with services, plus this new technology that came from HP labs, plus the new servers and software elements, this is the type of winning combination customers demand and expect from HP.

Gardner: We are also, I believe, going to see some news later in the week around change management and problem management and resolution. I don't have the details, and we can't pre-announce that, but it does bring to mind the question about hardware/software services, these major trends, methodologies and maturity models, the Information Technology Infrastructure Library (ITIL).

For those folks managing multiple dimensions of IT operational integrity and efficiency, how do you get a handle on a holistic, top-down approach that includes elements of hardware, energy, software, change management, and IT systems management? Is there a whole greater than the sum of the parts here?

Campbell: We are finding that customers are demanding that holistic approach, which is why dealing with the company with the size and the depth and breadth of HP makes a lot of sense. Some of the software attributes that you've mentioned here at HP really do come to bear when you think about the adaptive infrastructure. Some of the fundamental building blocks from Opsware are great examples of that.

When you think about data-center automation, that is a great example, and Forrester recently called HP's Opsware product suite the number one offering out there. And that's in combination with, as I mentioned before, some of our maturity-model type of assessment that we do with our largest customers. It is a fantastic dialogue in assessment built on rich set of data best practices, where we understand where they are trying to go with their environment, and then work with them on specific recommendations. It's a fantastic process that we engage in with customers.

Gardner: I suppose an important aspect of going holistic is that people don't want iterative payback. They are looking for substantive efficiency and performance improvements. To that element, do you have some examples of companies or a matrix? What is the baseline? Are people looking for 15-20 percent that says "Yeah, I am ready to go holistic?" Is this more up towards 30-40 percent? What are the bottom line elements of what these customers are expecting from these kinds of major activities?

Campbell: Good question. We have a very robust solution called our Data Center Transformation Solution from HP, which is a composite of some concrete specific solutions with specific return-on-investment type of numbers in the range you mentioned for energy-efficiency IT consolidation, and business continuity in data center automation.

As you are saying, though, lots of customers don't have the time or the runway to expect a long-term project with a speculative type of payback. What we do is break it into bite-size chunks, into fundamental progress, with return-on-investment in these concrete solution areas.

Gardner: Let's look to the future a little bit. We are hearing a lot these days about cloud computing. Many people think of that as a greater utility function that someone else does, but for some of the enterprises that I speak to, they actually like the idea of private clouds -- taking the best of the technology and efficiencies at a cloud computing approach.

I believe it is taking the methodology and approach, as well as the technology set, and using that to support their services, their data, and perhaps start doing more platform as a service, integration as service activities, but for their internal constituencies, and then, over time, into their partners and business ecologies. What do you see coming from an adaptive solutions perspective for cloud computing?

Campbell: From my standpoint, I think you've nailed it, because we do not see our major enterprise customers turning over lock, stock, and barrel, their whole IT environment to a perhaps less insecure type of environment with less predictive type of results.

What we see, though, is that customers like attributes of the cloud. So, the private cloud concept that you speak of here is much more near-and-dear to the heart as we've heard from some of our advisory type of customers and our lighthouse customers. From that standpoint they are looking very much to an adaptive infrastructure to provide those type of attributes of a cloud, but still under the control and under the security type of requirements that they have for their specific enterprise and their domains.

Gardner: So, when we think of the next next-generation data center architectures and the requirements for them, do you think cloud computing is going to play a significant role in that?

Campbell: That's the hot topic, and it's interesting, because of these specific benefits that we provide with the adaptive infrastructure around speed, cost, quality of service, and energy. It turns out those value propositions still remain true. So, we see this as more of an opportunity for us to provide new technology innovation for our customers through some of the attributes of the cloud. There are a lot of people working on this within HP, but I think it's providing customer choice, while providing no specific benefits in the next generation data center, and that is exactly our plan.

Gardner: Very good, and just to close out our discussion, you announced today the non-stop blade servers. When will those be available in the market?

Campbell: At this point, that news is being transmitted as we speak, and so as our press release comes across the wire we will all know that, and read that with great relish and anticipation.

Gardner: Okay, we could fill that in a little later in a future podcast. But, thank you. We've been speaking with Duncan Campbell. He is the vice president in charge of Adaptive Infrastructure and the Adaptive Infrastructure Program here at Hewlett-Packard. Also, you're delivering, I believe, some keynotes and other discussions at the live event throughout the week.

Gardner: This comes to you as a sponsored HP Software Universe live podcast recorded at the Venetian Resort in Las Vegas. Look for other podcast from this HP event at hp.com website, under "Software Universe Live Podcasts," as well as, through the BriefingsDirect Network. I would like to thank our producers on today’s show, Fred Bals and Kate Whalen, and also our sponsor Hewlett-Packard.

I'm Dana Gardner, principal analyst at Interarbor Solutions. Thanks for listening, and come back next time for more in-depth podcasts on enterprise software infrastructure and strategies. Bye for now.

Listen to the podcast. Sponsor: Hewlett-Packard.

Transcript of BriefingsDirect podcast recorded at the Hewlett-Packard Software Universe Conference in Las Vegas, Nevada. Copyright Interarbor Solutions, LLC, 2005-2008. All rights reserved.