Tuesday, January 29, 2013

AT&T Cloud Services Built on VMware vCloud Datacenter Meet Evolving Business Demands for Advanced IaaS

Transcript of a BriefingsDirect podcast on how telecom giant AT&T is leveraging its networking and cloud expertise to provide advanced cloud services.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: VMware.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you're listening to BriefingsDirect.

Dana Gardner
Today, we present a sponsored podcast discussion on how global telecommunications giant AT&T has created advanced cloud services for its customers. We'll see how AT&T has developed the ability to provide virtual private clouds and other computing capabilities as integrated services at scale.

Stay with us now to learn more about building the best infrastructure to handle some of the most demanding network and compute services for one of the world's largest service providers. Here to share her story on building top-performing infrastructure is Chris Costello, Assistant Vice President of AT&T Cloud Services. Welcome, Chris. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

Chris Costello: Thank you, Dana.

Gardner: Just to help us understand, because it's such a large company and you provide so many services, what cloud services generally is AT&T providing now, and why is this an important initiative for you?

Costello: AT&T has been in the hosting business for over 15 years, and so it was only a natural extension for us to get into the cloud services business to evolve with customers' changing business demands and technology needs.

Chris Costello
We have cloud services in several areas. The first is our AT&T Synaptic Compute as a Service. This is a hybrid cloud that allows VMware clients to extend their private clouds into AT&T's network-based cloud using a virtual private network (VPN). And it melds the security and performance of VPN with the economics and flexibility of a public cloud. So the service is optimized for VMware's more than 350,000 clients.

If you look at customers who have internal clouds today or private data centers, they like the control, the security, and the leverage that they have, but they really want the best of both worlds. There are certain workloads where they want to burst into a service provider’s cloud.

We give them that flexibility, agility, and control, where they can simply point and click, using free downloadable tools from VMware, to instantly turn up workloads into AT&T's cloud.

Another capability that we have in this space is AT&T Platform as a Service. This is targeted primarily to independent software vendors (ISVs), IT leaders, and line-of-business managers. It allows customers to choose from 50 pre-built applications, instantly mobilize those applications, and run them in AT&T's cloud, all without having to write a single line of code.

So we're really starting to get into more of the informal buyers, those line-of-business managers, and IT managers who don't have the budget to build it all themselves, or don't have the budget to buy expensive software licenses for certain application environments.

Examples of some of the applications that we support with our platform as a service (PaaS) are things like salesforce automation, quote and proposal tools, and budget management tools.

Storage space

The third key category of AT&T's Cloud Services is in the storage space. We have our AT&T Synaptic Storage as a Service, and this gives customers control over storage, distribution, and retrieval of their data, on the go, using any web-enabled device. In a little bit, I can get into some detail on use cases of how customers are using our cloud services.

This is a very important initiative for AT&T. We're seeing customer demand of all shapes and sizes. We have a sizable business and effort supporting our small- to medium-sized business (SMB) customers, and we have capabilities that we have tailor-developed just to reach those markets.

As an example, in SMB, it's all about the bundle. It's all about simplicity. It's all about on demand. And it's all about pay per use and having a service provider they can trust.
It's all about simplicity. It's all about on demand.

In the enterprise space, you really start getting into detailed discussions around security. You also start getting into discussions with many customers who already have private networking solutions from AT&T that they trust. When you start talking with clients around the fact that they can run a workload, turn up a server in the cloud, behind their firewall, it really resonates with CIOs that we're speaking with in the enterprise space.

Also in enterprises, it's about having a globally consistent experience. So as these customers are reaching new markets, it's all about not having to stand up an additional data center, compute instance, or what have you, and having a very consistent experience, no matter where they do business, anywhere in the world.

New era for women in tech

Gardner: Let’s look into your role Chris as an IT executive and also a woman. The fact is that a significant majority of CIOs and IT executives are men, and that’s been the case for quite some time. But I'm curious, does cloud computing and the accompanying shift towards IT becoming more of a services brokering role change that? Do you think that with the consensus building among businesses and partner groups being more important in that brokering role, this might bring in a new era for women in tech?

Costello: I think it is a new era for women in tech. Specifically to my experience in working at AT&T in technology, this company has really provided me with an opportunity to grow both personally and professionally.

I currently lead our Cloud Office at AT&T and, prior to that, ran AT&T’s global managed hosting business across our 38 data centers. I was also lucky enough to be chosen as one of the top women in wireline services.
The key to success of being a woman working in technology is being able to build offers that solve customers' business problem.

What drives me as a woman in technology is that I enjoy the challenge of creating offers that meet customer needs, whether they be in the cloud space, things like driving eCommerce, high performance computing environment, or disaster recovery (DR) solutions.

I love spending time with customers. That’s my favorite thing to do. I also like to interact with many partners and vendors that I work with to stay current on trends and technologies. The key to success of being a woman working in technology is being able to build offers that solve customers' business problem, number one.

Number two is being able to then articulate the value of a lot of the complexity around some of these solutions, and package the value in a way that’s very simple for customers to understand.

Some of the challenge and also opportunity of the future is that, as technology continues to evolve, it’s about reducing complexity for customers and making the service experience seamless. The trend is to deliver more and more finished services, versus complex infrastructure solutions.

Gardner: It’s a very interesting period. Do you have any sense of a future direction in terms of IT roles? Does the actual role, whether it’s a man or woman, shift? The leadership in IT, how is that changing?

Costello: I've been in the technology space for a number of years at AT&T and I've had the opportunity to interact with many women in leadership, whether they be my peer group, managers that work as a part of my team, and/or mentors that I have within AT&T that are senior leaders within the business.

I've worked with several women in leadership. I think that trend is going to continue. I also mentor three women at AT&T, whether they be in technology, sales, or an operations role. So I'm starting to see this trend continue to grow.
It enables us to deliver a mobile cloud as well. That helps customers to transform their businesses.

Gardner: You have a lot of customers who are already using your network services. It seems a natural extension for them to look to you for cloud, and now you have created these, as I have seen it termed, virtual private clouds.

From what you're describing, that allows folks to take whatever cloud activities they've got and be able to burst those into your cloud, and that gives them that elasticity. I imagine there are probably some good cost-efficiencies as well.

Costello: Absolutely. We've embedded cloud capabilities into the AT&T managed network. It enables us to deliver a mobile cloud as well. That helps customers to transform their businesses. We're delivering cloud services in the same manner as voice and data services, intelligently routed across our highly secure, reliable network.

AT&T's cloud is embedded in our network. It's not sitting on top of or attached to our network, but it's fully integrated to provide customers a seamless, highly secure, low-latency, and high-performing experience.

Gardner: Let’s look into the VMware solution set, and why you chose VMware. Maybe you can explain the process. Was this a data-driven decision? Was this a pure architecture? Were there other technology or business considerations? I'm just trying to better understand the lead-up to using vCloud Datacenter Services as a core to the AT&T Synaptic Compute as a Service. 

Multiple uses

Costello: AT&T uses VMware in several of our hosting application and cloud solutions today. In the case of AT&T Synaptic Compute as a Service, we use that in several ways, both to serve customers in public cloud and hybrid, as well as private cloud solutions.

We've also been using VMware technology for a number of years in AT&T’s Synaptic Hosting offer, which is our enterprise-grade utility computing service. We've also been serving customers with server virtualization solutions available in AT&T data centers around the world and also can be extended into customer or third-party locations.

Just to drill down on some of the key differentiators of AT&T Synaptic Compute as a Service, it’s two-fold.

One is that we integrate with AT&T private networking solutions. Some of the benefits that customers enjoy as a result of that are orchestration of resources, where we'll take the amount of compute storage and networking resources and provide the exact amount of resources at the exact right time to customers on-demand.

Our solutions offer enterprise-grade security. The fact that we've integrated our AT&T Synaptic Compute as a Service with private networking solution allows customers to extend their cloud into our network using VPN.
An engineering firm can now perform complex mathematical computations and extend from their private cloud into AT&T’s hybrid solution instantaneously, using their native VMware toolset.

Let me touch upon VMware vCloud Datacenter Services for a minute. We think that’s another key differentiator for us, in that we can allow clients to seamlessly move workloads to our cloud using native VMware toolsets. Essentially, we're taking technical complexity and interoperability challenges off the table.

How this manifests itself in terms of client solutions is that an engineering firm can now perform computationally intensive complex mathematical modeling on the fly and on demand using AT&T Synaptic Compute as a Service.

Medical firms can use our solutions for medical imaging to securely store and access x-rays. Companies that are interested in mobile cloud solution can use AT&T’s Mobile Enterprise Application Platform to offer product catalogs in the cloud with mobile access.

Gardner: It certainly appears to me that we're going to be finding a lot more ways in which the private cloud infrastructure in these organizations can synergistically add value and benefit from public cloud services.

Cloud interaction

Even though we want to distill out the complexities, there’s something about the interactions between the private cloud and the enterprise and the public cloud services, like AT&T, that depend on some sort of a core architecture. How are you looking at making that visible? What are some of the important requirements that you have for making this hybrid cloud capability work?

Costello: One of the requirements for a hybrid cloud solution to be a success, specifically in terms of how AT&T offers the service, is that we have a large base of customers that have private networking solutions with AT&T, and they view their networks as secure and scalable.

Many of our customers that we have today have been using these networks for many years. And as customers are looking to cloud solutions to evolve their data centers and their application environment, they're demanding that the solution be secure and scalable.  So the fact that we let customers extend their private cloud and instantly access our cloud environment over their private network is key, especially when it comes to enterprise customers.

Secondly, with the vCloud Datacenter program that we are part of with VMware, letting customers have access to copy and paste workloads and see all of their virtual machines, whether it be in their own private cloud environment or in a hybrid solution provided by AT&T, providing that seamless access to view all of their virtual machines and manage those through single interface, is key in reducing technical complexity and speeding time to market.

Gardner: I should also think that these concepts around the software-defined datacenter and software-defined networking play a part in that, is that something that you are focused on?
If we start with enterprise, the security aspects of the solution had to prove out for the customers that we do business with.

Costello: Software-defined datacenter and software-defined networks are essentially what we're talking about here with some uniqueness that AT&T Labs has built within our networking solutions. We essentially take our edge, our edge routers, and the benefits that are associated with AT&T networking solutions around redundancy, quality of service, etc., and extend that into cloud solutions, so customers can extend their cloud into our network using VPN solutions.

Gardner: As you moved toward this really important initiative, what were some of the other requirements you had in terms of functionality for your infrastructure? What were you really looking for?

Costello: In terms of functionality for the infrastructure, if we start with enterprise, the security aspects of the solution had to prove out for the customers that we do business with. When you think about clients in financial services, the federal government, and healthcare, as examples, we really had to prove that the data was secure and private. The certifications and audits and compliance that we were able to provide for our customers were absolutely critical to earning customers’ business.

We're seeing more and more customers, who have had very large IT shops in the past, who are now opening the door and are very open to these discussions, because they're viewing AT&T as a service provider that can really help them to extend the private cloud environment that they have today. So security is absolutely key.

As I mentioned earlier, networking capabilities are very attractive to the enterprise customers that we're talking to. They may think. "I've already invested in this global managed network that I have in multiple points around the world, and I'm simply adding another node on my network. Within minutes I can turn up workloads or I can store data in the cloud and only pay for the resources that I utilize, not only the compute and/or storage resources, but also the network resources."

Added efficiency

Previously many customers would have to buy a router and try to pull together a solution on their own. It can be costly and time consuming. There's a whole lot of efficiency that comes with having a service provider being able to manage your compute storage and networking capabilities end to end.

Global scale was also very critical to the customers who we've been talking to. The fact that AT&T has localized and distributed resources through a combination of our 38 data centers around the world, as well as central offices, makes it very attractive to do business with AT&T as a service provider.

Also, having that enterprise-grade customer experience is absolutely critical to the customers who do business with AT&T. When they think of our brand, they think of reliability. If there are service degradation or change management issues, they want to know that they've got a resource that is working on their behalf that has technical expertise and is a champion working proactively on their cloud environment.

Gardner: You mentioned that it's a natural extension for those who are using your network services to move towards cloud services. You also mentioned that VMware has somewhere in the order of 350,000 customers with private-cloud installations that can now seamlessly move to your public-cloud offering.

Tell me how that came about and why the VMware platform, as well as their installed base, has become critical for you?
We learned early on, in working with customers’ managed utility computing environments, that VMware was the virtualization tool of choice for many of our enterprise customers.

Costello: We've been doing business with VMware for a number of years. We also have a utility-computing platform called AT&T Synaptic Hosting. We learned early on, in working with customers’ managed utility computing environments, that VMware was the virtualization tool of choice for many of our enterprise customers.

As technologies evolved over time and cloud technologies have become more prevalent, it was absolutely paramount for us to pick a virtualization partner that was going to provide the global scale that we needed to serve our enterprise customers, and to be able to handle the large amount of volume that we receive, given the fact that we have been in the hosting business for over 15 years.

As a natural extension of our Synaptic Hosting relationship with VMware for many years, it only made sense that we joined the VMware vCloud Datacenter program. VMware is baked into our Synaptic Compute as a Service capability. And it really lets customers have a simplified hybrid cloud experience. In five simple steps, customers can move workloads from their private environment into AT&T's cloud environment.

Think that you are the IT manager and you are coming into start your workday. All of a sudden, you hit 85 percent utilization in your environment, but you want to very easily access additional resources from AT&T. You can use the same console that you use to perform your daily job for the data center that you run in-house.

In five clicks, you're viewing your in-house private-cloud resources that are VMware based and your AT&T virtual machines (VMs) running in AT&T's cloud, our Synaptic Compute as a Service capability. That all happens in minutes' time.

Fantastic discussions

I've been in the hosting business and application management business for many years and have seen lots of fantastic discussions with customers. The whole thing falls apart when you start talking about the complexities of interoperability and having to write scripts and code and not being able to accept tools that the clients have already made investments in.

The fact that we're part of the vCloud Datacenter program provides a lot of benefits for our clients, when you talk to customers about the benefit of running that cloud in AT&T's network. Some of the additional benefits are no incremental bandwidth needed at the data center and no investment in a managed-router solution.

We have patented AT&T technology that completely isolates traffic from other cloud traffic. The network and cloud elasticity work in tandem. So all of this happens on the fly, instantaneously. Then, all of the end-to-end class of service prioritization and QoS and DDOS protection capabilities that are inherent in our network are now surrounding the compute experience as well.

Gardner: We've certainly seen a lot of interest in this hybrid capability. I wonder if you could help me identify some of the use cases that this is being employed with now. I'm thinking that if I needed to expand my organization into another country or to another region of the world, given your 38 data centers and your global reach, I would be able to take advantage of this and bring services to that region from my private cloud pretty rapidly.

Is that one of the more popular use cases, or are there some others that are on the forefront of this hybrid uptake?
We see a lot of customers looking for a more efficient way to be able to have business continuity,  have the ability to fail over in the event of a disaster

Costello: I speak with a lot of customers who are looking to be able to virtually expand. They have data-center, systems, and application investments and they have global headquarters locations, but they don't want to have to stand up another data center and/or virtually expand and/or ship staff out to other location. So certainly one use case that's very popular with customers is, "I can expand my virtual data-center environment and use AT&T as a service provider to help me to do that."

Another use case that's very popular with our customers is disaster recovery. We see a lot of customers looking for a more efficient way to be able to have business continuity,  have the ability to fail over in the event of a disaster, and also get in and test their plans more frequently than they're doing today.

For many of the solutions that are in place today, clients are saying they are expensive and/or they're just not meeting their service-level agreements (SLAs) to their business unit. One of the solutions that we recently put in place for a client is that we put them in two of AT&T's geographically diverse data centers. We wrapped it with AT&T's private-networking capability and then we solutioned our AT&T Synaptic Compute as a Service and Storage as a Service.

The customer ended up with a better SLA and a very powerful return on investment (ROI) as well, because they're only paying for the cloud resources when the meter is running. They now have a stable environment so that they can get in and test their plans as often as they'd like to and they're only paying for a very small storage fee in the event that they actually need to invoke in the event of a disaster. So DR plans are very popular.

Another use case that’s very popular among our clients is short-term compute. We work with a lot of customers who have massive mathematical calculations and they do a lot of number crunching.

Data crunching

One customer that comes to mind is one that looks at the probability of natural disasters on large structures, such as bridges, tunnels, nuclear power plants. They came to AT&T, looked at our Synaptic Compute as a Service, and ultimately ran a very large number of VMs in a workload. Because of the large amount of data crunching they had to do, they ran it for two weeks straight on our platform. They finished the report. They were very pleased with the results, and the convenience factor was there.

They didn’t have to stand up an environment temporarily for themselves and now they use us anytime they sign a new client for those bursty type, short-term compute workloads.

Certainly test and development is one that I am seeing CIOs, directors of IT, and other functional managers as one of the most highly adopted use cases, in that it’s lower risk. Over the years, we've gone from, "Will I use the cloud?" to "What workloads are going to fit for me in the cloud?"

For those that are earlier on in their journey, using AT&Ts Synaptic Compute as a Service for their test and development environments certainly provides the performance, the global reach, and also the economics of pay per use. And if a client has private networking solutions from AT&T, they can fully integrate with our private networking solutions.

Finally, in the compute space, we're seeing a lot of customers start to hang virtual desktop solutions off of their compute environment. In the past, when I would ask clients about virtual desktop infrastructure (VDI), they'd say, "We're looking at it, but we're not sure. It hasn’t made the budget list." All of a sudden, it’s becoming one of the most highly requested use cases from customers, and AT&T has solutions to cover all those needs.
The fact that we have 38 data centers around the world, a global reach from a networking perspective, and all the foundational cloud capabilities makes a whole lot of sense.
 
Gardner: I'm particularly interested in the spiky applications, where your workload spikes up, but then there is no sense of keeping resources available for it when they're not in use. Do you think that this will extend to some of the big data and analytics crunching that we've heard about or is that hurdle of getting the data to the cloud still a major issue? And does your unique position as a network service provider help pave the way for more of these big-data, spiky types of uses?

Costello: I don’t think anyone is in a better position than AT&T to be able to help customers to manage their massive amounts of data, given the fact that a lot of this data has to reside on very strong networking solutions. The fact that we have 38 data centers around the world, a global reach from a networking perspective, and all the foundational cloud capabilities makes a whole lot of sense.

Speaking about this type of a bursty use case, we host some of the largest brand name retailers in the world. When you think about it, a lot of these retailers are preparing for the holidays, and their servers are going underutilized much of year. So how attractive is it to be able to look at AT&T, as a service provider, to provide them robust SLAs and a platform that they only have to pay for when they need to utilize it, versus sitting and going very underutilized much of the year?

We also host many online gaming customers. When you think about the gamers that are out there, there is a big land rush when the buzz occurs right before the launch of a new game. We work very proactively with those gaming customers to help them size their networking needs well in advance of a launch. Also we'll monitor it in real time to ensure that those gamers have a very positive experience when that launch does occur.

Gardner: I suppose one other area that’s top of mind for lots of folks is how to extend the enterprise out to the mobile tier, to those mobile devices. Again, this seems to be an area where having the network services expertise and reach comes to an advantage.

For an enterprise that wanted to extend more of their apps, perhaps the VDI experience, out to their mobile devices, be they smartphones or tablets, what offerings do you have that might help us grease the skid towards that kind of a value?

Mobility applications

Costello: AT&T has a very successful mobility applications business, and we have a couple of examples of how customers use our cloud services in conjunction with making their mobile applications more productive.

First and foremost, we have a set of experts and consultants who help customers to mobilize their applications. So there might be internal customer proprietary applications, and we can really help them move to an on-demand mobile environment.

Secondly, a couple of cloud examples of how customers will use our capabilities off the shelf. One is our AT&T Synaptic Storage as a Service capability. We find that many customers are looking for a secure place to collaborate on data sharing. They're looking to have a place to access their data and store their data to enable a worker on the go scenario, or to enable a field services applications or technicians.

Our AT&T Synaptic Storage as a Service capability gives the end-user that ability to store, distribute, share, and retrieve that data on the go using any web-enabled device. Another example is AT&T's Platform as a Service capability, a great foundational tool for users to go in and use any one of our pre-built application and then instantly mobilize that application.

We have a customer who recently used this, because they had a customer meeting and they didn't have a sophisticated way to get surveys out for their customers. They wanted to create a database on the fly and get instantaneous feedback.
We find that many customers are looking for a secure place to collaborate on data sharing.

So they went into AT&T's Platform as a Service -- and this is a marketing person mind you, not a technical user -- they entered the questions that they required of the customers. They sent the quick questionnaire out to the end-users, five simple questions. The clients answered the questions.

Ultimately, that customer had a very sophisticated database with all of that information that they could use for market sensing on how to improve their products, number one. But number two, it made sense to use it as a marketing tool to provide promotional information to those customers in the future.

Gardner: Very good. We've been talking about how global telecommunications giant AT&T has been creating and delivering advanced cloud services for the customers, and we have seen how they view the VMware-centric infrastructure approach to help provide virtual private clouds and other computing capabilities as integrated services at scale.

So thanks to our guest, Chris Costello, Assistant Vice President of AT&T Cloud Services, really appreciate your input.

Costello: Thank you.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks again to our audience for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: VMware.

Transcript of a BriefingsDirect podcast on how telecom giant AT&T is leveraging its networking and cloud expertise to provide advanced cloud services. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Monday, January 28, 2013

The Open Group Keynoter Sees Big-Data Analytics Bolstering Quality, Manufacturing, Processes

Transcript of a BriefingsDirect podcast on how Ford Motor Company is harnessing multiple big data sources to improve products and operations.
Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership interview series coming to you in conjunction with The Open Group Conference on Jan. 28 in Newport Beach, California.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, and I'll be your host and moderator throughout these business transformation discussions. The conference will focus on "Big Data -- The Transformation We Need to Embrace Today."

We are here now with one of the main speakers at the conference, Michael Cavaretta, PhD, Technical Leader of Predictive Analytics for Ford Research and Advanced Engineering in Dearborn, Michigan.

We’ll see how Ford has exploited the strengths of big data analytics by directing them internally to improve business results. In doing so, they scour the metrics from the company’s best processes across myriad manufacturing efforts and through detailed outputs from in-use automobiles, all to improve and help transform their business. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Cavaretta has led multiple data-analytic projects at Ford to break down silos inside the company to best define Ford’s most fruitful data sets. Ford has successfully aggregated customer feedback, and extracted all the internal data to predict how best new features in technologies will improve their cars.

As a lead-in to his Open Group presentation, Michael and I will now explore how big data is fostering business transformation by allowing deeper insights into more types of data efficiently, and thereby improving processes, quality control, and customer satisfaction.

With that, please join me in welcoming Michael Cavaretta. Welcome to BriefingsDirect, Michael.

Michael Cavaretta: Thank you very much.

Gardner: Your upcoming presentation for The Open Group Conference is going to describe some of these new approaches to big data and how that offers some valuable insights into internal operations, and therefore making a better product. To start, what's different now in being able to get at this data and do this type of analysis from, say, five years ago?

Cavaretta: The biggest difference has to do with the cheap availability of storage and processing power, where a few years ago people were very much concentrated on filtering down the datasets that were being stored for long-term analysis. There has been a big sea change with the idea that we should just store as much as we can and take advantage of that storage to improve business processes.

Gardner: That sounds right on the money, but how did we get here? How do we get to the point where we could start using these benefits from a technology perspective, as you say, better storage, networks, being able to move big dataset, that sort of thing, to wrenching out benefits. What's the process behind the benefit?

Sea change in attitude

Cavaretta: The process behind the benefits has to do with a sea change in the attitude of organizations, particularly IT within large enterprises. There's this idea that you don't need to spend so much time figuring out what data you want to store and worry about the cost associated with it, and more about data as an asset. There is value in being able to store it, and being able to go back and extract different insights from it. This really comes from this really cheap storage, access to parallel processing machines, and great software.

Gardner: It seems to me that for a long time, the mindset was that data is simply the output from applications, with applications being primary and the data being almost an afterthought. It seems like we sort flipped that. The data now is perhaps as important, even more important, than the applications. Does that seem to hold true?

Cavaretta
Cavaretta: Most definitely, and we’ve had a number of interesting engagements where people have thought about the data that's being collected. When we talk to them about big data, storing everything at the lowest level of transactions, and what could be done with that, their eyes light up and they really begin to get it.

Gardner: I suppose earlier, when cost considerations and technical limitations were at work, we would just go for a tip-of-the-iceberg level. Now, as you say, we can get almost all the data. So, is this a matter of getting at more data, different types of data, bringing in unstructured data, all the above? How much you are really going after here?

Cavaretta: I like to talk to people about the possibility that big data provides and I always tell them that I have yet to have a circumstance where somebody is giving me too much data. You can pull in all this information and then answer a variety of questions, because you don't have to worry that something has been thrown out. You have everything.

You may have 100 questions, and each one of the questions uses a very small portion of the data. Those questions may use different portions of the data, a very small piece, but they're all different. If you go in thinking, "We’re going to answer the top 20 questions and we’re just going to hold data for that," that leaves so much on the table, and you don't get any value out of it.
The process behind the benefits has to do with a sea change in the attitude of organizations, particularly IT within large enterprises.

Gardner: I suppose too that we can think about small samples or small datasets and aggregate them or join them. We have new software capabilities to do that efficiently, so that we’re able to not just look for big honking, original datasets, but to aggregate, correlate, and look for a lifecycle level of data. Is that fair as well?

Cavaretta: Definitely. We're a big believer in mash-ups and we really believe that there is a lot of value in being able to take even datasets that are not specifically big-data sizes yet, and then not go deep, not get more detailed information, but expand the breadth. So it's being able to augment it with other internal datasets, bridging across different business areas as well as augmenting it with external datasets.

A lot of times you can take something that is maybe a few hundred thousand records or a few million records, and then by the time you’re joining it, and appending different pieces of information onto it, you can get the big dataset sizes.

Gardner: Just to be clear, you’re unique. The conventional wisdom for big data is to look at what your customers are doing, or just the external data. You’re really looking primarily at internal data, while also availing yourself of what external data might be appropriate. Maybe you could describe a little bit about your organization, what you do, and why this internal focus is so important for you.

Internal consultants

Cavaretta: I'm part of a larger department that is housed over in the research and advanced-engineering area at Ford Motor Company, and we’re about 30 people. We work as internal consultants, kind of like Capgemini or Ernst & Young, but only within Ford Motor Company. We’re responsible for going out and looking for different opportunities from the business perspective to bring advanced technologies. So, we’ve been focused on the area of statistical modeling and machine learning for I’d say about 15 years or so.

And in this time, we’ve had a number of engagements where we’ve talked with different business customers, and people have said, "We'd really like to do this." Then, we'd look at the datasets that they have, and say, "Wouldn’t it be great if we would have had this. So now we have to wait six months or a year."

These new technologies are really changing the game from that perspective. We can turn on the complete fire-hose, and then say that we don't have to worry about that anymore. Everything is coming in. We can record it all. We don't have to worry about if the data doesn’t support this analysis, because it's all there. That's really a big benefit of big-data technologies.

Gardner: If you've been doing this for 15 years, you must be demonstrating a return on investment (ROI) or a value proposition back to Ford. Has that value proposition been changing? Do you expect it to change? What might be your real value proposition two or three years from now?

Cavaretta: The real value proposition definitely is changing as things are being pushed down in the company to lower-level analysts who are really interested in looking at things from a data-driven perspective. From when I first came in to now, the biggest change has been when Alan Mulally came into the company, and really pushed the idea of data-driven decisions.
The real value proposition definitely is changing as things are being pushed down in the company to lower-level analysts.

Before, we were getting a lot of interest from people who are really very focused on the data that they had internally. After that, they had a lot of questions from their management and from upper level directors and vice-president saying, "We’ve got all these data assets. We should be getting more out of them." This strategic perspective has really changed a lot of what we’ve done in the last few years.

Gardner: As I listen to you Michael, it occurs to me that you are applying this data-driven mentality more deeply. As you pointed out earlier, you're also going after all the data, all the information, whether that’s internal or external.

In the case of an automobile company, you're looking at the factory, the dealers, what drivers are doing, what the devices within the automobile are telling you, factoring that back into design relatively quickly, and then repeating this process. Are we getting to the point where this sort of Holy Grail notion of a total feedback loop across the lifecycle of a major product like an automobile is really within our grasp? Are we getting there, or is this still kind of theoretical. Can we pull it altogether and make it a science?

Cavaretta: The theory is there. The question has more to do with the actual implementation and the practicality of it. We still are talking a lot of data where even with new advanced technologies and techniques that’s a lot of data to store, it’s a lot of data to analyze, there’s a lot of data to make sure that we can mash-up appropriately.

And, while I think the potential is there and I think the theory is there. There is also a work in being able to get the data from multiple sources. So everything which you can get back from the vehicle, fantastic. Now if you marry that up with internal data, is it survey data, is it manufacturing data, is it quality data? What are the things do you want to go after first? We can’t do everything all at the same time.

Highest value

Our perspective has been let’s make sure that we identify the highest value, the greatest ROI areas, and then begin to take some of the major datasets that we have and then push them and get more detail. Mash them up appropriately and really prove up the value for the technologists.

Gardner: Clearly, there's a lot more to come in terms of where we can take this, but I suppose it's useful to have a historic perspective and context as well. I was thinking about some of the early quality gurus like Deming and some of the movement towards quality like Six Sigma. Does this fall within that same lineage? Are we talking about a continuum here over that last 50 or 60 years, or is this something different?

Cavaretta: That’s a really interesting question. From the perspective of analyzing data, using data appropriately, I think there is a really good long history, and Ford has been a big follower of Deming and Six Sigma for a number of years now.

The difference though, is this idea that you don't have to worry so much upfront about getting the data. If you're doing this right, you have the data right there, and this has some great advantages. You’ll have to wait until you get enough history to look for somebody’s patterns. Then again, it also has some disadvantage, which is you’ve got so much data that it’s easy to find things that could be spurious correlations or models that don’t make any sense.

The piece that is required is good domain knowledge, in particular when you are talking about making changes in the manufacturing plant. It's very appropriate to look at things and be able to talk with people who have 20 years of experience to say, "This is what we found in the data. Does this match what your intuition is?" Then, take that extra step.
We do have to deal with working on pilot projects and working with our business customers to bring advanced analytics and big data technologies to bear against these problems.

Gardner: Tell me a little about sort a day in the life of your organization and your team to let us know what you do. How do you go about making more data available and then reaching some of these higher-level benefits?

Cavaretta: We're very much focused on interacting with the business. Most of all, we do have to deal with working on pilot projects and working with our business customers to bring advanced analytics and big data technologies to bear against these problems. So we work in kind of what we call push-and-pull model.

We go out and investigate technologies and say these are technologies that Ford should be interested in. Then, we look internally for business customers who would be interested in that. So, we're kind of pushing the technologies.

From the pull perspective, we’ve had so many successful engagements in such good contacts and good credibility within the organization that we've had people come to us and say, "We’ve got a problem. We know this has been in your domain. Give us some help. We’d love to be able to hear your opinions on this."

So we’ve pulled from the business side and then our job is to match up those two pieces. It's best when we will be looking at a particular technology and we have somebody come to us and we say, "Oh, this is a perfect match."

Big data

Those types of opportunities have been increasing in the last few years, and we've been very happy with the number of internal customers that have really been very excited about the areas of big data.

Gardner: Because this is The Open Group Conference and an audience that’s familiar with the IT side of things, I'm curious as to how this relates to software and software development. Of course there are so many more millions of lines of code in automobiles these days, software being more important than just about everything. Are you applying a lot of what you are doing to the software side of the house or are the agile and the feedback loops and the performance management issues a separate domain, or it’s your crossover here?

Cavaretta: There's some crossover. The biggest area that we've been focused on has been picking information, whether internal business processes or from the vehicle, and then being able to bring it back in to derive value. We have very good contacts in the Ford IT group, and they have been fantastic to work with in bringing interesting tools and technology to bear, and then looking at moving those into production and what’s the best way to be able to do that.

A fantastic development has been this idea that we’re using some of the more agile techniques in this space and Ford IT has been pushing this for a while. It’s been fantastic to see them work with us and be able to bring these techniques into this new domain. So we're pushing the envelope from two different directions.

Gardner: It sounds like you will be meeting up at some point with a complementary nature to your activities.

Cavaretta: Definitely.
There are huge opportunities within that, and there are also some interesting opportunities having to do with opening up some of these systems for third-party developers.

Gardner: Let’s move on to this notion of the "Internet of things," a very interesting concept that lot of people talk about. It seems relevant to what we've been discussing.

We have sensors in these cars, wireless transfer of data, more-and-more opportunity for location information to be brought to bear, where cars are, how they're driven, speed information, all sorts of metrics, maybe making those available through cloud providers that assimilate this data.

So let’s not go too deep, because this is a multi-hour discussion all on its own, but how is this notion of the Internet of things being brought to bear on your gathering of big data and applying it to the analytics in your organization?

Cavaretta: It is a huge area, and not only from the internal process perspective -- RFID tags within the manufacturing plans, as well as out on the plant floor, and then all of the information that’s being generated by the vehicle itself.

The Ford Energi generates about 25 gigabytes of data per hour. So you can imagine selling couple of million vehicles in the near future with that amount of data being generated. There are huge opportunities within that, and there are also some interesting opportunities having to do with opening up some of these systems for third-party developers. OpenXC is an initiative that we have going on to add at Research and Advanced Engineering.

Huge number of sensors

We have a lot of data coming from the vehicle. There’s huge number of sensors and processors that are being added to the vehicles. There's data being generated there, as well as communication between the vehicle and your cell phone and communication between vehicles.

There's a group over at Ann Arbor Michigan, the University of Michigan Transportation Research Institute (UMTRI), that’s investigating that, as well as communication between the vehicle and let’s say a home system. It lets the home know that you're on your way and it’s time to increase the temperature, if it’s winter outside, or cool it at the summer time.

The amount of data that’s been generated there is invaluable information and could be used for a lot of benefits, both from the corporate perspective, as well as just the very nature of the environment.

Gardner: Just to put a stake in the ground on this, how much data do cars typically generate? Do you have a sense of what now is the case, an average?

Cavaretta: The Energi, according to the latest information that I have, generates about 25 gigabytes per hour. Different vehicles are going to generate different amounts, depending on the number of sensors and processors on the vehicle. But the biggest key has to do with not necessarily where we are right now but where we will be in the near future.

With the amount of information that's being generated from the vehicles, a lot of it is just internal stuff. The question is how much information should be sent back for analysis and to find different patterns? That becomes really interesting as you look at external sensors, temperature, humidity. You can know when the windshield wipers go on, and then to be able to take that information, and mash that up with other external data sources too. It's a very interesting domain.
With the amount of information that's being generated from the vehicles, a lot of it is just internal stuff.

Gardner: So clearly, it's multiple gigabytes per hour per vehicle and probably going much higher.

Cavaretta: Easily.

Gardner: Let's move forward now for those folks who have been listening and are interested in bringing this to bear on their organizations and their vertical industries, from the perspective of skills, mindset, and culture. Are there standards, certification, or professional organizations that you’re working with in order to find the right people?

It's a big question. Let's look at what skills do you target for your group, and what ways you think that you can improve on that. Then, we’ll get into some of those larger issues about culture and mindset.

Cavaretta: The skills that we have in our department, in particular on our team, are in the area of computer science, statistics, and some good old-fashioned engineering domain knowledge. We’ve really gone about this from a training perspective. Aside from a few key hires, it's really been an internally developed group.

Targeted training

The biggest advantage that we have is that we can go out and be very targeted with the amount of training that we have. There are such big tools out there, especially in the open-source realm, that we can spin things up with relatively low cost and low risk, and do a number of experiments in the area. That's really the way that we push the technologies forward.

Gardner: Why The Open Group? Why is that a good forum for your message, and for your research here?

Cavaretta: The biggest reason is the focus on the enterprise, where there are a lot of advantages and a lot of business cases, looking at large enterprises and where there are a lot of systems, companies that can take a relatively small improvement, and it can make a large difference on the bottom-line.

Talking with The Open Group really gives me an opportunity to be able to bring people on board with the idea that you should be looking at a difference in mindset. It's not "Here’s a way that data is being generated, look, try and conceive of some questions that we can use, and we’ll store that too." Let's just take everything, we’ll worry about it later, and then we’ll find the value.

Gardner: I'm sure the viewers of your presentation on January 28 will be gathering a lot of great insights. A lot of the people that attend The Open Group conferences are enterprise architects. What do you think those enterprise architects should be taking away from this? Is there something about their mindset that should shift in recognizing the potential that you've been demonstrating?
Talking with The Open Group really gives me an opportunity to be able to bring people on board with the idea that you should be looking at a difference in mindset.

Cavaretta: It's important for them to be thinking about data as an asset, rather than as a cost. You even have to spend some money, and it may be a little bit unsafe without really solid ROI at the beginning. Then, move towards pulling that information in, and being able to store it in a way that allows not just the high-level data scientist to get access to and provide value, but people who are interested in the data overall. Those are very important pieces.

The last one is how do you take a big-data project, how do you take something where you’re not storing in the traditional business intelligence (BI) framework that an enterprise can develop, and then connect that to the BI systems and look at providing value to those mash-ups. Those are really important areas that still need some work.

Gardner: Another big constituency within The Open Group community are those business architects. Is there something about mindset and culture, getting back to that topic, that those business-level architects should consider? Do you really need to change the way you think about planning and resource allocation in a business setting, based on the fruits of things that you are doing with big data?

Cavaretta: I really think so. The digital asset that you have can be monetized to change the way the business works, and that could be done by creating new assets that then can be sold to customers, as well as improving the efficiencies of the business.

High quality data

This idea that everything is going to be very well-defined and there is a lot of work that’s being put into making sure that data has high quality, I think those things need to be changed somewhat. As you're pulling the data in, as you are thinking about long-term storage, it’s more the access to the information, rather than the problem in just storing it.

Gardner: Interesting that you brought up that notion that the data becomes a product itself and even a profit center perhaps.

Cavaretta: Exactly. There are many companies, especially large enterprises, that are looking at their data assets and wondering what can they do to monetize this, not only to just pay for the efficiency improvement but as a new revenue stream.

Gardner: We're almost out of time. For those organizations that want to get started on this, are there any 20/20 hindsights or Monday morning quarterback insights you can provide. How do you get started? Do you appoint a leader? Do you need a strategic roadmap, getting this culture or mindset shifted, pilot programs? How would you recommend that people might begin the process of getting into this?
Understand that it maybe going to be a little bit more costly and the ROI isn't going to be there at the beginning.

Cavaretta: We're definitely a huge believer in pilot projects and proof of concept, and we like to develop roadmaps by doing. So get out there. Understand that it's going to be messy. Understand that it maybe going to be a little bit more costly and the ROI isn't going to be there at the beginning.

But get your feet wet. Start doing some experiments, and then, as those experiments turn from just experimentation into really providing real business value, that’s the time to start looking at a more formal aspect and more formal IT processes. But you've just got to get going at this point.

Gardner: I would think that the competitive forces are out there. If you are in a competitive industry, and those that you compete against are doing this and you are not, that could spell some trouble.

Cavaretta: Definitely.

Gardner: We’ve been talking with Michael Cavaretta, PhD, Technical Leader of Predictive Analytics at Ford Research and Advanced Engineering in Dearborn, Michigan. Michael and I have been exploring how big data is fostering business transformation by allowing deeper insights into more types of data and all very efficiently. This is improving processes, updating quality control and adding to customer satisfaction.

Our conversation today comes as a lead-in to Michael’s upcoming plenary presentation. He is going to be talking on January 28 in Newport Beach California, as part of The Open Group Conference.

You will hear more from Michael and others, the global leaders on big data that are going to be gathering to talk about business transformation from big data at this conference. So a big thank you to Michael for joining us in this fascinating discussion. I really enjoyed it and I look forward to your presentation on the 28.

Cavaretta: Thank you very much.

Gardner: And I would encourage our listeners and readers to attend the conference or follow more of the threads in social media from the event. Again, it’s going to be happening from January 27 to January 30 in Newport Beach, California.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout this thought leadership interview series. Thanks again for listening, and come back next time.
Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect podcast on how Ford Motor Company is harnessing multiple big data sources to improve products and operations. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in:

Monday, January 14, 2013

The Networked Economy Newly Forges Innovation Forces for Collaboration in Business and Commerce, Says Author Zach Tumin

Advanced business networks are driving innovation and social interactions as new technologies and heightened user expectations converge.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Ariba.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you're listening to BriefingsDirect. Today, we present a sponsored podcast discussion on how new levels of collaboration have emerged from an increasingly networked world, and what that now means for business and society.

Gardner
We'll hear from a Harvard Kennedy School researcher and author on how deeper levels of collaboration -- more than ever -- can positively impact how organizations operate. And we'll learn from a global business-commerce network provider how these digital communities are redefining and extending new types of business and collaboration

To learn more about how new trends in collaboration and business networking are driving innovation and social interactions, please join me now in welcoming Zach Tumin, Senior Researcher at the Science, Technology, and Public Policy Program at the Harvard Kennedy School. Welcome, Zach.

Zach Tumin: Good morning, Dana.

Gardner: Zach, you're also the co-author with William Bratton of this year’s Collaborate or Perish: Reaching Across Boundaries in a Networked World, published by Random House. We welcome you to the show.

Tumin: Thank you.

Gardner: We're also joined today by Tim Minahan, Senior Vice-President of Global Network Strategy and Chief Marketing Officer at Ariba, an SAP company. Welcome back, Tim.

Tim Minahan: Thanks, Dana. Good to be here. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

Gardner: Gentlemen, let's set the stage here, because we have a really big topic. Zach, in your book "Collaborate or Perish," you're exploring collaboration and you show what it can do when it's fully leveraged. It's very interesting. And Tim, at Ariba you've been showing how a more networked economy is producing efficiencies for business and even extending the balance of what we would consider commerce to be.

I’d like to start with looking at how these come together. First, we have new types of collaboration and then we have the means to execute on them through these new business networks. What should we expect when these come together? Let's go to you first, Zach.

Tumin: Thanks, Dana. The opportunities for collaboration are expanding even as we speak. The networks around the world are volatile. They're moving fast. The speed of change is coming at managers and executives at a terrific pace. There is an incredible variety of choice, and people are empowered with these great digital devices that we all have in our pockets.
Tumin

That creates a new world, where the possibilities are tremendous for joining forces, whether politically, economically, or socially. Yet it's also a difficult world, where we don't have authority, if we have to go outside of our organizations -- but where we don't have all the power that we need, if we stay within the boundaries of our charters.

So, we're always reaching across boundaries to find people who we can partner with. The key is how we do that. How do we move people to act with us, where we don't have the authority over them? How do we make it pay for people to collaborate?

A lot of change

Gardner: Tim, we've seen lots of change in last 20 years, and a lot of times, we'll see behavioral shifts. Then, at other times, we'll see technology shifts. Today, we seem to be having both come together. Based on what Zach has described in this unprecedented level of change in adaptation, where do you see the big payoffs for business in terms of leveraging collaboration in the context of a vast network?

Minahan: Collaboration certainly is the new business imperative. Companies have leaned out their operations over the past couple of years and they spent the previous 30 years focusing on their internal operations and efficiencies and driving greater performance, and getting greater insights.

Minahan
When they look outside their enterprise today, it's still a mess. Most of the transactions still occur offline or through semi-automated processes. They lack transparency into those processes and efficiency in executing them. As a result, that means lots of paper and lots of people and lots of missed opportunities, whether it's in capitalizing on getting a new product to market or achieving new sales with new potential customers.

What business networks and this new level of collaboration bring is four things. It brings the transparency that’s currently lacking into the process. So you know where your opportunities are. You know where your orders are. You know where your invoices are and what your exposure to payables are.

It brings new levels of efficiencies executing against those processes, much faster than you ever could before through mostly automated process.
It brings new levels of efficiencies executing against those processes, much faster than you ever could before through mostly automated process. It brings new types of collaboration which I am sure we will get into later in this segment.

The last part, which I think is most intriguing, is that it brings new levels of insights. We're no longer making decisions blindly. We no longer need to double order, because we don’t know if that shipment is coming in and we need to stockpile, because we can't let the refinery go down. So it brings new levels of insight to make more informed decisions in real time.

Gardner: One of the things I sense, as people grapple with these issues, is a difficulty in deciding where to let creative chaos rein and where to exercise control and where to lock down and exercise traditional IT imperatives around governance, command and control, and systems of records.

Zach, in your book with William Bratton, are there any examples that you can point to that show how some organizations have allowed that creativity of people to extend their habits and behaviors in new ways unfettered and then at the same time retain that all-important IT control?

Tumin: It's a critical question that you’ve raised. We have young people coming into the workforce who are newly empowered. They understand how to do all the things that they need do without waiting online and without waiting for authority. Yet, they're coming into organizations that have strong cultures that have strong command-and-control hierarchies.

There's a clash that’s happening here, and the strong companies are the ones that find the path to embracing the creativity of networked folks within the organization and across their boundaries, while maintaining focus on set of core deliverables that everyone needs to do.

Wells Fargo

There are plenty of terrific examples. I will give you one. At Wells Fargo, for the development of the online capability for the wholesale shop, Steve Ellis was Executive Vice President. He had to take his group offline to develop the capability, but he had two responsibilities. One was to the bank, which had a history of security and trust. That was its brand. That was its reputation. But he was also looking to the online world, to variability, to choice, and to developing exactly the things that customers want.

Steve Ellis found a way of working with his core group of developers to engage customers in the code design of Wells Fargo's online presence for the wholesale side. As a result, they were able to develop systems that were so integrated into the customers over time that they can move very, very quickly, adapt as new developments required, and yet they gave full head to the creativity of the designers, as well as to the customers in coming to these new ways of doing business.

So here's an example of a pretty staid organization, 150 years old with a reputation for trust and security, making its way into the roiling water of the networked world and finding a path through engagement that helped to prevail in the marketplace over a decade.

Gardner: Tim Minahan, for the benefit of our audience, help us better understand how Ariba is helping to fuel this issue of allowing creativity and new types of collaboration, but at the same time maintaining that the important principles of good business.

Minahan: Absolutely, Dana. The problem we solve at Ariba is quite basic, yet one of the biggest impediments to business productivity and performance that still exists. That's around inter-enterprise collaboration or collaboration between businesses.

We talked about the deficits there earlier. Through our cloud-based applications and business network, we eliminate all of the hassles, the papers, the phone calls, and other manual or disjointed activities that companies do each day to do things like find new suppliers, find new business opportunities as a seller, to place or manage orders, to collaborate with customers suppliers and other partners, or to just get paid.

They can connect with known trading partners much more efficiently and then automate the processes and the information flows between each other.
Nearly a million business today are digitally connected through the Ariba Network. They're empowered to discover one another in new ways, getting qualifying information from the community, so that they know who that party is even if they haven’t met them before. It's similar to what you see on eBay. When you want to sell your golf clubs, you know that that buyer has a performance history of doing business with other buyers.

They can connect with known trading partners much more efficiently and then automate the processes and the information flows between each other. Then, they can collaborate in new ways, not only to find one another, but also to get access to preferred financing or new insights into market trends that are going on around particular commodities.

That’s the power of bringing a business network to bear in today’s world. It's this convergence of cloud applications, the ability to access and automate a process. Those that share that process share the underlying infrastructure and a digitally connected community of relevant parties, whether that’s customers, suppliers, potential trading partners, banking partners, or other participants involved in the commerce process.

Gardner: Zach, in your book and in your earlier comments, you're basically describing almost a new workforce and some companies and organizations are recognizing that and embracing it. What’s driving this? What has happened that is basically redefining a workforce and how it relates to itself and to the customer or, in many cases, for businesses across the ecosystem of the suppliers and then the channels and distribution? What’s behind this fairly massive shift in what workforces are?

It's the demographics

Tumin: It’s in the demographics, Dana. Young people are accustomed to doing things today that were not possible 10 years ago. The digital power in everyone’s pocket or pocket book, the digital wallet in markets, are ready, willing, and able to deal with them and to welcome them. That means that there’s pressure on organizations to integrate and take advantage of the power that individuals have in the marketplace and that come in to their workforce.

Everyone can see what's going on around the world. We're moving to a situation where young people are feeling pretty powerful. They're able to search, find, discover, and become experts all on their own through the use of technologies that 10 years ago weren’t available.

So a lot of the traditional ways of thinking about power, status, and prestige in the workforce are changing as a result, and the organizations that can adapt and adopt these kinds of technologies and turn them to their advantage are the ones that are going to prevail.

Gardner: Tim, with that said, there's this demographic shift, the shift in the mentality of self-started discovery of recognizing that the information you want is out there, and it’s simply a matter of applying your need to the right data and then executing on some action as a result. Your network seems ready-made for that. I know that you guys have been at this for some time. It seems like the events, these trends, have coalesced in a way that that really suits your strength.

Tell me why you think that’s the case that this vision you had at Ariba a decade or more ago has come about. Is there something fundamental about the Internet or were you guys just in the right place at the right time?


The reality of the community is that it is organic. It takes time to grow.
Minahan: The reality of the community is that it is organic. It takes time to grow. At Ariba we have more than 15 years of transactional history, relationship history, and community generated content that we've amassed. In fact, over the past 12 months those, nearly a million connected companies have executed more than $400 billion in purchase, sales, invoice, and payment transactions over the Ariba network.

Aggregate that over 15 years, and you have some great insights beyond just trading efficiencies for those companies participating there. You can deliver insights to them so that they can make more informed decisions, whether that’s in selecting a new trading partner or determining when or how to pay.

Should I take an early-payment discount in order to accelerate or reduce my cost basis? From a sales standpoint, or seller’s standpoint, should I offer an early payment discount in order to accelerate my cash flow? There are actually a host of examples where companies are taking advantage of this today and it’s not just for the large companies. Let me give you two examples.

From the buyer side, there was a company called Plaid Enterprises. Plaid is a company that, if you have daughters like I do who are interested in hobbies and creating crafts, you are very familiar with. They're one of the leading providers for the do-it-yourself crafts that you would get at your craft store.

Like many other manufacturers, they were a mid sized company, but they decided a couple of years ago to offshore their supplies. So they went to the low cost region of China. A few years into it, they realized that labor wages were rising, their quality was declining, and worse than that, it was sometimes taking them five months to get their shipment.

New sources of supply

So they went to the Ariba Network to find new sources of supply. Like many other manufacturers, they thought, "Let’s look in other low cost regions like Vietnam." They certainly found suppliers there, but what they also found were suppliers here in North America.

They went through a bidding process with the suppliers they found there, with the qualifying information on who was doing business with whom and how they performed in the past, and they wound up selecting a supplier that was 30 miles down the road. They wound up getting a 40 percent cost reduction from what they had previously paid in China and their lead times were cut from more than 120 days down to 30.

That’s from the buy side. From the sell side, the inverse is true. I'll use an example of a company called Mediafly. It's a fast growing company that provides mobile marketing services to some of the largest companies in the world, large entertainment companies, large consumer products companies.

They were asked to join the Ariba Network to automate their invoicing and they have gotten some great efficiencies from that. They've gotten transparencies to know when their invoice is paid, but one other thing was really interesting.

Once they were in the networked environment and once they had automated those processes, they were now able to do what we call dynamic discounting. That meant when they want their cash, they can make offers to their customers that they're connected to on the Ariba Network and be able to accelerate their cash.
You have extraordinary volatility on your network and that can rumble all the way through.

So they were able not only to shrink their quote-to-settle cycle by 84 percent, but they gained access to new financing and capital through the Ariba network. So they could go out and hire that new developer to take on that new project and they were even able to defer a next round of funding, because they have greater control over their cash flow.

Gardner: Zach, in listening to Tim, particularly that discovery process, we're really going back to some principles that define being human -- collaboration, word of mouth, sharing information about what you know. It just seems that we have a much greater scale that we can deploy this. As Tim was saying, you can look to supply chains in China, Vietnam, or in your own neighborhood that you might not have known, but you will discover.

Help me understand why the scale here is important? We can scale up and scale down. How is that fundamentally changing how people are relating in business and society?

Tumin: The scaling means that things can get big in a hurry and they can get fast in a hurry. So you get a lot of volume, things go viral, and you have a velocity of change here. New technologies are introducing themselves to the market. You have extraordinary volatility on your network and that can rumble all the way through, so that you feel it seconds after something halfway around the world has put a glitch in your supply chain. You have enormous variability. You're dealing with many different languages, both computer languages and human languages.

That means that the potential for collaboration really requires coming together in ways that helps people see very quickly why it is that they should work together, rather than go it alone. They may not have a choice, but people are still status quo animals. We're comfortable in the way that we have always done business, and it takes a lot to move us out.

It comes down to people

When crisis hits, it’s not exactly a great time to build those relationships. Speaker of the House Tip O'Neill here in United States once said "Make friends before you need them." That’s a good advice. We have great technology and we have great networks, but at the end of day, it’s people that make them work.

People rely on trust, and trust relies on relationships. Technology here is a great enabler but it’s no super bullet. It takes leadership to get people together across these networks and to then be able to scale and take advantage of what all these networks have to offer.

Gardner: Tim, another big trend today of course, is the ability to use all of this data that Zach has been describing, and you are alluding to, about what’s going on within these networks. Now, of course, with this explosive scale, the amount of that data has likewise exploded.

As we bring more of these coalescent trends together, we have the ability to deal with that scale at a lower cost than ever, and therefore start to create this dynamic of viral or virtual benefit type of effect. What I'm alluding to is more data, the more insight into what’s going on in the network, the more the people then avail themselves of that network, the more data they create, and therefore the better the analysis and the more pertinent their efforts are to their goals.

So, am I off in la-la land here or is there really something that we can point to about a virtuous adoption pattern, vis-a-vis, the ability to manage this data even as we explode the scale of commerce?
One of the reasons we're so excited about getting access to SAP HANA is the ability to offer this information up in real time.

Minahan: We've only begun to scratch surface on this. When you look at the data that goes on in a business commerce network, it’s really three levels. One is the transactional data, the actual transactions that are going on, knowing what commodities are being purchased and so on. Then, there's relationship data, knowing the relationship between a given buyer and seller.

Finally, there's what I would call community data, or community generated data, and that can take the form of performance ratings, so buyers rating suppliers and suppliers rating buyers. Others in the community can use that to help determine who to do business with or to help to detect some risk in their supply chain.

There are also community generated content, like request for proposal (RFP) templates. A lot of our communities members use a "give a template, take a template" type approach in which they are offering RFP templates to other members of the community that work well for them. These can be templates on how to source temp labor or how to source corrugated packaging.

We have dozens and dozens of those. When you aggregate all of this, the last part of the community data is the benchmarking data. It's understanding not just process benchmarking but also spend benchmarking.

One of the reasons we're so excited about getting access to SAP HANA is the ability to offer this information up in real time, at the point of either purchase or sale decision, so that folks can make more informed decisions about who to engage with or what terms to take or how to approach a particular category. That is particularly powerful and something you can’t get in a non-networked model.

Sharing data

Gardner: To that same point, Zach, are there some instances in your book, where you can point to this ability to share the data across community, whether it’s through some sort of a cloud apparatus or even a regulatory environment, where people are compelled to open up and share that is creating a new or very substantial benefits?

I am just trying to get at the network effect here, when it comes to exposing the data. I think that we're at a period now where that can happen in ways that just weren’t possible even five years ago.

Tumin: One of the things that we're seeing around the world is that innovation is taking place at the level of individual apps and individual developers. There's a great example in London. London Transport had a data set and a website that people would use to find out where their trains were, what the schedule was, and what was happening on a day-to-day basis.

As we all know, passengers on mass transit like to know what's happening on a minute-to-minute basis. London Transport decided they would open up their data, and the open data movement is very, very important in that respect. They opened the data and let developers develop some apps for folks. A number of apps developers did and put these things out on the system. The demand was so high that they crashed London Transport, initially.

London Transport took their data and put it into the cloud, where they could handle the scale much more effectively. Within a few days, they had gone from those thousand hits on the website per day to 2.3 million in the cloud.
You need governance and support people, and people to make it work and to trust each other and share information.

The ability to scale is terribly important. The ability to innovate and turn these open datasets over to communities of developers, to make this data available to people the way they want use it, is terribly important. And these kinds of industry-government relations that makes this possible are critical as well.

So across all those dimensions, technology, people, politics, and the platform, the data has to line up. You need governance and support people, and people to make it work and to trust each other and share information. These are the keys to collaboration today.

Gardner: We're coming up on our time limit, but I wanted to put myself in the place of a listener, who might be really jazzed by the potential here, but is still concerned about losing control. How do you take advantage of the mobile extended networks of social media and networks, but without losing your basic principles of good business practice and governance?

Is there something that you're seeing Tim, through your network and the way you're approaching this, that is a balancing act? How can you give some advice to someone who can start to enter these waters, but not drown or get lost?

Minahan: First, I want to talk about the dynamics going on that are fueling B2B collaboration. There is certainly the need for more productivity. So that's a constant in business, particularly as we're in tight environments. Many times companies are finding they are tapped out within the enterprise.

Becoming more dependent

The second is the leaning out of the enterprise itself with outsourcing more processes, more supply, and more activities to third parties. Companies are becoming more and more dependent on getting insights and collaborating with folks outside their enterprise.

The third is what Zach mentioned before, the changing demographics in the workforce, the millennials. They're collapsing the hierarchal command and control. They don't stand for sequestering of information with only a given few. They believe in sharing and in the knowledge of crowds. They want more collaboration with their peers, their bosses, and their business partners.

When you take that within a business context and how you put controls on it, obviously there needs to be some change. There is some change going on. There is change going on towards this wave of collaboration. Zach said before that it needs a good leader. There is change management involved. Let's not fool ourselves that technology is the only answer.

So policies need to be put down. Just like many businesses put policies down on their social media, there needs to be policies put down on how we share information and with whom, but the great thing about technology is that it can enforce those controls. It can help to put in checks and balances and give you a full transparency and audit trail, so you know that these policies are being enforced. You know that there are certain parameters around security of data.

You don't have those controls in the offline world. When paper leaves the building, you don't know. But when a transaction is shared or when information is shared over a network, you, as a company, have greater control. You have a greater insight, and the ability to track and trace.
When a transaction is shared or when information is shared over a network, you, as a company, have greater control.

So there is this balancing act going on between opening the kimono, as we talked about in '80s, being able to share more information with your trading partners, but now being able to do it in a controlled environment that is digitized and process-oriented. You have the controls you need to ensure you're protecting your business, while also growing your business.

Gardner: Zach, last word to you. What do we get? What's the payoff, if we can balance this correctly? If we can allow these new wheels of innovation to spin, to scale up, but also apply the right balance, as Tim was describing, for audit trails and access and privilege controls? If we do this right, what's in the offing? Even though it's early in the game as you pointed out, what's the potential here? When can we expect this payoff?

Tumin: I think you can expect four things, Dana. First is that you can expect innovations faster with ideas that work right away for partners. The partners who collaborate deeply and right from the start get their products right without too much error built-in and they can get them to market faster.

Second is that you're going to rinse out the cost of rework, whether it's from carrying needless inventory or handling paper that you don’t have to touch where there is cost involved. You're going to be able to rinse that out.

Third is that you're going to be able to build revenues by dealing with risk. You're going to take advantage of customer insight. You're going to make life better and that's going to be good news for you and the marketplace.

Constant learning

The fourth is that you have an opportunity for constant learning, so that insight moves to practice faster. That’s really important, because the world is changing so fast, you have the volatility, a velocity, a volume, variability, being able to learn and adapt is critical. That means embracing change, setting out the values that you want to lead by, helping people understand them.

Great leaders are great teachers. The opportunity of the networked world is to share that insight and loop it across the network, so that people understand how to improve every day and every way the core business processes that they're responsible for.

Gardner: Well, great. I am afraid we'll have to leave it there. I'd like to thank our audience for joining us. We've been discussing new levels of collaboration and how they have emerged within an increasingly networked world and how that's all coming together to impact both business and society.

I’d also like to thank our guests for joining us. Zach Tumin, Senior Researcher at the Science, Technology, and Public Policy Program at Harvard Kennedy School. He is also the co-author with William Bratton of this year's Collaborate or Perish.: Reaching Across Boundaries in a Networked World, and that’s published by Random House. Thanks so much Zach.

Tumin: Thank you, Dana.

Gardner: And, of course, Tim Minahan, Senior Vice-President of Global Network Strategy and Chief Marketing Officer at Ariba, an SAP company. Thanks so much, Tim.

Minahan: Thanks, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions and you’ve been listening to a sponsored BriefingsDirect broadcast. Thanks again for listening and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Ariba.

Advanced business networks are driving innovation and social interactions as new technologies and heightened user expectations converge. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

You may also be interested in: