Wednesday, August 15, 2018

Poor Cloud Utilization and High Complexity Demand a Better Way to Manage and Optimize Multicloud Economics

Transcript of a discussion on how IT leaders face an increasingly complex mix of identifying and automating for both best performance and best price points across all of their cloud options.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of the Analyst podcast series.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on the latest insights into hybrid IT and multicloud management.

Gardner
IT architects and operators face an increasingly complex mix of identifying and automating for both the best performance and the best price points across their cloud options.

The modern IT services procurement task is made more difficult by the vast choices public cloud providers offer -- literally, hundreds of thousands of service options.

New tools to help optimize cloud economics are arriving, but in the meantime, waste is rampant in the total spend for unchecked cloud computing.

We will now hear from an IT industry analyst about what causes unwieldy cloud use and how new tools, processes, and methods are bringing insights and actionable analysis to gain control over hybrid IT sprawl.

Here to help us explore new breeds of IT management solutions is William Fellows, Founder and Research Vice President at 451 Research. Welcome, William.

William Fellows: Thanks for inviting me.

Gardner: Let’s start at the top. How much waste is really out there when it comes to enterprises buying and using public cloud services?

Fellows: Well, a lot -- and it’s growing daily. Specifically this is because buyers are now spending thousands, tens of thousands, and even in some cases, millions of dollars a month on their cloud services. So, the amount of waste goes up as the bill goes up.

Rein in the cloud waste

As anyone who works in the field can tell you, by using some cost optimization and resource optimization tools you can save the average organization about 30 percent of the cost on their monthly bill.

Fellows
If your monthly bill is a $100, that’s one amount, but if your monthly bill is a million dollars, then that’s another amount. That’s the kind of wastage in terms of percentage terms that is being seen out there.

What we are really talking about here is the process and how it comes to be that there is such a waste of cloud resources. These are driven by things that can be done fairly easily in terms of better managing to rein in these wasteful charges.

Gardner: What are the top reasons for this lack of efficiency and optimization? Are these just growing pains, that people adopted cloud so rapidly that they lost control over it? Or is there more to it?

Fellows: There are a couple of reasons. At a high level, there is massive organizational dysfunction around cloud and IT. This is driven primarily because cloud as we know is usually via decentralized purchases at large organizations. That means there is often a variety of different groups and departments using cloud. There is no single, central, and logical way of controlling cost.

Secondly there is the sheer number of available services, and the resulting complexity of trying to deal with all of the different nuances with regard to different image sizes, on keeping taps on who is doing what, and so on. That also underpins this resource wastage.

There isn’t one single reason. And, quite frankly, these things are moving forward so quickly that some users want to get on to the next service advance before they are used to using what they already have.

For organizations fearful of runaway costs, this amounts to a drunken sailor effect, where an individual group within an organization just starts using cloud services without regard to any kind of cost-management or economic insight.
Learn More About
HPE OneSphere
In those cases, cloud costs can spiral dramatically. That, of course, is the fear for the chief information officer (CIO), especially as they are trying to build a business case for accelerating the conversion to cloud at an organization.

Yet the actual mechanisms by which organizations are able to better control and eliminate waste are fairly simple. Even Amazon Web Services (AWS) has a mantra on this: Simply turn things off when they are no longer needed. Make sure you are using the right size of instance, for example, for what you are trying to achieve, and make sure that you work with tools that can turn things off as well as turn things on. In other words, employ services that are flexible.

Gardner: We are also seeing more organizations using multiple clouds in multiple ways. So even if AWS, for example, gives you more insight and clarity into your spend with them, and allows you to know better when to turn things off -- that doesn’t carry across the hybrid environment people are facing. The complexity is ramping up at the same time as spiraling costs.

If there were 30 percent waste occurring in other aspects of the enterprise, the chief financial officer (CFO) would probably get involved. The chief procurement officer (CPO) would be called in to do some centralized purchasing here, right?

Why don’t we see the business side of these enterprises come in and take over when it comes to fixing this cloud use waste problem?

It’s costly not to track cloud costs

Fellows: You are right. In defense of the hyperscale cloud providers, they are now doing a much better job of providing tools for doing cost reporting on their services. But of course, they are only interested in really managing the cost on their own services and not on third-party services. As we transition to a hybrid world, and multicloud, those approaches are deficient.

There has recently been a consolidation around the cloud cost reporting and monitoring technologies, leading to the next wave of more forensic resource optimization services, to gain the ability to do this over multiple cloud services.

Coming back to why this isn’t managed centrally, it’s because much of the use and purchasing is so decentralized. There is no single version of the economic truth, if you like, that’s being used to plan, manage, and budget.

For most organizations, they have one foot in the new world and still a foot in the old world. They are working in old procurement models, in the old ways of accounting, budgeting, and cost reporting, which are unlikely to work in a cloud context.
Cloud isn't managed centrally because much of the use and purchasing is so decentralized. There is no single version of the economics truth being used to plan, manage, and budget.

That’s why we are seeing the rise of new approaches. Collectively these things were called cloud management services or cloud management platforms, but the language the industry is using now is cloud governance. And that implies that it’s not only the optimization of resources, infrastructure, and absent workloads -- but it’s also governance in terms of the economics and the cost. And it’s governance when it comes to security and compliance as well.

Again, this is needed because enterprises want a verifiable return on investment (ROI), they do want to control these costs. Economics is important, but it’s not the only factor. It’s only one dimension of the problem they face in this conversion to cloud.

Gardner: It seems to me that this problem needs to be solved if the waste continues to grow, and if decentralization proves to be a disadvantage over time. It behooves the cloud providers, the enterprises, and certainly the IT organizations to get control over this. The economics is, as you say, a big part -- not the only part -- but certainly worth focusing on.

Tell me why you have created at 451 Research a Digital Economics Unit and the 451 Cloud Price Index. Do you hope to accelerate movement toward a solution to this waste problem?

Carry a cost-efficient basket

Fellows: Yes, thanks for bringing that into the interview. I created the Digital Economics Unit at 451 about five years ago. We produce a range of pricing indicators that help end-users and vendors understand the cost of doing things in different kinds of hosted environments. The first set of indicators are around cloud. So, the Cloud Price Index acts like a Consumer Price Index, which measures the cost of a basket of consumer goods and services over time.

The Cloud Price Index measures the cost of a basket of cloud goods and services over time to determine where the prices are going. Of course, five years ago we were just at the beginning of the enormous interest in the relative costs of doing things within AWS versus Azure versus Google, or another places as firms added services.

We’ve assembled a basket of cloud goods and services and priced that in the market. It provides a real average price per basket of goods. We do that by public cloud, and we do it by private cloud. We do it by commercial code, such as Microsoft and others, as well as via open source offerings such as OpenStack. And we do it across global regions.

That has been used by enterprises to understand whether they are getting a good deal from their suppliers, or whether they are paying over the market rates. For vendors, obviously, this helps them with their pricing and packaging strategies.

In the early days, we saw a big shift [downward] in cloud pricing as the vendors introduced new basic infrastructure services. Recently this has fallen off. Although cloud prices are falling, they are coming down less.
Learn More About
HPE OneSphere
I just checked and the basket of goods that we use has fallen this year by about 4 percent in the US. You can expect Europe and Asia-Pac to still pay, for example, a premium of 10 and 25 percent more respectively for the same cloud services in those regions.

We also provide insight into about a dozen services in those baskets of cloud goods, so not only compute but storage, networking, SQL and non-SQL bandwidth, and all kinds of other things.

Now, if you were to choose the provider that offers the cheapest services in each of those -- and you did that across the full basket of goods -- you would actually make a savings of 75 percent on the market costs of that basket. It shows that there is an awful lot of headroom in the market in terms of pricing.

Gardner: Let me make sure I understand what that 75 percent represents. That means if you had clarity, and you were able to shop with full optimization on price, you could reduce your cloud bill by 75 percent. Is that right?

Fellows: Correct, yes. If you were to choose the cheapest provider of each one of those services, you would save yourself 75 percent of the cost over the average market price.

Gardner: Well, that’s massive. That’s just massive.

Opportunity abounds in cloud space

Fellows: Yes, but by the same token, no one is doing that because it’s way too complex and there is nothing in the market available that allows someone to do that, let alone manage that kind of complexity. The key is that it shows there is a great deal of opportunity and room for innovation in this space.

We feel at 451 Research that the price of cloud compute services may go down further. I think it’s unlikely to reach zero, but what’s much more important now is determining the cost of using basic cloud across all of the vendors as quickly as we can because they are now adding higher-value services on top of the basic infrastructure.

The game is now beyond infrastructure. That’s why we have added 16 managed services to the Cloud Price Index of cloud services. With this you can see what you could expect to be paying in the market for those different services, and by different regions. This is the new battleground and the new opportunity for service providers.

Gardner: Clearly 451 Research has identified a big opportunity for cloud spend improvement. But what’s preventing IT people from doing more on costs? Why does it so difficult to get a handle on the number of cloud services? And what needs to happen next for companies to be able to execute once they have gained more visibility?

Fellows: You are right. One of the things we like to do with the Cloud Price Index is to ask folks, “Just how many different things do you think you can buy from the hyperscale vendors now?” The answer as of last week was more than 500,000 -- there are more than 500,000 SKUs available from AWS, Azure, and Google right now.

How can any human keep up with understanding what combination of these things might be most useful within their organization?

The second wave

You need more than a degree in cloud economics to be able to figure that out. And that’s why I talked earlier about a second wave of cloud cost management tools now coming into view. Specifically, these are around resource optimization, and they deliver a forensic view. This is more than just looking at your monthly bill; this is in real time looking at how the services are performing and then recommending actions on that basis to optimize their use from an economic point of view.

Some of these are already beginning to employ more automation based on machine learning (ML). So, the tools themselves can learn what’s going on and make decisions based upon those.
A second wave of cloud cost management tools is coming into view around resource optimization, and they deliver a forensic view.

There is a whole raft of vendors we are covering within our research here. I fully expect that like the initial wave of cloud-cost-reporting tools that have largely been acquired, that these newest tools will probably go the same way. This is because the IT vendors are trying to build out end-to-end cloud governance portfolios, and they are going to need this kind of introspection and optimization as part of their offerings.

Gardner: As we have seen in IT in the past, oftentimes we have new problems, but they have a lot in common with similar waves of problems and solutions from years before. For example, there used to be a lot of difficulty knowing what you had inside of your own internal data centers. IT vendors came to the rescue with IT management tools, agent-based, agentless, crawling across the network, finding all the devices, recognizing certain platforms, and then creating a map, if you will.

So we have been through this before, William. We have seen how IT management has created the means technically to support centralization, management, and governance over complexity and sprawl. Are the same vendors who were behind IT management traditionally now extending their capabilities to the cloud? And who might be some of the top players that are able to do that?

Return of the incumbents

Fellows: You make a very relevant point because although it has taken them some time, the incumbents, the systems management vendors, are rearchitecting, reengineering. And either by organic, in-house development, by partnership, or by acquisition, they are extending and remodeling their environments for the cloud opportunity.

Many of them have now assembled real and meaningful portfolios, whether that’s Cisco, BMC, CA, HPE, or IBM, and so on. Most of these folks now have a good set of tools for doing this, but it has taken them a long time.

Sometimes some of these firms don’t need to do anything for a number of years and they can still come out on top of this market. One of the questions is whether there is room for long-term, profitable, growing, independent firms in this area. That remains to be seen.

The most likely candidates are not necessarily the independent software vendors (ISVs). We might think about RightScale as being one of the longest serving folks in the market. But, instead, I believe it will be solved by the managed service providers (MSPs).

These are the folks providing ways for enterprises to achieve a meaningful conversion to cloud and to multiple cloud services. In order to be able to do that, of course, they need to manage all those resources in a logical way.

There is a new breed of MSPs coming to the market that are essentially born in the cloud, or cloud-native, in their approach -- rather than the incumbent vendors, who have bolted this [new set of capabilities] onto their environments.

One of the exceptions is HPE, because of what they have done by selling most of their legacy software business to Micro Focus. They have actually come from a cloud-native starting place for the tooling to do this. They have taken a somewhat differentiated approach to the other folks in the market who have really been assembling things through acquisition.
Learn More About
HPE OneSphere
The other folks in the market are the traditional systems integrators. It’s in their DNA to be working with multiple services. That may be Accenture, Capgemini, and DXC, or any of these folks. But, quite frankly, those organizations are only interested in working with the Global 1000 or 2000 companies. And as we know, the conversion to cloud is happening across all industries. There is a tremendous opportunity for folks to work with all kinds of companies as they are moving to the cloud.

Gardner: Again, going back historically in IT, we have recognized that having multiple management points solves only part of the problem. Organizations quickly tend to want to consolidate their management and have a single view, in this case, of not just the data center or private cloud, but all public clouds, so hybrid and multicloud.

It seems to me that having a single point across all of the hybrid IT continuum is going to be an essential attribute. Is that something you are seeing in the market as well?

More is better

Fellows: Yes, it is, although, I don’t think there is any one company or one approach that has a leadership position yet. That makes this point in time more interesting but somewhat risky for end users. That is why our counsel to enterprises is to work with vendors who can offer a full and a rich set of services.

The more things that you have, the more you are going to be able to undertake and navigate this journey to the cloud -- and then support the digital transformation on top.

Working with vendors that have loosely-coupled approaches allows you to take advantage of a core set of native services -- but then also use your own tools or third-party services via application programming interfaces (APIs). It may be a platform approach or it may be a software-as-a-service (SaaS) approach.

At this point, I don’t think any of the IT vendor firms have sufficiently joined up these approaches to be able to operate across the hybrid IT environment. But it seems to me that HPE is doing a good job here in terms of bringing, or joining, these things together.

On one side of the HPE hash is the mature, well-understood, HPE OneView environment, which is now being purposed to provide a software-defined way of provisioning infrastructure. The other piece is the HPE OneSphere environment, which provides API-driven management for applications, services, workloads, and the whole workspace and developer piece as well.

So, one is coming top-down and the other one bottom-up. Once those things become integrated, they will offer a pretty rich way for organizations to manage their hybrid IT environments.
The HPE OneSphere environment provides API-driven management for applications, services, workloads, and the developer piece as well.

Now, if you are also using HPE’s Synergy composable infrastructure, then you are going to get an exponential benefit from using those other tools. Also, the Cloud Cruiser cost reporting capability is now embedded into HPE OneSphere. And HPE has a leading position in this new kind of hardware consumption model -- for using new hardware services payment models -- via its HPE GreenLake Hybrid Cloud offering.

So, it seems to me that there is enough here to appeal to many interests within an organization, but crucially it will allow IT to retain control at the same time.

Now, HPE is not unique. It seems to me that all of the vendors are working to head in this general direction. But the HPE offering looks like it's coming together pretty well.

Gardner: So, a great deal of maturity left to go. Nonetheless, the cloud-governance opportunity appears big enough to drive a truck through. If you can bring together an ecosystem and a platform approach that appeals to those MSPs, to systems integrators, works well in the large global 2000, but also has a direct role toward the small and medium businesses – that’s a very big market opportunity.

I think businesses and IT operators should begin to avail themselves of learning more about this market, because there is so much to gain when you do it well. As you say, the competition is going to push the vendors forward, so a huge opportunity is brewing out there.

William, what should IT organizations be doing now to get ready for what the vendors and ecosystems bring out around cloud management and optimization? What should you be doing now to get in a position where you can take advantage of what the marketplace is going to provide?

Get your cloud house in order

Fellows: First and foremost, organizations now need to be moving toward a position of cloud-readiness. And what I mean is understanding to what extent applications and workloads are suitable for moving to the cloud. Next comes undertaking the architecting, refactoring, and modernization. That will allow them to move into the cloud without the complexity, cost, and disruption of the first-generation lift-and-shift approaches.

In other words, get your own house in order, so to speak. Prepare for the move to the cloud. It will become apparent that some applications and workloads are suitable for some kind of services deployment, maybe a public cloud. Other types of apps and workloads are going to be more suited to other kinds of environments, maybe a hosted private environment.

You are then also going to have applications that you want to take advantage of on the edge, for Internet of things (IoT), and so on. You are going to want a different set of services for that as well.

The challenge is going to be working with providers that can help you with all of that. One thing we do know is that most organizations are accessing cloud services via partners. In fact, in AWS’s case, 90 percent of Fortune 100 companies that are its customers are accessing its services via a partner.

And this comes back to the role and the rise of the MSP who can deliver value-add by enabling an organization to work and use different kinds of cloud services to meet different needs -- and to manage those as a logical resource.

That’s the way I think organizations need to approach this whole cloud piece. Although we have been doing this for a while now -- AWS has had cloud services for 11 years -- the majority of the opportunity is still ahead of us. Up until now, it has really still only been the early adopters who have converted to cloud. That’s why there is such a land grab underway at present to be able to capture the majority of the opportunity.
Learn More About
HPE OneSphere
Gardner: I’m sure we can go on for another 30 minutes on just one more aspect to this, which is the skills part. It appears to me there will be a huge need for the required skills for managing cloud adoption across the economics and procurement best practices -- as well as the technical side. So perhaps a whole new class of people are needed within companies who have backgrounds in economics, procurement, IT optimization and management methods, as well as deeply understanding cloud ecosystem.

Develop your skills

Fellows: You are right. 451’s Voice of the Enterprise data shows that the key barrier to accelerating adoption is not technology -- but a skills shortage. Indeed, that’s across operations, architecture, and security.

Again, I think this is another opportunity for the MSPs, to help upskill a customer’s own organization in these areas. That will be a driver for success, because, of course, when we talk about being in the cloud, we are not talking so much about the technology -- we are talking about the operating model. That really is the key here.

That operating model is consumption-based, services-driven, and with a retail-model’s discipline. It’s more than CAPEX to OPEX. It’s more than hardwired to being agile -- it’s all of those things, and that really means the transformation of enterprises and organizations. It’s really the most difficult and challenging thing going on here.

Whatever an IT supplier can do to assist end-customers with that, to rotate to that new operating model, is likely to be more successful.

Gardner: I’m afraid we will have to leave it there. We have been exploring how IT architects and operators face an increasingly complex mix of identifying and automating both the best performance and the best price points across their cloud options.

And we have learned how new breeds of hybrid and multicloud management methods and solutions are bringing new insights and actionable analysis to help gain control over hybrid IT sprawl.

So please join me in thanking our guest, William Fellows, Founder and Research Vice President at 451 Research. Thank you so much, William.

Fellows: Thanks, indeed. Thanks, everyone.

Gardner: Yes, a big thank you to our audience for joining this special BriefingsDirect Voice of the Analyst hybrid IT management strategies interview.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host on this ongoing series of Hewlett Packard Enterprise (HPE)-sponsored discussions. Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how IT leaders face an increasingly complex mix of identifying and automating for both best performance and best price points across all of their cloud options. Copyright Interarbor Solutions, LLC, 2005-2018. All rights reserved.

You may also be interested in:

Monday, August 13, 2018

GDPR Forces a Rekindling of the People-Centric Approach to Marketing and Business

Transcript of a discussion on how GDPR impacts how customer data can be used, forcing marketers to rethink digital-only approaches to customer outreach and relationships.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Our next digital business innovation discussion explores how modern marketing is impacted by the General DataProtection Regulation (GDPR). Those seeking to know their customers well are finding that this sweeping new European Union (EU) law forces a dramatic shift in how customer data can be gathered, shared, and protected.

And it means that low-touch marketing by mass data analysis and inference alone likely will need to revert to the good-old-fashioned handshake and more high-touch trust building approaches that bind people to people, and people to brands.

Here to help us sort through a more practical approach of rethinking marketing within the requirements of highly protected data is Tifenn Dano Kwan, Chief Marketing Officer at SAP Ariba. Welcome, Tifenn.

Tifenn Dano Kwan: Thank you, Dana. Very glad to be with you.

Gardner: Now with GDPR is fully in place, it seems that we’ve had to embrace the concept that good privacy is good business. In doing so, it seems that marketers have become too dependent on data-driven and digital means of interacting with their customers and prospects.

Has GDPR done us a favor in marketing -- maybe as an unintended consequence -- when it comes to bringing the human relationships aspect of business back to the fore?

Marketing with soul 

Dano Kwan: GDPR is giving us the ability to remember what marketing is, and who we are as marketers. I think that it is absolutely critical, to go back to the foundation of what marketing is. If you think about the role of marketing in an organization, we are a little bit of the Picassos of companies -- we are the creative souls. We bring the soul back into an organization.

Dano Kwan
Why? Because we control the narrative, we control the storytelling, and we control the brands. Also, in many ways -- especially over the past couple of years -- we control the data because our focus is understanding the audience and our customers.

With the rise of digital over the past couple of years, data has been the center of a lot of what marketing has been driving. But make no mistake, marketers are creative people. Their passion is in creating amazing stories -- to promote and support sales in the selling process, and being, frankly, the voice of the customer.

The GDPR law is simply bringing back to the forefront what the value of marketing is. It’s not just controlling the data. We have to go back to what marketing really brings to the table. And go back to balancing the data with the art, the science with the art, and ensuring that we continue to add value to represent the voice of the customer.

Gardner: It must have been tempting for marketers, with the data approach, to see a lot of scalability -- that they could reach a lot more people, with perhaps less money spent. The human touch, the high-touch can be more expensive. It doesn’t necessarily scale as well.

Do you think that we need to revisit cost and scale when it comes to this human and creative aspect of marketing?

Balancing high- and low-touch points 

Dano Kwan: It’s a matter of realigning the touch points and how we consider touch points when we drive marketing strategies. I don’t think that there is one thing that is better than the other. It’s a matter of sequencing and orchestrating the efforts when we run marketing initiatives.

If you think about the value of digital, it’s really focused on the inbound marketing engine that we have been hearing about for so many years now. Every company that wants to scale has to build an inbound engine. But in reality, if you look at the importance of creating inbound, it is a long-term strategy, it doesn’t necessarily provide a short-term gain from the marketing standpoint or pipeline standpoint. It needs to be built upon a long-term strategy around inbound searches, such as paid media search, and so on. Those very much rely on data.

While we need to focus on these low-touch concepts, we also need to recognize that the high-touch initiatives are equally important.

Sometimes marketing can be accused of being completely disconnected from the customers because we don’t have enough face-to-face interactions. Or of creating large events without an understanding of high-touch. GDPR is an opportunity like never before for marketers to deeply connect with customers.

Gardner: Let’s step back and explain more about GDPR and why the use of data has to be reevaluated.

GDPR is from the EU, but any company that deals with the supply chains that enter the European Union -- one of the largest trading blocks in the world -- is impacted. Penalties can be quite high if you don’t treat data properly, or if you don’t alert your customers if their private data has been compromised in any way.

How does this reduce the amount that marketers can do? What’s the direct connection between what GDPR does and why marketers need to change?

Return to the source 

Dano Kwan: It’s a matter of balancing the origins of a sales pipeline. If you look at the sources of pipeline in an organization, whether it’s marketing-led or sales-led, or even ecosystem- or partner-led, everybody is specifically tracking the sources of pipeline.

What we call the marketing mix includes the source of the pipeline and the channels of those sources. When you look at pure inbound strategies, you can see a lot of them coming out of digital properties versus physical properties.

We need to understand the impact [of GDPR] and acknowledge a drop in the typical outbound flow, whether it’s telemarketing, inside sales, or the good-old events, which are very much outbound-driven.

Over the next couple of months there is going to be a direct impact on all sources of pipeline. At the very least, we are going to have to monitor where the opportunities are coming from. Those who are going to succeed are those who are going to shift the sources of the pipeline and understand over time how to anticipate the timing for that new pipeline that we generate.
We are absolutely going to have to make a shift. Some readjustment needs to happen. We need new forms of opportunities for business.

We are absolutely going to have to make a shift. Like I said, inbound marketing takes more time, so those sources of pipeline are more elongated in time versus outbound strategies. Some readjustment needs to happen, but we also need new forms of opportunities for business.

That could mean going back to old-fashioned direct mail, believe it or not -- this is back in fashion, and this is going to happen over again. But it also means new ways of doing marketing, such as influencer marketing.

If you think about the value of social media and blogs, all those digital influencers in the world are going to have a blast, because today if you want to multiply your impact, and if you want to reach out to your audiences, you can’t do it just by yourself. You have to create an ecosystem and a network of influencers that are going to carry your voice and carry the value for you. Once they do that they tap into their own networks, and those networks capture the audiences that you are looking for. Once those audiences are captured through the network of influencers, you have a chance to send them back to your digital properties and dotcom properties.

We are very excited to see how we can balance the impact of GDPR, but also create new routes and techniques, to experiment with new opportunities. Yes, we are going to see a drop in the traditional sources of pipeline. It’s obvious. We are going to have to readjust. But that’s exciting, it’s going to mean more experimentation or thinking outside of the box and reinventing ourselves.

Opportunity knocks, outside the box 

Gardner: And how is this going to be different for business-to-consumer (B2C) and business-to-business (B2B)? We are seeing a lot influencer marketing effective for consumer and some retail; is it just as effective in the B2B space? How should B2B marketers be thinking differently?

Dano Kwan: I don’t know that it’s that different, to be honest with you, Dana. I think it’s the same thing. I think we are going to have to partner a lot more with what I call an ecosystem of influencers, whether it be partners, analysts, press, bloggers or very strong influencers who are extremely well-networked.

In the consumer world, the idea is to multiply the value. You are going to see a lot more partnerships, such as core branding initiatives that are going to rise. Or where two brands come together, carrying the power of their message to reach up to and join customers.

Gardner: As an observer of SAP Ariba and over the past several years, it’s been very impactful for me see how the company has embraced the notion of doing good and in doing well in terms of the relationship with customers and the perception of a company. I think your customers have received this very well.

Is there a relationship between this new thinking of marketing and the idea of being a company that’s perceived as being a good player, a good custodian in their particular ecosystems?

Purpose-driven pipelines

Dano Kwan: It’s a great question, Dana. I think those two things are happening at the same time. We are moving toward being more purposeful because the world simply is moving toward becoming more purposeful. This is a trend we see among buyers in both the B2C world and B2B worlds. They are extremely sensitive to those notions - especially millennials. They look at the news and they truly worry for their future.

The end-goal here is to remind ourselves that companies are not just here to make a profit -- they are here to make a difference.

GDPR is shifting the focus of marketing within companies to where we are not just seeking data to reach out to audiences -- but to be meaningful and purposeful when we reach out to our customers. We must not only provide content; we have to give them something that aligns with their values and ignites their passions.
The end goal here is to remind ourselves that companies are not just here to make a profit -- they are here to make a difference.

So, those two things are connected to each other, and I think it’s going to accelerate the value of purpose, it’s going to accelerate the value of meaningful conversations with our customers that are truly based -- not just on profit and data -- but on making a difference in the world, and that is a beautiful thing.

Gardner: Do you think, Tifenn, that we are going to see more user conferences -- perhaps smaller ones, more regional, more localized -- rather than just once a year?

Dano Kwan: I think that we are going to see some readjustments. Big conferences used to happen in Europe and North America, but think about the emerging markets, think about Latin America, think about Asia Pacific, and Japan, think about Middle East. All of those regions are growing, they are getting more connected.

In my organization, I am pushing for it. People don’t necessarily want to travel long distances to go to big conferences. They prefer local interaction and messaging. So regionalization and localizations – from messaging to marketing activities – are going to become a lot more prominent, in my opinion, in the coming years.

Gardner: Another big trend these days is the power that artificial intelligence (AI) and machine learning (ML) can bring to solve many types of problems. While we might be more cautious about what we do with data – and we might not get the same amount of data under a GDPR regime -- the tools for what we can do with the data are much stronger than before.

Is there some way in which we can bring the power of AI and ML into a creative process that allows a better relationship between businesses and consumers and businesses and businesses? How does AI factor into the next few years in a GDPR world?

AI gets customers 

Dano Kwan: AI is going to be a way for us to get more quality control in the understanding of the customer, definitely. I think it is going to allow us to learn about behaviors and do that at scale.

Business technologies and processes are going to be enabled through AI and ML; that is obvious, all of the studies indicate it. It starts with obvious sectors and industries, but it’s going to expand drastically because it informs more curiosity in the understanding of processes and customers.

Gardner: Perhaps a way to look at it would be that aggregated data and anonymized data will be used in an AI environment in order to then allow you to get closer to your customer in that high-touch fashion. Like we are seeing in retail, when somebody walks into a brick-and-mortar environment, a store, you might not know them individually, but you have got enough inference from aggregated data to be able to have a much better user experience.

Dano Kwan: That’s exactly right. I think it’s going to inform the experience in general, whether that experience is communicated through marketing or via face-to-face. At the end of the day, and you are right, the user experience affects everything that we do. Users can get very specific about what they want. They want their experiences to be personal, to be ethical, to be local, and regionalized. They want them to be extremely pointed to their specific needs.

And I do believe that AI is going to allow us to get rapidly attuned to the customer experience and constantly innovate and improve that experience. So in the end, if it’s just the benefit of providing a better experience, then I say, why not? Choose the tools that offer a superior experience for our customers.

I believe that the face-to-face approach, especially when you have complex interactions with customers, still is going to be needed. And the face-to-face approach, the real touch point that you have, is going to be necessary in complex engagements with customers.

But AI can also help prepare for those types of complex interactions. It really depends on what you sell, what you promote. If you promote a simple solution or thing that can be triggered online, then AI is simply going to accelerate the ability for the customer to click and purchase.

But if you go with very complex sales cycles, for example, that require human interactions, you can use AI to inform a conversation and be prepared for a meeting where you have activated data to present in front of your customer and to support whatever value you want to bring to the customer.

Gardner: We are already seeing that in the help-desk field where people who are fielding calls from customers are much better prepared. It makes the agents themselves far more powerful.

How does this all relate to the vast amount of data and information you have in the Ariba Network, for example? Being in a position of having a lot of data but being aware that you have to be careful about how you use it, seems to me the best of all worlds. How does the Ariba Network and the type of data that you can use safely and appropriately benefit your customers?

Be prepared, stay protected

Dano Kwan: We have done extensive work at the product level within SAP Ariba to prepare for GDPR. In fact, our organization is one of the most prepared from a GDPR standpoint not only to be compliant but to offer solutions that are enabling our customers to themselves become compliant from a GDPR standpoint.

That’s one of the strengths [that comes] not just from Network, but also [from] the solutions that we bring to the industry and to our customers.

The Ariba Network has a lot of data that is specific to the customer. GDPR is simply reinforcing the fact that data has to be protected, that all companies, including SAP Ariba -- and all supply chain and procurement organizations in the world -- have to be prepared for it, to work toward respect of privacy, consent, and ensuring that the data is used in the right way. SAP Ariba is absolutely partnering with all the suppliers and buyers in the network and preparing for this.

Gardner: If you’re a marketing executive and you weren’t necessarily thinking about the full impact of GDPR, do you have some advice now that you have thought this through? What should others who are just beginning that process be mindful of?
Ensuring that GDPR is well understood by suppliers and agencies -- from a marketing point of view -- is critical.

Dano Kwan: My single biggest advice is to really focus on knowledge transfer within the organization. GDPR is a collective responsibility. It is not just a marketing responsibility; the sales teams, the customer facing teams -- whether it’s support services, presales, sales -- everybody has to be prepared. The knowledge transfer is absolutely critical, and it has to be clear, it has to be simple, and equipping the field within your organization is critical. So that’s number one, internally.

But the positioning with the external contributors to your business is also critical. So ensuring that GDPR is well understood with the external suppliers as well as agencies, from a marketing standpoint, and then all the partners that you have is equally important.

Prepare by doing a lot of knowledge transfer on what GDPR is, what its impact is, and what’s in it for each constituent of the business. Also, explore how people can connect and communicate with customers. Learn what they can do, what they can’t do. This has to be explained in a very simple way and has to be explained over and over and over again because what we are seeing is that it’s new for everyone. And one launch is not enough.

Over the next couple of months all companies are going to have to heavily invest in regular knowledge-transfer sessions and training to ensure that all of their customer-facing teams -- inside the organization or outside -- are very well prepared for GDPR.

Gardner: I’m afraid we’ll have to leave it there. You’ve been listening to a sponsored BriefingsDirect discussion on how the sweeping new European Union law GDPR forces a dramatic shift in how customer data can be gathered, shared, and protected.

And we have learned how low-touch marketing will likely revert to include more of the good old-fashioned handshake and personal trust-building methods that bind people to people -- and people to brands.

So, a big thank you to our guest, Tifenn Dano Kwan, Chief Marketing Officer at SAP Ariba. Thank you so much, Tifenn.

Dano Kwan: Thank you very much, Dana.

Gardner: And a big thank you to our audience as well for joining this BriefingsDirect digital business innovation interview.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of SAP Ariba-sponsored BriefingsDirect discussions. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

Transcript of a discussion on how GDPR impacts how customer data can be used, forcing marketers to rethink digital-only approaches to customer outreach and relationships. Copyright Interarbor Solutions, LLC, 2005-2018. All rights reserved.

You may also be interested in: