Monday, December 07, 2020

How to Industrialize Data Science to Attain Mastery of Repeatable Intelligence Delivery

Transcript of a discussion on the latest methods, tools, and thinking around making data science an integral core function of any business.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next BriefingsDirect Voice of Analytics Innovation podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on the latest insights into data science advances and strategy.


Businesses these days are quick to declare their intention to become data-driven, yet the deployment of analytics and the use of data science remains spotty, isolated, and often uncoordinated. To fully reach their digital business transformation potential, businesses large and small need to make data science more of a repeatable assembly line -- an industrialization, if you will, of end-to-end data exploitation.

Stay with us now as we explore the latest methods, tools, and thinking around making data science an integral core function that both responds to business needs and scales to improve every aspect of productivity.

To learn more about the ways that data and analytics behave more like a factory -- and less like an Ivory Tower -- please join me now in welcoming Doug Cackett, EMEA Field Chief Technology Officer at Hewlett Packard Enterprise. Welcome, Doug.

Doug Cackett:
Thank you so much, Dana.

Gardner: Doug, why is there a lingering gap -- and really a gaping gap -- between the amount of data available and the analytics that should be taking advantage of it?

Data’s potential at edge

Cackett: That’s such a big question to start with, Dana, to be honest. We probably need to accept that we’re not doing things the right way at the moment. Actually, Forrester suggests that something like 40 zettabytes of data are going to be under management by the end of this year, which is quite enormous.


And, significantly, more of that data is being generated at the edge through applications, Internet of Things (IoT), and all sorts of other things. This is where the customer meets your business. This is where you’re going to have to start making decisions as well.

So, the gap is two things. It’s the gap between the amount of data that’s being generated and the amount you can actually comprehend and create value from. In order to leverage that data from a business point of view, you need to make decisions at the edge. 

You will need to operationalize those decisions and move that capability to the edge where your business meets your customer. That’s the challenge we’re all looking for machine learning (ML) -- and the operationalization of all of those ML models into applications -- to make the difference. 

Gardner: Why does HPE think that moving more toward a factory model, industrializing data science, is part of the solution to compressing and removing this gap?

Cackett: It’s a math problem, really, if you think about it. If there is exponential growth in data within your business, if you’re trying to optimize every step in every business process you have, then you’ll want to operationalize those insights by making your applications as smart as they can possibly be. You’ll want to embed ML into those applications. 

Because, correspondingly, there’s exponential growth in the demand for analytics in your business, right? And yet, the number of data scientists you have in your organization -- I mean, growing them exponentially isn’t really an option, is it? And, of course, budgets are also pretty much flat or declining.

There's exponential growth in the demand for analytics in your business. And yet the number of data scientists in your organization, growing them, is not exponential. And budgets are pretty much flat or declining.

So, it’s a math problem because we need to somehow square away that equation. We somehow have to generate exponentially more models for more data, getting to the edge, but doing that with fewer data scientists and lower levels of budget. 

Industrialization, we think, is the only way of doing that. Through industrialization, we can remove waste from the system and improve the quality and control of those models. All of those things are going to be key going forward.

Gardner: When we’re thinking about such industrialization, we shouldn’t necessarily be thinking about an assembly line of 50 years ago -- where there are a lot of warm bodies lined up. I’m thinking about the Lucille Ball assembly line, where all that candy was coming down and she couldn’t keep up with it.

Perhaps we need more of an ultra-modern assembly line, where it’s a series of robots and with a few very capable people involved. Is that a fair analogy?

Industrialization of data science

Cackett: I think that’s right. Industrialization is about manufacturing where we replace manual labor with mechanical mass production. We are not talking about that. Because we’re not talking about replacing the data scientist. The data scientist is key to this. But we want to look more like a modern car plant, yes. We want to make sure that the data scientist is maximizing the value from the data science, if you like.

We don’t want to go hunting around for the right tools to use. We don’t want to wait for the production line to play catch up, or for the supply chain to catch up. In our case, of course, that’s mostly data or waiting for infrastructure or waiting for permission to do something. All of those things are a complete waste of their time. 

As you look at the amount of productive time data scientists spend creating value, that can be pretty small compared to their non-productive time -- and that’s a concern. Part of the non-productive time, of course, has been with those data scientists having to discover a model and optimize it. Then they would do the steps to operationalize it.

But maybe doing the data and operations engineering things to operationalize the model can be much more efficiently done with another team of people who have the skills to do that. We’re talking about specialization here, really.

But there are some other learnings as well. I recently wrote a blog about it. In it, I looked at the modern Toyota production system and started to ask questions around what we could learn about what they have learned, if you like, over the last 70 years or so.

It was not just about automation, but also how they went about doing research and development, how they approached tooling, and how they did continuous improvement. We have a lot to learn in those areas.

For an awful lot of organizations that I deal with, they haven’t had a lot of experience around such operationalization problems. They haven’t built that part of their assembly line yet. Automating supply chains and mistake-proofing things, what Toyota called jidoka, also really important. It’s a really interesting area to be involved with.

Gardner: Right, this is what US manufacturing, in the bricks and mortar sense, went through back in the 1980s when they moved to business process reengineering, adopted kaizen principles, and did what Deming and more quality-emphasis had done for the Japanese auto companies.

And so, back then there was a revolution, if you will, in physical manufacturing. And now it sounds like we’re at a watershed moment in how data and analytics are processed.

Cackett: Yes, that’s exactly right. To extend that analogy a little further, I recently saw a documentary about Morgan cars in the UK. They’re a hand-built kind of car company. Quite expensive, very hand-built, and very specialized.

And I ended up by almost throwing things at the TV because they were talking about the skills of this one individual. They only had one guy who could actually bend the metal to create the bonnet, the hood, of the car in the way that it needed to be done. And it took two or three years to train this guy, and I’m thinking, “Well, if you just automated the process, and the robot built it, you wouldn’t need to have that variability.” I mean, it’s just so annoying, right?

In the same way, with data science we’re talking about laying bricks -- not Michelangelo hammering out the figure of David. What I’m really trying to say is a lot of the data science in our customer’s organizations are fairly mundane. To get that through the door, get it done and dusted, and give them time to do the other bits of finesse using more skills -- that’s what we’re trying to achieve. Both [the basics and the finesse] are necessary and they can all be done on the same production line.

Gardner: Doug, if we are going to reinvent and increase the productivity generally of data science, it sounds like technology is going to be a big part of the solution. But technology can also be part of the problem.

What is it about the way that organizations are deploying technology now that needs to shift? How is HPE helping them adjust to the technology that supports a better data science approach?

Define and refine

Cackett: We can probably all agree that most of the tooling around MLOps is relatively young. The two types of company we see are either companies that haven’t yet gotten to the stage where they’re trying to operationalize more models. In other words, they don’t really understand what the problem is yet.

Forrester research suggests that only 14 percent of organizations that they surveyed said they had a robust and repeatable operationalization process. It’s clear that the other 86 percent of organizations just haven’t refined what they’re doing yet. And that’s often because it’s quite difficult. 

Many of these organizations have only just linked their data science to their big data instances or their data lakes. And they’re using it both for the workloads and to develop the models. And therein lies the problem. Often they get stuck with simple things like trying to have everyone use a uniform environment. All of your data scientists are both sharing the data and sharing the computer environment as well.

Data scientists can be very destructive in what they're doing. Maybe overwriting data, for example. To avoid that, you end up replicating terabytes of data, which can take a long time. That also demands new resources, including new hardware.

And data scientists can often be very destructive in what they’re doing. Maybe overwriting data, for example. To avoid that, you end up replicating the data. And if you’re going to replicate terabytes of data, that can take a long period of time. That also means you need new resources, maybe new more compute power and that means approvals, and it might mean new hardware, too.

Often the biggest challenge is in provisioning the environment for data scientists to work on, the data that they want, and the tools they want. That can all often lead to huge delays in the process. And, as we talked about, this is often a time-sensitive problem. You want to get through more tasks and so every delayed minute, hour, or day that you have becomes a real challenge.

The other thing that is key is that data science is very peaky. You’ll find that data scientists may need no resources or tools on Monday and Tuesday, but then they may burn every GPU you have in the building on Wednesday, Thursday, and Friday. So, managing that as a business is also really important. If you’re going to get the most out of the budget you have, and the infrastructure you have, you need to think differently about all of these things. Does that make sense, Dana?

Gardner: Yes. Doug how is HPE Ezmeral being designed to help give the data scientists more of what they need, how they need it, and that helps close the gap between the ad hoc approach and that right kind of assembly line approach?

Two assembly lines to start

Cackett: Look at it as two assembly lines, at the very minimum. That’s the way we want to look at it. And the first thing the data scientists are doing is the discovery.

The second is the MLOps processes. There will be a range of people operationalizing the models. Imagine that you’re a data scientist, Dana, and I’ve just given you a task. Let’s say there’s a high defection or churn rate from our business, and you need to investigate why.

First you want to find out more about the problem because you might have to break that problem down into a number of steps. And then, in order to do something with the data, you’re going to want an environment to work in. So, in the first step, you may simply want to define the project, determine how long you have, and develop a cost center.

You may next define the environment: Maybe you need CPUs or GPUs. Maybe you need them highly available and maybe not. So you’d select the appropriate-sized environment. You then might next go and open the tools catalog. We’re not forcing you to use a specific tool; we have a range of tools available. You select the tools you want. Maybe you’re going to use Python. I know you’re hardcore, so you’re going to code using Jupyter and Python.

And the next step, you then want to find the right data, maybe through the data catalog. So you locate the data that you want to use and you just want to push a button and get provisioned for that lot. You don’t want to have to wait months for that data. That should be provisioned straight away, right?

You can do your work, save all your work away into a virtual repository, and save the data so it’s reproducible. You can also then check the things like model drift and data drift and those sorts of things. You can save the code and model parameters and those sorts of things away. And then you can put that on the backlog for the MLOps team.

Then the MLOps team picks it up and goes through a similar data science process. They want to create their own production line now, right? And so, they’re going to seek a different set of tools. This time, they need continuous integration and continuous delivery (CICD), plus a whole bunch of data stuff they want to operationalize your model. They’re going to define the way that that model is going to be deployed. Let’s say, we’re going to use Kubeflow for that. They might decide on, say, an A/B testing process. So they’re going to configure that, do the rest of the work, and press the button again, right?

Clearly, this is an ongoing process. Fundamentally that requires workflow and automatic provisioning of the environment to eliminate wasted time, waiting for stuff to be available. It is fundamentally what we’re doing in our MLOps product.

But in the wider sense, we also have consulting teams helping customers get up to speed, define these processes, and build the skills around the tools. We can also do this as-a-service via our HPE GreenLake proposition as well. Those are the kinds of things that we’re helping customers with.

Gardner: Doug, what you’re describing as needed in data science operations is a lot like what was needed for application development with the advent of DevOps several years ago. Is there commonality between what we’re doing with the flow and nature of the process for data and analytics and what was done not too long ago with application development? Isn’t that also akin to more of a cattle approach than a pet approach?

Operationalize with agility

Cackett: Yes, I completely agree. That’s exactly what this is about and for an MLOps process. It’s exactly that. It’s analogous to the sort of CICD, DevOps, part of the IT business. But a lot of that tool chain is being taken care of by things like Kubeflow and MLflow Project, some of these newer, open source technologies. 

I should say that this is all very new, the ancillary tooling that wraps around the CICD. The CICD set of tools are also pretty new. What we’re also attempting to do is allow you, as a business, to bring these new tools and on-board them so you can evaluate them and see how they might impact what you’re doing as your process settles down.

The way we're doing MLOps and data science is progressing extremely quickly. So you don't want to lock yourself into a corner where you're trapped in a particular workflow. You want to have agility. It's analogous to the DevOps movement.

The idea is to put them in a wrapper and make them available so we get a more dynamic feel to this. The way we’re doing MLOps and data science generally is progressing extremely quickly at the moment. So you don’t want to lock yourself into a corner where you’re trapped into a particular workflow. You want to be able to have agility. Yes, it’s very analogous to the DevOps movement as we seek to operationalize the ML model.

The other thing to pay attention to are the changes that need to happen to your operational applications. You’re going to have to change those so they can tool the ML model at the appropriate place, get the result back, and then render that result in whatever way is appropriate. So changes to the operational apps are also important.

Gardner: You really couldn’t operationalize ML as a process if you’re only a tools provider. You couldn’t really do it if you’re a cloud services provider alone. You couldn’t just do this if you were a professional services provider.

It seems to me that HPE is actually in a very advantageous place to allow the best-of-breed tools approach where it’s most impactful but to also start put some standard glue around this -- the industrialization. How is HPE is an advantageous place to have a meaningful impact on this difficult problem?

Cackett: Hopefully, we’re in an advantageous place. As you say, it’s not just a tool, is it? Think about the breadth of decisions that you need to make in your organization, and how many of those could be optimized using some kind of ML model.

You’d understand that it’s very unlikely that it’s going to be a tool. It’s going to be a range of tools, and that range of tools is going to be changing almost constantly over the next 10 and 20 years.

This is much more to do with a platform approach because this area is relatively new. Like any other technology, when it’s new it almost inevitably to tends to be very technical in implementation. So using the early tools can be very difficult. Over time, the tools mature, with a mature UI and a well-defined process, and they become simple to use.

But at the moment, we’re way up at the other end. And so I think this is about platforms. And what we’re providing at HPE is the platform through which you can plug in these tools and integrate them together. You have the freedom to use whatever tools you want. But at the same time, you’re inheriting the back-end system. So, that’s Active Directory and Lightweight Directory Access Protocol (LDAP) integrations, and that’s linkage back to the data, your most precious asset in your business. Whether that be in a data lake or a data warehouse, in data marts or even streaming applications. 

This is the melting point of the business at the moment. And HPE has had a lot of experience helping our customers deliver value through information technology investments over many years. And that’s certainly what we’re trying to do right now.

Gardner: It seems that HPE Ezmeral is moving toward industrialization of data science, as well as other essential functions. But is that where you should start, with operationalizing data science? Or is there a certain order by which this becomes more fruitful? Where do you start?

Machine learning leads change

Cackett: This is such a hard question to answer, Dana. It’s so dependent on where you are as a business and what you’re trying to achieve. Typically, to be honest, we find that the engagement is normally with some element of change in our customers. That’s often, for example, where there’s a new digital transformation initiative going on. And you’ll find that the digital transformation is being held back by an inability to do the data science that’s required.

There is another Forrester report that I’m sure you’ll find interesting. It suggests that 98 percent of business leaders feel that ML is key to their competitive advantage. It’s hardly surprising then that ML is so closely related to digital transformation, right? Because that’s about the stage at which organizations are competing after all.

So we often find that that’s the starting point, yes. Why can’t we develop these models and get them into production in time to meet our digital transformation initiative? And then it becomes, “Well, what bits do we have to change? How do we transform our MLOps capability to be able to do this and do this at scale?”

Often this shift is led by an individual in an organization. There develops a momentum in an organization to make these changes. But the changes can be really small at the start, of course. You might start off with just a single ML problem related to digital transformation. 

We acquired MapR some time ago, which is now our HPE Ezmeral Data Fabric. And it underpins a lot of the work that we’re doing. And so, we will often start with the data, to be honest with you, because a lot of the challenges in many of our organizations has to do with the data. And as businesses become more real-time and want to connect more closely to the edge, really that’s where the strengths of the data fabric approach come into play.

So another starting point might be the data. A new application at the edge, for example, has new, very stringent requirements for data and so we start there with building these data systems using our data fabric. And that leads to a requirement to do the analytics and brings us obviously nicely to the HPE Ezmeral MLOps, the data science proposition that we have.

Gardner: Doug, is the COVID-19 pandemic prompting people to bite the bullet and operationalize data science because they need to be fleet and agile and to do things in new ways that they couldn’t have anticipated?

Cackett: Yes, I’m sure it is. We know it’s happening; we’ve seen all the research. McKinsey has pointed out that the pandemic has accelerated a digital transformation journey. And inevitably that means more data science going forward because, as we talked about already with that Forrester research, some 98 percent think that it’s about competitive advantage. And it is, frankly. The research goes back a long way to people like Tom Davenport, of course, in his famous Harvard Business Review article. We know that customers who do more with analytics, or better analytics, outperform their peers on any measure. And ML is the next incarnation of that journey.

Gardner: Do you have any use cases of organizations that have gone to the industrialization approach to data science? What is it done for them?

Financial services benefits

Cackett: I’m afraid names are going to have to be left out. But a good example is in financial services. They have a problem in the form of many regulatory requirements.

When HPE acquired BlueData it gained an underlying technology, which we’ve transformed into our MLOps and container platform. BlueData had a long history of containerizing very difficult, problematic workloads. In this case, this particular financial services organization had a real challenge. They wanted to bring on new data scientists. But the problem is, every time they wanted to bring a new data scientist on, they had to go and acquire a bunch of new hardware, because their process required them to replicate the data and completely isolate the new data scientist from the other ones. This was their process. That’s what they had to do.

So as a result, it took them almost six months to do anything. And there’s no way that was sustainable. It was a well-defined process, but it’s still involved a six-month wait each time.

So instead we containerized their Cloudera implementation and separated the compute and storage as well. That means we could now create environments on the fly within minutes effectively. But it also means that we can take read-only snapshots of data. So, the read-only snapshot is just a set of pointers. So, it’s instantaneous.

They scaled out their data science without scaling up their costs or the number of people required. They are now doing that in a hybrid cloud environment. And they only have to change two lines of code to push workloads into AWS, which is pretty magical, right?

They were able to scale-out their data science without scaling up their costs or the number of people required. Interestingly, recently, they’ve moved that on further as well. Now doing all of that in a hybrid cloud environment. And they only have to change two lines of code to allow them to push workloads into AWS, for example, which is pretty magical, right? And that’s where they’re doing the data science.

Another good example that I can name is GM Finance, a fantastic example of how having started in one area for business -- all about risk and compliance -- they’ve been able to extend the value to things like credit risk.

But doing credit risk and risk in terms of insurance also means that they can look at policy pricing based on dynamic risk. For example, for auto insurance based on the way you’re driving. How about you, Dana? I drive like a complete idiot. So I couldn’t possibly afford that, right? But you, I’m sure you drive very safely.

But in this use-case, because they have the data science in place it means they can know how a car is being driven. They are able to look at the value of the car, the end of that lease period, and create more value from it.

These are types of detailed business outcomes we’re talking about. This is about giving our customers the means to do more data science. And because the data science becomes better, you’re able to do even more data science and create momentum in the organization, which means you can do increasingly more data science. It’s really a very compelling proposition.

Gardner: Doug, if I were to come to you in three years and ask similarly, “Give me the example of a company that has done this right and has really reshaped itself.” Describe what you think a correctly analytically driven company will be able to do. What is the end state?

A data-science driven future

Cackett: I can answer that in two ways. One relates to talking to an ex-colleague who worked at Facebook. And I’m so taken with what they were doing there. Basically, he said, what originally happened at Facebook, in his very words, is that to create a new product in Facebook they had an engineer and a product owner. They sat together and they created a new product.

Sometime later, they would ask a data scientist to get involved, too. That person would look at the data and tell them the results.

Then they completely changed that around. What they now do is first find the data scientist and bring him or her on board as they’re creating a product. So they’re instrumenting up what they’re doing in a way that best serves the data scientist, which is really interesting.

The data science is built-in from the start. If you ask me what’s going to happen in three years’ time, as we move to this democratization of ML, that’s exactly what’s going to happen. I think we’ll end up genuinely being information-driven as an organization.

That will build the data science into the products and the applications from the start, not tack them on to the end.

Gardner: And when you do that, it seems to me the payoffs are expansive -- and perhaps accelerating.

Cackett: Yes. That’s the competitive advantage and differentiation we started off talking about. But the technology has to underpin that. You can’t deliver the ML without the technology; you won’t get the competitive advantage in your business, and so your digital transformation will also fail.

This is about getting the right technology with the right people in place to deliver these kinds of results.

Gardner: I’m afraid we’ll have to leave it there. You’ve been with us as we explored how businesses can make data science more of a repeatable assembly line – an industrialization, if you will -- of end-to-end data exploitation. And we’ve learned how HPE is ushering in the latest methods, tools, and thinking around making data science an integral core function that both responds to business needs and scales to improve nearly every aspect of productivity.

So please join me in thanking our guest, Doug Cackett, EMEA Field Chief Technology Officer at HPE. Thank you so much, Doug. It was a great conversation.

Cackett: Yes, thanks everyone. Thanks, Dana.

Gardner: And a big thank you as well to our audience for joining this sponsored BriefingsDirect Voice of Analytics Innovation discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-supported discussions.

Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on the latest methods, tools, and thinking around making data science an integral core function of any business. Copyright Interarbor Solutions, LLC, 2005-2020. All rights reserved.

You may also be interested in:

Monday, November 30, 2020

How Transforming Source-to-Pay Procurement Delivers Agility and Improves Outcomes at Zuellig Pharma

Transcript of a discussion on how to bring agility, resilience, and managed risk to the end-to-end procurement process for significantly better overall business outcomes.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: SAP Ariba.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.


Our next intelligent procurement discussion explores the rationales and results when companies transform how they acquire goods and services. By injecting intelligence, automation, and standardization into the source-to-pay process, organizations are supporting even larger digital business transformation efforts.

Stay with us now as we hear from two pioneers in how to bring agility, resilience, and managed risk to the end-to-end procurement process for significantly better overall business outcomes. To show how organizations can embark on such a procurement and sourcing transformation journey, please join me now in welcoming our guests.

We’re here with Victoria Folbigg, Vice President of Procurement at Zuellig Pharma Holdings. Welcome, Victoria.

Victoria Folbigg: Thank you for having me, Dana.

We’re also joined by Baber Farooq, Senior Vice President of Product Strategy for Procurement Solutions at SAP. Welcome, Baber.

Baber Farooq: Thank you for having me, Dana.

Gardner: Baber, what are the top procurement trends and adoption patterns you’re seeing globally? Why is now such an important time to transform how you’re doing your sourcing and procurement?

Efficiency over productivity

Farooq: When we talk about trends in procurement, the macroeconomic factors governing the world -- particularly in this COVID-induced economy -- need to be kept in mind. Not only are we in a very dynamic situation, but the current scenario is impacting the function of the profession. This changing, evolving time presents an opportunity for procurement professionals to impact their businesses company-wide.

Firstly, if you look at the world, some of these trends existed prior to COVID hitting -- but I think they have accelerated. For the past 10 years, we’ve had declining productivity growth across the world. You can slice this by industry or by geography, but in general -- despite the technological advances from cloud computing, mobile technologies, and et cetera -- organizations from a labor productivity perspective are not becoming more-and-more productive.

This trend has existed for 10 to 15 years, but we really started seeing flattening over the past two to three years, particularly in the G7 countries. Now, it’s interesting because that past 10 years or so also correlates with some of the greatest economic expansion that the world has experienced. When things are going well, you can kind of say, “Yeah, productivity may be not necessarily so important.” But now that we’re in this unfortunate recession of remarkable scale, efficiency is going to become more-and-more important.

The second trend is we know that the digital economy has been expanding in this new millennium. It’s been expanding rapidly, and by all indications that trend will further accelerate in this new COVID-normal that everyone is trying to come to grips with. We have seen this in terms of our daily lives being disrupted and how digital tools have helped us to remain functional. Sometimes circumstances in the world that change everything become fuel for transformation. And to a large extent, I think the expansion of the digital economy will end up continuing and accelerating and procurement will play a significant role in that.

The third trend I see is this concept that The Economist has dubbed slowbalisation, which is the idea that despite the past 30 years of increasing globalization -- even prior to COVID, we saw a slowdown in globalization due to trade wars and nationalistic tendencies.

Post-COVID, I think organizations will ask the question, “Hey, this complicated global supply chain that has existed puts me at risk like I never thought of before if there’s something disruptive in the market.”

So, expect more focus on nearshore manufacturing across many industries. It’s going to become more prevalent from a goods perspective when we talk about trade. On the flip-side, from a service’s perspective, digitization will actually allow for more cross-border services to be provided. That includes things we never thought we could do cross-border before.

It will be a very interesting shift to see how the world changes with these trends, and how it impacts procurement. Procurement is going to play a pivotal and central role.

It will be a very interesting shift to see how the world changes with these trends, and how that impacts procurement. It doesn’t take a lot of reflection to see where synergies exist. If organizations are going to operate manufacturing differently than they have before, if supply chains will be structured differently, and if we engage in services procurement differently -- in all of those conversations, procurement is going to play a pivotal and central role.

And how are you going to try to come out of this productivity glut? If the promise of the artificial intelligence (AI) is going to help us come out of this productivity glut, then procurement is going to play a central role in how we use suppliers as key co-innovation partners. It means a very different lens about how you manage a relationship with your supplier base than we’ve done traditionally.

So those are some of the key factors if you look at how procurement is going to evolve over the next five to 10 years. The macroeconomic factors are the driving forces. The more that procurement professionals focus on providing solutions to their organizations around these areas the more impactful they can be. These are very different than the traditional metrics that we’ve had around cost savings. Those are still important, of course, don’t get me wrong. But if I think about how the procurement profession is changing and those trends, I think it’s going to be around these areas.

Medicine thinks globally, supplies locally

Gardner: Victoria, at your organization, Zuellig Pharma, are you also seeing these trends? Tell us about your organization and what you’ve been doing with procurement transformation?

Folbigg: Zuellig Pharma is one of the largest healthcare services groups in Asia. And as a pharma services company we distribute medicine in Asia. We are present in at least 13 countries, or what we call markets, in Asia. We also have clinical trials distribution throughout the world.

We realized pretty early on with all of the distribution capabilities and contact with healthcare professionals across Asia that we have a lot of data about drug purchasing preferences, which we are actively monetizing. We also have a significant role to play in ensuring the right medicines go to the market, which means preventing counterfeits and parallel trades.

Zuellig Pharma is not only enabling improved drug distribution, we also do vaccine distribution. In some of the bigger countries, for example, we take flu vaccines and distribute them to the various state hospitals and schools. It’s now very exciting for us to possibility be at the forefront of COVID-19 vaccines distribution. We are very busy figuring out how to make that possible across Asia.

Building on Baber’s points on globalization, which I found very relevant, there is a clear trend in supply of goods to move away from globalization. We have seen that even with the supply of personal protective equipment (PPE) from China due to people being concerned about buying from China as well as the many custom issues for going in and out. People are naturally now looking for supply sources closer to their countries. We are seeing that as well.

Baber also spoke about the globalization of services. This is fabulous and very exciting, and we are seeing that. For example, when I now negotiate contracts with consulting companies, I begin by telling them there is no need for travel. And so why don’t you put your best team on my project? I don’t even need your team to be in Asia.

And that makes them pull a breath and step back and say, “Oh my God. You know, we had these natural differences between regions and different companies in the servicing industry.” That is breaking down because customers are expecting the best people on the job anywhere. I completely see that in my daily work now.

Gardner: Baber also mentioned the impact of AI, with more data-driven decision-making. While we’re grappling with rapid changes in sourcing and the requirements of rapid vaccination distribution and logistics, for example, how are the latest technologies helping you transform your procurement?

AI for 100 percent reliability

Folbigg: It’s an interesting and complex subject. When I talk to my peers on the manufacturing side -- and again we’re not a manufacturing company – they oversee a lot of direct spend. I see them embracing data-driven, AI-driven procedures and analysis.

With services industries, and with indirect procurement, it is much more difficult, I believe. And hence we are not so much on the forefront of AI thinking. Also because we’re in Asia, I’m wondering whether there is enough databases and rules to be able to draw the right decisions.

Established technologies like RPA and chatbots are filling holes because people need support. The labor force is getting more expensive and so having a robot do a menial task can be more efficient.

For example, if I want to find a different source among suppliers very far away, I would rely normally on a database that would go through millions of sources for a supplier. If the supplier, though, were a local company, I might not find any relevant databases. So the challenges we have in Asia are about getting data that can be analyzed and then draw insights from it.

Other more established technologies like robotic process automation (RPA) and chatbots are filling holes because people need support. As Baber said, the labor force is getting more expensive even in Asia. So having a robot do a menial task can be much better and more efficient than hiring somebody to do it.

Gardner: Baber, how common are these challenges around data and analytics that Zuellig Pharma is grappling with?

Farooq: What Victoria said is so accurate with respect to the challenges that customers are facing in using these technologies. The challenge we have as technology provider is to make sure that we provide access to these technologies in the most beneficial fashion. AI is a very broad topic that means a lot of different things these days.

The ultimate goal of AI is to provide insights and eliminate tasks while effectively focusing on actual business outcomes, and not having so much repetition. When Victoria mentions the, “Hey, we can use a lot of this in the direct materials space,” a lot of those are predictable, repetitive tasks.

In the services space, and for indirect materials purchasing, it’s more difficult to grapple with it because it’s not as predictable and it’s not as rule-oriented as in other areas. That gets to the true heart of the problem with AI across any space, right? The last mile of AI is very hard. You can make it 90 percent effective, but who is going to trust their business with a robotic or computational process that’s 90 percent effective? Making it a 100 percent effective is the real challenge.

This is why we don’t have self-driving cars right now, right? They work great in laboratories. They work great on test tracks. They are driven around deserts. So much advancement and capabilities have happened, but that last mile is still yet to be achieved. And the amount of data needed to make that last mile work is an order of magnitude greater than it is for the first 90 percent of achieving the outcome.

The onus and the burden, frankly, is on companies like SAP to make sure that we can solve this problem for customers like Zuellig so that they can truly trust their business for insights and for the outcome-driven work that they would want the machines to do before they go ahead and say, “Okay, we’re happy with AI.” There were predictions six to seven years ago that dermatologists would not be diagnosing skin cancer anymore because an app would be doing it by taking a photo. That’s not true. It hasn’t happened yet, right? But the potential is still there.

The focus is on the outcomes that professionals are looking for. Let's see if we can use the data from across the world to drive these outcomes in a sustainable and predictable fashion. This work is research-oriented.

For us, the focus is on the outcomes that professionals are looking for. Let’s see if we can use the data from across the world to drive these outcomes in a sustainable and predictable fashion. This work is research-oriented. It requires focus from companies such as SAP to say that this is where we’re going to take the initiative and actually drive toward this outcome.

The reason why we feel that SAP is one of the companies that can do this is we actually have so many data. I mean, if you look at the SAP Business Network and the fact that just on spend and sourcing events we’re carrying $20 to $25 trillion worth of procurement over the past 10 years, we believe we have the data that can start making an impact.

We have to prove it, undoubtedly, especially when it comes to niche economies and emerging markets, like Victoria said. But we have a very strong starting point. And, of course, at the same time, we have to be considerate about privacy concerns and General Data Protection Regulation (GDPR) in all of these things. If you’re going to be mining data and then cross-applying the impacts across customer communities, you have to do it in a responsible manner.

So those are the things that we are grappling with. I clearly see there’s a trend here and you will see AI impacting procurement processes before you see AI driving cars on roads. There’s still a lot of work to be done, and there’s still a lot of data that needs to be mined in order to make sure that we’re building something that’s intelligence- and not just rule-based. You can use RPA, for sure, but that’s still rule-based. That’s not true intelligence, and no business is going to actually go ahead and say, “Hey, we’re happy with the insights that the machine is telling us or we’re happy with the machine doing the work of a human if it’s 90 to 95 percent accurate.” It really, really needs to be 99.9 percent accurate for that to happen.

Gardner: And whether we are doing this with AI or traditional data-driven analytics, what we need to deliver more of now are better agility, resilience, and managed risks.

Victoria, tell us about your journey at Zuellig Pharma and why you’re working toward those fundamental goals. How have you gone about revamping your source-to-pay procurement to attain that agility and resilience, and to managing risks?

Strategic procurement journey

Folbigg: Our real strategic sourcing journey started in 2016. And I like to call the company a 100-year-old startup because Zuellig Pharma is truly 100 years old. The company was, fair to say, in the early stages very decentralized. And then it moved to become more central-to-edge, with the need in Asia with emerging economies for general management to act much faster if there was a risk or opportunity. So these principles still apply.

But the chief executive saw the need for more strategic procurement, with transparency, visibility, and control of spend accountability. He sponsored the need to design and set up a lean procurement function within the Asia region. The first thing we decided to do was put a system in place to better anchor a new, all-encompassing, yet small procurement team. I have been getting this visibility, control, and data through our all-encompassing procurement system, SAP Ariba.

SAP Ariba has also been different because of its ecosystem and because they’re backed by SAP. And it has a support network and already-proven technology across Asia. Because of Asian tax rules, and the variety of Asian languages, we found when we looked at the market back in 2015-2016 that you needed a system that will grow with you. We needed something that’s anchored very strongly within Asia. From that, we gained control and visibility in stage one of our journey.

The next stage focused on process improvement. Our old key performance indicator (KPI) was about how long it takes to pay an invoice. And that you need to make it easier and user friendly but also have controls in place to ensure that you have no fund leakage. So, control and visibility are number one and two, and process improvement is number three. Next, we will be seeking agility and then insights.

But COVID-19 has shown the need for traditional procurement, too. For example, when it came time that we needed a PPE supplier -- everyone needed them. And it wasn’t a system that helped us, unfortunately. It’s more of people knowing people and finding out where there was capacity. That was not done via data-driven insights.

We had to go off system as well because sometimes we didn’t have time to get the supplies through the system. We also didn’t have time to pay the suppliers through the system because it was a supplier’s market: “You can have the shipment of your general masks. You take it or you leave it.”

The traditional kind of robust procurement systems were breaking down for us and exacerbated by the fact that we do not yet have the right kind of data to make these decisions. We still needed to be rather creative in how we found the best sources.

So very often we had to make this decision within an hour. And in some cases, I would come back to the supplier and say, “I’m ready to buy,” and they’re saying, “Sorry, somebody else offered me twice the price.” This was the reality of procurement last spring. It certainly brought us to the forefront because we needed to report to the CEO what we were doing to protect our business. We’re delivering the medicines to the hospitals. We probably needed this PPE for the drivers even more than the hospitals, and we needed to negotiate to buy that.

This is where the traditional kind of robust procurement systems were breaking down for us and exacerbated by the fact that we do not yet have the right amount of data on Asia translated into English to make these decisions as we would like to. That newer method may be strong and prevalent, of course, in the US and in Europe.

So that tested us quite a lot and it’s shown that we still needed to be rather creative in how we found the best sources. There are building blocks to what the systems allow you to do. And now we’re saying, “Okay, well, how can you give us insights? How can you give us this agility?” I think the systems need to evolve to be topical and to be able to address all of these use cases that came to the fore due to the COVID-19 pandemic.

Gardner: Listening to you reminds me of what Baber said about self-driving cars. You had to revert back to manual during the pandemic.

Folbigg: Bicycles even.

Pandemic forces procurement pivot

Farooq: It’s such a great point. One thing I’ve learned is that the technology and business processes we have constructed over the past 15 to 20 years kind of broke down. When you look at a pandemic of this magnitude -- it’s the greatest disruption in the world since World War II. The IMF just estimated how big. When the global financial crisis happened in 2008, the overall global GDP impact, because the emerging economies were not as affected, was a reduction of 0.1 to 0.2 percent in global GDP. This year we’re seeing a 5 percent GDP impact globally. It’s very, very significant.

The scale of the disruption is huge, and you are having these low-probability, high-impact events. Because they don’t happen for a long time, people presume they won’t happen, and they don’t plan for them.

What I’ve learned is, with technology and business processes, you need to keep in mind that one aspect that might have a 2 to 3 percent chance of happening. You can’t Pareto analysis that out of the way and not consider it. So it’s one thing to make sure that, of course, you’re not spending time focused on a problem that has a low chance of happening. But at the same time, you have to keep in mind that, “Hey, if there’s one of these events where if it happens, the results could be a complete breakdown.” You can’t ignore it, right? You need to make sure you have that factored into your technology.

So, emergency payment processes, emergency purchase order (PO) processes. These capabilities need to be built in. You can’t just presume that there’s going to be perfection that’s set up and available for all circumstances, and that’s the only thing you’re designing for, particularly when you talk about industries like life sciences.

Gardner: That’s the very character of agility and resiliency -- being able to have exception management for exceptions that you can’t anticipate. And certainly, we have seen that in the last seven months.

Now that we see how important procurement is for a larger organization during a very tumultuous time -- recognizing that we need to have the agility of working with the manual as well as the automatic -- what does the future portend? What will our systems need to now become in order to provide the new definition of agility and resiliency?

Agile systems preempt problems

Folbigg: We need agile systems, and we need to be able to solve specific use cases in order for these systems to become important, viable, and present within our procurement landscape and many ways of doing business.

It’s not good enough for us when everything reverts back to the system. When there is issue like a pandemic -- or for something that is not necessarily rule-based -- we then need to go off system, and that marginalizes the importance of the system. I honestly don’t know how you enable a search for suppliers that is largely relationship-based. But there are elements that come from the availability of data, data that is presented in a form that’s easily consumed, especially if the data has to be translated and normalized. That is something definitely that the system suppliers can play a role in.

When I look at the system now as the head of procurement, I am not looking at features and functions. I am looking at the problems that I need to solve through a system to enable us to drive the resiliency that the company needs. And if I look at the challenge that we have of enabling the potential like distribution across the world, what we are trying to do is not to be stuck in a situation that we had at the beginning of the year.

What we are looking proactively at is certain key suppliers to partner with to develop the system, to design the supply chain, and this is not transactional. This is a highly strategic activity based on human creativity, human network relationships, and trust between the leadership of different companies. It is a completely different design approach.

Now we are all thinking about preempting. How is the technology going to help me with what I am looking forward to? I need to be able to have the basic explanation at my fingertips fast in order for me and my team to concentrate on the strategic analysis.

Now we are all thinking about preempting. How is the technology going to help me with what I am looking forward to? I need to be able to have the basic explanation at my fingertips fast in order for me and my team to concentrate on the real strategic creative kinds of analysis.

Also, we need systems that can give us a lot of modeling and analysis. If you think about my problem now, I can buy freezers and cold storage for vaccines. But what am I going to do with them in five years’ time? You have supplies for the vaccine distribution. And then what?

I think the vaccine will become part-and-parcel of our cold chain and supply chain going forward because COVID-19 is not going to go away. The vaccines potentially are only going to last for a year or two, and you will have to be re-vaccinated. But, despite of all these high-cost, complex, energy-thirsty capital purchases, how do you do that? Now everything is done on the spur of the moment. A system that holistically can bring this all together for me would be a huge benefit.

Gardner: That point about being holistic, Baber, must be very important to you at SAP because you’ve been building out so many different systems, business capabilities, and data capabilities. It sounds like SAP might be actually in very good position to come to the rescue of somebody like Victoria, given that she has these pressing needs and wants to instantiate relationships into digital interactions. How SAP can help?

Supply chain for vaccine delivery

Farooq: It’s a privileged position because it’s a complicated problem. But it’s a problem that I believe SAP is one of the few companies that can support Zuellig. From our perspective, we want to get companies like Zuellig into a position where they can focus on those strategic elements and those creative elements that only humans can do. Creativity solving these problems is probably is one of the most complicated supply chain problems in recent history. The COVID vaccine distribution problem can only be solved through extensive creativity.

When SAP talks about the intelligent enterprise, that just means two very simple things. It means that I give an organization all of the insights and analytics capabilities at their fingertips so that they have the ability to quickly make decisions and pivot when they need to pivot, and that truly became evident during this pandemic. From our perspective, we have the ability.

If you look at all of the different processes that exist across manufacturing, distribution, sourcing, purchasing, procurement, payment -- all of these processes reside and are impacted by some element of SAP’s footprint. And our perspective is to make sure that all these elements can talk to each other. And by talking to each other, they can actively provide all of the data that’s required by organizations like Zuellig so that they can quickly make the decisions they need and focus on the strategic elements they need to focus on.

We don’t want people at Zuellig to be worried about how the POs are going to get raised and what are the different steps required for sourcing to take place. And that is very strictly the direction we want to take our products and we’re going to be taking our products so that we can go ahead and offer these solutions for companies like Zuellig.

The example that Victoria gave is just so close to my heart because I believe that when I was talking about the productivity decrease and growth that the world has experienced over the past 10 years, if we can make procurement more productive as a function, then procurement organizations can make the entire organization more productive. They can actually focus on supplier relationships and the co-innovation partnerships with suppliers that are critical suppliers. That has an impact on the entire business.

And no one is better suited to do doing that than procurement. We just have to get them out of the day-to-day processes of running reports, figuring out what the data says, and focusing on the transactional events and purchase orders and payments that take place. We need to get them out of those processes so they can leverage their skills in terms of finding the right suppliers, developing the right relationships that make an innovation impactful, and have an impact to the top line of organizations -- along with the bottom one.

And it is very clearly the direction that we are trying to take as rapidly as possible because we know that the next 12 months are critical in this space.

Gardner: Victoria, what advice could you give to others who are trying to transform their procurement organizations to take advantage of the agility and resilience that are now required? What advice can you offer for folks who might be not quite as far as long as you are in your transformation journey?

Educate around procurement

Folbigg: It’s complex because it depends very much on the specific company and how anchored procurement is. But it’s about making sure you find sponsors of the function who really understand the benefits of procurement. Give your team and yourself a job to show the benefit that strategic procurement can bring.

In this part of the world, we are just now seeing procurement on the university curriculum. Where I worked before, in Europe and the US, it was an established kind of skillset that we would learn in university. And there were courses on that in MBAs and social work. It’s just starting to anchor in universities in Asia. Go to your leadership and put procurement on the table and give a very factual and viable rationale of why the systems investment is very, very important.

As you are able to anchor your procurement with the system, it will put a lot of pressure on you to deliver the benefits that the system’s business cases provide. It gives you an opportunity to reach for wider buy-in of the system with you and your purchasers. Your training of people on what procurement can provide then becomes part of their evaluation. So, I think certainly this goes in hand-in-hand.

Gardner: Baber, anything more to offer?

Farooq: Victoria said something just a few moments ago. She said, “I really don’t care about the feature functionality. I only care about the outcomes.” That should be your North Star. It’s natural when you get into the deployment that you care about all the different little things, but one of the things that organizations often struggle with once the deployment begins is they stay in those sub-processes and functional elements.

I only care about outcomes. That should be your North Star. It's natural when you get into the deployment that you can care about all the little things, but one of the things that organizations struggle with is that they stay stuck in those sub-processes.

And a lot of the things that were the guiding reasons behind their transformation to begin with, those got lost, right? I say keep that front and center. That is the basis by which not only you will get internal buy-in, CEO buy-in, and CFO buy-in -- but it’s also something that you should constantly be reminding people of as well.

Of course you have to deliver to those outcomes and that’s where companies like SAP need to be held accountable and be a partner to make sure that those outcomes are delivered. But those business outcomes from a technology perspective is everything that we want to be focusing on and from a business perspective, and everything that the procurement organization should focus on.

And COVID-19 will force a recalibration on what those business outcomes should be. The traditional measures of the efficacy of procurement will change -- and should change -- because procurement can make a bigger, deeper impact for organizations.

Supply chain resilience is going to become a much more important factor. Procurement should embrace what they want to impact. Co-innovative partnerships that you deliver for the business should become a much more important factor. Procurement should embrace and show the impact. These are not measurements that were traditionally monitored, but they’re going to be increasing in terms of importance as we encounter the challenges of the next couple of years. This is something procurement organizations should embrace because it will elevate their standing in organizations.

Gardner: I’m afraid we’ll have to leave it there. You’ve been listening to a sponsored BriefingsDirect discussion on the rationales and results when companies look to intelligent automation and standardization for how they acquire the goods and services.

And we’ve learned how organizations are finding -- even during the pandemic -- new lessons and efficiencies in how their source-to-pay processes and purchasing work best.

So please join me in thanking our guests, Victoria Folbigg, Vice President of Procurement at Zuellig Pharma Holdings. Thank you so much, Victoria.

Folbigg: Thank you for having me.

Gardner: And also a big thank you to Baber Farooq, Senior Vice President of Product Strategy for Procurement Solutions at SAP. Thank you, sir.

Farooq: Thank you, Dana, for having me.

And a big thank you as well to our audience for joining this BriefingsDirect modern digital business innovation discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of SAP-sponsored BriefingsDirect discussions.

Thanks again for listening. Please do come back next time, and feel free to share this information across your IT and business communities.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: SAP Ariba.

Transcript of a discussion on how to bring agility, resilience, and managed risk to the end-to-end procurement process for significantly better overall business outcomes. Copyright Interarbor Solutions, LLC, 2005-2020. All rights reserved.

You may also be interested in: