Tuesday, October 11, 2016

How Propelling Instant Results to the Excel Edge Democratizes Advanced Analytics

Transcript of a discussion on how HTI Labs in London provides the means and governance with their Schematiq tool to bring critical data to the spreadsheet interface users want most.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition to the Hewlett Packard Enterprise (HPE) Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on digital transformation. Stay with us now to learn how agile businesses are fending off disruption in favor of innovation.

Gardner
Our next case study highlights how powerful and diverse financial information is delivered to the ubiquitous Excel spreadsheet edge. We'll explore how HTI Labs in London provides the means and governance with Schematiq to bring critical data to the interface users want.

By leveraging the best of instant cloud-delivered information with spreadsheets, Schematiq democratizes end-user empowerment while providing powerful new ways to harness and access complex information.

To describe how complex cloud core-to-edge processes and benefits can be managed and exploited, we're joined by Darren Harris, CEO and Co-Founder of HTI Labs in London.

Welcome, Darren.
Learn More About
Haven OnDemand
Sign Up Now
Darren Harris: Thank you. It's great to be here.

Gardner: We're also here with Jonathan Glass, CTO and Co-Founder of HTI Labs. Welcome, Jonathan.

Jonathan Glass: Hi. Thank you.

Gardner: Let's put some context around this first. What major trends in the financial sector led you to create HTI Labs, and what are the problems you're seeking to solve?

Harris
Harris: Obviously, in finance, spreadsheets are widespread and are being used for a number of varying problems. A real issue started a number of years ago, where spreadsheets got out of control. People were using them everywhere, causing lots of operational risk processes. They wanted to get their hands around it for governance, and there were loads that we needed to eradicate -- Excel-type issues.

That led to the creation of centralized teams that locked down rigid processes and effectively took away a lot of the innovation and discovery process that traders are using to spot opportunities and explore data.

Through this process, we're trying to help with governance to understand the tools to explore, and [deliver] the ability to put the data in the hands of people ... [with] the right balance.

So by taking the best of regulatory scrutiny around what a person needs, and some innovation that we put into Schematiq, we see an opportunity to take Excel to another level -- but not sacrifice the control that’s needed.

Gardner: Jonathan, are there technology trends that allowed you to be able to do this, whereas it may not have been feasible economically or technically before?

Upstream capabilities

Glass: There are lot of really great back-end technologies that are available now, along with the ability to either internally or externally scale compute resources. Essentially, the desktop remains quite similar. Excel has remained quite the same, but the upstream capabilities have really grown.

Glass
So there's a challenge. Data that people feel they should have access to is getting bigger, more complex, and less structured. So Excel, which is this great front-end to come to grips with data, is becoming a bit of bottleneck in terms of actually keeping up with the data that's out there that people want.

Gardner: So, we're going to keep Excel. We're not going to throw the baby out with the bathwater, so to speak, but we are going to do something a little bit different and interesting. What is it that we're now putting into Excel and how is that different from what was available in the past?

Harris: Schematiq extends Excel and allows it to access unstructured data. It also reduces the complexity and technical limitations that Excel has as an out-of-the-box product.

We have the notion of a data link that's effectively in a single cell that allows you to reference data that’s held externally on a back-end site. So, where people used to ingest data from another system directly into Excel, and effectively divorce it from the source, we can leave that data where it is.

It's a paradigm of take a question to the data; don’t pull the data to the question. That means we can leverage the power of the big-data platforms and how they process an analytic database on the back-end, but where you can effectively use Excel as the front screen. Ask questions from Excel, but push that query to the back-end. That's very different in terms of the model that most people are used to working with in Excel.

Gardner: This is a two-way street. It's a bit different. And you're also looking at the quality, compliance, and regulatory concerns over that data.

Harris: Absolutely. An end-user is able to break down or decompose any workflow process with data and debug it the same way they can in a spreadsheet. The transparency that we add on top of Excel’s use with Schematiq allows us to monitor what everybody is doing and the function they're using. So, you can give them agility, but still maintain the governance and the control.

In organizations, lots of teams have become disengaged. IT has tried to create some central core platform that’s quite restrictive, and it's not really serving the users. They have gotten disengaged and they've created what Gartner referred to as the Shadow BI Team, with databases under their desk, and stuff like that.

By bringing in Schematiq we add that transparency back, and we allow IT and the users to have an informed discussion -- a very analytic conversation -- around what they're using, how they are using it, where the bottlenecks are. And then, they can work out where the best value is. It's all about agility and control. You just can't give the self-service tools to an organization and not have the transparency for any oversight or governance.

To the edge

Gardner: So we have, in a sense, brought this core to the edge. We've managed it in terms of compliance and security. Now, we can start to think about how creative we can get with what's on that back-end that we deliver. Tell us a little bit about what you go after, what your users want to experiment with, and then how you enable that.

Glass: We try to be as agnostic to that as we can, because it's the creativity of the end-user that really drives value.

We have a variety of different data sources, traditional relational databases, object stores, OLAP cubes, APIs, web queries, and flat files. People want to bring that stuff together. They want some way that they can pull this stuff in from different sources and create something that's unique. This concept of putting together data that hasn't been put together before is where the sparks start to fly and where the value really comes from.

Gardner: And with Schematiq you're enabling that aggregation and cleansing ability to combine, as well as delivering it. Is that right?
The iteration curve is so much tighter and the cost of doing that is so much less. Users are able to innovate and put together the scenario of the business case for why this is a good idea.

Harris: Absolutely. It's that discovery process. It may be very early on in a long chain. This thing may progress to be something more classic, operational, and structured business intelligence (BI), but allowing end-users the ability to cleanse, explore data, and then hand over an artifact that someone in the core team can work with or use as an asset. The iteration curve is so much tighter and the cost of doing that is so much less. Users are able to innovate and put together the scenario of the business case for why this is a good idea.

The only thing I would add to the sources that Jon has just mentioned is with HPE Haven OnDemand, [you gain access to] the unstructured analytics, giving the users the ability to access and leverage all of the HPE IDOL capabilities. That capability is a really powerful and transformational thing for businesses.

They have such a set of unstructured data [services] available in voice and text, and when you allow business users access to that data, the things they come up with, their ideas, are just quite amazing.

Technologists always try to put themselves in the minds of the users, and we've all historically done a bad job of making the data more accessible for them. When you allow them the ability to analyze PDFs without structure, to share that, to analyze sentiment, to include concepts and entities, or even enrich a core proposition, you're really starting to create innovation. You've raised the awareness of all of these analytics that exist in the world today in the back-end, shown end-users what they can do, and then put their brains to work discovering and inventing.

Gardner: Many of these financial organizations are well-established, many of them for hundreds of years perhaps. All are thinking about digital transformation, the journey, and are looking to become more data-driven and to empower more people to take advantage of that. So, it seems to me you're almost an agent of digital transformation, even in a very technical and sophisticated sector like finance.

Making data accessible

Glass: There are a lot of stereotypes in terms of who the business analysts are and who the people are that come up with ideas and intervention. The true power of democratization is making data more accessible, lowering the technical barrier, and allowing people to explore and innovate. Things always come from where you least expect them.

Gardner: I imagine that Microsoft is pleased with this, because there are some people who are a bit down on Excel. They think that it's manual, that it's by rote, and that it's not the way to go. So, you, in a sense, are helping Excel get a new lease on life.

Glass: I don’t think we're the whole story in that space, but I love Excel. I've used it for years and years at work. I've seen the power of what it can do and what it can deliver, and I have a bit of an understanding of why that is. It’s the live nature of it, the fact that people can look at data in a spreadsheet, see where it’s come from, see where it’s going, they can trust it, and they can believe in it.
Learn More About
Haven OnDemand
Sign Up Now
That’s why what we're trying to do is create these live connections to these upstream data sources. There are manual steps, download, copy/paste, move around the sheet, which is where errors creep in. It’s where the bloat, the slowness, and the unreliability can happen, but by changing that into a live connection to the data source, it becomes instant and it goes back to being trusted, reliable, and actionable.

Harris: There's something in the DNA, as well, of how people interact with data and so we can lay out effectively the algorithm or the process of understanding a calculation or a data flow. That’s why you see a lot of other systems that are more web-based or web-centric and replicate an Excel-type experience.

The user starts to use it and starts to think, "Wow, it’s just like Excel," and it isn’t. They hit a barrier, they hit a wall, and then they hit the "export" button. Then, they put it back [into Excel] and create their own way to work with it. So, there's something in the DNA of Excel and the way people lay things out. I think of [Excel] almost like a programing environment for non-programers. Some people describe it as a functional language very much like Haskell, and the Excel functions they write were effectively then working and navigating through the data.


Gardner: No need to worry that if you build it, will they come; they're already there.

Harris: Absolutely.

Gardner: Tell us a bit about HTI Labs and how your company came about, and where you are on your evolution.

Cutting edge

Harris: HTI labs was founded in 2012. The core backbone of the team actually worked for the same tier 1 investment bank, and we were building risk and trading systems for front-office teams. We were really, I suppose, the cutting edge of all the big data technologies that were being used at the time -- real-time, disputed graphs and cubes, and everything.

As a core team, it was about taking that expertise and bringing it to other industries. Using Monte Carlo farms in risk calculations, the ability to export data at speed and real-time risk. These things were becoming more centric to other organizations, which was an opportunity.

At the moment, we're focusing predominately on energy trading. Our software is being used across a number of other sectors and our largest client has installed Schematiq on 120 desktops, which is great. That’s a great validation of what we're doing. We're also a member of the London Stock Exchange Elite Program, based in London for high-growth companies.

Glass: Darren and I met when we were working for the same company. I started out as a quant doing the modeling, the map behind pricing, but I found that my interest lay more in the engineering. Rather than doing it once, can I do it a million times, can I do these things reliably and scale them?
The algorithms are built, but the key to making them so much more improved is the feedback loop between your domain users, your business users, and how they can enrich and train effectively these algorithms.

Because I started in a front-office environment, it was very spreadsheet-dominated, it was very VBA-dominated. There's good and bad in that. A lot of those lessened, and Darren and I met up. We crossed the divide together from the top-down, big IT systems and the bottom-up end-user best-developed spreadsheets, and so on. We found a middle ground together, which we feel is a quite powerful combination.

Gardner: Back to where this leads. We're seeing more-and-more companies using data services like Haven OnDemand and starting to employ machine learning, artificial intelligence (AI), and bots to augment what the humans do so well. Is there an opportunity for that to play here, or maybe it already is? The question basically is, how does AI come to bear on what you can deliver out to the Excel edge?

Harris: I think what you see is that out of the box, you have a base unit of capability. The algorithms are built, but the key to making them so much more improved is the feedback loop between your domain users, your business users, and how they can enrich and train effectively these algorithms.

So, we see a future where the self-service BI tools that they use to interact with data and explore would almost become the same mechanism where people will see the results from the algorithms and give feedback to send back to the underlying algorithm.

Gardner: And Jonathan, where do you see the use of bots, particularly perhaps with an API model like Haven OnDemand?

The role of bots

Glass: The concept for bots is replicating an insight or a process that somebody might already be doing manually. When people create these data flows and analyses that they maybe run once so it’s quite time-consuming to run. The real exciting possibility is that you make these things run 24×7. So, you start receiving notifications, rather than having to pull from the data source. You start receiving notifications from your own mailbox that you have created. You look at those and you decide whether that's a good insight or a bad insight, and you can then start to train it and refine it.

The training and refining is that loop that potentially goes back to IT, gets back through a development loop, and it’s about closing that loop and tightening that loop. That's the thing that really adds value to those opportunities.

Gardner: Perhaps we should unpack Schematiq a bit to understand how one might go back and do that within the context of your tool. Are there several components of the tool, one of which might lend itself to going back and automating?

Glass: Absolutely. You can imagine the spreadsheet has some inputs and some outputs. One of the components within the Schematiq architecture is the ability to take a spreadsheet, to take the logic and the process that’s embedded in our spreadsheet, and turn it into an executable module of code, which you can host on your server, you can schedule, you can run as often as you like, and you can trigger based on events.
It’s very much all about empowering the end-user to connect, create, govern, share instantly and then allow consumption from anybody on any device.

It’s a way of emitting code from a spreadsheet. You take some of the insight, you take without a business analysis loop and a development loop, and you take the exact thing that the user, the analyst, has programmed. You make it into something that you can run, commoditize, and scale. That’s quite an important way in which we reduce that development loop. We create that cycle that’s tight and rapid.

Gardner: Darren, would you like to explain the other components that make-up Schematiq?

Harris: There are four components of Schematiq architecture. There's the workbench that extends Excel and allows the ability to have large structured data analytics. We have the asset manager, which is really all about governance. So, you can think of it like source control for Excel, but with a lot more around metadata control, transparency, and analytics on what people are using and how they are using it.

There's a server component that allows you just to off-load and scale analytics horizontally, if they do that, and build repeatable or overnight processes. The last part is the portal. This is really about allowing end-users to instantly share their insights with other people. Picking up from Jon’s point about the compound executable, but it’s defined in Schematiq. That can be off-loaded to a server and exposed as another API to a computer, the mobile, or even a function.

So, it’s very much all about empowering the end-user to connect, create, govern, share instantly and then allow consumption from anybody on any device.

Market for data services

Gardner: I imagine, given the sensitive nature of the financial markets and activities, that you have some boundaries that you can’t cross when it comes to examining what’s going on in between the core and the edge.

Tell me about how you, as an organization, can look at what’s going on with the Schematiq and the democratization, and whether that creates another market for data services when you see what the demand entails.

Harris: It’s definitely the case that people have internal datasets they create and that they look after. People are very precious about them because they are hugely valuable, and one of the things that we strive to help people do is to share those things.

Across the trading floor, you might effectively have a dozen or more different IT infrastructures, if you think of what’s existing on the desk as being a miniature infrastructure that’s been created. So, it's about making easy for people to share these things, to create master datasets that they gain value from, and to see that they gain mutual value from that, rather than feeling closed in, and don’t want to share this with their neighbors.

If we work together and if we have the tools that enable us to collaborate effectively, then we can all get more done and we can all add more value.
If we work together and if we have the tools that enable us to collaborate effectively, then we can all get more done and we can all add more value.

Gardner: It's interesting to me that the more we look at the use of data, the more it opens up new markets and innovation capabilities that we hadn’t even considered before. And, as an analyst, I expect to see more of a marketplace of data services. You strike me as an accelerant to that.

Harris: Absolutely. As the analytics are coming online and exposed by API’s, the underlying store that’s used is becoming a bit irrelevant. If you look at what the analytics can do for you, that’s how you consume the insight and you can connect to other sources. You can connect from Twitter, you connect from Facebook, you can connect PDFs, whether it’s NoSQL, structured, columnar, rows, it doesn’t really matter. You don’t see that complexity. The fact that you can just create an API key, access it as consumer, and can start to work with it is really powerful.

There was the recent example in the UK of a report on the Iraq War. It’s 2.2 million words, it took seven years to write, and it’s available online, but there's no way any normal person could consume or analyze that. That’s three times the complete works of Shakespeare.

Using these APIs, you can start to pull out mentions, you can pull out countries, locations and really start to get into the data and provide anybody with Excel at home, in our case, or any other tool, the ability to analyze and get in there and share those insights. We're very used to media where we get just the headline, and that spin comes into play. People turn things on their, head and you really never get to delve into the underlying detail.

What’s really interesting is when democratization and sharing of insights and collaboration comes, we can all be informed. We can all really dig deep, and all these people that work there, the great analysts, could start to collaborate and delve and find things and find new discoveries and share that insight.

Gardner: All right, a little light bulb just went off in my head whereas we would go to a headline and a new story and we might have a hyperlink to a source. I could get a headline and a news story, open up my Excel spreadsheet, get to the actual data source behind the entire story and then probe and plumb and analyze that any which way I wanted to.

Harris: Yes, Exactly. I think the most savvy consumer now, the analyst, is starting to demand that transparency. We've seen in the UK, words, election messages and quotes and even financial stats where people just don’t believe the headlines. They're demanding transparency in that process, and so governance can only be really a good thing.
Learn More About
Haven OnDemand
Sign Up Now
Gardner: I'm afraid we will have to leave it here. We've been exploring how powerful and diverse financial information is delivered to the ubiquitous Excel spreadsheet edge. And we have learned how HTI Labs in London provides the means and governance with their Schematiq tool to bring critical data to the interface that users want most.

So please join me in thanking our guests, Darren Harris, CEO and Co-Founder of HTI Labs, and Jonathan Glass, CTO and Co-Founder of HTI Labs.

And a big thank you to our audience as well for joining us for this Hewlett Packard Enterprise Voice of the Customer digital transformation discussion.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing series of HPE-sponsored interviews. Thanks again for listening, and please do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how HTI Labs in London provides the means and governance with their Schematiq tool to bring critical data to the spreadsheet interface that users want most. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in:

Wednesday, September 28, 2016

How JetBlue Mobile Applications Quality Assurance Leads to Greater Workforce Productivity

Transcript of a discussion on how JetBlue created a DevOps model by including more performance feedback in the continuous integration process.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition to the Hewlett Packard Enterprise (HPE) Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on technology innovation and how it's making an impact on people's lives.

Gardner
Our next performance engineering case study discussion examines how JetBlue Airways in New York has used virtual environments to reduce software development costs, centralized performance testing, and created a climate for continuous integration and real-time monitoring of mobile applications.

We'll now hear how JetBlue created a DevOps model by including performance feedback in the continuous integration process to enable greater workforce productivity.

To describe how efficient performance engineering has reduced testing, hardware and maintenance costs by as much as 60 percent, we're joined by Mohammed Mahmud, the Senior Software Performance Engineer at JetBlue Airways in New York. Welcome, Mahmud.

Mohammed Mahmud: Thank you.
Effective Performance Engineering
Download the Report
Gardner: Why is mobile user experience so very important to your ability to serve customers well these days?

Mahmud: It's really very important for us to give the customer an option to do check-in, book flights, manage bookings, check flight status, and some other things. On flights, they have an option to watch TV, listen to music, and buy stuff using mobile devices as well. But on board, they have to use Fly-Fi [wireless networks]. This is one of the most important business drivers for JetBlue Airways.

Gardner: What sort of climate or environment have you had to put together in order to make sure that those mobile apps really work well, and that your brand and reputation don’t suffer?

Mahmud: I believe a real-time monitoring solution is the key to success. We use HPE Business Service Management (BSM), integrated with third-party applications for monitoring purposes. We created some synthetic transactions and put them out there on a real device to see ... how it impacts performance. If there are any issues with that, we can fix it before it happens in the production environment.

Also, we have a real-time monitoring solution in place. This solution uses real devices to get the real user experience and to identify potential performance bottlenecks in a live production environment. If anything goes wrong there, we can get alerts from the production environment, and we can mitigate that issue right away.

DevOps benefits

Gardner: How have you been able to connect the development process to the operational environment?

Mahmud
Mahmud: My area is strictly performance engineering, but we're in the process of putting the performance effort into our DevOps model. We're going to be part of the continuous integration (CI) process, so we can take part in the development process and give performance feedback early in the development phase.

In this model, an application module upgrade is kicking off the functional test cases and giving feedback to the developers. Our plan is to take part of that CI process and include the performance test cases to provide performance feedback in the very early stage of the development process.

Gardner: How often are you updating these apps? Are you doing it monthly, quarterly, more frequently?

Mahmud: Most of them on a two- or three-week basis.

Gardner: How are you managing the virtual environment to create as close to the operating environment as you can? How do the virtualized services and networks benefit you?

Mahmud: We're maintaining a complete virtualized environment for our performance testing and performance engineering. Before our developers create any kind of a service, or put it out there, they do mock-ups using third-party applications. The virtual environment they're creating is similar to the production environment, so that when it’s being deployed out there in the actual environment, it works efficiently and perfectly without any issue.
Effective Performance Engineering
Download the Report
Our developers recently started using the service virtualization technology. Also, we use network virtualization technology to measure the latency for various geographical locations.

Gardner: How has performance engineering changed over the past few years? We've seen a lot of changes in general, in development, mobile of course, DevOps, and the need for more rapid development. But, how have you seen it shift in the past few years in performance engineering?

Mahmud: When I came to JetBlue Airways, LoadRunner was only one product they had. The performance team was responsible for evaluating the application performance by running a performance test and give the test results with identifying pass/fail based on the requirements provided. It was strictly performance testing.

The statistics they used to provide were pretty straightforward, maybe some transaction response times and some server statistics, but no other analysis or detailed information. But now, it’s more than that. Now, we don’t just test the application and determine the pass/fail. We analyze the logs, traffic flow, user behavior, and how they behave, etc. in order to create and design an effective test. Now, this is more performance engineering than performance testing.

Early in the cycle

We're getting engaged early in the development cycle to provide performance feedback. We're doing the performance testing, providing the response time in cases where multiple users are using that application or that module, finding out how this is going to impact the performance, and finding bottlenecks before it goes to the integration point.

So, it’s more of coming to the developers' table, sitting together, and figuring out any performance issue.

Gardner: Understanding the trajectory forward, it seems that we're going to be doing more with microservices, APIs, more points of contact, generating more data, trying to bring the analysis of that data back into the application. Where do you see it going now that you've told us where it has come from? What will be some of the next benefits that performance engineering can bring to the overall development process?

Mahmud: Well, as I mentioned earlier, we're planning to be part of the continuous integration; our goal is to become engaged earlier in the development process. That's sitting together with the developers on a one-to-one basis to see what they need to make sure that we have performance-efficient applications in our environment for our customers. Again, this is all about getting involved in the earlier stages. That's number one.
We're trying to become engaged in the early stages and be part of the development process as well.

Number two, we're trying to mitigate any kind of volume-related issue. Sometimes, we have yearly sales. We don’t know when that's going to happen, but when it happens, it’s an enormous pressure on the system. It's a big thing, and we need to make sure we're prepared for that kind of traffic on our site.

Our applications are mostly JetBlue.com and JetBlue mobile applications. It’s really crucial for us and for our business. We're trying to become engaged in the early stages and be part of the development process as well.

Gardner: Of course it’s important to be able to demonstrate value. Do you have any metrics of success or you can point to ways in which getting in early, getting in deep, has been of significant value? How do you measure your effectiveness?

Mahmud: We did an assessment in our production environment to see, if JetBlue.com goes down for an hour, how much it’s going to cost us? I'm not authorized to discuss any numbers, but I can tell you that it was in the millions of dollars.

So, before it goes to production with any kind of performance-related issue, we make sure that we're solving it before it happens. Right there, we're saving millions of dollars. That’s the value we are adding.

Gardner: Of course more and more people identify the mobile app with the company. This is how they interact; it becomes the brand. So, it's super important for that as well.

Adding value

Mahmud: Of course, and I can add another thing. When I joined JetBlue three years ago, industry standard-wise our position was bottom on the benchmark list. Now, we're within the top five in the benchmark list. So, we're adding value to our organization.

Gardner: It pays to get it done right the first time and get it early, almost in any activity these days.

What comes next? Where would you like to extend continuous integration processes, to more types of applications, developing more services? Where do you take the success and extend it?
Before it goes to production with any kind of performance-related issue, we make sure that we're solving it before it happens.

Mahmud: Right now, we're more engaged with JetBlue.com and mobile applications. Other teams are interested in doing performance testing for their systems as well. So, we're getting engaged with the SAP, DB, HR, and payroll team as well. We're getting engaged more day by day. It’s getting bigger everyday.

Gardner: Well, great. I'm afraid we'll have to leave it there. We've learned how JetBlue Airways has created a climate for continuous integration and real-time monitoring of mobile applications to enable greater workforce productivity. And we've heard how JetBlue created a DevOps model by including more performance feedback in the continuous integration process.

So, join me in thanking our guest, Mohammed Mahmud, Senior Software Performance Engineer at JetBlue Airways in New York. Thank you, Mohammed.

Mahmud: Thank you for having me.
Effective Performance Engineering
Download the Report
Gardner: And I would like to thank our audience as well for joining us for this Hewlett Packard Enterprise Voice of the Customer Podcast. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored discussions. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how JetBlue created a DevOps model by including more performance feedback in the continuous integration process. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in:

Monday, September 26, 2016

Seven Secrets to Highly Effective Procurement: How Technology, Data and Business Networks Fuel Innovation and Transformation

Transcript of a discussion on how technology, data analytics and digital business networks are transforming procurement and source-to-pay processes as we know them.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you're listening to BriefingsDirect.

Gardner
Our next innovation thought leadership discussion focuses on how technology, data analysis, and digital networks are transforming procurement and the source-to-pay process as we know it. We’ll also discuss what it takes to do procurement well in this new era of business networks.

Far beyond just automating tasks and transactions, procurement today is a strategic function that demands an integrated, end-to-end approach built on deep insights and intelligence to drive informed source-to-pay decisions and actions that enable businesses to adopt a true business ecosystem-wide digital strategy.

And according to the findings of a benchmarking survey conducted by SAP Ariba, there are seven essential traits of modern procurement organizations that are driving this innovation and business transformation. To learn more about the survey results please join me in welcoming our guest, Kay Ree Lee, Director of Value Realization at SAP. Welcome, Kay Ree.

Kay Ree Lee: Thank you, Dana.

Gardner: Procurement seems more complex than ever. Supply chains now stretch around the globe, regulation is on the rise, and risk is heightened on many fronts in terms of supply chain integrity.

Innovative companies, however, have figured out how to overcome these challenges, and so, at the value realization group you have uncovered some of these best practices through your annual benchmarking survey. Tell us about this survey and what you found.


Lee: We have an annual benchmarking program that covers purchasing operations, payables, sourcing, contract management, and working capital. What's unique about it, Dana, is that it combines a traditional survey with data from our procurement applications and business network.

This past year, we looked at more than 200 customers who participated, covering more than $350 billion in spend. We analyzed their quantitative and qualitative responses and identified the intersection between those responses for top performers compared to average performers. Then, we drew correlations between which top performers did well and the practices that drove those achievements.

Gardner: By making that intersection, it’s an example of the power of business networks, because you're able to gather intelligence from your business network environment or ecosystem and then apply a survey back into that. It seems to me that there is a whole greater than the sum of the parts between what the Ariba Network can do and what market intelligence is demanding.

Universe of insights

Lee: That’s right. The data from the applications in the Ariba Network contain a universe of insights, intelligence, and transactional data that we've amassed over the last 20-plus years. By looking at the data, we've found that there are specific patterns and trends that can help a lot of companies improve their procurement performance -- either by processing transactions with fewer errors or processing them faster. They can source more effectively by collaborating with more suppliers, having suppliers bid on more events, and working collaboratively with suppliers.

Lee
Gardner: And across these 200 companies, you mentioned $350 billion of spend. Do you have any sense of what kind of companies these are, or do they cross a variety of different types of companies in different places doing different vertical industry activities?

Lee: They're actually cross-industry. We have a lot of companies in the services industry and in the manufacturing industry as well.

Gardner: This sounds like a unique, powerful dataset, indicative of what's going on not just in one or two places, but across industries. Before we dig into the detail, let’s look at the big picture, a 100,000-foot view. What would you say are some the major high-level takeaways that define best-in-class procurement and organizations that can produce it these days based on your data?

Lee: There are four key takeaways that define what best-in-class procurement organizations do.

The first one is that a lot of these best-in-class organizations, when they look at source-to-pay or procure-to-pay, manage it as an end-to-end process. They don't just look at a set of discrete tasks; they look at it as a big, broad picture. More often than not, they have an assigned process expert or a process owner that's accountable for the entire end-to-end process. That's key takeaway number one.
A lot of these best-in-class organizations also have an integrated platform from which they manage all of their spend.

Key takeaway number two is that a lot of these best-in-class organizations also have an integrated platform from which they manage all of their spend. And through this platform, procurement organizations provide their internal stakeholders with flexibility, based on what they're trying to purchase.

For example, if a company needs to keep track of items that are critical to manufacturing and they need to have inventory visibility and tracking. That's one requirement.

Another requirement is if they have to purchase manufacturing or machine parts that are not stocked, that can be purchased through supply catalogs with pre-negotiated part description and item pricing.
   
Gardner: Are you saying that this same platform can be used in these companies across all the different types of procurement and source-to-pay activities -- internal services, even indirect, perhaps across different parts of a large company? That could be manufacturing or transportation? Is it the common platform common for all types of purchasing?

Common platform

Lee: That's right. One common platform for different permutations of what you're trying to buy. This is important.

The third key takeaway was that best-in-class organizations leverage technology to fuel greater collaboration. They don't just automate tasks. One example of this is by providing self-service options.

Perhaps a lot of companies think that self-service options are dangerous, because you're letting the person who is requesting items select on their own, and they could make mistakes. But the way to think about a self-service option is that it's providing an alternative for stakeholders to buy and to have a guided buying experience that is both simple and compliant and that's available 24/7.

You don't need someone there supervising them. They can go on the platform and they can pick the items, because they know the items best -- and they can do this around the clock. That's another way of offering flexibility and fueling greater collaboration and ultimately, adoption.
Networks have become very prevalent these days, but best-in-class companies connect to networks to assess intelligence, not just transact.

Gardner: We have technologies like mobile these days that allow that democratization of involvement. That sounds like a powerful approach.

Lee: It is. And it ties to the fourth key takeaway, which is that best-in-class organizations connect to networks. Networks have become very prevalent these days, but best-in-class companies connect to networks to assess intelligence, not just transact. They go out to the network, they collaborate, and they get intelligence. A network really offers scale that organizations would otherwise have to achieve by developing multiple point-to-point connections for transacting across thousands of different suppliers.

You now go on a network and you have access to thousands of suppliers. Years ago, you would have had to develop point-to-point connectivity, which costs money, takes a long time, and you have to test all those connections, etc.

Gardner: I'm old enough to remember Metcalfe's Law, which roughly says that the more participants in a network, the more valuable that network becomes, and I think that's probably the case here. Is there any indication from your data and research that the size and breadth and depth of the business network value works in this same fashion?

Lee: Absolutely. Those three words are key. The size -- you want a lot of suppliers transacting on there. And then the breadth -- you want your network to contain global suppliers, so some suppliers that can transact in remote parts of the world, even Nigeria or Angola.

Then, the depth of the network -- the types of suppliers that transact on there. You want to have suppliers that can transact across a plethora of different spend categories -- suppliers that offer services, suppliers that offer parts, and suppliers that offer more mundane items.

But you hit the nail on the head with the size and breadth of the network.

Pretty straightforward

Gardner: So for industry analysts like myself, these seem pretty straightforward. I see where procurement and business networks are going, I can certainly agree that these are major and important points.

But I wonder, because we're in such a dynamic world and because companies -- at least in many of the procurement organizations -- are still catching up in technology, how are these findings different than if you had done the survey four or five years ago? What's been a big shift in terms of how this journey is progressing for these large and important companies?

Lee: I don't think that there's a big shift. Over the last two to five years, perhaps priorities have changed. So, there are some patterns that we see in the data for sure. For example, within sourcing, while sourcing savings continue to go up, go down, sourcing continues to be very important to a lot of organizations to deliver cost savings.

The data tells us organizations need to be agile and they need to continue to do more with less. Networks have become very prevalent these days, but best-in-class companies connect to networks to assess intelligence, not just transact.
They have fewer people operating certain processes, and that means that it costs organizations less to operate those processes.

One of the key takeaways from this is that the cost structure of procurement organizations have come down. They have fewer people operating certain processes, and that means that it costs organizations less to operate those processes, because now they're leveraging technology even more. Then, they're able to also deliver higher savings, because they're including more and different suppliers as they go to market for certain spend categories.

That's where we're seeing difference. It's not really a shift, but there are some patterns in the data.

Gardner: It seems to me, too, though, that because we're adding through that technology more data and insight, we can elevate procurement more prominently into the category of spend management. That allows companies to really make decisions at a large environment level across the entire industries, maybe across the entire company based on these insights, based on best practices, and they can save a lot more money.

But then, it seems to me that that elevates procurement to a strategic level, not just a way to save money or to reduce costs, but to actually enable processes and agility, as you pointed out, that haven't been done before.

Before we go the traits themselves, is there a sense that your findings illustrate this movement of procurement to a more strategic role?

Front and center

Lee: Absolutely. That's another one of the key traits that we have found from the study. Top performing organizations do not view procurement as a back-office function. Procurement is front and center. It plays a strategic role within the organization to manage the organization’s spend.

When you talk about managing spend, you could talk about it at the surface level. But we have a lot of organizations that manage spend to a depth that includes performing strategic supplier relationship management, supplier risk management, and deep spend analysis. The ability to manage at this depth distinguishes top performers from average performers.

Gardner: As we know, Kay Ree, many people most trust their cohorts, people in other companies doing the same function they are, for business acumen. So this information is great, because we're learning from the people that are doing it in the field and doing it well. What are some of the other traits that you uncovered in your research?
Top performers play a strategic role within the organization. They manage more spend and they manage that spend at a deep level.

Lee: Let me go back to the first trait. The first one that we saw that drove top performing organizations was that top performers play a strategic role within the organization. They manage more spend and they manage that spend at a deep level.

One of the stats that I will share is that top performers see a 36 percent higher spend under management, compared to the average organization. And they do this by playing a strategic role in the organization. They're not just processing transactions. They have a seat at the leadership table. They're a part of the business in making decisions. They're part of the planning, budgeting, and financial process.

They also ensure that they're working collaboratively with their stakeholders to ensure that procurement is viewed as a trusted business adviser, not an administrator or a gatekeeper. That’s really the first trait that we saw that distinguishes top performers.

The second one is that top performers have an integrated platform for all procurement spend, and they conduct regular stakeholder spend reviews -- resulting in higher sourcing savings.

And this is key. They conduct quarterly – or even more frequent -- meetings with the businesses to review their spend. These reviews serve different purposes. They provide a forum for discussing various sourcing opportunities.

Imagine going to the business unit to talk to them about their spend from the previous year. "Here is who you have spent money with. What is your plan for the upcoming year? What spend categories can we help you source? What's your priority for the upcoming year? Are there any capital projects that we can help out with?"

Sourcing opportunities

It's understanding the business and requirements from stakeholders that helps procurement to identify additional sourcing opportunities. Then, collaborating with the businesses and making sure that procurement is being responsive and agile to the stakeholder requirements. Procurement, has to be proactive in collaborating with stakeholders and ensuring that they’re being responsive and agile to their requirements. That's the second finding that we saw from the survey.

The third one is that top performers manage procure-to-pay as an end-to-end process with a single point of accountability, and this really drives higher purchase order (PO) and invoicing efficiency. This one is quite straightforward. Our quantitative and qualitative research tells us that having a single point of accountability drives a higher transactional efficiency.

Gardner: I can speak to that personally. In too many instances, I work with companies where one hand doesn’t know what the other is doing, and there is finger pointing. Any kind of exception management becomes bogged down, because there isn’t that point of accountability. I think that’s super important.

Lee: We see that as well. Top performers operationalize savings after they have sourced spend categories and captured negotiated savings. The question then becomes how do they operationalize negotiated savings so that it becomes actual savings? The way top performers approach it is that they manage compliance for those sourced categories by creating fit-for-purpose strategies for purchase. So, they drive more spend toward contract and electronic catalogs through a guided buying experience.

You do that by having available to your stakeholders contracts and catalogs that would guide them to the negotiated pricing, so that they don't have to enter pricing, which would then dilute your savings. Top performers also look at working capital, and they look at it closely, with the ability to analyze historical payment trends and then optimize payment instruments resulting in higher discounts.
Top performers leverage technology and provide self-service to enable around-the-clock business.

Sometimes, working capital is not as important to procurement because it's left to the accounts payable (AP) function, but top performers or top performing procurement organizations look at it holistically; as another lever that they manage within the sourcing and procure-to-pay process.

So, it's another negotiation point when they are sourcing, to take advantage of opportunities to standardize payment terms, take discounts when they need to, and also look at historical data and really have a strategy, and variations of the strategy, for how we're going to pay strategic suppliers. What’s the payment term for standard suppliers, when do we pay on terms versus discounts, and then when do we pay on a P-Card? They look at working capital holistically as part of their entire procurement process.

Gardner: It really shows where being agile and intelligent can have major benefits in terms of your ability to time and enforce delivery of goods and services -- and also get the best price in the market. That’s very cool.

Lee: And having all of that information and having the ability to transact efficiently is key. Let’s say you have all the information, but you can't transact efficiently. You're slow to make invoice payments, as an example. Then, while you have a strategy and approach, you can’t even make a change there (related to working capital). So, it's important to be able to do both, so that you have the options and the flexibility to be able to operationalize that strategy.

Top performers leverage technology and provide self-service to enable around-the-clock business. This really helps organizations drive down cycle time for PO processing.

Within the oil and gas sector, for example, it's critical for organizations to get the items out to the field, because if they don't, they may jeopardize operations on a large scale. Offering the ability to perform self-service and to enable that 24x7 gives organizations flexibility and offers the users the ability to maneuver themselves around the system quite easily. Systems nowadays are quite user-friendly. Let the users do their work, trust them in doing their work, so that they can purchase the items they need to, when they want to.

User experience

Gardner: Kay Ree, this really points out the importance of the user experience, and not just your end-user customers, but your internal employee users and how younger folks, millennials in particular, expect that self-service capability.

Lee: That’s right. Purchasing shouldn't be any different. We should follow the lead of other industries and other mobile apps and allow users to do self-service. If you want to buy something, you go out there, you pick the item, the pricing is out there, it’s negotiated pricing, so you pick the item, and then let’s go.

Gardner: That’s enabling a lot of productivity. That’s great. Okay, last one.

Lee: The last one is that top performers leverage technology to automate PO and invoice processing to increase administrative efficiency. What we see is best-in-class organizations leverage technology with various features and functionalities within the technology itself to increase administrative efficiency.

An example of this could be the ability to collaborate with suppliers on the requisitioning process. Perhaps you're doing three bids and a buy, and during that process it's not picking up the phone anymore. You list out your requirements for what you're trying to buy and you send it out automatically to three suppliers, and then they provide responses back, you pick your responses and then the system converts the requirements to a PO.
Top performers are able to achieve about 7.8 percent in savings per year as a percent of source spend. That’s a key monetary benefit that most organizations look to.

So that flexibility by leveraging technology is key.

Gardner: Of course, we expect to get even more technology involved with business processes. We hear things about the Internet of Things (IoT), more data, more measurement, more scientific data analysis being applied to what may have been more gut instinct types of business decision making, now it’s more empirical. So I think we should expect to see even more technology being brought to bear on many of these processes in the next several years. So that’s kind of important to see elevated to a top trait.

All right, what I really like about this, Kay Ree, is this information is not just from an academic or maybe a theory or prediction, but this is what organizations are actually doing. Do we have any way of demonstrating what you get in return? If these are best practices as the marketplace defines them, what is the marketplace seeing when they adopt these principles? What do they get for this innovation? Brass tacks, money, productivity and benefits -- what are the real paybacks?

Lee: I'll share stats for top performers. Top performers are able to achieve about 7.8 percent in savings per year as a percent of source spend. That’s a key monetary benefit that most organizations look to. It’s 7.8 percent in savings.

Gardner: And 7.8 percent to someone who's not familiar with what we're talking about might not seem large, but this is a huge amount of money for many companies.

Lee: That's right. Per billion dollars, that’s $78 million.

Efficient processing

They also manage more than 80 percent of their spend and they manage this spend to a greater depth by having the right tools to do it -- processing transactions efficiently, managing contracts, and managing compliance.And they have data that lets them run deeper spend analysis. That’s a key business benefit for organizations that are looking to transact over the network, looking to leverage more technology.

Top performers also transact and collaborate electronically with suppliers to achieve a 99 percent-plus electronic PO rate. Best-in-class organizations don't even attach a PDF to an email anymore. They create a requisition, it gets approved, it becomes a PO, and it is automatically sent to a supplier. No one is involved in it. So the entire process becomes touch-less.

Gardner: These traits promote that automation that then leads to better data, which allows for better process. And so on. It really is a virtuous cycle that you can get into when you do this.

Lee: That’s right. One leads to another.
They create a requisition, it gets approved, it becomes a PO, and it is automatically sent to a supplier. No one is involved in it. So the entire process becomes touch-less.

Gardner: Are there other ways that we're seeing paybacks?

Lee: The proof of the pudding is in the eating. I'll share a couple of examples from my experience looking at data for specific companies. One organization utilizes the availability of collaboration and sourcing tools to source transportation lanes, to obtain better-negotiated rates, and drive higher sourcing savings.

A lot of organizations use collaboration and sourcing tools, but the reason why this is interesting is because when you think about transportation, there are different ways to source transportation, but doing it to an eSourcing tool and having the ability to generate a high percentage in savings through collaboration and sourcing tools, that was an eye-opener for me. That’s an example of an organization really using technology to its benefit of going out and sourcing an uncommon spend category.

For another example, I have a customer that was really struggling to get control of their operational costs related to transaction processing, while trying to manage and drive a high degree of compliance. What they were struggling with is that their cost structure was high. They wanted to keep the cost structure lower, but still drive a high degree of compliance.

When we looked at their benchmark data, it helped open the eyes of the customer to understand how to drive improvements by directing transactions to catalogs and contracts where applicable, driving suppliers to create invoice-based contracts in the Ariba Network and then they were enabling more suppliers to invoice electronically. This then helped increase administrative efficiency and reduced invoice errors, which were resulting in a lot of rework for the AP team.

So, these two examples, in addition to the quantitative benefits, show the tremendous opportunity organizations have to adopt and leverage some of these technologies.

Virtuous cycle

Gardner: So, we're seeing more technology become available, more data and analytics become available with the business networks are being built out in terms of size, breadth and depth, and we've identified that the paybacks can lead to a virtuous cycle of improvement.

Where do you see things going now that you've had a chance to really dig into this data and see these best practices in actual daily occurrence? What would you see happening in the future? How can we extrapolate from what we've learned in the market to what we should expect to see in the market?

Lee: We're still only just scratching the surface with insights. We have a roadmap of advanced insights that we're planning for our customers that will allow us to further leverage the insights and intelligence embedded in our network to help our customers increase efficiency in operations and effectiveness of sourcing.
We have a roadmap of advanced insights that we're planning for our customers that will allow us to further leverage the insights and intelligence embedded in our network.

Gardner: It sounds very exciting, and I think we can also consider bringing artificial intelligence and machine learning capabilities into this as we use cloud computing. And so the information and insights are then shared through a sophisticated infrastructure and services delivery approach. Who knows where we might start seeing the ability to analyze these processes and add all sorts of new value-added benefits and transactional efficiency? It's going to be really exciting in the next several years.

I'm afraid we'll have to leave it there. You've been listing to a sponsored Briefings Direct discussion focusing on how technology, data analytics and digital business networks are transforming procurement and source-to-pay processes as we know them. And we've heard how new survey results from SAP show seven essential traits of excellence for modern procurement and digital business innovation.

We’ve also seen from this data how these best practices have been proven in the field on how they can deliver some significant benefits and further elevate procurement and into a strategic role within organizations.

So, please join me in thanking our guest for sharing this great information and insight. We've been here with Kay Ree Lee, Director of Value Realization at SAP. Thank you so much.

Lee: Thanks, Dana.

Gardner: And a big thank you as well to our audience for joining this SAP Ariba-sponsored business innovation thought leadership discussion. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator. Thanks again for listing, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

Transcript of a discussion on how technology, data analytics and digital business networks are transforming procurement and source-to-pay processes as we know them. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in: