Monday, January 18, 2016

Procurement in 2016—The Supply Chain Goes Digital

Transcript of a BriefingsDirect discussion on the role of procurement as a strategic business force.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Gardner
Our business innovation thought leadership discussion today focuses on the heightened role and impact of procurement as a strategic business force. We'll explore how intelligent procurement is rapidly transforming from an emphasis on cost savings to creating new business value and enabling supplier innovations.

As the so-called digital enterprise adapts to a world of increased collaboration, data access, and business networks, procurement leaders can have a much bigger impact, both inside and outside of their companies.

To learn more about the future of procurement as a focal point of integrated business services we’re joined by Kurt Albertson, Principal of Advisory Services at The Hackett Group in Atlanta. Welcome, Kurt.

Kurt Albertson: Thank you, Dana.

Gardner: We're also joined by Dr. Marcell Vollmer, Chief Operating Officer at SAP Ariba and former Chief Procurement Officer at SAP. Welcome, Marcell.

Dr. Marcell Vollmer: Thanks, Dana, for having me. Great being here.

Gardner: You must be seeing external forces having an impact on your company and on Ariba's customers. We're looking at mobile devices being used more and more for business. We have connected business networks. How are these trends impacting procurement, and why is procurement going to have a bigger impact as time goes on?

Vollmer: Thanks, Dana. That's a really good question. I see a couple of disruptive trends, which are very important and are directly impacting procurement.

Vollmer
We see how smartphones and tablets have changed the way we work on a daily basis, not to forget big data, Internet of Things (IoT), Industry 4.0. So, there are a lot of technology trends out there that are very important.

On the other side, we also see completely new business models taking off. Uber is the largest taxi company without owning a single cab. Airbnb is basically the same, the largest accommodation provider, but not owning a single bed. We see also companies like WhatsApp, Skype, and WeChat. They don't own the infrastructure anymore, like what we know from the past.

I could mention a couple more, like Alibaba. Everybody knows it was the highest IPO in history, with a market capitalization of around $200 billion, and they even don’t have an inventory. What we're seeing are fundamental changes, the technology on one side and then the new business models.

We now see the impact here for procurement. When business models are changing, procurement also needs to change. Companies intend to simplify the way they do business today.

Complex processes

We see a lot of complex processes. We have a lot of complex business models. Today it needs to be "Apple easy" and "Google fast." This is simply what millennials expect in the market.

But also, we see that procurement, as a function itself, is transforming from a service to function. And this is definitely one trend. We see a different strategic impact. What is asked of procurement from the lines of business is more important and is on the agenda for the procurement function.

Let me add one last topic, the evolution of the Chief Procurement Officer (CPO) role, by saying that seeing the different trends in the market, seeing also the different requirements indicated by the trends for procurement, the role of procurement, as well as the CPO role in the 21st Century will definitely change.

I believe that the CPO role might evolve and might be a Chief Collaboration Officer role. Or, in the future, as we see the focus is more and more on the business value, a Chief Value Officer role might be the next big step.

Gardner: Kurt, we're hearing a lot from Marcell about virtual enterprises. When we say that a major retailer doesn’t have an inventory, or that a hotel rooms coordinator doesn’t have any beds, we're really now talking about relationships. We're talking about knowledge rather than physical goods. Does that map in some way to the new role of the CPO? How has the virtual enterprise impacted the procurement process?

Albertson: Marcell brought up some great points. Hackett is a quantitative-based organization. Let me share with you some of the insights from a very recent Key Issues Study that we did for 2016. This is a study we do each year, looking forward across the market. We're usually talking with the head of procurement about where the focus is, what’s the priority, what’s going to have the biggest impact on success, and what capabilities they're building out.

Albertson
Let me start at a high level. A lot of things that Marcell talked about in terms of elevating procurement’s role, and more collaboration and driving more value, we saw it quite strongly in 2015 -- and we see it quite strongly in 2016.

In 2015, when we did our Key Issues Study, the number one objective of the procurement executive was to elevate the role of procurement to what we called a trusted adviser, and certainly you've heard that term before.

We actually put a very solid definition around it, but achieving the role of a trusted adviser, in itself, is not the end-game. It does allow you to do other things, like reduce costs, tap suppliers for innovation, and become more agile as an organization, which was in the top five procurement objectives as well.

Trusted advisor

So when we look at this concept of the trusted adviser role of procurement, just as Marcell said, it's about a lot of the procurement executives across multiple industries who are asking, "How do we change the perception of procurement within the eyes of the stakeholders, so that we can do more higher value type activities?"

For example, if you're focusing on cost, we talk a lot about the quantity of spend influence, versus the quality of spend influence. In fact, in our forum in October, we had a very good discussion on that with our client base.

We used to measure success of the procurement organization by cost savings, but one of the key metrics a lot of our clients would look at is percent of spend influenced by procurement. We have a formal definition around that, but when you ask people, you'll get a different definition from them in terms of how they define spend influence.

What we've realized is that world-class organizations are in the 95 percent range and 90 percent plus on the indirect side. Non world-class procurement organizations are lagging, in the 70 percent range in terms of influence. Where do we go from here? It has to be about the quality of the spend influence.
When we look out in the market, there are a lot of companies that don't have line-item level detail or they don't have 90 percent or 95 percent-plus data quality with respect to spend analytics.

And what our data shows very clearly is that world-class organizations are involved during the requirements and planning stages with their internal stakeholders much more often than non-world-class organizations. The latter are usually involved either once the supplier has been identified, or for the most part, once requirements are identified and the stakeholder already knows what they want.

In both cases, you're influencing. But in the world-class case, you're doing a much better job of quality of influence, and you can open up tremendous amounts of value. It changes the discussion with your internal stakeholders from, "We're here to go out and competitively bid and help you get the best price," to, "Let’s have a conversation with what you're trying to achieve and, with the knowledge, relationships, and tool sets that we have around the supply markets and managing those supply markets, let us help you get more value in terms of what you are trying to achieve."

We've asked some organizations how we become a trusted adviser, and we've built some frameworks around that. One of the key things is exactly what you just talked about. In fact, we did a forward-looking, 10-year-out procurement 2025 vision piece of research that we published a few months ago, and big data and analytics were key components of that.

When we look at big data, like a lot of the things Marcell already talked about, most procurement groups aren’t very good at doing basic spend analytics, even with all the great solutions and processes that are out there. Still, when we look out in the market, there are a lot of companies that don't have line-item-level detail, or they don't have 90 percent or 95 percent-plus data quality with respect to spend analytics.

We need to move way beyond that for procurement to really elevate its role within the organization. We need to be looking at all of the big data that’s out there in the supply networks, across these supply networks, and across a lot of other sources of information. You have PDAs and all kinds of information.

We need to be constructively pulling that information together in a way that then allows us to marry it up with our internal information, do more analysis with that, synthesize that data, and then turn it over and provide it to our internal stakeholders in a way that's meaningful and insightful for them, so that they can then see how their businesses are going to be impacted by a lot of the trends out in the supply markets.

Transformational impact

This year, we asked a question that I thought was interesting. We asked which trends will have the greatest transformational impact on the way procurement performs its job over the next decade. I was shocked. Three out of the top five have to do with technology: predictive analytics and forecasting tools, cloud computing and mobility, the global economy and millennial workforce.

Mobility, predictive analytics, forecasting, and cloud computing are in the top five, along with global economy and the millennial workforce, two other major topics that were in our forward-looking procurement 2025 paper.

When we look at the trend that’s going to have the greatest transformational impact, it's predictive analytics and forecasting tools in terms of how procurement performs its job over the next 10 years. That’s big.

Consider the fact that we aren’t very good at doing the basics around spend analytics right now. We're saying that we need to get a lot better to be able to predict what’s going to happen in the future in terms of budgets, based on what we expect to happen in supply markets and economies.
We need to put in the hands of our stakeholders toolsets that they can then use to look at their business objectives and understand what’s happening in the supply market and how that might impact it in two to three years.

We need to put in the hands of our stakeholders tool sets that they can then use to look at their business objectives and understand what’s happening in the supply market and how that might impact it in two to three years. That way, when you look at some of the industries out there, when your revenue gets cut in more than half almost within a year, you have a plan in place that you can then go execute on to take out cost in a strategic way as opposed to just taking a broad axe and trying to take out that cost.

Gardner: Interesting. So it's being much smarter, much more analytic, and being proactive, rather than reactive, to what’s going on inside your company. It’s interesting, Kurt, because I talk to a lot of IT people, and they're doing more with the data and the analysis and they want to elevate it to more people in the organization, like the CPO.

It seems to me that Ariba is in a position to bridge these groups between what IT or data providers can bring out in terms of analysis and what these CPOs are going to need in order to do their jobs differently.

Marcell, do you see that? Do you see SAP Ariba playing a role as the bridge between technology and business processes that the CPOs are going to be looking to have more insight from?

Vollmer: Absolutely, Dana. I couldn’t agree more what Kurt said about the importance of the top priorities today. It's very important also to ask what you want to do with the data. First of all, you need technology. You need to get access to all the different sources of information that you have in a company.

We see today how difficult it is. I could echo what Kurt said about the challenges. A lot of procurement functions aren't even capable of getting the basic data to drive procurement, to do spend analytics, and then to see that it really links this to supply-chain data. In the future this will definitely change.

Good time to purchase

When you think about what you can do with the data by predictive analytics and then say, "This is a good time to buy, based on the cycle we've seen is this time-frame." This would give you a good time to make a purchase decision and go to the market.

And what do you need to do that? You need the right tools, spend visibility tools, and access to the data to drive end-to-end transparency on all the data what you have, for the entire source-to-pay process.

Gardner: Another thing that we're expecting to see more of in 2016 is collaboration between procurement inside an organization and suppliers -- finding new ideas for how to do things, whether it’s marketing or product design.

Kurt, do you have any data that supports this idea that this is not just a transaction, that there is, in fact, collaboration between partners, and that that can have quite an impact on the role and value that the procurement officer and their charges bring back to their companies?
That helps procurement category managers raise their game and really be perceived as adding more value, becoming this trusted advisor.

Albertson: Let me tie it into the conversation that we've been having. We just talked about a lot of data and analytics and putting that in the hand of procurement folks, so that they can then go and have conversations and be really advisers in terms of helping enable business strategies as opposed to just looking at historical spend cost analysis, for example. That helps procurement category managers raise their game and really be perceived as adding more value, becoming this trusted adviser.

Hackett Group works with hundreds of Global 1000 organizations, and probably still one of the most common discussions we have, and even in on-site training support that we do, is around strategic category management. It's switching the game from strategic sourcing, which we view as an end-step process that results in awarding a competitive bid process, with aggregation of spend and awarding a contract, to a more formal category management framework.

That provides a whole set of broader value levers that you can pull to drive value, including supplier relationship management (SRM), which includes working with suppliers to innovate, impacting a much broader set of value objectives that our stakeholders have, including spend cost reduction, but not only including spend cost-reduction.

We see such a level of interesting category management today. In our Key Issues Study in 2016, when we look at the capability building that organizations are rolling out, we've been seeing this shift from strategic sourcing to category management.

Strategic sourcing as a capability was always number one. It still is, but now number two is this category management framework. Think of those two as bookends, with category management being a much more mature framework than just strategic sourcing.

Category management

Some 80 percent of companies said category management is a key capability that they need to use to drive procurement’s objectives, and that’s because they're impacting a broader set of value objectives.

Now, the value levers they're pulling are around innovation and SRM. In fact, if you look at our 2016 Key Issues Study again, tapping supplier innovation is actually a little bit further on down the list, somewhere around 10.

When we look at all the things that are there, it’s actually ninth on the list, with 55 percent of procurement executives saying it’s a critical and major importance for us.

The interesting thing, though, is that if you go back to 2015 and compare where that is versus 2016, in 2016, that moves nearly into the top three with respect to the significantly more focus on a key capability. SRM has been a hot topic for our clients for a long time, but this tells us that it’s getting more and more important.

We're seeing a lot of organizations still with very informal SRM, supply innovation frameworks, in place. It’s done within the organization, but it’s done haphazardly by individuals within the business and by key stakeholders. A lot of times, that activity isn't necessarily aligned with where it can drive the most value.
We have to rethink how we look at our supply base and really understand where those suppliers are that can truly move the needle on supplier innovation.

When we work with a company, it's quite common for them to say, "These are our top five suppliers that we want to innovate with." And you ask, "If innovation is your objective, either to drive cost reduction or to help improve the market effectiveness of your products or services and drive greater revenue, whatever the reason you are doing that, are these suppliers going to get you there?"

Probably 7 out of 10 times, people come back to us and say that they picked these suppliers because they were the largest spend impact suppliers. But when you start talking about supplier innovation, they freely admit that there's no way that supplier is going to engage with them in any kind of innovation.

We have to rethink how we look at our supply base and really understand where those suppliers are that can truly move the needle on supplier innovation and engage them through a category-management framework that pulls the value lever of SRM and then track the benefits associated with that.

And as I said, looking at our 2016 Key Issues Study, supplier innovation was the fastest growing in terms of its focus objective that we saw when we asked the procurement executives.

Gardner: Marcell, back to you. It sounds as if the idea of picking a supplier is not just a cost equation, but that there is a qualitative part to that. How would you automate and scale that in a large organization? It sounds to me like you need a business network of some sort where organizations can lay out much more freely what it is that they're providing as a service, and then making those services actually hook up -- a collaboration function.

Is that something you're seeing at Ariba, as well that the business network, helping procurement move from a transaction cost equation to a much richer set of services?

Key role

Vollmer: Business networks play a key role for us for our business strategy, but also on how to help companies to simplify their complexity.

When you reach out to a marketplace, you're looking for things. You're probably also starting discussions and getting additional information. You're not necessarily looking for paint in the automotive industry or the color of a car. Why not get an already painted car as a service at the end?

This is a very simple example, but now think about when you go to the next level on how to evolve and have a technology partnership, where you reach out to suppliers, looking for new suppliers, by getting more and more information and also asking others who have probably having already done similar things.

When you do this on a network, you get probably responses from suppliers you wouldn't even have thought about having capabilities like that. This is a process that, in the future, will continue to aid successfully the transformation to a more value-focused procurement function, and simplicity is definitely a key.
You need to run simple. You need to focus on your business, and you need to get rid of the complexity.

You need to run simple. You need to focus on your business, and you need to get rid of the complexity. You can’t have all the information and do everything on your own. You need to focus on your core competencies and help the business in getting whatever they need to be successful, from the suppliers out in the market to ensure you get the best price for the desired quality, and ensure on-time deliveries.

The magic triangle of procurement is not a big secret in the procurement world. Everybody knows that it's not possible to optimize everything. Therefore, you need to find the right mix. You also need to be agile to work with suppliers in a different way by not only focusing just on the price, which a lot of operational technical procurement functions are used to. You need what you really want to achieve as a business outcome.

On a network you can get help from suppliers, from the collaboration side also, in finding the right ones to drive business value for your organization.

Gardner: Another major area where we're expecting significant change in 2016 is around the use of procurement as a vehicle for risk reduction. So having this visibility using networks -- elevating the use of data analysis, everything we have talked about, in addition to cost-efficiencies, in addition to bringing innovation to play between suppliers and consumers at the industrial scale -- it seems to me that we're getting insight deeply into supply chains and able to therefore head off a variety of risks. These risks can be around security, around the ability to keep supply chains healthy and functioning, and even unknown factors could arise that would damage even an entire company's reputation.

Kurt, do you have some data, some findings that would illustrate or reinforce this idea that procurement as a function, and CPOs in particular, can play a much greater role in the ability to detect risk and prevent bad things from happening to companies?

Supply continuity risk

Albertson: Again, I'll go back to the 2016 Key Issues Study and talk about objectives. Reducing supply continuity risk is actually number six on the list, and it’s a long list, and that’s pretty important.

A little bit further down, we see things like regulatory noncompliance risk, which is certainly core. It's certainly more aligned with certain industries than others. So just from our perspective, we see this as certainly number six on the list of procurement 2016 objectives, and the question is what we do about it.

There's another objective that I talked about earlier, which is to improve agility. It's actually number four on the list for procurement 2016 objectives.

I look at risk management and procurement agility going hand in hand. The way data helps support that is by getting access to better information, really understanding where those risks are, and then being able to quickly respond and hopefully mitigate those risks. Ideally, we want to mitigate risks and we want to be able to tap the suppliers themselves and the supply network to do it.

In fact, we attacked this idea of supply risk management in our 2025 procurement study. It’s really about going beyond just looking at a particular supplier and looking at all the suppliers that are out there in the network, their suppliers, their suppliers, and so on.

But then, it's also tapping all the other partners that are participating in those networks, and using them to help support your understanding and proactively identifying where risk might be occurring, so that you can take action against it.
How do we manage and analyze all this data? How do we make sense of it? That's where we see a lot of our clients struggling today.

It’s one of the key cornerstones of our 2025 research. It's about tapping supplier networks and pulling information from those networks and other external sources, pulling that information into some type of solution that can help you manage and analyze that information, and then presenting that to your internal stakeholders in a manner that helps them manage risk better.

And certainly, an organization like SAP Ariba is in a good position to do that. That’s obviously one of the major barriers with this big-data equation. How do we manage and analyze all this data? How do we make sense of it? That's where we see a lot of our clients struggling today.

We have had some examples of clients that have built out an SRM group inside their procurement organization as a center-of-excellence capability purely to pull this information that resides out in the market, whether it’s supplier market intelligence or information flowing from networks and other network partners. Marrying that information with their internal objectives and plans, and then synthesizing that information, lets them put that information in the hands of category managers.

Category managers can then sit down with business leaders and have fact-based opinions about what’s going to happen in those markets from a risk perspective. We could be talking about continuity of supply, pricing risks and the impact on profitability, or what have you. Whatever those risks are, you're able to use that information. It goes back to elevating the roles of trusted advisor. The more information and insight you can put into their hands the better.

The indirect side

Obviously, when we look at some of the supply networks, there's a lot of information that can be gleaned out there. Think about different buyers that are working with certain suppliers in getting information to them on supply risk performance. To be frank, a lot of organizations still don’t do a great job on the indirect side.

There are opportunities, and we're seeing it already in some of these markets for supply networks to start with the supplier performance piece of this, tap the network community to provide insight to that, and get help from a risk perspective that can be used to help identify where opportunities to manage risk better might occur.

But there are a lot of other sources of information and it’s really up to procurement to try to figure this out with all the sources of big data. Whether it’s sensor data, social data, transactional data, operational data, partner data, machine-to-machine (M2M) data, or cloud services based data, there's a lot of information. We have a model that looks at this kind of these three levels of kind of this analytics model.

The first level of the model is just for recording things and generating reports. The second level is that you're understanding and generating information that then can be used for analytics. Third, you're actually anticipating. You have intelligence and you're moving towards more real-time analytics so that you can be quicker in responding to potential risk.
Procurement organizations need to ensure that they really help the business as much as possible, and also evolve to the next level for their own procurement functions.

I mentioned this idea of agility as being key on the procurement executive’s list. Agility can be in many things, but one of the things that it means with respect to risk is that you can’t avoid every risk event. Some risk events are going to happen. There's nothing you're going to do about them, but you can proactively make plans for when those risk events do occur, so that you have a well thought-out plan based on analytics to execute in order to minimize the impact of that risk.

Time and time again, when we look at case studies and at the research that’s out there, those organizations that are much more agile in terms of responding to these risks where you're not going to be able to avoid them, minimize the impact of those risks significantly compared to others.

Gardner: As we look ahead to 2016, we're certainly seeing a lot on the plate for the procurement organization. It looks like they're facing a lot more technology issues, they're facing change of culture, they're thinking about being a networked organization. Marcell, how do you recommend that procurement professionals prepare themselves? What would you recommend that they do in order to meet these challenges in 2016? How can they be ready for such a vast amount of change?

Vollmer: Procurement organizations need to ensure that they really help the business as much as possible, and also evolve to the next level for their own procurement functions. Number one is that procurement functions need to see that they have the right organizational setup in place. That setup needs to fit the overall organizational line of business spectra, what a company has.

The second component, which I think is very important, is to have an end-to-end focus on the process side. Source-to-pay is a clearly defined term, but it's a little bit different in all the companies. When you really want to optimize, when you really want to streamline your process, you want to use business networks and strategic sourcing tools, as well as running in a highly automated level of transaction to leverage the automation potential of what you have in a purchase order or invoice automation, for example.

One defined process

Then, you need to ensure that you have one defined process and you need to have side systems covering all the different parts of the process. This needs to be highly integrated, as well as integrated in your entire IT landscape.

Finally, you need to also consider change management. This is a most important component by which you help the buyers in your organization transform and evolve to the next level into a more strategic procurement function.

As Kurt said about the data, if you don’t have some basic data, you're very far away from driving predictive analytics and prescriptive guidance. Therefore, you need to ensure that you invest also in your talents and that you drive to change management side.

These are the three components that I would see in 2016. This sounds easy, but I've talked to a lot of CPOs. This journey might take a couple of years, but procurement doesn't have a lot of time. We need to see now in procurement that we define the right measures, the right actions, to ensure that we can help the business and also create value.
If you don’t have some basic data, you're very far away from driving predictive analytics and prescriptive guidance.

As was already mentioned, this needs to go beyond just creating procurement savings. I believe that this concept is here to stay in the future. I think the value is what counts, what you can create.

Gardner: I'm afraid we'll have to leave it there. You've been listening to a sponsored BriefingsDirect podcast discussion on the heightened role and impact of procurement as a strategic business force. And we learned how procurement leaders can make a much bigger difference in the organization as procurement itself transforms to become a focal point of integrated business services.

So please join me in thanking our guests, Kurt Albertson, Principal of Advisory Services at The Hackett Group, and Dr. Marcell Vollmer, Chief Operating Officer at SAP Ariba, and the former Chief Procurement Officer at SAP. And also a big thank you to our audience for joining this SAP Ariba-sponsored business innovation thought leadership discussion.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

Transcript of a BriefingsDirect discussion on the role of procurement as a strategic business force. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in:

Steve Nunn, The Open Group President and CEO, Discusses the Inaugural TOGAF User Group Meeting and Practical Role of EA in Business Transformation

Transcript of a BriefingsDirect discussion with President and CEO of The Open Group, Steve Nunn, on what to expect from The Open Group San Francisco 2016, January 25 to 28.
 
Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership interview coming to you in conjunction with The Open Group San Francisco 2016 event on January 25. We'll explore a new user group being formed about TOGAF, The Open Group standard, and how
this group will further foster the practical use of the TOGAF enterprise architecture aid for effective and practical business transformation.

Gardner
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, and I'll be your host as we set the stage for the next chapter in enterprise architecture (EA) for digital business success.

We are now here with the President and CEO of The Open Group, Steve Nunn. Welcome, Steve.

Steve Nunn: Thank you, Dana. Glad to be here.

Gardner: Before we get to the TOGAF User Group news, let’s relate what’s changed in the business world and why EA and frameworks and standards like TOGAF are more practical and more powerful than ever.

Nunn: One of the keys, Dana, is that we're seeing EA increasingly used as a tool in business transformation. Whereas in the past, maybe in the early adoptions of TOGAF and implementations of TOGAF, it was more about redesigning EA, redesigning systems inside an organization more generally. Nowadays, with the need to transform businesses for the digital world, EA has another more immediate and more obvious appeal.

It’s really around an enablement tool for companies and organizations to transform their businesses for the digital world, specifically the worlds of the Internet of Things (IoT), big data, social, mobile, all of those things which we at The Open Group lump into something we call Open Platform 3.0, but it really is affecting the business place at large and the markets that our member organizations are part of.

Gardner: TOGAF has been around for quite a while. How old is TOGAF now?

Nunn: The first version of TOGAF was published in 1993, so it's been quite some time. For a little while, we published a version every year. Once we got to Version 7.0, the refreshes and the new versions came a bit slower after that.

Nunn
We're now at Version 9.1, and there is a new version being worked on. The key for TOGAF is that we introduced a certification program around it for both tools that help people implement TOGAF, but also for the practitioners, the individuals who are actually using it. We did that with version 8.0 and then we moved to what we consider, and the marketplace certainly considers, to be an improved version with TOGAF 9.0, making it an exam-based certification. It has proved to be very popular indeed, with more than 50,000 certified individuals under that program to date.

Gardner: Now the IT world, the business world, many things about these worlds have changed since 1993. Something that comes to mind, of course, is the need to not just think about architecture within your organization, but how that relates across boundaries of many organizations.

I sometimes tease friends who are Star Trek fans that we have gone from regular chess to 3-D chess, and that’s a leap in complexity. How does this need to better manage Boundaryless Information Flow make EA and standards like TOGAF so important now?

Common vocabulary

Nunn: With the type of change that you talked about and the level of complexity, what standards like TOGAF and others bring is commonality and ability to make architecting organizations a little bit easier; to give it all a bit more structure. One of the things that we hear is most valuable about TOGAF, in particular, is the common vocabulary that it gives to those involved in a business transformation, which obviously involves multiple parts of an organization and multiple partners in a group of organizations, for example.

So, it’s not just for enterprise architects. We're hearing increasingly about a level of training and introductory use of TOGAF at all levels of an organization as a means of communicating and having a common set of terminology. So everyone has the same expectation about what particular terms mean. With added complexity, we need things to help us work through that and divide up the complexity into different layers that we can tackle. EA and TOGAF, in particular, are proving very popular for tackling those levels of complexity.

Gardner: So in the next chapter, these things continue to evolve, react to the market, and adjust. We're hearing that there is news at the event, the January 25 event in San Francisco, around this new user group. Tell me why we're instituting a user group associated with TOGAF at this point?

Nunn: It’s going to be the first meeting of a TOGAF User Group, and it’s something we have been thinking about for some time, but the time seems to be now. I've alluded to the level of popularity of TOGAF, but it really is becoming very widely used. What users of TOGAF are looking for is how to better use it in their day jobs. How can they make it effective? How can they learn from what others have done, both good and bad, the things to try and the things not to try or more the things that worked and things that didn’t work? That isn’t something that we've necessarily offered, apart from a few conference sessions at previous events.

So this really ends up getting a broader community around TOGAF, and not just those members of the Architecture Forum which is our particular forum that advances the TOGAF standard. It’s really to engage the wider community, both those who are certified and those who aren’t certified, as a way of learning how to make better and more effective use of TOGAF. There are a lot of possibilities for what we might do at the meeting, and a lot of it will depend on what those who attend would like to cover.

Gardner: Now, to be clear, any standard has a fairly rigorous process by which the standard is amended, changed, or evolves over time. But we're talking about something separate from that. We're talking about perhaps more organic information flow, sharing, bringing points into that standard’s process. Maybe you could clarify the separation, the difference, the relationship between a standard’s adoption and a user group's input.
This is the first time we've offered nonmembers a real opportunity, not necessarily to decide what goes into the standard, but certainly a greater degree of influence.

Nunn: That’s the key point, Dana. The standard will get evolved by the members of The Open Group, specifically the members of The Open Group Architecture Forum. They are the ones who have evolved it this far and are very actively working on a future version. So they will be the ones who will ultimately get to propose what goes in and ultimately vote on what goes in.

Where the role of the user community, both members and non-members -- but specifically the opportunity for non-members -- comes in is being able to give their input, put forward ideas that areas where maybe TOGAF might be strengthened or improved in some way. Nobody pretends it’s prefect as you use it. It has evolved over time and it will evolve in the future. But hearing from those who actually use TOGAF day to day, we might get, certainly from The Open Group point of view, some new perspectives, and those perspectives will then get passed on through us to the members of the Architecture Forum.

Many of those we expect to attend the event anyway. They might hear it for the first time, but certainly we would spend part of the meeting looking at what that input might be, so that we have something to pass on to them for consideration in the standard.

This is the first time we've offered nonmembers a real opportunity, not necessarily to decide what goes into the standard, but certainly a greater degree of influence.

It's somewhat of a throwback to the days where user groups were very powerful in what came out of vendor organizations. I do hope that this will be something that will enable everyone to get the benefit of a better overall standard.

Past user groups

Gardner: I certainly remember, Steve, the days when vendors would quake in their boots when user meetings and groups came up, because they had such influence and impact. They both benefited each other. The vendors really benefited by hearing from the user groups and the user groups benefited by the standards that could come forth and vendor cooperation that they basically demanded.

I recall, at the last Open Group event, the synergy discussions around Zachman, and other EA frameworks. Do you expect that some of these user group activities that you're putting forth will allow some of that cross pollination, if you will, people who might be using other EA tools and want to bring more cooperation and collaboration across them?

Nunn: I would certainly expect that to happen. Our position at The Open Group, and we've said it consistently over the years, is that it’s not "TOGAF or," it’s "TOGAF and." The reality is that  most organizations, the vast majority, are not just going to take TOGAF and let it be everything they use in implementing their EAs.

So the other frameworks are certainly relevant. I expect there to be some interest in tools, as well as frameworks. We hear that quite a lot, suggestions of what good tools are for people at different stages of maturity and their implementation of the EA. So, I expect a lot of discussion about the other thoughts or the other tools in the toolbox of an EA to come up here.

Gardner: So user groups serve to bring more of an echo system approach, voices from disparate parties coming together sounds very powerful. Now this is happening on January 25. This is a free first meeting. Is that correct? And being in San Francisco, of course, it's within a couple hours drive of a lot of influential users, start-ups, the VC community, vendors, or service providers. Tell us a little bit about why people who are within a quick access to the Bay Area might consider coming to this on January 25?
What people would get out of it is the chance to hear a bit more about how TOGAF is used by others, case studies, what’s worked, what hasn’t worked, the opportunity to talk directly with people.

Nunn: That’s another reason, the location of our next event. We were first thinking this is the right time to do a first TOGAF User Group, because you see there are a lot of users of TOGAF in the area or within a few hours of it. What people would get out of it is the chance to hear a bit more about how TOGAF is used by others, case studies, what’s worked, what hasn’t worked, the opportunity to talk directly with people, whether it’s through networking or actually in the sessions in the user group meeting.

We're trying to not put too much rigid structure around those particular sessions, because we won’t be able to get the most benefit out of them. So it’s really what they want to get out of it that will probably be achievable.The point of view of The Open Group is that it's about getting that broader perspective for the attendees, learning useful tips and tricks, learning from the experience of others, and learning a bit more about The Open Group and how TOGAF has evolved.

This is a key point. TOGAF is so widely used now and globally, and even though we have quite a few members in The Open Group, we have more than 350 organization participating in some way in the Architecture Forum, and more in The Open Group as a whole.

But there's obviously a much wider community of those who are using it. Hearing more about how it has developed, what the processes are inside The Open Group, might make them feel good about the future of something that they clearly have some investment in. Hopefully, it might even persuade a few of those organizations to join and influence from the inside.

Gardner: Now, there's more information about the user group at www.opengroup.org. You're meeting on January 25 at 9:30 a.m. Pacific Time at the Marriott Union Square right in the heart of San Francisco. But this is happening in association with a larger event. So tell us about the total event that's happening between January 25 and 28.

Quarterly events

Nunn: This is part of one of our quarterly events that we've been running for lot of years now. They take the form generally of a plenary sessions that are open to anyone and also member meetings, where the members of the various Open Group forums get together to progress the work that they do virtually. But it’s to really knuckle down and progress some of it face-to-face, which as, we all know, is generally a very productive way of working.

Apart from the TOGAF User Group, we have on the agenda sessions on the Digital Business Strategy and Customer Experience, which is an activity that's being driven inside our Open Platform 3.0 Forum, as a membership activity, but this is really to open that up to a wide audience at the conference. So, we'll have people talking about that.

Open Platform 3.0 is where the convergence of technologies like cloud, social computing, mobile computing, big data, and IoT all come together. As we see it, our goal is for our members to create an Open Platform 3.0 Standard, which is basically a standard for digital platform, so that the enterprises can more easily use the technologies and get the benefit of these technologies that are now out there. There will be quite a bit of focus on Open Platform 3.0.

The other big thing that is proving very popular for us, which will be featured at the conference is the Open Group IT4IT Reference Architecture, and there is a membership activity, the IT4IT Forum. They're working on standards. We published the first version of that reference architecture at our last quarterly conference, which was in Edinburgh in October last year.
There has been a lot of interest in it so far, and we are working on a certification program for IT4IT that we will be launching later this year, hopefully at our next quarterly event in London in April.

There has been a lot of interest in it, and it's really a standard for running the business of IT. Oftentimes, IT is just seen as doing its own thing and not really part of the business. But the reality nowadays is that whoever is running the IT, be it the CIO or whatever other individual, to be successful they have to not just run IT as a business, with the usual business principles of return on investment, etc., but they have to be seen to be doing so. This is a reference architecture that's not specific to any industry and that provides a guide for how to go about doing that.

We're quite excited about it. There has been a lot of interest in it so far, and we are working on a certification program for IT4IT that we will be launching later this year, hopefully at our next quarterly event in London in April.

Gardner: I'll just remind our listeners and readers that we're going to be doing some separate discussions and sharing with them on the IT4IT Reference Architecture. So please look for that coming up.

Getting back to the event, Steve, I've attended many of these over the years and I find a lot of the discussions around security, around specific markets like healthcare and government really powerful and interesting. Is there anything in particular about this conference that you're particularly interested in or looking forward to?

Nunn: The ones I've already spoken to are the ones that I'm personally most looking forward to. We'll be having sessions on health care and security, as you say.

In the security area it’s worth calling out that one of the suggestions that we've had about TOGAF -- I won’t call it criticism, but one of the suggestions for future versions -- is that TOGAF is a bit light on security. It could do with beefing up that particular area.

The approach that we've taken this time, which people attending the conference will hear about, is that we have actually got the security experts to say what we need to cover in TOGAF, in the next version of TOGAF from a security point of view. Rather than having the architects include what they know about security, we have some heavyweight security folks in there, working with the Architecture Forum, to really beef up the security aspect. We'll hear a bit more about that.

Customer experience

Gardner: I also see that customer experience, which is closely aligned with user experience, is a big part of the event this year. That’s such a key topic these days for me, because it sort of forms a culmination of Platform 3.0. When you can pull together big data, hybrid cloud architectures, mobile enablement and reach, you can start to really do some fantastic new things that just really couldn’t have been done before when it comes to that user experience, real-time adaptation to user behaviors, bringing that inference back into a cloud or a back-end architecture, and then bringing back some sort of predictive or actionable result.

Please flesh out a bit more for us about how this user experience and customer experience is such a key part of the output, the benefit, the value, and the business transformation that we get from all these technical issues that we've discussed; this is sort of a business issue.

Nunn: You're absolutely right. It’s when we start providing a better experience for the customers overall and they can get more out of what the organizations are offering that everybody wins.
What we're trying to do from the organizational side is focus on what is it that you can do to look at it from the customers’ point of view, meet their expectations, and start to evolve from there.

From the group that we have working on this inside The Open Group, they are coming at it from a point of view that some of these new technologies are actually very scary for organizations, because they are forced to transform. The expectations of customers now are completely different. They expect to be able to get things on their cellphones or their tablets, or whatever device they might be using. That's  quite a big shift for a lot of organizations, and that’s not even getting into some of the areas of IoT, which promises to be huge.

What we're trying to do from the organizational side is focus on what is it that you can do to look at it from the customers’ point of view, meet their expectations, and start to evolve from there.

To me, it’s interesting from the point of view that it’s pretty business-driven. The technologies are there to be taken advantage of or to actually be very disruptive. So the business needs to know at a fairly early stage what those customer expectations are and take advantage of the new technologies that are there. That’s the angle that we are coming from inside The Open Group on that.

Some of the main participants in that group are actually coming from the telco world, where things have obviously changed enormously over the last few years. So that one is going to move quite quickly.

Gardner: It certainly seems that the ability to have boundaryless architecture is essential on that customer experience benefit. You certainly seem to be in the right place at the right time for that.

But the event in San Francisco also forms a milestone for you, Steve. You're now in your first full event as President and CEO of The Open Group, having taken over from Allen Brown last Fall. Tell us a little bit about your earlier roles within the standards organization and a bit more about yourself perhaps for those folks who are not yet familiar with you?

Quite different

Nunn: Yes, it will be quite different this time around. I've been with The Open Group for 22 years now. I was originally hired as General Counsel, and then fairly quickly moving on to Vice President of Corporate, Legal and Chief Operating Officer under Allen Brown as CEO. Allen was CEO for 17 years, and I was with him all of that time. It’s going to be quite different to have somebody else running the events, but I'm very much looking forward to it.

From my point of view, it’s a great honor to be leading The Open Group and its members into our next phase of evolution. The events that we hold are one small part of it, but they're a very important part, particularly these quarterly ones. It’s where a lot of our customers and members come together in one place, and as we have heard, there will be some folks who may not have been involved with one of our events before through the user group, so it’s pretty exciting.

I'm looking forward to building on the very solid foundation that we have and some of the great work activities that we mainly have ongoing inside The Open Group.
I'm looking forward to building on the very solid foundation that we have and some of the great work activities that we mainly have ongoing inside The Open Group.

Don’t expect great change from The Open Group, but just really more of the same good stuff that we've been working on before, having regard to the fact that obviously things are changing very rapidly around us and we need to be able to provide value in that fast changing world, which we are very confident we can.

Gardner: As an observer of the market, but also of The Open Group, I'm glad to hear that you're continuing on your course, because the world owes you in many ways. Things you were talking about 5 or 10 years ago have become very essential. You were spot on on how you saw the vision of the world changing on IT and its influence on business and vice versa.

More than ever, it seems that IT and EA is destiny for businesses. So I'm glad to hear that we're having a long vision, and the future seems very bright for your organization as the tools and approaches and the mentality and philosophy that you have been espousing becomes essential to do some of these things we have been discussing, like Platform 3.0, like customer experience, and IoT.

In closing, let’s remind our audience that you can register for the event at The Open Group website, www.opengroup.org. The first day, January 25, includes that free user group, the inaugural user group for TOGAF, and it all happens at the Marriott Union Square, San Francisco, along with the General Conference, which also runs from January 25 to 28.

Any last thoughts Steve, as we close out, in terms of where people should expect The Open Group to go, or how they can become perhaps involved in ways that they hadn’t considered before?

Good introduction

Nunn: Attending one of our events is a really good introduction to what goes on in The Open Group. For those who haven’t attended one previously, you might be pleasantly surprised.

If I had to pick one thing, I would say it's the breadth of activities there are at these events. It’s very easy for an organization like The Open Group to be known for one thing or a very small number of things, whether it’s UNIX originally and EA more recently, but there really is a lot going on beyond there.

Getting exposure to that at an event such as this, particularly in a location as important to the industry and as beautiful as San Francisco is, is a great chance. So anyone who is on the fence about going, then jump over the fence and try us out.
Attending one of our events is a really good introduction to what goes on in The Open Group. For those who haven’t attended one previously, you might be pleasantly surprised.

Gardner: We'll have to leave it there I'm afraid. We have been talking about how a new user group is being formed around TOGAF, an Open Group Standard. We've heard how this group will be fostering practical use of TOGAF, gaining insights from the field, organic knowledge bubbling up into the standards process around TOGAF. This, of course, is essential for EA to support effective and practical business transformation.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Event this January in San Francisco. Join me now in thanking our guest. We've been here with Steve Nunn, the President and CEO of The Open Group. Thanks so much, Steve.

Nunn: Thank you very much, Dana, for this opportunity and I hope to see some of your listeners at the event.

Gardner: Very good. Also, a big thank you to The Open Group for sponsoring this discussion. And lastly, a big thank you to our audience for joining us.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these Enterprise IT Thought Leadership Interviews. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: The Open Group.

Transcript of a podcast with President and CEO of The Open Group, Steve Nunn, on what to expect from The Open Group San Francisco 2016, January 25 to 28. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in:

Thursday, January 14, 2016

How SKYPAD and HPE Vertica Enable Luxury Retail Brands to Gain Rapid Insight into Consumer Sales Trends

Transcript of a BriefingsDirect discussion on how Sky I.T. has changed its platform and solved the challenges around variety, velocity, and volume for big data to make better insights available to retail users.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition of the HPE Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Our next big-data use case leadership discussion explores how retail luxury goods market analysis provider Sky I.T. Group has upped its game to provide more buyer behavior analysis faster and with more depth. We will see how Sky I.T. changed its data analysis platform infrastructure and why that has helped solve its challenges around data variety, velocity, and volume to make better insights available to its retail users.

Here to share how retail intelligence just got a whole lot smarter, we are joined by Jay Hakami, President of Sky I.T. Group in New York. Welcome, Jay.

Jay Hakami: Thank you very much. Thank you for having us.

Gardner: We're also here with Dane Adcock, Vice President of Business Development at Sky I.T. Group. Welcome Dane.

Dane Adcock: Thank you very much.

Gardner: And we're here with Stephen Czetty, Vice President and Chief Technology Officer at Sky I.T. Group. Welcome to BriefingsDirect, Stephen.
Sky I.T. Group
Retail Business-Intelligence Solutions
Get More Information
Stephen Czetty: Thank you, Dana, and I'm looking forward to the chance.

Gardner: What are the top trends that are driving the need for greater and better big-data analysis for retailers? Why do they need to know more, better, faster?

Adcock: Well, customers have more choices. As a result, businesses need to be more agile and responsive and fill the customer's needs more completely or lose the business. That's driving the entire industry into practices that mean shorter times from design to shelf in order to be more responsive.

It has created a great deal of gross marketing pressure, because there's simply more competition and more selections that a consumer can make with their dollar today.

Gardner: Is there anything specific to the retail process around luxury goods that is even more pressing when it comes to this additional speed? Are there more choices and  higher expectations of the end user?

Greater penalty

Adcock: Yes. The downside to making mistakes in terms of designing a product and allocating it in the right amounts to locations at the store level carries a much greater penalty, because it has to be liquidated. There's not a chance to simply cut back on the supply chain side, and so margins are more at risk in terms of making the mistake.

Ten years ago, from a fashion perspective, it was about optimizing the return and focusing on winners. Today, you also have to plan to manage and optimize the margins on your losers as well. So, it's a total package.

Gardner: So, clearly, the more you know about what those users are doing or what they have done is going to be essential. It seems to me, though, that we'rere talking about a market-wide look rather than just one store, one retailer, or one brand.

How does that work, Jay? How do we get to the point where we've been able to gather information at a fairly comprehensive level, rather than cherry-picking or maybe getting a non-representative look based on only one organization’s view into the market?

Hakami: With SKYPAD, what we're doing is collecting data from the supplier, from the wholesaler, as well as from their retail stores, their wholesale business, and their dot-com, meaning the whole omni channel. When we collect that data, we cleanse it to make sure its meaningful to the user.

Hakami
Now, we're dealing with a connected world where the retailer, wholesalers, and suppliers have to talk to one another and plan together for the buying season. So the partnerships and the insight that they get into the product performance is extremely important, as Dane mentioned, in terms of the gross margin and in terms of the software information. SKYPAD basically provides that intelligence, that insight, into this retail/wholesale world.

Gardner: Correct me if I'm wrong, but isn’t this also a case where people are opening up their information and making it available for the benefit of a community or recognizing that the more data and the more analysis that’s available, the better it is for all the participants, even if there's an element of competition at some point?

Hakami: Dana, that's correct. The retail business likes to share the information with their suppliers, but they're not sharing it across all the suppliers. They're sharing it with each individual supplier. Then, you have the market research companies who come in and give you aggregation of trends and so on. But the retailers are interested in sell-through. They're interested in telling X supplier, "This is how your products are performing in my stores."

If they're not performing, then there's going to be a mark down. There's going to be less of a margin for you and for us. So, there's a very strong interest between the retailer and a specific supplier to improve the performance of the product and the sell-through of those products on the floor.

Gardner: Before we learn more about the data science and dealing with the technology and business case issues, tell us a little bit more about Sky I.T. Group, how you came about, and what you're doing with SKYPAD to solve some of these issues across this entire supply chain and retail market spot.

Complex history

Hakami: I'll take the beginning. I'll give you a little bit of the history, Dana, and then maybe Dane and Stephen can jump in and tell you what we are doing today, which is extremely complex and interesting at the same time.

We started with SKYPAD about eight years ago. We found a pain point within our customers where they were dealing with so many retailers, as well as their own retail stores, and not getting the information that they needed to make sound business decisions on a timely basis.

We started with one customer, which was Theory. We came to them and we said, "We can give you a solution where we're going to take some data from your retailers, from your retail stores, from your dot-com, and bring it all into one dashboard, so you can actually see what’s selling and what’s not selling."

Fast forward, we've been able to take not only EDI transactions, but also retail portals. We're taking information from any format you can imagine -- from Excel, PDF, merchant spreadsheets -- bringing that wealth of data into our data warehouse, cleansing it, and then populating the dashboard.

So today, SKYPAD is giving a wealth of information to the users by the sheer fact that they don’t have to go out by retailer and get the information. That’s what we do, and we give them, on a Monday morning, the information they need to make decisions.
As these business intelligence (BI) tools have become more popular, the distribution of data coming from the retailers has gotten more ubiquitous and broader in terms of the metrics.

Dane, can you elaborate more on this as well?

Adcock: This process has evolved from a time when EDI was easy, because it was structured, but it was also limited in the number of metrics that were provided by the mainstream. As these business intelligence (BI) tools have become more popular, the distribution of data coming from the retailers has gotten more ubiquitous and broader in terms of the metrics.

But the challenge has moved from reporting to identification of all these data sources and communication methodologies and different formats. These can change from week to week, because they're being launched by individuals, rather than systems, in terms of Excel spreadsheets and PDF files. Sometimes, they come from multiple sources from the same retailer.

One of our accounts would like to see all of their data together, so they can see trends across categories and different geographies and markets. The challenge is to bring all those data sources together and align them to their own item master file, rather than the retailer’s item master file, and then be able to understand trends, which accounts are generating the most profits, and what strategies are the most profitable.
Visit the Sky BI Team
At The 2016 NRF Big Show!
Get More Information
It's been a shifting model from the challenge of reporting all this data together, to data collection. And there's a lot more of it today, because more retailers report at the UPC level, size level, and the store level. They're broadcasting some of this data by day. The data pours in, and the quicker they can make a decision, the more money they can make. So, there's a lot of pressure to turn it around.

Gardner: Let me understand, Dane. When you're putting out those reports on Monday morning, do you get queries back? Is this a sort of a conversation, if you will, where not only are you presenting your findings, but people have specific questions about specific things? Do you allow for them to do that, and is the data therefore something that’s subject to query?

Subject to queries

Adcock: It’s subject to queries in the sense that they're able to do their own discovery within the data. In other words, we put it in a BI tool, it’s on the web, and they're doing their own analysis. They're probing to see what their best styles are. They're trying to understand how colors are moving, and they're looking to see where they're low on stock, where they may be able to backfill in the marketplace, and trying to understand what attributes are really driving sales.

But of course, they always have questions about completeness of the data. When things don’t look correct, they have questions about it. That drives us to be able to do analysis on the fly, on-demand, and deliver some responses, "All your stores are there, all of your locations, everything looks normal." Or perhaps there seems to be some flaws or things in the data that don’t actually look correct.

Not only do we need to organize it and provide it to them so that they can do their own broad, flexible analysis, but they're coming back to us with questions about how their data was audited. And they're looking for us to do the analysis on the spot and provide them with satisfactory answers.

Gardner: Stephen Czetty, we've heard about the use case, the business case, and how this data challenge has grown in terms of variety as well as volume. What do you need to bring to the table from the architecture and the data platform to sustain this growth and provide for the agility that these market decision makers are demanding?

Czetty: We started out with an abacus, in a sense, but today we collect information from thousands of sources literally every single week. Close to 9,000 files will come across to us and we'll process them correctly and sort of them out -- what client they belong to and so forth, but the challenge is forever growing.

Czetty
We needed to go from older technology to newer technology, because our volumes of data are increasing and the amount of time that we need to consume to data in is static.

So we're quite aware that we have a time limit. We found Vertica as a platform for us to be able to collect the data into a coherent structure in a very rapid time as opposed to our legacy systems.

It allows us to treat the data in a truly vertical way, although that has nothing to do with the application or the database itself. In the past we had to deal with each client separately. Now we can deal with each retailer separately and just collect their data for every single client that we have. That makes our processes much more pipelined and far faster in performance.

The secret sauce behind that is the ability in our Vertica environment to rapidly sort out the data -- where it belongs, who it belongs to -- calculate it out correctly, put it into the database tables that we need to, and then serve it back to the front end that we're using to represent it.

That's why we've shifted from a traditional database model to a Vertica-type model. It's 100 percent SQL for us, so it looks the same for everybody who is querying it, but under the covers we get tremendous performance and compression and lots of cost savings.

Gardner: For some organizations that are dealing with the different sources and  different types of data, cleansing is one problem. Then, the ability to warehouse that and make it available for queries is a separate problem. You've been able to tackle those both at the same time with the same platform. Is that right?

Proprietary parsers

Czetty: That's correct. We get the data, and we have proprietary parsers for every single data type that we get. There are a couple of hundred of them at this point. But all of that data, after parsing, goes into Vertica. From there, we can very rapidly figure out what is going where and what is not going anywhere, because it’s incomplete or it’s not ours, which happens, or it’s not relevant to our processes, which happens.

We can sort out what we've collected very rapidly and then integrate it with the information we already have or insert new information if it's brand-new. Prior to this, we'd been doing this by hand to a large-scale, and that's not effective any longer with our number of clients growing.

Gardner: I'd like to hear more about what your actual deployment is, but before we do that, let’s go back to the business case. Dane and Jay, when Vertica came online, when Steve was able to give you some of these more pronounced capabilities, how did that translate into a benefit for your business? How did you bring that out to the market, and what's been the response?

Hakami: I think the first response was "wow." And I think the second response was "Wow, how can we do this fast and move quickly to this platform?"
Prior to this, we'd been doing this by hand to a large-scale, and that's not effective any longer with our number of clients growing.

Let me give you some examples. When Steve did the proof of concept (POC) with the folks from HP, we were very impressed with the statistics we had seen. In other words, going from a processing time of eight or nine hours to minutes was a huge advantage that we saw from the business side, showing our customers that we can load data much faster.

The ability to use less hardware and infrastructure as a result of the architecture of Vertica allowed us to reduce, and to continue to reduce, the cost of infrastructure. These two are the major benefits that I've seen in the evolution of us moving from our legacy to Vertica.

From the business perspective, if we're able to deliver faster and more reliably to the customer, we accomplished one of the major goals that we set for ourselves with SKYPAD.

Adcock: Let me add something there. Jay is exactly right. The real impact, as it translates into the business, is that we have to stop processing and stop collecting data at a certain point in the morning and start processing it in order for us to make our service-level agreements (SLAs) on reporting for our clients, because they start their analysis. The retail data comes in staggered over the morning and it may not all be in by the time that we need to shut that processing off.

One of the things that moving to Vertica has allowed us to do is to cut that time off later, and when we cut it off later, we have more data, as a rule, for a customer earlier in the morning to do their analysis. They don’t have to wait until the afternoon. That’s a big benefit. They get a much better view of their business.

Driving more metrics

The other thing that it has enabled us to do is drive more metrics into the database and do some processing in the database, rather than in the user tool, which makes the user tool faster and it provides more value.

For example, maybe for age on the floor, we can do the calculation in the background, in the database, and it doesn't impede the response in the front-end engine. We get more metrics in the database calculated rather than in our user tool, and it becomes more flexible and more valuable.
Sky I.T. Group
Retail Business-Intelligence Solutions
Get More Information
Gardner: So not only are you doing what you used to do faster, better, cheaper, but you're able to now do things you couldn't have done before in terms of your quality of data and analysis. Is there anything else that is of a business nature that you're able to do vis-à-vis analytics that just wasn't possible before, and might, in fact, be equivalent of a new product line or a new service for you?

Czetty: In the old model, when we got a new client we had to essentially recreate the processes that we'd built for other clients to match that new client, because they're collecting that data just for that client just at that moment.
In the current model, where we're centered on retailers, the only thing that will take us a long time to do in this particular situation is if there's a new retailer that we've never collected data from.

So 99 percent of it is the same as any other client, but one percent is always different, and it had to be built out. On-boarding a client, as we call it, took us a considerable amount of time -- we are talking weeks.

In the current model, where we're centered on retailers, the only thing that will take us a long time to do in this particular situation is if there's a new retailer that we've never collected data from. We have to understand their methodology of delivery, how it comes, how complex it is and so forth, and then create the logic to load that into the database correctly to match up with what we are collecting for others.

In this scenario, since we’ve got so many clients, very few new stores or new retailers show up, and typically it’s just our clients on retail chain, and therefore our on-boarding is just simplified, because if we are getting Nordstrom’s data from client A, we're getting the same exact data for client B, C, D, E, and F.

Now, it comes through a single funnel and it's the Nordstrom funnel. It’s just a lot easier to deal with, and on-boarding comes naturally.

Hakami: In addition to that, since we're adding more significant clients, the ability to increase variety, velocity, and volume is very important to us. We couldn't scale without having Vertica as a foundation for us. We'd be standing still, rather than moving forward and being innovative, if we stayed where we were. So this is a monumental change and a very instrumental change for us going forward.

Gardner: Steve, tell us a little bit about your actual deployment. Is this a single tenant environment? Are you on a single database? What’s your server or data center environment? What's been the impact of that on your storage and compression and costs associated with some of the ancillary issues?

Multi-tenant environment

Czetty: To begin with, we're coming from a multi-tenant environment. Every client had its own private database in the past, because in DB2, we couldn't add all these clients into one database and get the job done. There was not enough horsepower to do the queries and the loads.

We ran a number of databases on a farm of servers, on Rackspace as our hosting system. When we brought in Vertica, we put up a minimal configuration with three nodes, and we're still living with that minimal configuration with three nodes.

We haven't exhausted our capacity on the license by any means whatsoever in loading up this data. The compression is obscenely high for us, because at the end of the day, our data absolutely lends itself to being compressed.

Everything repeats over and over again every single week. In the world of Vertica, that means it only appears once in wherever it lives in the database, and the rest of it is magic. Not to get into the technology underneath it at this point, from our perspective, it's just very effective in that scenario.
With the three nodes, we've had zero problems with performance. It hasn't been an issue at all. We're just looking back and saying that we wish we had this a little sooner.

Also in our DB2 world, we're using quite costly large SAN configurations with lots of spindles, so that we can have the data distributed all across the spindles for performance on DB2, and that does improve the performance of that product.

However, in Vertica, we have 600 GB drives and we can just pop more in if we need to expand our capacity. With the three nodes, we've had zero problems with performance. It hasn't been an issue at all. We're just looking back and saying that we wish we had this a little sooner.

Vertica came in and did the install for us initially. Then, we ended up taking those servers down and reinstalling it ourselves. With a little information from the guide, we were able to do it. We wanted to learn it for ourselves. That took us probably a day and a half to two days, as opposed to Vertica doing it in two hours. But other than that, everything is just fine. We’ve had a little training, we’ve gone to the Vertica event to learn how other people are dealing with things, and it's been quite a bit of fun.

Now there is a lot of work we have to do at the back end to transform our processes to this new methodology. There are some restrictions on how we can do things, updates and so forth. So, we had to reengineer that into this new technology, but other than that, no changes. The biggest change is that we went vertical on the retail silos. That's just a big win for us.

Gardner: As you know, Vertica is cloud ready. Is there any benefit to that further down the road where maybe it’s around issues of a spike demand in holiday season, for example, or for backup recovery or business continuity? Any thoughts about where you might leverage that cloud readiness in the future?

Dedicated servers

Czetty: We're already sort of in the cloud with the use of dedicated servers, but in our business, the volume increases in the stores around holidays is not doubling the volume. It’s adding 10 percent, 15 percent, maybe 20 percent of the volume for the holiday season. It hasn’t been that big a problem in DB2. So, it’s certainly not going to be a problem in Vertica.

We've looked at virtualization in the cloud, but with the size of the hardware that we actually want to run, we want to take advantage of the speed and the memory and everything else. We put up pretty robust servers ourselves, and it turns out that in secure cloud environments like we're using right now at Rackspace, it's simply less expensive to do it as dedicated equipment. To spin up a machine, like another node for us at Rackspace, would take about same time it would take for virtual system setup and configure to a day or so. They can give us another node just like this on our rack.

We looked at the cloud financially every single time that somebody came around and said there was a better cloud deal, but so far, owning it seems to be a better financial approach.

Gardner: Before we close out, looking to the future, I suppose the retailers are only going to face more competition. They're going to be getting more demand from their end users or customers for user experience for information.
We looked at the cloud financially every single time that somebody came around and said there was a better cloud deal, but so far, owning it seems to be a better financial approach.

We're going to see more mobile devices that will be used in a dot-com world or even a retail world. We are going to start to see geolocation data brought to bear. We're going to expect the Internet of Things (IoT) to kick in at some point where there might be more sensors involved either in a retail environment or across the supply chain.

Clearly, there's going to be more demand for more data doing more things faster. Do you feel like you're in a good position to do that? Where do you see your next challenges from the data-architecture perspective?

Czetty: Not to disparage too much the industry of luxury, but at this point, they're not the bleeding edge on the data collection and analysis side, where they are on the bleeding edge on social media and so forth. We've anticipated that. We've got some clients who were collecting information about their web activities and we have done analysis for identifying customers who are presenting different personas through their different methods as they contact the company.

We're dabbling in that area and that’s going to grow as it becomes so tablet-oriented or phone-oriented as the interfaces go. A lot of sales are potentially going to go through social media and not just the official websites in the future.

We'll be capturing that information as well. We’ve got some experience with that kind of data that we’ve done in the past. So, this is something I'm looking forward to getting more of, but as of today, we’re only doing it for a few clients.

Well positioned

Hakami: In terms of planning, we're very well-positioned as a hub between the wholesaler and the retailer, the wholesaler and their own retail stores, as well as the wholesaler and their dot-coms. One of the things that we are looking into, and this is going to probably get more oxygen next year, is also taking a look at the relationships and the data between the retailer and the consumer.

As you mentioned, this is a growing area, and the retailers are looking to capture more of the consumer information so they can target-market to them, not based on segment but based on individual preferences. This is again a huge amount of data that needs to be cleansed, populated, and then presented to the CMOs of companies to be able to sell more, market more, and be in front of their customers much more than ever before.

Gardner: That’s a big trend that we are seeing in many different sectors of the economy -- that drive for personalization, and it really is a result of these data technologies to allow that to happen.

Last word to you, Dane. Any other thoughts about where the intersection of computer science capabilities and market intelligence demands are coming together in new and interesting ways?

Adcock: I'm excited about the whole approach to leveraging some predictive capabilities alongside the great inventory of data that we've put together for our clients. It's not just about creating better forecasts of demand, but optimizing different metrics, using this data to understand when product should be marked down, what types of attributes of products seem to be favored by different locations of stores that are obviously alike in terms of their shopper profiles, and bringing together better allocations and quantities in breadth and depth of products to individual locations to drive better, higher percentage of full-price selling and fewer markdowns for our clients.

So it’s a predictive side, rather than discovery using a BI tool.

Czetty: Just to add to that, there's the margin. When we talked to CEOs and CFOs five or six years ago and told them we could improve business by two, three, or four percent, they were laughing at us, saying it was meaningless to them. Now, three, four, or five percent, even in the luxury market, is a huge improvement to business. The companies like Michael Kors, Tory Burch, Marc Jacobs, Giorgio Armani, and Prada are all looking for those margins.
I'm excited about the whole approach to leveraging some predictive capabilities alongside the great inventory of data that we've put together for our clients.

So, how do we become more efficient with a product assortment, how do we become more efficient with distribution and all of these products to different sales channels, and then how do we increase our margins? How do we not over-manufacture and not create those blue shirts in Florida, where they are not selling, and create them for Detroit, where they're selling like hotcakes.

These are the things that customers are looking at and they must have that tool or tools in place to be able to manage their merchandising and by doing so become a lot more agile and a lot more profitable.

Gardner: Well, great. I'm afraid we will have to leave it there. We've been discussing how retail luxury goods and fashion market goods providers are using analysis from Sky I.T. Group and how Sky I.T. Group heads up its game through using HPE Vertica to provide more buyer behavior analysis faster, better, and cheaper.

And we’ve seen how Sky I.T. has changed its platform and solved the challenges around variety, velocity, and volume for that data to make those better insights available to those retail users, allowing them to become more data-driven across their entire market.

So please join me in thanking our guests. We have been talking with Jay Hakami,  President of Sky I.T. Group in New York. Thank you so much, Jay.
Visit the Sky BI Team
At The 2016 NRF Big Show!
Get More Information
Hakami: Thank you, Dana. I appreciate it very much.

Gardner: And we've also been talking with Dane Adcock, Vice President of Business Development at Sky I.T. Group. Thank you, Dane.

Adcock: It’s great to have the conversation. Thank you.
Gardner: I've enjoyed it myself. And lastly, a big thank you to Stephen Czetty, Vice-President and Chief Technology Officer there at Sky I.T. Group. Thank you, Stephen.

Czetty: You're very welcome, and I enjoyed the conversation. Thank you.

Gardner: And I’d also like to thank our audience as well for joining us for this big-data use case leadership discussion.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-sponsored discussions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a BriefingsDirect discussion on how Sky I.T. has changed its platform and solved the challenges around variety, velocity, and volume for big data to make better insights available to retail users. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in: