Wednesday, February 10, 2021

How Consistent Data Services Deliver Simplicity, Compatibility, and Lower Cost

A transcript of a discussion on the latest technologies and products delivering common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions and you’re listening to BriefingsDirect. Part 2 in our Data Strategies Insights Discussion Series explores the latest technologies and products that are delivering common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes.

New advances in storage technologies, standards, and methods have changed the game when it comes to overcoming the obstacles businesses too often face when seeking pervasive analytics across their systems and services.

Stay with us now as we examine how IBM Storage is leveraging containers and the latest storage advances to deliver inclusive, comprehensive, and actionable storage.

To learn more about the future of storage strategies that accelerate digital transformation, please join me in welcoming Denis Kennelly, General Manager, IBM Storage. Welcome back, Denis.

Denis Kennelly: Thank you, Dana. It’s great to be here.

Gardner: In our earlier discussion we learned about the business needs and IBM’s large-scale vision for global, consistent data. Let’s now delve beneath the covers into what enables this new era of data-driven business transformation.

In our last discussion, we talked about containers -- how they had been typically relegated to application development. What should businesses know about the value of containers more broadly within the storage arena as well as across other elements of IT?

Containers for ease, efficiency

Kennelly: Sometimes we talk about containers as being unique to application development, but I think the real business value of containers is in the operational simplicity and cost savings.

Kennelly

When you build applications on containers, they are container-aware. When you look at Kubernetes and the controls you have there as an operations IT person, you can scale up and scale down your applications seamlessly.

As we think about that and about storage, we have to include storage under that umbrella. Traditionally, storage was independently doing of a lot of the work. Now we are in a much more integrated environment where you have cloud-like behaviors. And you want to deliver those cloud-like behaviors end-to-end -- be it for the applications, for the data, for the storage, and even for the network -- right across the board. That way you can have a much more seamless, easier, and operationally efficient way of running your environment.

Containers are much more than just an application development tool; they are a key enabler to operational improvement across the board.

Gardner: Because hybrid cloud and multi-cloud environments are essential for digital business transformation, what does this container value bring to bridging the hybrid gap? How do containers lead to a consistent and actionable environment, without integrations and complexity thwarting wider use of assets around the globe?

Kennelly: Let’s talk about what a hybrid cloud is. To me, a hybrid cloud is the ability to run workloads on a public cloud and on a private cloud traditional data center. And even right out to edge locations in your enterprise where there are no IT people whatsoever.

Being able to do that consistently across that environment -- that’s what containers bring. They allow a layer of abstraction above the target environment, be it a bare-metal server, a virtual machine (VM), or a cloud service – and you can do that seamlessly across those environments.

That’s what a hybrid cloud platform is and what enables that are containers and being able to have a seamless runtime across this entire environment.

Today, as an enterprise, we still have assets sitting on a data center. Yet typical horizontal business processes, such as HR or sales, want to move to a SaaS model while still retaining core differentiating business processes.

And that’s core to digital transformation, because when we start to think about where we are today as an enterprise, we still have assets sitting on the data center. Typically, what you see out there are horizontal business processes, such as human resources or sales, and you might want to move those more to a software as a service (SaaS) capability while still retaining your core, differentiating business processes.

For compliance or regulatory reasons, you may need to keep those assets in the data center. Maybe you can move some pieces. But at the same time, you want to have the level of efficiency you gain from cloud-like economics. You want to be able to respond to business needs, to scale up and scale down the environment, and not design the environment for a worst-case scenario.

That’s why a hybrid cloud platform is so critical. And underneath that, why containers are a key enabler. Then, if you think about the data in storage, you want to seamlessly integrate that into a hybrid environment as well.

Gardner: Of course, the hybrid cloud environment extends these days more broadly with the connected edge included. For many organizations the edge increasingly allows real-time analytics capabilities by taking advantage of having compute in so many more environments and closer to so many more devices.

What is it about the IBM hybrid storage vision that allows for more data to reside at the edge without having to move it into a cloud, analyze it there, and move it back? How are containers enabling more data to stay local and still be part of a coordinated whole greater than the sum of the parts?

Data and analytics at the edge

Kennelly: As an industry, we go from being centralized to decentralized -- what I call a pendulum movement every number of years. If you think back, we were in the mainframe, where everything was very centralized. Then we went to distributed systems and decentralized everything.

With cloud we began to recentralize everything again. And now we are moving our clouds back out to the edge for a lot of reasons, largely because of egress and ingress challenges and to seek efficiency in moving more and more of that data.

When I think about edge, I am not necessarily thinking about Internet of things (IoT) devices or sensors, but in a lot of cases this is about branch and remote locations. That’s where a core part of the enterprise operates, but not necessarily with an IT team there. And that part of the enterprise is generating data from what’s happening in that facility, be it a manufacturing plant, a distribution center, or many others.

As you generate that data, you also want to generate the analytics that are key to understanding how the business is reacting and responding. Do you want to move all that data to a central cloud to run analytics, and then take the result back out to that distribution center? You can do that, but it’s highly inefficient -- and very costly.

What our clients are asking for is to keep the data out at these locations and to run the analytics locally. But, of course, with all of the analytics you still want to share some of that data with a central cloud.

So, what’s really important is that you can share across this entire environment, be it from a central data center or a central cloud out to an edge location and provide what we call seamless access across this environment.

With our technology, with things like IBM Spectrum Scale, you gain that seamless access. We abstract the data access as if you are accessing the data locally -- or it could be back in the cloud. But in terms of the applications, it really doesn’t care. That seamless access is core to what we are doing.

Gardner:  The IBM Storage portfolio is broad and venerable. It includes flash, disk, and tape, which continues to have many viable use cases. So, let’s talk about the products and how they extend the consistency and commonality that we have talked about and how that portfolio then buttresses the larger hybrid storage vision.

Storage supports all environments

Kennelly: One of the key design points of our portfolio, particularly our flash line, is being able to run in all environments. We have one software code base across our entire portfolio. That code runs on our disk subsystems and disk controllers, but it can also run on your platform of choice. So, we absolutely support all platforms across the board. So that’s one design principle.

Secondly, we embrace containers very heavily. And being able to run on containers and provide data services across those containers provides that seamless access that I talked about. That’s a second major design principle.

Yet as we look at our storage portfolio, we also want to make sure we optimize the storage and optimize the spend by the customer by tiered storage and being able to move data across those different tiers of storage.

As we look at our storage portfolio, we also want to make sure we optimize the storage and optimize the spend by the customer by tiered storage and being able to move data across those different tiers of storage.

You mentioned tape storage. And so, for example, at times you may want to move from fast, online, always-on, and high-end storage to a lower tier of less expensive storage such as tape, maybe for data retention reasons. You’ll then need an air gap solution and you’ll want to move to cold storage, as we call it, i.e. on tape. We support that capability and we can manage your data across that environment.

There are three core design principles to our IBM Storage portfolio. Number one is we can run seamlessly across these environments. Number two, we provide seamless access to the data across those environments. And number three, we support optimization of the storage for the use case needed, such being able to tier the storage to your economic and workload needs.

Gardner: Of course, what people are also interested in these days is the FlashSystem performance. Tell us about some of the latest and greatest when it comes to FlashSystem. You have the new 5200, the high-end 9200, and those also complement some of your other products like ESS 3200.

Flash provides best performance

Kennelly: Yes, we continue to expand the portfolio. With the FlashSystems, and some of our recent launches, some things don’t change. We’re still able to run across these different environments.

But in terms of price-performance, especially with the work we have done around our flash technology, we have optimized our storage subsystems to use standard flash technologies. In terms of price for throughput, when we look at this against our competitors, we offer twice the performance for roughly half the price. And this has been proven as we look at our competitors’ technology.

That’s due to leveraging our innovations around what we call the FlashCore Module, wherein we are able to use standard flash in those disk drives and enable compression on the fly. That’s driving the roadmap in terms of throughput and performance at a very, very competitive price point.

Gardner: Many of our readers and listeners, Denis, are focused on their digital business transformation. They might not be familiar with some of these underlying technological advances, particularly end-to-end Non-Volatile Memory Express (NVMe). So why are these systems doing things that just weren’t possible before?

Kennelly: A lot of it comes down to where the technology is today and the price points that we can get from flash from our vendors. And that’s why we are optimizing our flash roadmap and our flash drives within these systems. It’s really pushing the envelope in terms of performance and throughput across our flash platforms.

Gardner: The desired end-product for many organizations is better and pervasive analytics. And one of the great things about artificial intelligence (AI) and machine learning (ML) is it’s not only an output -- it’s a feature of the process of enhancing storage and IT.

How are IT systems and storage using AI inside these devices and across these solutions? What is AI bringing to enable better storage performance at a lower price point?

Kennelly: We continue to optimize what we can do in our flash technology, as I said. But when you embark on an AI project, something like 70 to 80 percent of the spend is around discovery, gaining access to the data, and finding out where the data assets are. And we have capabilities like IBM Spectrum Discover that help catalog and understand where the data is and how to access that data. It’s a critical piece of our portfolio on that journey to AI.

We also have integrations with AI services like Cloudera out of the box so that we can seamlessly integrate with those platforms and help those platforms differentiate using our Spectrum Scale technology.

But in terms of AI, we have some really key enablers to help accelerate AI projects through discovery and integration with some of the big AI platforms.

Gardner: And these new storage platforms are knocking off some impressive numbers around high availability and low latency. We are also seeing a great deal of consolidation around storage arrays and managing storage as a single pool.

On the economics of the IBM FlashSystem approach, these performance attributes are also being enhanced by reducing operational costs and moving from CapEx to OpEx purchasing.

Storage-as-a-service delivers

Kennelly: Yes, there is no question we are moving toward an OpEx model. When I talked about cloud economics and cloud-like flexibility behavior at a technology level, that’s only one side of the equation.

On the business side, IT is demanding cloud consumption models, OpEx-type models, and pay-as-you-go. It’s not just a pure financial equation, it's also how you consume the technology. And storage is no different. This is why we are doing a lot of innovation around storage-as-a-service. But what does that really mean?

It means you ask for a service. “I need a certain type of storage with this type of availability, this type of performance, and this type of throughput.” Then we as a storage vendor take care of all the details behind that. We get the actual devices on the floor that meet those requirements and manage that.

As those assets depreciate over a number of years, we replace and update those assets in a seamless manner to the client.

We already have the technology to support all environments. Now we want to make sure we have a seamless consumption model and the business processes of delivering storage-as-a-service and being able to replace and upgrade that storage over time -- all seamless to the client.

As the storage sits in the data center, maybe the customer says, “I want to move some of that data to a cloud instance.” We also offer a seamless capability to move the data over to the cloud and run that service on the cloud.

We already have all the technology to do that and the platform support for all of those environments. What we are working on now is making sure we have a seamless consumption model and the business processes of delivering that storage-as-a-service, and how to replace and upgrade that storage over time -- while making it all seamless to the client.

I see storage moving quickly to this new storage consumption model, a pure OpEx model. That’s where we as an industry will go over the next few years.

Gardner: Another big element of reducing your total cost of ownership over time is in how well systems can be managed. When you have a common pool approach, a comprehensive portfolio approach, you also gain visibility, a single pane of glass when it comes to managing these systems.

Intelligent insights via storage

Kennelly: That’s an area we continue to invest in heavily. Our IBM Storage Insights platform provides tremendous insights in how the storage subsystems are running operationally. It also provides insights within the storage in terms of where you have space constraints or where you may need to expand.

But that’s not just a manual dashboard that we present to an operator. We are also infusing AI quite heavily into that platform and using AIOps to integrate with Storage Insights to run storage operations at much lower costs and with more automation.

And we can do that in a consistent manner right across the environments, whether it’s a flash storage array, mainframe attached, or a tape device. It’s all seamless across the environment. You can see those tiers and storage as one platform and so are able to respond quickly to events and understand events as they are happening.

Gardner: As we close out, Denis, for many organizations hybrid cloud means that they don’t always know what’s coming and lack control over predicting their IT requirements. Deciding in advance how things get deployed isn’t always an option.

How do the IBM FlashSystems, and your recent announcements in February 2021, provide a path to a crawl-walk-run adoption approach? How do people begin this journey regardless of the type of organization and the size of the organization?

Kennelly: We are introducing an update to our FlashSystem 5200 platform, which is our entry point platform. Now, that consistent software platform runs our storage software, IBM Spectrum Virtualize. It’s the same software as in our high-end arrays at the very top of our pyramid of capabilities.

As part of that announcement, we are also supporting other public cloud vendors. So you can run the software on our arrays, or you can move it out to run on a public cloud. You have tremendous flexibility and choice due to the consistent software platform.

And, as I said, it’s our entry point so the price is very, very competitive. This is a part of the market where we see tremendous growth. You can experience the best of the IBM Storage platform at a low-cost entry point, but also get the tremendous flexibility. You can scale up that environment within your data center and right out to your choice of how to use the same capabilities across the hybrid cloud.

There has been tremendous innovation by the IBM team to make sure that our software supports this myriad of platforms, but also at a price point that is the sweet spot of what customers are asking for now.

Gardner: It strikes me that we are on the vanguard of some major new advances in storage, but they are not just relegated to the largest enterprises. Even the smallest enterprises can take advantage and exploit these great technologies and storage benefits.

Kennelly: Absolutely. When we look at the storage market, the fastest growing part is at that lower price point -- where it’s below $50K to $100K unit costs. That’s where we see tremendous growth in the market and we are serving it very well and very efficiently with our platforms. And, of course, as people want to scale and grow, they can do that in a consistent and predictable manner.

Gardner: I’m afraid we will have to leave it there. You have been listening to a sponsored BriefingsDirect discussion on how the latest storage technologies and products are delivering a common data services benefit across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes. And we have delved beneath the covers to learn about the latest technologies enabling IBM’s vision for the future of data storage.

So please join me now in thanking our guest, Denis Kennelly, General Manager, IBM Storage. Thank you so much, Denis.

Kennelly: Thank you, Dana.

Gardner: Please join me and Denis again soon for our next discussion in this three-part series as we explore the virtues of a global access capabilities and the importance of consistent data -- no matter where it’s stored.

Thank you to our audience as well for joining this BriefingsDirect data strategies insights discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of IBM Storage-sponsored BriefingsDirect discussions.

Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

A transcript of a discussion on the latest technologies and products delivering common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes. Copyright Interarbor Solutions, LLC, 2005-2021. All rights reserved.

You may also be interested in:

Friday, February 05, 2021

How Storage Can Help You Digitally Transform in a Hybrid Cloud World


A transcript of a discussion on how consistent global storage models best accelerate and enable pervasive analytics that support digital business transformation. 

 

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

 

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions and you’re listening to BriefingsDirect.

 

Our next data strategies insights discussion explores how consistent and global storage models can best propel pervasive analytics and support digital business transformation.

 

Decades of disparate and uncoordinated storage solutions have hindered enterprises’ ability to gain common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes.

 

Yet only a comprehensive data storage model that includes all platforms, data types, and deployment architectures will deliver the rapid insights that businesses need.

 

Stay with us now as we examine how IBM Storage is leveraging containers and the latest storage advances to deliver the holy grail of inclusive, comprehensive, and actionable storage.

 


To learn more about the future promise of the storage strategies that accelerate digital transformation, please join me now in welcoming our guest, Denis Kennelly, General Manager, IBM Storage. Welcome, Denis.

 

Kennelly: Thank you, Dana. It’s great to be here.

 

Gardner: Clearly the world is transforming digitally. And hybrid cloud is helping in that transition. But what role specifically does storage play in allowing hybrid cloud to function in a way that bolsters and even accelerates digital transformation?

 

Kennelly: As you said, the world is undergoing a digital transformation, and that is accelerating in the current climate of a COVID-19 world. And, really, it comes down to having an IT infrastructure that is flexible, agile, has cloud-like attributes, is open, and delivers the economic value that we all need.

 

That is why we at IBM have a common hybrid cloud strategy. A hybrid cloud approach, we now know, is 2.5 times more economical than a public cloud-only strategy. And why is that? Because as customers transform -- and transform their existing systems – the data and systems sit on-premises for a long time. As you move to the public cloud, the cost of transformation has to overcome other constraints such as data sovereignty and compliance. This is why hybrid cloud is a key enabler.

 

Hybrid cloud for transformation

Kennelly
Now, underpinning that, the core building block of the hybrid cloud platform, is containers and Kubernetes using our OpenShift technology. That’s the key enabler to the hybrid cloud architecture and how we move applications and data within that environment.

 

As the customer starts to transform and looks at those applications and workloads as they move to this new world, being able to access the data is critical and being able to keep that access is a really important step in that journey. Integrating storage into that world of containers is therefore a key building block on which we are very focused today.

 

Storage is where you capture all that state, where all the data is stored. When you think about cloud, hybrid cloud, and containers -- you think stateless. You think about cloud-like economics as you scale up and scale down. Our focus is bridging those two worlds and making sure that they come together seamlessly. To that end, we provide an end-to-end hybrid cloud architecture to help those customers in their transformation journeys.

 

Gardner: So often in this business, we’re standing on the shoulders of the giants of the past 30 years; the legacy. But sometimes legacy can lead to complexity and becomes a hindrance. What is it about the way storage has evolved up until now that people need to rethink? Why do we need something like containers, which seem like a fairly radical departure?

 

Kennelly: It comes back to the existing systems. You know, I think storage at the end of the day was all about the applications, the workloads that we ran. It was storage for storage’s sake. You know, we designed applications, we ran applications and servers, and we architected them in a certain fashion.

When you get to a hybrid cloud world ... If you're in a digitally transformed business, you can respond rapidly. Your infrastructure needs to respond to those needs versus having the maximum throughput capacity.

And, of course, they generated data and we wanted access to that data. That’s just how the world happened. When you get to a hybrid cloud world -- I mean, we talk about cloud-like behavior, cloud-like economics – it manifests itself in the ability to respond.

 

If you’re in a digitally transformed business, you can respond to needs in your supply chain rapidly, maybe to a surge in demand based on certain events. Your infrastructure needs to respond to those needs versus having the maximum throughput capacity that would ever be needed. That’s the benefit cloud has brought to the industry, and why it’s so critically important.

 

Now, maybe traditionally storage was designed for the worst-case scenario. In this new world, we have to be able to scale up and scale down elastically like we do in these workloads in a cloud-like fashion. That’s what has fundamentally changed and what we need to change in those legacy infrastructures. Then we can deliver more of our analysis-services-consumption-type model to meet the needs of the businesses.

 

Gardner: And on that economic front, digitally transformed organizations need data very rapidly, and in greater volumes -- with that scalability to easily go up and down. How will the hybrid cloud model supported by containers provide faster data in greater volumes, and with a managed and forecastable economic burden?

 

Disparate data delivers insights

Kennelly: In a digitally transformed world, data is the raw material to a competitive advantage. Access to data is critical. Based on that data, we can derive insights and unique competitive advantages using artificial intelligence (AI) and other tools. But therein lies the question, right?

 

When we look at things like AI, a lot of our time and effort is spent on getting access to the data and being able to assemble that data and move it to where it is needed to gain those insights.

 


Being able to do that rapidly and at a low cost is critical to the storage world. And so that’s what we are very focused on, being able to provide those data services -- to discover and access the data seamlessly. And, as required, we can then move the data very rapidly to build on those insights and deliver competitive advantage to a digitally transformed enterprise.

 


Gardner:
Denis, in order to have comprehensive data access and rapidly deliver analytics at an affordable cost, the storage needs to run consistently across a wide variety of different environments – bare-metal, virtual machines (VMs), containers -- and then to and from both public and private clouds, as well as the edge.

 

What is it about the way that IBM is advancing storage that affords this common view, even across that great disparity of environments?

 

Kennelly: That’s a key design principle for our storage platform, what we call global access or a global file system. We’re going right back to our roots of IBM Research, decades ago where we invented a lot of that technology. And that’s the core of what we’re still talking about today -- to be able to have seamless access across disparate environments.

A key design principle for our storage platform, what we call global access or a global file system, goes back to our roots at IBM Research. We invented a lot of that technology. And that's at the core of what we're talking about -- seamless access across disparate environments.

Access is one issue, right? You can get read-access to the data, but you need to do that at high performance and at scale. At the same time, we are generating data at a phenomenal rate, so you need to scale out the storage infrastructure seamlessly. That’s another critical piece of it. We do that with products or capabilities we have today in things like IBM Spectrum Scale.

 

But another key design principle in our storage platforms is being to run in all of those environments -- bare-metal servers, to VMs, to containers, and right out to the edge footprints. So we are making sure our storage platform is designed and capable of supporting all of those platforms. It has to run on them and as well as support the data services -- the access services, the mobility services and the like, seamlessly across those environments. That’s what enables the hybrid cloud platform at the core of our transformation strategy.

 

Gardner: In addition to the focus on the data in production environments, we also should consider the development environment. What does your data vision include across a full life-cycle approach to data, if you will?

 

Be upfront with data in DevOps

Kennelly: It’s a great point because the business requirements drive the digital transformation strategy. But a lot of these efforts run into inertia when you have to change. The development processes teams within the organization have traditionally done things in a certain way. Now, all of a sudden, they’re building applications for a very different target environment -- this hybrid cloud environment, from the public cloud, to the data center, and right out to the edge.

 

The economics we’re trying to drive require flexible platforms across the DevOps tool chain so you can innovate very quickly. That’s because digital transformation is all about how quickly you can innovate via such new services. The next question is about the data.

 

As you develop and build these transformed applications in a modern, DevOps cloud-like development process, you have to integrate your data assets early and make sure you know the data is available – both in that development cycle as well as when you move to production. It’s essential to use things like copy-data-management services to integrate that access into your tool chain in a seamless manner. If you build those applications and ignore the data, then it becomes a shock as you roll it into production.

 

This is the key issue. A lot of times we can get an application running in one scenario and it looks good, but as you start to extend those services across more environments – and haven’t thought through the data architecture -- a lot of the cracks appear. A lot of the problems happen.

 

You have to design in the data access upfront in your development process and into your tool chains to make sure that’s part of your core development process.

 

Gardner: Denis, over the past several years we’ve learned that containers appear to be the gift that keeps on giving. One of the nice things about this storage transition, as you’ve described, is that containers were at first a facet of the development environment.

 

Developers leveraged containers first to solve many problems for runtimes. So it’s also important to understand the limits that containers had. Stateful and persistent storage hadn’t been part of the earlier container attributes.

 

How technically have we overcome some of the earlier limits of containers?

 

Containers create scalable benefits

Kennelly: You’re right, containers have roots in the open-source world. Developers picked up on containers to gain a layer of abstraction. In an operational context, it gives tremendous power because of that abstraction layer. You can quickly scale up and scale down pods and clusters, and you gain cloud-like behaviors very quickly. Even within IBM, we have containerized software and enabled traditional products to have cloud-like behaviors.

 

We were able to quickly move to a scalable, cloud-like platform very quickly using container technology, which is a tremendous benefit as a developer. We then moved containers to operations to respond to business needs such as when there’s a spike in demand and you need to scale up the environment. Containers are amazing in how quickly and how simple that is.

We have been able to move to a scalable, cloud-like platform very quickly using container technology, which is a tremendous benefit as a developer. We then moved containers to operations to respond to business needs to scale up and down. Containers are amazing in how quickly and how simple that is.

 

Now, with all of that power and the capability to scale up and scale down workloads, you also have a storage system sitting at the back end that has to respond accordingly. That’s because as you scale up more containers, you generate more input/output (IO) demands. How does the storage system respond?

 

Well, we have managed to integrate containers into the storage ecosystem. But, as an industry, we have some work to do. The integration of storage with containers is not just the simple IO channel to the storage. It also needs to be able to scale out accordingly, and to be managed. It’s an area we at IBM are focused on working closely with our friends at Red Hat to make sure that’s a very seamless integration and gives you consistent, global behavior.

 

Gardner: With security and cyber-attacks being so prominent in people’s minds in early 2021, what impacts do we get with a comprehensive data strategy when it comes to security? In the past, we had disparate silos of data. Sometimes, bad things could happen between the cracks.

 

So as we adopt containers consistently is there an overarching security benefit when it comes to having a common data strategy across all of your data and storage types?

 

Prevent angles of attack

Kennelly: Yes. It goes back to the hybrid cloud platform and having potentially multiple public clouds, data center workloads, edge workloads, and all of the combinations thereof. The new core is containers, but you know that with applications running across that hybrid environment that we’ve expanded the attack surface beyond the data center.

 

By expanding the attack surface, unfortunately, we’ve created more opportunities for people to do nefarious things, such as interrupt the applications and get access to the data. But when people attack a system, the cybercriminals are really after the data. Those are the crown jewels of any organization. That’s why this is so critical.

 


Data protection then requires understanding when somebody is tampering with the data or gaining access to data and doing something nefarious with that data. As we look at our data protection technologies, and as we protect our backups, we can detect if something is out of the ordinary. Integrating that capability into our backups and data protection processes is critical because that’s when we see at a very granular level what’s happening with the data. We can detect if behavioral attributes have changed from incremental backups or over time.

 

We can also integrate that into business process because, unfortunately, we have to plan for somebody attacking us. It’s really about how quickly we can detect and respond very quickly to get the systems back online. You have to plan for the worst-case scenario.

 

That’s why we have such a big focus on making sure we can detect in real time when something is happening as the blocks are literally being written to the disk. We can then also unwind to when we seek a good copy. That’s a huge focus for us right now.

 

Gardner: When you have a comprehensive data infrastructure, can go global and access data across all of these different environments, it seems to me that you have set yourself up for a pervasive analytics capability, which is the gorilla in the room when it comes to digital business transformation. Denis, how does the IBM Storage vision help bring more pervasive and powerful analytics to better drive a digital business?

 

Climb the AI Ladder

Kennelly: At the end of the day, that’s what this is all about. It’s about transforming businesses, to drive analytics, and provide unique insights that help grow your business and respond to the needs of the marketplace.

 

It’s all about enabling top-line growth. And that’s only possible when you can have seamless access to the data very quickly to generate insights literally in real time so you can respond accordingly to your customer needs and improve customer satisfaction.

 

This platform is all about discovering that data to drive the analytics. We have a phrase within IBM, we call it “The AI Ladder.” The first rung on that AI ladder is about discovering and accessing the data, and then being able to generate models from those analytics that you can use to respond in your business.

We're all in a world based on data. AI has a major role to play where we can look at business processes and understand how they are operating and then drive greater automation.That's a huge focus for us -- optimizing and automating existing business processes.

 

We’re all in a world based on data. And we’re using it to not only look for new business opportunities but for optimizing and automating what we already have today. AI has a major role to play where we can look at business processes and understand how they are operating and then, based on analytics and AI, drive greater automation. That’s a huge focus for us as well: Not only looking at the new business opportunities but optimizing and automating existing business processes.

 

Gardner: I’m afraid we’ll have to leave it there. You’ve been listening to a sponsored BriefingsDirect discussion on how consistent and global storage models best propel pervasive analytics and support digital business transformation.

 

And we’ve learned how IBM Storage is leveraging containers and the latest storage advances to deliver inclusive, comprehensive, and actionable data. So please join me in thanking our guest, Denis Kennelly, General Manager, IBM Storage. Thank you so much, Denis.

 

Kennelly: Thank you, Dana.

 

Gardner: And please look forward to when Denis joins me again for the next discussion in this three-part series. We’ll delve beneath the covers to learn more about the actual technologies enabling IBM’s vision for the future of data storage.

 


A big thank you as well to our audience for joining these BriefingsDirect data strategies insights discussions. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of IBM Storage-sponsored BriefingsDirect discussions.

 

Thanks again for listening. Please pass this along to your IT community, and do come back next time.

 

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

A transcript of a discussion on how consistent global storage models best accelerate and enable pervasive analytics that support digital business transformation. Copyright Interarbor Solutions, LLC, 2005-2021. All rights reserved.

 

You may also be interested in: