Tuesday, February 16, 2021

How Global Data Availability Accelerates Collaboration and Delivers Business Insights


A transcript of a discussion that explores how comprehensive and global data storage access delivers the rapid insights businesses need for digital business transformation.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions and you’re listening to BriefingsDirect.

Our next data strategy insights discussion explores the payoffs when enterprises overcome the hurdles of disjointed storage to obtain global data access.

By leveraging the latest in container and storage server technologies, the holy grail of inclusive, comprehensive, and actionable storage can be obtained. And such access extends across all deployment models – from hybrid cloud, to software-as-a-service (SaaS), to distributed data centers, and edge.

Stay with us now as we examine the role that comprehensive data storage plays in delivering the rapid insights businesses need for digital business transformation. To learn how, we’re joined again by Denis Kennelly, General Manager, IBM Storage. Welcome back, Denis.

Kennelly: Thank you, Dana.

Gardner:  Denis, in our earlier discussions in this three-part series we learned about IBM’s vision for global consistent data, as well as the newest systems forming the foundation for these advances.

But let’s now explore the many value streams gained from obtaining global data access. We hear a lot about the rise of artificial intelligence (AI) adoption needed to support digital businesses. So what role does a modern storage capability -- particularly with a global access function and value -- play in that AI growth?

Kennelly: As enterprises become increasingly digitally transformed, the amount of data they are generating is enormous. IDC predicts that something like 42 billion Internet of things (IoT) devices will be sold by 2025, and so the role of storage is not only centralized to data centers. It needs to be distributed across this entire hybrid cloud environment.

Discover and share AI data

Kennelly

For actionable AI, you want to build models on all of the data that’s been generated across this environment. Being able to discover and understand that data is critical, and that’s why it’s a key part of our storage capabilities. You need to run that storage on all of these highly distributed environments in a seamless fashion. You could be running anywhere -- the data center, the public cloud, and at edge locations. But you want to have the same software and capabilities for all of these locations to allow for that essential seamless access.

That’s critical to enabling an AI journey because AI doesn’t just operate on the data sitting in a public cloud or data center. It needs to operate on all of the data if you want to get the best insights. You must get to the data from all of these locations and bring it together in a seamless manner.

Gardner: When we’re able to attain such global availability of data -- particularly in a consistent context – how does that accelerate AI adoption? Are there particular use cases, perhaps around DevOps? How do people change their behavior when it comes to AI adoption, thanks to what the storage and data consistency can do for them?

Kennelly:  First it’s about knowing where the data is and doing basic discovery. And that’s a non-trivial task because data is being generated across the enterprise. We are increasingly collaborating remotely and that generates a lot of extended data. Being able to access and share that data across environments is a critical requirement. It’s something that’s very important to us.

Then -- as you discover and share the data – you can also bring that data together into use by AI models. You can use it to actually generate better AI models across the various tiers of storage. But you don’t want to just end up saying, “Okay, I discovered all of the data. I’m going to move it to this certain location and then I’m going to run my analytics on it.”

Part 1 in the IBM Storage innovation series
Part 2 in the series
Instead, you want to do the analytics in real time and in a distributed fashion. And that’s what’s critical about the next level of storage.

Coming back to what’s hindering AI adoption, number one is that data discovery because enterprises spent a huge amount of time just discovering the data. And when you get access, you need to have seamless access. And then, of course, as you build your AI models you need to infuse those analytics into the applications and capabilities that you’re developing.

And that leads to your question around DevOps, to be able to integrate the processes of generating and building AI models into the application development process so that we make sure the application developers can leverage those insights for the applications they are building.

Gardner:  For many organizations, moving to hybrid cloud has been about application portability. But when it comes to the additional data mobility we gain from consistent global data access, there’s a potential greater value. Is there a second shoe to fall, if you will, Denis, when we can apply such data mobility in a hybrid cloud environment?

Access data across hybrid cloud

Kennelly:  Yes, and that second shoe is about to fall. The first part of our collective cloud journey was all about moving to the public cloud, moving everything to public clouds, and building applications with cloud-based data.

What we discovered in doing that is that life is not so simple, and we’re really now in a hybrid cloud world for many reasons. Because of that success, we now need the hybrid cloud approach.

The need for more cloud portability has led to technologies like containers to get portability across all of the environments -- from data centers to clouds. As we roll out containers into production, however, the whole question of data becomes even more critical.

That need for more cloud portability has led to technologies like containers to get portability across all of these environments – from data centers to clouds. As we roll out containers and these workloads into production, the whole data question is more critical.

You can now build an application that runs in a certain environment, and containers allow you to move that application to other environments very quickly. But if the data doesn’t follow -- if the data access doesn’t follow that application seamlessly -- then you face some serious challenges and problems.

And that is the next shoe to drop, and it’s dropping right now. As we roll out these sophisticated applications into production, being able to copy data or get access to data across this hybrid cloud environment is the biggest challenge the industry is facing.

Gardner: When we envision such expansive data mobility, we often think about location, but it also impacts the type of data – be it file, block, and object storage, for example. Why must there be global access geographically -- but also in terms of the storage type and across the underlying technology platforms?

Kennelly: To the application developer, we really have to hide from them that layer of complexity of the storage type and platform. At the end of the day, the application developer is looking for a consistent API through which to access the data services, whether that’s file, block, or object. They shouldn’t have to care about that level of detail.

It’s important that there’s a focus on consistent access via APIs to the developer. And then the storage subsystem has to take care of the federated global access of the data. Also, as we generate data, the storage subsystem should scale horizontally.

These are the design principles we have put into the IBM Storage platform. Number one, you get seamless actions and consistent access – be it file, object, or block storage. And we can scale horizontally as you generate data across that hybrid cloud environment.

Gardner: The good news is that global data access enablement can now be done with greater ease. The bad news is the global access enablement can be done anywhere, anytime, and with ease.

And so we have to also worry about access, security, permissions, and regulatory compliance issues. How do you open the floodgates, in a sense, for common access to distributed data, but at the same time put in the guardrails that allow for the management of that access in a responsible way?

Global data access opens doors

Kennelly: That’s a great question. As we introduce simplicity and ease of data access, we can’t just open it up to everybody. We have to make sure we have good authentication as part of the design, using things like two-factor authentication on the data-access APIs.

But that’s only half of the problem. In the security world, the unfortunate acceptance is that you probably are going to get breached. It’s in how you respond that really differentiates you and determines how quickly you can get the business back on its feet.

And so, when something bad happens, the third critical role for the storage subsystem to play is in the access control to the persistence storage. At the end of the day, that is what people are after. Being able to understand the typical behavior of those storage systems, and how data is usually being stored, forms a baseline against which you can understand when something out of the ordinary is happening.

Part 1 in the IBM Storage innovation series
Part 2 in the series
Clearly, if you’re under a malware or CryptoLocker attack, you see a very different input/output (IO) pattern than you would normally see. We can detect that in real time, understand when it happens, and make sure you have protected copies of the data so you can quickly access that and get back to business and back online quickly.

Why is all of that important? Because we live in a world where it’s not a case of if it will happen, it’s really when it will happen. How we can respond is critical.

Gardner: Denis, throughout our three-part series we’ve been discussing what we can do, but we haven’t necessarily delved into specific use cases. I know you can’t always name businesses and reference customers, but how can we better understand the benefits of a global data access capability in the context of use cases?

In practice, when the rubber hits the road, how does global data storage access enable business transformation? Is there a key metric you look for to show how well your storage systems support business outcomes? 

Global data storage success

Kennelly: We’re at a point right now when customers are looking to drive new business models and to move much more quickly in their hybrid cloud environments.

There are enabling technologies right now facilitating that. There’s a lot of talk about edge with the advent of 5G networks, which enable a lot of this to happen. When you talk about seamless access and the capability to distribute data across these environments, you need the underlying network infrastructure to make that happen.

Customers are looking to drive new business models and to move much more quickly in their hybrid cloud deployments. There's a lot of talk about edge with the advent of 5G networks. When you talk about seamless access and the capability to distribute data across these environments, you need the underlying network infrastructure to make that happen.

As we do that, we’re looking at a number of key business measures and metrics. We have done some independent surveys and analysis looking at the business value that we drive for our clients with a hybrid cloud platform and things like portability, agility, and seamless data access.

In terms of business value, we have four or five measures. For example, we can drive roughly 2.5 times more business value for our clients -- everything from top-line growth to operational savings. And that’s something that we have tested with many clients independently.

One example that’s very relevant in the world we live in today is we have a cloud provider that needed to have more federated access to their global data. But they also wanted to distribute that through edge nodes in a consistent manner. And that’s just an example of why this is happening in action.

Gardner: You know, some of the major consumers of analytics in businesses these days are data scientists, and they don’t always want to know what’s going on underneath the covers. On the other hand, what goes on underneath the covers can greatly impact how well they can do their jobs, which are often essential to digital business transformation.

For you to address a data scientist specifically about why global access for data and storage modernization is key, what would you tell them? How do you describe the value that you’re providing to someone like a data scientist who plays such a key role in analytics?

Kennelly: Well, data scientists talk a lot about data sets. They want access to data sets so they can test their hypothesis very quickly. In a nutshell, we surface data sets quicker and faster than anybody else at a price performance that leads the industry -- and that’s what we do every day to enable data scientists.

Gardner: Throughout our series of three storage strategy discussions, we’ve talked about how we got here and what we’re doing. But we haven’t yet talked about what comes next.

These enabling technologies not only satisfy business imperatives and requirements now but set up organizations to be even more intelligent over time. Let’s look to the future for the expanding values when you do data access globally and across hybrid clouds well. 

Insight-filled future drives growth

Kennelly: Yes, you get to critically look at current and new business models. At the end of the day, this is about driving business growth. As you start to look at these environments -- and we’ve talked a lot about analytics and data – it becomes about getting competitive advantage through real-time insights about what’s going on in your environments.

You become able to better understand your supply chain, what’s happening in certain products, and in certain manufacturing lines. You’re able to respond accordingly. There’s a big operational benefit in terms of savings. You don’t have to have excess capacity in the environment.

Part 1 in the IBM Storage innovation series
Part 2 in the series
Also, in seeking new business opportunities, you will detect the patterns needed to have insights you hadn’t had before by doing analytics and machine learning into what’s critical in your systems and markets. If you move your IT environment and centralize everything in one cloud, for example, then that really hinders that progress.

By being able to do that with all of the data as it’s generated in real time, you get very unique insights that provide competitive advantage.

Gardner: And lastly, why IBM? What sets you apart from the competition in the storage market for obtaining these larger goals of distributed analytics, intelligence, and competitiveness?

Kennelly: We have shown over the years that we have been at the forefront of many transformations of businesses and industries. Going back to the electronic typewriter, if we want to go back far enough, or now to our business-to-business (B2B) or business-to-employee (B2E) models in the hybrid cloud -- IBM has helped businesses make these transformations. That includes everything from storage to data and AI through to hybrid cloud platforms, with Red Hat Enterprise Linux, and right out to our business service consulting.

IBM has the end-to-end capabilities to make that all happen. It positions us as an ideal partner who can do so much.

I love to talk about storage and the value of storage, and I spend a lot of time talking with people in our business consulting group to understand the business transformations that clients are trying to drive and the role that storage has in that. Likewise, with our data science and data analytics teams that are enabling those technologies.

The combination of all of those capabilities as one idea is a unique differentiator for us in the industry. And it’s why we are developing the leading edge capabilities, products, and technology to enable the next digital transformations.

Gardner: I’m afraid we’ll have to leave it there. You have been listening to a sponsored BriefingsDirect discussion on the major payoffs when enterprises overcome the hurdles of disjointed storage to obtain global data access.

And we’ve learned about the role and impact of a comprehensive and global data storage model when delivering rapid insights for accomplishing digital business transformation.

So please join me impacting our guest, Denis Kennelly, General Manager, IBM Storage.  Thank you so much, Denis.

Kennelly: Thank you, Dana.

Gardner: And thanks as well to our audience for joining these BriefingsDirect data strategies insights discussions. Please look for the other two discussions in this series on the IBM Storage vision, as well as the newest systems that form the foundation for these advances.

I’m Dana Gardner, principal analyst at Interarbor Solutions, your host throughout the series of IBM Storage sponsored BriefingsDirect discussions. Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

A transcript of a discussion that explores how comprehensive and global data storage access delivers the rapid insights businesses need for digital business transformation. Copyright Interarbor Solutions, LLC, 2005-2021. All rights reserved.

You may also be interested in:

Wednesday, February 10, 2021

How Consistent Data Services Deliver Simplicity, Compatibility, and Lower Cost

A transcript of a discussion on the latest technologies and products delivering common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions and you’re listening to BriefingsDirect. Part 2 in our Data Strategies Insights Discussion Series explores the latest technologies and products that are delivering common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes.

New advances in storage technologies, standards, and methods have changed the game when it comes to overcoming the obstacles businesses too often face when seeking pervasive analytics across their systems and services.

Stay with us now as we examine how IBM Storage is leveraging containers and the latest storage advances to deliver inclusive, comprehensive, and actionable storage.

To learn more about the future of storage strategies that accelerate digital transformation, please join me in welcoming Denis Kennelly, General Manager, IBM Storage. Welcome back, Denis.

Denis Kennelly: Thank you, Dana. It’s great to be here.

Gardner: In our earlier discussion we learned about the business needs and IBM’s large-scale vision for global, consistent data. Let’s now delve beneath the covers into what enables this new era of data-driven business transformation.

In our last discussion, we talked about containers -- how they had been typically relegated to application development. What should businesses know about the value of containers more broadly within the storage arena as well as across other elements of IT?

Containers for ease, efficiency

Kennelly: Sometimes we talk about containers as being unique to application development, but I think the real business value of containers is in the operational simplicity and cost savings.

Kennelly

When you build applications on containers, they are container-aware. When you look at Kubernetes and the controls you have there as an operations IT person, you can scale up and scale down your applications seamlessly.

As we think about that and about storage, we have to include storage under that umbrella. Traditionally, storage was independently doing of a lot of the work. Now we are in a much more integrated environment where you have cloud-like behaviors. And you want to deliver those cloud-like behaviors end-to-end -- be it for the applications, for the data, for the storage, and even for the network -- right across the board. That way you can have a much more seamless, easier, and operationally efficient way of running your environment.

Containers are much more than just an application development tool; they are a key enabler to operational improvement across the board.

Gardner: Because hybrid cloud and multi-cloud environments are essential for digital business transformation, what does this container value bring to bridging the hybrid gap? How do containers lead to a consistent and actionable environment, without integrations and complexity thwarting wider use of assets around the globe?

Kennelly: Let’s talk about what a hybrid cloud is. To me, a hybrid cloud is the ability to run workloads on a public cloud and on a private cloud traditional data center. And even right out to edge locations in your enterprise where there are no IT people whatsoever.

Being able to do that consistently across that environment -- that’s what containers bring. They allow a layer of abstraction above the target environment, be it a bare-metal server, a virtual machine (VM), or a cloud service – and you can do that seamlessly across those environments.

That’s what a hybrid cloud platform is and what enables that are containers and being able to have a seamless runtime across this entire environment.

Today, as an enterprise, we still have assets sitting on a data center. Yet typical horizontal business processes, such as HR or sales, want to move to a SaaS model while still retaining core differentiating business processes.

And that’s core to digital transformation, because when we start to think about where we are today as an enterprise, we still have assets sitting on the data center. Typically, what you see out there are horizontal business processes, such as human resources or sales, and you might want to move those more to a software as a service (SaaS) capability while still retaining your core, differentiating business processes.

For compliance or regulatory reasons, you may need to keep those assets in the data center. Maybe you can move some pieces. But at the same time, you want to have the level of efficiency you gain from cloud-like economics. You want to be able to respond to business needs, to scale up and scale down the environment, and not design the environment for a worst-case scenario.

That’s why a hybrid cloud platform is so critical. And underneath that, why containers are a key enabler. Then, if you think about the data in storage, you want to seamlessly integrate that into a hybrid environment as well.

Gardner: Of course, the hybrid cloud environment extends these days more broadly with the connected edge included. For many organizations the edge increasingly allows real-time analytics capabilities by taking advantage of having compute in so many more environments and closer to so many more devices.

What is it about the IBM hybrid storage vision that allows for more data to reside at the edge without having to move it into a cloud, analyze it there, and move it back? How are containers enabling more data to stay local and still be part of a coordinated whole greater than the sum of the parts?

Data and analytics at the edge

Kennelly: As an industry, we go from being centralized to decentralized -- what I call a pendulum movement every number of years. If you think back, we were in the mainframe, where everything was very centralized. Then we went to distributed systems and decentralized everything.

With cloud we began to recentralize everything again. And now we are moving our clouds back out to the edge for a lot of reasons, largely because of egress and ingress challenges and to seek efficiency in moving more and more of that data.

When I think about edge, I am not necessarily thinking about Internet of things (IoT) devices or sensors, but in a lot of cases this is about branch and remote locations. That’s where a core part of the enterprise operates, but not necessarily with an IT team there. And that part of the enterprise is generating data from what’s happening in that facility, be it a manufacturing plant, a distribution center, or many others.

As you generate that data, you also want to generate the analytics that are key to understanding how the business is reacting and responding. Do you want to move all that data to a central cloud to run analytics, and then take the result back out to that distribution center? You can do that, but it’s highly inefficient -- and very costly.

What our clients are asking for is to keep the data out at these locations and to run the analytics locally. But, of course, with all of the analytics you still want to share some of that data with a central cloud.

So, what’s really important is that you can share across this entire environment, be it from a central data center or a central cloud out to an edge location and provide what we call seamless access across this environment.

With our technology, with things like IBM Spectrum Scale, you gain that seamless access. We abstract the data access as if you are accessing the data locally -- or it could be back in the cloud. But in terms of the applications, it really doesn’t care. That seamless access is core to what we are doing.

Gardner:  The IBM Storage portfolio is broad and venerable. It includes flash, disk, and tape, which continues to have many viable use cases. So, let’s talk about the products and how they extend the consistency and commonality that we have talked about and how that portfolio then buttresses the larger hybrid storage vision.

Storage supports all environments

Kennelly: One of the key design points of our portfolio, particularly our flash line, is being able to run in all environments. We have one software code base across our entire portfolio. That code runs on our disk subsystems and disk controllers, but it can also run on your platform of choice. So, we absolutely support all platforms across the board. So that’s one design principle.

Secondly, we embrace containers very heavily. And being able to run on containers and provide data services across those containers provides that seamless access that I talked about. That’s a second major design principle.

Yet as we look at our storage portfolio, we also want to make sure we optimize the storage and optimize the spend by the customer by tiered storage and being able to move data across those different tiers of storage.

As we look at our storage portfolio, we also want to make sure we optimize the storage and optimize the spend by the customer by tiered storage and being able to move data across those different tiers of storage.

You mentioned tape storage. And so, for example, at times you may want to move from fast, online, always-on, and high-end storage to a lower tier of less expensive storage such as tape, maybe for data retention reasons. You’ll then need an air gap solution and you’ll want to move to cold storage, as we call it, i.e. on tape. We support that capability and we can manage your data across that environment.

There are three core design principles to our IBM Storage portfolio. Number one is we can run seamlessly across these environments. Number two, we provide seamless access to the data across those environments. And number three, we support optimization of the storage for the use case needed, such being able to tier the storage to your economic and workload needs.

Gardner: Of course, what people are also interested in these days is the FlashSystem performance. Tell us about some of the latest and greatest when it comes to FlashSystem. You have the new 5200, the high-end 9200, and those also complement some of your other products like ESS 3200.

Flash provides best performance

Kennelly: Yes, we continue to expand the portfolio. With the FlashSystems, and some of our recent launches, some things don’t change. We’re still able to run across these different environments.

But in terms of price-performance, especially with the work we have done around our flash technology, we have optimized our storage subsystems to use standard flash technologies. In terms of price for throughput, when we look at this against our competitors, we offer twice the performance for roughly half the price. And this has been proven as we look at our competitors’ technology.

That’s due to leveraging our innovations around what we call the FlashCore Module, wherein we are able to use standard flash in those disk drives and enable compression on the fly. That’s driving the roadmap in terms of throughput and performance at a very, very competitive price point.

Gardner: Many of our readers and listeners, Denis, are focused on their digital business transformation. They might not be familiar with some of these underlying technological advances, particularly end-to-end Non-Volatile Memory Express (NVMe). So why are these systems doing things that just weren’t possible before?

Kennelly: A lot of it comes down to where the technology is today and the price points that we can get from flash from our vendors. And that’s why we are optimizing our flash roadmap and our flash drives within these systems. It’s really pushing the envelope in terms of performance and throughput across our flash platforms.

Gardner: The desired end-product for many organizations is better and pervasive analytics. And one of the great things about artificial intelligence (AI) and machine learning (ML) is it’s not only an output -- it’s a feature of the process of enhancing storage and IT.

How are IT systems and storage using AI inside these devices and across these solutions? What is AI bringing to enable better storage performance at a lower price point?

Kennelly: We continue to optimize what we can do in our flash technology, as I said. But when you embark on an AI project, something like 70 to 80 percent of the spend is around discovery, gaining access to the data, and finding out where the data assets are. And we have capabilities like IBM Spectrum Discover that help catalog and understand where the data is and how to access that data. It’s a critical piece of our portfolio on that journey to AI.

We also have integrations with AI services like Cloudera out of the box so that we can seamlessly integrate with those platforms and help those platforms differentiate using our Spectrum Scale technology.

But in terms of AI, we have some really key enablers to help accelerate AI projects through discovery and integration with some of the big AI platforms.

Gardner: And these new storage platforms are knocking off some impressive numbers around high availability and low latency. We are also seeing a great deal of consolidation around storage arrays and managing storage as a single pool.

On the economics of the IBM FlashSystem approach, these performance attributes are also being enhanced by reducing operational costs and moving from CapEx to OpEx purchasing.

Storage-as-a-service delivers

Kennelly: Yes, there is no question we are moving toward an OpEx model. When I talked about cloud economics and cloud-like flexibility behavior at a technology level, that’s only one side of the equation.

On the business side, IT is demanding cloud consumption models, OpEx-type models, and pay-as-you-go. It’s not just a pure financial equation, it's also how you consume the technology. And storage is no different. This is why we are doing a lot of innovation around storage-as-a-service. But what does that really mean?

It means you ask for a service. “I need a certain type of storage with this type of availability, this type of performance, and this type of throughput.” Then we as a storage vendor take care of all the details behind that. We get the actual devices on the floor that meet those requirements and manage that.

As those assets depreciate over a number of years, we replace and update those assets in a seamless manner to the client.

We already have the technology to support all environments. Now we want to make sure we have a seamless consumption model and the business processes of delivering storage-as-a-service and being able to replace and upgrade that storage over time -- all seamless to the client.

As the storage sits in the data center, maybe the customer says, “I want to move some of that data to a cloud instance.” We also offer a seamless capability to move the data over to the cloud and run that service on the cloud.

We already have all the technology to do that and the platform support for all of those environments. What we are working on now is making sure we have a seamless consumption model and the business processes of delivering that storage-as-a-service, and how to replace and upgrade that storage over time -- while making it all seamless to the client.

I see storage moving quickly to this new storage consumption model, a pure OpEx model. That’s where we as an industry will go over the next few years.

Gardner: Another big element of reducing your total cost of ownership over time is in how well systems can be managed. When you have a common pool approach, a comprehensive portfolio approach, you also gain visibility, a single pane of glass when it comes to managing these systems.

Intelligent insights via storage

Kennelly: That’s an area we continue to invest in heavily. Our IBM Storage Insights platform provides tremendous insights in how the storage subsystems are running operationally. It also provides insights within the storage in terms of where you have space constraints or where you may need to expand.

But that’s not just a manual dashboard that we present to an operator. We are also infusing AI quite heavily into that platform and using AIOps to integrate with Storage Insights to run storage operations at much lower costs and with more automation.

And we can do that in a consistent manner right across the environments, whether it’s a flash storage array, mainframe attached, or a tape device. It’s all seamless across the environment. You can see those tiers and storage as one platform and so are able to respond quickly to events and understand events as they are happening.

Gardner: As we close out, Denis, for many organizations hybrid cloud means that they don’t always know what’s coming and lack control over predicting their IT requirements. Deciding in advance how things get deployed isn’t always an option.

How do the IBM FlashSystems, and your recent announcements in February 2021, provide a path to a crawl-walk-run adoption approach? How do people begin this journey regardless of the type of organization and the size of the organization?

Kennelly: We are introducing an update to our FlashSystem 5200 platform, which is our entry point platform. Now, that consistent software platform runs our storage software, IBM Spectrum Virtualize. It’s the same software as in our high-end arrays at the very top of our pyramid of capabilities.

As part of that announcement, we are also supporting other public cloud vendors. So you can run the software on our arrays, or you can move it out to run on a public cloud. You have tremendous flexibility and choice due to the consistent software platform.

And, as I said, it’s our entry point so the price is very, very competitive. This is a part of the market where we see tremendous growth. You can experience the best of the IBM Storage platform at a low-cost entry point, but also get the tremendous flexibility. You can scale up that environment within your data center and right out to your choice of how to use the same capabilities across the hybrid cloud.

There has been tremendous innovation by the IBM team to make sure that our software supports this myriad of platforms, but also at a price point that is the sweet spot of what customers are asking for now.

Gardner: It strikes me that we are on the vanguard of some major new advances in storage, but they are not just relegated to the largest enterprises. Even the smallest enterprises can take advantage and exploit these great technologies and storage benefits.

Kennelly: Absolutely. When we look at the storage market, the fastest growing part is at that lower price point -- where it’s below $50K to $100K unit costs. That’s where we see tremendous growth in the market and we are serving it very well and very efficiently with our platforms. And, of course, as people want to scale and grow, they can do that in a consistent and predictable manner.

Gardner: I’m afraid we will have to leave it there. You have been listening to a sponsored BriefingsDirect discussion on how the latest storage technologies and products are delivering a common data services benefit across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes. And we have delved beneath the covers to learn about the latest technologies enabling IBM’s vision for the future of data storage.

So please join me now in thanking our guest, Denis Kennelly, General Manager, IBM Storage. Thank you so much, Denis.

Kennelly: Thank you, Dana.

Gardner: Please join me and Denis again soon for our next discussion in this three-part series as we explore the virtues of a global access capabilities and the importance of consistent data -- no matter where it’s stored.

Thank you to our audience as well for joining this BriefingsDirect data strategies insights discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of IBM Storage-sponsored BriefingsDirect discussions.

Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: IBM Storage.

A transcript of a discussion on the latest technologies and products delivering common data services across today’s hybrid cloud, distributed data centers, and burgeoning edge landscapes. Copyright Interarbor Solutions, LLC, 2005-2021. All rights reserved.

You may also be interested in: