Friday, June 07, 2019

How Real-Time Data Streaming and Integration Set the Stage for AI-Driven DataOps


Transcript of a discussion the latest strategies for uniting and governing data wherever it resides to enable rapid and actionable analysis.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Qlik.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect. Our next business intelligence (BI) trends discussion explores the growing role of data integration in a multi-cloud world.

Gardner
Just as enterprises seek to gain more insights and value from their copious data, they’re also finding their applications, services, and raw data spread across a hybrid and public clouds continuum. Raw data is also piling up closer to the edge -- on factory floors, in hospital rooms, and anywhere digital business and consumer activities exist.

Stay with us now as we examine the latest strategies for uniting and governing data wherever it resides. By doing so, businesses are enabling rapid and actionable analysis -- as well as entirely new levels of human-to-augmented-intelligence collaboration.

To learn more about the foundational capabilities that lead to a total data access exploitation, we’re now joined by Dan Potter, Vice President of Product Marketing at Attunity, a Division of Qlik. Welcome, Dan.

Dan Potter: Hey, Dana. Great to be with you.

Gardner: Dan, what are the business trends forcing a new approach to data integration?

Potter: It’s all being driven by analytics. The analytics world has gone through some very interesting phases of late: Internet of Things (IoT), streaming data from operational systems, artificial intelligence (AI) and machine learning (ML), predictive and preventative kinds of analytics, and real-time streaming analytics.

Potter
So, it’s analytics driving data integration requirements. Analytics has changed the way in which data is being stored and managed for analytics. Things like cloud data warehouses, data lakes, streaming infrastructure like Kafka -- these are all a response to the business demand for a new style of analytics.

As analytics drives data management changes, the way in which the data is being integrated and moved needs to change as well. Traditional approaches to data integration – such as batch processes, more ETL, and scripted-oriented integration – are no longer good enough. All of that is changing. It’s all moving to a much more agile, real-time style of integration that’s being driven by things like the movement to the cloud and the need to move more data in greater volume, and in greater variety, into data lakes, and how do I shape that data and make it analytics-ready.

With all of these movements, there have been new challenges and new technologies. The pace of innovation is accelerating, and the challenges are growing. The demand for digital transformation and the move to the cloud has changed the landscape dramatically. With that came great opportunities for us as a modern data integration vendor, but also great challenges for companies that are going through this transition.

Gardner: Companies have been doing data integration since the original relational database (RDB) was kicked around. But it seems the core competency of managing the integration of data is more important than ever.

Innovation transforms data integration

Potter: I totally agree, and if done right, in the future, you won’t have to focus on data integration. The goal is to automate as much as possible because the data sources are changing. You have a proliferation of NoSQL databases, graph databases; it’s no longer just an Oracle database or RDB. You have all kinds of different data. You have different technologies being used to transform that data. Things like Spark have emerged along with other transformation technologies that are real-time-oriented. And there are different targets to where this data is being transformed and moved to.
It's difficult for organizations to maintain the skills set -- and you don’t want them to. We want to move to an automated process of data integration. The more we can achieve that, the more valuable all of this becomes. You don’t spend time with mundane data integration; you spend time on the analytics -- and that’s where the value comes from.

Gardner: Now that Attunity is part of Qlik, you are an essential component of a larger undertaking, of moving toward DataOps. Tell me why automated data migration and integration translates into a larger strategic value when you combine it with Qlik?

https://www.qlik.com/us

Potter: DataOps resonates well for the pain we’re setting out to address. DataOps is about bringing the same discipline that DevOps has brought to software development. Only now we’re bringing that to data and data integration for analytics.

How do we accelerate and remove the gap between IT, which is charged with providing analytics-ready data to the business, and all of the various business and analytics requirements? That’s where DataOps comes in. DataOps is technology, but that’s just a part of it. It’s as much or more about people and process -- along with enabling technology and modern integration technology like Attunity.

We’re trying to solve a problem that’s been persistent since the first bit of data hit a hard drive. Data integration challenges will always be there, but we’re getting smarter about the technology that you apply and gaining the discipline to not boil the ocean with every initiative.

The new goal is to get more collaboration between what business users need and to automate the delivery of analytics-ready data, knowing full-well that the requirements are going to change often. You can be much more responsive to those business changes, bring in additional datasets, and prepare that data in different ways and in different formats so it can be consumed with different analytics technologies.

That’s the big problem we’re trying to solve. And now, being part of Qlik gives us a much broader perspective on these pains as relates to the analytics world. It gives us a much broader portfolio of data integration technologies. The Qlik Data Catalyst product is a perfect complement to what Attunity does.
Our role in data integration has been to help organizations move data in real-time as that data changes on source systems. We capture those changes and move that data to where it's needed -- like a cloud, data lake, or data warehouse. We prepare and shape that data for analytics.

Our role in data integration has been to help organizations move data in real-time as that data changes on source systems. We capture those changes and move that data to where it’s needed -- like a cloud, data lake, or data warehouse. We prepare and shape that data for analytics.

Qlik Data Catalyst then comes in to catalog all of this data and make it available to business users so they can discover and govern that data. And it easily allows for that data to be further prepared, enriched, or to create derivative datasets.

So, it’s a perfect marriage in that the data integration world brings together the strength of Attunity with Qlik Data Catalyst. We have the most purpose-fit, modern data integration technology to solve these analytics challenges. And we’re doing it in a way that fits well with a DataOps discipline.

Gardner: We not only have the different data types, we have another level of heterogeneity to contend with and that’s cloud, hybrid cloud, multi-cloud, and edge. We don’t even know what more is going to be coming in two or three years. How does an organization stay agile given that level of dynamic complexity?

Real-time analytics deliver agility 

Potter: You need a different approach for a different style of integration technology to support these topologies that are themselves very different. And what the ecosystem looks like today is going to be radically different two years from now.

The pace of innovation just within the cloud platform technologies is very rapid. Just the new databases, transformation engines, and orchestration engines -- it’s just proliferates. And now you have multiple cloud vendors. There are great reasons for organizations to use multiple clouds, to use the best of the technologies or approaches that work for your organization, your workgroup, your division. So you need that. You need to prepare yourself for that, and modern integration approaches definitely help.


One of the interesting technologies to help organizations provide ongoing agility is Apache Kafka. Kafka is a way to move data in real-time and make the data easy to consume even as it’s flowing. We see that as an important piece of the evolving data infrastructure fabric.

At Attunity we create data streams from systems like mainframes, SAP applications, and RDBs. These systems weren’t built to stream data, but we stream-enable that data. We publish it into a Kafka stream and that provides great flexibility for organizations to, for example, process that data in real time for real-time analytics such as fraud detection. It’s an efficient way to publish that data to multiple systems. But it also provides the agility to be able to deliver that data widely and have people find and consume that data easily.

https://www.qlik.com/us
Such new, evolving approaches enable a mentality that says, “I need to make sure that whatever decision I make today is going to future-proof me.” So, setting yourself up right and thinking about that agility and building for agility on day one is absolutely essential.

Gardner: What are the top challenges companies have for becoming masterful at this ongoing challenge -- of getting control of data so that they can then always analyze it properly and get the big business outcomes payoff?

Potter: The most important competency is on the enterprise architecture (EA) level, more than on the people who traditionally build ETL scripts and integration routines. I think those are the piece you want to automate.

The real core competency is to define a modern data architecture and build it for agility so you can embrace the changing technologies and requirements landscape. It may be that you have all of your eggs in one cloud vendor today. But you certainly want to set yourself up so you can evolve and push processing to the most efficient place, and to attain the best technology for the kinds of analytics or operational workloads you want.

That’s the top competency that organizations should be focused on. As an integration vendor, we are trying to reduce the reliance on technical people to do all of this integration work in a manual way. It’s time-consuming, error-prone, and costly. Let’s automate as much as we can and help companies build the right data architecture for the future.

Gardner: What’s fascinating to me, Dan, in this era of AI, ML, and augmented intelligence is that we’re not just creating systems that will get you to that analytic opportunity for intelligence. We are employing that intelligence to get there. It’s tactical and strategic. It’s a process, and it’s a result.

How do AI tools help automate and streamline the process of getting your data lined up properly?

Automated analytics advance automation 

Potter: This is an emerging area for integration technology. Our focus initially has been on preparing data to make it available for ML initiatives. We work with vendors such as Databricks at the forefront of processing, using a high performance Spark engine and processing data for data science, ML, and AI initiatives.

We need to ask, “How do we apply cognitive engines, things like Qlik, to the fore within our own technology and get smarter about the patterns of integration that organizations are deploying so we can further automate?” That’s really the next way for us.

Gardner: You’re not just the president, you’re a client.

Potter: Yeah, that’s a great way to put it.

Gardner: How should people prepare for such use of intelligence?

Potter: If it’s done right -- and we plan on doing it right -- it should be transparent to the users. This is all about automation done right. It should just be intuitive. Going back 15 years when we first brought out replication technology at Attunity, the idea was to automate and abstract away all of the complexity. You could literally drag your source, your target, and make it happen. The technology does the mapping, the routing, and handles all the errors for me. It’s that same elegance. That’s where the intelligence comes in, to make it so intuitive that you are not seeing all the magic that’s happening under the covers.
This is all about automation done right. It should just be intuitive. When we first brought out replication technology at Attunity, the idea was to automate and abstract away all of the complexity. That's now where the intelligence comes in, to make it so intuitive that you are not seeing all the magic under the covers.

We follow that same design principle in our product. As the technologies get more complex, it’s harder for us to do that. Applying ML and AI becomes even more important to us. So that’s really the future for us. You’ll continue to see, as we automate more of these processes, all of what is happening under the covers.

Gardner: Dan, are there any examples of organizations on the bleeding edge? They understand the data integration requirements and core competencies. They see this through the lens of architecture.

Automation insures insights into data 

Potter: Zurich Insurance is one of the early innovators in applying automation to their data warehouse initiatives. Zurich had been moving to a modern data warehouse to better meet the analytics requirements, but they realized they needed a better way to do it than in the past.

Traditional enterprise data warehousing employs a lot of people, building a lot of ETL scripts. It tends to be very brittle. When source systems change you don’t know about it until the scripts break or until the business users complain about holes in their graphs. Zurich turned to Attunity to automate the process of integrating, moving it to real-time, and automatically structuring their data warehouse.

Their capability to respond to business users is a fraction of what it was. They reduced 45-day cycles to two-day cycles for updating and building out new data marts for users. Their agility is off the charts compared to the traditional way of doing it. They can now better meet the needs of the business users through automation.

As organizations move to the cloud to automate processes, a lot of customers are embracing data lakes. It’s easy to put data into a data lake, but it’s really hard to derive value from the data lake and reconstruct the data to make it analytics-ready.

For example, you can take transactions from a mainframe and dump all of those things into a data lake, which is wonderful. But how do I create any analytic insights? How do I ensure all those frequently updated files I’m dumping into the lake can be reconstructed into a queryable dataset? The way people have done it in the past is manually. I have scriptures using Pig and other languages try to reconstruct it. We fully automate that process. For companies using Attunity technology, our big investments in data lakes has had a tremendous impact on demonstrating value.

Gardner: Attunity recently became part of Qlik. Are there any clients that demonstrate the combination of two-plus-two-equals-five effect when it comes to Attunity and the Qlik Catalyst catalog?

DataOps delivers the magic 

Potter: It’s still early days for us. As we look at our installed base -- and there is a lot of overlap between who we sell to -- the BI teams and the data integration teams in many cases are separate and distinct. DataOps brings them together.

In the future, as we take the Qlik Data Catalyst and make that the nexus of where the business side and the IT side come together, the DataOps approach leverages that catalog and extends it with collaboration. That’s where the magic happens.

https://www.qlik.com/us

So business users can more easily find the data. They can send the requirements back to the data engineering team as they need them. By, again, applying AI and ML to the patterns that we are seeing from the analytics side will help better apply that to the data that’s required and automate the delivery and preparation of that data for different business users.

That’s the future, and it’s going to be very interesting. A year from now, after being part of the Qlik family, we’ll bring together the BI and data integration side from our joint customers. We are going to see some really interesting results.

Gardner: As this next, third generation of BI kicks in, what should organizations be doing to get prepared? What should the data architect, who is starting to think about DataOps, do to put them in an advantageous position to exploit this when the market matures?

Potter: First they should be talking to Attunity. We get engaged early and often in many of these organizations. The hardest job in IT right now is [to be an] enterprise architect, because there are so many moving parts. But we have wonderful conversations because at Attunity we’ve been doing this for a long time, we speak the same language, and we bring a lot of knowledge and experience from other organizations to bear. It’s one of the reasons we have deep strategic relationships with many of these enterprise architects and on the IT side of the house.

They should be thinking about what’s the next wave and how to best prepare for that. Foundationally, moving to more real-time streaming integration is an absolute requirement. You can take our word for it. You can go talk to analysts and other peers around the need for real-time data and streaming architectures, and how important that is going to be in the next wave.
Data integration is strategic, it unlocks the value of the data. If you do it right, you're going to set yourself up for long-term success.

So, preparing for that and again thinking about the agility in the automation that’s going to get them the desired results because if they’re not preparing for that now, they are going to be left behind, and if they are left behind the business is left behind, and it is a very competitive world and organizations are competing on data and analytics. So the faster that you can deliver the right data and make it analytic-ready, the faster and better decisions you can make and the more successful you’ll be.

So it really is a do-or-die kind of proposition and that’s why data integration, it’s strategic, it’s unlocking the value of this data, and if you do it right, you’re going to set yourself up for long-term success.


Gardner: I’m afraid we’ll have to leave it there. You’ve been listening to a sponsored BriefingsDirect discussion on the role of data integration in a multicloud world. And we have learned how the latest strategies for uniting and governing all of data, wherever it resides, enables rapid and actionable analysis.

So, a big thank you to our guest, Dan Potter, Vice President of Product Marketing at Attunity, a Division of Qlik.

Potter: Thank you, Dana. Always a pleasure.

Gardner: And a big thank you as well to our audience for joining this BriefingsDirect business intelligence trends discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of Qlik-sponsored BriefingsDirect interviews.

Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Qlik.

Transcript of a discussion on the latest strategies for uniting and governing data wherever it resides to enable rapid and actionable analysis. Copyright Interarbor Solutions, LLC, 2005-2019. All rights reserved.

You may also be interested in:

Thursday, June 06, 2019

How HCI Forms a Simple Foundation for Hybrid Cloud and Composable Infrastructure

https://www.hpe.com/us/en/home.html

A discussion on how IT operators are seeking increased automation, built-in intelligence, and robust security as they look for turnkey hyperconverged appliance approaches for both cloud and traditional workloads.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.


Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of the Innovator podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on the latest insights into hybrid cloud and hyperconverged infrastructure (HCI) strategies.

Gardner
Speed to business value and simplicity in deployments have been top drivers of the steady growth around HCI solutions. IT operators are now looking to increased automation, built-in intelligence, and robust security as they seek such turnkey appliance approaches for both cloud and traditional workloads.

Stay with us now as we examine the rapidly evolving HCI innovation landscape, which is being shaped just as much by composability, partnerships, and economics, as it is new technology.

Here to help us learn more about the next chapter of automated and integrated IT infrastructure solutions is Thomas Goepel, Chief Technologist for Hyperconverged Infrastructure at Hewlett Packard Enterprise (HPE). Welcome, Thomas.
 

Thomas Goepel: Thank you for having me.

Gardner: Thomas, what are the top drivers now for HCI as a business tool? What’s driving the market now, and how has that changed from a few years ago?

Goepel
Goepel: HCI has gone through a really big transformation in the last few years. When I look at how it originally started, it was literally people looking for a better way of building virtual desktop infrastructure (VDI) solutions. They wanted to combine servers and storage in a single device and make it easier to operate.

What I am seeing now is HCI spreading throughout datacenters and becoming one of the core elements of a lot of the datacenters around the world. The use cases have significantly been expanded. It started out with VDI, but now people are running all kinds of business applications on HCI -- all the way to critical databases like SAP HANA.

Gardner: People are using HCI in new ways. They are innovating in the market, and that often means they do things with HCI that were not necessarily anticipated. Do you see that happening with HCI?

Ease of use encourages HCI expansion

Goepel: Yes, it’s happened with HCI quite a bit. The original use cases were very much focused on VDI and end-user computing. It was just a convenient way of having a platform for all of your virtual desktops and an easy way of managing them.

But people saw that ease of management can actually be expanded into other use cases. They then began to bring in some core business applications, such as Microsoft Exchange or SharePoint, logged onto the platform and saw there are more and more things they can put on there, and gain the entire simplicity that hyperconverged brings to operating in this environment.
How Hyperconverged Infrastructure Delivers
Unexpected Results for VDI Users
You no longer had to build a separate server farm, separate storage farm, or even manage your network independently. You could now do all of that from a single interface, a single-entry point, and gain a single point of management. Then people said, “Well, this ease makes it so beneficial for me, why don’t we bring the other things in here?” And then we saw it spread out in the data centers.

What we now have is people saying, “Hey, let me take this a step further. If I have remote offices, branch offices, or edge use-cases where I also need compute resources, why not try to take HCI there? Because typically on the edge I don’t even have system administrators, so I can take this entire simplicity down to this point, too.”

And the nice thing with hyperconvergence is that -- at least in the HPE version of hyperconvergence, which is HPE SimpliVity -- it’s not only simple to manage, it has also built in all of the enterprise features such as high availability and data efficiency, so it makes it really a robust solution. It has come a very long way on this journey.

Gardner: Thomas, you mentioned the role of HCI at the edge gaining traction and innovation. What’s a typical use case for this sort of micro datacenter at the edge? How does that work?

Losing weight with HCI wins the race

Goepel: Let me give you a really good example of a super-fast-paced industry: Formula One car racing. It really illustrates how edge is having an impact -- and also how this has a business impact.

One of our customers, Aston Martin Red Bull Racing, has been very successful in Formula One racing. The rules of the International Automobile Federation (FIA), the governing board of Formula One racing, say that each race team can only bring a certain amount of weight to a racetrack during the races.

This is obviously a high-tech race. They are adjusting the car during the race, lap by lap, making adjustments based on the real-time performance of the car to get the last inch possible out of the car to win that race. All of these cars are very close to each other from a performance perspective.

Traditionally, they shipped racks and racks of IT gear to the racetrack to calculate the performance of the car and make adjustments during the race. They have now replaced all of these racks with HPE SimpliVity HCI gear and significantly reduced the amount of gear. It means having significantly less weight to bring to the racetrack.
How Hyperconvergence Plays
A Pivotal Role at Red Bull
There are two benefits. First, reducing the weight of the IT gear allows them to bring additional things to the racetrack because what counts is the total weight – and that includes the car, spare parts, people, equipment -- everything. There is a certain mandated limit.

By taking that weight out, having less IT equipment on the racetrack, the HCI allows them to bring extra personnel and spare parts. They can perform better in the races.

The other benefit is that HCI performs significantly better than traditional IT infrastructure. They can now make adjustments within one lap of the race versus before, when it took them three laps before they could make adjustments to the car.

This is a huge competitive advantage. When you look at the results, they are doing great when it comes to Formula One racing, especially for being a smaller team compared to the big teams out there.

From that perspective, at the edge, HCI is making some big improvements, not only in a high-end industry like Formula One racing, but in all kinds of other industries, including manufacturing and retail. They are seeing similar benefits.

Gardner: I wrote a research paper about four years ago, Thomas, that laid out the case that HCI will become a popular on-ramp to private clouds and ultimately hybrid cloud. Was I ahead of my time?

HCI on-ramp to the clouds

Goepel: Yes, I think you were a little bit ahead of your time. But you were also a visionary to lay out that groundwork. When you look at the industry, hyperconvergence is a fast-growing industry segment. When it comes to server and data center infrastructure, HCI has the highest growth rate across the entire IT industry.
I don't see an end anytime soon. HCI continues to grow as people discover new use cases. The edge is one new element, but we are just scratching the surface.

What you were foreseeing four years ago is exactly what we now have, and I don’t see an end anytime soon. HCI continues to grow as people discover new use cases. The edge is one new element, but we are just scratching the surface.

Edge use cases are a fascinating new world in general -- from such distributed environments as smart cities and smart manufacturing. We are just starting to get into this world. There’s a huge opportunity for innovation and this will become an attractive area for hyperconvergence.

Gardner: How does HCI innovation align with other innovations at HPE around automation, composability, and intelligence derived to make IT behave as total solutions? Is there a sense that the whole is greater than the sum of the parts?

HCI innovations prevent problems

Goepel: Absolutely there is. We have leveraged a lot of innovation in the broader HPE ecosystem, including the latest generation of the ProLiant DL380 Server, the most secure server in the industry. All of these elements flew into the HPE SimpliVity HCI platform, too.

But we are not stopping there. A lot of other innovations in the HPE ecosystem are being brought into hyperconvergence. A perfect example is HPE InfoSight, a management platform that allows you to operate your infrastructure better by understanding what’s going on in a very efficient way. It uses artificial intelligence (AI) to detect when something is going wrong in your IT environment so you can proactively take action and don’t end up with a disaster.
How to Tell if Your Network
Is Really Aware of Your Infrastructure
HPE InfoSight originally started out in storage, but we are now taking it into the full HPE SimpliVity HCI ecosystem. It’s not just a support portal, it gives you intelligence to understand what’s going on before you run into problems. Those problems can be solved so your environment keeps running at top performance. You’ll have what you need to run any mission-critical business on HCI.

More and more of these innovations in our ecosystem will be brought into the hyperconverged world. Another example is around composability. We have been developing a lot of platform capabilities around composability and we are now bringing HPE SimpliVity and composability together. This allows customers to actually change the infrastructure’s personality depending on the workload, including bringing on HPE SimpliVity. You can get the best of these two worlds.

https://www.hpe.com/us/en/home.html
This leads to building a private cloud environment that can be easily connected to a public cloud or clouds. You will ultimately build out a hybrid IT environment in such a way that your private cloud environment, or your on-premise environment, runs in the most optimized way for your business and for your specific needs as a company.

Gardner: You are also opening up that HCI ecosystem with new partners. Tell us how innovation around hyperconverged is broadening and making it more ecumenical for the IT operations consumer.

Welcome to the hybrid world

Goepel: HPE has always been an open player. We never believed in locking down an environment or making it proprietary and basically locking out everyone else. We have always been a company that listens to what our customers want, what our customers need, and then give them the best solution.

Now, customers are looking to run their HCI environment on HPE equipment and infrastructure because they know that this is reliable infrastructure. It is working, and they feel comfortable with it, and they trust it. But we also have customers who say, “Hey, you know, I want to run this piece of software or that solution on this HPE environment. Can you make sure this runs and works perfectly?”


We are in a hybrid world. And in a hybrid world there is not a single vendor that can cover the entire hybrid market. We need to innovate in such a way that we allow an ecosystem of partners to all come together and work collaboratively and jointly to provide new solutions.

We have recently announced new partnerships with other software vendors, and that includes HPE GreenLake Flex Capacity. With that, instead of doing big, upfront investments on equipment, you can do it in a more innovative way financially. It brings about the solution that solves the customers’ real problems, rather than locking the customer into some certain infrastructure.

Flexibility improves performance 

Gardner: You are broadening the idea of making something consumable when you innovate, not only around the technology and the partnerships, but also the economic model, the consumption model. Tell us more about how HPE GreenLake Flex Capacity and acquiring a turnkey HPE SimpliVity HCI solution can accelerate value when you consume it, not as a capital expense, but as an operating cost affair.

Goepel: No industry is 100 percent predictable, at least I haven’t seen it, and I haven’t found it. Not even the most conservative government institution that has a five-year plan is predictable. There are always factors that will disrupt that predictability plan, and you have to react to that.
How Hyperconverged Infrastructure
 Solves Unique Challenges
For Datacenters at the Edge
Traditionally, what we have done in the industry is oversized our environments to calculate for anticipated growth over five years -- and then add another 25 percent on top of it, and then another 10 percent cover on top of that. Hopefully we did not undersize the environment once we get to the end of the life of the equipment.

That is a lot of capital you are investing into something that just sits there and has no value, no use, and just basically stands around, and you take off of your books in the financial perspective.

Now, HPE GreenLake gives you a flexible-capacity model. You only pay literally for what you consume. If you grow faster than you anticipated, you just use more. If you grow slower, you use less. If you have an extremely successful business -- but then something in the economic model changes and your business doesn’t perform as you have anticipated -- then you can reduce your spending. That flexibility better supports your business.
IT shouldn't be a burden that slows you down, it should be an accelerator. By having a flexible financial model, you get exactly that.You can scale up and down based on your business needs.

We are ultimately doing IT to help our businesses to perform better. IT shouldn't be a burden that slows you down, it should be an accelerator. By having a flexible financial model, you get exactly that. HPE GreenLake allows you to scale up and scale down your environment based on your business needs with the right financial benefits behind it.

Gardner: There is such a thing as too much of a good thing. And I suppose that also applies to innovation. If you are doing so many new and interesting things -- allowing for hybrid models to accelerate and employing new economic models -- sometimes things can spin out of control.

But you can also innovate around management to prevent that from happening. How does management innovation fit into these other aspects of a solution, to keep it from getting out of control?

Checks and balances extend manageability

Goepel: You bring up a really good point. One of the things we have learned as an industry is that things can spin out of control very quickly. And for me, the best example is when I go back two years when people said, “I need to go to the cloud because that is going to save my world. It’s going to reduce my costs, and it's going to be the perfect solution for me.”

What happened is people went all-in for the cloud and every developer and IT person heard, “Hey, if you need a virtual machine just get it on whatever your favorite cloud provider is. Go for it.” People very quickly learned that this means exploding their costs. There was no control, no checks and balances.

On both the HCI and general IT side, we have learned from that initial mistake in the public cloud and have put the right checks and balances in place. HPE OneView is our infrastructure management platform that allows the system administrator to operate the infrastructure from a single-entry point or single point of view.
How Hyperconverged Infrastructure
 Helps Trim IT Complexity
Without Sacrificing Quality
That gives you a very simple way of managing and plays along with the way HCI is operated -- from a single point of view. You don't have five consoles or five screens, you literally have one screen you operate from.

You need to have a common way of managing checks and balances in any environment. You don't want the end user or every developer to go in there and just randomly create virtual machines, because then your HCI environment quickly runs out of resources, too. You need to have the right access controls so that only people that have the right justification can do that, but it still needs to happen quickly. We are in a world where a developer doesn’t want to wait three days to get a virtual machine. If he is working on something, he needs the virtual machine now -- not in a week or in two days.

Similarly, when it comes to a hybrid environment -- when we bring together the private cloud and the public cloud -- we want a consistent view across both worlds. So this is where HPE OneSphere comes in. HPE OneSphere is a cloud management platform that manages hybrid clouds, so private and public clouds.

https://www.hpe.com/us/en/home.html
It allows you to gain a holistic view of what resources you are consuming, what's the cost of these resources, and how you can best distribute workloads between the public and private clouds in the most efficient way. It is about managing performance, availability, and cost. You can put in place the right control mechanisms to curb rogue spending, and control how much is being consumed and where.

Gardner: From all of these advancements, Thomas, have you made any personal observations about the nature of innovation? What is it about innovation that works? What do you need to put in place to prevent it from becoming a negative? What is it about innovation that is a force-multiplier from your vantage point?

Faster is better 

Goepel: The biggest observation I have is that innovation is happening faster and faster. In the past, it took quite a while to get innovation out there. Now it is happening so fast that one innovation comes, then the next one just basically runs over it, and we are taking advantage of it, too. This is just the nature of the world we are living in; everything is moving much faster.

There are obviously some really great benefits from the innovation we are seeing. We have talked about a few of them, like AI and how HCI is being used in edge use-cases. In manufacturing, hospitals, and these kinds of environments, you can now do things in better and more efficient ways. That's also helping on the business side.
How One Business
Took Control of their Hybrid Cloud 
But there’s also the human factor, because innovation makes things easier for us or makes it better for us to operate. A perfect example is in hospitals, where we can provide the right compute power and intelligence to make sure patients get the right medication. It is controlled in a good way, rather than just somebody writing on a piece of paper and hoping the next person can read it. You can now do all of these things electronically, with the right digital intelligence to ensure that you are actually curing the patient.

I think we will see more and more of these types of examples happening and bringing compute power to the edge. That is a huge opportunity, and there is a lot of innovation in the next two to three years, specifically in this segment, and that will impact everyone’s life in a positive way.

Gardner: Speaking of impacting people's lives, I have observed that the IT operator is being greatly impacted by innovation. The very nature of their job is changing. For example, I recently spoke with Gary Thome, CTO for Composable Cloud at HPE, and he said that composability allows for the actual consumers of applications to compose their own supporting infrastructure.

Because of ease, automation, and intelligence, we don’t necessarily need to go to IT to say, “Set up XYZ infrastructure with these requirements.” Using composablity, we can move innovation to the very people who are in the most advantageous position to define what it is they need.

Thomas, how do you see innovation impacting the very definition of what IT people do?

No more mundane tasks 

Goepel: This is a very positive impact, and I will give you a really good example. I spend a lot of time talking to customers and to a lot of IT people out there. And I have never encountered a single systems administrator in this industry who comes to work in the morning and says, “You know, I am so happy that I am here this morning so I can do a backup of my environment. It’s going to take me four hours, and I am going to be the happiest person in the world if the backup goes through.” Nobody wants to do this.

Nobody goes to work in the morning and says, “You know, I really hope I get a hard problem to solve, like my network crashes and I am going to be the hero in solving the problem, or by making a configuration change in my virtual environment.”

These are boring tasks that nobody is looking for, but we have to do it because we don't have the right automation in our environments. We don't have the right management tools in our environment. We put a lot of boring tasks to our administrators and let them do them. They are mundane and they don't really look forward to them.
How Hyperconverged Infrastructure
Gives You 54 Minutes Back Every Hour
Innovation takes these burdens away from the systems administrator and frees up their time to do things that are not only more interesting, but also add to the bottom line of the company. They can better help drive the businesses and spend IT resources on something that makes the difference for the company’s bottom line.

Ultimately, you don’t want to be the one watching backups going through or restoring files. You want this to be automatic, with a couple of clicks, and then you spend your time on something more interesting.

Every systems administrator I talk to really likes the new ways. I haven't seen anyone coming back to me and saying, “Hey, can you take this automation away and all this hyperconvergence away? I want to go back to the old way and do things manually so I know how to spend my eight hours of the day.” People have much more to do with the hours they have. This is just freeing them up to focus on the things that add value.

HCI to make IT life easier and easier 

Gardner: Before we close out, Thomas, how about some forward-looking thoughts about what innovation is going to bring next to HCI? We talked about the edge and intelligence, but is there more? What are we going to be talking about when it comes to innovation in two years in the HCI space?

Goepel: I touched on the edge. I think there will be a lot of things happening across the entire edge space, where HCI will clearly be able to make a difference. We will take advantage of the capabilities that HCI brings in all these segments -- and it will actually drive innovation outside of the hyperconverged world, but by being enabled by HCI.

But there are a couple of other things to look at. Self-healing using AI in IT troubleshooting, I think, will become a big innovation point in the HCI industry. What we are doing with HPE InfoSight is a start, but there is much more to come. This will continue to make the life of the systems administrator easier.
We want HCI as a platform to be almost invisible to the end user because they shouldn't care about the infrastructure. It will behave like a cloud, but just be on-premises and private, and in a better, more controlled way.

Ideally, we want HCI as a platform to be almost invisible to the end user because they shouldn't care about the infrastructure. It will behave like a cloud, but just be on-premises and private, and in a better, more controlled way.

The next element of innovation you will see is HCI acting very similar to a cloud environment. And some of the first steps with that are what we are doing around composability. This will drive forward to where you change the personality of the infrastructure depending on the workload needed. It becomes a huge pool of resources. And if you need to look like a bare-metal server, or a virtual server -- a big one or a small one -- you can just change it and this will be all software controlled. I think that innovation element will then enable a lot of other innovations on top of it.

If you take these three elements -- AI, composability of the infrastructure, and driving that into the edge use cases -- that will enable a lot of business innovation. It’s like the three legs of a stool. And that will help us drive even further innovation.

Gardner: I’m afraid we will have to leave it there. You have been exploring the speed to business value and simplicity benefits from the latest HCI solutions. And we have learned how built-in intelligence, flexible economic models, and a drive to the edge are advancing the nature and value of composable IT infrastructure and hyperconvergence as well.
How to Achieve Composability
Across Your Datacenter
So please join me in thanking our guest, Thomas Goepel, Chief Technologist for Hyperconverged Infrastructure at Hewlett Packard Enterprise. Thank you so much, Thomas.

And a big thank you as well to our audience for joining this sponsored BriefingsDirect Voice of the Innovator hybrid IT and composable infrastructure strategies interview.


I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-sponsored discussions. Thanks again for listening. Please pass this along to your IT community, and do come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise.

A discussion on how IT operators are seeking increased automation, built-in intelligence, and robust security as they seek turnkey appliance approaches for both cloud and traditional workloads. Copyright Interarbor Solutions, LLC, 2005-2019. All rights reserved.

You may also be interested in: