Wednesday, July 22, 2015

Zynga Builds Big Data Innovation Culture by Making Analytics Open to All

Transcript of a BriefingsDirect discussion on how data-driven companies can gain a competitive advantage in making as much analysis available to as many people in their organizations as possible.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Our next big data case study interview highlights how Zynga in San Francisco depends on big-data analytics to improve its business via a culture of pervasive analytics and experimentation.

To learn more about how big data impacts Zynga in the fast-changing and highly competitive mobile gaming industry, please welcome Joanne Ho, a Senior Engineering Manager at Zynga. Welcome, Joanne.

Joanne Ho: Hi.

Gardner: And also, Yuko Yamazaki, Head of Analytics at Zynga. Welcome, Yuko.
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Yuko Yamazaki: Thank you.

Gardner: How important is big data analytics to you as an organization?

Ho
Ho: To Zynga, big data is very important. It's a main piece of the company and as a part of the analytics department, big data is serving the entire company as a source of understanding our users' behavior, our players, what they like, and what they don’t like about games. We are using this data to analyze the user’s behavior and we also will personalize a lot of different game models that fit the user’s player pattern.

Gardner: What’s interesting to me about games is the people will not only download them but that they're upgradable, changeable. People can easily move. So the feedback loop between the inferences, information, and analysis you gain by your users' actions is rather compressed, compared to many other industries.

What is it that you're able to do in this rapid-fire development-and-release process? How is that responsiveness important to you?

Real-time analysis

Ho: Real-time analysis, of course, is critical, and we have our streaming system that can do it. We have our monitoring and alerting system that can alert us whenever we see any drops in user’s install rating, or any daily active users (DAU). The game studio will be alerted and they will take appropriate action on that.

Gardner: Yuko, what sort of datasets we are talking about? If we're going to the social realm, we can get some very large datasets. What's the volume and scale we're talking about here?

Yamazaki: We get data of everything that happens in our games. Almost every single play gets tracked into our system. We're talking about 40 billion to 60 billion rows a day, and that's the data that our game product managers and development engineers decide what they want to analyze later. So it’s already being structured and compressed as it comes into our database.

Gardner: That’s very impressive scale. It’s one thing to have a lot of data, but it’s another to be able to make that actionable. What do you do once that data is assembled?

Yamazaki: The biggest success story that I will normally tell about Zynga is that we make data available to all employees. From day one, as soon as you join Zynga, you get to see all the data through our visualization to whatever we have. Even if you're FarmVille product manager, you get to see what Poker is doing, making it more transparent. There is an account report that you can just click and see how many people have done this particular game action, for example. That’s how we were able to create this data-driven culture for Zynga.

Yamazaki
Gardner: And Zynga is not all that old. Is this data capability something that you’ve had right from the start, or did you come into it over time? 

Yamazaki: Since we began Poker and Words With Friends, our cluster scaled 70 times.

Ho: It started off with three nodes, and we've grown to 230 node clusters.

Gardner: So you're performing the gathering of the data and analysis in your own data centers?

Yamazaki: Yes.

Gardner: When you realized the scale and the nature of your task, what were some of the top requirements you had for your cluster, your database, and your analytics engine? How did you make some technology choices?

Biggest points

Yamazaki: When Zynga was growing, our main focus was to build something that was going to be able to scale and provide the data as fast as possible. Those were the two biggest points that we had in mind when we decided to create our analytics infrastructure.

Gardner: And any other more detailed requirements in terms of the type of database or the type of analytics engine?
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Yamazaki: Those are two big ones. As I mentioned, we wanted to have everyone be able to access the data. So SQL would have been a great technology to have. It’s easy to train PMs instead of engineering sites, for example, MapReduce for Hadoop. Those were the three key points as we selected our database.

Gardner: What are the future directions and requirements that you have? Are there things that you’d like to see from HP, for example, in order to continue to be able do what you do at increasing scale?

Ho: We're interested in real-time analytics. There's a function aggregated projection that we're interested in trying. Also Flex Tables [in HP Vertica] sounds like a very interesting feature that we also will attempt to try. And cloud analytics is the third one that we're also interested in. We hope HP will get it matured, so that we can also test it out in the future.
We we have 2,000 employees, and  at least 1,000 are using our visualization tool on a daily basis.

Gardner: While your analytics has been with you right from the start, you were early in using Vertica?

Ho: Yes.

Gardner: So now we've determined how important it is, do you have any metrics of what this is able to do for you? Other organizations might be saying they we don't have as much of a data-driven culture as Zynga, but would like to and they realize that the technology can now ramp-up to such incredible volume and velocity, What do you get back? How do you measure the success when you do big-data analytics correctly?

Yamazaki: Internally, we look at adoption of systems. We we have 2,000 employees, and  at least 1,000 are using our visualization tool on a daily basis. This is the way to measure adoption of our systems internally.

Externally, the biggest metric is retention. Are players coming back and, if so, was that through the data that we collect? Were we able to do personalization so that they're coming back because of the experience they've had?

Gardner: These are very important to your business, obviously, and it’s curious about that buy-in. As the saying goes, you can lead a horse to water, but you can't make him drink. You can provide data analysis and visualization to the employees, but if they don’t find it useful and impactful, they won’t use it. So that’s interesting with that as a key performance indicator for you.

Any words of advice for other organizations who are trying to become more data-driven, to use analytics more strategically? Is this about people, process, culture, technology, all the above? What advice might you have for those seeking to better avail themselves of big data analytics?

Visualization

Yamazaki: A couple of things. One is to provide end-to-end. So not just data storage, but also visualization. We also have an experimentation system, where I think we have about 400-600 experiments running as we speak. We have a report that shows you run this experiment, all these metrics have been moved because of your experiment, and A is better than B.

We run this other experiment, and there's a visualization you can use to see that data. So providing that end-to-end data and analytics to all employees is one of the biggest pieces of advice I would provide to any companies.

One more thing is try to get one good win. If you focus too much on technology or scalability, you might be building a battleship, when you actually don’t need it yet. It's incremental. Improvement is probably going to take you to a place that you need to get to. Just try to get a good big win of increasing installs or active users in one particular game or product and see where it goes.

Gardner: And just to revisit the idea that you've got so many employees and so many innovations going on, how do you encourage your employees to interact with the data? Do you give them total flexibility in terms of experiments? How do they start the process of some of those proof-of-concept type of activities?

Yamazaki: It's all freestyle. They can log whatever they want. They can see whatever they want, except revenue type of data, and they can create any experiments they want. Her team owns this part, but we also make the data available. Some of the games can hit real time. We can do that real-time personalization using that data that you logged. It’s almost 360-degree of the data availability to our product teams.
If you focus too much on technology or scalability, you might be building a battleship, when you actually don’t need it yet.

Gardner: It’s really impressive that there's so much of this data mentality ingrained in the company, from the start and also across all the employees, so that’s very interesting. How do you see that in terms of your competitive edge? Do you think the other gaming companies are doing the same thing? Do you have an advantage that you've created a data culture?

Yamazaki: Definitely, in online gaming you have to have big data to succeed. A lot of companies, though, are just getting whatever they can, then structure it, and make it analyzable. One of the things that we've done that do well was to make a structure to start with. So the data is already structured.

Product managers are already thinking about what they want to analyze before hand. It's not like they just get everything in and then see what happens. They think right away about, "Is this analyzable? is this something we want to store?" We're a lot smarter about what we want to store. Cost-wise, it's a lot more optimized.

Gardner: We'll have to leave it there. We have been hearing about how Zynga in San Francisco has, right from its inception, created a very strong culture around big data as it grabs as much data as they can from the massive volumes created by its games. It then goes further, using HP Vertica, to make the results of that data acquisition available to its employees.
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
I'd like to thank our guests, Joanne Ho, Senior Engineering Manager at Zynga, and Yuko Yamazaki, Head of Analytics at Zynga. And a big thank you to our audience as well, for joining us for this special new style of IT discussion.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP-sponsored discussions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect discussion on how data-driven companies can gain a competitive advantage in making as much analysis available to as many people in their organizations as possible. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:

Monday, July 20, 2015

How Big Data Powers GameStop to Gain Retail Advantage and Deep Insights into its Markets

Transcript of a BriefingsDirect discussion on how a gaming retailer uses big data to gather insights into sales trends and customer wants and needs.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing sponsored discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we're focusing on how companies are adapting to the new style of IT to improve IT performance and deliver better user experiences, as well as better business results.

Our next innovation case study interview highlights how GameStop, based in Grapevine, Texas uses big data to improve how it conducts its business and serve its customers. To learn more about how they deploy big data and use the resulting analytics, we are joined by John Crossen, Data Warehouse Lead at GameStop. Welcome, John.

John Crossen: Thank you for having me.
Become a member of myVertica today
Register now
Access the FREE HP Vertica Community Edition
Gardner: Tell us a little bit about GameStop. Most people are probably familiar with the retail outlets that they see, where you can buy, rent, trade games, and learn more about games. Why is big data important to your organization?

Crossen: We wanted to get a better idea of who our customers are, how we can better serve our customers and what types of needs they may have. With prior reporting, we would get good overall views of here’s how the company is doing or here’s how a particular game series is selling, but we weren’t able to tie that to activities of individual customers and possible future activity of future customers, using more of a traditional SQL-based platform that would just deliver flat reports.

Crossen
So, our goal was to get s more 360-degree view of our customer and we realized pretty quickly that, using our existing toolsets and methodologies, that wasn’t going to be possible. That’s where Vertica ended up coming into play to drive us in that direction.

Gardner: Just so we have a sense of this scale here, how many retail outlets does GameStop support and where are you located?

Crossen:  We're international. There are approximately 4,200 stores in the US and another 2,200 international.

Gardner: And in terms of the type of data that you are acquiring, is this all internal data or do you go to external data sources and how do you to bring that together?

Internal data

Crossen: It's primarily internal data. We get data from our website. We have the PowerUp Rewards program that customers can choose to join, and we have data from individual cash registers and all those stores.

Gardner: I know from experience in my own family that gaming is a very fast-moving industry. We’ve quickly gone from different platforms to different game types and different technologies when we're interacting with the games.

It's a very dynamic changeable landscape for the users, as well as, of course, the providers of games. You are sort of in the middle. You're right between the users and the vendors. You must be very important to the whole ecosystem.

Crossen: Most definitely, and there aren’t really many game retailers left anymore. GameStop is certainly the preeminent one. So a lot of customers come not just to purchase a game, but get information from store associates. We have Game Informer Magazine that people like to read and we have content on the website as well.

Gardner: Now that you know where to get the data and you have the data, how big is it? How difficult is it to manage? Are you looking for real-time or batch? How do you then move forward from that data to some business outcome?

Crossen: It’s primarily batch at this point. The registers close at night, and we get data from registers and loads that into HP Vertica. When we started approximately two years ago, we didn't have a single byte in Vertica. Now, we have pretty close to 24 terabytes of data. It's primarily customer data on individual customers, as well Weblogs or mobile application data.
Become a member of myVertica today
Register now
Access the FREE HP Vertica Community Edition
Gardner: I should think that when you analyze which games are being bought, which ones are being traded, which ones are price-sensitive and move at a certain price or not, you're really at the vanguard of knowing the trends in the gaming industry -- even perhaps before anyone else. How has that worked for you, and what are you finding?

Crossen: A lot of it is just based on determining who is likely to buy which series of games. So you won't market the next Call of Duty 3 or something like that to somebody who's buying your children's games. We are not going to ask people buy Call of Duty 3, rather than My Little Pony 6.

The interesting thing, at least with games and video game systems, is that when we sell them new, there's no price movement. Every game is the same price in any store. So we have to rely on other things like customer service and getting information to the customer to drive game sales. Used games are a bit of a different story.

Gardner: Now back to Vertica. Given that you've been using this for a few years and you have such a substantial data lake, what is it about Vertica that works for you? What are learning here at the conference that intrigues you about the future?

Quick reports

Crossen: The initial push with HP Vertica was just to get reports fast. We had processes that literally took a day to run to accumulate data. Now, in Vertica, we can pull that same data out in five minutes. I think that if we spend a little bit more time, we could probably get it faster than half of that.

The first big push was just speed. The second wave after that was bringing in data sources that were unattainable before, like web-click data, a tremendous amount of data, loading that into SQL, and then being able to query it out of SQL. This wasn't doable before, and it’s made it do that. At first, it was faster data, then acquiring new data and finding different ways to tie different data elements together that we haven’t done before.

Gardner: How about visualization of these reports? How do you serve up those reports and do you make your inference and analytics outputs available to all your employees? How do you distribute it? Is there sort of an innovation curve that you're following in terms of what they do with that data?
We had processes that literally took a day to run to accumulate data. Now, in Vertica, we can pull that same data out in five minutes.

Crossen: As far as a platform, we use Tableau as our visualization tool. We’ve used a kind of an ad-hoc environment to write direct SQL queries to pull data out, but Tableau serves the primary tool.

Gardner: In that data input area, what integration technologies are you interested in? What would you like to see HP do differently? Are you happy with the way SQL, Vertica, Hadoop, and other technologies are coming together? Where would you like to see that go?

Crossen: A lot of our source systems are either SQL-server based or just flat files. For flat files, we use the Copy Command to bring data, and that’s very fast. With Vertica 7, they released the Microsoft SQL Connector.

So we're able to use our existing SQL Server Integration Services (SSIS) data flows and change the output from another SQL table to direct me into Vertica. It uses the Copy Command under the covers and that’s been a major improvement. Before that, we had to stage the data somewhere else and then use the Copy Command to bring it in or try to use Open Database Connectivity (ODBC) to bring it in, which wasn’t very efficient.

20/20 hindsight

Gardner: How about words of wisdom from your 20/20 hindsight? Others are also thinking about moving from a standard relational database environment towards big data stores for analytics and speed and velocity of their reports. Any advice you might offer organizations as they're making that transition, now that you’ve done it?

Crossen: Just to better understand how a column-store database works, and how that's different from a traditional row-based database. It's a different mindset, everything from how you are going to lay out data modeling.
Become a member of myVertica today
Register now
Access the FREE HP Vertica Community Edition
For example, in a row database you would tend to freak out if you had a 700-column table. In the column stores, that doesn’t really matter. So just to get in the right mindset of here’s how a column-store database works, and not try to duplicate row-based system in the column-store system.

Gardner: Great. I am afraid we’ll have to leave it there. I’d like to thank our guest, John Crossen, the Data Warehouse Lead at GameStop in Grapevine, Texas. I appreciate your input.

Crossen: Thank you.

Gardner: And also thank to our audience for joining us for this special new style of IT discussion. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP-sponsored discussions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect discussion on how a gaming retailer uses big data to gather insights into sales trends and customer wants and needs. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:

Monday, July 13, 2015

A Tale of Two IT Departments, or How Cloud Governance is Essential in the Bimodal IT Era

Transcript of a BriefingsDirect discussion on the role of cloud governance and enterprise architecture and how they work together in the era of increasingly fragmented IT.
Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Download the transcript. Sponsor: The Open Group.

Dana Gardner: Hello, and welcome to a special BriefingsDirect thought leadership panel discussion, coming to you in conjunction with The Open Group's upcoming conference on July 20, 2015 in Baltimore.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, and I'll be your host and moderator as we examine the role that cloud governance and enterprise architecture play in an era of increasingly fragmented IT.
Gardner

Not only are IT organizations dealing with so-called shadow IT and myriad proof-of-concept affairs, there is now a strong rationale for fostering what Gartner calls Bimodal IT. There's a strong case to be made for exploiting the strengths of several different flavors of IT, except that -- at the same time -- businesses are asking IT in total to be faster, better, and cheaper.

The topic before us is how to allow for the benefits of Bimodal IT or even multimodal IT, but without IT fragmentation leading to fractured and even broken businesses.
Attend The Open Group Baltimore 2015
July 20-23, 2015
Register Here
Here to update us on the work of The Open Group Cloud Governance initiatives and working groups and to further explore the ways that companies can better manage and thrive with hybrid IT are our guests. We're here with Dr. Chris Harding, Director for Interoperability and Cloud Computing Forum Director at The Open Group. Welcome, Chris.

Dr. Chris Harding: Thank you, Dana. It’s great to be here.

Gardner: We're also here with David Janson, Executive IT Architect and Business Solutions Professional with the IBM Industry Solutions Team for Central and Eastern Europe and a leading contributor to The Open Group Cloud Governance Project. Welcome, David.

David Janson: Thank you. Glad to be here.

Gardner: Lastly, we here with Nadhan, HP Distinguished Technologist and Cloud Adviser and Co-Chairman of The Open Group Cloud Governance Project. Welcome, Nadhan.

Nadhan: Thank you, Dana. It’s a pleasure to be here.

IT trends

Gardner: Before we get into an update on The Open Group Cloud Governance initiatives, in many ways over the past decades IT has always been somewhat fragmented. Very few companies have been able to keep all their IT oars rowing in the same direction, if you will. But today things seem to be changing so rapidly that some degree of disparate IT methods are necessary. We might even think of old IT and new IT, and this may even be desirable.

But what are the trends that are driving this need for a multimodal IT? What's accelerating the need for different types of IT, and how can we think about retaining a common governance, and even a frameworks-driven enterprise architecture umbrella, over these IT elements?

Nadhan: Basically, the change that we're going through is really driven by the business. Business today has much more rapid access to the services that IT has traditionally provided. Business has a need to react to its own customers in a much more agile manner than they were traditionally used to.

Nadhan
We now have to react to demands where we're talking days and weeks instead of months and years. Businesses today have a choice. Business units are no longer dependent on the traditional IT to avail themselves of the services provided. Instead, they can go out and use the services that are available external to the enterprise.

To a great extent, the advent of social media has also resulted in direct customer feedback on the sentiment from the external customer that businesses need to react to. That is actually changing the timelines. It is requiring IT to be delivered at the pace of business. And the very definition of IT is undergoing a change, where we need to have the right paradigm, the right technology, and the right solution for the right business function and therefore the right application.

Since the choices have increased with the new style of IT, the manner in which you pair them up, the solutions with the problems, also has significantly changed. With more choices, come more such pairs on which solution is right for which problem. That's really what has caused the change that we're going through.
With more choices, come more such pairs on which solution is right for which problem.

A change of this magnitude requires governance that goes across building up on the traditional governance that was always in play, requiring elements like cloud to have governance that is more specific to solutions that are in the cloud across the whole lifecycle of cloud solutions deployment.

Gardner: David, do you agree that this seems to be a natural evolution, based on business requirements, that we basically spin out different types of IT within the same organization to address some of these issues around agility? Or is this perhaps a bad thing, something that’s unnatural and should be avoided?

Janson: In many ways, this follows a repeating pattern we've seen with other kinds of transformations in business and IT. Not to diminish the specifics about what we're looking at today, but I think there are some repeating patterns here.

There are new disruptive events that compete with the status quo. Those things that have been optimized, proven, and settled into sort of a consistent groove can compete with each other. Excitement about the new value that can be produced by new approaches generates momentum, and so far this actually sounds like a healthy state of vitality.

Good governance

However, one of the challenges is that the excitement potentially can lead to overlooking other important factors, and that’s where I think good governance practices can help.

For example, governance helps remind people about important durable principles that should be guiding their decisions, important considerations that we don’t want to forget or under-appreciate as we roll through stages of change and transformation.

At the same time, governance practices need to evolve so that it can adapt to new things that fit into the governance framework. What are those things and how do we govern those? So governance needs to evolve at the same time.

There is a pattern here with some specific things that are new today, but there is a repeating pattern as well, something we can learn from.

Gardner: Chris Harding, is there a built-in capability with cloud governance that anticipates some of these issues around different styles or flavors or even velocity of IT innovation that can then allow for that innovation and experimentation, but then keep it all under the same umbrella with a common management and visibility?

Harding: There are a number of forces at play here, and there are three separate trends that we've seen, or at least that I have observed, in discussions with members within The Open Group that relate to this.

Harding
The first is one that Nadhan mentioned, the possibility of outsourcing IT. I remember a member’s meeting a few years ago, when one of our members who worked for a company that was starting a cloud brokerage activity happened to mention that two major clients were going to do away with their IT departments completely and just go for cloud brokerage. You could see the jaws drop around the table, particularly with the representatives who were from company corporate IT departments.

Of course, cloud brokers haven’t taken over from corporate IT, but there has been that trend toward things moving out of the enterprise to bring in IT services from elsewhere.

That’s all very well to do that, but from a governance perspective, you may have an easy life if you outsource all of your IT to a broker somewhere, but if you fail to comply with regulations, the broker won’t go to jail; you will go to jail.

So you need to make sure that you retain control at the governance level over what is happening from the point of view of compliance. You probably also want to make sure that your architecture principles are followed and retain governance control to enable that to happen. That’s the first trend and the governance implication of it.

In response to that, a second trend that we see is that IT departments have reacted often by becoming quite like brokers themselves -- providing services, maybe providing hybrid cloud services or private cloud services within the enterprise, or maybe sourcing cloud services from outside. So that’s a way that IT has moved in the past and maybe still is moving.

Third trend

The third trend that we're seeing in some cases is that multi-discipline teams within line of business divisions, including both business people and technical people, address the business problems. This is the way that some companies are addressing the need to be on top of the technology in order to innovate at a business level. That is an interesting and, I think, a very healthy development.

So maybe, yes, we are seeing a bimodal splitting in IT between the traditional IT and the more flexible and agile IT, but maybe you could say that that second part belongs really in the line of business departments -- rather than in the IT departments. That's at least how I see it.

Nadhan: I'd like to build on a point that David made earlier about repeating patterns. I can relate to that very well within The Open Group, speaking about the Cloud Governance Project. Truth be told, as we continue to evolve the content in cloud governance, some of the seeding content actually came from the SOA Governance Project that The Open Group worked on a few years back. So the point David made about the repeating patterns resonates very well with that particular case in mind.
I think there's a repeating pattern here of new approaches, new ways of doing things, coming into the picture.

Gardner: So we've been through this before. When there is change and disruption, sometimes it’s required for a new version of methodologies and best practices to emerge, perhaps even associated with specific technologies. Then, over time, we see that folded back in to IT in general, or maybe it’s pushed back out into the business, as Chris alluded to.

My question, though, is how we make sure that these don’t become disruptive and negative influences over time. Maybe governance and enterprise architecture principles can prevent that. So is there something about the cloud governance, which I think really anticipates a hybrid model, particularly a cloud hybrid model, that would be germane and appropriate for a hybrid IT environment?

David Janson, is there a cloud governance benefit in managing hybrid IT?

Janson: There most definitely is. I tend to think that hybrid IT is probably where we're headed. I don’t think this is avoidable. My editorial comment upon that is that’s an unavoidable direction we're going in. Part of the reason I say that is I think there's a repeating pattern here of new approaches, new ways of doing things, coming into the picture.

Janson
And then some balancing acts goes on, where people look at more traditional ways versus the new approaches people are talking about, and eventually they look at the strengths and weaknesses of both.

There's going to be some disruption, but that’s not necessarily bad. That’s how we drive change and transformation. What we're really talking about is making sure the amount of disruption is not so counterproductive that it actually moves things backward instead of forward.

I don’t mind a little bit of disruption. The governance processes that we're talking about, good governance practices, have an overall life cycle that things move through. If there is a way to apply governance, as you work through that life cycle, at each point, you're looking at the particular decision points and actions that are going to happen, and make sure that those decisions and actions are well-informed.

We sometimes say that governance helps us do the right things right. So governance helps people know what the right things are, and then the right way to do those things.

Bimodal IT

Also, we can measure how well people are actually adapting to those “right things” to do. What’s “right” can vary over time, because we have disruptive change. Things like we are talking about with Bimodal IT is one example.

Within a narrower time frame in the process lifecycle, there are points that evolve across that time frame that have particular decisions and actions. Governance makes sure that people are well informed as they're rolling through that about important things they shouldn’t forget. It’s very easy to forget key things and optimize for only one factor, and governance helps people remember that.

Also, just check to see whether we're getting the benefits that people expected out of it. Coming back around and looking afterward to see if we accomplish what we thought we would or did we get off in the wrong direction. So it’s a bit like a steering mechanism or a feedback mechanism, in it that helps keep the car on the road, rather than going off in the soft shoulder. Did we overlook something important? Governance is key to making this all successful.

Gardner: Let’s return to The Open Group’s upcoming conference on July 20 in Baltimore and also learn a bit more about what the Cloud Governance Project has been up to. I think that will help us better understand how cloud governance relates to these hybrid IT issues that we've been discussing.

Nadhan, you are the co-chairman of the Cloud Governance Project. Tell us about what to expect in Baltimore with the concepts of Boundaryless Information Flow, and then also perhaps an update on what the Cloud Governance Project has been up to.
Attend The Open Group Baltimore 2015
July 20-23, 2015
Register Here
Nadhan: When the Cloud Governance Project started, the first question we challenged ourselves with was, what is it and why do we need it, especially given that SOA governance, architecture governance, IT governance, enterprise governance, in general are all out there with frameworks. We actually detailed out the landscape with different standards and then identified the niche or the domain that cloud governance addresses.

After that, we went through and identified the top five principles that matter for cloud governance to be done right. Some of the obvious ones being that cloud is a business decision, and the governance exercise should keep in mind whether it is the right business decision to go to the cloud rather than just jumping on the bandwagon. Those are just some examples of the foundational principles that drive how cloud governance must be established and exercised.

Subsequent to that, we have a lifecycle for cloud governance defined and then we have gone through the process of detailing it out by identifying and decoupling the governance process and the process that is actually governed.

So there is this concept of process pairs that we have going, where we've identified key processes, key process pairs, whether it be the planning, the architecture, reusing cloud service, subscribing to it, unsubscribing, retiring, and so on. These are some of the defining milestones in the life cycle.

We've actually put together a template for identifying and detailing these process pairs, and the template has an outline of the process that is being governed, the key phases that the governance goes through, the desirable business outcomes that we would expect because of the cloud governance, as well as the associated metrics and the key roles.

Real-life solution

The Cloud Governance Framework is actually detailing each one. Where we are right now is looking at a real-life solution. The hypothetical could be an actual business scenario, but the idea is to help the reader digest the concepts outlined in the context of a scenario where such governance is exercised. That’s where we are on the Cloud Governance Project.

Let me take the opportunity to invite everyone to be part of the project to continue it by subscribing to the right mailing list for cloud governance within The Open Group.

Gardner: Just for the benefit of our readers and listeners who might not be that familiar with The Open Group, perhaps you could give us a very quick overview -- its mission, its charter, what we could expect at the Baltimore conference, and why people should get involved, either directly by attending, or following it on social media or the other avenues that The Open Group provides on its website?
Until an Open Group standard is published, there is no official Open Group position on the topic, and members will present their views at conferences.

Harding: The Open Group is a vendor-neutral consortium whose vision is Boundaryless Information Flow. That is to say the idea that information should be available to people within an enterprise, or indeed within an ecosystem of enterprises, as and when needed, not locked away into silos.

We hold main conferences, quarterly conferences, four times a year and also regional conferences in various parts of the world in between those, and we discuss a variety of topics.

In fact, the main topics for the conference that we will be holding in July in Baltimore are enterprise architecture and risk and security. Architecture and security are two of the key things for which The Open Group is known, Enterprise architecture, particularly with its TOGAF Framework, is perhaps what The Open Group is best known for.

We've been active in a number of other areas, and risk and security is one. We also have started a new vertical activity on healthcare, and there will be a track on that at the Baltimore conference.

There will be tracks on other topics too, including four sessions on Open Platform 3.0. Open Platform 3.0 is The Open Group initiative to address how enterprises can gain value from new technologies, including cloud computing, social computing, mobile computing, big data analysis, and the Internet of Things.

We'll have a number of presentations related to that. These will include, in fact, a perspective on cloud governance, although that will not necessarily reflect what is happening in the Cloud Governance Project. Until an Open Group standard is published, there is no official Open Group position on the topic, and members will present their views at conferences. So we're including a presentation on that.

Lifecycle governance

There is also a presentation on another interesting governance topic, which is on Information Lifecycle Governance. We have a panel session on the business context for Open Platform 3.0 and a number of other presentations on particular topics, for example, relating to the new technologies that Open Platform 3.0 will help enterprises to use.

There's always a lot going on at Open Group conferences, and that’s a brief flavor of what will happen at this one.

Gardner: Thank you. And I'd just add that there is more available at The Open Group website, opengroup.org.

Going to one thing you mentioned about a standard and publishing that standard, is there a roadmap that we could look to in order to anticipate the next steps or milestones in the Cloud Governance Project? When would such a standard emerge and when might we expect it?

Nadhan: As I said earlier, the next step is to identify the business scenario and apply it. I'm expecting, with the right level of participation, that it will take another quarter, after which it would go through the internal review with The Open Group and the company reviews for the publication of the standard. Assuming we have that in another quarter, Chris, could you please weigh in on what it usually takes, on average, for those reviews before it gets published.
I want to step back and think about what are the changes to project-related processes that new approaches require.

Harding: You could add on another quarter. It shouldn’t actually take that long, but we do have a thorough review process. All members of The Open Group are invited to participate. The document is posted for comment for, I would think, four weeks, after which we review the comments and decide what actually needs to be taken.

Certainly, it could take only two months to complete the overall publication of the standard from the draft being completed, but it’s safer to say about a quarter.

Gardner: So a real important working document could be available in the second half of 2015. Let’s now go back to why a cloud governance document and approach is important when we consider the implications of Bimodal or multimodal IT.

One of things that Gartner says is that Bimodal IT projects require new project management styles. They didn’t say project management products. They didn’t say, downloads or services from a cloud provider. We're talking about styles.

So it seems to me that, in order to prevent the good aspects of Bimodal IT to be overridden by negative impacts of chaos and the lack of coordination that we're talking about, not about a product or a download, we're talking about something that a working group and a standards approach like the Cloud Governance Project can accommodate.

David, why is it that you can’t buy this in a box or download it as a product? What is it that we need to look at in terms of governance across Bimodal IT and why is that appropriate for a style? Maybe the IT people need to think differently about accomplishing this through technology alone?

First question

Janson: When I think of anything like a tool or a piece of software, the first question I tend to have is what is that helping me do, because the tool itself generally is not the be-all and end-all of this. What process is this going to help me carry out?

So, before I would think about tools, I want to step back and think about what are the changes to project-related processes that new approaches require. Then secondly, think about how can tools help me speed up, automate, or make those a little bit more reliable?

It’s an easy thing to think about a tool that may have some process-related aspects embedded in it as sort of some kind of a magic wand that's going to automatically make everything work well, but it’s the processes that the tool could enable that are really the important decision. Then, the tools simply help to carry that out more effectively, more reliably, and more consistently.

We've always seen an evolution about the processes we use in developing solutions, as well as tools. Technology requires tools to adapt. As to the processes we use, as they get more agile, we want to be more incremental, and see rapid turnarounds in how we're developing things. Tools need to evolve with that.
Once you've settled on some decisions about evolving those processes, then we'll start looking for tools that help you automate, accelerate, and make consistent and more reliable what those processes are.

But I'd really start out from a governance standpoint, thinking about challenging the idea that if we're going to make a change, how do we know that it's really an appropriate one and asking some questions about how we differentiate this change from just reinventing the wheel. Is this an innovation that really makes a difference and isn't just change for the sake of change?

Governance helps people challenge their thinking and make sure that it’s actually a worthwhile step to take to make those adaptations in project-related processes.

Once you've settled on some decisions about evolving those processes, then we'll start looking for tools that help you automate, accelerate, and make consistent and more reliable what those processes are.

I tend to start with the process and think of the technology second, rather than the other way around. Where governance can help to remind people of principles we want to think about. Are you putting the cart before the horse? It helps people challenge their thinking a little bit to be sure they're really going in the right direction.

Gardner: Of course, a lot of what you just mentioned pertains to enterprise architecture generally as well.

Nadhan, when we think about Bimodal or multimodal IT, this to me is going to be very variable from company to company, given their legacy, given their existing style, the rate of adoption of cloud or other software as a service (SaaS), agile, or DevOps types of methods. So this isn’t something that’s going to be a cookie-cutter. It really needs to be looked at company by company and timeline by timeline.

Is this a vehicle for professional services, for management consulting more than IT and product? What is n the relationship between cloud governance, Bimodal IT, and professional services?

Delineating systems

Nadhan: It’s a great question Dana. Let me characterize Bimodal IT slightly differently, before answering the question. Another way to look at Bimodal IT, where we are today, is delineating systems of record and systems of engagement.

In traditional IT, typically, we're looking at the systems of record, and systems of engagement with the social media and so on are in the live interaction. Those define the continuously evolving, growing-by-the-second systems of engagement, which results in the need for big data, security, and definitely the cloud and so on.

The coexistence of both of these paradigms requires the right move to the cloud for the right reason. So even though they are the systems of record, some, if not most, do need to get transformed to the cloud, but that doesn’t mean all systems of engagement eventually get transformed to the cloud.
There are good reasons why you may actually want to leave certain systems of engagement the way they are.

There are good reasons why you may actually want to leave certain systems of engagement the way they are. The art really is in combining the historical data that the systems of record have with the continual influx of data that we get through the live channels of social media, and then, using the right level of predictive analytics to get information.

I said a lot in there just to characterize the Bimodal IT slightly differently, making the point that what really is at play, Dana, is a new style of thinking. It's a new style of addressing the problems that have been around for a while.

But a new way to address the same problems, new solutions, a new way of coming up with the solution models would address the business problems at hand. That requires an external perspective. That requires service providers, consulting professionals, who have worked with multiple customers, perhaps other customers in the same industry, and other industries with a healthy dose of innovation.

That's where this is a new opportunity for professional services to work with the CxOs, the enterprise architects, the CIOs to exercise the right business decision with the rights level of governance.

Because of the challenges with the coexistence of both systems of record and systems of engagement and harvesting the right information to make the right business decision, there is a significant opportunity for consulting services to be provided to enterprises today.

Drilling down

Gardner: Before we close off I wanted to just drill down on one thing, Nadhan, that you brought up, which is that ability to measure and know and then analyze and compare.

One of the things that we've seen with IT developing over the past several years as well is that the big data capabilities have been applied to all the information coming out of IT systems so that we can develop a steady state and understand those systems of record, how they are performing, and compare and contrast in ways that we couldn’t have before.

So on our last topic for today, David Janson, how important is it for that measuring capability in a governance context, and for organizations that want to pursue Bimodal IT, but keep it governed and keep it from spinning out of control? What should they be thinking about putting in place, the proper big data and analytics and measurement and visibility apparatus and capabilities?

Janson: That’s a really good question. One aspect of this is that, when I talk with people about the ideas around governance, it's not unusual that the first idea that people have about what governance is is about the compliance or the policing aspect that governance can play. That sounds like that’s interference, sand in the gears, but it really should be the other way around.
Good governance has communicated that well enough, so that people should actually move faster rather than slower. In other words, there should be no surprises.

A governance framework should actually make it very clear how people should be doing things, what’s expected as the result at the end, and how things are checked and measured across time at early stages and later stages, so that people are very clear about how things are carried out and what they are expected to do. So, if someone does use a governance-compliance process to see if things are working right, there is no surprise, there is no slowdown. They actually know how to quickly move through that.

Good governance has communicated that well enough, so that people should actually move faster rather than slower. In other words, there should be no surprises.

Measuring things is very important, because if you haven’t established the objectives that you're after and some metrics to help you determine whether you're meeting those, then it’s kind of an empty suit, so to speak, with governance. You express some ideas that you want to achieve, but you have no way of knowing or answering the question of how we know if this is doing what we want to do. Metrics are very important around this.

We capture metrics within processes. Then, for the end result, is it actually producing the effects people want? That’s pretty important.

One of the things that we have built into the Cloud Governance Framework is some idea about what are the outcomes and the metrics that each of these process pairs should have in mind. It helps to answer the question, how do we know? How do we know if something is doing what we expect? That’s very, very essential.

Gardner: I am afraid we'll have to leave it there. We've been examining the role of cloud governance and enterprise architecture and how they work together in the era of increasingly fragmented IT. And we've seen how The Open Group Cloud Governance Initiatives and Working Groups can help allow for the benefits of Bimodal IT, but without necessarily IT fragmentation leading to a fractured or broken business process around technology and innovation.
Attend The Open Group Baltimore 2015
July 20-23, 2015
Register Here
This special BriefingsDirect thought leadership panel discussion comes to you in conjunction with The Open Group’s upcoming conference on July 20, 2015 in Baltimore. And it’s not too late to register on The Open Group’s website or to follow the proceedings online and via social media such as Twitter and LinkedIn.

So, thank you to our guests: Dr. Chris Harding, Director for Interoperability and Cloud Computing Forum Director at The Open Group; David Janson, Executive IT Architect and Business Solutions Professional with the IBM Industry Solutions Team for Central and Eastern Europe and a leading contributor to The Open Group Cloud Governance Project, and Nadhan, HP Distinguished Technologist and Cloud Advisor and Co-Chairman of The Open Group Cloud Governance Project.

And a big thank you, too, to our audience for joining this special Open Group-sponsored discussion. This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this thought leadership panel discussion series. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Download the transcript. Sponsor: The Open Group.

Transcript of a BriefingsDirect discussion on the role of cloud governance and enterprise architecture and how they work together in the era of increasingly fragmented IT. Copyright The Open Group and Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in: