Showing posts with label HPE Vertica. Show all posts
Showing posts with label HPE Vertica. Show all posts

Friday, January 06, 2017

How Lastminute.com Uses Machine Learning to Improve Real-Time Travel Bookings

Transcript of a discussion on how lastminute.com manages massive volumes of data to support a cutting-edge machine-learning algorithmic approach to matching the best experience in travel with end user requirements.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition to the Hewlett Packard Enterprise (HPE) Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on digital transformation. Stay with us now to learn how agile businesses are fending off disruption -- in favor of innovation.

Gardner
Our next case study highlights how online travel and events pioneer lastminute.com leverages big-data analytics with speed at scale to provide business advantages to travel bookings. We'll explore how lastminute.com manages massive volumes of data to support cutting-edge machine-learning algorithms to allow for speed and automation while buying such services online.

So join us as we learn how a culture of IT innovation helps make highly dynamic customer interactions for online travel a major differentiator for lastminute.com. To describe how machine learning is improving travel services fulfillment, we're joined by Filippo Onorato, Chief Information Officer at lastminute.com group in Chiasso, Switzerland. Welcome, Filippo.

Filippo Onorato: Thank you very much.

Gardner: Most people these days are trying to do more things more quickly amid higher complexity. What is it that you're trying to accomplish in terms of moving beyond disruption and being competitive in a highly complex area?
Join myVertica
To Get The Free
HPE Vertica Community Edition
Onorato: The travel market -- and in particular the online travel market -- is a very fast-moving market, and the habits and behaviors of the customers are changing so rapidly that we have to move fast.

Disruption is coming every day from different actors ... [requiring] a different way of constructing the customer experience. In order to do that, you have to rely on very big amounts of data -- just to style the evolution of the customer and their behaviors.

Gardner: And customers are more savvy; they really know how to use data and look for deals. They're expecting real-time advantages. How is the sophistication of the end user impacting how you work at the core, in your data center, and in your data analysis, to improve your competitive position?

Onorato
Onorato: Once again, customers are normally looking for information, and providing the right information at the right time is a key of our success. The brand we came from was called Bravofly and Volagratis in Italy; that means "free flight." The competitive advantage we have is to provide a comparison among all the different airline tickets, where the market is changing rapidly from the standard airline behavior to the low-cost ones. Customers are eager to find the best deal, the best price for their travel requirements.

So, the ability to construct their customer experience in order to find the right information at the right time, comparing hundreds of different airlines, was the competitive advantage we made our fortune on.

Gardner: Let’s edify our listeners and reader a bit about lastminute.com. You're global. Tell us about the company and perhaps your size, employees, and the number of customers you deal with each day.

Most famous brand

Onorato: We are 1,200 employees worldwide. Lastminute.com, the most famous brand worldwide, was acquired by the Bravofly Rumbo Group two years ago from Sabre. We own Bravofly; that was the original brand. We own Rumbo; that is very popular in Spanish-speaking markets. We own Volagratis in Italy; that was the original brand. And we own Jetcost; that is very popular in France. That is actually a metasearch, a combination of search and competitive comparison between all the online travel agencies (OTAs) in the market.

We span across 40 countries, we support 17 languages, and we help almost 10 million people fly every year.

Gardner: Let’s dig into the data issues here, because this is a really compelling use-case. There's so much data changing so quickly, and sifting through it is an immense task, but you want to bring the best information to the right end user at the right time. Tell us a little about your big-data architecture, and then we'll talk a little bit about bots, algorithms, and artificial intelligence.

Onorato: The architecture of our system is pretty complex. On one side, we have to react almost instantly to the search that the customers are doing. We have a real-time platform that's grabbing information from all the providers, airlines, other OTAs, hotel provider, bed banks, or whatever.

We concentrate all this information in a huge real-time database, using a lot of caching mechanisms, because the speed of the search, the speed of giving result to the customer is a competitive advantage. That's the real-time part of our development that constitutes the core business of our industry.

Gardner: And this core of yours, these are your own data centers? How have you constructed them and how do you manage them in terms of on-premises, cloud, or hybrid?

Onorato: It's all on-premises, and this is our core infrastructure. On the other hand, all that data that is gathered from the interaction with the customer is partially captured. This is the big challenge for the future -- having all that data stored in a data warehouse. That data is captured in order to build our internal knowledge. That would be the sales funnel.
Right now, we're storing a short history of that data, but the goal is to have two years worth of session data.

So, the behavior of the customer, the percentage of conversion in each and every step that the customer does, from the search to the actual booking. That data is gathered together in a data warehouse that is based on HPE Vertica, and then, analyzed in order to find the best place, in order to optimize the conversion. That’s the main usage of the date warehouse.

On the other hand, what we're implementing on top of all this enormous amount of data is session-related data. You can imagine how much a data single interaction of a customer can generate. Right now, we're storing a short history of that data, but the goal is to have two years' worth of session data. That would be an enormous amount of data.

Gardner: And when we talk about data, often we're concerned about velocity and volume. You've just addressed volume, but velocity must be a real issue, because any change in a weather issue in Europe, for example, or a glitch in a computer system at one airline in North America changes all of these travel data points instantly.

Unpredictable events

Onorato: That’s also pretty typical in the tourism industry. It's a very delicate business, because we have to react to unpredictable events that are happening all over the world. In order to do a better optimization of margin, of search results, etc, we're also applying some machine-learning algorithm, because a human can't react so fast to the ever-changing market or situation.

In those cases, we use optimization algorithms in order to fine tune our search results, in order to better deal with a customer request, and to propose the better deal at the right time. In very simple terms, that's our core business right now.

Gardner: And Filippo, only your organization can do this, because the people with the data on the back side can’t apply the algorithm; they have only their own data. It’s not something the end user can do on the edge, because they need to receive the results of the analysis and the machine learning. So you're in a unique, important position. You're the only one who can really apply the intelligence, the AI, and the bots to make this happen. Tell us a little bit about how you approached that problem and solved it.
Join myVertica
To Get The Free
HPE Vertica Community Edition
Onorato: I perfectly agree. We are the collector of an enormous amount of product-related information on one side. On the other side, what we're collecting are the customer behaviors. Matching the two is unique for our industry. It's definitely a competitive advantage to have that data.

Then, what you do with all those data is something that is pushing us to do continuous innovation and continuous analysis. By the way, I don't think something can be implemented without a lot of training and a lot of understanding of the data.

Just to give you an example, what we're implementing, the machine learning algorithm that is called multi-armed bandit, is kind of parallel testing of different configurations of parameters that are presented to the final user. This algorithm is reacting to a specific set of conditions and proposing the best combination of order, visibility, pricing, and whatever to the customer in order to satisfy their research.

What we really do in that case is to grab information, build our experience into the algorithm, and then optimize this algorithm every day, by changing parameters, by also changing the type of data that we're inputting into the algorithm itself.
It's endless, because the market conditions are changing and the actors in the market are changing as well.

So, it’s an ongoing experience; it’s an ongoing study. It's endless, because the market conditions are changing and the actors in the market are changing as well, coming from the two operators in the past, the airline and now the OTA. We're also a metasearch, aggregating products from different OTAs. So, there are new players coming in and they're always coming closer and closer to the customer in order to grab information on customer behavior.

Gardner: It sounds like you have a really intense culture of innovation, and that's super important these days, of course. As we were hearing at the HPE Big Data Conference 2016, the feedback loop element of big data is now really taking precedence. We have the ability to manage the data, to find the data, to put the data in a useful form, but we're finding new ways. It seems to me that the more people use our websites, the better that algorithm gets, the better the insight to the end user, therefore the better the result and user experience. And it never ends; it always improves.

How does this extend? Do you take it to now beyond hotels, to events or transportation? It seems to me that this would be highly extensible and the data and insights would be very valuable.

Core business

Onorato: Correct. The core business was initially the flight business. We were born by selling flight tickets. Hotels and pre-packaged holidays was the second step. Then, we provided information about lifestyle. For example, in London we have an extensive offer of theater, events, shows, whatever, that are aggregated.

Also, we have a smaller brand regarding restaurants. We're offering car rental. We're giving also value-added services to the customer, because the journey of the customer doesn't end with the booking. It continues throughout the trip, and we're providing information regarding the check-in; web check-in is a service that we provide. There are a lot of ancillary businesses that are making the overall travel experience better, and that’s the goal for the future.

Gardner: I can even envision where you play a real-time concierge, where you're able to follow the person through the trip and be available to them as a bot or a chat. This edge-to-core capability is so important, and that big data feedback, analysis, and algorithms are all coming together very powerfully.

Tell us a bit about metrics of success. How can you measure this? Obviously a lot of it is going to be qualitative. If I'm a traveler and I get what I want, when I want it, at the right price, that's a success story, but you're also filling every seat on the aircraft or you're filling more rooms in the hotels. How do we measure the success of this across your ecosystem?
We can jump from one location to another very easily, and that's one of the competitive advantages of being an OTA.

Onorato: In that sense, we're probably a little bit farther away from the real product, because we're an aggregator. We don’t have the risk of running a physical hotel, and that's where we're actually very flexible. We can jump from one location to another very easily, and that's one of the competitive advantages of being an OTA.

But the success overall right now is giving the best information at the right time to the final customer. What we're measuring right now is definitely the voice of the customer, the voice of the final customer, who is asking for more and more information, more and more flexibility, and the ability to live an experience in the best way possible.

So, we're also providing a brand that is associated with wonderful holidays, having fun, etc. 

Gardner: The last question, for those who are still working on building out their big data infrastructure, trying to attain this cutting-edge capability and start to take advantage of machine learning, artificial intelligence, and so forth, if you could do it all over again, what would you tell them, what would be your advice to somebody who is merely more in the early stages of their big data journey?

Onorato: It is definitely based on two factors -- having the best technology and not always trying to build your own technology, because there are a lot of products in the market that can speed up your development.

And also, it's having the best people. The best people is one of the competitive advantages of any company that is running this kind of business. You have to rely on fast learners, because market condition are changing, technology is changing, and the people needs to train themselves very fast. So, you have to invest in people and invest in the best technology available.

Gardner: I'm afraid we will have to leave it there. We've been exploring how online travel and events pioneer lastminute.com group leverages big-data analytics with incredible speed and at huge scale to provide business advantages in travel and other bookings.

And we've learned how the OTA manages these massive volumes of data to support a cutting-edge machine-learning algorithmic approach to matching the best experience in travel with end user requirements.

So, please join me in thanking our guest, Filippo Onorato, Chief Information Officer at lastminute.com group in Chiasso, Switzerland. Thank you, sir.
Join myVertica
To Get The Free
HPE Vertica Community Edition
Onorato: Thank you very much.

Gardner: And thanks to our audience as well for joining us for this Hewlett Packard Enterprise Voice of the Customer digital transformation discussion. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored interviews. Thanks again for listening, and please do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how lastminute.com manages massive volumes of data to support a cutting-edge machine-learning algorithmic approach to matching the best experience in travel with end user requirements. Copyright Interarbor Solutions, LLC, 2005-2017. All rights reserved.

You may also be interested in:

Monday, December 19, 2016

Veikkaus Digitally Transforms as it Emerges as New Combined Finnish National Gaming Company

Transcript of a discussion on how a culture of IT innovation is helping to establish a single wholly nationally owned company to operate gaming in Finland.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Welcome to the next edition to the Hewlett Packard Enterprise (HPE) Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on digital transformation. Stay with us now to learn how agile businesses are fending off disruption in favor of innovation.

Gardner
Our next case study highlights how combined Finnish national gaming company, Veikkaus, manages a complex merger process while bringing a digital advantage to both its operations and business model. We'll explore how Veikkaus uses a powerful big-data analytics platform to respond rapidly to the challenges of digitization.

Learn how a culture of IT innovation has established a single nationally owned company to operate gaming and gambling in Finland. To describe how Veikkaus is transforming itself for better customer experience from the computing core to the edge, we're joined by Harri Räsänen, Information and Communications Technology Architect at Veikkaus in Helsinki.

Welcome, Harri.

Harri Räsänen: Thank you. It’s nice to be here.
Join myVertica
To Get The Free
HPE Vertica Community Edition
Gardner: Why has Veikkaus reinvented its data infrastructure technology?

Räsänen: Our data warehouse solution was a traditional data warehouse, and had been around for 10 years. Different things had gone wrong. One of the key issues we faced was that our data wasn’t real-time. It was far from real time -- it was data that was one or two days old.

We decided that we need to get data quicker and in more detail because we now had aggregate data.

Gardner: What were some of your top requirements technically in order to accomplish that?

Real-time data

Räsänen: As I said, we had quite a old-fashioned data warehouse. Initially, we needed our game-service provider to feed us data more in real-time. They needed to build up a mechanism to complete data, and we needed to build out capabilities to gather it. We needed to rethink the information structure -- totally from scratch.

Räsänen
Gardner: When we think about gambling, gaming, or lotteries, in many cases, this is an awful lot of data, a very big undertaking. Give us a sense of the size of the data and the disparity of the three organizations that came together including the Finnish national football gaming reorganization.

Räsänen: I'll talk about our current-situation records, for the new combined company we are starting up in 2017.

We have a big company from a customer point of view. We have 1.8 million consumers. Finland has a population of 5.5 million. So, we have a quite a lot of Finnish consumers. When it comes to transactions, we get one to three million transactions per day. So it’s quite large, if you think about the transactional data.

In addition to that, we gather different kinds of information about our web store; it’s one of the biggest retail web stores in Finland.

Gardner: It’s one thing to put in a new platform, but it’s another to then change the culture and the organization -- and transform into a digital business. How is the implementation of your new data environment aiding in your cultural shift?

Räsänen: Luckily, Veikkaus has a background of doing things quite analytically. If you think about a web store, there is a culture that we need to be able to monitor what we're doing if we're running some changes in our web store -- whether it works or not. That’s a good thing.

But, we are redoing our whole data technology. We added the Apache Kafka integration point and then, Cloudera, the Hadoop system. Then, we added a new ETL tool for us, Pentaho, and last but not least, HPE Vertica. It's been really challenging for us, with lots of different things to consider and learn.

Luckily, we've been able to use good external consultants to help us out, but as you said, we can always make the technology work better. In transforming the culture of doing things, we're still definitely in the middle of our journey.

Gardner: I imagine you'll want to better analyze what takes place within your organization so it’s not just serving the data and managing the transactions. There's an opportunity to have a secondary benefit, which is more control of your data. The more insight you have allows you to adapt and improve your customer experience and customer service. Have you been able to start down that path of that secondary analysis of what goes on internally?

New level of data

Räsänen: Some of our key data was even out of our hands in our service-provider environments. We wanted to get all the relevant data with us, and now we've been working on that new level of data access. We have analysts working on that, both IT and business people, browsing the data. They already have some findings on things that previously they could have asked or even thought about. So, we have been getting our information up-to-date.

Gardner: Can you give us more specific examples of how you've been able to benefit from this new digital environment?

Räsänen: Yeah, consumer communication on CRM is one of the key successes, things we needed to have in place. We've been able to constantly improve on that. Before, we had data that was too old, but now, we have near real-time data. We get one-minute-old data, so we can communicate with the consumers better. We know whether they've been playing their lotteries or betting on football matches.

We can say, "It’s time for football today, and you haven’t yet placed a bet." We can communicate, and on the other hand, we can avoid disturbing customers by sending out e-mails or SMS messages about things they've already done.
Join myVertica
To Get The Free
HPE Vertica Community Edition
Gardner: Yes, less spam, but more help. It’s important, of course, with any organization like this in a government environment, for trust and safety to be involved. I should think that there's some analysis to help keep people from overdoing it and managing the gaming for a positive outcome.

Räsänen: Definitely. That’s one of the key metrics we're measuring with our consumer so that gaming is responsible. We need to see that all things they do can be thought of as good, because as you said, we're a national company, it’s a very regulated market, and that kind of thing.

Gardner: But a great deal of good comes from this. I understand that more than 1 billion euros a year go to the common good of people living in Finland. So, there are a lot of benefits when this is done properly.

Now that you've gone quite a ways into this, and you're going to need to be going to the new form and new organization the first of 2017, what advice would you be able to give to someone who is beginning a big data consolidation and modernization journey? What lessons have you learned that you might share?

Out of the box

Räsänen: If you're experimenting, you need to start to think a little bit out of the box. Integration is one of crucial part, and avoid all direct integration as much as possible.

We're utilizing Apache Kafka as an integration point, and that’s one of the crucial things, because then you can "appify" everything. You're going to provide an application interface for integrating systems and that will help those of us in gaming companies.

Gardner: A lot a services-orientation?

Räsänen: That’s one of the components of our data architecture. We have been using our Cloudera Hadoop system for archiving and we are building our capabilities on top of that. In addition, of course, we have HPE Vertica. It’s one of our most crucial things in our data ecosystem because it’s a traditional enterprise data warehousing in that sense it is a SQL database. Users can take a benefit out of that, and it’s lightning-fast. You need to design all the components and make those work on that role that they are based at.

Gardner: And of course SQL is very commonly understood as the query language. There's no great change there, but it's really putting it into the hands of more people.

Räsänen: I've been writing or talking in SQL since the beginning of the ’90s, and it’s actually a pretty easy language to communicate, even between business and IT, because at least, at some level, it’s self-explanatory. That’s where the communication matters.

Gardner: Just a much better engine under the hood, right?

Räsänen: Yeah, exactly.

Gardner: I am afraid we'll have to leave it there. We've been exploring how combined Finnish state gaming company, Veikkaus, is managing a complex merger process, while also bringing more of a digital advantage to its operations. And we've learned how a culture of IT innovation is helping to establish a state company to operate gaming in Finland.

Please join me in thanking our guest, Harri Räsänen, Information and Communications Technology Architect at Veikkaus in Helsinki. Thank you, Harri.
Join myVertica
To Get The Free
HPE Vertica Community Edition
Räsänen: Thank you.

Gardner: And a big thank you as well to our audience for joining us for this Hewlett Packard Enterprise Voice of the Customer digital transformation discussion.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored interviews. Thanks again for listening, and please do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how a culture of IT innovation is helping to establish a single wholly owned state company to operate gaming in Finland. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in:

Friday, November 18, 2016

Strategic View Across More Data Delivers Digital Business Boost for AmeriPride

Transcript of a discussion on how improved data allows for more types of work in an improved organization to become even more intelligent, and to find new efficiency benefits.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Hello, and welcome to the next edition to the Hewlett Packard Enterprise (HPE) Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on digital transformation. Stay with us now to learn how agile companies are fending off disruption -- in favor of innovation.

Gardner
Our next case study explores how linen services industry leader AmeriPride Services uses big data to gain a competitive and comprehensive overview of its operations, finances and culture.

We’ll explore how improved data analytics allows for disparate company divisions and organizations to come under a single umbrella, to become more aligned, and to act as a whole greater than the sum of the parts. This is truly the path to a digital business.

Here to describe how digital transformation has been supported by innovations at the big data core, we’re joined by Steven John, CIO at AmeriPride Services in Minnetonka, Minnesota. Welcome, Steven.

Steven John: I’m glad to be here.

Gardner: We’re also joined by Tony Ordner, Information Team Manager at AmeriPride Services. Welcome, Tony.

Tony Ordner: Thank you. I’m happy to be here, too.
Become a Member of myVertica
Gain Access to the Free 
HPE Vertica Community Edition
Gardner: Let’s discuss your path to being a more digitally transformed organization. What were the requirements that led you to become more data-driven, more comprehensive, and more inclusive in managing your large, complex organization?

John


John: One of the key business drivers for us was that we're a company in transition -- from a very diverse organization to a very centralized organization. Before, it wasn't necessarily important for us to speak the same data language, but now it's critical. We’re developing the lexicon, the Rosetta Stone, that we can all rely on and use to make sure that we're aligned and heading in the same direction.

Gardner: And Tony, when we say “data,” are we talking about just databases and data within applications? Or are we being even more comprehensive -- across as many information types as we can?

Ordner: It’s across all of the different information types. When we embarked on this journey, we discovered that data itself is great to have, but you also have to have processes that are defined in a similar fashion. You really have to drive business change in order to be able to effectively utilize that data, analyze where you're going, and then use that to drive the business. We're trying to institute into this organization an iterative process of learning.

Gardner: For those who are not familiar with AmeriPride Services, tell us about the company. It’s been around for quite a while. What do you do, and how big of an umbrella organization are we talking about?

Long-term investments

John: The company is over 125 years old. It’s family-owned, which is nice, because we're not driven by the quarter. We can make longer-term investments through the family. We can have more of a future view and have ambition to drive change in different ways than a quarter-by-quarter corporation does.

We're in the laundry business. We're in the textiles and linen business. What that means is that for food and beverage, we handle tablecloths, napkins, chef coats, aprons, and those types of things. In oil and gas, we provide the safety garments that are required. We also provide the mats you cross as you walk in the door of various restaurants or retail stores. We're in healthcare facilities and meet the various needs of providing and cleansing the garments and linens coming out of those institutions. We're very diverse. We're the largest company of our kind in Canada, probably about fourth in the US, and growing.

Gardner: And this is a function that many companies don't view as core and they're very happy to outsource it. However, you need to remain competitive in a dynamic world. There's a lot of innovation going on. We've seen disruption in the taxicab industry and the hospitality industry. Many companies are saying, “We don’t want to be a deer in the headlights; we need to get out in front of this.”

Tony, how do you continue to get in front of this, not just at the data level, but also at the cultural level?

Ordner: Part of what we're doing is defining those standards across the company. And we're coming up with new programs and new ways to get in front and to partner with the customers.

Ordner
As part of our initiative, we're installing a lot of different technology pieces that we can use to be right there with the customers, to make changes with them as partners, and maybe better understand their business and the products that they aren't buying from us today that we can provide. We’re really trying to build that partnership with customers, provide them more ways to access our products, and devise other ways they might not have thought of for using our products and services.

With all of those data points, it allows us to do a much better job.

Gardner: And we have heard from Hewlett Packard Enterprise (HPE) the concept that it's the “analytics that are at the core of the organization,” that then drive innovation and drive better operations. Is that something you subscribe to, and is that part of your thinking?

John: For me, you have to extend it a little bit further. In the past, our company was driven by the experience and judgment of the leadership. But what we discovered is that we really wanted to be more data-driven in our decision-making.

Data creates a context for conversation. In the context of their judgment and experience, our leaders can leverage that data to make better decisions. The data, in and of itself, doesn’t drive the decisions -- it's that experience and judgment of the leadership that's that final filter.

We often forget the human element at the end of that and think that everything is being driven by analytics, when analytics is a tool and will remain a tool that helps leaders lead great companies.

Gardner: Steven, tell us about your background. You were at a startup, a very successful one, on the leading edge of how to do things different when it comes to apps, data, and cloud delivery.

New ways to innovate

John: Yes, you're referring to Workday. I was actually Workday’s 33rd customer, the first to go global with their product. Then, I joined Workday in two roles: as their Strategic CIO, working very closely with the sales force, helping CIOs understand the cloud and how to manage software as a service (SaaS); and also as their VP of Mid-Market Services, where we were developing new ways to innovate, to implement in different ways and much more rapidly.

And it was a great experience. I've done two things in my life, startups and turnarounds, and I thought that I was kind of stepping back and taking a relaxing job with AmeriPride. But in many ways, it's both; AmeriPride’s both a turnaround and a startup, and I'm really enjoying the experience.

Gardner: Let’s hear about how you translate technology advancement into business advancement. And the reason I ask it in that fashion is that it seems as a bit of a chicken and the egg, that they need to be done in parallel -- strategy, ops, culture, as well as technology. How are you balancing that difficult equation?

John: Let me give you an example. Again, it goes back to that idea of, if you just have the human element, they may not know what to ask, but when you add the analytics, then you suddenly create a set of questions that drive to a truth.
Become a Member of myVertica
Gain Access to the Free 
HPE Vertica Community Edition
We're a route-based business. We have over a 1,000 trucks out there delivering our products every day. When we started looking at margin we discovered that our greatest margin was from those customers that were within a mile of another customer.

So factoring that in changes how we sell, that changes how we don't sell, or how we might actually let some customers go -- and it helps drive up our margin. You have that piece of data, and suddenly we as leaders knew some different questions to ask and different ways to orchestrate programs to drive higher margin.

Gardner: Another trend we've seen is that putting data and analytics, very powerful tools, in the hands of more people can have unintended, often very positive, consequences. A knowledge worker isn't just in a cube and in front of a computer screen. They're often in the trenches doing the real physical work, and so can have real process insights. Has that kicked in yet at AmeriPride, and are you democratizing analytics?

Ordner: That’s a really great question. We've been trying to build a power-user base and bring some of these capabilities into the business segments to allow them to explore the data.

You always have to keep an eye on knowledge workers, because sometimes they can come to the wrong conclusions, as well as the right ones. So it's trying to make sure that we maintain that business layer, that final check. It's like, the data is telling me this, is that really where it is?

I liken it to having a flashlight in a dark room. That’s what we are really doing with visualizing this data and allowing them to eliminate certain things, and that's how they can raise the questions, what's in this room? Well, let me look over here, let me look over there. That’s how I see that.

Too much information

John: One of the things I worry about is that if you give people too much information or unstructured information, then they really get caught up in the academics of the information -- and it doesn’t necessarily drive a business process or drive a business result. It can cause people to get lost in the weeds of all that data.

You still have to orchestrate it, you still have to manage it, and you have to guide it. But you have to let people go off and play and innovate using the data. We actually have a competition among our power-users where they go out and create something, and there are judges and prizes. So we do try to encourage the innovation, but we also want to hold the reins in just a little bit.

Gardner: And that gets to the point of having a tight association between what goes on in the core and what goes on at the edge. Is that something that you're dabbling in as well?

John: It gets back to that idea of a common lexicon. If you think about evolution, you don't want a Madagascar or a Tasmania, where groups get cut off and then they develop their own truth, or a different truth, or they interpret data in a different way -- where they create their own definition of revenue, or they create their own definition of customer.

If you think about it as orbits, you have to have a balance. Maybe you only need to touch certain people in the outer orbit once a month, but you have to touch them once a month to make sure they're connected. The thing about orbits and keeping people in the proper orbits is that if you don't, then one of two things happens, based on gravity. They either spin out of orbit or they come crashing in. The idea is to figure out what's the right balance for the right groups to keep them aligned with where we are going, what the data means, and how we're using it, and how often.

Gardner: Let’s get back to the ability to pull together the data from disparate environments. I imagine, like many organizations, that you have SaaS apps. Maybe it’s for human capital management or maybe it’s for sales management. How does that data then get brought to bear with internal apps, some of them may even be on a mainframe still, or virtualized apps from older code basis and so forth? What’s the hurdle and what words of wisdom might you impart to others who are earlier in this journey of how to make all that data common and usable?

Ordner: That tends to be a hurdle. As to the data acquisition piece, as you set these things up in the cloud, a lot of the times the business units themselves are doing these things or making the agreements. They don't put into place the data access that we've always needed. That’s been our biggest hurdle. They'll sign the contracts, not getting us involved until they say, "Oh my gosh, now we need the data." We look at it and we say, "Well, it’s not in our contracts and now it’s going to cost more to access the data." That’s been our biggest hurdle for the cloud services that we've done.

Once you get past that, web services have been a great thing. Once you get the licensing and the contract in place, it becomes a very simple process, and it becomes a lot more seamless.

Gardner: So, maybe something to keep in mind is always think about the data before, during, and after your involvement with any acquisition, any contract, and any vendor?

Ordner: Absolutely.

You own three things

John: With SaaS, at the end of the day, you own three things: the process design, the data, and the integration points. When we construct a contract, one of the things I always insist upon is what I refer to as the “prenuptial agreement.”

What that simply means is, before the relationship begins, you understand how it can end. The key thing in how it ends is that you can take your data with you, that it has a migration path, and that they haven't created a stickiness that traps you there and you don't have the ability to migrate your data to somebody else, whether that’s somebody else in the cloud or on-premise.

Gardner: All right, let’s talk about lessons learned in infrastructure. Clearly, you've had an opportunity to look at a variety of different platforms, different requirements that you have had, that you have tested and required for your vendors. What is it about HPE Vertica, for example, that is appealing to you, and how does that factor into some of these digital transformation issues?

Ordner: There are two things that come to mind right away for me. One is there were some performance implications. We were struggling with our old world and certain processes that ran 36 hours. We did a proof of concept with HPE and Vertica and that ran in something like 17 minutes. So, right there, we were sold on performance changes.

As we got into it and negotiated with them, the other big advantage we discovered is that the licensing model with the amount of data, versus the core model that everyone else runs in the CPU core. We're able to scale this and provide that service at a high speed, so we can maintain that performance without having to take penalties against licensing. Those are a couple of things I see. Anything from your end, Steven?

John: No, I think that was just brilliant.

Gardner: How about on that acquisition and integration of data. Is there an issue with that that you have been able to solve?

Ordner: With acquisition and integration, we're still early in that process. We're still learning about how to put data into HPE Vertica in the most effective manner. So, we're really at our first source of data and we're looking forward to those additional pieces. We have a number of different telematics pieces that we want to include; wash aisle telematics as well as in-vehicle telematics. We're looking forward to that.

There's also scan data that I think will soon be on the horizon. All of our garments and our mats have chips in them. We scan them in and out, so we can see the activity and where they flow through the system. Those are some of our next targets to bring that data in and take a look at that and analyze it, but we're still a little bit early in that process as far as multiple sources. We're looking forward to some of the different ways that Vertica will allow us to connect to those data sources.

Gardner: I suppose another important consideration when you are picking and choosing systems and platforms is that extensibility. RFID tags are important now; we're expecting even more sensors, more data coming from the edge, the information from the Internet of Things (IoT). You need to feel that the systems you're putting in place now will scale out and up. Any thoughts about the IoT impact on what you're up to?

Overcoming past sins

John: We have had several conversations just this week with HPE and their teams, and they are coming out to visit with us on that exact topic. Being about a year into our journey, we've been doing two things. We've been forming the foundation with HPE Vertica and we've been getting our own house in order. So, there's a fair amount of cleanup and overcoming the sins of the past as we go through that process.

But Vertica is a platform; it's a platform where we have only tapped a small percentage of its capability. And in my personal opinion, even HPE is only aware of a portion of its capability. There are a whole set of things that it can do, and I don’t believe that we have discovered all of them.

With that said, we're going to do what you and Tony just described; we're going to use the telematics coming out of our trucks. We're going to track safety and seat belts. We're going to track green initiatives, routes, and the analytics around our routes and fuel consumption. We're going to make the place safer, we're going to make it more efficient, and we're going to get proactive about being able to tell when a machine is going to fail and when to bring in our vendor partners to get it fixed before it disrupts production.

Gardner: It really sounds like there is virtually no part of your business in the laundry services industry that won't be in some way beneficially impacted by more data, better analytics delivered to more people. Is that fair?

Ordner: I think that’s a very fair statement. As I prepared for this conference, one of the things I learned, and I have been with the company for 17 years, is that we've done a lot technology changes, and technology has taken an added significance within our company. When you think of laundry, you certainly don't think of technology, but we've been at the leading edge of implementing technology to get closer to our customers, closer to understanding our products.

[Data technology] has become really ingrained within the industry, at least at our company.

John: It is one of those few projects where everyone is united, everybody believes that success is possible, and everybody is willing to pay the price to make it happen.

Gardner: I’m afraid we’ll have to leave it there. We’ve been exploring how linen services industry leader AmeriPride Services uses big data to gain a common and comprehensive overview of its operations, finance, and its culture. And we've learned how improved data allows for more types of work in an improved organization to become even more intelligent, and to find new efficiencies and benefits -- even those that you probably hadn't thought of before.
Become a Member of myVertica
Gain Access to the Free 
HPE Vertica Community Edition
So, please join me in thanking our guests, Steven John, CIO at AmeriPride, and Tony Ordner, Information Team Manager at AmeriPride. And a big thank you to our audience as well for joining us for this Hewlett Packard Enterprise Voice of the Customer digital transformation discussion.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored interviews. Thanks again for listening, and do come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how improved data allows for more types of work in an improved organization to become even more intelligent, and to find new efficiency benefits. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in:

Tuesday, November 01, 2016

2016 Campaigners Look to Deep Big Data Analysis and Querying to Gain an Edge in Reaching Voters

Transcript of a discussion on how data analysis services startup BlueLabs in Washington helps presidential campaigns better know and engage with potential voters.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Dana Gardner: Welcome to the next edition of the Hewlett Packard Enterprise (HPE) Voice of the Customer podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on business digital transformation. Stay with us now to learn how agile companies are fending off disruption in favor of innovation.

Gardner
Our next case study explores how data-analysis services startup BlueLabs in Washington, D.C. helps presidential campaigns better know and engage with potential voters.

We'll learn how BlueLabs relies on analytics platforms that allow a democratization of querying, of opening the value of vast big data resources to more of those in the need to know.

In this example of helping organizations work smarter by leveraging innovative statistical methods and technology, we'll discover how specific types of voters can be identified and reached.

Here to describe how big data is being used creatively by contemporary political organizations for two-way voter engagement, we're joined by Erek Dyskant Co-Founder and Vice President of Impact at BlueLabs Analytics in Washington. Welcome, Erek.
Join myVertica
To Get the Free
HPE Vertica Community Edition
Erek Dyskant: I'm so happy to be here, thanks for having me.

Gardner: Obviously, this is a busy season for the analytics people who are focused on politics and campaigns. What are some of the trends that are different in 2016 from just four years ago. It’s a fast-changing technology set, it's also a fast-changing methodology. And of course, the trends about how voters think, react, use social, and engage are also dynamic. So what's different this cycle?

Dyskant: From a voter-engagement perspective, in 2012, we could reach most of our voters online through a relatively small set of social media channels -- Facebook, Twitter, and a little bit on the Instagram side. Moving into 2016, we see a fragmentation of the online and offline media consumption landscape and many more folks moving toward purpose-built social media platforms.

If I'm at the HPE Conference and I want my colleagues back in D.C. to see what I'm seeing, then maybe I'll use Periscope, maybe Facebook Live, but probably Periscope. If I see something that I think one of my friends will think is really funny, I'll send that to them on Snapchat.

Where political campaigns have traditionally broadcast messages out through the news-feed style social-media strategies, now we need to consider how it is that one-to-one social media is acting as a force multiplier for our events and for the ideas of our candidates, filtered through our campaign’s champions.

Gardner: So, perhaps a way to look at that is that you're no longer focused on precincts physically and you're no longer able to use broadcast through social media. It’s much more of an influence within communities and identifying those communities in a new way through these apps, perhaps more than platforms.

Social media

Dyskant: That's exactly right. Campaigns have always organized voters at the door and on the phone. Now, we think of one more way. If you want to be a champion for a candidate, you can be a champion by knocking on doors for us, by making phone calls, or by making phone calls through online platforms.

You can also use one-to-one social media channels to let your friends know why the election matters so much to you and why they should turn out and vote, or vote for the issues that really matter to you.

Gardner: So, we're talking about retail campaigning, but it's a bit more virtual. What’s interesting though is that you can get a lot more data through the interaction than you might if you were physically knocking on someone's door.

Dyskant: The data is different. We're starting to see a shift from demographic targeting. In 2000, we were targeting on precincts. A little bit later, we were targeting on combinations of demographics, on soccer moms, on single women, on single men, on rural, urban, or suburban communities separately.

Dyskant
Moving to 2012, we've looked at everything that we knew about a person and built individual-level predictive models, so that we knew each person's individual set of characteristics made that person more or less likely to be someone that our candidate would have an engaging conversation through a volunteer.

Now, what we're starting to see is behavioral characteristics trumping demographic or even consumer data. You can put whiskey drinkers in your model, you can put cat owners in your model, but isn't it a lot more interesting to put in your model that fact that this person has an online profile on our website and this is their clickstream? Isn't it much more interesting to put into a model that this person is likely to consume media via TV, is likely to be a cord-cutter, is likely to be a social media trendsetter, is likely to view multiple channels, or to use both Facebook and media on TV?

That lets us have a really broad reach or really broad set of interested voters, rather than just creating an echo chamber where we're talking to the same voters across different platforms.

Gardner: So, over time, the analytics tools have gone from semi-blunt instruments to much more precise, and you're also able to better target what you think would be the right voter for you to get the right message out to.

One of the things you mentioned that struck me is the word "predictive." I suppose I think of campaigning as looking to influence people, and that polling then tries to predict what will happen as a result. Is there somewhat less daylight between these two than I am thinking, that being predictive and campaigning are much more closely associated, and how would that work?

Predictive modeling

Dyskant: When I think of predictive modeling, what I think of is predicting something that the campaign doesn't know. That may be something that will happen in the future or it may be something that already exists today, but that we don't have an observation for it.

In the case of the role of polling, what I really see about that is understanding what issues matter the most to voters and how it is that we can craft messages that resonate with those issues. When I think of predictive analytics, I think of how is it that we allocate our resources to persuade and activate voters.

Over the course of elections, what we've seen is an exponential trajectory of the amount of data that is considered by predictive models. Even more important than that is an exponential set of the use cases of models. Today, we see every time a predictive model is used, it’s used in a million and one ways, whereas in 2012 it might have been used in 50, 20, or 100 sessions about each voter contract.

Gardner: It’s a fascinating use case to see how analytics and data can be brought to bear on the democratic process and to help you get messages out, probably in a way that's better received by the voter or the prospective voter, like in a retail or commercial environment. You don’t want to hear things that aren’t relevant to you, and when people do make an effort to provide you with information that's useful or that helps you make a decision, you benefit and you respect and even admire and enjoy it.

Dyskant: What I really want is for the voter experience to be as transparent and easy as possible, that campaigns reach out to me around the same time that I'm seeking information about who I'm going to vote for in November. I know who I'm voting for in 2016, but in some local actions, I may not have made that decision yet. So, I want a steady stream of information to be reaching voters, as they're in those key decision points, with messaging that really is relevant to their lives.
I want a steady stream of information to be reaching voters, as they're in those key decision points, with messaging that really is relevant to their lives.

I also want to listen to what voters tell me. If a voter has a conversation with a volunteer at the door, that should inform future communications. If somebody has told me that they're definitely voting for the candidate, then the next conversation should be different from someone who says, "I work in energy. I really want to know more about the Secretary’s energy policies."

Gardner: Just as if a salesperson is engaging with process, they use customer relationship management (CRM), and that data is captured, analyzed, and shared. That becomes a much better process for both the buyer and the seller. It's the same thing in a campaign, right? The better information you have, the more likely you're going to be able to serve that user, that voter.

Dyskant: There definitely are parallels to marketing, and that’s how we at BlueLabs decided to found the company and work across industries. We work with Fortune 100 retail organizations that are interested in how, once someone buys one item, we can bring them back into the store to buy the follow-on item or maybe to buy the follow-on item through that same store’s online portal. How it is that we can provide relevant messaging as users engage in complex processes online? All those things are driven from our lessons in politics.

Politics is fundamentally different from retail, though. It's a civic decision, rather than an individual-level decision. I always want to be mindful that I have a duty to voters to provide extremely relevant information to them, so that they can be engaged in the civic decision that they need to make.

Gardner: Suffice it to say that good quality comparison shopping is still good quality comparison decision-making.

Dyskant: Yes, I would agree with you.

Relevant and speedy

Gardner: Now that we've established how really relevant, important, and powerful this type of analysis can be in the context of the 2016 campaign, I'd like to learn more about how you go about getting that analysis and making it relevant and speedy across large variety of data sets and content sets. But first, let’s hear more about BlueLabs. Tell me about your company, how it started, why you started it, maybe a bit about yourself as well.

Dyskant: Of the four of us who started BlueLabs, some of us met in the 2008 elections and some of us met during the 2010 midterms working at the Democratic National Committee (DNC). Throughout that pre-2012 experience, we had the opportunity as practitioners to try a lot of things, sometimes just once or twice, sometimes things that we operationalized within those cycles.

Jumping forward to 2012 we had the opportunity to scale all that research and development to say that we did this one thing that was a different way of building models, and it worked for in this congressional array. We decided to make this three people’s full-time jobs and scale that up.

Moving past 2012, we got to build potentially one of the fastest-growing startups, one of the most data-driven organizations, and we knew that we built a special team. We wanted to continue working together with ourselves and the folks who we worked with and who made all this possible. We also wanted to apply the same types of techniques to other areas of social impact and other areas of commerce. This individual-level approach to identifying conversations is something that we found unique in the marketplace. We wanted to expand on that.
Join myVertica
To Get the Free
HPE Vertica Community Edition
Increasingly, what we're working on is this segmentation-of-media problem. It's this idea that some people watch only TV, and you can't ignore a TV. It has lots of eyeballs. Some people watch only digital and some people consume a mix of media. How is it that you can build media plans that are aware of people's cross-channel media preferences and reach the right audience with their preferred means of communications?

Gardner: That’s fascinating. You start with the rigors of the demands of a political campaign, but then you can apply in so many ways, answering the types of questions anticipating the type of questions that more verticals, more sectors, and charitable organizations would want to be involved with. That’s very cool.

Let’s go back to the data science. You have this vast pool of data. You have a snappy analytics platform to work with. But, one of the things that I am interested in is how you get more people whether it's in your organization or a campaign, like the Hillary Clinton campaign, or the DNC to then be able to utilize that data to get to these inferences, get to these insights that you want.

What is it that you look for and what is it that you've been able to do in that form of getting more people able to query and utilize the data?

Dyskant: Data science happens when individuals have direct access to ask complex questions of a large, gnarly, but well-integrated data set. If I have 30 terabytes of data across online contacts, off-line contacts, and maybe a sample of clickstream data, and I want to ask things like of all the people who went to my online platform and clicked the password reset because they couldn't remember their password, then never followed up with an e-mail, how many of them showed up at a retail location within the next five days? They tried to engage online, and it didn't work out for them. I want to know whether we're losing them or are they showing up in person.

That type of question maybe would make it into a business-intelligence (BI) report a few months from that, but people who are thinking about what we do every day, would say, "I wonder about this, turn it into a query, and say, "I think I found something." If we give these customers phone calls, maybe we can reset their passwords over the phone and reengage them.

Human intensive

That's just one tiny, micro example, which is why data science is truly a human-intensive exercise. You get 50-100 people working at an enterprise solving problems like that and what you ultimately get is a positive feedback loop of self-correcting systems. Every time there's a problem, somebody is thinking about how that problem is represented in the data. How do I quantify that. If it’s significant enough, then how is it that the organization can improve in this one specific area?

All that can be done with business logic is the interesting piece. You need very granular data that's accessible via query and you need reasonably fast query time, because you can’t ask questions like that when you're going to get coffee every time you run a query.

Layering predictive modeling allows you to understand the opportunity for impact if you fix that problem. That one hypothesis with those users who cannot reset their passwords is that maybe those users aren't that engaged in the first place. You fix their password but it doesn’t move the needle.

The other hypothesis is that it's people who are actively trying to engage with your server and are unsuccessful because of this one very specific barrier. If you have a model of user engagement at an individual level, you can say that these are really high-value users that are having this problem, or maybe they aren’t. So you take data science, align it with really smart individual-level business analysis, and what you get is an organization that continues to improve without having to have at an executive-decision level for each one of those things.

Gardner: So a great deal of inquiry experimentation, iterative improvement, and feedback loops can all come together very powerfully. I'm all for the data scientist full-employment movement, but we need to do more than have people have to go through data scientist to use, access, and develop these feedback insights. What is it about the SQL, natural language, or APIs? What is it that you like to see that allows for more people to be able to directly relate and engage with these powerful data sets?
It's taking that hypothesis that’s driven from personal stories, and being able to, through a relatively simple query, translate that into a database query, and find out if that hypothesis proves true at scale.

Dyskant: One of the things is the product management of data schemas. So whenever we build an analytics database for a large-scale organization I think a lot about an analyst who is 22, knows VLOOKUP, took some statistics classes in college, and has some personal stories about the industry that they're working in. They know, "My grandmother isn't a native English speaker, and this is how she would use this website."

So it's taking that hypothesis that’s driven from personal stories, and being able to, through a relatively simple query, translate that into a database query, and find out if that hypothesis proves true at scale.

Then, potentially take the result of that query, dump them into a statistical-analysis language, or use database analytics to answer that in a more robust way. What that means is that our schemas favor very wide schemas, because I want someone to be able to write a three-line SQL statement, no joins, that enters a business question that I wouldn't have thought to put in a report. So that’s the first line -- is analyst-friendly schemas that are accessed via SQL.

The next line is deep key performance indicators (KPIs). Once we step out of the analytics database, consumers drop into the wider organization that’s consuming data at a different level. I always want reporting to report on opportunity for impact, to report on whether we're reaching our most valuable customers, not how many customers are we reaching.

"Are we reaching our most valuable customers" is much more easily addressable; you just talk to different people. Whereas, when you ask, "Are we reaching enough customers," I don’t know how find out. I can go over to the sales team and yell at them to work harder, but ultimately, I want our reporting to facilitate smarter working, which means incorporating model scores and predictive analytics into our KPIs.

Getting to the core

Gardner: Let’s step back from the edge, where we engage the analysts, to the core, where we need to provide the ability for them to do what they want and which gets them those great results.

It seems to me that when you're dealing in a campaign cycle that is very spiky, you have a short period of time where there's a need for a tremendous amount of data, but that could quickly go down between cycles of an election, or in a retail environment, be very intensive leading up to a holiday season.

Do you therefore take advantage of the cloud models for your analytics that make a fit-for-purpose approach to data and analytics pay as you go? Tell us a little bit about your strategy for the data and the analytics engine.

Dyskant: All of our customers have a cyclical nature to them. I think that almost every business is cyclical, just some more than others. Horizontal scaling is incredibly important to us. It would be very difficult for us to do what we do without using a cloud model such as Amazon Web Services (AWS).

Also, one of the things that works well for us with HPE Vertica is the licensing model where we can add additional performance with only the cost of hardware or hardware provision through the cloud. That allows us to scale up our cost areas during the busy season. We'll sometimes even scale them back down during slower periods so that we can have those 150 analysts asking their own questions about the areas of the program that they're responsible for during busy cycles, and then during less busy cycles, scale down the footprint of the operation.
I do everything I can to avoid aggregation. I want my analysts to be looking at the data at the interaction-by-interaction level.

Gardner: Is there anything else about the HPE Vertica OnDemand platform that benefits your particular need for analysis? I'm thinking about the scale and the rows. You must have so many variables when it comes to a retail situation, a commercial situation, where you're trying to really understand that consumer?

Dyskant: I do everything I can to avoid aggregation. I want my analysts to be looking at the data at the interaction-by-interaction level. If it’s a website, I want them to be looking at clickstream data. If it's a retail organization, I want them to be looking at point-of-sale data. In order to do that, we build data sets that are very frequently in the billions of rows. They're also very frequently incredibly wide, because we don't just want to know every transaction with this dollar amount. We want to know things like what the variables were, and where that store was located.

Getting back to the idea that we want our queries to be dead-simple, that means that we very frequently append additional columns on to our transaction tables. We’re okay that the table is big, because in a columnar model, we can pick out just the columns that we want for that particular query.

Then, moving into some of the in-database machine-learning algorithms allows us to perform more higher-order computation within the database and have less data shipping.

Gardner: We're almost out of time, but I wanted to do some predictive analysis ourselves. Thinking about the next election cycle, midterms, only two years away, what might change between now and then? We hear so much about machine learning, bots, and advanced algorithms. How do you predict, Erek, the way that big data will come to bear on the next election cycle?

Behavioral targeting

Dyskant: I think that a big piece of the next election will be around moving even more away from demographic targeting, toward even more behavioral targeting. How is it that we reach every voter based on what they're telling us about them and what matters to them, how that matters to them? That will increasingly drive our models.

To do that involves probably another 10X scale in the data, because that type of data is generally at the clickstream level, generally at the interaction-by-interaction level, incorporating things like Twitter feeds, which adds an additional level of complexity and laying in computational necessity to the data.

Gardner: It almost sounds like you're shooting for sentiment analysis on an issue-by-issue basis, a very complex undertaking, but it could be very powerful.

Dyskant: I think that it's heading in that direction, yes.

Gardner: I am afraid we'll have to leave it there. We've been exploring how data analysis services startup BlueLabs in Washington, DC helps presidential campaigns better know and engage with potential voters. And we've learned how organizations are working smarter by leveraging innovative statistical methods and technologies, and in this case, looking at two-way voter engagement in entirely new ways -- in this and in future election cycles.
Join myVertica
To Get the Free
HPE Vertica Community Edition
So, please join me in thanking our guest, Erek Dyskant, Co-Founder and Vice President of Impact at BlueLabs in Washington. Thank you, Erek.

Dyskant: Thank you.

Gardner: And a big thank you as well to our audience for joining us for this Hewlett Packard Enterprise Voice of the Customer digital transformation discussion.

I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of HPE-sponsored interviews. Thanks again for listening, and please come back next time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

Transcript of a discussion on how data analysis services startup BlueLabs in Washington helps presidential campaigns better know and engage with potential voters. Copyright Interarbor Solutions, LLC, 2005-2016. All rights reserved.

You may also be interested in: