Avatar photo

Webinar: Introducing Native Analytics

 

Thinking about introducing data views as a part of your product?

Native Analytics is a suite of tools that empower you to build and ship white-labeled analytics in your product, in record time.

Peter Nachbaur, our Director of Product, hosted this 30-minute webinar walking attendees through examples of Native Analytics, its features, and how to get it deployed.

Why Native Analytics?

A few reasons our customers choose to build Native Analytics into their products:

  • To quantify the impact and value of their product to their customer
  • To drive more sales by selling Native Analytics as a premium feature
  • To differentiate their product from competitors (or keep up with them!)
  • To deliver analytics features that customers have been requesting forever
  • To increase user engagement and get customers hooked

Do you have questions about getting started with Native Analytics? Get in touch!


TRANSCRIPT:

Peter:
All right, we’re going to get started with the webinar today about Native Analytics with Keen IO. We’re going to be talking about how to build and ship analytics features for your users, that’s what Native Analytics is all about. I’m Peter Nachbaur. I’m the Director of Product at Keen IO. Been with the company for a long time. I started out as the first engineer, then moved into an architecture role. Today I’m really focused on making use cases like Native Analytics as enjoyable as possible.

Today we’re going to kind of reiterate what Native Analytics is and make sure that you understand what we’re talking about. We’re going to go through some customer examples to highlight some exciting things the folks that we work have done, and then walk through a more specific example to understand the actual features that support Native Analytics and then we will get to questions at the end. Through the Go-to-Webinar interface you can put text questions in there and we’ll get to those at the end of the webinar.

As we start, what is interesting here is that for years Keen IO has built this really scalable and very flexible analytics platform. We can collect data from anywhere and we can help you query that data from anywhere. We’ve supported a number of use cases over the years, a number of different industries for people both working with their data both internally and working with their data externally, but today’s focus is on the Native Analytics use case, which is all about putting your analytics directly in front of your own customers.

This is something that is a little bit different than the internal analytics folks who we are generally familiar with. There’s different latency needs and different privacy needs, but there is also a really, really huge payoff. Rather than working with internal teams you are able to really directly increase the value of the product to your customers and as part of that you are going to be controlling the experience natively so that it kind of increases the perception of your brand. You’re going to do all of this in a very quick iterative fashion without having to worry about the operational overhead. Keen’s Native Analytics is all about letting you focus on the product design space without having to worry about the broader data infrastructure challenges that we take care of behind the scenes.

Bluecore is one of our most interesting examples. They’re a customer experience platform. They help manage marketing automation campaigns for some really, really major brands; retailers like Staples, and Tommy Hilfiger. What they are able to do is have really fine-grained tracking of the products that they share. When someone is working with you, when Bluecore is working with you, they can say that they do something for you but it is immediately more valuable when they can prove what it does for you. They can get these numbers in front of you and show you the open rate, show you the conversion rate. That’s actually just the beginning. You can then take that data that they show you and more intelligently control the campaigns that you are running and get more out of the automation tools. So it really goes beyond proving what the product does and actually makes the value that much higher.

Pixlee TurnTo is another interesting example. They work with major brands to get significant conversions from newer social media like Instagram. So they help people track conversions that are otherwise hard to understand and they are able to drive really significant traffic through these brands and in a way that is really valuable to them. But again, being able to actually prove that this is happening is a huge part of it. Being able to understand which pieces of content are most valuable helps these brands come back again and again, to really get more value out of Pixlee TurnTo.

SketchUp is someone I started working with recently. I just got off a call with them and they do some really fascinating stuff. They are a marketplace for 3D sketches and designs for architects, for manufacturers, for homeowners who are looking to improve things. They are able to allow these designers to understand who is viewing their models and their sketches and sort of begin to create new sketches that will work with their audience even better, to be able to follow up with people who are engaging heavily with the content that is working well for them.

Finally, Mic.com is perhaps one of the most interesting examples on this list because they built this really fantastic tool to show the performance of the media that they are producing. Both the long form articles and videos but the primary audience for one of the tools they built is not just the publishers they work with but actually the internal editorial team, who doesn’t have to worry about the complexities of an analytics platform or a visualization tool, or having to know a whole lot about how to dig into data. All they have to do is go into this system that was built for them and they can consume and ingest the information that they need to do their job better, and that’s natively built into the internal product that the Mic team maintains.

So we are going to now peek under the hood a little bit about the pieces of functionality that we have built that enable this use case. As we do so we are going to talk about this upcoming company you may have heard about called Walruspring. It’s a pretty crazy thing. The ascension intelligent walruses are coming out with really interesting fashion designs that fit the walrus body type and this all coming out of my mind, of course. I haven’t cloned any walruses and given them superpowers, but you can start to understand the features we’re talking about within the framework of a marketplace where clothing designers are coming in on one end and walrus consumers are coming in on the other end and trying to figure out how to match jackets to their tusks.

When a company wants to do Native Analytics they have some shared requirements. Things they need to understand. They need to be able to have fast and reliable query performance because consumers or partners or publishers are not necessarily going to be patient in the way that internal analysts will be. The team that’s delivering this needs to easily manage this for each customer that comes on. In the past when people would try and showcase data to their own users it would be a very manual process — for each user they added, they would have to configure everything and really customize it and have a lot of overhead and headaches there. A lot of the ways that they would end up doing it, it would be very janky — it wouldn’t match the brand and wouldn’t be a seamless part of the experience. So when people are look for Native Analytics, they are looking to be able to have all of these capabilities.

The cool thing is that this is a lot of functionality that we’ve designed and built on top of the core horizontal platform. So as you think about some of these things that we’ve built, I’d be interested to hear how you might want to use them in different use cases beyond just adding analytics into your own application. The main components here are being able to programmatically create accounts, to manage access, to introduce caching, to get good performance, to be able to embed the charts, and then to be able to fully customize and control the visualizations.

So for Walruspring, when they are onboarding new designers they’ve got a flow where you sign in, you’re introduced to the platform, you get a sense of the features and behind-the-scenes, they start managing your account information. As part of that, they can make a simple API call to Keen that says “Hey, we had this new designer sign up and here is the information about Mark, and we are going to create this project that is totally separate for him. His data will be isolated both for privacy and for performance concerns.” So when they then move on to the next stage to pick out what access Mark should have into the data, it’s really going to be only the data that his consumers are seeing. Once you have those specific projects you are able to then have custom API keys, and these keys can have a lot of functionality baked into them — filters, so if you want to control the exploration and experience that they get, if you want to have different roles, so maybe, within the Walruspring environment, there is a basic tier for designers and there is a more premium tier where they are able to have to have additional access. You can control that through a customer specific key. You can say “Well okay, they have now unlocked this additional potential.”

Additionally, on the flipside, as we’re collecting data as part of a platform, these custom keys can actually automatically add customer specific data and enrichment so that the application itself is worried about here is the core data model that is shared across all customers and then, depending on the key that is used to collect data, one designers data will be added or another designers data will be added. That then seamlessly flows into how you think about the data model. You’re able to have the information about the product, or about the behavior, viewing different pieces of clothing as well as the specific customer properties that let you both do the internal analysis that you have done historically, and then also be able to present that in a way that is meaningful to each individual designer.

So, the Walruspring team is at a stage where they’re creating projects on the fly as new designers sign up. They’re managing the privacy performance of that. They’re collecting the data. Now they’re able to sit back and say “From a product design perspective, what do our walruses need to make sure that they’re selling the most clothing”, (…) make sure they are understanding their audiences so they can go through this exploratory phase and get to that “Eureka” moment of thinking, “Well these are few queries that will really help the designers get the most out of the Walruspring platform.”

Once those queries have been identified, they can be optimized. They can be cached. These high value questions are then going to respond very very quickly, in very predictable fashion, so that the designers, when they are logging in, they’re not seeing really long load time or having a bad experience there, so that they can jump around, but their own exploration is not slowed down by the challenges of the data infrastructure. So caching is a really huge tool. Once you’ve done that initial design-level exploration to figure out what people care about, you can then productionalize individual queries or full dashboards to make sure that experience is perfect.

Once you’ve got those queries it really becomes a matter of embedding the analytics, and this is really what makes it a native experience. We’re not talking about really heavy weight iframes or stuff that has a logo or a brand — you have complete flexibility with these charts, so you’ve got a pretty decent design sense when things come out of the box. You’ve got a color palette that you think is pretty attractive but the thing is that it has nothing to do with your branding and as you are going from that internal use case where it’s really easy on the eyes but a little generic, you are able to then, in the JavaScript itself, control what visualization library you’re using, specific chart types, the size, and the color. We’ve actually seen folks really go the extra mile here, and produce visualizations that would be worthy of the front page of The New York Times. It’s really impressive stuff.

We have a link here that we can share later, we can share an example of all the different ways that the data can be visualized and the library can be used.

Beyond some of these core things about onboarding your customers, and controlling access, and managing performance, you can actually then expose some of the more raw functionality and let the customers dig into CSV files, to be able to pass those reports around but plug them into systems that they already have. So we see use cases where people are regularly sending exports to their users so that they can wire that into existing systems. They can then answer their own questions and really get the maximum value from that. And again, because of the custom keys that we discussed earlier, you can rest assured that the data they have access to is only the data that belongs to them. They are then able to do complex modeling on their own or you can downstream, use the raw data that we’ve collected to start adding more predictive analytics functionality or recognition functionality to, again, keep making sure that they get as much value out of the data as possible.

A final aside is that, along on this theme of natively and seamlessly matching your brands, you can make sure that the API endpoint itself takes advantage of a custom cname so that, in this case, you would be able to send data to api.walruspring.com and there would just be no external way to know that this is powered by Keen. We’re thrilled to be able to help folks with this use case in a white label capacity.

We’ll briefly just talk over this little demo site that we’ve got that shows the Walruspring experience. This first page is what the consumer walruses would be seeing, and they’re able to kind of take a look at all this ridiculously creative clothing that the walrus designers have put together. I think, of these, I’m really feeling the pajamas right now. It’s little chilly in the office and I wish I had some nice fuzzy stars.

To reiterate, this is the same data that you are probably already collecting today, you’re thinking about the flow, you’re able to think about what the product ID is, the product name, all that sort of core information; and as the consumer is progressing through the site, different events are being sent. All that information is being tracked. As they continue down the flow, they check out, more data is collected. When they purchase it’s a really big piece of data that is an important part of the flow.

As far as the consumer is concerned that is just the experience. The analytics is not detracting from that. They’ve gotten these awesome, super comfy button-up pajamas, for a really reasonable price, and they can go on with their lives. But then, internal to the product experience, a Walruspring designer is going to be able to login in and have this analytics tab, which shows them what their conversion rate is, what their repeat purchase rate is, and really help them understand, “Okay, for the different products that I’m offering I’m really seeing heavy conversion in this area but pajamas are somehow really popular and that’s a bit unexpected and we can actually maybe look into why that might be.” As a design team you think “Okay, well we should make a couple different types of pajamas and really capitalize on that.” That’s the kind of decision making the customers can feel really good about. So, not only are they getting value from all of Walruspring because it connects them to these consumers and they can put a product up on the website and get sales from it, but they can actually then have this additional information to get more in depth sales.

So, to reiterate, the Native Analytics is all about full control while still having a very simple integration and an integration where you don’t have to worry about the challenge of data construction, the challenge of scaling. You can focus on the really fascinating questions on what makes your customers happier, what makes your customers smarter. These are often very aligned with the metrics that help you grow your product internally, as a product team or as a marketing team. You’re already looking at some of this information in aggregate but you can slide around a little bit and put yourself in their shoes, or wear their tusks for a day, to understand, as they’re doing their job, and as they’re using your product, what will help them get the most out of it and what is a natural progression. So they’re logging in and they’re going to check on the functionality that you’ve already got, whether that’s marketing automation, or social media conversions, or walrus clothing. What’s going to be a natural part of their flow and is that a full-on dashboard? Is that just small pieces of data that are wired directly into other parts of their experience? It’s a pretty open-ended question. It’s something that I encourage you all to think about. It gives you a lot of flexibility to iterate quickly.

So, be happy to take some questions. Both about the specific features and details that we talked about there or the use cases that we’ve seen.

Author’s note: apologies for the echoes and inaudible moments in the recording during the Q&A, and the impact this had on our transcript!

Sarah-Jane:
All right. This is Sarah-Jane. We’ve got a couple of questions in through our question interface and we’re going to leave that open a little while, so if you have any questions feel free to drop those in. Just give me one second while I bring those up.

All right. So the first question is what other charting libraries can it support and can I use my own visualizations?

Peter:
Yes, absolutely. So, on that previous slide where we’ve been looking at the visualization code you can very easily toggle between charting libraries, Google charts, C3, D-3, those are all powered by our API. The raw JSON that is returned can be visualized by any system that you’re more comfortable with.

Sarah-Jane:
Great. Another question here. Can I update the metrics that I show my customers without coding once Keen is implemented?

Peter:
Yes and no. There’s a really cool way to be able to do that today and we’ve actually got some products available to make that even easier. But we have saved queries where as you’re doing exploration of the data, thinking about what’s valuable in terms of that that “Eureka” moment I was mentioning, you can then save that actual query. You can actually tie that to a custom API key and say, “This key for Tommy Hilfiger only has access to these saved queries.” Maybe have a Tommy Hilfiger Average Purchase query. If you then make a chart for that and put the metric for that chart into your products you can go back and using our interface, change the saved query and where you have embedded and then you just naturally take advantage of the changes that you’ve made to that saved query.

Sarah-Jane:
Great. Got a question here. Is it possible to track location natively with Keen? Would you just send an event with the locational information?

Peter:
Oh yeah. You have a handful of different tools and put those into add-ons into these custom API keys and geolocation wise (…) because it lets you do mapping views of data to figure out by state, by zipcode, by country, where people are buying all this clothing, you can take (…) patterns, you can take an IP address and can expand that out into geographic information. We can give you latitude and longitude so you can really map that info . We have some stuff in the backboard there that geographic… everything’s basically even more powerful.

Sarah-Jane:
Great. One more question here. What is the current timeline to get Native Analytics implemented?

Peter:
Oh, that’s a great question. So, Bluecore, which was one of the examples we talked about earlier, is probably the most impressive example. They were able to get that dashboard looked at briefly and up-and-running in less than 24 hours. It was actually just built by an internal team that they had. I know that the mic.com team was able to get that information to their editors as part of their workflow in a matter of days. Generally the answer that I give is tied to an individual company’s development workflow. So, if your team is pushing code to production in a matter of hours then you can absolutely start making these changes very quickly. (…)

Sarah-Jane:
Great. I believe we’re going to be answering question for a little bit longer. If there’s anymore… We’re going to be sharing the recording of this webinar, so if you have any further questions go through one of the channels listed on this final slide. For our Slack chat go to keen.chat and join us there with any questions you may have. Maybe check out our Github repos and, of course, tweet us any time and tweet us on our website with any further questions. Have any closing thoughts, Peter? I will hand it over to you… Oh, we have some more questions. One second, here… alright, what’s the pricing model? Is it by license or SaaS subscription?

Peter:
Yeah, that’s a good question. So, we look to understand your usage, unlike products like a marketing automation platform for example, where you’d be sending out a very discrete, understandable number of emails. With Keen, the data that you’re tracking… the volume can get very large. A whole bunch of data has allowed us to look at that but the impact in usage varies dramatically depending on whether you’re doing a couple queries a minute, or an hour, or a day for that internal use case that some of you are familiar with. On the Native Analytics side, what we like to understand to get a sense of usage is how many customers you’re looking to serve today, to ensure the number of queries you’re going to be looking to optimize and keep running so that they get that seamless cached experience. Because our core offering is a very flexible and very customizable API, then we’ll perfectly match your product needs to let you build exactly the analytics experience that you want. There isn’t really a one-size-fits-all, if you’re going to be querying across hundreds of millions of events and you’re going to be doing that thousands of times an hour then it’s going to be more expensive than if you only have a few million events and fewer customers asking questions across that data.

Sarah-Jane:
Great. Another question is, “Do you offer machine learning algos on the data and if not, how can one run a custom routine on your data”

Peter:
So, we don’t have API-level functionality around running machine learning today, although it is definitely part of our design and roadmap. So what we do to meet that use case is our solutions architects will sit down and take a look at the raw data that’s available and help put together the laws that make sense. Generally what folks want to be able to do is have offline classifiers that you’d be able to actually, as part of a clone, make online checks against that. We’ve got a handful of folks on the team, myself included, have done that a number of times. What did end up happening was largely downstream from the data production and analysis and Native Analytics features that we were talking about today.

Sarah-Jane:
All right. So that’s pretty much it. No more questions. Thanks so much Peter and thanks to everyone on the webinar today. As I mentioned, we’re sharing the recording and feel free to connect with us on our community channels. Have a great day.

Peter:
Looking forward to hearing more and let me know how I can help.