Roles & Responsibilities of a Successful Analytics Team

After years of working with hundreds of companies, I’ve learned a thing or two about what makes analytics projects successful. I’ve also watched many projects fail.

The most common reason for failure might surprise you. It’s not a lack of data expertise or an integration mistake. It’s simply that the organization forgot to make it anyone’s responsibility to make use of their data. Far too many companies collect months and months of interesting data only to have it sitting in the corner collecting dust. That’s what compelled me to write this guide and hopefully inspire you to add a little bit of structure to your next project. Your analytics implementation should be a wellspring of business value that you tap again and again.

If you can find at least one person who is excited to take on each of the roles I’ve outlined below, I promise your project will be 1000x* more likely to succeed. Not all of these roles need to be fulltime, and one person could certainly play more than one part (or even all parts)! More detailed role descriptions & tips following.

*not scientifically proven!

Data TeamPhoto credit to the wonderful folks #WOCinTechChat

Roles & Their Outputs

Role Deliverables
Project Lead Project plan with scope & timeline
Data Architect Data model and queries
Product Developer Implementation of tracking
Analyst(s) Generation of new business questions
Reporting Developer     Reports for your business

Project Lead

A single individual should be responsible for the delivery of your initial analytics implementation. You probably already know what an effective project manager does:

  • Identifies the project’s stakeholders and figures out what they need. They ask, “What are the specific business questions we want to answer?”
  • Sets & communicates the goals, scope, and timeline for the project to everyone involved.
  • Manages dependencies and identifies roadblocks to delivery.
  • Ensures that the project is actually delivered and achieves its goals (e.g. that the data answers the questions that are important to your business).
  • Makes sure everyone involved, from the engineers to the product managers, is in sync and understands what will (and won’t) be delivered as a part of the project. This part is important as it’s also very common for people to under and over-estimate what your data will be able to do.

Tips for the Project Lead:

  • You’ll get the most immediate payback on your analytics project if you focus on questions whose answers will lead to a direct change in your product or business strategy. An example question might be: are customers from our new campaigns converting to paid (should we keep investing in this channel)? -or- We’re thinking about nixing this feature - can you find out if any paying customers use it?
  • Keep the scope of your project as small as possible. Start with tracking only a few key actions that are important to your business so you can quickly answer the most pressing business questions (e.g. what is our retention like for customers who use this feature?). Your immediate, helpful results will get your organization hooked and they’ll soon come up with more questions that you can tackle in the next sprint. In other words: analytics work should be done in an agile fashion, becoming a little bit more in-depth with each iteration. If the scope of your analytics project gets too big (e.g. taking two weeks of engineer time), you risk the entire thing getting put on hold in favor of a pressing product feature or some other business priority.

Data Architect

This is a fancy title but it just means that your team needs someone somewhat technical to create your data model and understand how querying the data works. The data model can be as simple as an email that lists what key actions you’ll track and what properties you’ll include on them. The model helps define and communicate the scope of your project. The data architect helps the team assess what business questions are possible to answer vs versus those that are not. This person is typically not a data science PhD. This role is most commonly filled by one of your app developers or someone adept at modeling things in spreadsheets.

Tips for the Data Architect:

  • Take the time to have your data model reviewed by someone who has built one before using the same toolset. For example, if you are using Keen, talk to a developer who has used Keen before. You can also ask your analytics provider to review your data model with you. No matter what toolset you are using, there will be tradeoffs and parts of the solution that don’t work quite like you’d expect. Save yourself some time and talk through your plans with someone who has enough experience to see around corners for you.
  • When modeling the data, use the vocabulary of the users and the business instead of the language of the application architecture. For example, you wouldn’t track an event like “state change” since this doesn’t have any meaning to the user of the product, and won’t have any meaning to other people in your company. If you keep the language of your data business-oriented, it helps other people in the organization understand how to query and make use of it.
  • Have at least one other person in your organization review your data model and confirm it makes sense to them too. You might find that some labels that make sense to you are unclear to others. For example, something like “uuid” might mean something different to different folks in your organization.
  • Don’t reinvent the wheel. If you’re building on Keen, please check out our Data Modeling Guide and Best Practices to learn some tricks and avoid the common pitfalls. If you’re starting a business-critical implementation, pair with one of our data architects.

Product Developer

At least one of your product developers has some pretty straightforward responsibilities at the beginning of your project: get tracking set up. They will add bits of code here and there so that every time a login, purchase, upload, or other action happens that you are able to collect data. If you have many sources of events, say your mobile app and your website, then this work might be done by multiple developers (e.g. a web dev and a mobile dev). In a small organization, the developer who sets up tracking often also plays the role of Data Architect. In larger groups, your developers should work closely with a Data Architect to make sure they model data optimally and that things are tracked and labeled in a consistent format (e.g. “” = “23cv42343jk88” not “” = “”). Setting up tracking is a relatively straightforward process as most analytics services provide drop-in client libraries to greatly simplify the effort, but your team still has to do the work of deciding what to track and how to name things.

Tips for the developer implementing tracking:

  • Make sure you’re implementing tracking according to a data model that makes sense to your organization. If your team doesn’t have a data architect, take on that role yourself and sketch out a model before you get started. This will clarify your thinking and make it easier to communicate your plan with others.
  • Implement separate repositories, with separate keys, for dev, test, and prod, so that you don’t get testing and production data mixed up.
  • Once you have tracking set up in development, get someone to peer review the incoming data before you go live with it. Your analytics implementation should go through a QA process just like any other feature of your product. It’s easy to make mistakes like sending numbers as strings, naming something in a confusing way, improperly formatting your JSON, or having typos in your labels.
  • Here is an inventory of tracking SDKs that work with Keen IO.


You’re going to be collecting lots of interesting data, but it won’t be very valuable unless someone uses it! You’ll need at least one person on your team who is very curious about what that data might reveal. I call these people analysts. Very often the analyst is a developer, product manager, or someone on the product or marketing team. Not only will these folks be dying to see the results of the business questions they set out to answer, they will be continuously thinking up new questions. Analysts love digging into the data you collected in the first phase of the project and will have a lot of ideas of what new things you can collect in the next phase. In other words, you need people on your team who enjoy the practice of analytics. Don’t worry, there are lots of people out there who do :). Having a technical background will be a huge asset for this person as they will quickly learn how to build queries to get the results they need. This role is absolutely critical for your success because if you don’t have people who want to learn from your data, you won’t be able to extract any value from it.

Tips for the Analyst:

  • The results of your analysis may be super meaningful and obvious to you, but they won’t be to anyone else. That’s because you know what questions you were looking to answer when you set out to do the analysis in the first place. You know exactly what data the dataset includes and excludes. Plus you wrote the queries that ultimately produced the visualization or report you’re looking at. That’s a lot of context that you need to share in order for other people to understand what the numbers mean.
  • When sharing results of your analysis, write out the conclusions you are drawing from the data and what business actions you think should be taken as a result of the analysis (e.g. our conversion decreased with this latest release and we should roll back). Not only do other folks perhaps not have the context to interpret the data correctly, they probably don’t find it as fascinating as you do and may not have the time to derive meaning from the data.
  • Not to hammer on it too much, but communication skills are so important for this role. Around half of the analyst’s time needs to be spent on communications. It takes quite a bit of time to explain and summarize the results and conclusions you’ll draw from your data. If the results of your analysis are sleeping in people’s inboxes, you’re not doing it right. Sometimes you may be the only person in the organization who knows about a problem or opportunity, and it’s your responsibility to make sure the organization is responding appropriately to what you’ve learned. Sometimes you gotta be the squeaky wheel. Don’t underestimate the value of your work.
  • If analysis work is something you repeatedly run out of time to do, try getting it added to your official job description and dedicating a certain number of hours per week or per month to it. Block it off on your calendar.

Reporting Developer

This role is optional, but you may want to build out some reports that can be easily accessed across the team or by other stakeholders. You can greatly increase the utility of your data by incorporating it more closely into your business workflows, rather than leaving it trapped in a database where people have to remember to login to check it. A front-end developer will be able to turn queries into a reports for product managers and others across the business. As an example, you may find your results are most useful:

Tips for the developer creating reports:

  • Get the most value out of your work by making sure the consumers of your reports understand the data. One way to do this is to ask them “When you see that conversion is 5.2%, what does that mean to you? How would you guess it is calculated?”.
  • Another way to increase report literacy is to put a guide (e.g. tooltip or footnote) that explains where the data comes from and how it is calculated. For example, does the data include users from the website and your mobile app, or just one of those? Does it include test users and internal users from your company, or have those been filtered out?
  • Have fun! Watching someone’s eyes light up when they discover something new is the best part of the whole analytics project, and you’re often the one helping bring that realization to life.

Starting an important new implementation and need some help? Our team of data architects has done this lots of times and can answer any questions you have! Just drop us a note using Keen’s support channels.

Michelle Wetzler

Chief Data Scientist, Human.

High Fives at API Strategy & Practice

Greetings from Austin

Greetings from Austin, Texas

Next week, I’m super excited that the API Strategy & Practice Conference is coming to Austin. I could write a novel-sized blog post about how much I love Austin, but I’ll just settle with saying: It’s pretty great city, but I might be slightly biased since I’m a local and Austin is home.

Austin has a spectacular developer community. I started participating in the Austin developer community while studying Computer Science at The University of Texas at Austin. It has a small town feel for such a large and growing city. I would not be where I am today without it. I’m glad we can share a bit of it with the APIStrat community.

The API Strategy & Practice Conference and its organizers have a long history in the space and they have become one of the leaders in the field of APIs. From business models to API design and evangelism, there’s a wealth of knowledge being shared and collaborated on throughout the conference. Before I joined Keen IO, I remember watching a fireside chat from APIStrat 2014 of one of Keen’s founders, Kyle Wild. For me, the stage at APIStrat had a strong sense of authenticity.

A few of my favorites

I’d love to be a resource to those visiting for the conference, so here’s a few of my favorites:

Please let me know if you need anything else or other local favorites on Twitter or via here.

How to find me

If you’ll be around the conference and want to meet up, feel free to reach out to me if you want to chat about Keen IO, event data, analytics, community building, hackathons, or anything else. I will also have some sweet limited edition Keen IO Field Notes and stickers to share. I’d love to meet up!

You can also find me talking about “Developer Love” during the Developer Experience breakout at 2:45pm on Thursday, 19th and on the API Consumption Panel with a bunch of other great people at 10:00am on Friday, 20th. A bunch of developer evangelists and advocates, including myself, will also be meeting up on Thursday from 6:30-9pm at Waller Creek Pub House for a meetup organized by the Austin Developer Evangelists group. You can RSVP and get more info here. Feel free to join!

I hope to see y'all in Austin soon! 😃

Taylor Barnett

developer, community builder, and huge fan of tacos

Analytics in 60 Seconds

I was talking with a co-worker, Eric, about how quickly we might be able to get totally custom analytics onto a page. 

He told me he thought we could do it in under a minute. I know that I can’t type fast enough to actually get it done in that amount of time, but Eric was pretty confident… Challenge accepted!

Anyone else up for the challenge? Get a metric on a page in less than 60 seconds or challenge yourself by trying this with another API. We used Screeny for the screen capture and Nice Timer for the countdown, but there are plenty of tools out there if you don’t like those.

Definitely send us your videos on Slack or share on Twitter! #analyticsin60seconds

Justin Johnson

community guy, hacker, music nut. i like to help people build stuff.

Introducing Saved and Cached Queries

We’ve been working hard to bring you some new features in the Data Explorer. We’re super excited to announce TWO exciting new features today, Saved Queries and Cached Queries.

Check out these new features!

Saved Queries

Saved queries are a super user-friendly way to revisit your favorite metrics. Rather than entering the same query parameters over and over again, queries can be easily saved, edited, and shared with your teammates using the Data Explorer or Keen’s API, so everyone can keep up to date with the most important metrics.

Saved queries are fully supported by the Keen API, meaning you can access your saved queries programmatically from anything you build on top of Keen’s platform. We’ve also streamlined our UI to make it easier for you to customize and revisit your queries again and again.

Cached Queries

If you need to access a query’s results instantly, you can choose to cache up to three saved queries right from the Data Explorer. This means that the query is automatically run on our side at a specific interval that you configure so you can get your results much faster whenever you need them.

This is really useful for important dashboard queries that need to be run often or for really large queries that take a long time to run. It’s also a great way to make sure your customer-facing dashboards load ultra-fast every time. We did the math, and cached queries load 100x faster than uncached queries. See for yourself! To cache your important queries just click the “Enable Caching” checkbox and set your refresh rate.

The Explorer will also let you know when the query was last updated:

To start playing with Saved and Cached queries, log in to your Keen account or sign up to try it out for free! Need more than three cached queries? Just reach out to us to learn more about our Keen Pro Plan.

We hope these features improve your experience using Keen! If you have any questions or feedback, please reach out to us anytime or join us on Slack!

Happy Exploring!

Joanne Cheng

mountain-person and computer nerd

Announcing a Particle Integration

Today we’re excited to announce that we have an integration with Particle!

We’re big fans (and users) of Particle’s IoT platform. Just like Keen IO replaces your analytics backend, or Stripe facilitates your billing, Particle does all the heavy lifting for your connected devices infrastructure.

Don’t you love how APIs are making it easier than ever to build all the things? Us too =)


So what does this integration mean to you?

Simply put, it allows developers to more easily and quickly send data from any Particle device to Keen IO. Once we have it, as always, you can run analysis and visualize the results via API or our Explorer.

We believe that analytics is an integral ‘part’ of every business and want to make sure that every builder has access to scalable and easy-to-implement tools. Collaborating with the fine people and products at Particle takes us one step closer to accomplishing just that.

Join us on Hackster and tell us what you build!

Justin Johnson

community guy, hacker, music nut. i like to help people build stuff.

Introducing the Community Projects Page

A few weeks ago one of my teammates asked if I had any examples of projects using a specific piece of hardware. Between hanging out on Keen Community Slack and Twitter, I had seen a project using it and sent my teammate a link. Although a lot of the Keen team is heavily involved in our own community, some of us see a lot more than others, so I decided to start a page of all the projects I see using Keen or assist in using Keen, like a helper library, to help out the rest of the team.

While making it, I realized how awesome all these projects were. I wanted to share it with everyone as a helpful resource, not just the Keen team.

So I made this file in our Community team repository: Community Projects

Currently, there’s 20+ projects on the page. I hope you find something interesting, helpful, or cool. I know I missed some projects. This is where I need your help.

the help kitty

Please send pull requests to add your projects, your coworkers’ projects, your friends’ projects, and any other projects you’ve seen using Keen. Blog posts are great but so is just sharing the source code in a repository. Even the smallest projects can help other community members, so don’t let the coolness or size of the project deter you from pull requesting it.

We will be sharing the page with new and current users, so it is a great way to get some exposure to your projects.

A couple of the projects I thought were especially helpful were:

Making a dashboard with Keen IO and SendGrid’s Event Webhook data by Heitor Tashiro Sergent Example SendGrid email dashboard

Cohort Visualization Tool built with Meteor and Keen IO by Ry Walker Example chart

High five!

Also, I would like to give a shout out to everyone who has worked on these projects. Thanks for making the Keen IO community awesome!

Can’t wait for your pull request! 😃

Taylor Barnett

developer, community builder, and huge fan of tacos

No Dogma Podcast: A Different Way of Doing Business

A very nice person named Bryan recently interviewed a few Keenies for his podcast, and the ensuing conversation turned out to be pretty darn interesting. So, we thought we’d share it here on the blog for your listening pleasure.

Lisa & Dan

Bryan spoke with Lisa and Dan on a variety of topics, providing a glimpse into Keen as a company, a culture, and a collection of humans – e.g., the experiment that is our organizational/operational model (is it holacracy? who knows!), our “less traditional” business strategies, and some triumphs, trials, and tribulations from along the way.


Bryan’s podcast lives alongside the No Dogma Blog, which collectively showcase “discussions [and] ideas on software development.” You can follow him on Github or listen to other podcasts about indie game development, machine learning, and the like.

If you have any questions or comments about the conversation within the interview, we’d love to hear ‘em - fire away!

We’re totally open to discussing how we run our company and the unique challenges/opportunities that accompany a “different” approach to building a business.

And we love to learn from others, too! So, if you’re doing the self-organized, management-free, quasi-holocracy thing, and all of this sounds eerily familiar, let’s compare notes… =)

Leave a comment below, Tweet at us, visit our FB page, chat us up on Slack, or swing by our SF HQ for a high-five and a sparkling water.

Tim Falls

community crafter and scarf enthusiast

Data-Driven Product Design

Hi, I’m Maggie, a Data Engineer at Keen IO. I presented this talk on “Data-Driven Product Design” at The Industry Product Conference back in September. It was a great event that brought together product leaders from around the world.

The talk provides a breakdown of the type of data and knowledge you can collect about your users, some sample analyses, and information on what makes a good metric.

To really drive those points home, I’ve included real-world examples of how companies have used data to drive product decisions and pointers on common pitfalls to avoid. Hope you enjoy! Feel free to reach out to me with questions on twitter or ping us on Slack!

Maggie Jan

ALL THE (internet of) THINGS!



Keen IO was born in 2012. From my perspective (call me a youngster), it seems as though we’ve kinda grown up with the Internet of Things (IoT), maturing through our toddler years and into mutual adolescence – I think that’s what four real-life years equates to in “startup years” …?

But did you know that “IoT” has been a thing since 1999? Yep, back when RFID was an emerging technology considered to be a prerequisite for the Internet of Things as we know it today…


All this fun and interesting history has led us to exciting times in 2015. And it has us especially stoked for our new partnership with SMARTRAC - the leading, global company in the (still emerging) field of RFID technology - as part of their SMART COSMOS program. This platform aims to bring together “system integrators, app developers and, users of RFID and NFC technologies” in a tight-knit, collaborative ecosystem.


But wait, there’s more!

The fine folks at SMARTRAC have also launched a few other bits of fun, by which we think our community may be intrigued:

  1. Lessons - “an educational training portal” for those interested in learning the technical aspects of a connected RFID-centric IoT ecosystem
  2. A coding competition for software developers - read all about it! They are awarding various prizes that add up to lots of $$. And, in case our fellow data dorks were wondering, yes, there is a prize category ($5k USD) for the Best Analytics Integration =)

IoTo the MAX

If this kinda stuff floats your boat, puts wind in your sails, greases your gears, etc, then we (you and Keen, that is) should be besties (if we’re not already) - cus who doesn’t LOVE hacking hardware and building IoT analytics TOGETHER?!

To be continued…

And if you find yourself on the fence, wondering if your curiosity in IoT could amount to fame and glory, just ask one of your fellow community members, Alex Swan, how it felt to be featured in the WSJ thanks to his work building an analytics/fitness tracker for his pet hamster, Jessica.

Hamster GIF

If you’re as excited as we are about being part of the promising future of IoT, grab a free Keen IO account today and start hacking. And never hesitate to reach out to us for help and/or to share your amazing creations!


Tim Falls

community crafter and scarf enthusiast

Visualize activity at your next event, with lasers. Lasers!

A little while back we sat down with the good people over at Particle and figured out a way we could show off how to use Keen IO and Particle at an inperson event called DevGuild. Particle has some seriously cool hardware and software tools for prototyping, scaling, and manage IoT products.

We ended up deciding to put some trip wires around the room that were connected to WiFi via a Particle board. When it was tripped it sent an event to Keen with a unique id for the laser that was tripped.

laser pic

We got some hardware to attach to our particle. It included this laser, with this mount, and this photoresistor.

Here was the code we wrote to get the laser working with Particle:

int laser = 0;
unsigned int lastPublish = 0;
bool lastState = false;

void setup() {
    pinMode(A0, INPUT_PULLDOWN);
    pinMode(D7, OUTPUT);

void loop() {
    int val = analogRead(A0);
    unsigned int now = millis();
    unsigned int elapsed = now - lastPublish;

    bool broken = val < 4000;
    if (broken == lastState) {

    digitalWrite(D7, (broken) ? HIGH : LOW);

    if (elapsed >= 1000) {
        String topic = ((broken) ? "laser/alarm" :  "laser/reset");
        String message = String(val);
        Particle.publish(topic, message);
        lastState = broken;
        lastPublish = millis();

Once we had the laser working we configured a Particle webhook to post to a Keen IO endpoint.

The webhook code looks a little something like this!

   "eventName": "laser",
   "url": " lasers",
   "requestType": "POST",
   "headers": {
    "Authorization": "YOUR_AUTH_KEY"
    "json": {
      "lasers": {
         "test": "1"

Then events started rolling in!

The next thing we had to do was make some sense out of the data, so we made a real time interactive visualization. We used SVG to make a very rough view of the rooms layout, which happened to just be a rectangle, and then popped two dot’s connected by a line on the page when a new new laser ID showed up in Keen. Once the dots are on the page you can move them around and place them where they phycicall are in the room.

The last step was to have the line blink to represent the laser getting tripped, in real time. We also added a fun little feature where, if a trip laser is getting a ton of activity, the line gets bigger to represent heavy traffic in that area of the venue.

viz pic

We used Firebase for persistence, Node + ReactJS for the interface, and Keen IO for the rest. Here’s the app code:

var React = require('react'),
    Keen = require('keen-js');


var config = require('../config');

// Configure Firebase
var firebaseRef = new Firebase(config['firebase']);

// Configure Keen
var client = new Keen(config['keen']);
var gates = new Keen.Query('count', {
  eventCollection: 'lasers',
  groupBy: 'coreid',
  timeframe: 'this_1_minute'
var totals = new Keen.Query('count', {
  eventCollection: 'lasers',
  groupBy: 'coreid'

var Gate = React.createClass({

  getInitialState: function(){
    var rando = Math.round(Math.random()*300) + 50;
    return {
      a: {
        active: false,
        radius: 15,
        x: rando,
        y: rando - 50
      b: {
        active: false,
        radius: 15,
        x: rando,
        y: rando + 50
      stroke: '#808080',
      strokeWidth: 5

  handleMove: function(i, e){
    var state = this.state[i];
    if (! return;
    state.x = this.props.trace[0];
    state.y = this.props.trace[1];
    firebaseRef.child('/gates/' +;
  handleDown: function(i,e){
    var state = this.state[i]; = true;
    state.radius = 25;
    firebaseRef.child('/gates/' +;
  handleUp: function(i,e){
    var state = this.state[i]; = false;
    state.radius = 15;
    firebaseRef.child('/gates/' +;

  tick: function(){
    var self = this;
    var diff = self.props.weight - self.state.weight;
    console.log(diff > 0,;
    var interval = setInterval(function(){
      if (diff > 0) {
          weight: self.state.weight ? 0 : self.props.weight,
      else {
        if (!self.state.weight) {
            weight: self.props.weight
    }, 500);

  componentDidMount: function(){
    setInterval(this.tick, 1000);

  render: function(){

      {'ID: ' +}
      {'Total: ' +}


var Stage = React.createClass({

  getDefaultProps: function(){
    return {
      gates: []

  getInitialState: function() {
    return {
      ix: 100,
      iy: 100,
      height: window.innerHeight,
      width: window.innerWidth,
      position: {
        x: window.innerWidth / 2,
        y: window.innerHeight / 2

  getKeenResults: function(){[gates, totals], function(err, res){
      res[0].result.forEach(function(record, i){
        var id = record.coreid;
        if (id) {
          firebaseRef.child('/gates/' + id).update({
            id: id,
            weight: record.result,
            total: res[1].result[i].result

  componentDidMount: function() {
    var self = this;

    setInterval(this.getKeenResults, 1000 * 5);

    firebaseRef.child('/gates').on('child_added', function(data){

    firebaseRef.child('/gates').on('child_changed', function(data){
      self.props.gates.forEach(function(gate, i){
        if ( === data.key()) {
          self.props.gates[i] = data.val();

    firebaseRef.child('/gates').on('child_removed', function(data){
      self.props.gates.forEach(function(gate, i){
        if ( === data.key()) {
          self.props.gates.splice(i, 1);

    window.addEventListener('resize', this.handleResize);
    document.addEventListener('touchmove', this.preventBehavior, false);

  handleResize: function(){
      'height': window.innerHeight,
      'width': window.innerWidth

  preventBehavior: function(e){

  handleMove: function(e){
    var touch = (e.touches) ? e.touches[0] : false;
      ix: touch ? touch.pageX : e.clientX,
      iy: touch ? touch.pageY : e.clientY

  render: function(){
    var gates =, index){
      // console.log(gate);
    }, this);





React.render( , document.getElementById('stage') );

module.exports = Stage;

The config.json file has the Keen API keys and the address of our Firebase app.

The index file puts our page on the internet and has is a div placeholder with an id of ‘stage’ and all the things are injected:

<!DOCTYPE html>

  <style scoped="scoped">
    body { margin: 0; }
    svg circle { cursor: move; }
  </style><div id="stage"></div>
  <script src=""></script><script src="./app/index.js"></script>

The result was a very fun interactive visual that people really seemed to enjoy. + we got to talk about lasers, double win!

laser pic

Check it out and hookup your next event and/or let us know what you come up with! Big ups to Dustin and Zach for coloborating on this with me. High-Five

laser gif

Justin Johnson

community guy, hacker, music nut. i like to help people build stuff.