I am a designer and prototyper drawn to challenges within uncharted domains. My approach leans on a hybrid background in user interaction, data visualization, visual effects and sound design. I enjoy bringing concepts to life and further evolving them into functional applications and experiences. Historically I have used Max/MSP, Processing, OpenFrameworks, Cinder, Three.js, D3.js and Unity for prototyping.

I currently work at mixed reality startup Magic Leap as a Senior Interaction Prototyper on the UI/UX team. Previous endeavors include creating award-winning data visualizations and interfaces at IBM Watson Cognitive Visualization Lab, prototyping patented augmented reality heads up display applications for Honda, building a gestural music sequencer for Oblong, programming visual effects on award winning commercial campaigns for Motion Theory, and sound editing on various console game titles. Along the way I've consulted for artists including Robert Hodgin, Rachel Mayeri and Electroland.

Education-wise I earned a Masters degree in Interactive Telecommunications (ITP) from New York University, and a Bachelors of Fine Arts degree in Music Technology from California Institute of the Arts (CalArts), after originally studying music composition and jazz bass at University of Miami, Florida.

Outside of work I am a husband, father, wild fermented beer enthusiast, non-dedicated yogi, music maker and storied world traveler. I currently reside in Oakland Park, Florida, hometown of fretless bass legend, Jaco Pastorius.
Company: IBM
Provided: design, programming, communication
Date: 2015 - 2016

News Explorer combines Alchemy News API and d3.js data visualization library to automatically construct a news information network and present large volumes of news results in an understandable fashion. News Explorer is created by the Watson Cognitive Visualization Lab, of which I am a part. I lead the design of the application, in addition to co-developing the data visualizations and user interface with Steve Ross.

Learn more on the IBM Watson Developer blog: https://developer.ibm.com/watson/blog/2016/01/04/exciting-updates-for-news-explorer/

Launch the application in browser: http://news-explorer.mybluemix.net/
Client: Honda Research Institute USA
Provided: design, programming, sound
Date: 2013

I worked with Honda Institute to create a prototype HUD application that addressed a number of defensive driving scenarios, focused around situational awareness, specially objects moving in front of the car, speeding vehicle approaching in rear, and vehicle breaking in front of car. I also designed a morphing rear-fan to help the driver become aware of situations in the rear proximity. In all of these concepts following technology is assumed: car's built-in IR sensor / camera to track objects that move in the path of a vehicle, and then displays overlaying info-graphics augmented-reality-style onto the windshield via micro-projector, aligning them to the user's perspective. The resulting application provides a unique way for the client to tweak the look and feel of the graphical animations for the scenarios in real-time, subbing in real driving footage in the background for the imagined scenarios. I programmed the application in openFrameworks (C++/OpenGL) and relied heavily on add-ons OfxUI (for user interface) and OfxFenster (for multi-screen). The majority of the graphics were OpenGL, allowing elements to be easily incorporated in the actual engine, once they were prototyped. I eventually added sound design to provide an additional mode of situational awareness--I synthesized this in Logic Audio. Special thanks to the team that I worked with at Honda Research Institute USA: Karlin Bark, Cuong Tran, and Victor Ng-Thow-Hing. We've since been issued a patent for part of this work: http://patents.justia.com/patent/9378644

pictured: wireframes and prototype HUD application for situational awareness scenarios.
Client: EA (through B-Reel)
Provided: design, programming
Date: 2012

I worked with B-Reel 'creative coding' visual concepts for a multi-screen social ribbon at the EA E3 space in Los Angeles. The first two videos use illustrated artwork as input to algorithmic processes that I designed and developed in OpenFrameworks, with images of code art and data visualization explorations that I generated to follow. I also coded the MIDI communication protocol between the Social Ribbon and the surrounding experience.

Client: Oblong Industries
Provided: design, programming, sound
Date: 2012

Airborne Beats is a music sequencer application that is operable entirely by gesturing mid-air in front of a screen. Users are able to drift around the app, grabbing, dragging, and releasing audio samples onto a grid. You can also draw automation curves across a track to control things like volume over time. There are a number of gestures for play, pause, and volume control that can be performed at any point in the experience. The application is performative, yet also compositional.

Airborne Beats was programmed in C++ using Oblong's g-speak platform and the creative coding library, Cinder. As for hardware, we pulled together one screen equipped with an IR sensor, a "commodity" computer (with g-speak installed), and speakers.

Airborne Beats had its public debut in a solo show at the Catalyst event, hosted at Bar Mitte in Barcelona.

various shots of the application in action, as well as an early wireframe
Client: Bestiario
Provided: data visualization, communication
Date: 2012

In early 2012 I began working with Bestiario to help shape the unique learning experience of Quadrigram, a browser-based application that allows users to create custom data visualizations in an intuitive way with the flexibility of a visual programming language. It enables rapid prototyping and sharing of ideas, as well as producing compelling solutions with data in the forms of interactive visualizations, animations or dashboards. While there I lead various initiatives, helped rename, drescribe and organize all modules, created several instructional tutorials, and designed new sections and features to the application.

Over the course of my engagement, I made several tutorial videos explaining Quadrigram. The above "How" video takes an exploratory data analysis approach and features voiceover by yours truly.

A selection of diagrams and data visualizations that I produced to help users gain a better understanding of data structures.
  Getting more posts...