Tim Stutts here. I am a designer and developer drawn to challenges within uncharted domains, where few pre-existing templates means a lot of visionary thought and re-inventing. My approach leans on a hybrid background in user interaction, data visualization, visual effects and sound design. I currently work at Magic Leap as Senior Interaction Prototyper on the HCI group within the User Experience team. Previously I worked on compelling data visualizations and interfaces at IBM Watson Cognitive Visualization Lab. A prototyper since my Lego days, I enjoy bringing concepts to life and further evolving them into functional applications and experiences. Outside of work I spend time with my wife and baby daughter, am a non-dedicated yogi, indulge a wild-fermented beer obsession, and compose electronic music under the moniker Lordx.
I can't say what I'm up to specifically just yet, but here's some exciting press to hold you over in the meantime: https://www.wired.com/2016/04/magic-leap-vr/
Company: IBM
Provided: design, programming, communication
Date: 2015 - 2016

News Explorer combines Alchemy News API and d3.js data visualization library to automatically construct a news information network and present large volumes of news results in an understandable fashion. News Explorer is created by the Watson Cognitive Visualization Lab, of which I am a part. I lead the design of the application, in addition to co-developing the data visualizations and user interface with Steve Ross.

Learn more on the IBM Watson Developer blog: https://developer.ibm.com/watson/blog/2016/01/04/exciting-updates-for-news-explorer/

Launch the application in browser: http://news-explorer.mybluemix.net/
Company: IBM
Provided: design, programming
Date: 2014

X-Ray is an application that helps users understand the natural language inner-workings and cognitive computing aspect of Watson, beginning with a sea of general knowledge questions asked to a pipeline, then allowing the user to expand an individual question, from there exploring candidate answer generation, evidence passages, text parsing and more. Its surreal design is a result of a given art direction to bring a sketch to life, avoid a typical grid layout and to employ 3D techniques wherever possible. On this project, I worked with Watson Innovation Labs to define the use experience, lead the design effort, produced numerous concept sketches/ visual designs, and was directly involved with the front-end development of the final application, which ran as a Three.js touch-enabled, 3D, in-browser experience. The application debuted at the launch of the Watson Experience Center in NYC, where it was installed on several touch-screens.

Client: Honda Research Institute USA
Provided: design, programming, sound
Date: 2013

I worked with Honda Institute to create a prototype HUD application that addressed a number of defensive driving scenarios, focused around situational awareness, specially objects moving in front of the car, speeding vehicle approaching in rear, and vehicle breaking in front of car. I also designed a morphing rear-fan to help the driver become aware of situations in the rear proximity. In all of these concepts following technology is assumed: car's built-in IR sensor / camera to track objects that move in the path of a vehicle, and then displays overlaying info-graphics augmented-reality-style onto the windshield via micro-projector, aligning them to the user's perspective. The resulting application provides a unique way for the client to tweak the look and feel of the graphical animations for the scenarios in real-time, subbing in real driving footage in the background for the imagined scenarios. I programmed the application in openFrameworks (C++/OpenGL) and relied heavily on add-ons OfxUI (for user interface) and OfxFenster (for multi-screen). The majority of the graphics were OpenGL, allowing elements to be easily incorporated in the actual engine, once they were prototyped. I eventually added sound design to provide an additional mode of situational awareness--I synthesized this in Logic Audio. Special thanks to the team that I worked with at Honda Research Institute USA: Karlin Bark, Cuong Tran, and Victor Ng-Thow-Hing. We've since been issued a patent for part of this work: http://patents.justia.com/patent/9378644

pictured: wireframes and prototype HUD application for situational awareness scenarios.
Client: EA (through B-Reel)
Provided: design, programming
Date: 2012

I worked with B-Reel 'creative coding' visual concepts, as well as data visualizations, that inspired the development direction of a multi-panel social ribbon for the EA E3 space in Los Angeles, 2013. Special thanks to B-Reel Los Angeles for their art direction and B-Reel Barcelona for their technical guidance and collaborative programming on the final social ribbon element. The first two videos use illustrated artwork as input to algorithmic processes that I designed and developed in OpenFrameworks, with images of code art and data visualization explorations that I generated to follow. I also coded the MIDI communication protocol between the Social Ribbon and the surrounding booth experience.

Client: Oblong Industries
Provided: design, programming, sound
Date: 2012

Airborne Beats is a music sequencer application that is operable entirely by gesturing mid-air in front of a screen. Users are able to drift around the app, grabbing, dragging, and releasing audio samples onto a grid. You can also draw automation curves across a track to control things like volume over time. There are a number of gestures for play, pause, and volume control that can be performed at any point in the experience. The application is performative, yet also compositional.

Airborne Beats was programmed in C++ using Oblong's g-speak platform and the creative coding library, Cinder. As for hardware, we pulled together one screen equipped with an IR sensor, a "commodity" computer (with g-speak installed), and speakers.

Airborne Beats had its public debut in a solo show at the Catalyst event, hosted at Bar Mitte in Barcelona.

various shots of the application in action, as well as an early wireframe
Client: Bestiario
Provided: data visualization, communication
Date: 2012

In early 2012 I began working with Bestiario to help shape the unique learning experience of Quadrigram, a browser-based application that allows users to create custom data visualizations in an intuitive way with the flexibility of a visual programming language. It enables rapid prototyping and sharing of ideas, as well as producing compelling solutions with data in the forms of interactive visualizations, animations or dashboards. While there I lead various initiatives, helped rename, drescribe and organize all modules, created several instructional tutorials, and designed new sections and features to the application.

Over the course of my engagement, I made several tutorial videos explaining Quadrigram. The above "How" video takes an exploratory data analysis approach and features voiceover by yours truly.

A selection of diagrams and data visualizations that I produced to help users gain a better understanding of data structures.
Client: Google, through LeftFieldLabs
Provided: design, programming
Date: 2012

LeftFieldLabs brought me on to do a design and programming exploration for a microsite surrounding the new Nexus Q device, featuring an in-browser 3D experience and realtime FFT music data visualization. I worked closely Eric Lee and CJ Cenizal on this effort. The following examples of my work are not representative of the current Google brand, but instead exist as early stage concepts showcasing visualization and interaction approaches.

FFT analysis of accompanying music track used to modulate the size of ellipses surrounding the device; one of several music data visualization approaches that were proposed. This prototype was programmed in OpenFrameworks (C++).

Prototype of an in-browser experience using 3D forms and primitive lighting. Audio data (track amplitude over time) swirls around the device. Rotation is accomplished the Y-axis via mouse position. Programmed in HTML5/Processing.js.

Same effect as previous example, though viewed at a different vantage point, this time with speakers in the background, which appear to be projecting music into the space.
Client: IBM, through Motion Theory
Provided: design, programming
Date: 2010

I was fortunate to have the opportunity to do visual code artistry on the IBM Smarter Planet campaign at Motion Theory, specifically for the "Energy" spot. In this campaign, traditional 3D animators using Maya and Cinema 4D, worked alongside a team of code artists, designing visual elements in Processing and OpenFrameworks. The coder team was lead by Josh Nimoy and Keith Pasko, and consisted of Jeremy Rotsztain, CJ Cenizal, Ryan Alexander, Elise Co, Ekene Ijeoma and myself.

"Energy" embraces generative images drawn from sources like windmills, transformers and homes as dimensionalized expressions of data flowing from energy, in chaotic yet elegant ways.

View the final commercial.

Final Graphic Elements for Commercial

/ (1 of 1)

I created these elements to visualize energy for the power lines shot. Each band is composed of hundreds of hair-like lines, modulated intermittently by Perlin noise and sine waves, and anchored to invisible points along 3D models of telephone poles. Programmed in C++ with OpenFrameworks. Click to see elements incorporated into final spot.

/ (1 of 1)

This visualization captures the flow of energy and inductance along the charge nozzle of an electric Tesla vehicle. Energy undulates around the surface of an invisible 3D model of a tube, later superimposed onto live-action footage. Programmed in C++ with OpenFrameworks. Click to see elements incorporated into final spot.

/ (1 of 1)

In a render for the overhead city shot, wires swerve along roads and congest at the intersections. Occasionally a straggler breaks free and whips around a building or two before returning to the streets. Programmed in Java with Processing. Click to see elements incorporated into final spot.

View the final commercial.
Client: Wired Magazine, with Robert Hodgin
Provided: music, sound design
Date: 2010

Screenshot of the completed illustration running on the device.

I recently completed sound for a generative graphics piece created by Robert Hodgin for issue 18.08 of the iPad edition of Wired Magazine. The piece was used as part of a feature illustration. I'm a huge fan of Hodgin's programmatic work--he's recently worked with Peter Gabriel, and in the past, with Apple on the iTunes Visualizer. Being a code artist myself, I've been interested in exploring audio accompaniments to these types of pieces, be it the assemblage of imagined sounds emitted by the objects and environment or--in the case of these two videos--a musical score that attempts to compliment vector-based graphics.

Earlier render with music track. This piece had multiple shots, as opposed to a single movement, so I composed ultra-brief sections--a glitchy, stripped-down beat, a whistling background, some wandering melodies.

Final render with music track. I chose to compliment the glowing, solar graininess of the particles with granulated, ascending tones, cello drones and a trip-hop-like beat.

A picture of my Logic Audio session in progress, in which I am generating the sound design for an earilier animation.

Client: Cooper-Hewitt Design Museum, with Electroland
Event: Design Triennial, 2006 New York City
Provided: sound design, programming (Max/MSP)
Date: 2007

I contracted with the LA-based art installation team, Electroland, to do sound design and Max/MSP development for an interactive piece installed in a stairwell for the 2006 Cooper-Hewitt Design Triennial. As people make their way through the installation, overhead motion-tracking cameras are used to establish their location. Light and sound from around the stairwell are projected to signify a collision of people, motion onto certain steps, and prolonged periods of standing in one place.

Watch a video with sound here.

/ (1 of 1)

Installation in action and Max patch that I developed for applying custom settings to the sound.

Exhibition: Sony Wonder Lab, NYC
Provided: hardware design, interaction design, and creative coding in Processing/Arduino
Date: 2007

Malleable Electronic Lawn is a hyper-tangible joystick interface for controlling 3D graphics on a computer projection. It features an onboard mini-Arduino connected to multiplexers and a USB jack to accomplish data transmission. For the debut at the ITP Spring Show, NYC, 2007, I connected Malleable Electronic Lawn to Zach Layton’s Sonic Topology visualization. Together we mapped the joysticks to control different 3D parameters, such as rotation and morphing behavior. Zach and I had a second showing at the Sony Wonder Lab, NYC, in the summer of 2007.

M.E.L. at various stages of development up until debut
  Getting more posts...