Nate is a creative technologist, technical director, and digital artist
He specializes in creating custom hardware and software systems for interactive audio visual
installations, digital event activations, and live performance. He holds a Bachelor of Science
in Electrical and Computer Engineering from The University of Colorado
He creates work on with Unity3D, WebGL, TouchDesigner, Max/MSP, and others
When he's not on a computer he's exploring the mountains by ski touring or rock climbing
Update: Congratulations to the 9to5 team for winning the competition!
Project Roles
Resident Hacker
Collaborators
Andy Pruett, Pablo Gnecco
Project Description
I was asked to participate in a piece of experimental programming for 9to5, an entry
into The Field Experiment art grant competition hosted by The Goat Farm Arts Center and
The Hambidge Center. The piece that I created in collaboration with Andy Pruett
reflected on the control we allow online bots to have in our lives be it through
algorithmic sorting of facebook posts or the "matches" we get sent to our online dating
profiles. Specifically, what happens as these bots start to break away from the net and
we invite them into our homes in the forms of products like Amazon Echo.
The installations consisted of first hacking the chat window of 9to5.tv
with injected javascript to forward messages to a slack channel. On the slack channel
were bots that listened for certain key phrases and commands that altered the physical
space that we were hacking from. The bots themselves then started a dialogue with each
other, creating feedback loops where the physical space was altered in a continuous
manner.
The whole hack was broadcast live via YouTube and on the website, where users were
invited to chat and participate in changing the space in dialogue with the bots.
For "The Big Show" in 2014, I helped develop a micro retail experience which came to be
known as RZR[shop] with the Emerging Experiences team at Razorfish. The aim was
to provide massive impact with a small footprint. The entire experience is housed in a
custom built shipping container designed to be able to be easily transported and
delivered in a scenario such as SXSW or Times Square. Attached to the side of the
shipping container is a Vending Macine which was the focus of my work leading up to the
event.
I developed a custom arduino shield that interfaces to the control board of the
vending machine and recreates the key presses normally executed by the keypad. This arduino
recieves serial messages from a Raspberry Pi which runs a node.js server,
allowing us to vend products personalized based on the outcome of the experience. In
this case, the experience is a multiscreen soccer game.
In addition, the arduino drives a set of custom built LEDs installed in
the machine, which respond to the gameplay. When a shot is blocked, the lights flash
green, when a goal is scored, the lights flash red, giving the goalie competitor
exciting visual feedback offscreen.
The Human Kaleidoscope is an interactive, audio reactive installation created for future night
held at Mammal Gallery in Atlanta, GA. It uses a camera input to mix the user's image
into the scene, and a MIDI controller to allow the user to control the scene. The scene
also responds to audio created by the user or by surrounding installations.
Nate Turley
May 2015
Animation toolkit for volumetric LED display
For a major retail client, I was asked to develop a away to create animations for a
volumetric LED display. Traditional motion graphics tools such as After Effects have no
built in modes for creating animations for this type of display. For this reason, I
created custom software with OpenFrameworks to enable our motion design team to
create content for the display.
I wanted to create a tool that had a familiar keyframe approach to animation. I chose
to leverage ofxTimeline
created by James George for the UI and timeline
management. From there, I developed a set of modes that the designer would be able to
use to create the animations:
Waves (Sine, Saw, Square, Tri) with modulations
Particle System
Mesh Intersection
The software outputs the animations as png sequences which are converted to .mov files
that can then be played back by the content delivery system that drives many displays
in the store. In addition, the software provides a 3D rendering of the display, so the
motion design team has a refrerence for what designs will look like on the final
display. (note: this content is from preliminary tests and does not reflect the content
to be played on the final version.)
T-Mobile Signature Stores
Project Roles
Software Engineering, Install
Client
T-Mobile
Project Description
Times Square, NYC
For the launch of the T-Mobile Signature Store in Times Square, I helped develop an
interactive experience surrounding the carrier's "Simple Global" campaign. The
experience included a "Kinect greenscreen" photo op in one of many destinations.
To accomplish this, I worked on a Kinect client/server system which streams a guest's
image with the background subtracted to another device running the photo booth. The
images are then sent to the massive billboard above the store.
Miami, FL
For the Miami store, I created a content development tool for the volumetric LED
chandelier featured in the middle of the store. Details of this project are in another
section of this page.
Sound Journeys is a collaboration with The
Principals commissioned by Ford and B&O Play. It includes three distinct but
connected zones of immersive light, sound, and architecture.
I was asked to create a 24 channel spatial audio system and program an audio reactive
DMX lighting system. To achieve this, I built two lightweight tools in Open
Frameworks which I will be releasing as ofxDBAP
and ofxOscDMXBouncer and an
audio mixer in Max/MSP.
This work was created for UNCOMPRESSED Special Event Night未压缩 Vol.4 held at 3-Legged Dog
in NYC.
Via 3LD: We are excited to announce that the groundbreaking media art group
exhibition .zip will be open to the public in 3LD Studio B.
In 2015, the “Future of Today: Imaginary Future” exhibition, organized by Today Art
Museum (TAM), opened to widespread critical acclaim in Beijing, China. In July 2017,
“Future of Today" developed into a secondary project around the theme “.zip.”
The piece manifested as an audio visual real time projection mapping experiment. A 3D
midi-reactive particle system built in Unity is rendered to the 3 massive walls at the
venue, giving viewers a uniquely immersive view into another world.
The audio track was produced and performed by Reese Donohue. Visual art and perfomance
by Nate Turley
Cloud of Petals VR is a series of 6 virtual reality vignettes created with artist Sarah Meyohas as part of Sarah's solo
exhibition at Red Bull Arts NY. The
project features massive interactive particle systems made of 100,000 rose petals which
were photographed at Bell Labs last year, as well as images created by a generative
adversarial network trained on those images. Each scene is tied together with a
meditative spatial audio accompaniment, and viewed on Oculus Rift headsets. The show
runs between October 12th and December 10th, 2017 at Red Bull Arts at 220 W 18th St,
New York, NY
Hy·per·sub·tle is an audio-reactive mixed reality installation where digital dancers
guide you to a new reality in the universe. The installation translates your movements
to generate a dynamic landscape unique to each personality. Hy·per·sub·tle is best
experienced by ~ dancing a lot ~ with friends.
You can also download the hy·per·sub·tle mobile augmented reality app for iOS and
Android, which is part of a curatorial project that merges artificial intelligence with
motion capture of dance moves. The project was choreographed by professional dancers
emphasizing magical moments in pop culture. Users were encouraged to experience and
record
these dances around the Panorama festival.
The Day The World Changed
In partnership with The International Campaign to Abolish Nuclear Weapons
In partnership with Nobel Media and Nobel Peace Prize Laureate, International Campaign
to Abolish Nuclear Weapons, The Day the World Changed brings to viewers the harrowing
impressions of the victims and survivors of atomic bombings and nuclear arms testing
through first-hand testimonies, data visualizations, and innovative use of 3-D scanning
and photogrammetry.
The experience premiered at Tribeca Film Festival in NYC. It is an interactive social VR experience created for HTC Vive with Unity3D
I was responsible for creating the look of the avatars. They were meant to look like the Human Shadow Etched in Stone.
To achieve this feel, I creted a special offcreen render pipeline to achieve a ghostly effect. In the final scene, the avatars start emitting blue particles, which all together start to encompass a burning earth, enticing the viewers
that there may still be hope.
ZIKR: A SUFI REVIVAL is an interactive social VR experience that uses song and dance to
transport four participants into ecstatic Sufi rituals. It also explores the
motivations behind followers of this mystical Islamic tradition, still observed by
millions around the world.
For this project, I was tasked with creating a cohesive visual language to tie together
a 3D world made up of holographic captures made with DepthKit
and a 360 degree immersive video world.
To accomplish this task, I created a custom GPU based particle system pipeline in
Unity3D which renders 360 video on a sphere made out of particles. This allows a
seemless transition between DepthKit characters (also rendered in particles) and the
360 video world. An example of this transition, recorded in VR, can be seen below:
Virtually Dating (#virtuallydating) is a multi user VR experience. We used perception
neuron motion capture suits to bring two people on a blind date into the virtual realm.
The results of the shoot have been produced into a episodic web series by Facebook and
Conde Nast, which pairs up real people on blind dates in VR to see if their VR
personalities make for a real-life match. Because the technology used is essentially in
beta, the experience was designed to play to these limitations.
I was brought on to build out interactivity in the scenes, as well as operate the
experience during the filming of the show.
Celestial Bodies is multiuser VR experience set to Set It Off by Diplo. The experience
is a fully immersive experience which transports a user upon entering leveraging IOT
devices, custom costuming, props, and actors.
Nate was brought on specifically to create custom interactive particle systems which
make up the users' bodies in VR. He created a system by which the parameters of the
particle physics and appearance could be changed by painting on textures dynamically
loaded by custom DirectX shaders. This allowed the artistic director to customize the
look as he saw fit without being bound to making changes in code.
Hermès went all out for the launch of their spring Men's line with a massive fashion
show and an even bigger party in a warehouse in DTLA. Nate was hired by Mash
Studios to produce two interactive experiences for the event.
Say Something
Say Something is a mobile driven interactive projection where users are invited to send
a message via SMS to be projected on a wall.
Technologies Used: OpenFrameworks, Node.js, Twilio
Silk Records
I built a system to play digital files using a Serato timecode record. The file was
triggered by placing an RFID enabled record sleeve on to a shelf.
Technologies Used: Node.js, Raspberry Pi, xwax
Razorshop is a BLE enabled shopping multiscreen shopping platform I helped develop while
at Razorfish. It was picked up by Adobe and used heavlily in their marketing efforts in
2015. I personally took the demo to many events including Adobe Summit in London and
DMEXCO in Cologne. In both locations, I helped train local Razorfish employees on the
platform, so that they could take demos all over EMEA.
It was one of the projects that led Adobe to name Publicis their digital marketing
partner of the year in 2016
Partnered with Refinery29 and Reify, we
brought an "augmented reality" gif photobooth to R29's School
of Self Expression at SXSW in 2016. After picking out some pieces of Neiman
Marcus clothing and getting a demo of the Reify AR shopping platform, users were
welcomed to step into our photo booth where their choices of clothing created custom
overlays while a gif is captured. Users were then able to share the capture on twitter
and @ themselves directly from the booth. Brand ambasadors were also standing by with a
tablet to share via email or instagram.
My role on the project was building the software for the booth and integrating
with a node.js server running locally on the booth which takes care of uploading to an
event specific microsite, and triggering the tweets.
Created as a demonstration for Razorfish Client Summit in the summer of 2014
To help talk about IOT, and to help people experience IOT, I developed a set of acrylic
cubes using Pinoccio. The cubes
are outfitted with LEDs, an MPU (Motion Processing Unit). The LEDs are mounted on a 3D
printed frame, and the batteries and electronics are mounted in the middle.
I made a custom pinoccio backpack to hold the peripherals. It contains an MPU6050
breakout board, as well as pin headers to connect the LEDs (NeoPixel Sticks from
Adafruit.) I used the battery that comes with Pinoccio to power the micro, and added
another larger LiPo to specifically power the lights. When the cubes are picked up they
turn on and send a pulse through the mesh. Every cube blinks in acknowledgement. The
color of the held cube can then be changed by the values read from the MPU (Yaw, Pitch,
Roll.) When the held cube is shook, it triggers an animation in all the other cubes.
When the cubes are turned on, they looked a bit like this:
HoloLens R&D
I have been extremely lucky to be among the first deveopers in the world to get my
hands on a Microsoft HoloLens device. I have been spending a good portion of my time
researching the possibilities and potential of this groundbreaking device. Here is the
progress I have made. Demos are made using Unity3D.
Razorfish Emerging Experiences June 2013 - August 2016
Work as member of highly creative, cross disciplinary team primarily focused on
developing interactive experiences for retail, tradeshow, and museum environments
Perform in high stress trade show situations. Installing, maintaining experiences,
and telling the story of the project
Participate in ideation sessions designed to explore uses for new and emerging
media technologies such as Kinect, Leap Motion, Myo, and others
Rapidly create prototypes for demonstration at pitches and iterate towards a final
project
Perform onsite installation and QA testing of projects at client sites
Develop new tools and reusable code as needed by a project
Post relevant news to official twitter handle
Spectra Logic May 2012 - August 2012
Develop user interface for large scale magnetic tape libraries
Develop and test cross platform API for legacy MFC code (C++)
CU Environmental Center Jan 2012 - May 2013
Design and developed smart phone app aimed at encouraging use of sustainable
transportation
Extensive use of Google Maps API
Send It! Apps Sept 2011 - Dec 2012
Develop two Android apps currently selling on the Android Market
Port existing iPhone code for use on Android platform
Education
University of colorado Class of 2013
B.S. Electrical and Computer Engineering
Skills
Development
Advanced or better in most object oriented languages including C/C++, C#, Java
Proficient in application frameworks such as Cinder, Unity3D, Open Frameworks,
Processing, Android, WPF
Working knowledge of graphics APIs including OpenGL, DirectX, GLSL
Proficient in standard web technologies including node.js, html, xml, javascript
Hardware
Advanced or better knowledge of Atmega microcontrollers including Arduino IDE
Experience with IEEE 802.15.4 mesh networks
Proficient with basic circuit design and preparation for fabrication
Software
Proficient with technical software including SolidWorks, LTSpice, MatLab, Wolfram
Mathematica, Blender, Autodesk Inventor
Proficient with creative software including Adobe Photoshop, Illustrator, After
Effects, Ableton Live
Exploring other media software including Touch Designer