Update: Congratulations to the 9to5 team for winning the competition!
Andy Pruett, Pablo Gnecco
I was asked to participate in a piece of experimental programming for 9to5, an entry into The Field Experiment art grant competition hosted by The Goat Farm Arts Center and The Hambidge Center. The piece that I created in collaboration with Andy Pruett reflected on the control we allow online bots to have in our lives be it through algorithmic sorting of facebook posts or the "matches" we get sent to our online dating profiles. Specifically, what happens as these bots start to break away from the net and we invite them into our homes in the forms of products like Amazon Echo.
The whole hack was broadcast live via YouTube and on the website, where users were invited to chat and participate in changing the space in dialogue with the bots.
PressThe Creator's Project
For "The Big Show" in 2014, I helped develop a micro retail experience which came to be known as RZR[shop] with the Emerging Experiences team at Razorfish. The aim was to provide massive impact with a small footprint. The entire experience is housed in a custom built shipping container designed to be able to be easily transported and delivered in a scenario such as SXSW or Times Square. Attached to the side of the shipping container is a Vending Macine which was the focus of my work leading up to the event.
I developed a custom arduino shield that interfaces to the control board of the vending machine and recreates the key presses normally executed by the keypad. This arduino recieves serial messages from a Raspberry Pi which runs a node.js server, allowing us to vend products personalized based on the outcome of the experience. In this case, the experience is a multiscreen soccer game.
In addition, the arduino drives a set of custom built LEDs installed in the machine, which respond to the gameplay. When a shot is blocked, the lights flash green, when a goal is scored, the lights flash red, giving the goalie competitor exciting visual feedback offscreen.
Razorfish Emerging Experiences
The Human Kaleidoscope is an interactive, audio reactive installation created for future night held at Mammal Gallery in Atlanta, GA. It uses a camera input to mix the user's image into the scene, and a MIDI controller to allow the user to control the scene. The scene also responds to audio created by the user or by surrounding installations.
For a major retail client, I was asked to develop a away to create animations for a volumetric LED display. Traditional motion graphics tools such as After Effects have no built in modes for creating animations for this type of display. For this reason, I created custom software with OpenFrameworks to enable our motion design team to create content for the display.
I wanted to create a tool that had a familiar keyframe approach to animation. I chose to leverage ofxTimeline created by James George for the UI and timeline management. From there, I developed a set of modes that the designer would be able to use to create the animations:
The software outputs the animations as png sequences which are converted to .mov files that can then be played back by the content delivery system that drives many displays in the store. In addition, the software provides a 3D rendering of the display, so the motion design team has a refrerence for what designs will look like on the final display. (note: this content is from preliminary tests and does not reflect the content to be played on the final version.)
Software Engineering, Install
For the launch of the T-Mobile Signature Store in Times Square, I helped develop an interactive experience surrounding the carrier's "Simple Global" campaign. The experience included a "Kinect greenscreen" photo op in one of many destinations.
To accomplish this, I worked on a Kinect client/server system which streams a guest's image with the background subtracted to another device running the photo booth. The images are then sent to the massive billboard above the store.
For the Miami store, I created a content development tool for the volumetric LED chandelier featured in the middle of the store. Details of this project are in another section of this page.
AwardsDSE Apex Gold in Retail
Technical direction, lead development
Sound Journeys is a collaboration with The Principals commissioned by Ford and B&O Play. It includes three distinct but connected zones of immersive light, sound, and architecture.
I was asked to create a 24 channel spatial audio system and program an audio reactive DMX lighting system. To achieve this, I built two lightweight tools in Open Frameworks which I will be releasing as ofxDBAP and ofxOscDMXBouncer and an audio mixer in Max/MSP.
This work was created for UNCOMPRESSED Special Event Night未压缩 Vol.4 held at 3-Legged Dog in NYC.
Via 3LD: We are excited to announce that the groundbreaking media art group exhibition .zip will be open to the public in 3LD Studio B. In 2015, the “Future of Today: Imaginary Future” exhibition, organized by Today Art Museum (TAM), opened to widespread critical acclaim in Beijing, China. In July 2017, “Future of Today" developed into a secondary project around the theme “.zip.”
The piece manifested as an audio visual real time projection mapping experiment. A 3D midi-reactive particle system built in Unity is rendered to the 3 massive walls at the venue, giving viewers a uniquely immersive view into another world.
The audio track was produced and performed by Reese Donohue. Visual art and perfomance by Nate Turley
Video Coming Soon
Sarah Meyohas, Red Bull Arts
Cloud of Petals VR is a series of 6 virtual reality vignettes created with artist Sarah Meyohas as part of Sarah's solo exhibition at Red Bull Arts NY. The project features massive interactive particle systems made of 100,000 rose petals which were photographed at Bell Labs last year, as well as images created by a generative adversarial network trained on those images. Each scene is tied together with a meditative spatial audio accompaniment, and viewed on Oculus Rift headsets. The show runs between October 12th and December 10th, 2017 at Red Bull Arts at 220 W 18th St, New York, NY
Lead Graphics Developer
ZIKR: A SUFI REVIVAL is an interactive social VR experience that uses song and dance to transport four participants into ecstatic Sufi rituals. It also explores the motivations behind followers of this mystical Islamic tradition, still observed by millions around the world.
For this project, I was tasked with creating a cohesive visual language to tie together a 3D world made up of holographic captures made with DepthKit and a 360 degree immersive video world.
To accomplish this task, I created a custom GPU based particle system pipeline in Unity3D which renders 360 video on a sphere made out of particles. This allows a seemless transition between DepthKit characters (also rendered in particles) and the 360 video world. An example of this transition, recorded in VR, can be seen below:
Conde Nast & Facebook
Virtually Dating (#virtuallydating) is a multi user VR experience. We used perception neuron motion capture suits to bring two people on a blind date into the virtual realm. The results of the shoot have been produced into a episodic web series by Facebook and Conde Nast, which pairs up real people on blind dates in VR to see if their VR personalities make for a real-life match. Because the technology used is essentially in beta, the experience was designed to play to these limitations.
Nate was brought on to build out interactivity in the scenes, as well as operate the experience during the filming of the show.
Museum of Sex
Celestial Bodies is multiuser VR experience set to Set It Off by Diplo. The experience is a fully immersive experience which transports a user upon entering leveraging IOT devices, custom costuming, props, and actors.
Nate was brought on specifically to create custom interactive particle systems which make up the users' bodies in VR. He created a system by which the parameters of the particle physics and appearance could be changed by painting on textures dynamically loaded by custom DirectX shaders. This allowed the artistic director to customize the look as he saw fit without being bound to making changes in code.
Technical Direction, Lead Development
Hermès went all out for the launch of their spring Men's line with a massive fashion show and an even bigger party in a warehouse in DTLA. Nate was hired by Mash Studios to produce two interactive experiences for the event.
Say Something is a mobile driven interactive projection where users are invited to send a message via SMS to be projected on a wall.
Technologies Used: OpenFrameworks, Node.js, Twilio
I built a system to play digital files using a Serato timecode record. The file was triggered by placing an RFID enabled record sleeve on to a shelf.
Technologies Used: Node.js, Raspberry Pi, xwax
Software Development, Demo Training
Razorshop is a BLE enabled shopping multiscreen shopping platform I helped develop while at Razorfish. It was picked up by Adobe and used heavlily in their marketing efforts in 2015. I personally took the demo to many events including Adobe Summit in London and DMEXCO in Cologne. In both locations, I helped train local Razorfish employees on the platform, so that they could take demos all over EMEA.
It was one of the projects that led Adobe to name Publicis their digital marketing partner of the year in 2016
Technical Direction, Lead Development, Install
Partnered with Refinery29 and Reify, we brought an "augmented reality" gif photobooth to R29's School of Self Expression at SXSW in 2016. After picking out some pieces of Neiman Marcus clothing and getting a demo of the Reify AR shopping platform, users were welcomed to step into our photo booth where their choices of clothing created custom overlays while a gif is captured. Users were then able to share the capture on twitter and @ themselves directly from the booth. Brand ambasadors were also standing by with a tablet to share via email or instagram.
My role on the project was building the software for the booth and integrating with a node.js server running locally on the booth which takes care of uploading to an event specific microsite, and triggering the tweets.
A video posted by Nate Turley (@turleyn) on
Created as a demonstration for Razorfish Client Summit in the summer of 2014
To help talk about IOT, and to help people experience IOT, I developed a set of acrylic cubes using Pinoccio. The cubes are outfitted with LEDs, an MPU (Motion Processing Unit). The LEDs are mounted on a 3D printed frame, and the batteries and electronics are mounted in the middle.
I made a custom pinoccio backpack to hold the peripherals. It contains an MPU6050 breakout board, as well as pin headers to connect the LEDs (NeoPixel Sticks from Adafruit.) I used the battery that comes with Pinoccio to power the micro, and added another larger LiPo to specifically power the lights. When the cubes are picked up they turn on and send a pulse through the mesh. Every cube blinks in acknowledgement. The color of the held cube can then be changed by the values read from the MPU (Yaw, Pitch, Roll.) When the held cube is shook, it triggers an animation in all the other cubes. When the cubes are turned on, they looked a bit like this:
I have been extremely lucky to be among the first deveopers in the world to get my hands on a Microsoft HoloLens device. I have been spending a good portion of my time researching the possibilities and potential of this groundbreaking device. Here is the progress I have made. Demos are made using Unity3D.
For my most updated PDF version, click here
B.S. Electrical and Computer Engineering