Follow Me Dragon AR

high poly

preliminary material iris control with random gaze

diffuse, normal, metalness, roughness, emissive, ambient occlusion

simple animation test

I worked with a medium sized team to create The Virtual Reality's first VR project with virtual embodiment .  The Martian VR Experience received a Silver Digital Craft Lion at Cannes 2016.


-Blueprint Logic

-Level / Asset Optimizations

-Maya Tools  ( i.e. camera exporter)

-Post Process - Polygon Penetration Solutions

-Blutility scripting

-Additional Level Design ( Blast Off to Hermes )

-Interaction Design ( Blast Off to Hermes )

​-Physics Interaction

-Additional Lighting

-Materials/ Shaders


​-HISM (Hierarchical Instanced Static Mesh Component) Research and Setup

-Integrated 3rd Party Spatial Sound Solution

-Troubleshoot / Debug editor crashes

-Technical Documentation

Homebrewed Vive / IOS mocap solution
Kiteboy head with xgen hair groom , attached to mannequin body

WPO used here to sculpt the characters brow, nose and jaw.

Homebrewed Vive / IOS mocap solution applied to Boffo early testing of both hair systems and mocap 

WPO used here for character 'mass' and is combined with normal map blending to fake muscle groups and veins

fXTweaker - an outliner tool that shows only particle related nodes and information.  Used for node optimizations and quick selections in a scene that contains thousands of nodes.

Written for Mel / Pyside.

World Position Offset

procedural vs none  preliminary test 

Using vive hmd, controllers, and pucks, as well as an Iphone I built a relatively inexpensive mocap rig to drive an animated character in realtime.   Initial testing was done using stock Unreal assets.  Kiteboy head and mannequin body were used for initial setup and a Metahuman followed shortly.

The final goal was to drive a furry host in realtime using easy to access mocap equipment.

Three facial blendshapes for  happy, sad, amazed, procedurally blended ontop of a body animation, to breathe life into the character


sfxToy -  a real-time fx tool to streamline the creation and manipulation of Maya generated particles inside of the existing Sony game art pipeline.  

Written for Mel / Pyside

Art Tools

Diffuse, Specular, Normal Maps

Substance Painter's Physically Based Rendering is an amazing tool.   The more I use it, the more I love it.

Homebrewed Vive IOS mocap (Cheap) Solution for Realtime Broadcast

a more technical example

Sample Python Scripts

Homebrewed Vive / IOS mocap solution applied to MetaHuman 

Unreal 4.22 in editor - realtime raytracing 


Character Process  / non PBR material

rigged and weighted


Raising A Rukus

Procedural Emotion

I enjoy working with Maya for asset creation, but I also love making tools that streamline my [or the teams] processes.  Although I am not very fond of building tools in 3dsMax, I have had experience in doing so.  

PBR material in Substance

high poly

The Martian VR Experience

Diffuse, Specular, Normal Maps

World Position Offset is a very powerful technique to manipulate vertex positions inside the material.  From a single base, unique versions can be generated

AR Prototyping


Ribbon Particle Tool

Gary Brunetti

Technical  .  Artistic  .  AR  .  VR  .  Realtime

rigged and weighted

low poly

I built this particle based ribbon simulator for a graphics test.

Made in Xcode 2013

I worked with a very small production team on Follow Me Dragon.  Most of the team had to wear multiple hats on this project.  I had my TAD tasks and was also responsible for the Technical Direction and managing the Engineering team, as they were fairly green to production.  My biggest contribution was conceiving and integrating the real-time Navmesh functionality.  This allows Drake to locate navigable areas in real-time and interact with them, dynamically.  This package of Art, Design, and boundary pushing Tech caught the eye of Apple.  Enough so, a special store demo version of Follow Me Dragon was playable at Apple stores worldwide.  It also led to an Apple Best of 2017 Award- - Tech and Innovation - AR.


-Realtime Navigation

-Blueprint Logic


-Material Creation

-Shadow Solution ( due to technical limitations the shadow is created in a non standard way )

-​Technical Direction

-Engineering Management

-Engineering Advisor

-Realtime reflection  Material / Shader ( Chrome )

- Materials Optimizations

-Character costume  creation ( 3d / 2d )

-UI ( Tech )

-Maya Tools  ( i.e. character exporter )

-Technical Documentation

low poly

Copyright 2019. gary brunetti. All rights reserved.

I worked with a large team of mostly movie industry professionals and a small group from the previous Martian VR project.   Toward the end of the project I teamed up with an engineer to research and create a workable solution for 360 capture inside of UE4.  The engineer took care of the engine side, and I took care of the editor side.  Setup was also required on the particle side of things both in the engine and in materials and particle systems setup and logic to compensate for the the non real time capture.

The intent of this feature is to be able to capture a super high resolution 360 snapshot at the camera's location each frame.  This 360 snapshot is actually made of many individual screenshots that get stitched back together with lens correction into a 360 video format.  Additional functionality allowed for specific masking of non buffer visualization channels, like 'Emissive', to be re routed into a channel that could be visualized and captured, like 'Metal'. 


​-360 Capture Editor integration ( Blueprints, Materials )

-Anim Notify based Footprint FX System

-Blueprint Logic

-Fluid Systems R & D

-Materials/ Shaders

-Level / Asset Optimizations

-Maya Tools  ( i.e. level and prop exporters)

-Blutility scripting

-Level Debugging

-Troubleshoot / Debug editor crashes

​-Technical Documentation