I worked with a medium sized team to create The Virtual Reality's first VR project with virtual embodiment .  The Martian VR Experience received a Silver Digital Craft Lion at Cannes 2016.


Contributions:

-Blueprint Logic

-Level / Asset Optimizations

-Maya Tools  ( i.e. camera exporter)

-Post Process - Polygon Penetration Solutions

-Blutility scripting

-Additional Level Design ( Blast Off to Hermes )

-Interaction Design ( Blast Off to Hermes )

​-Physics Interaction

-Additional Lighting

-Materials/ Shaders

-Modeling 

​-HISM (Hierarchical Instanced Static Mesh Component) Research and Setup

-Integrated 3rd Party Spatial Sound Solution

-Troubleshoot / Debug editor crashes

-Technical Documentation

simple animation test


Diffuse, Specular, Normal Maps


Ribbon Particle Tool

The Martian VR Experience

Diffuse, Specular, Normal Maps


On this project I was working directly with our Lead Virtual / Augmented Reality Designer.  We created two AR demos the explore both tech and design in AR on mobile with PC quality graphics, while running at frame rate for a massive IP .


Contributions:

-AR Character - User look at behavior and interaction

-User Location - Character Animation Blending Logic

​-Blueprint Logic

-Character Migration / ARKit Integration

-UI

-FX

​-Material Creation

​-Audio Integration

​​​​

Character Process  / non PBR material

concept

Technical  .  Artistic  .  AR  .  VR  .  Realtime


Copyright 2019. gary brunetti. All rights reserved.

AR Tech and Design Demos

PC Quality Graphics on a Mobile Device

Character AI Believability / Presence


render

Follow Me Dragon AR

a more technical example

Sample Python Scripts

I built this particle based ribbon simulator for a graphics test.

Made in Xcode 2013

sfxToy -  a real-time fx tool to streamline the creation and manipulation of Maya generated particles inside of the existing Sony game art pipeline.  


Written for Mel / Pyside

Unreal 4.22 in editor - realtime raytracing 

concept

VR Prototyping

high poly

fXTweaker - an outliner tool that shows only particle related nodes and information.  Used for node optimizations and quick selections in a scene that contains thousands of nodes.

Written for Mel / Pyside.

Working with an Engineer, we explored using front facing cameras and other IOs sensors for cheap facial Mocap solutions.  The Engineer created the Objective C app that samples and saves the mocap data, while I worked on the Maya side, using that data to drive rigs.  This was inspired by the facial capture technique created by Kite and Lightning.


Contributions:
-Maya python script to read and process mocap data

-Create 51 blend shapes as needed by the application

-Rig cleanup / setup

-Controller rig setup

​-Technical Documentation



​​​​

diffuse, normal, metalness, roughness, emissive, ambient occlusion

Gary Brunetti

Substance Painter's Physically Based Rendering is an amazing tool.   The more I use it, the more I love it.

low poly

AR Prototyping


render

low poly

rigged and weighted

rigged and weighted

Art Tools

I enjoy working with Maya for asset creation, but I also love making tools that streamline my [or the teams] processes.  Although I am not very fond of building tools in 3dsMax, I have had experience in doing so.  

high poly

I worked with a very small production team on Follow Me Dragon.  Most of the team had to wear multiple hats on this project.  I had my TAD tasks and was also responsible for the Technical Direction and managing the Engineering team, as they were fairly green to production.  My biggest contribution was conceiving and integrating the real-time Navmesh functionality.  This allows Drake to locate navigable areas in real-time and interact with them, dynamically.  This package of Art, Design, and boundary pushing Tech caught the eye of Apple.  Enough so, a special store demo version of Follow Me Dragon was playable at Apple stores worldwide.  It also led to an Apple Best of 2017 Award- - Tech and Innovation - AR.


Contributions:

-Realtime Navigation

-Blueprint Logic

-FX

-Material Creation

-Shadow Solution ( due to technical limitations the shadow is created in a non standard way )

-​Technical Direction

-Engineering Management

-Engineering Advisor

-Realtime reflection  Material / Shader ( Chrome )

- Materials Optimizations

-Character costume  creation ( 3d / 2d )

-UI ( Tech )

-Maya Tools  ( i.e. character exporter )

-Technical Documentation





PBR material in Substance

Raising A Rukus

I worked with a large team of mostly movie industry professionals and a small group from the previous Martian VR project.   Toward the end of the project I teamed up with an engineer to research and create a workable solution for 360 capture inside of UE4.  The engineer took care of the engine side, and I took care of the editor side.  Setup was also required on the particle side of things both in the engine and in materials and particle systems setup and logic to compensate for the the non real time capture.

The intent of this feature is to be able to capture a super high resolution 360 snapshot at the camera's location each frame.  This 360 snapshot is actually made of many individual screenshots that get stitched back together with lens correction into a 360 video format.  Additional functionality allowed for specific masking of non buffer visualization channels, like 'Emissive', to be re routed into a channel that could be visualized and captured, like 'Metal'. 


Contributions:

​-360 Capture Editor integration ( Blueprints, Materials )

-Anim Notify based Footprint FX System

-Blueprint Logic

-Fluid Systems R & D

-Materials/ Shaders

-Level / Asset Optimizations

-Maya Tools  ( i.e. level and prop exporters)

-Blutility scripting

-Level Debugging

-Troubleshoot / Debug editor crashes

​-Technical Documentation



The Prototype team consisted of three people in the beginning.  Me ( Technical Art Director ), Brandon Biggs ( Lead VR / AR Designer), and Craig Craig McPherson ( Creative Director ).  A few months later we got our third and final member Sam Wey ( Senior Designer ).   Being a tiny team, everyone had to come to the table with previous experience in many fields of software creation and production skills.  We all believe efficiency and iteration are keys to strong software development.  Our previous experience in software productions showed us which pipelines and methods would work and would not.  This insight showed in our quality, agility and speed of production.  Having many many years of real time experience, every team member was concerned with getting the best visual quality, creating compelling user interaction, maintaining user interest and awe,... all while maintaining 90 FPS. 

     Our main responsibility was to push the VR boundaries in Design and Tech to create working prototypes Demos to be used by the company.  



​Contributions across multiple Demos:

-Created a suite of VR tools and functionality inside UE4 to facilitate Rapid Prototyping

-Locomotion Solutions

-Post Process Nausea Combatant Solutions

-Post Process Camera Clipping Solutions
-User Interaction Systems / Design

-Object Interaction Systems / Design

​-Vehicle Interaction Solutions

-Vehicle Setup

-Physics Setup

-Modifiable NavMesh Systems

-Weapon / Ammo Class System

-Foliage Physics Solutions

-Maya Support Tools

​-Technical Documentation

-Convert existing Blueprints and Systems across multiple UE4 engine versions

-Troubleshoot / Debug editor crashes

-Revamp, Optimize, and Rebuild assets to 3rd party VR project to run at 90fps ( was running at ~45 fps in key areas )

-FX

-Modeling

-Texturing

-Material Creation

-UV Layout / Cleanup

-Asset Optimization

-Level Optimization

-Material Optimization

-Blueprint Optimization

-Lighting

-Animation

-UI


​​