hugefere.blogg.se

Faceware vs faceshift
Faceware vs faceshift














They look amazing- especial compared to the characters I've been using thus far. Now that the feature has officially launched, what can I say. Note: I mask out my lips when I record the facial data in iClone because I like the control of Acculips.ģ) METAHUMANS I used the first Metahumans in my "A Job to Die For" short when it was just the demo. I found it to work pretty well! You still need to tweak the data but it's a great foundation. Now I can record to the SD card and apply the animation in iClone later using a PNG sequence. It allows me to attach a lighter GoPro on my DIY helmet and I don't have to stream the facial data to my computer at the same time I was streaming from the PN3. I would be overjoyed if I could place markers on a small object or weapon.Ģ) FACEWARE FOR iCLONE I decided to get Faceware for iClone for a couple of reasons. Thank heavens for gloves! Next on my Noitom wishlist is simple prop capture for the PN system. I also really like the fit of the new strap system (although they collect sweat a lot easier than my PN Pro). I'll have to get used to Axis Studio and its tools but so far so good. I finally got to test it out and was really impressed by how smooth the capture was. The important thing was to start working out a new performance capture pipeline.ġ) PERCEPTION NEURON 3 My PN3 system arrived while I was finishing up "The Soot Man" so it's been sitting in its box for almost a month.

faceware vs faceshift

There's a lot of weird stuff in this test but I wasn't going to obsess about making it perfect. I had a couple of tech things I needed to test and thought it would be fun to do it with a short GTA-style cutscene. Hellbent on revenge, he turns the table on Paulie by making his worst fear come true. "Jackie's fueled by rage and in too deep.

faceware vs faceshift

I’ll tell more about the setup of the materials inside the engine in the second part of the breakdown.Created by: Jason Cuadrado | Cinemonster Cinematics All of them were hand-painted, with some textures applied with blend.įor the skin, I used the Unreal Engine’s Subsurface Scatter shader. In some cases I also used an AO map.įor the creation of all the textures, I used the Substance Painter. I used the PBR workflow, with Base Color, Metalness, Roughness and Normal maps for each material.

faceware vs faceshift

So, I created 1 material for each part of the character: body, hair, shirt, shorts, socks and tennis. Because we were just creating one character, in a very small setting. In the case of this real-time application, there was no need to worry about the size of the textures. He wears a green and yellow shirt (the colors of Brazil), a blue shorts, red tennis and has a 3-color hair. I will talk more about the implementation in the next breakdown parts.įor the textures and colors I used as a reference the concept from the client. Faceware Live have plugins to work with Unity and Unreal Engine.

#Faceware vs faceshift software#

This great Software solution can capture live the actor facial expressions with an easy setup. By the time of the project and our first option was no longer an option. The site is no longer available (read here). However, this software was bought by Apple, and I think it was discontinued. The first interesting option we found that would serve us well was the Faceshift. All other animations that didnt include the facial expressions and head positiion/rotation would be controlled by joystic input.

faceware vs faceshift

We would capture the actor face through camera and transfer the date to the characters face. The possibilities were many.ĭue to the costs, mainly, we decided to rely on a software solution. So many answers that lead us to a lot of information. Our first step was to do an intensive research about tools and methods how to create a Digital Puppet. The was the main research pool, and can provide many answers quicly.














Faceware vs faceshift