Machine render – week 10

After completing the model, texturing, UV unmapping, animating and creating the lights in the scene, I took sometime adjusting the lights and seeing how variously placed shadow mattes (as planes) would affect the object appearance in the scene. I would also adjust the intensity, colour and exposure of my main three light sources and render out the single frame to see how it would fit in the final composition. Here are my few test renders.

Test 1 – too dark and harsh textures
Test 2 – better light, but textures are too harsh. The light source on the right is too yellow.
Test 3 – textures are better, light as well. Still feels a bit too yellow.
Test 4 – slightly dark, but better tone to the source light.

Whilst this was the final render, I still noticed that I was missing a roto on the rock, as for the first few frames the machine is standing on top of it, rather than being blocked out by it. During the last few seconds the roto around the lamp goes slightly weird, so I would need to adjust it, as well as the roto of the inside of the lamp, such that the machine bits could be seen better through it. I am happy with the colouring and lighting of the shot. Looking at the node script I noticed that the ReGrain node is missing for the shadows, so that would need to get adjusted as well.

Nuke node graph

Green screen extraction – week 9

Continuing on from the last the week, we proceed on adjusting the despill, i.e. light reflected of the green screen that would illuminate skin or clothes with a greenish tone. As there were subtle differences in the green screen itself, I adjusted it to have a same colour with the use of IBK keyer, as well as the nodes Merge with operations: “average”, “minus” and “plus” from the original footage. I then created the core, base and hair mattes, alphas of which were then piped into the despill footage. As the woman’s head is placed over the blue mountains, I created a separate graded edge alpha with blue undertones, whilst for the body part I used a more green targeted colour correction.

After copying in the alpha, I used a rough roto to cut her out of the footage, colour graded to the background, transformed the image to match the proportions, premultiplied and grained the clip.

Nuke script

Machine model (rough renders) – week 9

This week I focused on the animation of machine parts and addition of texture to the bits of it, so it would not look as perfect or brand new machine out of the factory. Before exporting the exr files from Maya to Nuke, I played around with lighting and placing shadow planes into the scene as well as placing some walls that would block out the lights. Here are a few test renders.

Blue: 1 and 2 are the planes that act as shadow mattes. I decided to removed a shadow matte wall from the right side of the room, as it was blocking out too much light. 3 is a blocking wall, that has a simple lambeth material assigned to it.
Red: 1 is a more yellow area light; 2 is an orange-based softer area light.

Scene planes and light placement

I then rendered out the exr files from the scene in Maya and imported them into Nuke. First, I realised that I wasn’t too happy with the lighting from SkyDome. Second, that I forgot to remove the background image plane, so the machine was rendered out with it as well. Hence the background is seen and due to aspect ratio and format difference (HD_540 for render), there is a slight mix-up with the image. The roto of the hole in wall also requires some work, as in some frames the machine is not as clearly visible.

Nuke node graph

Nuke: all exercises

Week 3: Beauty and marker removal

Week 3: Projection

Week 4: 3D Camera Tracking

Week 5: CGI asset placement

Week 6: CGI asset placement to a point

Week 7: 3D projections – cleanup

Week 8: Green screen extraction (basics)

Week 9: Green screen extraction

Project progress

During the first week of March we focused on finishing the tracking in 3DEqualiser and setting up the scene in Maya. The three of us all attended another few sessions with Dom, where we all tried perfecting the track of the shot. After looking at the footage followed by an advice from Dom, we decided to cut down the shot by a thousand frames and have it last for 44 seconds, rather than 68 seconds. This way the attention of the audience would not get lost and the shot would be more interesting.

The tracking information was then exported into Maya, after which we placed cards to represent the wall and the floor, as well as few cubes to represent the black stands. We also added lights, position of which was based on the light pattern falling on walls and the floor. Test objects such as cube and few copies of a torus were placed around the scene for a render run.

Scene recreation in Maya.
Render of the placed objects.

Earlier, during the project planning stage, Jane found an Egyptian statue. When we met last week, we decided to add sequences of dancing characters (Keith Haring inspired) to the texture of the statue. Proceeding with the task, Jane created a png sequence of those with the transparent background. She also re-did the animated painting of the her earliest work with the newer characters.

Updated version of the animated video (by Jane)

As the clip was cut down we had to adjust the plan for object placement and question if all the original pieces should be placed in the scene. Since we all equally liked the Egyptian statue, it was decided to keep it but we did cut down the amount of pictures from 3 to 2. The first one to be the painted version (by Jane) of the Van Gogh famous painting “Bedroom in Arles”, whilst the second to be the animated painting. The classical statue was kept in the same place and the display case with the fire to be placed between the two paintings.

Giulia was the one who collected most of the free assets for plants, as well as modelled the pyramid shaped display case. Furthermore, she put herself forward to model the neon signs which would be positioned around the classical statue by the black stands. Various tests for quality of neon lights were carried out by her, which we all later discussed during our recent call and came to conclusion of what look we all preferred.

For the particle work, I organized a session with Mehdi, who guided and helped me to understand how to use the object as boundary for the fire simulation. After achieving the fire look that I wanted, I also played around with material and colours of it, to found what we liked the most.

Fire: Node Network

One of the difficulties that I came across was how to read the VDB files inside Maya and later how to export the volume material from Houdini and read that in Maya. Still in the process of finding the solution, my next steps will be trying out the methods I found in various forums, contacting Mehdi and, if all fails, exporting the render from Houdini rather than Maya. For that, I would need to export the tracked information again from 3DEqualiser to Houdini, export the objects and lights that were set in Maya to Houdini, such that the scene set up would be exact as in Maya. The display case would also be needed to get imported to Houdini and be rendered there as well, as the light is interacting with the glass properties of the case.

Material properties window

Other than figuring out the proper way to render the fire, my next steps involve finding all the references for particle movement around the Classical statue stand and setting up the node network in Houdini. For those, I believe, placement of material would be easier, so that could be rendered inside Maya.

Green screen extraction (basics) – week 8

There are various nodes one can use inside Nuke to extract the green screen. Whilst the Image-Based Keyer (IBK) nodes were as less complicated and more straightforward in their application, the Keylight node required more attention and experimentation with the screen colour and screen matte attributes.

After attempting the shot during the first time, the constant problem that occurred was in how bright the red top was and how much contrast it created with the background. After finding a plate which suited my vision for the shot, I focused on using an imported node, called ColorPickerID, which allowed me to adjust a specific colour, without affecting others as much. It took me sometime to figure out of where in the script to place my adjustments, but it worked out in the end.

Looking at the footage, it still appears like it doesn’t suit the shot well enough, as the lighting of when it was shot differs to the lighting from the background.

Further particle simulations – week 6

Now we got to the exciting part, where we were shown how to combine all the previously obtained knowledge into something that would generate much cooler effects.

The first exercise was creation of a disintegration effect where a physical object would break into smaller pieces and those would fly apart, whilst decreasing in size. I tried it on a torus shape, then applied on the human object varying some of the attributes, such as: at what kind of rate the pieces would disappear.

Torus disintegration
Disintegration effects geo network
DOP Network

For the man example I wanted the pieces to last slightly longer, hence changing my value from 0.92 inside the scale to 0.96 in the primitive node in SopSolver inside the DOPNetwork.

POP solver
VOP network for randomization of which particles get de-attached.
Man disintegration effect

We then moved onto using the smoke to direct a path for particles, thus trying to create a more magical effect, similar to the aura from the second week exercise.

I wanted the smoke not to be as grouped as Mehdi’s, so I changed a few settings around, making it slightly more wider in the simulation and dissolution. I also added some colour for the density inside the volume visualization node. I tried out various ways of how to add colour to the particles, that were translated into points when using the ‘CopyToPoints’ node, but didn’t manage to figure how to do so yet.

In the second half of the lesson we moved onto a bigger project: big building destruction. The final objective of this example, is to have a meteor colliding with the building and having explosions at the points of entry and exit.

The first part of it was finding all the problems with the geometry, fixing them and creating the pieces of how it will get destructed. It was noted which parts had defects, however would not be involved in the collision so that we didn’t have to fix them necessarily (e.g. air conditioning unit at the top).

Then we created a proxy version of the meteor, as the original meteor was quite a heavy object with a lot of details and polygons. Various ways were shown of how to subdivide the polygons.

We then started creating the RBD geometry, inside which there will a big graph telling how to break down pieces for various parts of the building. For the outside and inside walls we setup the following way:

Quick review of the building break down.

The plan is to continue creating destruction of the building various parts, such as glass, inside bits like floor, inner walls. For those, example from week 3 when we destroyed the cabin could be used.

Modelling and texturizing (machine) – week 7

Continuing on from the last time, I modelled the final parts of the machine that would be rigged and animated, as well as a few smaller additional details, like screws of different appearance or any extra required cogs, beams and circular shaped bits.

Then, thinking about adding textures and seeing which UVs would I need to fix, I added a couple metal based textures to bigger parts. For the screws, which I counted there are 41 of the basic shape, it is easier to unwrap and correct a UV map of one screw, then export the object and paint in Mudbox, export out the new texture and apply it to the other screws. But that also means, that the other screws would be needed to be copied from the corrected one and placed in their previous positions.

Camera perspective, scene placement

When adding the planes for walls as shadows and adjusting the intensity of the skydome light (to 10), I still noticed that the blue-ish reflection from the machine base and highlights around the column, meaning I need to correct the walls and block out more light, as well as place an area light in front of the object, to the right side. Looking at the floot, one can see harsh shadows falling from the rocks in a specific direction and hence I need to recreate that lighting for this room, as the HDRI used is actually taken from the first room.

Last frame rendered out.

Noting down of the rest of the work, there are still some things that are needed to be modelled or re-modelled, UV maps fixed and textures placed. Given that I have all the parts that will be moving, I need to rig and animate them correctly, which will be the main focus for me this week. I would also need to rotoscope out the part of the wall in front of the machine, such that when placed in Nuke, it would block parts of it and hence look like it was there originally.

Animation plan (red – movement, black and blue – the parts)

Looking at the reference model and what I have learned in other classes, I had an idea of additionally modeling a see-through water tank and placing it in the first room and as the pump is moving, the water is travelling from a pre-modelled tube to the tank and being added to the tank. The water simulation has to be completed in Houdini and later added to Maya for rendering, but I will see with the workload and time limit if I will be able to do so.