Final assessment

For the final assessment we had to apply all the nodes and instruments we’ve learnt so far in the term. These included rotoscoping, colour grading, tracking, planar tracking and 2D clean-up. Having chosen this shot I took in a museum a while back, I planned out of what I wanted to clean up, what to replace and what to add.

Firstly, I denoised, then stabilized the shot. This stage was required to make sure that the tracking would be done easier and better. Then I cleaned away the picture on the left and a golden decoration on the middle wall.

Clean-up node graph
Denoising and stabilising node graph
Plan for the clean-up

Having found difficulty to grade the bits and create keys for animation of the grade node, I found that the order was important, i.e. placing the Grade node after FrameHold, before Premult.

Then I focused on replacing the art. Here the main area of problems created was the planar tracking as the pictures corners were moving randomly at few frames.

Plan for the drawings replacement
Node graph for planar tracking and insertion of the assets.
Close up on the node graph for middle painting replacement

After completing the tracking and colour grading the paintings accordingly to the original shot, I found some footage of moving people that I wanted to implement inside my video. This is where I focused on the rotoscoping part of the assessment. Deciding to place the woman in the place where you would be able to see her in full size, I had to think about her shadow and how that would be formed given the lighting in the shot, arriving to the final stage by trial and error. Working on the roto of the main on the right-side was specifically tough for his hair, as there were a lot of baby hairs or space between hair at the top of the head. Overall, I like how he is facing the crowd and it makes it feel as if he is talking to the woman in front of him.

Node graph for the rotoscoping
Close up on the node graph for woman roto
The whole project node graph

Demon mask (final) ~ Assessment

For the assessment we had to make a simple camera pan around the mask that we’ve been working on. This is my final result.

Taken over about a day and half of rendering, this is a 120 frame shot showing around the mask. Using an AI Sky dome, I used the choice that I liked the most which was taken from the website below.

https://hdrihaven.com/hdri/?c=urban&h=concrete_tunnel_02

Coming across various technical difficulties, one of which involved updating my version of Maya, I realised only after the final render was completed, that I wasn’t too happy with the physical property of the mask, it being too rubbery and hence not as reflective. The colour of the main mask part seems to be too light, burnt out even.

Research – week 1

This is the research I conducted for the machinery for one of which I will be creating a 3D model in Maya. I looked at the various oil pumps, steam engines and generators.

Oil pumps
https://sketchfab.com/3d-models/twin-steam-engine-146d9e79c8d94a74852f29d05235db33
https://sketchfab.com/3d-models/steam-engine-41eb85c28c594767bd6a21833a886070
https://sketchfab.com/3d-models/steam-engine-39d09212f57e482a87302f348102d2d2
https://sketchfab.com/3d-models/beam-engine-889a748f9a284598ae56febcffbc4dd0
https://sketchfab.com/3d-models/steam-engine-202f0f55120248d99d66144435d4f833

The final two pictures of the steam engines are the ones that I like the most. Although the last one may be having a lot of little details and it is not originally animated, I could use it as a challenge trying to see which parts should be moving using the other references.

Shader Development, Creature, Pipeline TDs and R&D Programmer ~ week 8

Shader Development TD

A shader developer is some who would design and write shaders to create a photorealistic look for the movie, TV series or whichever project they are working on. They would be applying procedural shading and scripting techniques, develop rendering tools for creatures, environments, props and VFX, write shaders for surface, volume, light and displacement. They would be using lighting and rendering techniques, including HDRI and Global Illumination and work in conjunction with the lighting TD’s to ensure shots can be delivered to the very highest standard. A shader developer must be proficient in C++ code.

Snowball render, shader created by Thomas Harle
Normal maps for the shader
Normal masks for the shader

To be hired as a shader developer they are required to be fluent with RSL, show an excellent understanding of technologies and techniques in relation to shading, lighting and rendering with the supporting demo reel, have a good understanding of the whole VFX pipeline, i.e. from shooting to modelling to texturing, animation, lighting and rendering. The software they should have a knowledge of includes Renderman, Maya, Nuke. Due to the technicality of the position, the shader developer should have a deep understanding of the linear algebra, rendering and lighting algorithms; they must have strong communication skills and be able to remain calm under the pressure.

https://80.lv/articles/vfx-shaders-with-thomas-harle/

Creature TD

A creature Technical Director is the one working out the mechanics of a CGI character movement, simulation of the character’s hair and clothing, creating muscle and skin deformation. They oversee building of the software in order to create believable attributes of the creatures. They help to develop and program the digital tools for the artists working on digital dinosaurs, animals or a grand beast. Depending on the size of the studio, the amount of job can vary a lot. Whilst in some studios, the role would include rigging, i.e. creation of the skeletons and muscles, in the larger studios a creature TD would be focusing on creation on the fur, hair, skin, feathers.

The creature TD would have to have good problem-solving skills, use cutting-edge technology to find new ways to achieve a creative vision; be collaborative and communicable, as they will be working closely with other VFX artists and would have to give constructive feedback; have programming and coding skills, be familiar and able to use software such as Houdini, Maya and Ziva Dynamics; be able to work on Linux or Unix operating systems and keep up with the deadlines. From the whole VFX pipeline, they report to CG and VFX supervisors, while working closely with modelling artists, riggers and animators, effects artists and lighting TDs.

Creature TD Demo reel

Pipeline TD

A pipeline TD’s job is leading the charge on designing and developing custom tools to help everyone else get their work done faster and more efficient. They provide technical troubleshooting support for the end users of the VFX pipeline, implementing bug fixes or suggesting solutions and work arounds; assists the Lead Pipeline VFX with all aspects of technical components; maintain and develop libraries and APIs for the core pipeline tools to meet the current projects’ requirements and would test and document new software and tools along with their adaptability. They work closely with the various development teams, such as research & development (R&D), infrastructure and feature animation, in order to make sure standard work practices are followed and to report common issues.

C++ coding example

Due to the tasks of the job, the pipeline TD has to be able to work well within a team to develop solutions, communicate with a variety of staff on different levels, think analytically to identify problems and come up with creative and effective solutions, understand the jobs and their requirements within the pipeline, have advanced knowledge of programming in Python and C++, while knowing a way around software such as Maya, Houdini and Nuke, be collaborative and maintain a positive attitude.

R&D Programmer

A research and development (R&D) programmer is responsible for development of software tools and technologies inside the studio. The create systems for the usage by TDs and modify such to suit the specific need of the VFX artists. They design new digital tools and make sure they fit into existing software systems, thus enabling the efficient passing of assets from one VFX process to the next. As this role requires working out ways to improve how well the digital processes work, they have to be informed about software and technology relevant to their field and beyond.

Bifrost node set-up (from Autodesk)

The R&D programmer must know C/C++ and Python, use experience in desktop application development, user interface design, image processing and computer vision algorithms, be an excellent communicator as they would have to talk to technical developers and non-technical artists; have a strong sense of organisation and planning; know of all the parts of the pipeline; use analytical thinking to come up with creative solutions. They work closely with all the technical directors in the whole of VFX pipeline.

Demon mask (progress) ~ week 8

Having worked on the model, texture and paint, this week the task was to transfer the files and information back to Maya from MudBox and set up a small scene inside Maya. I went back over to the mask, changed a bit of the topology, re-did the paint work and this was the result that I imported into Maya.

Imported mask with the material assigned accordingly

It was hard to export the displacement map from MudBox as my laptop would never not finish the task, hence instead I exported the normal maps, as well as paint information.

In Maya I played around with the preset for the material properties, choosing a rubber at first. Then, looking through Arnold render, I adjusted the roughness and IOR to achieve what I wanted. Thinking about setting up a scene, it was decided to make a simple object like a wall and use textures from the free resources website Textures Haven. This was the result.

Close-up
Mask against the wall render

Playing around with the bump map properties for the wall, I decided to go with the option of Bump with the depth of 0.1, compared to the Tangent Space Normal with the depth as 1.

Render with bump map as Tangent Space Normal

Finally, I exported a shot of the rendered scene from Maya as an EXR file to be worked on at in Nuke.

2D Clean-up ~ week 8

This week we were practicing the clean-up of the shot. In the shot provided, I removed the papers on the board, cleaned the side of the lockers and removed a pipe in the background.

Clean up
Nuke script

In the next shot I removed a sand pattern, house and a car.


Nuke script

Demon mask (progress) ~ week 7

This I worked on the texture of the mask. As originally I used a simpler version of a mask which did not have a lot of details or textures, I had to search for other masks which would be as my starting point for texturing and colouring.

Original inspiration for the mask
Nose textures
Top side of the mask textures
Final version with the painting of the mask
Mask used as reference for texture details

Looking back at the mask, I don’t feel too happy with it’s overall topology and it should be worked on a bit more. For instance, defining the brows area, the lips and adding more texture to the teeth would be something I want to work on, before exporting back to Maya and setting up the scene to be rendered for week 8. The overall paint of the mask could also be better, specifically the main colour of the mask and the teeth and mouth painting as well.

Planar tracking ~ week 7

We had to focus on understanding the corner pin and planar tracking of nodes ‘tracker’ and ‘roto’. First, I did the exercise where we had to replace the posters in the original shot.

Original
With the planar track

Looking back at the result, it feels that an additional work on getting rid of lens distortion before additions of the assets and re-applying it afterwards would have produced a better outcome. Here is the Nuke script with close-ups.

Whole Nuke script
Close up of a part of the script

The second shot I worked on was something I shot of a camera moving into a map of the shopping mall. Trying out to stabilise the shot with the 2D tracker before applying the planar track and adding on the asset produced a result which I was not too happy with. The image is getting warped at some frames, so probably using the stabilisation techniques with application of 3D tracking could have produced a better result.

Creating sequences and stabilising the shot
Planar tracking and merging with the OP

Production manager, producer, animator, FX TD ~ week 7

Production manager

In the production pipeline of a project, the production manager is the one who acts on the decisions made by the VFX producer. It is in their job to create a detailed and precise schedule for the project as well as look after the budget. A production manager is also involved in overseeing the production coordinators’ work, they will be involved in team management, training of coordinators and can participate in casting or hiring artists and drafting contracts. They serve as an important point of contact between the VFX artists and technical directors from all the parts of the VFX pipeline to make sure that the work is being completed on time. The production manager should also be communicating with the producer of the company which is shooting the live-action footage and producing the film or TV programme.

With the job specification listed above it only is logical to point out that the manager should be a good communicator and have the ability to explain themselves well, they should be organised, be aware of what is happening, listen and be one step ahead, have leadership skills, have a strong knowledge of VFX and be able to understand the aspects of the VFX pipeline and finally have the problem-solving abilities in order to successfully anticipate any issues occurring during the project and to adapt to the changing timescales and technical issues.

Producer

The VFX producer is someone who manages the entire process of creating the visual effects for a film or a TV show. It is a part of their job to ensure the client is happy with the results of the studios’ work. VFX producers are the ones who write the bid, i.e. the document which is used to persuade the film or TV series producer to take their VFX studio on for the required visual effects work in the project. They will put the team of VFX artists and other technical staff, create and set the schedules for the work and manage the budget, distribute the footage to the according artists, assessing the shot, giving feedback internally. During the stage of filming, it is important that the VFX producer works closely with the live-action production crew, as well as in post-production stage with the editor. It is set between the client and the producer of how much interaction will be happening: either weekly or daily.

VFX producer Hasraf Dullul

The producer has to have the leadership skills, have the confidence to give directions and communicate well enough with every team member, be organised as they have to plan effectively and manage the project and its budget, have the knowledge of the VFX pipeline and how everything works inside one, have problem-solving skills and be able to wok and maintain good relationship with the clients. They often communicate with the producer or director of the company making the film, but also work closely with the VFX supervisor.

Animator

In the stage of post-production, after the rigging of a character or an object, the animator will bring it to the life, i.e. literally taking the motionless 3D character and appear for it to seem to be alive. They create animation ‘frames’, using the ‘rig’. The animation is formed when all the frames are put together in a sequence. Depending on the company the type of animation may spread out across different jobs. Inside the VFX industry, the task of the animator is to produce work to be integrated into the live-action footage of a film or a TV programme. Therefore, they must animate the 3D objects as dictated by background film plates, implying there is a footage and a set camera position that they must work to.

The animator has to be good at drawing and revealing attitude, emotions and mood with the use of character’s movement, have special awareness and a feel for movement over time, they must have knowledge of animation and understand the principles and mechanics of it, know how to use relevant software such as Arnold, Blender, Maya, Mental Ray, Photoshop, RenderMan, Substance Painter, V-Ray, ZBrush and 3ds Max, ne organised and collaborative. VFX animators work using the overall brief from the film’s director, picking up after the matchmove artist, who would create the rig for the character to be animated.

Animation Showreel

Effects TD

An effect technical director oversees designing and creating any effects such as explosion, wind, smoke, water, fire, clouds, dust, debris and many others. They write the computer language scripts that generate the effects, as well as build and test software tools for the VFX artist to use, which they then incorporate into a VFX studio’s production pipeline. With each project presenting its own complex problems, TDs are good problem-solvers. They have to stay up-to-date with the latest research and techniques and push software technological boundaries to find ways for the creative vision of the director to be brought to life onto the screen.

An effects TD need to have a good eye for detail and know how make a sequence look good, they have to understand science and be able to create accurate and believable movement of particles, they need to be aware of the VFX production pipeline and understand other roles inside the VFX studio, they have to have coding skills and communicate well with the team of VFX artists. They must know programmes such as Houdini, RealFlow. They will be working closely with the lighting and rendering teams in order to ensure the effects will look correct in any reflections, shadows, etc.

Technological emerging practice ~ weeks 5-7

“Lion King” (2019) was one of the first movies to involve the Virtual Reality in the process of creation, thus completely reshaping the production pipeline. It all started when the director Jon Favreau was working on the movie “The Jungle Book” in 2016, shooting of which was done in just one warehouse, with use of SimulCam, such that they could see the combination of real-life and CG characters on the screen. Taking it to the next step a set-up for working in Virtual Reality was created and implemented, such that the shooting team could see low-resolution version of the final assets and could trigger the long clips of animation on command. So, it was important for the assets to be worked on before the stage of shooting, with the details added and refined only after the scenes were shot. Those were done by MPC, working on the master scenes in London, whilst developing an “asset management system” such that the scenes could be translated into something that was compatible with Unity, the game engine.

Jon Favreau on use of VR during the pre-production

With the use of HTC Vive headsets Jon Favreau, Caleb Deschanel and Robert Legato with other important crew members could submerge into the virtual world being able to explore it like a real set. Within the virtual world they had access to a lot of settings, such as the lighting, placement of sun, choice of sky (out of 350), position of the camera, variety of lenses.

Director Jon Favreau (far left), Deschanel (green VR rig), production designer James Chinlund (blue), Legato (red) and animation supervisor Andy Jones (white) study the previsualized world in preparation for a virtual shoot.

The team would make suggestions ‘based on what they thought it was going to look like when they filmed it’, be it changing the way an animal is walking, or position of assets in the shot. Despite the expensive side of virtual reality equipment, the ability to pre-determine such key characteristics which make-up a big part of the movie all in the virtual reality meant that it would require less crew and equipment, the choice and application would be instantaneous and hence more convenient infrastructure-wise. This way it brings more flexibility to the set. But one has to still have a great sense artistically and visually wise to be able to use all the various instruments to their advantage.

MPC lead lighting artist, Samuel Maniscalco, would have to light the world according to several factors, such as time of the day, location of the scene, placement within the larger story. The lighting was determined in the Unity such that it was possible to know the direction is was falling and how it would affect the drama when it was finalised. However, the downside of the VR production is that it was only possible to see how properly it was lit later after the scene was shot. Thus, it is important for the filmmaker to understand the process in advance. Having someone as part of the filming crew who knows how to use the visual tools straightaway and be able to adjust lighting, colour and editing on the spot is a big must, as that way they could show the director immediately of what they are intending to do. Thus by the time that the film was being shot, a big digital file would be created, which would contain the performance of the characters and the settings in 360 degrees, with all the dialogue and songs pre-recorded and synced with the characters in the virtual world. The only left thing to do was to record with the virtual camera and apply the virtual lighting.

Virtual Capture
At the wheels, Deschanel operates a virtual shot.

The actual shooting of the movie was held in a 25-foot-square area called the Volume. Despite it being physically restrictive, due to technological involvement, they could update the scale of their movement in regard to the scene, for example, moving a meter in real life would translate as moving two or four inside the virtual world. Using a camera or something that would represent the camera, Favreau and his crew would physically move it around the stage. The three-dimensional flight path was tracked and reflected inside the virtual world, thus enabling the team to ‘shoot’ inside Unity. Having a custom setup where it was combining the Vive with OptiTrack sensor system and US Digital encoders, the crew could use and track the conventional camera equipment, be it Steadicam stabilisers, cranes and wheeled dollies. Furthermore, those filmmaking tools could be repurposed for other uses, such as a coded pan or tilt on a dolly or head, tilt up thus lifting the camera up in the air like the arm of a Fisher dolly. A drone pilot was hired to operate a virtual quadcopter such that aerial shots could be captured in a believable manner. One of the difficulties that arises with the Virtual Reality toolset is how to make the film appear having a human touch to it, having to understand how to imitate the conventional filming if the movie was shot in real life. One of such examples was the requirement of the drone for a wide aerial shot of Zazu. Following the pre-set put virtual path an operator with great experience of drone-like movement in real life, would know how to move it, such that it felt natural for the audience eyes.

Use of the Virtual Production implied that the animation inside the master scenes were always repeatable and identical, meaning the filming crew could shoot the action as many times as they needed to. This also got rid of the need to use multiple cameras, allowing for the operator to focus on a single camera. After the shoot, the so-called “3D scene files” would get sent to MPC, where in their turn, they would translate the camerawork into a version of the movie that had final production quality, assests and animation. This was a very new workflow for the company, but it had its advantages, such as eliminating the need to track the camera as it was done already. However, this sort of production increased the amount of work done by the VFX company, as MPC did all the visual effects: animation, compositing, lighting.

Large, detailed environments help you to visualize the map.

Overall, “Lion King” brought new technological step and advancement in the world of cinematography. Whilst being on a more expensive side, the use of Virtual Reality has some impressive applications and brings the ability to use countless instruments, all accessible via one headset and pair of tools for the hands. Being both the advantage as well as a disadvantage it reduces the necessity for a big crew during the stage of shooting but would probably require more people from the side of post-production as well as pre-production. Nowadays, more projects are bringing the virtual reality to the production appearing more and more convenient and available for other projects.

References:

https://ascmag.com/articles/making-the-lion-king

https://www.wired.com/story/disney-new-lion-king-vr-fueled-future-cinema/

http://engadget.com/2019-07-29-lion-king-remake-vfx-mpc-interview.html