Search Results for “DP1703” – DIGITAL PRODUCTION https://digitalproduction.com Magazine for Digital Media Production Wed, 04 Dec 2024 13:29:45 +0000 en-US hourly 1 https://digitalproduction.com/wp-content/uploads/2024/09/cropped-DP_icon@4x-32x32.png Search Results for “DP1703” – DIGITAL PRODUCTION https://digitalproduction.com 32 32 236729828 Creation Effects – Kreative Templates! https://digitalproduction.com/2023/03/17/creation-effects-kreative-templates/ Fri, 17 Mar 2023 13:15:55 +0000 https://www.digitalproduction.com/?p=115532
Creation Effects at DP reader price: As announced in the article, Noel has sent us a code for a 25% discount on the templates presented in the test!
]]>

If you have already read the article in the magazine and are only looking for the code:

DigitalProductionCFX

To stand out on social media, everyone wants Hollywood, of course. That’s why the web is flooded with programmes and templates that supposedly allow you to generate “amazing” videos with just a few clicks. However, the variation options are usually limited and leave little room for creativity. That’s why you quickly realise which “awesome” videos have their origin in a template. If you want to create something really interesting, you have to show a little more “skill”. But you don’t always have to start from scratch.

In DP 01:21 (Inception in After Effects) we already presented some templates by Noel Powell. His latest collection of templates is called “Landscaper” and the name says it all. You can use his Trippy Effects for weird eye-catchers on social media and if you’re looking for interesting title animations that you can’t create with the usual title tools, you’ll be well served with the recently revised Text Effects. The Beasts feature various animated animals, Flags generates quite realistic flags and Glitch helps you shred videos
Videos. If you want to take a look right away: creationeffects.com
The basic structure of Noel’s templates is the same. Everything is packed into an After Effects composition. Here, too, work is carried out using only on-board tools, without the use of external plug-ins. This alone is a lesson in the potential of After Effects, even with the help of scripting.

Landscaper

Neal’s latest template collection is called Landscaper and the name says it all. This template can be used to create animated landscapes. With the help of the 3D layers staggered in the Z-axis, very beautiful parallax effects can be achieved. In addition to the 42 templates that can be modified down to the last detail, there is also a landscape construction kit, as is usual with Creation Effects. Noel has collected countless photos from free image portals and meticulously cut them out. These can be searched in the web browser and then dragged into the project from the local footage library. Noel has produced very clear tutorials to go with them. The included footage library alone, with around 1000 objects consisting of landscapes, mountains, plants and objects, which can of course also be used in other projects, would be worth the purchase. The compositions are very clearly laid out and come with many tips on the workflow. Of course, you shouldn’t expect photorealism. But the templates are well suited for animated illustrations for magazine programmes, backgrounds for virtual studios or video conferences or to spice up PowerPoint presentations.

Flags

What is actually not possible in After Effects, namely animating waving cloths, Noel has managed to do very realistically here with a few tricks. All the parameters of the “waving” can be modified down to the last detail and the flags are truly three-dimensional. There is an extensive collection of flag textures from all countries and of course there is also a pirate flag. The realistic representation is further refined by means of fabric textures. In addition, videos and any other graphic elements can also be used as textures. The appropriate wind and fluttering noises are also included. Due to the many 3D layers, the rendering times are quite high, but even with a lot of expertise in 3D software, such results are difficult to achieve. And the low price of the template of 25 dollars is quickly offset by the labour savings.

Creation Glitch Effects

In DP 03:18 “Creating a glitch title template for Premiere in After Effects” we had a workshop on creating a glitch template. The basis at that time was the extension of a tutorial by Andrew Kramer on Videocopilot.net. In the template, a similar technique was used to some extent, but the topic of glitches is covered here down to the smallest detail. Firstly, there are lots of pre-rendered glitch clips in the footage folder, which are also suitable for keying over in Premiere and are used here in some of the templates. Some of these are actually recorded, but Noel has also created them with the integrated “Glitch Generator”. Here, the fractal noise effect is used to generate glitches in greyscale and colour, which are superimposed in several layers using blending modes. There are also plenty of disturbing sound effects to match.

Creation Title Effects

This somewhat different title generator offers a lot that some expensive plugins cannot. There is something for every taste among the 200 templates. Even if the colour scheme is often a little “American colourful”, everything can be changed to your own taste, including the text, the font and the colours. The headings all have an initial animation. The spectrum ranges from “normal” graphic titles to those that cannot be created with a title programme. These include, for example, texts generated from sand, bubbles or other particles. 3D texts, metal texts with reflections, chalk, chrome, clay, clouds, ice, stone, fire, smoke and much more. For an overview, there is an online video and a PDF showing all 200 title templates.

Creation Trippy Effects

If you want to produce wacky music videos and social media clips, this is the right place for you
exactly the right place. There are not only colourful motion graphics backgrounds, but also very unusual effects. These are particularly suitable for music videos. Colours are played with, videos are melted and covered with jelly. They are also distorted and frozen line by line. However, these spectacular effects, which are created using time remapping and displacement maps, are very computationally intensive.

Conclusion

The tested templates were again a lot of fun. And thanks to Noel’s detailed tutorials, you always learn something new, because he makes no secret of how the templates are built and how they work. What’s more, Noel gives us a 25 per cent discount for our readers on the templates presented.

The code is: DigitalProductionCFX
]]>
DIGITAL PRODUCTION 115532
The open libraries of Hollywood https://digitalproduction.com/2017/10/01/the-open-libraries-of-hollywood/ Sun, 01 Oct 2017 14:09:00 +0000 https://digitalproduction.com/?p=148730 At this year’s FMX there was a talk „Open Source Software in the Motion Picture Industry – An Investigation by the Science and Technology Council of the Academy of Motion Picture Arts and Sciences”. In a full room David Morin presented the results of a survey about Open Source libraries in use in the Motion Picture Industry right now, from a survey done with big studios, software vendors, developers and CTOs – including a live Q&A with the FMX audience.]]>

David Morin is chairman of the Joint Technology Subcommittee on Virtual Production, past co-chair of the ASC-ADG-VES Joint Technology Subcommittee on Previsualization and he organized the first Academy Summit on Open Source Software on behalf of the Science and Technology Council of the Academy of Motion Picture Arts and Sciences.

David earned a B.Sc.A. in computer science from Laval University (Quebec City, Canada) and has participated in the deve­lopment of motion capture and 3D software at companies such as Softimage, Microsoft, Avid Technology and Autodesk. Today he is president of David Morin, LLC, a consultancy specialising in immersive production, wor­king with the Academy of Motion Picture Arts and Sciences. He works from Los Ange­les, California.

DP: When you did the survey for Academy of Motion Picture Arts, what was its goal?
David Morin: This survey was initiated by the Science and Technology Council, Academy of Motion Picture Arts and Sciences under the leadership of Rob Bredow, an Academy member who is also CTO of LucasFim. It was – basically – as a survey on Open Source Software (OSS) libraries. The reason we felt that this has become necessary was that
Libraries such as OpenEXR and Alembic and OpenSubDiv are used a lot in the production of movies, either inside products like Maya and Nuke or directly by production facilities in their various pipelines and in-house tools. The number of libraries has been growing, as well as the number of versions that are in use for each of the libraries. There have been some problems in production with OSS libraries, and the Academy wants to see if it could and should help – therefore the survey. As we did that, we discovered more about the vibrant OSS Community, the large number of libraries used in today’s pipelines, and it has been very helpful to gather information and to inform the decision how the Academy should react to it. This decision hasn’t been made, we are just gathering data so far.

DP: What could the Academy do?
David Morin: The range of possible actions after the survey is basically running the gamut between „Do nothing, open source is fine“ at one extreme and the other extreme is „There are problems the Academy can solve, and it should put together an organization that will solve those problems“.
Other suggestions were that the Academy could host a forum for all VFX- and Movie-People to exchange information, links, and resources. Or we should work together with an outside Foundation that has experience and a community that can help with the day-to-day challenges of Open Source Software.

DP: If you would work together with foundations for open source software, would the „special requirements“ of the VFX- and Studio-Industry be compatible with general IT-Solutions?
David Morin: I think so. For one, the VFX-Studios already use a lot of „general” IT Open Source Systems – databases, file hand­ling, Programming Languages and so forth. What we are talking about are the libraries that are developed specifically for the Motion Picture Business. And there is a number of examples of big foundations – such as the Linux foundation, the Apache Foundation and the Eclipse Foundation – that are providing customised environments for certain markets. For example, the Linux foundation has a big initiative around the car industry – they provide an open source environment for developments in that industry, called the Automotive Grade Linux (AGL).

DP: One big problem with the Motion Picture industry is constant change – but Open Source Libraries often „die“ – in the sense that they are no longer maintained, the owner or developer abandons the project and the last update is a decade ago. How could the Academy handle something like that?
David Morin: That is one thing to be solved – some of the most important libraries could be transferred to an organization that would own them and have the responsibility to manage them. This problem of ownership – either of losing an owner, or having an owner who is busy with other projects is one of the many problems we are facing. Thankfully so far, with the most important libra­ries the departing owner has been replaced by someone else – but we could do better. One solution for that could be a “Badge of Honour“ System, to incentivize companies to support OSS libraries. But I think, the better incentive is money. It is important to get an understanding of the unique challenges of “free” Open Source Software: The cost of development – the work hours of your programmers, user testing, Q&A and so forth – makes time the currency – in regards to fixing problems or enhancing libra­ries or upgrading – all this requires time. Any solution for those problems will require an investment from somewhere. When a company assigns someone to use or develop an OSS library, they take this person’s time away from something closer to the revenue stream of the company. So, often we found that shining a light on the value of Open Source Software, and positive reinforcement gets the company to say „Ok, we are very busy, but we will do something about the update of this library“. And with this process of the Academy doing a survey, it has in fact revived some OSS libraries that might have been lagging behind a bit – the simple fact of asking questions about it raised the priority of the work. We don‘t know where this will lead, but having a discussion was a great first step to rally everyone and share the understanding – and the value – of Open Source Software.

DP: In your talk, you showed the preliminary data about the most used libraries – in fifth place came an open standard that the Academy developed itself: ACES, the Academy Color Enco­ding System. Do you think that the Academy Logo and name in the title could help promote libraries, and therefore guide further development?
David Morin: Great question. ACES was developed by the Science and Technology Council of the Academy because there is a need for a standard to manage colour through the life cycle of movie and tv production. The goal was to develop a standard for the industry to use – and to see ACES coming up in this survey was interesting. If you think about it: Aces Version 1.0 was released just a couple of years ago, and it is already almost in mainstream use. People and companies use ACES because it is useful, but the
attached label does not hurt – As for mentioning or endorsing other libraries through the Academy: That is a possibility, but we are far from any decision.

DP: Obviously during the survey, you talked – both in the forms and in person – to a lot of different studios, vendors, producers, CTOs and freelancers all over the industry. With that in mind, could you give us some tips for implementing OSS libraries?
David Morin: Glad too! Firstly: Make sure you consult those two important resources and keep them close at hand: opensourcevfx.org – it is an excellent place to see what already exists – and the great VFX Reference Platform from the Visual Effects Society at vfxplatform.com, which lists most of the OSS libraries used in movie production today, and which versions are recommended to use right now.
Secondly: Read the license agreement in its entirety. Don‘t skip it or just skim over it – every software, commercial or open source, has a license agreement, and make sure you understand what you are signing. If you are only starting out, do your research – that invested time will help you down the line.
Thirdly: If you are a vendor who is relea­sing an application, a studio who is buil­ding a pipeline or anything else in between: make sure that you have a well-defined and structured development process – when you commit to an open source library, approach it is a living project, and plan to stay on top of future developments, upgrades, and changes. Adhere to good software develop­ment practice, and make sure that you are not limiting your future developments by using too many workarounds and getting locked into any particular version. If you are in charge of software development, make it easy for your engineers to use OSS libraries so that they don‘t do that in secret – if you are developing software, you probably are using OSS libraries already, so getting acquainted with the process and the licensing system isn’t that big of a stretch.
And a fourth tip: If you are developing with OSS, embrace the spirit – pretty much all libraries are on Github and have dedicated forums and message boards, where you can find answers to most of your questions – the community helping itself and each other is one of the major benefits of OSS.

DP: And with the survey itself: What will happen next?
David Morin: There was an Academy summit on Open Source Software in February, where some of the people who answered the survey got together and discussed the results and the potential actions to come. And during Siggraph 2017 we will hold a
second meeting, where the state of the data – including the answers we gathered here at FMX – are going to be discussed. We are hoping to hear from existing Foundations about the type of service they could provide to help with Open Source in the Motion Pictures. The investigation continues.

OpenEXR: OpenEXR is a high dynamic-range (HDR) image file format developed by Industrial Light & Magic for use in computer imaging applications. openexr.com

Alembic: Alembic is an open computer graphics interchange framework – it distills animated scenes into a non-procedural, application-independent set of baked geometric results and is focused on efficiently storing the computed results of complex procedural geometric constructions. Alembic 1.O was released in 2O11 by Lucasfilm and Sony Pictures Imageworks with support from Autodesk, Side Effects Software, The Foundry, Luxology, Pixar’s Renderman and NVidia. alembic.io

Opencolor IO: OCIO is a complete color management solution geared towards motion picture production with an emphasis on visual effects and computer animation. OCIO is compatible with the Academy Color Encoding Specification (ACES) and is LUT-format agnostic, supporting many popular formats. OpenColorIO is released as version 1.O and has been in development since 2OO3. OpenColorIO is free and is one of several open source projects actively sponsored by Sony Imageworks. opencolorio.org

OpenVDB: OpenVDB is an open source C++ library comprising a new hierarchical data structure (and toolset) for the storage and manipulation of sparse volumetric data on 3d-grids. Developed and maintained by DreamWorks. openvdb.org

ACES: The Academy Color Encoding System (ACES) is a standard for managing color throughout the production, from capture through editing, VFX, mastering, presentation, archiving and remastering. ACES focus on consistent colors and solves many problems that came along with digital workflows.
www.oscars.org/science-technology/sci-tech-projects/aces

PTEX: Ptex is a texture mapping system developed by Walt Disney Animation for rendering without UV assignment. The files can store thousands of texture images. ptex.us

Opensubdiv: OpenSubdiv is an open source library that implement subdivision surface (subdiv) evaluation on CPU and GPU. It is optimized for drawing deforming sub divs with static topology at interactive framerates. The open beta (Current status as of June 2O17) source code for OpenSubdiv is located on GitHub. graphics.pixar.com/opensubdiv

]]>
148730
So much cattle stuff! https://digitalproduction.com/2017/04/27/so-viel-viehzeugs-retro-artikel/ Thu, 27 Apr 2017 05:37:00 +0000 https://www.digitalproduction.com/?p=100185
Review: In DP 03 : 2017, Warner Brothers reached for "Fantastic Beasts & Where to Find Them", because J.K. Rowling was finished with Harry Potter after volume seven. Off we went to the beasts (wherever they are to be found)!
]]>

Harry Potter’s school years in book and film form were incredibly successful. As author J.K. Rowling put an end to the adventures of the sorcerer’s apprentice after volume 7, the reliably lucrative box office results of the franchise failed to materialise. So a new corner was reached for, and the fictional encyclopaedia of mythical creatures became the story basis for a new film series.

Rowling had already published the two small volumes “Fantastic Beasts and Where to Find Them” and “Quidditch Throughthe Ages” under a pseudonym in 2001; she wrote the animal encyclopaedia under the author’s name Newt Scamander. Both books are standard reading at Hogwarts School in the Harry Potter novels. Rowling wrote the story of the creation of the encyclopaedia about Newt Scamander and his magical creatures, which is set 70 years before Harry’s lifetime, in screenplay form. The first “Fantastic Beasts” part was released in cinemas in Germany on 16 November 2016; the film was released on DVD and Bluray in April 2017.

For the spin-off, numerous full CG creatures had to be created that deviated visually from the norm. Double Negative, Framestore, Rodeo FX, Milk VFX, Image Engine and Cinesite were the VFX studios involved in the project.

Creatures at MPC

MPC realised over 220 VFX shots for the film; the VFX supervisor for the team was Ferran Domenech (“Legend of Tarzan”, “Godzilla”). MPC is already experienced in creating magical effects, as the studio was part of the VFX crew for all 8 previous Harry Potter films. For “Fantastic Beasts”, MPC created the titles including the Warner Bros. logo, various crowd and environment extensions as well as the Manhattan environment when Newt arrives in NY by ferry. The most complex task, however, was to bring the three creatures Demiguise, Billiwig and Occamy to life – including all the destructions that his enormous ability to grow brings with it.

The film was mainly shot at Leavesden Studios in north-west London, where the other parts of the Harry Potter saga were also created. Some other original plates, such as the one in the shopping centre where Demiguise and Occamy are hiding, were filmed in Birmingham.

Monkey with silver curls

Demiguise – a small monkey-like creature that can turn invisible and read the future – was the first creature MPC worked on for the project. The team used the specially developed FurtilityGroom technology combined with simulated cloth geometry strands to create his long, silver-coloured hair. These allowed for natural hair movement and interaction with the creature’s limbs and the environment. To create the effect of Demiguise becoming invisible, the Furtility team developed a new texture projection tool that allowed the background images to be painted over the fur and moved realistically. Demiguise’s facial and body movements were animated with keyframes.

Feathered giant snake

The most complicated creature, however, was the feathered serpent Occamy, which has wings and a dragon-like face. Occamy is always as big as the space that surrounds it – in the case of the hall-like department stores’, the magical creature had to wrap itself around the roof beams and look incredibly long. To achieve this, the asset team created five different variations of the body. The enormous body was also divided into different parts so that the space could be completely filled and the carefully designed composition adjusted in each shot.

For the realisation of Occamy, the team refined the SnakeRigging technology previously developed for the Harry Potter films, with a snake body covered in feathers providing an additional challenge that was overcome using MPC’s Furtility tool. The creature’s complex transformation performance, ranging in size from a house to a mouse, was achieved by customising the Furtility tool to allow Occamy to scale interactively.

In the sequence, Newt scares Occamy, causing it to get caught in the roof structure of the department stores’ and destroy the building as it tries to free itself. For this scene, MPC built a detailed set extension of the attic including beams, screws and nails, wood panelling and an outer layer of shingles. The team realised the destruction effects of the CG set using the studio’s own destruction technology Kali.

Dinner at Cinesite

Cinesite realised around 100 VFX shots for “Fantastic Beasts” – including the scene in which Newt enters the magical world with the help of his suitcase, various New York CG environments and the entire dinner sequence in the Goldstein apartment. The project kicked off at the beginning of 2016, after which a small but permanent team worked continuously until the end of September 2016 to complete the shots. Like MPC, Cinesite was also on board as a VFX service provider for all previous Harry Potter films; the studio has already worked on a total of 2,000 shots for the series.

Cinesite supervisor on the project was Andrew (aka Andy) Morley, who has been working in the digital film industry since early 2000. He was involved in “Harry Potter and the Chamber of Secrets” as technical supervisor. His other projects include “Batman Begins”, “Transformers”, “Avatar” and “Gods of Egypt”.

DP: Post-production for the first “Harry Potter” instalments took place more than 15 years ago. How has working on the film series changed for you in the meantime?

Andy Morley: The technology and artist skill base that we were able to draw on for the fantastic beasts is much more mature and reliable compared to the earlier Potter days. Today it can deliver any effect imaginable, so the challenge is increasingly to create visually compelling VFX work that audiences around the world have never seen before. The expertise and skillset available in the UK guarantees that the films will look great across the entire franchise.

DP: How did you realise the scenes where Newt disappears into the magical world through the suitcase?

Andy Morley: One of the key sequences involving the suitcase took place inside the Goldstein flat. In it, Jacob hesitates to jump into the suitcase after Newt. Partly because it is much more difficult for him than for the slender Newt due to his girth. As was to be expected, he gets stuck and then tries to slide out by moving up and down. The original plate for this was shot using a real suitcase with Dan Fogler’s legs sticking into the floor. As the prop suitcase on set was a slightly different size to the one Newt jumps through, it had to be replaced with a full CG suitcase for this scene.

A lot of our work involved removing Dan’s legs via painting and cleaning, as well as rebuilding the floor when the suitcase bounces into the air on top of it. Additional cleaning was required to bend and manipulate Jacob’s arms to convincingly close the scaled-down edges of the CG suitcase. To enhance the realistic look, we added interactive shadows to the environment. The final result required a lot of back and forth between the colour grading, compositing and animation departments. The animations were ultimately driven by the actor’s movements on set, while the compositing artists placed the CG suitcase in the shots. By giving the suitcase more dynamic movements, we gave it the impression of having a will of its own. As there were relatively few suitcase shots, warping and deforming effects in the compositing, supported by some 2D adjustments, allowed us to achieve the final results. Some subtle dust effects with each bounce on the floor gave the weight and impact of the suitcase a convincing feel.

DP: How was the NY environment created?

Andy Morley: A key sequence of our work in this regard was the view from the window of the Goldstein flat over New York City. We created set extensions for this, which consisted of a mix of rendered 3D buildings, projected 3D building details and matte paintings. We created the background using references (texture photos, lidar scans and building photographs) from New York in the 1920s and 1930s. Another CG environment of this type was created for a later situation in the film, in which Newt and Tina stand on the edge of a New York rooftop. The realistic dark night lighting for this sequence, in which the action taking place is still legible, had to be finely balanced. As the scene was filmed entirely in green screen, it took many hours of work before the night-time full CG cityscape behind the actors looked believable.

DP: You were responsible for the entire dinner sequence in the Goldstein flat. How did you realise the self-drying clothes?

Andy Morley: In this sequence, the injured Jacob is brought into the Goldstein flat at the beginning. There, he is startled by clothes on a drying rack that rotate automatically. This was originally filmed as a live-action scene with real clothes on wires. However, the production later decided that the scenes did not look fluid enough and that they should be replaced with CG objects. The design of the clothes horse was also changed to vertical rods. The garments were realised in eight shots using a mix of animation and cloth simulation with Maya and nCloth.

DP: And Queenie’s magic dress that wraps itself around her?

Andy Morley: The full CG dress had to be seamlessly integrated into the environment shots, while the actress wears the real version of the dress in some shots. A believable implementation was complicated because cloth software is usually used to replicate the behaviour of a real-world cloth material. However, this world is a magical one, so Queenie’s dress had to behave unconventionally. Actress Alison Sudol played the scene with the full CG dress in her underwear – the digital dress was animated with a complex Maya rig that allowed for adjustments to the dress to match the movement and realistic deformations in the Cloth simulation. Shape problems could be solved using extensive geometry sculpting. As the real fabric material of the dress did not react particularly well to the real lighting, it was optimised for the final look with a more interesting finish in Nuke.

DP: Then dinner is served – all the ingredients fly through the air, prepare themselves and land on the table. How did you proceed for this sequence?

Andy Morley: An exact choreography was defined for the numerous flying CG objects such as bowls, plates, apples, napkins, cutlery and glass jars. All the objects on the laid table are also full CG. In one shot you can see a jug of cloudy lemonade, the contents of which were not animated with a fluid simulation, but with a deforming effect of the surface. Even the candles are CG, we have complemented them with a manipulated flame element. We also changed the real lighting a little. In particular the one on Jacob to reduce the harshness of the initial lighting on his face – this created an interactive lighting effect. The highlight of the dinner is the apple strudel: all the ingredients swirl around in front of Jacob’s face, the fruit wraps itself in layers of dough before everything is baked to a crispy brown and the cake sinks to the centre of the table, ready to be eaten. We created this shot with customised FX and used Houdini for stronger deformations. The animation was created with Maya, shading was done with Arnold. As we wanted to give the whirlpool a photorealistic look, the team wrote new shaders for it and developed various render-related sequences for the animation of the surface baking in the air. We turned all render settings to 11 for this.

DP: How did the compositing work with the numerous CG elements?

Andy Morley: We realised it with Nuke 9.0v5. Each shot in this sequence required individual 3D models and shaders as well as complex animated textures and displacements. Additional BlendShapes ensured that the overall shape of the pastry could shrink slightly during the baking process; heat distortion effects were added in Nuke. The different objects – some with transparent surfaces with a refraction effect – and the lighting situation on set with many different light sources that had to be recreated in the 3D scene made working on the sequence extremely complex.

DP: How was the collaboration with the other studios involved?

Andy Morley: We shared some shots of the sequence where Newt and Tina are talking on the roof of a New York building with Framestore. Tim Burke was the supervisor for this scene, which was filmed entirely in front of a green screen, and for us it was the last shots for the project in the pipeline. Framestore put Newt’s full CG pet Bowtruckle, called Pickett, on his shoulder, we in turn handed over the 3D layout and lighting setup to Framestore for about half of the shots. Lighting, look and grading were crucial in the edit. We compensated for the lighting in the original plate; the basis for the CG city was a single 3D scene, which was later also used as a digital matte painting and adjusted by the compositors for different camera angles. Double Negative provided us with building assets, which we further developed for the respective shots.

DP: Will you be part of the VFX team again for the next “Fantastic Beast” instalment?

Andy Morley: I really hope so, especially as Cinesite has been involved in all the films based on J.K. Rowling’s books so far. We have a good relationship with the creatives involved in the realisation of the franchise. We would love to help bring more magical effects to the big screen.

Links

“Fantastic Beasts” trailer
youtu.be/Vso5o11LuGU

MPC website
www.moving-picture.com

Cinesite website
www.cinesite.com

Behind the Scenes “Fantastic Beasts”
youtu.be/v00xz7oB3MY

]]>
DIGITAL PRODUCTION 100185
BaseGrade and the evolution of colour grading https://digitalproduction.com/2017/04/01/basegrade-and-the-evolution-of-colour-grading/ Sat, 01 Apr 2017 13:07:00 +0000 https://digitalproduction.com/?p=148698
At NAB 2O16, colour grading specialist Filmlight caused quite a stir in the colourist scene. BaseGrade - a completely newly developed grading operator for Baselight - was presented, which is intended to replace classic tools such as Lift, Gamma and Gain. That sounds like a small revolution. Filmlight promises more consistent results and a more natural way of working. Reason enough not only for Baselight colourists to take a detailed look at it.
]]>

A rough understanding of the evolution of colour correction helps in the evaluation of basegrades. The origins of colour grading as we know it today go back a long way, to the early days of television and cinema. For TV, video signals from television cameras or from a film scanner have always had to be levelled or corrected. The profession of colourist originated in the telecine, where film material was converted into a pleasing video signal.

VideoGrade

The technicians provided the colourist with four basic technical parameters with which he could process the video signal: Lift, Gamma, Gain and Saturation. These are still among the most popular grading tools today. In Baselight, they can be found in the VideoGrade Operator. With Lift, which is sometimes also called Pedestal, the colourist adjusts the black level and with Gain the white level. Experienced Photoshop users will find it easy to visualise the resulting gradation curve: Lift sets the starting point (bottom left) and Gain sets the end point (top right). When working with VideoGrade, these are the most important reference points. Gamma is technically a power function whose only parameter is the exponent. The gamma function determines the curvature of the gradation curve between the two end points.

Light determination – The exposure tool in FilmGrade not only simulates working with copy lights, but also indicates the strength of the correction in printer points.

These three parameters can be adjusted not only for brightness, but for all three colour channels of the video signal (RGB). However, the operator is not usually presented with individual controls for red, green and blue, but one for the brightness signal and a two-dimensional one for the colour component. This results in the basic structure of all grading panels: three spheres, which adjust the colour in two dimensions, and a rotating ring around or next to them for the luma setting. Lift is on the left, gain on the right and gamma in the centre. Colloquially, these three parameters are often referred to as shadows, mid-tones and highlights.

A very popular and solid working method used by video style colourists is to first adjust the black level and white point of the image. This involves balancing all three channels in both black and white just before the clipping points. The brightest point in the image is then pure white, i.e. without a colour cast and with maximum brightness, and the darkest point is pure black. This is often referred to as “clean” black and white. The gamma parameter is then set. This regulates the “airiness” and “heaviness” of the image. In other words, a combination of brightness and contrast. If you want to colour the image, for example, you often do this via the gamma, as black and white then remain “clean”.

Look and feel – The BaseGrade user interface. The developers have orientated themselves on the existing tools in Baselight. Users can create their own layouts on new pages as usual.

Baselight provides two modes for VideoGrade. The standard mode is RGB. In YCbCr mode, the luma channel is processed in isolation. Changes in brightness then have no effect on the colour and saturation of the image. Over time, VideoGrade has also become the most important tool for telecine-style colourists. The colourist manually converts an image from a colour space with a high contrast range, such as log coding, into the output colour space. VideoGrade is therefore now not only applied to images in a video colour space, as originally intended, but also to images in a log colour space.

Lift, Gamm, Gain is probably the most frequently used grading operator in the video sector at present. However, it originates from a time when the handling of specular highlights, for example, was of secondary importance. Burned-out windows or overbright skies were tolerated as long as the faces were recognisable. A soft clip, i.e. a smoother transition to overshoot, is not possible with VideoGrade alone. Over time, video colourists therefore developed various techniques to meet this aesthetic requirement. Gradation curves, luma keys, blend modes or dedicated soft clip operators were used for example.

Another disadvantage becomes apparent in VFX workflows. Contemporary compositing works most realistically in a scene-linear colour space. The pixel values are proportional to the photons on set. Pregrading is nevertheless often helpful so that the basic brightness and white balance are correct and the individual shots in a sequence match each other. Unfortunately, lift, gamma and gain destroy the scene linearity and make VideoGrade unusable for this type of VFX pregrading.

FilmGrade

Long before the telecine colourists, the profession of the film light setter emerged. They influenced the look of a cinema film via the intensity of the copy lights and the chemical processes. With the introduction of the digital intermediate process around the year 2000, the new profession of DI colourist emerged. This person processes cinema images digitally before they are exposed on film material and copied onto print material.

The source material was also usually a film scan. The digital intermediate gradually replaced the analogue intermediate process and thus the creative part of determining the light in the copying plant. Film is the all-determining factor in the DI process, as you can only produce colours that can also be reproduced on film material. The analogue process up to the cinema copy is therefore simulated live in grading using a so-called Film Print Emulation LUT. The obligatory film LUT, which is used in the preview output, means that the image reacts differently to the colourist’s inputs than in telecine or video mode.
The manufacturers have developed new grading tools for DI processing, which are based on analogue light determination with copy lights. In Baselight it is called “FilmGrade”. FilmGrade is designed for processing images in the Cineon-Log colour space. After the colour correction, a conversion to a display colour space takes place, for example classically via a LUT or, since Baselight 4.4, via shaders with truelight colour spaces.

FilmGrade consists of a total of six tools, which are divided into two tabs. The main page consists of Exposure, Contrast and Saturation. The second page consists of Shadows, Midtones and Highlights. All tools offer adjustment options via a ball and a rotating ring. The most important tool is Exposure, which can also be adjusted in RGB copy light steps, so-called printer points, via the blackboard panel. The colourist uses Exposure, also known as Offset in other grading programs, to adjust the brightness and the sphere to adjust the colour of the image. Film-style purists try to work with exposure as much as possible, as this simulates an analogue light setting and the image remains very natural. The contrast in all colour channels and the scene linearity are retained when changing the exposure, for example. Shadows, midtones and highlights do not correspond to lift, gamma and gain, as the individual areas are limited by pivot points. If, for example, the shadows are lifted or coloured using Shadows, this is only done up to a defined point in the curve.

With Lift, on the other hand, the entire image is processed, but the shadows are the strongest. In film workflows, the LAD grey test field approximately in the middle of the curve is the anchor point both for calibration and for the FilmGrade colourist. It was defined by Kodak and is a little darker than 18% medium grey. Full black and white are difficult to determine on film because the curve is very flat. This is why a film-style colourist, unlike a video-style colourist, is not so concerned with achieving 100% white or black. With this concept, a soft clip is already active via the simulation of the print material. The visual impression takes centre stage. This grading concept is therefore more natural than VideoGrade.

There are also fundamental problems with this concept. Although controlling the brightness via Exposure works quite naturally, it is not exactly the same as changing the camera aperture or the ISO value. This is due to the log coding used, which does not define the black level exactly to zero, for example. The detailed reasons are beyond the scope of this article.

HDR ready – BaseGrade was developed with HDR formats in mind, but working with regular dynamic range is also easier.

FilmGrade is now used not only on Cineon log data, but also on any type of log coding such as LogC from Arri. On the output side, not only film emulations but also modern approaches such as ACES are used. This brings us to future-proofing: log coding can only store a limited dynamic range. The Cineon curve, for example, was no longer sufficient for the high contrast range of the Alexa camera. This is why Arri developed the LogC curve. Future camera generations and HDR displays will require further adjustments, which may involve compromises. This also applies to the VideoGrade operator, which was originally only designed for video signals with a standard dynamic range.

Unity – Below was pushed by one f-stop in the raw settings and above via BaseGrade. The result is identical.

BaseGrade – under the bonnet

The time is ripe for a next-generation colour grading operator. Filmlight uses neither a video signal nor film material as its foundation, but algorithms that are modelled on human perception. The dynamic range is not limited by a technical format such as Rec. 709 or Cineon coding, but is ready for the future characterised by HDR.

BaseGrade works identically in every working colour space and therefore always feels the same to the colourist, regardless of the camera used. However, in addition to a sensible colour setup in the scene settings, this also requires the correct keywording of the material, which is usually done automatically. BaseGrade autonomously converts the image into a linear colour space in which the original brightness ratios of the scene on set prevail, as in linear compositing. The user is not aware of the colour space conversion; the next operator in the stack receives the image in the defined working colour space again.

BaseGrade uses an internal colour model based on Lab, which consists of a pure lightness component L and two colour components a and b. The colour plane spanned by a and b has been distorted by the developers with regard to colour correction so that colour and saturation changes have the same visual effect in all colour areas and work feels more intuitive. Four parameters affect the entire image: Flare, Balance, Contrast and Saturation. In addition to these global parameters, BaseGrade divides the image into brightness zones. This is immediately reminiscent of Ansel Adams’ legendary zone system, which, according to the developers, also served as a source of inspiration.

Stray light – Using the Flare parameter, the colourist adjusts scenes with an increased black level.


At first glance, there are few parallels to existing tools in Baselight or comparable video grading software such as Resolve. Raw development in Adobe Lightroom comes closest to working with BaseGrade. However, BaseGrade not only offers brightness controls for each zone, but also colour and saturation. In addition, each zone area can be fine-tuned via pivot and falloff. All exposure settings and pivot points are specified in f-stops. This also comes close to human perception and helps photographers and cameramen to understand them. Stops are also a widely recognised and established unit.

Baselight 5.0 

Users are provided with new builds every few weeks, but now version 5 is finally in the starting blocks. The significance of this release becomes clear when you consider that the last full version jump from 3.O to 4.O was more than seven years ago (2OO9).

Although BaseGrade is the most significant innovation in the upcoming version, it is of course not the only one. For example, the software will be greatly expanded in the finishing area with a dedicated blue/green screen keyer, perspective tracker and warper as well as a grid warper and a paint tool. In the plug-in area, support for the Autodesk Flame matchbox shader format and GPU acceleration for OFX will be added. 

CGI renderings with WSP and normal maps can be relit in Baselight 5 or individual objects can be graded separately. And the powerful colour management has been further developed with a special focus on HDR and has been structured more clearly for the user. DP will be looking at further new features in a future issue. 

The colourist can now give the camera operator direct, meaningful feedback, such as: “I have made the entire image half a stop brighter” or “I have lowered the highlights by one stop”. If a Dailies colourist works with BaseGrade, it is even conceivable that such feedback – like the copy light report in the past – will help the cameraman when working on set. In any case, communication is simplified, especially when the cameraman cannot be physically present during grading.
BaseGrade’s reference point is medium grey, as found on 18% grey cards. The brightness zones are defined from there in f-stops. A correction of three f-stops up or down is the maximum in standard mode and is sufficient. For extreme cases, up to six f-stops can be corrected in “Extended mode”.

Purist – If necessary, the user first corrects the flare and then limits his work to the Balance parameter. The scene linearity of shots is then retained, for example in VFX pregrading.

User interface

The user interface is based on VideoGrade and FilmGrade. Three main parameters at the top, which are mapped to the three sphere-ring combinations on the blackboard panel. Below this is a visualisation of the current grade as a gradation curve, framed by other parameters such as pivot points. The developers have divided BaseGrade into two tabs. The first page is called Dim/Balance/Light and the second Dark/Balance/Bright. The most important parameter, Balance, is permanently visible and mapped to the centre sphere/ring combination on the blackboard. Flare, Contrast and Saturation are also visible on both sides and can be adjusted via potentiometers. As with all other tools, the colourist can of course adjust everything in the user interface and on the blackboard to suit their individual requirements.

The developers have come up with something special for visualisation using a gradation curve. They superimpose a luma waveform display of the current image over the curve. This means that the colourist can always see which parts of the image he is currently working on and how he should readjust the pivot points if necessary
Pivot points should be readjusted.

Practical test

A little familiarisation time is needed to find your way around. However, the curiosity of having a revolutionary tool at your fingertips makes it easy to get started, after which the results motivate you to continue.
Firstly, the promise of the exact aperture scale is checked. And indeed, increasing the balance by one f-stop is exactly the same as doubling the ISO value in the raw developer. And this applies to all tested cameras from Arri, RED and Sony. The best thing about this is that no raw material is required. For example, if you are working with a mixed ProRes and ArriRAW timeline and all raw shots have been pushed by one f-stop, you can now apply the exact same correction to all shots. A cumbersome switch to the De-Bayer settings is no longer necessary and does not bring any qualitative advantages compared to working with BaseGrade.

Adjusting from shot to shot seems to be another ideal task for the tool. Many colourists divide their grade into a base correction per shot and the creative look. Used as the first layer in the stack, BaseGrade’s scene-linear functionality makes it easier to compensate for exposure differences. Shots with diffuse lens flares, which appear in the image as a raised black, are a common problem when equalising. Even technically high-quality lenses such as the master primes show this effect, especially in scenes with light-coloured backgrounds. If you tried to remove the flare using Lift, for example, the entire image would change and the grade would no longer fit correctly. The flare parameter helps in these cases as it corresponds to optical stray light. If the exposure is basically correct, the differences in the black level can be corrected well using flare.


When copying corrections from shot to shot, the colourist should also pay attention to the flare value and readjust it if necessary. For example, if you copy from a scene with strong stray light, the black in the new shot may be completely washed out. With none of the other controls except Flare can you get it back to a decent level. However, if Flare is set correctly, the black can practically never be crushed. In BaseGrade, the shadows are pressed into a pleasant-looking compression and not clipped hard.

The Saturation slider is also pleasant to use. It is amazing that something as widespread as colour saturation could be improved even further. In direct comparison, BaseGrade behaves visually more evenly than existing implementations. Previously, primary colours such as red, for example, quickly became overweight in the image when saturation was increased. Not so with BaseGrade, the strength of the effect is distributed more evenly across the colour wheel. And the best thing about it is that it works in the same way when desaturating images.

BaseGrade shows its particular strengths through the zone model. The gradation of an image can now be modelled in great detail with just one operator. Previously, this required detailed and sometimes quite fiddly tweaking of the curves in CurveGrade or Keying, but now there is a more intuitive alternative. Extracting detail from a sky normally requires a luminance key on the highlights. With BaseGrade you can get surprisingly far without any secondary correction. Bright is lowered and by raising Light you can tease out the last details until just before clipping. If you initially set the correction too high, it is easier to find the right pivot points. You can then reduce to a realistic level.

If the colourist opts for extremely strong corrections, there is a risk of unsightly effects, a preliminary stage of solarisation, so to speak. Although BaseGrade prevents true solarisation, i.e. negative gradients in the gradation curve, the colourist can flatten the curve so much that the drawing is ruined in certain brightness zones. In these cases, a larger falloff, the transition area of the zone, usually provides a remedy, but at the same time reduces the effect slightly.

BaseGrade can also be convincing with a common technique in colour correction. Cross process toning is a popular stylistic device that makes images more interesting in terms of colour. This involves colouring the highlights and shadows in different shades. Complementary colours are often chosen for this: the shadows, for example, in cold turquoise blue and the highlights in warm orange. Colourists often use HLS keys so that they have better control over the effect. With BaseGrade and its zone model, you can precisely control the strength and colour tone of the effect. This will probably save a few layers in the timeline in the future.

On the subject of VFX pregrading: Flare is an artefact that occurs in the camera optics and distorts the scene linearity. With the help of BaseGrade, the real brightness conditions on set can be reconstructed using a correctly set flare value. Afterwards, Balance is used for complete scene-linear colour correction. The flare correction makes BaseGrade more suitable for scene-linear pre-grading than Exposure/Printer-Lights in FilmGrade.

Conclusion

BaseGrade is powerful, but also complex. A properly set up project is the be-all and end-all for correct functioning. If the colour spaces are not correct, it will not work properly. It is advisable to familiarise yourself with the tool slowly at first and then integrate it into your daily work bit by bit. A good way to start, for example, would be to make all saturation corrections with BaseGrade. The next step would be to use it for basic corrections from shot to shot, and so on.

The new grading concept could also be a good introduction to moving image colour correction for photographers with Lightroom experience and, thanks to the intuitive and aperture-based approach, also for cameramen.
The first few days with BaseGrade were very promising: the tool has the potential to create both more natural and possibly completely new looks. It also makes day-to-day work easier. The decisive factor will be how well it is accepted by users.

Andy Minuth is a graduate of the Stuttgart Media University. He then spent several years at CinePost in Munich, working his way up from junior to senior colourist. He is a beta tester for Baselight and currently lead colourist at 1OOO Volt in Istanbul. His work there focuses on commercials and cinema films. www.specular.xyz

]]>
DIGITAL PRODUCTION 148698