Search Results for “DP1906” – DIGITAL PRODUCTION https://digitalproduction.com Magazine for Digital Media Production Tue, 03 Dec 2024 15:44:54 +0000 en-US hourly 1 https://digitalproduction.com/wp-content/uploads/2024/09/cropped-DP_icon@4x-32x32.png Search Results for “DP1906” – DIGITAL PRODUCTION https://digitalproduction.com 32 32 236729828 400 MPH https://digitalproduction.com/2022/06/08/400-mph-retro-artikel/ Wed, 08 Jun 2022 07:00:17 +0000 https://www.digitalproduction.com/?p=103235
Blast from the past: In DP 06 : 2019 we went apeshit for the Planet of the Apes movies – and then the animated short "400 MPH" from students of "Supinfocom" came along. Ready for a chimpanzee aping Icarus?
]]>

It seems that chimpanzees have made an impression – and six students from Supinfocom Rubika took that one step further.

DP: Just to satisfy my curiosity: Did the ape at least know he reached 700 mph in the final seconds?
Team 400mph: Maybe. No matter how fast he goes, to him it will always feel like he hasn’t managed to exceed 400 mph.

DP: Considering the basic story line: How did you get the idea for that movie?
Team 400mph: The story was inspired by the attempts to break the sound barrier in the 1950s. The pilots at that time were taking incredible risks to beat this record. Quickly the lake of Bonneville became an iconical place for speed records, and it made us think of the Moebius comics, such as “Arzac Thapsody” and “40 Days Dans Le Desert B”. Having a chimpanzee as a main character reminded us of the space race and reinforced the impression of danger for our character.
In addition, we could accentuate his emotions while still keeping him very animal. He could sometimes behave in a human way and his emotions would be clear to the viewer but he also could show a primitive rage in his moments of anger.

DP: How big was the team and how long have you worked on this movie?
Team 400mph: We were a team of six and worked on it for a year and a half, even a bit more for the original storyteller that came up with the original idea.

DP: How did your team get together?
Team 400mph: Each student presents a project, then there is a selection by teachers to keep the films that seem the most different but also feasible and interesting with the knowledge we have acquired in school. Then each student decides the stories he/ she wants to work on the most and makes a list of the projects that interest him/her and the role he/she wants to play. We all try to shape complementary teams based on each other’s strengths and weaknesses. This helps us form a balanced group that can work on all different aspects of the creation of a 3D animated movie.

DP: Can you tell us about your pipeline?
Team 400mph: We mainly worked in Maya, the rendering was done with Redshift and for the FX we used Houdini. The monkey was sculpted in ZBrush, his hair was done with Yeti, and for his textures we used the Substance suite and Mari, the modeling and simulation of his clothes was done in Marvelous Designer. The compositing was done with Nuke.
Finally, for editing, sound design and calibration we used Avid Media Composer, Pro-Tools and Resolve. We used a pipeline tool called Pipou, developed by Fabien Meyran for their film “028” (see page 80) – it would manage the shader exports and animation as well as all the hierarchy of the files. As hardware, we had six working machines and in addition to that we had access to 3 machines with more RAM for Houdini and Marvelous simulations. For rendering, we had six other computers equipped with GTX 1080Ti as well as a server with 4 Quadros.

DP: Any special tools, plug-ins or scripts besides those?
Team 400mph: We created several more scripts, in the beginning for the animatic and the forecast in order to update the editing more easily. Then for rig and clothing simulations, especially for baking simulations onto the low-poly model. For the animation, we used AnimBot and Studio Library to make the general process easier and more time-efficient. We also had some homemade camera shaking scripts to make the shots inside the vehicles as intense as possible.

DP: And with that setup, how long did the render take?
Team 400mph: It depends obviously on the shots, but generally speaking the shots outside were delivered in a few minutes by frame. For the monkey shots it would range from 15 minutes to an hour. Except for a few very close-up shots, we tried to never exceed one hour of rendering.

DP: The protagonist’s face is awesome and on par with the work done by Weta and ILM. How did you do that, and could you describe your design approach and the underlying rigs, textures, and muscle systems?
Team 400mph: Thank you for the comparison! The making of Icarus was a huge process that involved a big part of the team through the concept phase of the movie. For the design of Icarus, we really focused on the chimpanzee’s characteristics by watching many documentaries and photos dealing with the ape’s appearance and anatomy. The only difference with reality is that we wanted our character to be able to express his feelings the way humans do, while still having his animal side (for example by moving his eyebrows more than a regular chimpanzee would).
After the drawing of the 2D concepts, Julia used Maya to model the skeleton of Icarus to have the correct proportions and make the whole structure credible. Then she used ZBrush to add the main details such as wrinkles on the model. For the facial setup, she created many blendshapes for the principal expressions. Then Lorraine, who was working on the rig, added a joints system to allow the animators Alice and Quentin to bring deformations to the face of our monkey. Meanwhile, Paul-Eugene was working on the texturing of Icarus with Mari to give this particular skin aspects that make the chimpanzee realistic, and Natacha worked on his eyes to get a chimpanzee/human look as realistic as possible. We really wanted to have a very characterized chimpanzee.
For the character FX, Lorraine dealt with the cloth modeling of the clothes Natacha had designed, textured, and simulated in Marvelous Designer, as Julia was establishing the fur of Icarus with Yeti.

DP: The rig for that face must have been detailed and extremely fine-tuned. Could you describe that setup?
Team 400mph: We first focused on creating a generic yet detailed deformation system that is close enough to what we could expect on a common human face, implying jaw/lips, eyelids/eyebrow controls. The biggest difficulty came when apprehending the deformations and their impact on the other controlers, especially around the mouth. That was built in two steps: the first one implied a complete skin weight based deformation with multiple bones around the lips and the cheeks, which are driven by the jaw control to redistribute the weight of the skin, when the chimp opens its mouth wide enough, and is complemented with corrective blendshapes. This system let us have rig based controls above for animators to tweak on the same skin weights distribution. Above this was added sticky lips setup, specific blendshape deformations and sticky clusters.
The eyes’ regions had to be so expressive, we worked specifically on them based on the animated shots and rendering requirements. We weren’t sure yet if the chimp was going to be faithful to a real chimp’s muscular movements or if adding human type expressions would deliver a good impression. We finally did add them, step by step, adapting the deformation blendshapes for nice rendering shapes and expressions based on the feedback of rendering for the closest shots.

Finally, it’s a rig based setup mounted with corrective blendshapes and sticky clusters, with maximum control over each stage of deformation, to get all the subtle movements right.

DP: And how much research did you need to do to get it just right?
Team 400mph: For the rig: It’s hard to get specific information on a chimp’s muscular motions, so the research mostly implied searching for human anatomy specificities to translate to the chimpanzee proportions. Drawing helped a lot in the process. For the animation, we watched a lot of documentaries and scientific videos, even some films like “Planet of the Apes”. We also shot a lot of references in order to see exactly how we wanted Icarus to move and react.

DP: The cars in the beginning: Were they based on the Saltlake Tests from the 60s? If so, how did you put them together and how did you make them look iconic, but not just a copy?
Team 400mph: The cars are actually based on real iconic models, which are in order of appearance: the Indian Scout of Burt Monroe, the Thunderbolt of George Eyston, the Challenger 1 of Mickey Thompson and the Spirit of America of Craig Breedlove. It was our intention to stay as close to reality as possible, but we also needed to adapt a little bit – for example, adapting the car cockpits to the shape of a chimpanzee driver. We also chose an angle of view which is made possible with the 3D techniques but generally not possible in reality. For example the last car is more like a rocket than a car, so our originality is in our camera choices.

DP: And what have you been working on since?
Team 400mph: We’ve all started working in different studios, some in Paris, some in Brussels and even some in Sydney.

]]>
DIGITAL PRODUCTION 103235
After the social collapse: Making-of Tom Clancy’s The Division https://digitalproduction.com/2019/07/15/nach-dem-gesellschaftlichen-kollaps-making-of-tom-clancys-the-division/ Mon, 15 Jul 2019 17:00:22 +0000 https://www.digitalproduction.com/?p=77333
the "Best Motion Design" category was added to the animago AWARD in 2016. At the time, it was important to the animago team to introduce a separate category for this, as motion design work differs from the other submissions in terms of a different storytelling approach and a special look. Last year, the motion design trailer for the second part of the game "Tom Clancy's The Division" won the coveted prize in this special category. The design studio Antibody, which created the trailer for Ubisoft, tells us what was important for the design.
]]>

The first part of the shooter “Tom Clancy’s The Division”, developed by Massive Entertainment and Red Storm Entertainment and published by Ubisoft, was released in 2016. In this game, an epidemic breaks out in New York City, the so-called dollar flu. Many people die from it, life in the city comes to a standstill after just a few days and chaos reigns. The Strategic Homeland Division, or the Division for short, is activated to deal with the situation. This secret special unit, consisting of sleeper agents, becomes active when all other state institutions and measures fail.

Part 2 begins seven months after the action of the first part: Despite the antidote found for the dollar flu, social order is still threatened with collapse. In Washington, D.C., numerous factions such as the Hyenas and the True Sons are fighting for power in the capital. Only the Division agents are well enough equipped to counter this threat and protect the remaining civilians.

Two trailers, two targets

Tom Clancy’s The Division 2 was released in March 2019 for Windows, Xbox One and Playstation 4. Ubisoft had two trailers developed for the announcement of the game: Platige Image created the official cinematic trailer and Antibody realised the “Washington D.C. Aftermath” trailer.

Mit Motion Graphics gestaltete Massenpanik: Die verschiedenen Symbole repräsentieren Polizei, Zivilisten und aggressive Randalierer.
Mass panic designed with motion graphics: The various symbols represent police, civilians and aggressive rioters

This one uses motion graphics to give viewers a lot of information about the events that took place before the gameplay. Both trailers have different goals – Eric Moutardier, Brand Director at Massive Entertainment, knows what they are: “The goal of the film trailer is to create emotion by giving viewers an incentive to want to fight for our game heroes. In the trailer, we focused on the changed situation and conditions of the game world and placed its unique atmosphere at the centre. The motion design trailer, on the other hand, is embedded in the context of the game and conveys facts about Washington, D.C. It should also create a mysterious feeling around the Division story.” Creating an immersive gaming experience is hugely important to Ubisoft in its games. To achieve this, it is crucial to take the logic of the game world seriously. With the help of the information and statistics shown in the trailer, this logic can be transferred to gamers. It is also a narrative trademark of The Division games: Ubisoft already released a trailer of this kind for the first instalment.

Survival – das Ziel beim Spielen von „The Division 2“ darf nicht fehlen.
Survival – the goal when playing “The Division 2” is not to be missed

It all depends on the right artist

The design studio Antibody was responsible for the motion graphics trailer. The team worked on the realisation for a total of four months – from the first call for the job to the delivery of the final film. The main software used was Cinema 4D for the animation and After Effects for the compositing. However, the team also built the 2D and 3D assets using other programmes such as Maya, ZBrush and more. The crowd simulations were created with X-Particles. The rendering was realised with Octane and a solid GPU rig.

Rot ist ein wichtige Farbe in dem Trailer. Auch wenn die verschiedenen Rottöne sich sehr ähneln, erzeugen sie beim Zuschauer eine unterschiedliche emotionale Ansprache.
Red is an important colour in the trailer. Even though the different shades of red are very similar, they create a different emotional response in the viewer

When it comes to the pipeline, Antibody is not focussing on the technical tools. Instead, the team follows the maxim: the right artist for each element. Once this has been found, they can work with the tool they feel most comfortable with.

Lots of red and smooth transitions

The previs phase was very thorough: the artists spent several months working on the storyboards, which they all drew by hand. In these, the team defined the transitions and the tempo. “In this way, the ideas lead the technology and not the other way round,” says Patrick Claire, Creative Director at Antibody. Antibody then developed the design frames and the boardomatics. Finally came the CGI previs, in which the shots were replaced with and by final shots.

Dominiert wird der Trailer von der Farbe Orange, oft kombiniert mit Pink- und Rottönen.
The trailer is dominated by the colour orange, often combined with shades of pink and red

The reference for the design of the look was the “Division” game itself. Claire and his team looked at the rich colours of the game world, and together they repainted American icons such as Washington, D.C., the presidential statues, cemeteries, etc. “I’m obsessed with the colour red. That’s why there are many different shades of red in the trailer, which are technically very close to each other, but trigger completely different emotional reactions in the audience.” As motion design films rely on excellent colour design to create an impressive whole, the team invested a lot of time in this: “In the design scheme, we let an all-time complex orange dominate, which we combined with pink and red tones to give the orange and amber more depth and interest.” For the bespoke look, the team took Normal and Reflection maps and coloured them in a specific way. They also added a lot of simple colour mattes and used depth passes to create fog effects in some areas.

Washington aus der Luftansicht
Washington from an aerial view

Because the narrative pace of the trailer had to remain moderate and the communication clear, the script was the biggest challenge in the development process. To find the right pace, the team imagined it as a conversation with the audience. “The pace of the narrative voice determines EVERYTHING. In this way, we engage the audience through the spoken language and the information on the screen becomes secondary,” says the Creative Director. The transitions between the various images and scenes in particular give the trailer its aesthetic speciality. According to Patrick, the style of the video resulted in this in a very natural way. “We stuck closely to the script and the ideas and developed the transitions in such a way that they match the flow of the narrative and don’t hinder it. Because story is king!”

Simulated collapse

The scenes in the trailer are based on 3D objects with numerous 2D details. The team found the design of the collapsing city to be the most interesting part of the project: “We developed crowd behaviour that simulates how a riot could break out. The finished video doesn’t focus on this, but there are layers over the map of Washington, D.C. that show how fear spreads through a crowd,” explains Patrick.

Visualisierung der hohen Todesopferzahl
Visualisation of the high death toll

The symbols represent police, civilians and aggressive rioters. In the simulated riot, a conflict breaks out in a certain zone, to which the police react, causing the crowd around them to panic. Some icons then symbolise “die” and turn into X letters, indicating the civilian victims. “It was interesting to think about how to visualise the mass panic and violence that can occur during an epidemic in a crowded public space,” says the Creative Director.

The motion design category will once again be part of this year’s animago AWARD. We will know who will receive the trophy on 2 November. That’s when the 2019 AWARDS ceremony will take place at the Alte Kongresshalle in Munich.

]]>
DIGITAL PRODUCTION 77333
Shadow of the Tomb Raider https://digitalproduction.com/2019/10/09/shadow-of-the-tomb-raider-retro-artikel/ Wed, 09 Oct 2019 08:00:00 +0000 https://www.digitalproduction.com/?p=99571
Blast from the past: Who is the most iconic video game heroine ever? Lara Croft, of course. In this interview, we take a deep dive into the lost city of Paititi – and the final installment of the reboot trilogy.
]]>

What is the first video game hero(ine) that comes to mind when you hear adventure? For us it is Lara Croft. The reboot has been doing very well – so say critics and gamers. The technical finesse is, even 18 months after its release, still the yardstick to measure all others by.

DP: Obviously the first question: How much fun is it to work on one of the major icons of gaming history: Lara Croft?

Jonathan Dahan (Producer): It was more of a humbling experience I’d say, like a dream come true that also comes with homework. We were given the mandate of completing the origin trilogy of one of the most recognizable protagonists/franchises in the industry. That’s a tall order!

We knew we couldn’t mess it up, so we took it very seriously and tried to make the best game we could. The fun comes more in the day-to-day, I think. Everyone on the team is very passionate about their work (I mean, making video games, living the dream) and the Tomb Raider franchise. When we’re able to actually play and try out a new feature for the first time, see the first storyboard of an intense cinematic moment, or see Païtiti running on the artist’s computer for the first time; it always makes us proud and motivated to continue to do better. That love of games translated well into the story, its realization and gameplay.

DP: With the incredible amount of assets, styles, FX and characters: How long did you work on that?

Fédéric Chappart (Art Technical Director): The origin story was always planned as a trilogy, so we knew that a third game was going to be made after “Rise of the Tomb Raider”. The conception of “Shadow of the Tomb Raider” started before the end of production of “Rise of the Tomb Raider”. Both creative teams at Crystal Dynamics and EidosMontreal started brainstorming on what we could bring to the game and how to end this origin story. It took approximately three years from the start to the release of “Shadow of the Tomb Raider”. And you can add several more months of development for the seven post-launch DLC releases.

DP: Since its release in 2018, what did you update?

Michel Leduc St-Arnaud (Lead Game Designer): The main updates came in the form of our seven DLCs, that each added a new narrative side mission, a new challenge tomb, additional outfits, weapons, and co-op play functionality. In total, the DLCs added nearly 10 hours of gameplay, and probably even longer with a very interesting level of replayability that co-op and our Score Attack and Time Attack modes delivered. Keeping a part of the development team active after launch also allowed us to tweak the game balancing and continue polishing the main experience. It also allowed us to fix bugs. Unfortunately, there’s always a few issues that slip through our quality assurance checks and can only be found when millions of players experience the game in their own way. That said, we are always listening to our community and try to fix things as fast as possible.

DP: Considering the flawless mechanics and execution: What was the groundwork and engine for “Shadow of the Tomb Raider”?

Jonathan Dahan: Several updates were implemented to our tech in order to deliver “Shadow of the Tomb Raider”. The game was developed on the Foundation engine (the internal engine created by Crystal Dynamics), which also powered “Tomb Raider 2013” and “Rise of the Tomb Raider”. With “Shadow of the Tomb Raider”, we wanted to push some aspects even further, with two big areas of focus being our lighting and vegetation systems. We wanted the jungle to feel as realistic as possible: dense and oppressive, full of life but also full of dangers. We invested a lot into making our vegetation look as good as possible and feel part of the world. That meant that they needed to properly react to the wind, Lara, other NPCs, and animals.

DP: A big part of the charm of the game is the interaction design. How did you get it to be as understandable as it is?

Fédéric Chappart: The main objective is to make the interaction as natural as possible for the player. We want the system to behave the way the player expects without prior learning. Once that is achieved, we want to find the balance between reactivity and visual quality to avoid frustration through repetition. QA will organically test interactions and spot the biggest visual quality offenders. The way we make sure quality is at its highest is by including QA in the process and not as a last check. Every iteration is tested to find weaknesses and areas of improvement in order to execute the necessary fixes along the way.

DP: Towards the final look: Could you talk about your rendering and finishing?

Michel Leduc St-Arnaud: We redid our lighting pipeline completely in order to support the highest HDR quality possible, which made us rework all of our post-process like color grading in order to reach a high HDR range. We reworked our material pipeline in order to support the latest shading models, and implemented a new way of rendering for water volumes with caustics, volumetric lighting, volumetric fog, all in real-time for maximum realism. We also added subsurface scattering of light in vegetation as well as character skin. Finally, we worked real-time, ray-traced shadows with Nvidia, which allowed us to support soft shadows, contact shadows, and omnidirectional shadows on the latest RTX video cards.

DP: And what have you been working on since?

Jonathan Dahan: Parts of the team have moved on to work on other projects in the studio including Marvel’s “Avengers” in collaboration with Crystal Dynamics.

]]>
DIGITAL PRODUCTION 99571
FMX 2019: Real-time ray tracing for final frame rendering https://digitalproduction.com/2019/05/15/fmx-2019-echtzeit-raytracing-fuer-final-frame-rendering/ Wed, 15 May 2019 13:30:56 +0000 https://www.digitalproduction.com/?p=77000
It's Wednesday afternoon in the König Karl Halle in the Haus der Wirtschaft. The hall is darkened and filled with people eagerly awaiting the lecture "Troll: Real Time Raytracing and The VFX Pipeline" - and the DP is there too.
]]>

The first speaker is Anton Palmqvist, Head of Real-Time at Goodbye Kansas. He starts the talk with the project “Troll”, which is presented as a cinematic trailer on the one hand and serves as a tech demo for the real-time raytracing features of the Unreal Engine on the other. The half-hour presentation will focus on the use of real-time ray tracing in a modern VFX pipeline and how game tech can be combined with a classic VFX pipeline. Anton Palmqvist answered DP’s questions in an interview and went into further details as well as possible future prospects for the technology, which is still in its infancy.

The demo project

As you can see from the pictures in the article, it revolves around a young lady called Bianca, who is sitting in a crown in the middle of a forest by a small lake. Frogs croak and the wind rustles the leaves. Three fairies appear below the surface of the water. They circle at first, glow, light up the area around them and rise out of the water. They dance around, play, fly around with their crowns. Soon muffled noises are heard, leaves rustle. The camera jumps back into the forest, a dark silhouette rises with muffled noises. Startled, the fairies disappear, Bianca’s face full of displeasure, darkness sets in, the crown falls into the princess’s hands and her gaze is directed towards the dark forest and the silhouette – fade to black. The audience applauds.

The underlying story was directed by Björne Larson, Head of Studio at Deep Forest Films in Los Angeles, with a musical score by Academy Award winners Ludwig Göransson and Joseph Shirley. To ensure that Princess Bianca’s emotions are as convincing as her movements, 3Lateral was involved in the project. 3Lateral is now part of Epic Games and specialises in the natural and realistic representation of digital creatures.

Game tech in VFX

Just under a minute and a half of film footage shows the latest real-time ray tracing features of Unreal Engine version 4.22 in action, and the quality is very convincing. The differences to final frame rendering with sophisticated path tracers are recognisable to professionals, but from the perspective of real-time computer graphics, the trailer was a visual treat and definitely forward-looking. Especially because the trailer was played at a constant 24 fps with an Nvidia RTX 2080Ti. The highlight is when Palmqvist pauses the trailer and zooms in on the respective areas to show the features.

“Troll” is a joint project between Goodbye Kanses, Epic Games – also 3Lateral – and Deep Forest Films. The VFX studio Goodbye Kansas, based in Stockholm, is known for creating high-quality game trailers, films, TV shows and adverts. Palmqvist says that Goodbye Kansas covers both areas, games and film. The VFX studio has an offshoot in Los Angeles that is dedicated purely to realtime. As a result, the studio has experience in creating products in both areas, and it was obvious that real-time ray tracing should also be examined more closely for production suitability.

As Palmqvist is at home in the field of games and had a great deal of prior knowledge of the Unreal Engine, he was predestined for the role of Head of Real-Time, which he took up in April 2018. He worked on the “Star Wars” franchise “Battlefront” and contributed his skills as a 3D and lighting artist to “Mirror’s Edge Catalyst”. The “Troll” project was intended to explore the potential of the Unreal Engine and compare it with the classic VFX pipeline in order to identify possible intersections. According to Palmqvist, a general review was planned and the real-time ray tracing features were the cherry on top.

Game tech workflows, as we all know, are different from offline rendering workflows. In order to create an intersection between the two worlds, attention was already focussed on the real-time ray tracing features during the preview version of Unreal Engine 4.22. According to Palmqvist, one major advantage is the support of real-time reflections. In game tech, there are ways to realise visually appealing reflections in one way or another, usually using special tricks or hacks. There are also common features such as Planar Reflection and Screen Space Reflection. However, the latter creates a certain type of artefact that detracts from the overall picture, precisely because the large water surface reflects the forest and all the surrounding objects, Bianca with her crown and the fairies. The same applies to the crown – an asset predestined for real-time reflection.

Especially in relation to the water in the trailer, it was also a relief that real-time translucency is supported. This gave the shot in which the fairies move in a circle beneath the surface of the water in front of Bianca a whole new dynamic. Translucent objects such as water or glass surfaces behave more realistically in the first instance. In conjunction with the lighting features that are also supported, the working method in Game Tech is slowly leaning towards the offline rendering VFX pipeline.

We are talking about area lights, which can be used in the Unreal Engine to bring more dynamics into the backdrop lighting and create soft shadows. To back up the example with visual material, Palmqvist repeatedly shows certain excerpts from the trailer. He presses the pause button, so to speak, and zooms in on certain parts, such as Bianca’s face, which is shown in close-up while the area light moves. The real-time reflection is shown by zooming in on the crown, in which the silhouettes of the glowing fairies can be seen.

Global Illumination

In addition, it was possible to use real-time global illumination in the area of lighting, which offers artists a completely new degree of freedom. Palmqvist took up a standard working method from the game tech sector, in which UV channels have to be created for all environmental objects as part of the light baking process, with sensible UV arrangements so that the respective light map is given the most economical dimension during baking. Then the process of light baking begins, which takes time and, depending on the resolution, is similar to offline rendering times. The work steps would be virtually eliminated and create new freedom. He added that there may be areas that do not benefit from real-time ray tracing and that can be properly given a light map channel with semi-automatic tools of the Unreal Engine for light baking so as not to waste samples unnecessarily. Nevertheless, the real-time ray tracing features in the area of lighting are another reason to combine game tech with the classic VFX pipeline. After all, the features would allow artists to adjust atmospheres in real time without having to worry about asset properties and re-baking. The DP editorial team was curious and questioned the real need for real-time ray tracing.

Palmqvist grinned and said quite frankly that the real-time teams at Goodbye Kansas in Stockholm and Los Angeles would have been able to create visually similar visual effects, but only with the aforementioned tricks, hacks and tricks, and with a significantly higher time investment compared to the real-time ray tracing features of Unreal Engine 4.22. And not only the time expenditure would be a problem, but the abundance of interfaces that would still have to be implemented to dock the classic VFX pipeline to the game tech.

Anton Palmqvist (Goodbye Kansas) bei seinem Vortrag auf der FMX

Palmqvist also briefly touched on the Unreal Engine’s Alembic support and praised the feature, as Goodbye Kansas, as an established VFX studio, uses the Alembic format for all assets as an exchange format. In the production of the “Troll” trailer, for example, Princess Bianca’s dress, which has around 100,000 polygons, was simulated in Houdini and transferred directly to the Unreal Engine via Alembic Cache, which was a great help. Support for industrial data formats is necessary.

In order to bridge the gap between game tech and the classic VFX pipeline, additional tools for artists had to be created in the Unreal Engine. The first tool was an uber-material. A large shader builder that allows artists to easily create a variety of different surfaces without any software-technical fiddling. Palmqvist explains that it is important to keep the material system as close to the classic VFX pipeline as possible in order to make it easier for new artists to work with game tech and reduce learning curves. The teams have therefore attached great importance to ensuring that UDIM support is available and that assets from production can be imported directly into the Unreal Engine.

To make lighting and look development more standardised and intuitive, a scene was developed for Look Dev that artists can use to test their assets. A major advantage is that lighting and material adjustments can be made live. In addition, a build of the scene can be generated and exchanged with the customer. This allows the customer to interactively inspect the assets and test different lighting conditions and surfaces.

Is Game Tech with real-time ray tracing capabilities ready for use in a classic VFX pipeline?

Palmqvist assessed the new possibilities as positive and spoke of a rethink in production planning: “The classic VFX pipeline model naturally has iteration cycles at one point or another, but under the bonnet it is a linear production model – even if modern renderers are equipped with interactive rendering. Modern denoisers are usually added to approximate a specific pixel colour. When modern game tech is integrated into VFX pipelines, the linear production model is replaced by one that works in parallel. As an example, he pointed out that adjustments to the light setup can be made very early on in the production – fine adjustments and nuances as well as the elaboration of intricate lighting details can be carried out until shortly before the end of production. This results in a new dynamic in production. The customer can be involved in the initial Look Dev. He is at the forefront.

Conclusion

We still need to investigate further where the limitations of the new features lie. The “Troll” tech demo ran at 24 fps with an RTX 2080 Ti and there were no performance lags that could have jeopardised the project. It would also have to be examined how studios could develop USPs using game tech. If the customer wants a project for which classic VFX with offline rendering is too expensive, it can be converted to Game Tech. However, the advantages of asset exchange also need to be analysed more closely within the studio. It may be necessary to develop an over-pipeline that allows a raw data set to be integrated into offline or real-time rendering in just a few steps. A similar situation already exists with the character pipeline in Goodbye Kansas. The full potential still needs to be uncovered through numerous tests, trials and further projects.

]]>
DIGITAL PRODUCTION 77000