Search Results for “DP2301” – DIGITAL PRODUCTION https://digitalproduction.com Magazine for Digital Media Production Wed, 04 Dec 2024 13:29:49 +0000 en-US hourly 1 https://digitalproduction.com/wp-content/uploads/2024/09/cropped-DP_icon@4x-32x32.png Search Results for “DP2301” – DIGITAL PRODUCTION https://digitalproduction.com 32 32 236729828 Virtual production with just one LED screen? https://digitalproduction.com/2024/09/03/virtual-production-with-just-one-led-screen/ Tue, 03 Sep 2024 07:54:59 +0000 https://digitalproduction.com/?p=144603
Let's be honest: large virtual production stages are cool, but they're also damn unwieldy - do you really need that many screens?
]]>

They have to be big enough – but if we build displays into a classic backdrop, we might get the best of both worlds – and if we look at Samsung’s LED display “The Wall”, it already covers a huge area – and was already used in the film “Die Traumnovelle”(is.gd/traumnovelle_imdb)! That’s why we spoke to two people who did just that – Florian Frerichs and Sven Lehmann!

Florian Frerichs(is.gd/frerichs) founded the production company Warnuts Entertainment in 2008 and has since been involved as a producer in numerous short films, music videos and commercials. He was production manager on the thriller “Gefällt Mir” (2014). As a producer and director, Frerichs shot “Alex” (2013), a fake trailer for a kind of remake of Stanley Kubrick’s “A Clockwork Orange”. His dystopian science fiction short film “Phoenix” (2014) and the horror short film “In the Ruins” (2016) were screened at several international genre festivals. Frerichs tackled a completely different subject in his feature film debut “The Last Supper” (2018), for which he also wrote the screenplay together with Stephan Warnatsch. Sven Lehmann was the VFX supervisor(is.gd/sven_lehmann) and was responsible for setting up the screens and communicating with the Unreal Engine (Summer of Unreal graduate).

DP: Hello! “Virtual production” usually means huge LED volumes and caves, doesn’t it – what did you do there?

Florian Frerichs, director: Die Traumnovelle is a modern adaptation of Arthur Schnitzler’s novel of the same name and is set in Berlin. We are a small, close-knit team of film and technology enthusiasts and have been making films together for years. We have always been interested in technical innovations and achievements. In fact, I can say that we were the first to use one of the first RED cameras in Europe beyond mere testing in 2007. We have also been working on the topic of virtual production for some time. Then, over the course of the year, we familiarised ourselves with Samsung’s solutions – which came at just the right time.

Sven Lehmann, VFX: We shot one scene of the film using three Samsung Micro LED walls called The Wall. It was a suburban railway scene in which the protagonist gets on, looks dreamily outside and meets another person. The opportunities to film in the real S-Bahn are limited. Unlimited filming licences are difficult to obtain, disruptions to ongoing passenger services are inevitable and re-shoots are very time-consuming. That’s why we came up with the idea of using virtual production to create the scene background and lighting. So we shot at the S-Bahn dummy in Babelsberg and used footage of an S-Bahn journey on the LED walls and positioned them behind the windows of the train.

DP: And why did you use the Samsung walls for this?

Sven Lehmann: We were already in contact with Samsung beforehand, as we had already experimented with The Premiere short-throw projector. When the film project came up, we found out more about The Wall and realised that it was ideal for our plans. The decisive factor was the image quality with a low pixel pitch, which is essential for shooting details. The solution can also display special studio image rates, which is important for video synchronisation. Samsung’s LED walls are also easy to operate – basically like a normal Samsung TV. The user interface is even pretty much the same. That makes it much easier to use on set.

DP: How many “The Walls” did you use for the set?

Sven Lehmann: We used three different displays. The decisive factor was that the panels work straight out of the box. That means: unpack, connect and you’re ready to go. The scope of delivery also included the corresponding frames on which the screens were installed. this allowed us to roll them and move them up and down electrically. We were therefore able to use them very flexibly and place them wherever they were needed.

DP: How did the shoot go? How did you set up, position and control the screens?

Sven Lehmann: The challenge when setting up the screens was that we had to position two of the three displays on the platform of the S-Bahn dummy and there was very little space around it. We were then able to solve this with millimetre precision using an assembly lift. The aforementioned packaging system for the walls helped us a lot when moving them. When they were raised briefly, they could be rolled, put down again in a different position and then adjusted to the correct height. In terms of realism, the scene was very grateful as it was rather dark.

DP: The scene itself is “theoretically” simple enough – did you have to follow the camera?

Florian Frerichs: The scene is a relatively calm shot. Accordingly, the camera work was comparatively static, without major pans or fast movements.

DP: After the first take, did you make any adjustments?

Sven Lehmann: The size of the shots had to be in the right proportion to the set and the actors. You have to align it precisely so that the overall picture is harmonious and realistic. Florian Frerichs: We also readjusted the brightness. The Samsung panels are very bright, so they quickly became brighter than the real environment.

DP: And during the lunch break you turned the screens into a staff cinema? Florian Frerichs: A display like The Wall naturally invites you to experiment – in many ways. On the first day, I even had a Playstation 5 with me in the car so I could utilise its gaming power. But only four controllers. That would have been unfair to the rest of the team (laughs). In fact, I would like to continue using the screens in the future and integrate them directly into studio buildings, for example as windows. That would make us even more flexible when working in the studio.

Sven Lehmann: My personal impression of The Wall was very positive. In the size and format in which we used it, it was as easy to handle as a “giant television”. For low-budget shoots where the camera doesn’t move much, you can do a lot with it. For example, the setup makes it easier to reshoot if a set is forgotten. In principle, The Wall allows even more, as the specifications can be customised so that ceiling installations or concave designs are also possible. With a fully equipped stage, even more is possible.

DP: If you were to shoot this scene again, what would you do differently? More or fewer screens?

Sven Lehmann: If I could, I would install a huge LED wall on both sides that covers everything, similar to the film “Bullet Train”. Samsung also offers options for this with The Wall. More displays with the same features and sizes of The Wall used would of course make it easier, but the scene was too small for that. So we planned with a manageable amount of effort, and that worked really well for our case. At one point or another, we could have avoided small software problems by testing beforehand. But that was manageable.

Florian Frerichs: Our virtual production specialists Sven Lehmann and Victor Manske did a great job and were very committed to realising the scene in this style. In the end, the work paid off: you rarely get to see shots of a completely empty train. The viewers will certainly not realise this so much, but they will feel it and have the feeling at the end that they have seen something special.

DP: And when it’s all over, what’s the next project? When you can talk about it?

Florian Frerichs: I’ve been dreaming of making a science fiction film for some time now. And I think that Samsung’s virtual production technology would be perfect for this – especially as “The Wall for Virtual Production”, a model specially tailored to the requirements of the film industry, was recently introduced.

]]>
DIGITAL PRODUCTION 144603
VERTAGT – VFX for a dystopian satire https://digitalproduction.com/2022/12/29/vertagt-vfx-for-a-dystopian-satire/ Thu, 29 Dec 2022 09:59:00 +0000 https://digitalproduction.com/?p=151115
What does the cultural programme actually look like in a dystopia? The short film Vertagt depicts the end-time scenario of a humanity that has not achieved its climate goals. What could our future look like - if human life still exists? And above all: how will we look back on our mistakes today? In Adjourned, our contemporary climate summits are already a historical relic, a classic theatre play from the past. Having failed to fulfil their actual task, their former existence now serves only serves to amuse a few survivors. In the dystopia, our present-day climate theatre becomes a perfect source of satire. We realised this in our second film at the HFF Munich under the direction of Prof. Jürgen Schopper as follows.
]]>

In the second year of our training, as visual effects students at HFF Munich, we had the opportunity for the first time to develop a short film idea together with other departments at the university, such as production, camera and screenplay, which had to be realised with the help of visual effects. The advantage of this co-operation was that the effort and feasibility of potential visual effects shots could already be planned down to the smallest detail during script development. We tried to work as closely as possible to the principle of virtual production, for which close co-operation with the other departments was a prerequisite. This collaboration was extended by VFX industry professionals, who supported us again and again during the course of the production. Petra Hereth was responsible for team coordination.

by Christian Geßner, Nicolas Schwarz, Chris Kühn,
Malte Pell, Tobias Sodeikat and Jonas Potthoff

From the first script version in October 2021 to the start of shooting in April 2022, we were involved in the preparation of the film in many ways: in parallel with the script development, we created the first concept art, which we used to design the look for our world together with author Larissa Dold and director Matthias Zentner.
At the same time, we constantly updated the VFX calculation with the latest findings from the joint meetings from the start of the script work in order to guarantee the feasibility of the visual effects. In this way, we were able to ensure that the VFX effort was in line with the time frame available to us during book development. It also gave us the opportunity to prioritise during development which parts of the story would benefit from visual effects and where they were not absolutely necessary. In addition to storyboarding and concept art, this process was accompanied by us with the support of Luis Guggenberger. With the help of 3D mentor Berter Orpak, we created a 3D pre-visualisation in a very short time, which not only showed roughly blocked scenes, but already included tracking shots, animated characters and a rough sound layout.

Production: University of Television and Film Munich 
Duration: 07:51 min (25fps)
Resolution: 2K (Aspect Ratio: 2:39)
Render Engine: Redshift, Mantra
Texturing Software: Substance Painter
3D Software: Blender, Houdini
Compositing Software: Nuke

Once the script was largely finalised in February 2022, we were able to focus on the actual filming preparations. As we already had a very concrete idea of the shots to be filmed based on the 3D previs and the entire shoot was also very VFX-heavy, the shooting schedule was developed directly in the VFX department in close consultation with the producers Luisa Eichler and Michaela Mederer. In various production meetings and shooting schedule discussions, we also planned the concrete realisation of each shot with the other trades – for example, we were able to determine exactly where the real studio set would merge into the digital world with the art department under the direction of Katja Severin and how we could make these transitions as unobtrusive as possible. It also took some coordination with the costume (costume designer: Katharina Ost) and make-up (key make-up: Sabeth Kelwing) departments in order to be able to shoot VFX-compatible costumes and SFX make-up, such as burning hands, as sensibly as possible.

During filming (camera: HFF graduate Teresa Renn), we as VFX students were represented in various positions on set, with the role of visual effects supervisor being swapped on a daily basis so that every student had the opportunity to experience this function for themselves. We were supported by professionals from the industry (Jan Stoltz and Dietrich Hasse). But we were also very busy outside of this position: Both the set and the actors* had to be scanned and data also had to be collected from the set after each take.

After the end of the shooting week, it only took another week until we were able to lure the first VFX shots from a rough cut and start editing (editing: Michael Dervenski & Matthias Zentner). To do this, we updated the latest cut version in DaVinci Resolve with the online material and exported the shot plates and references in 16-bit Linear EXR sequences for editing.

Order in the chaos of assets – the modelling

Hopefully it will still be a few years before it looks like the Vertagt universe here in Munich. As the only thing that already existed during the production period was an empty stage, the world around it had to be created from scratch in the computer. We used a combination of specially created, purchased and scanned assets.
This resulted in a total of over 50 different models, some in up to six different versions. The largest of these was a broken container ship. We reworked the purchased 3D models with our own textures for our film in order to adjust the level of detail for the project and the individual shots. The realistic replica of a whale skeleton, shipping containers and oil drums are some of the many assets that were added to the interior and exterior of the wreck.

Image references of possible shipwrecks, damaged cars, aeroplanes, harbours, pipeline systems and materials affected by sand and weather formed the basis of our research. From the references summarised in mood boards
References summarised in mood boards, we were able to pick out new ideas throughout the project and recreate them in detail. The interior of the shipwreck was continuously filled with first large, then smaller and smaller objects.

The 3D models were modelled and “UV unwrapped” in Blender. In order to cope with the large number of differently coloured and dented containers, we created various textures and models, which were assigned procedurally in Houdini. To ensure that accessories such as cables, ladders and rubbish could still be placed on the containers and that the AO map could be calculated correctly in Substance Painter, the scene was initially blocked with cuboids.

In addition, we modelled a second interior, which is revealed to us in the long return journey as the ruins of an engine room, gradually adding information about the location and the dystopian world. The centrepiece is an old, oversized ship’s engine. The difficulty here was to combine the realistic depiction of the working environment in an engine room with as open an environment as possible to show the size of the wreck. Some elements run through both areas of the ship in order to connect them as seamlessly as possible. Both the rust shader and the cables hanging from the ceiling serve as a link here.
“Among other things, a model looks realistic when the eye and brain cannot grasp all the elements at once, but discover new things bit by bit”. To follow this advice from Dirk Mauche (our active support in the modelling area), a tangle of cables, chains and rope ladders provides the necessary details (and disorder) in the picture.

A crawler learns to fly – rigging & animation

The main CG character, a cockroach called “Despair”, was already delivered with various animation cycles and a rig, but first had to be fed into the pipeline. To do this, we created a motion cycle system in Houdini that could be used to efficiently fade between the various animation cycles. With the help of several controllers, it was possible to run the CG cockroach on a spline curve at different speeds. It only became problematic when more complex movements were required. A rotation around its own axis, when the cockroach has to perform a somersault to escape from a tricky situation, became an endless search for the right point of rotation. The problem was solved using the good old “brute force” method. A second locator served as a new anchor. Despite using ready-made animation presets, we were once again lucky enough to be supported by Prof Melanie Beisswenger as an animation lecturer. We learnt that even the smallest changes in the running speed of a cockroach can provide a lot of information about how it can affect the viewer.

Continuity of light and shadow

We realised early on that we needed low key lighting on set. The only source of light was to be the extremely bright lightbeams that would shine through small holes in the outer wall of our shipwreck, but from whose scorching heat the ship’s inhabitants were to hide. The darkness surrounding them was only to be discreetly brightened up by their bounce lights. However, as is so often the case, it turned out during pre-production that the reality was different. We couldn’t completely do without a few bright accents in the form of artificial light lamps on set in order to capture the mood and create enough detail in the blacks for the camera. This resulted in an interplay of lightbeams, which have a daylight and therefore cool colour temperature, and practicals, which emit artificial light and therefore have a warmer colour temperature.

In order to get as close as possible to the properties of real sunlight in our studio environment, our head lighting technician Torsten Baier organised a large PAR spotlight, which was directed into the right paths using mirrors and whose light cone was suitably shaped. Thanks to the parallel beam path of this spotlight, it hardly loses any brightness at a distance and thus corresponds to our perception of sunlight. A lot of artificial fog on set also ensured that the beam of light was visible in the air and that our surroundings looked appropriately dusty. Now it was time to digitally match the lighting from the set..

The long shots were particularly challenging, in which we had to extend our shot set outwards in order to be able to tell the interior of the shipwreck credibly. The transition between the filmed stage and the digital set was concealed by 3D-scanned scrap parts and models, but the lighting mood had to match exactly for both. The light beams also cross the “boundary” between the filmed and digital plate, which is why they had to be partially supplemented, but also completely retouched and digitally replaced. The aim was to recreate the correct angle, diameter, intensity and colour temperature of the real light beam. This was done in Houdini and then rendered with Redshift. In compositing, we added further details such as animated fog and particles floating in the air to the light beams, further blurring the line between real and computer-generated footage. We received a lot of support from our lecturer Frank Dürschinger, who provided us with the final food for thought to create a realistic image.

Other shots had a smaller CG component, but this had to be all the more seamless. As a completely digital character, our cockroach needed lighting that was as true to the original as possible, including refraction, reflection and shadows, in order to fit exactly onto the plate being shot and integrate itself into the real world. And that in every setting size – whether it appears large in the picture or only takes up a few pixels. To achieve this, we used a 360° camera to take exposure series of the respective location and its lighting during the shoot and then stitched them together to create HDRIs. The HDRIs served as the basis for the lighting of the respective 3D scenes, but were often supported by additional spot and area lights. Finding the right balance between too much and too little additional lighting was a challenging fine-tuning task, but it was also a lot of fun.

FX simulations: Sand and bone dust

The requirements for the FX simulations ranged from simple setups like the main title crumbling into dust and a cockroach kicking up sand, to big challenges like the skull shattering on the ground and a photorealistic desert that had to work in both close-up shots and a super long shot. We were once again lucky enough to have Felix Hörlein on board as a lecturer for our second project at the HFF and were therefore able to draw from the full range of possibilities.

In the final shot of the film, we establish a vast, open desert landscape swept by sand and wind. The biggest challenge in realising this haze was the enormous size of the terrain. With an area of three square kilometres, a simulation approach was needed that could simulate different resolutions for different areas. Thanks to the help of Felix Hörlein, an LOD system was created based on the distance to the camera. In the first step, a point cloud was generated at three different points in time using Mantra’s ray tracer, which subsequently produced a representation of the entire visible geometry.

The LOD system

In combination with the LOD system, this resulted in a significantly reduced footprint that could be used to generate the volume. All parameters that have an influence on the volume must also be controlled globally in order to ensure that the edges of the individual LOD areas blend into each other as much as possible. To save computing power, time and storage space, the simulations on the individual wedges only start shortly before they are visible in the camera. In addition to the volumes, we used a particle simulation in the foremost areas of the field of view to get as close as possible to the “sandstorm” target. This was controlled by the Vel field from the volume simulation.

Despite all the cost-saving measures, one workstation per version took an average of around six days to simulate all the individual parts, generating a cache size of 5 TB. This is largely due – at 1250 frames – to the length of the shot. Mantra became our render engine for all simulations. Rendering was always done with mattes, IDs and cryptomattes to give the compositing specific access to the individual simulations.

The skull crashing to the ground and disintegrating is the dramaturgical highlight of the film and demanded a lot from us due to the size of the shot. The basis of the effect is an RBD simulation with custom constraints for the three areas “skull”, “jaw” and “teeth”. To make the breaking more interesting, the option “switch constraints when broken” is also active for a selected area that contains the main features of the skull. Using a second constraint type, the selected geometry is held together for longer before it finally breaks. The skull has been acquired and prepared for the simulation. Three separate “rbd-fracture-nodes” are used to divide it into fragments in order to get as close as possible to the real properties of the different bone thicknesses.

To save memory space and optimise performance, the RBD simulation was only cached as points. This allowed us to remesh the geometry of the skull only after the simulation and thus determine the final resolution of the mesh. A growth solver, which is initiated by changing the distance between neighbouring points when breaking, starts the “disintegration” process. All infected polygons are passed on to another solver, which freezes them at their position and feeds their centre as a single point into a POP network. After the simulation of the POP solver, the polygons are linked to the animation of the respective points using an ID. This allows the dissolving parts to be simulated and saved in a very computationally and data-efficient manner. The individual points are controlled by a velocity field that was generated with the RBD simulation.


In order to intensify the shattering, we have also added two further particle simulations, which on the one hand provide more volume, but more importantly create the appearance of a dust-covered floor – and thus correspond to the rotated plate. We were also able to use a similar but significantly simplified technique for two shots in which the hand and arm of an actor are burnt by the sun and disintegrate into dust.

Digital film set

The use of the Faro laser scanner provided great technical support for post-production. Purchased especially for the Visual Effects degree programme, we were able to create 3D scans of the entire set, which later enabled us to read off accurate distances and create precise match moves. Our initial expectations that tracking would not be a problem with smaller camera pans were quickly dashed by the bitter realisation of how important a good match move is.
Thanks to Matchmove lecturer Ando Avila, however, we were also able to overcome this challenge. The scans also form the basis for the digital replica of the film set. An initial rough reconstruction provided information on the position and tilt of the camera, and details such as wooden struts, chairs and lamps were subsequently added. The recording of the actors was rotoscoped and projected onto the geometry. The projection of the recorded stage onto 3D geometry was used several times. Especially to create subtle
3D projection was an important tool, especially for credibly narrating subtle camera movements with parallax shifts. Of course, it was also helpful to use rudimentary geometry.

Pipeline

In the previous year and during the course of the project, we learnt a lot about effectiveness and workflow. Many shots with the same environment led to the decision to integrate Houdini digital assets into our workflow. This allowed us to tackle steps such as modelling and animation at the same time. But that alone is not enough to keep a cool head and keep things organised. Our Pipeline TD Jonas Kluger, who supported us in the workflow with Shotgrid Studio and modified the pipeline according to our needs, made sure of this. This enabled us to keep a better overview of the tasks and stick to our production schedule.

Compositing

The final step in post production was compositing. As the production offered everything from simple retouching to elaborate full CG shots, the six students had to finalise the more than 40 visual effects shots in a short space of time. Deadlines are deadlines and they don’t wait any longer than our climate. The comp work took place exclusively in Nuke and for some of us, working on the project was a good introduction to the software. Fortunately, we were able to rely on the support of Christoph Zapletal and Jens Schneider, who were able to teach us better Nuke structures and advanced techniques. This was the only way we could work on many different shots at the same time.
At the end of the film, there were also several shots in which we had to digitally add ash flying through the air. The “Das Element” software helped us enormously here, as plates can be organised and found quickly. These
Plates were inserted after they had been created on the last day of filming and thus supplemented the library of purchased stock footage. From a large ladder we threw dust particles, torn paper or we blew mist through a hose into a rubber glove. As everything was filmed against a black background, we could simply use these elements in Nuke and add them to the shot plates on the stage.

Nice and dirty – the colour grading

Once shooting, editing and visual effects were complete, the film was given its finishing touches. Andreas Lautil from Pharos created an individual look to match the eccentric style of the dystopian theatre. Although the grading is based on fairly classic teal and orange lighting on set, it was pushed in a much dirtier direction here. The dark areas were made dirtier and greener so that the audience could put themselves in the shoes of the run-down world and better empathise with the society in the film. Once again, it became clear how much influence each department has on the final project. Originally, a rather desaturated style was planned – clearly recognisable in our concept drawings. However, this was greatly varied in the grading. Now the contrasting cold tones in the depths harmonise with the warmer highlights and thus also with the absurd events in front of and behind the stage.

The sound design and funny background conversations

The quiet murmuring, the sound echoing from all directions and the creaking of the rusting steel transported us to the gigantic wreck in the middle of the desert for the first time in the making of the film. Thanks to Dr Rodolfo Silveira’s work in sound design, the silent shots that accompanied us for months became an atmospheric experience. In several sets, vocal ranges and positions, we students were even able to lend our voices to the digital audience ourselves and push our vocal chords to their limits. What do old, injured people who are on the verge of dying of thirst sound like when they nevertheless enter a bizarre theatre? The HFF’s in-house recording studio demanded a lot from us in terms of acting, but also brought to light a hidden passion for some of us. As the film was shot entirely in a studio, it was necessary to place all the metallic sounds of a ship in the very first shot of the film.


Thus, the entire opening sequence, which takes place in a corridor on the side of the theatre stage, was completely resonated to convey the echoes, vibrations and thus the dimensions and weight of an abandoned oil tanker. Through this strategy, the sound information creates a new level of understanding that complements and enlarges the film for the audience and prepares them for the climax and the central conflict of the story. The sound design was supported by an atmospheric composition by musician Meredi, who was able to capture a unique mood for the film using specially created sounds.

Outlook

Above all, working with industry professionals has shown us that even a little insight and practical experience in all departments helps us to gain an understanding of other departments. The mentality that visual effects is just a part of post-production is a thing of the past, because film-making begins for visual effects during script development. Virtual production is becoming increasingly relevant for well-planned productions with a visual effects component. The first internal presentation took place as part of the “VFX-Reel” 2022 exhibition in the Audimax of the HFF Munich.

Team

  • Script: Larissa Dold
  • Director: Matthias Zentner
  • Producer: Michaela Mederer and Luisa Eichler
  • Director of Photography: Teresa Renn
  • Gaffer: Torsten Baier
  • Production Designer: Katja Severin
  • Costume Designer: Katharina Ost
  • Key Make-Up: Sabeth Kelwing
  • Composer: Ina Meredi Arakelian
  • Editor: Michael Dervenski, Matthias Zentner
  • Colorgrading: Andreas Lautil
  • VFX Supervisor: Jan Stoltz and Dietrich Hasse
  • Project Supervision: Prof. Jürgen Schopper
  • Project Consultant: Dr Rodolfo Anes Silveira
  • VFX Pipeline TD: Jonas Kluger
  • 3D Mentor: Berter Orpak
  • Team Assitant: Petra Hereth
  • Technical Tutor: Moritz Rautenberg
  • VFX Producer: Chris Kühn, Jonas Potthoff
  • Modellers: Christian Gessner, Chris Kühn, Malte Pell, Tobias Sodeikat
  • Texture Paint Artists: Christian Gessner,
  • Chris Kühn, Malte Pell, Tobias Sodeikat
  • Simulation Artist: Nicolas Schwarz
  • Animators: Chris Kühn, Tobias Sodeikat, Malte Pell
  • Compositors: Jonas Potthoff, Chris Kühn, Nicolas Schwarz, Malte Pell, Tobias Sodeikat, Christian Gessner
  • Visual Effects Editor: Jonas Potthoff
  • PreViz Artists: Christian Gessner, Chris Kühn
  • Sound Design/Re-Recording Mixer: Dr Rodolfo Anes Silveira
    • VFX Mentors
  • Concept: Luis Guggenberger
  • Modelling/Texturing: Dirk “Superdirk” Mauche
  • MATCHMOVE Ando Avila
  • FX: Felix Hörlein
  • Mattepainting: Jens Schneider
  • Animation: Prof. Melanie Beisswenger
  • Compositing: Christoph Zapletal
  • Lighting: Frank Dürschinger
    • Cast
  • A Chancellor: Viola von der Burg
  • A Conservative Party Member: Claus Peter Seifert
  • Green Party Member: Ronja Katharina Brusa
  • A Prompter: Patrick Bimazubute
  • The Choir Singers: Emil Vorbrugg

]]>
DIGITAL PRODUCTION 151115