Filmakademie Baden-Württemberg – DIGITAL PRODUCTION https://digitalproduction.com Magazine for Digital Media Production Thu, 14 Nov 2024 10:37:31 +0000 en-US hourly 1 https://digitalproduction.com/wp-content/uploads/2024/09/cropped-DP_icon@4x-32x32.png Filmakademie Baden-Württemberg – DIGITAL PRODUCTION https://digitalproduction.com 32 32 236729828 The Beauty | Student Academy Award https://digitalproduction.com/2020/09/23/the-beauty-student-academy-award/ Wed, 23 Sep 2020 09:00:53 +0000 https://www.digitalproduction.com/?p=83672
The award goes to Animago and Visual Effects Society winners.
]]>

Student Academy Award

From the Animago Awards 2019, to the VES Awards 2020 – and now to the Student Academy Award? Causality or just correlation? In any case, the success story of the diploma film The Beauty is pornographic. Created at the Animation Institute of Filmakademie Baden-Württemberg, The Beauty has now won the Student Academy Award 2020 in the international competition in the animation category. The winners will be honoured in a virtual ceremony on 21 October in Los Angeles.

The Beauty

Topic of the diploma film: The pollution of the world’s oceans by plastic waste. The short film dives into an underwater world in which plastic and nature come together. For a breath, worries and feelings of guilt dissolve between coral reefs and the deep sea. Then the viewer is fished out of escapist images – and man’s ecological responsibility forces its way to the surface…

Baden-Württemberg Film Academy

The Beauty is the sixth time that the Student Academy Award has gone to a production from the Filmakademie Baden-Württemberg. The previous winning years: 1998 (Rochade), 2007 (NimmerMeer), 2012 (Von Hunden und Pferden), 2015 (Erledigung einer Sache) and 2017 (Galamsey – Für eine Handvoll Gold) – and now 2021 (The Beauty). In total, seventeen student projects have made it to the final round of the young talent competition since the Film Academy was founded.

Teaser trailer and making-of below:

]]>
DIGITAL PRODUCTION 83672
The Beauty wins VES Award https://digitalproduction.com/2020/01/31/the-beauty-gewinnt-ves-award/ Fri, 31 Jan 2020 17:00:00 +0000 https://www.digitalproduction.com/?p=77862
Deep Dive with "The Beauty" - the short film from the Animation Institute of the Filmakademie Baden-Württemberg now also wins the VES Award - read the making-of from the DP.
]]>

In addition to our own animago “Jury’s Prize” 2019, the diploma film The Beauty , created at the Animation Institute of the Filmakademie Baden-Württemberg, is going on a trophy dive: The short film was recently honoured in Los Angeles with a VES Award in the category “Outstanding Visual Effects in a Student Project”. The team directed by Pascal Schelbli(VFX Supervisor: Marc Angele, Animation: Noel Winzen, Producers: Tina Vest, Aleksandra Todorovic) beat three other nominated student productions. You can findall the winners of this year’s VES Awards here.

The Beauty Making-Of in the DP 19 : 06

In The Beauty, the Animationsinstitut team takes us on a ‘deep dive’ into a rather uncomfortable part of the world. However, beautiful images and sounds from the depths of the oceans can be seen and heard – consisting of shoals of flip-flops, coral reefs made of plastic cutlery and straws, plastic jellyfish and bottled whales. You can read all the details about the realisation in our article from DP 19 : 06 here for free.

Further information: To the website of the animation institute

]]>
DIGITAL PRODUCTION 77862
Into space with the Film Academy https://digitalproduction.com/2019/03/05/mit-der-filmakademie-im-weltraum/ https://digitalproduction.com/2019/03/05/mit-der-filmakademie-im-weltraum/#comments Tue, 05 Mar 2019 10:30:42 +0000 https://www.digitalproduction.com/?p=75372 Asperity
With their diploma project "Asperity", the Filmakademie Baden-Württemberg team, in collaboration with a total of 3O students, transformed a visit to FMX 2O18 into an interactive space journey.
]]>
Asperity

Strapped into a cockpit chair with VR goggles, headphones and other accessories, participants were able to slip into the role of a co-pilot in the space shuttle. Together with virtual pilot Charles Overmyer, the shuttle is supposed to dock with the ISS, but the mission goes differently than planned … asperity-tec.com. The team has come up with many details for the interactive project: In addition to the actual VR experience, a fictitious company called Asperity Technologies Corporation has also been created – including a corporate design (even with postcards), website and image film. The trade fair stand at FMX 2018 was also elaborately designed – and anyone who didn’t run into the FMX beaver could also continue the selfie cascade with an astronaut from Asperity. We spoke to the team to find out details about the implementation in Unity, tools used, workflows employed and more.

Artwork Shuttle mit Blick zum Mond
Artwork shuttle with a view of the moon
DP: For those who didn’t have the chance to try out your project at FMX: What can users expect to find in Asperity?

Lena Lohfink: Asperity is an interactive, cinematic virtual reality experience in which the viewer embarks on an adventurous space journey. At our Asperity Technologies Corporation stand, users can not only learn a lot about the company, but also take a seat in a real replica of a cockpit chair. Then it’s time to put on the VR goggles and headphones, put on the controller glove, grab the joystick and off you go! This interactive 360-degree room installation simulates as real a space flight as possible by exposing the user to additional external, physical stimuli and thus actively involving them in the experience.

Rendering von der Hülle des Spaceshuttles
Rendering of the hull of the space shuttle
DP: What goals have you set yourselves for the project?

Sebastian Ingenfeld: Personally, I’ve always been interested in space travel – both the scientific and the fictional side of science fiction. The iconic space shuttle has always been my favourite, and when I stood in front of the decommissioned “Atlantis” in Florida in 2016, I couldn’t be stopped. The main aim was always to entertain. However, it was also important to me to strike a balance between captivating entertainment and a credible scientific background. The end result should be a piece of VR entertainment that I would have liked to experience myself as a viewer and that wouldn’t leave every scientist scratching their head.

Rendering der ikonischen Triebwerke des Asperity Shuttles
Rendering of the iconic engines of the Asperity Shuttle
DP: How big was your core team and what additional people did you work with?

Lena Lohfink: The core team consisted of the director, lead technical director and producer. In total, however, around 30 people were involved in the project, including US and Canadian voice actors, German and Slovenian actors and many great artists (programming, animation, sound design, music etc.) from the Baden-Württemberg area.

DP: How much experience did you have with VR & interactivity beforehand?

Sebastian Ingenfeld: I mainly have experience with classic short films. However, I had already worked on an interactive installation at the Film Academy and was able to quickly learn the necessary basics. However, I had to adapt to many new workflows and working methods during creative pre-production. But that was also a lot of fun.

Ausschnitt der Bedienelemente des Cockpits als Screenshot der Unity Engine
Section of the cockpit controls as a screenshot of the Unity Engine
DP: Over what period of time was “Asperity” created?

Lena Lohfink: The pre-production and production of “Asperity” began in October 2017 and ended with the graduation in May 2018. However, the idea and the script were created much earlier, at the end of 2016, and some preparations were already made in spring 2017.

DP: What did your project management look like? Did you use any special tools?

Lena Lohfink: As this was our first VR project, I gathered a lot of information from / with the artists at the beginning and looked for comparable projects. In the resulting production plan, we divided the seven-month production time, including pre-production and project completion, into five major milestones – in our case: user testing. Accordingly, certain elements of the film/game had to be in production at these milestones, so our weekly targets and deadlines were derived from this. Following the methods of agile project management, we defined the various task packages from week to week, distributed them to those responsible and worked on them. The production plan, weekly logs with targets etc. were mainly created in Excel, Google Spreadsheets and InDesign. We also worked with game and flow charts, used handwritten to-do lists to track the task packages and tried to save time through direct communication channels.

So sieht das Cockpit vor und nach dem finalen Shading in der Unity Engine aus.
This is what the cockpit looks like before and after the final shading in the Unity Engine
DP: How did the planning and work on this differ from previous projects?

Lena Lohfink: In comparison, the task packages of the previous projects were much easier to organise into almost linear processes whose individual tasks were interdependent and interconnected. This means that a certain task can only start once other tasks have reached a certain point or have been completed. With “Asperity”, user testing was the linchpin of almost all task packages and areas. This means that there were an incredible number of individual, independent task areas and therefore construction sites at the same time. Monitoring the work processes was therefore much more complex and the various game elements had to be adjusted and re-evaluated on a weekly basis in the context of the engine capacities and results from the user testing.

DP: Which hardware and software tools were mainly used in the respective project phases? What did your pipeline look like?

Sebastian Ingenfeld: I mainly work in Cinema 4D and modelled or assembled the first assets there – I also created and revised all the UVs there. In general, the interior of the cockpit – which you have in front of you for almost the entire experience – consists of a lot of geometry. The basic textures were created with Substance Painter, but I then mostly reworked them in Photoshop to break up the procedural look. We assembled the experience in Unity 3D and used the SteamVR plug-in to address the HTC Vive. The cockpit including displays, astronaut, effects and sunlight runs in real time. The intercom or video chat connection to Mission Control and the phenomenal view of the Earth are pre-rendered video textures – they now run really well in Unity.

Das rohe Modell des Cockpits mit Grey Shading
The raw model of the cockpit with grey shading
DP: Which file formats did you work with?

Sebastian Ingenfeld: Basically, “Asperity” is a patchwork of many assets from different sources – various artists modelled for us and we bought in assets. For the import to Unity, I have to say that .fbx is wonderful for models and animations. Our video textures run solidly as .mp4/ H.264. We even use a 360-degree video that we map onto a sphere interior.

</x>https://www.youtube.com/watch?v=uCQYhermUmM

DP: How many assets did you create for the project and which templates did you use?

Sebastian Ingenfeld: The cockpit alone as a set consists of countless components with thousands of switches. It’s particularly helpful that NASA provides a large number of plans and drawings for free; we mainly stuck to these and basically copied them. However, we had to slightly modify the design of the original NASA space shuttle cockpit for our story and add a little science fiction here and there. Apart from the pilot, we only had to model a bit of space debris and build a believable Earth as a moving 2D matte painting – after all, the shuttle is flying into the sunset. Then there are our “floating props” – basically everyday astronaut objects: a torch, a board book, chewing gum. These props float through the cockpit and can be touched by the user.

Eines von mehreren animierten Interface-Elementen für das Cockpit
One of several animated interface elements for the cockpit
DP: Was it possible to use existing assets, e.g. assets provided online by NASA?

Sebastian Ingenfeld: NASA provides free 3D models for download, but these didn’t make it into the game in the end. Due to time constraints, we bought the model of the ISS – NASA also offers a finished model free of charge, but it would have taken a lot of additional time to retopo, which we wanted to save. Philipp Maas derived and texturised a Unity-compatible asset for us. For our matte paintings, we naturally used existing shots from the orbit. We also drew on the NASA archive for our intro and placed our own designs using classic VFX workflows.

Übersicht zu den Interaktionsmöglichkeiten im selbst programmierten Flow Chart
Overview of the interaction options in the self-programmed flow chart
DP: What did the realisation of the cockpit and the pilot look like in detail?

Sebastian Ingenfeld: During pre-production, I designed a CI / CD for our fictitious company Asperity Technologies Corporation at an early stage. Inspired by this, our costume designer Marika Moritz then created the astronaut suit in Marvelous Designer and sewed it together virtually – only a little retopo was needed here. Alexander Frey simultaneously sculpted the helmet in ZBrush and then modelled it game-ready in Maya. Our Technical Director Seyed Ahmad Housseini was mainly responsible for a clean character animation pipeline. We had roughly recorded all the necessary movements for the astronaut using motion capture, but our animator Maike Koller still had to animate many details on top. In Maya, the mocap rig was finally merged with the animation rig to create a game rig and then pushed to Unity via .fbx – there we were able to fade from take to take or send the character to idle using our event system.

In-Game VR Screenshot vom Cockpit in der Unity Engine (Real time)
In-game VR screenshot of the cockpit in the Unity Engine (real time)
DP: What interaction options does the user have in the cockpit? How could these be realised with Unity?

Matthias Heim: The user uses a glove with a Vive tracker to control their hand as a space tourist in the cockpit, which was realised with Inverse Kinematic. This allows them to operate switches to progress through the story. In addition, objects float in zero gravity with which the player can interact. Simulated rigid bodies were used for this. To dock to the ISS, the user can control the rotation of the shuttle with a real joystick, whose input Unity was able to read as a normal USB controller.

DP: What animations had to be created for docking with the ISS?

Matthias Heim: For the docking of the space shuttle, the position was animated by hand. The player can control the rotation using a controller. In addition, the displays in the cockpit reflect the position and rotation of the shuttle, which was procedurally animated depending on these values.

Asperity’s Corporate Design: Neben Website und Logo wurden auch Postkarten designt.
Asperity’s corporate design: In addition to the website and logo, postcards were also designed
DP: Why did you decide in favour of the Unity Engine?

Matthias Heim: As Unity is very widespread, most of the project team had already had contact with this game engine. In addition, the two programmers already had experience from previous Unity projects. The software offered us the opportunity to quickly develop customised tools, such as a node-based event system. This made it easy for non-programmers to create content and edit the storyline. Even though other programmes such as the Unreal Engine offer better graphics out of the box than Unity, this is not necessarily an advantage for VR applications. As all images have to be calculated twice and at a very high frequency, Unity has already taken steps to improve performance for these applications.

Der Helm aus mehreren Perspektiven: Das Sculpting erfolgte in ZBrush
The helmet from multiple perspectives: The sculpting was done in ZBrush
DP: How did the exchange and integration of the sound design go – which middleware was used for this in Unity?

Pablo Knupfer: We worked very iteratively in advance and defined important sound events in advance. A large part of the sound design was also created here – in snippets, so to speak – and was then integrated into the sound engine and thus into Unity. Binaural sound was necessary to create the most immersive experience possible and a particularly realistic soundscape. After all, sound sources needed to be localisable. This was implemented using the Wwise sound engine together with the Spatialiser from Oculus.

DP: How did you bring conventional VFX elements into the real-time environment of the game engine?

Sebastian Ingenfeld: I used to do a lot of 2D compositing for live-action film and feel at home in classic VFX workflows. So my original plan was to pre-render elements such as explosions, impacts or space debris and integrate them as video textures. However, when all the hard 3D assets and video textures for the backgrounds were integrated into the Unity scene and the performance and frame rate were still good, we reconsidered this decision. We now use a particle-based library for most effects such as fire, smoke and the visible rocket bursts. Our space debris is fuelled by Unity’s physics system, which also controls our floating props and things like glass shards inside the cockpit. The last remnants of VFX can be found in the view of Earth – a moving matte painting of real footage from Earth orbit. We also retouched a lot of licensed archive footage for our intro – a fictitious image film of Asperity Technologies Corporation – and integrated our own actors into our cockpit.

DP: What other means did you use to create the most immersive atmosphere possible for the user?

Sebastian Ingenfeld: The experience begins even before the user puts on the VR goggles. You experience “Asperity” exclusively on a replica of a space shuttle pilot’s chair, where you are initially strapped in tightly. Using a glove, the user can not only see their own hand in the virtual world, but also interact with it. In the finale, a joystick mounted on the chair becomes important. Within the story, it is very important that the user’s journey begins with the experienced pilot Charles Overmyer at his side – the astronaut sits right next to the player and confidently steers the shuttle.

Stefan Ingenfeld als Astronaut Charles Overmyer im Pilotenstuhl auf der FMX
Stefan Ingenfeld as astronaut Charles Overmyer in the pilot’s chair on the FMX

The moment in which Charles Overmyer dies from one moment to the next, leaving the inexperienced user virtually alone, is all the more shocking. At the same time, the lights in the cockpit go out and all the air flows out of the cockpit. The lighting and sound mood changes from one second to the next. Very early on in the production phase, we experimented with the lighting of the instruments and the cockpit, so that the lights also adapt to the story and the mood. Outside the cockpit windows, we can also see the earth slowly turning from the sunny side to the dark side. At the sound level, the experience dispenses completely with audible music for most of the time. However, Pablo Knupfer makes the seat vibrate via a bass channel and mixes an atmospheric ambience from a variation of rockets and ventilation noises, as you might find inside a shuttle.

DP: What major technical problems arose during the course of the project and how did you solve them? Were there any ideas that you weren’t able to realise?

Sebastian Ingenfeld: There were always minor difficulties when exchanging assets, especially with our pilot. We had a complex pipeline here. Our TD had many more ideas for technical features that we unfortunately didn’t manage to realise. For example, we toyed with the idea of implementing additional communication with Mission Control using AI-controlled voice commands. Personally, I would now like to realise the project outside of VR – with a functioning, haptic replica of the shuttle cockpit and with stereoscopic projections outside the windows. That would be a dream!

In kompletter Montur: So sieht das gesamte Outfit des Space Suites aus.
Fully kitted out: this is what the entire Space Suite outfit looks like
DP: What resolutions did you work with to achieve the appropriate sharpness and image quality?

Sebastian Ingenfeld: The experience runs at 60 fps, the same applies to our video textures. The resolution of the VR goggles is still problematic – we had to enlarge the labelling and even some displays within the cockpit so that the letters and symbols could be deciphered at all. Unfortunately, we are limited by the hardware and its resolution. The cockpit textures are correspondingly high-resolution in the direction of flight – however, if you were to use the roomscale and stand up from your seat, you would quickly realise that we have saved a lot of resolution in unimportant places.

Design und Model des Anzugs in Marvelous Designer stammt von Marika Moritz.
Design and model of the suit in Marvelous Designer by Marika Moritz
DP: What did you particularly like about the project?

Matthias Heim: “Asperity” was my first big project in virtual reality, a technology that really fascinates me.
Lena Lohfink: Sebastian Ingenfeld developed a complete experience with a background story (fictional company, image film, website, exhibition stand with spaceship and much more) from his diploma project “Asperity”. producing “Asperity” in its entirety was exciting, very varied and constantly presented me with new challenges and tasks that required all my previous skills to create something new (several voiceover recordings, real film shooting with set construction, VFX production, binaural sound recordings, game and app programming, etc.). I’ve grown from that.
Sebastian Ingenfeld: I particularly liked the subject matter and the opportunity to break out of my personal comfort zone of linear film. 360 degrees combined with the interaction really challenged me. That was great!
Pablo Knupfer: It was great for me to gain experience in the production of sound design for VR and spatial audio.

Aufteilung der einzelnen Projektphasen von Oktober 2O17 bis Mai 2O18
Breakdown of the individual project phases from October 2O17 to May 2O18
DP: Is it possible for interested parties to try out or view “Asperity” somewhere (e.g. on the website in the future)?

Sebastian Ingenfeld: “Asperity” is designed as an installation and can only be fully enjoyed as such. The pilot’s chair and our entire setup are interwoven with the experience, so the application will not be available online for the end user. We had our premiere at FMX 2018 and were fully booked every day of the trade fair. We are currently in talks with various customers to exhibit “Asperity” on a seasonal or permanent basis. Until then, you can find out when and where the experience will be available for a short time on our website.

</x>https://www.youtube.com/watch?v=O5AGq9fVI0Y

DP: What are your hopes for the future of VR experience design? What technical issues do you think need to be fixed?

Sebastian Ingenfeld: For me, the biggest problem is the glasses: the physical resolution and the limited field of view. With our HTC Vive from 2017, the pixel grid is still very clearly recognisable. Being forced to use the Steam VR software also limited us in some features and – in my opinion – is not particularly user-friendly. A discreet background client with fewer adverts, no login and account requirements and no update panic would be much more pleasant.

Das Team von „Asperity“ am Set für die Aufnahmen der Funkvideos
The “Asperity” team on set for the recording of the radio videos
DP: What’s next for you after the diploma project?

Lena Lohfink: Now that we’ve completed the project, we’re starting to commercialise it and are looking for a buyer for our product. At the same time, I’ll be looking for a job and continuing to produce great projects that inspire people.
Sebastian Ingenfeld: Together with Lena, I’m now looking after the exploitation and possible further development of “Asperity” – perhaps we’ll soon have a co-operation that makes everyone involved happy.

]]>
https://digitalproduction.com/2019/03/05/mit-der-filmakademie-im-weltraum/feed/ 2 DIGITAL PRODUCTION Artwork des Shuttles mit Hitzeschutzkacheln 75372
Filmakademie VR film “Sonar” now on Google Daydream https://digitalproduction.com/2017/09/21/filmakademie-vr-film-sonar-jetzt-auf-google-daydream/ Thu, 21 Sep 2017 16:17:01 +0000 https://www.digitalproduction.com/?p=62527 VR-Film Sonar
We have already presented the interesting sci-fi horror experience "Sonar" in our VR focus in DP 02 : 17. Here is an extract from the interview with Philipp Maas and all the new information.
]]>
VR-Film Sonar

Until now, “Sonar” could only be viewed on the Oculus GearVR, but now Philipp Maas, director and developer of the project, has also brought the 360-degree film to the Google headset (costs €1.99). Because with every new VR headset, the number of users who have not yet seen “Sonar” grows.

In addition to this second platform, the film has also been given a spatial audio update. The VR audio specialists DELTA Soundworks from Heidelberg have given “Sonar” a new, spatial sound experience. in 2014, this was not yet feasible for the students at the Film Academy, nor was the visual impression of depth through omni-directional stereo 3D.

The dearVR Spatial Connect tool from Dear Reality was used, allowing the film to be mixed while you are in virtual reality. There is even a mix available in the 3rd Order Ambisonic format, which the team will implement as soon as the platforms support it. Now the viewer is drawn into the eerie atmosphere of the space scenario even more than before.

DP: Many people want to make full CG 360-degree films, but very few know exactly how. How did you approach your first VR project?
Philipp Maas: “Sonar” began as a classic CG animation film in Cinemascope format. In order to be able to produce as independently as possible, I set up a 4TB network storage and two of my own workstations in our project room so that we had a self-sufficient LAN and enough rendering power available. I also decided early on to work with the GPU renderer Redshift. Autodesk Maya and ZBrush, as well as two additional workstations, were provided by the Film Academy. By chance, however, we got our hands on the Oculus Development Kit 1 via my brother Alexander Maas, who also wrote the music for “Sonar”. After the decision in favour of 360 degrees was made, we spent a week researching on the internet to answer questions such as: What is possible, what software should we use, and much more. We were very apprehensive about familiarising ourselves with a game engine like Unity in the short time we had left for the project, as none of us had any experience with interactivity. On the other hand, our goal was to create a passive, very cinematic experience anyway. Unfortunately, we had to do without some features in the process that would have been available to us in a game engine, such as stereoscopy and spatial audio. Fortunately, we were able to add these things later. It is also advantageous to only distribute a video, i.e. a pre-rendered VR experience, as the distribution of software with constantly new drivers, operating systems and headsets would have been much more time-consuming for us.

DP: As you have the difficulty of not being able to actively direct the viewer’s gaze in 360 degrees, how did you conceptualise the storytelling?
Philipp Maas: We gave this a lot of thought and the decision to move the camera was made very early on. A moving camera automatically indicates a direction of movement (nobody wants to sit in the passenger seat with their back to the direction of travel for a long time…). Even very clear points of interest, such as a moving spaceship with bright lights, naturally attract the viewer’s attention. In the design, we also made sure to use classic compositional elements such as converging lines and frames. The most important thing in every shot is to define a focus and give the viewer time to find it.

DP: How did you go about avoiding motion sickness for the viewer? How were you able to test it before the release?
Philipp Maas: We tried to show our animation colleagues animatics on the Oculus as early as possible to observe their behaviour and reactions. This gave us incredibly valuable feedback. We knew from the beginning that we couldn’t do without camera movement altogether. Slow movements are fine and we always have static shots in the film that give the viewer time to orientate and recover. However, the ability to rotate the head should not be taken away from the viewer, so only “rolling” the camera and thus tilting the horizon is acceptable because we are not constantly making this movement with our heads ourselves. Motion sickness can also be used intentionally as a stylistic device – namely when it makes sense dramaturgically and the film is supposed to be really physical.

DP: How long did you work on “Sonar” in total? With how many people?
Philipp Maas: The core production time was between April 2014 and July 2014. The team only consisted of three people: Dominik and I were responsible for the visuals and split the 3D and 2D work. My brother Alexander realised the music and sound design. in 2015, we then spent another three months working on the technical update between June and September. The app was developed in December 2015. We are currently mixing a new soundtrack in Ambisonic format in cooperation with Delta SoundWorks. The project always gives us the opportunity to find out new things. The basic concept of the film works, and it’s always worthwhile to take the film further technically.

“Sonar” was created as a student project at the Filmakademie Baden-Württemberg. More about the university here.

]]>
DIGITAL PRODUCTION 62527
Filmakademie Baden-Württemberg sets up new degree programmes https://digitalproduction.com/2006/09/07/filmakademie-baden-wuerttemberg-richtet-neue-studiengaenge-ein/ Wed, 06 Sep 2006 22:00:00 +0000 https://www.digitalproduction.com/?p=24551 "Interactive Media" and "Education and Science"]]>

The Filmakademie Baden-Württemberg in Ludwigsburg will expand its range of courses in 2007 to include two new courses: “Education and Science” in cooperation with ZDF and “Interactive Media”, a course that will also be organised in cooperation with an external partner. According to a press release, the Film Academy is following its philosophy of practice-orientation by expanding its range of courses. The application deadline is 15 February 2007.

With the “Interactive Media” degree programme, the Film Academy is responding to the increasing importance of new media as future markets in the field of cinematic storytelling. Against the background of the constantly exponentially expanding media platforms in addition to the classic areas of cinema and television, the new course will attempt to fill the media of mobile phones and the Internet with sophisticated content offerings.

The second innovation is the conversion of the previous degree programme “Business and Science Film” into the degree programme “Education and Science”. The name of the new programme reflects the current structures of many television newsrooms. The head of culture at ZDF, Peter Arens, has been recruited as a senior lecturer and will redesign the department for the 2007/2008 academic year.

The degree programme at the Filmakademie Baden-Württemberg normally lasts four years and ends with a final film and a diploma in the respective subjects. In addition, the Academy enables its graduates to make a smooth transition into professional life after graduation through co-operation with the Medien- und Filmgesellschaft Baden-Württemberg and a whole range of television stations such as Südwestrundfunk, Hessischer Rundfunk, Bayerischer Rundfunk, ARTE, ZDF, ProSieben/Sat.1 and RTL.

]]>
24551