VR Deep Diving | Retro-Article

Review: In DP 06 : 2017, our author Mirja Fürst immersed herself in a VR experience for the “Ghost in the Shell” film adaptation.

In the VR experience of “Ghost in the Shell”, the viewer is transported into Major’s world via the VR headset. Using the controllers, the viewer can create a glitch effect or slow down time. You can see Major’s fight against robot geishas, experience a deep dive and fall from a rooftop. The experience was realised by the REWIND studio from England in cooperation with the production studio Here Be Dragons, Oculus, Paramount and Dreamworks. We tested the experience at this year’s FMX and then met up with REWIND founder and CEO Solomon Rogers for an interview.

REWIND was founded six years ago as an animation and VFX studio. The team has been focussing on virtual and mixed reality as well as 360-degree projects for around three years, but as a content studio it also continues to realise post-production jobs of all kinds. The “Ghost in the Shell” experience was realised at REWIND with a team of 15 people in a very tight time frame of 7 weeks, which also included motion capture shoots.

DP: How did REWIND’s focus on VR come about?

Solomon Rogers: We started working on VR projects and experiences shortly after the launch of the Xbox One. When the first Oculus Rift DK1 was launched, we quickly realised that we could use this technology for all kinds of projects in the architecture, education and entertainment sectors. From that point on, we focussed on VR productions.

DP: What did your software pipeline for the VR project “Ghost in the Shell” look like?

Solomon Rogers: We used everything (laughs): 3ds Max and Maya were used for the modelling, Houdini was used for all the particle effects, fluid dynamics and the holograms, plus a bit of ZBrush and Mudbox. The Unreal Engine was the core tool for the delivery, and we used Unity for the Gear VR version.

DP: Why did you use two different engines?

Solomon Rogers: During testing, we found that the Unreal Engine delivered the best results on the Rift. But we couldn’t run a real-time engine on the Gear VR, so we mixed a stereoscopic 360-degree render with interactive elements on top. This worked very well for Unity because the material was really lightweight this way.

DP: How many assets did you create for the “Ghost in the Shell” experience?

Solomon Rogers: I can’t give you an exact number, but there are four main characters and three key scenes: the one on the roof terrace, the tea room and the shell sequence at the end.

DP: Did you have to build the assets from scratch or did you have models from the film to work from?

Solomon Rogers: We received numerous references from the trailer and the initial artwork, which provided a good basis. We also received film assets and 3D scan files from some VFX vendors, but these were far too high resolution to work with in the engine, so they only served as a reference.

DP: There’s this huge city backdrop that you can see from the roof terrace. How did you create it?

Solomon Rogers: We tried to get as close as possible to the look of the film. We built the nearby building models all by hand in Maya and Houdini, the more distant city backdrops were created with lots of matte paintings, and the compositing was done with Nuke. The hologram figures between the buildings come from the film, we received them as Houdini files. But it was a very complicated process until we got them to work in the right 3D form in Unreal. We rendered them as animated flipbooks from Houdini and then used the depth data to generate a displacement map that we ran in the engine. So the 3D object is actually just a texture.

DP: The experience of falling from the roof of the house was very interesting. How did you achieve that?

Solomon Rogers: It’s actually just a very small turn of 30 degrees. As soon as you play with the horizon in VR, people feel shaky. We used this effect as a narrative tool in the experience. We carried out tests with test subjects, many of whom actually fell over (laughs). Using the real-time engine, we were able to test what felt best at all times and stayed at 30 degrees because in this case it was enough to feel a little falling, but not too much.

DP: How much freedom did you have in the design, for example with the time slowdown or the glitch effect?

Solomon Rogers: The design was between us and HBD, their creative directors and we worked closely together through the whole process. They were very good at directing great ideas, so not many checks from the production companies were needed. With our project, we didn’t have to stick too closely to the ideas of the film because it’s an autonomous VR setting, a small part of the fantasy of the whole Ghost in the Shell universe that stands on its own.

DP: Your original concept envisaged more interactivity. What was planned?

Solomon Rogers: Actually, the viewer was supposed to get weapons and fight his way through the scenery. In the timefreeze scenes, he could have thought about the best way to kill the bad guys.

DP: What was the output like for the different platforms?

Solomon Rogers: The Rift version has a tougher minimum specification, which caused difficulties because the experience had to run smoothly on Unreal. Nevertheless, the final version wasn’t too big in the end at a few gigabytes. The version for the Gear VR had a UHD resolution and was 600 megabytes in size. For Facebook, we created a 6K x 6K stereo video with spatial audio.

DP: What output did the material come out of the engine with?

Solomon Rogers: We exported the experience from Unreal at 16K x 16K. We chose this resolution so that we could downscale it to 8K and 4K for the Gear VR, and the size we chose allowed us to achieve very high quality.

DP: In contrast to other VR experiences, which often appear very blurred and pixelated, this problem was not apparent in the “Ghost in the Shell” experience. How did you manage that?

Solomon Rogers: Due to the two stereo images in the headset, the image is unfortunately often somewhat blurred. That’s why we sharpened the images slightly in post-production, in addition to the fact that many scenes in the “Ghost in the Shell” experience are very dark. This allowed us to concentrate primarily on achieving higher quality in the more brightly lit areas, but at the same time some very beautiful things are unfortunately hidden in the darkness.

DP: Overall, what was the biggest challenge of the project?

Solomon Rogers: Definitely the time. But also getting the motion capture to work and look smooth on so many characters in one scene at the same time. This project taught us a lot about motion capturing in VR.

DP: What were the reasons why the project time of seven weeks was so extremely short?

Solomon Rogers: I wish I knew. At first we turned the project down because of the tight deadline, because we felt the amount of work couldn’t be done in that time frame. But our team loves “Ghost in the Shell”. They really wanted to do the project and were prepared to put in long working days for it. So we decided to take on the project after all and called them back.

DP: Which experience was more complicated to realise: “Ghost in the Shell” or the BBC spacewalk “Home”?

Solomon Rogers:Ghost in the Shell” was challenging because of the time pressure and the extensive motion capturing. But it works linearly, the viewer is only shown a story, which made it easier than “Home” in terms of interactivity. The BBC Spacewalk was a ground-breaking VR project that had never been realised before in this way. The story had to put the user at the centre and suggest that they are free to make their own decisions and have full control. We had to move and play in the grey area between pure storytelling and a game.

Related Posts

Hff’s Little Star

Every year, the VFX programme at HFF Munich produces various projects that break new creative ground. These include the animated mixed-media short film "Little Star", which brings a touching encounter to life through a combination of traditional hand-drawn 2D animation and 3D computer animation. But how exactly was the film made?
More...

VR gallery at the Ostfalia

Wouldn't it be great if you could see something else in VR besides hectic hustle and bustle, demos and microtransaction calls? An art exhibition, for example? Yes, there is - and we found one such project with Noah Thiele, a student at Ostfalia.
More...