The craft of filmmaking differs substantially from the art of game design. A great game is not easily turned into a successful movie and film adaptations of video games often perform terribly and receive dreadful reviews. However, at the same time, the filmmaking process is becoming more and more intertwined with the gaming industry. One of the main drivers of this interwovenness is the growing importance of game engines in the technical aspect of filmmaking. Game engines are bringing real-time production techniques to filmmaking, with significant consequences for the workflow of directors, writers and VFX artists.
Game engines have the potential to revolutionize filmmaking in several ways. At the core are real-time production techniques that originate from the game industry, which can be applied in filmmaking. Real-time filmmaking offers clear-cut advantages to the creative process of filmmaking and the cost side of stagecraft (i.e. the technical aspect of filmmaking).
To start with the latter, game engines bear the potential for radically cutting costs in the value chain of filmmaking. Nowadays, movies rely heavily on CGI (computer-generated imagery), blockbusters can easily have 1000-2000 VFX shots and the current process for this is very time-consuming. For example, if a director wants to change the lighting conditions and see the difference, it could easily take up to an hour (e.g. every frame in the movie Avatar (2009) took several hours). Real-time is the next phase in the evolution of rendering and real-time filmmaking affects VFX, previsualization and CG animation. Take VFX for example, normally, visual effects are added in post-production while expensive render farms supply the computing power. A render farm is a high-performance cluster of computers built to render computer-generated images. For the movie Big Hero 6 (2014), Disney connected 4,600 computers with 55,000 processor cores to a giant supercomputer to handle its rendering needs. With real-time, filmmakers get an approximation of the final shot earlier, which can drastically reduce the number of iterations necessary and thereby speed up the workflow and reduce the rendering needs. Potentially, real-time rendering 3D LCD screens could even make expensive render farms superfluous and replace green-screens as the film industry standard. Another cost advantage of real-time 3D rendering is that filmmakers don’t need to bring the entire set to an exotic filming location. One person to capture the footage of the environment will do the trick, as the photorealistic GR environment can be brought back and projected in real-time on the massive LCD screens. The technique allows for the simulation of environments without having to build an entire set in a foreign location. This is especially useful when it only comes down to one or two scenes.
Besides cutting costs, real-time filmmaking also fundamentally rearranges the workflow of the creative process. The key takeaway here is that visual effects will play a role earlier in the process, the project is “live” from the moment you start production. When movie studios operate with a normal “sequencing” method, editors are often isolated when working on post-production VFX and directors watch the edited frames the day after. Real-time production tools bring filmmakers and VFX artists together, as they immediately begin to collaborate and do so throughout the entire process. As soon as the project is “live”, they can jointly start experimenting withlight, shadow, characters or camera positioning. Furthermore, by embracing game engines, movie studios are gaining access to a new pool of talents. While it might still be hard to attract hardcore game developers to filmmaking, well-known tools and new workflows might lure them.
In addition to bringing real-time production tools to filmmaking, game engines have other valuable assets thatmight entice the movie industry. For example, in the CG environments of the movies Jungle Book (2016) and Lion King (2019), Disney leveraged the digital asset library of Quixel Megascans (acquired by Epic last year) to achieve high levels of photorealism. Comparable to real-time techniques, this mostly speeds up the workflow because prefabricated assets can be used, instead of movie studios having to build CGI themselves. Furthermore, game engines are spurring new forms of media content on the cutting edge of movies and gaming. Last year, the ILMxlab – the immersive entertainment studio of LucasFilm – teamed up with Epic to develop the VR series Star Wars: Immortal. It received mixed reviews but is an interesting showcase of what could be expected from these innovative movie studios in the upcoming years.
There are also some drawbacks to game engines. Despite the clear improvement over green screens, it is questionable whether real-time rendering will eventually lead to better output for consumers and whether it can match the visual splendor of real filming. Furthermore, it remains to be seen how comfortable filmmakers will be replacing existing workflows with ones that originate from the game industry. Consequently, the adoption of game engines will remain a balancing act between the demands of filmmakers, VFX artists, budget managers and consumers.