Game engines are revolutionizing filmmaking

January 20, 2020

Game engines are revolutionizing filmmaking

Sebastiaan Crul
January 20, 2020

Game engines are revolutionizing filmmaking

Sebastiaan Crul
January 20, 2020

Game engines are revolutionizing filmmaking

January 20, 2020

The craft of filmmaking differs substantially from the art of game design. A great game is not easily turned into a successful movie and film adaptations of video games often perform terribly and receive dreadful reviews. However, at the same time, the filmmaking process is becoming more and more intertwined with the gaming industry. One of the main drivers of this interwovenness is the growing importance of game engines in the technical aspect of filmmaking. Game engines are bringing real-time production techniques to filmmaking, with significant consequences for the workflow of directors, writers and VFX artists.

Our observations

  • In its flagship series The Mandalorian, Disney used Epic’s game engine Unreal to render real-time projected sets. During the SIGGRAPH 2019, writer and director Jon Favreau highlighted two important features of the game engine. The first was the Virtual Camera Plugin, which enables filmmakers to render mocked environments in real-time. Subsequently, editors and filmmakers can play around in a virtual environment before they start shooting. This could potentially replace time-consuming previsualization methods such as painted storyboards.
  • The second tool highlighted by Favreau was the projection of CG (Computer Generated) environments on massive screens through real-time rendering techniques, which Disney also used to render real-time CG backgrounds in Solo. Real-time rendering techniques originate from game development and are now entering the process of filmmaking. Normally, when VFX (visual effects) are required, actors are asked to look at a green screen while acting. With real-time rendering techniques, actors walk into a real-time CG environment visible on massive LCD screens, such as the Millennium Falcon cockpit, and are therefore better able to interact with their surroundings.
  • In November, Epic Games announced the acquisition of Quixel. Among other things, the company contains a library of more than 10,000 digital photogrammetry assets that can be used in games and VFX. In the past, the Megascans library was leveraged by Disney to achieve high levels of photorealistic CGI in Jungle Book (2016) and Lion King (2019).
  • Its big competitor Unity acquired Digital Monarch Media (DDM) in 2018, a company entirely focused on bringing real-time techniques to filmmaking. The company was founded by two veterans of the movie industry and gaming industry. DDM develops real-time filmmaking tools accessed through an iPad or VR interface, which gives the director more control when filming and the possibility to see changes immediately.
  • Among others, VFX pioneer Steven Spielberg utilized Unity for VFX in Ready Player One (2018). Like Favreau, he employed it mostly for real-time techniques. For Spielberg, the in-VR editor built by Unity offered a non-technical and intuitive way of moving freely through 3D digital environments and frame shots.
  • Nickelodeon has announced it will rely entirely on a game engine to produce the upcoming TV series Meet the Voxels, but has not revealed which engine it will be using. In the past, it has cooperated with Epic Unreal in some of its VR projects.

Connecting the dots

Game engines have the potential to revolutionize filmmaking in several ways. At the core are real-time production techniques that originate from the game industry, which can be applied in filmmaking. Real-time filmmaking offers clear-cut advantages to the creative process of filmmaking and the cost side of stagecraft (i.e. the technical aspect of filmmaking).  

To start with the latter, game engines bear the potential for radically cutting costs in the value chain of filmmaking. Nowadays, movies rely heavily on CGI (computer-generated imagery), blockbusters can easily have 1000-2000 VFX shots and the current process for this is very time-consuming. For example, if a director wants to change the lighting conditions and see the difference, it could easily take up to an hour (e.g. every frame in the movie Avatar (2009) took several hours). Real-time is the next phase in the evolution of rendering and real-time filmmaking affects VFX, previsualization and CG animation. Take VFX for example, normally, visual effects are added in post-production while expensive render farms supply the computing power. A render farm is a high-performance cluster of computers built to render computer-generated images. For the movie Big Hero 6 (2014), Disney connected 4,600 computers with 55,000 processor cores to a giant supercomputer to handle its rendering needs. With real-time, filmmakers get an approximation of the final shot earlier, which can drastically reduce the number of iterations necessary and thereby speed up the workflow and reduce the rendering needs. Potentially, real-time rendering 3D LCD screens could even make expensive render farms superfluous and replace green-screens as the film industry standard. Another cost advantage of real-time 3D rendering is that filmmakers don’t need to bring the entire set to an exotic filming location. One person to capture the footage of the environment will do the trick, as the photorealistic GR environment can be brought back and projected in real-time on the massive LCD screens. The technique allows for the simulation of environments without having to build an entire set in a foreign location. This is especially useful when it only comes down to one or two scenes.

Besides cutting costs, real-time filmmaking also fundamentally rearranges the workflow of the creative process. The key takeaway here is that visual effects will play a role earlier in the process, the project is “live” from the moment you start production. When movie studios operate with a normal “sequencing” method, editors are often isolated when working on post-production VFX and directors watch the edited frames the day after. Real-time production tools bring filmmakers and VFX artists together, as they immediately begin to collaborate and do so throughout the entire process. As soon as the project is “live”, they can jointly start experimenting withlight, shadow, characters or camera positioning. Furthermore, by embracing game engines, movie studios are gaining access to a new pool of talents. While it might still be hard to attract hardcore game developers to filmmaking, well-known tools and new workflows might lure them.  

In addition to bringing real-time production tools to filmmaking, game engines have other valuable assets thatmight entice the movie industry. For example, in the CG environments of the movies Jungle Book (2016) and Lion King (2019), Disney leveraged the digital asset library of Quixel Megascans (acquired by Epic last year) to achieve high levels of photorealism. Comparable to real-time techniques, this mostly speeds up the workflow because prefabricated assets can be used, instead of movie studios having to build CGI themselves. Furthermore, game engines are spurring new forms of media content on the cutting edge of movies and gaming. Last year, the ILMxlab – the immersive entertainment studio of LucasFilm – teamed up with Epic to develop the VR series Star Wars: Immortal. It received mixed reviews but is an interesting showcase of what could be expected from these innovative movie studios in the upcoming years.    

There are also some drawbacks to game engines. Despite the clear improvement over green screens, it is questionable whether real-time rendering will eventually lead to better output for consumers and whether it can match the visual splendor of real filming. Furthermore, it remains to be seen how comfortable filmmakers will be replacing existing workflows with ones that originate from the game industry. Consequently, the adoption of game engines will remain a balancing act between the demands of filmmakers, VFX artists, budget managers and consumers.

Implications

  • Digital assets such as photorealistic landscapes, everyday items or reusable digital characters will grow in importance as they speed up the workflow and are cost–effective. The possibility to reuse digital assets can be particularly useful to studios focusing on a franchise strategy. As this is currently the dominant strategy for the top studios, we can expect these companies to considerably expand their libraries of digital assets , with a strong focus on digital characters and animation. Game engines, on the other hand, will probably further specialize in digital assets such as CG environmental backgrounds and standard items.
  • Movie studios such as Disney are also building proprietary real-time engines. This has two important reasons. The first is that it will allow their digital assets to be more easily be transferred to new projects. The second is they want to retain autonomy over the filmmaking process. Consequently, interoperability between game engines and proprietary technology will remain a key focus for movie studios in upcoming years.
About the author(s)
Economist and philosopher Sebastiaan Crul writes articles on a wide range of topics, including rule of law in digital societies, the virtualization of the lifeworld and internet culture. He is currently working on his doctoral degree on the influence of digitalization on mental health and virtue ethics, having previously published dissertations on the philosophy of play and systemic risks in the finance industry.
You may also like