FX-School

A NEW Term in Gaming + CG Animation, and VFX, starts in JAN!  A NEW Term in Filmmaking starts in JAN!  FREE Counselling!  Call 022 4235 4235 to learn more

 

New Course

New Course

Engines of creation

Published on : June 18, 2016
Engines of creation
JUNE 22nd is the 20th birthday of “Quake”. Its release, by a Texan firm called id Software, was a milestone in the history of video games. “Quake”, a grim and gory fantasy “shoot-’em-up”, pioneered many now-commonplace features of computerised play. Its most striking innovation was its fully three-dimensional world. This was drawn by a piece of software, called a game engine, regarded at the time as jaw-dropping.

These days “Quake” looks like a muddy brown mess. Two decades of advances in processing power, allied with cut-throat competition between games designers, have advanced the art tremendously. Game engines are now a product in their own right. Besides drawing the graphics, they handle tasks like simulating physics (such as gravity, say, or object collisions) and connecting players to each other online. They are, in other words, the platforms upon which games are built. Most games companies buy them pre-made, off the shelf. And not just games companies. Game engines have become so good at creating high-quality facsimiles of reality that they are attracting the attention of firms that, until now, have had nothing to do with video gaming at all.

One such outsider is PLP Architecture, a big London partnership. PLP has been experimenting with two leading game engines, Unity (made by Unity Technologies, of San Francisco) and Unreal (made by Epic Games, of Cary, North Carolina). Architecture businesses have long used graphics to give their customers virtual tours of as-yet-unbuilt edifices. But, says Richard Woolsgrove, who is in charge of “visualisation” at PLP, these were often just pre-cooked animations. Game engines, by contrast, let clients wander wherever they like. Mr Woolsgrove’s group has created virtual versions of proposed buildings using one or other of the engines it is testing, and invited people to walk around and inside them, using a video-game controller to do so. The ability to explore a virtual building in this way, Mr Woolsgrove says, gets clients much more excited than they were by the old approach.

Game on
Architects are not the only non-gamers interested in extending the uses of game engines. NASA, America’s space agency, is a fan. It is experimenting with a virtual-reality (VR) system based on Unreal to train astronauts for stints on the International Space Station. And this year’s Game Developers’ Conference, an industry shindig held every March in San Francisco, featured an eclectic range of firms, from McLaren, a British sports-car company, to Disney, an American entertainment giant, talking about how they were using game engines either to sell products or to help design those products in the first place.

According to Clive Downie, Unity Technologies’ chief marketing officer, the main advantage game engines give organisations is the ability to do instantaneously what used to take minutes or even hours. Before such engines were applied to the task, creating high-quality renderings required computers to crunch tediously through the calculations needed to simulate how light rays bounce around rooms and interact with objects. Some individual frames of “Toy Story”, the first fully computer-animated film, took 30 hours to produce. Game engines avoid all that by employing a host of mathematical shortcuts to make images 30 times a second or more. The price is a lower-quality image. But as computing power has grown, the trade-off between speed and quality has become less and less noticeable (see picture: the game-engine version is above, the photograph from life is below).

If artists want to add to the renderings, speed also lets them tweak their creations on the fly. If the lighting is not quite right, or a piece of virtual furniture is made of the wrong material, that can be changed without waiting while the scene is laboriously redrawn. This dramatically speeds up the production process.

One way to think of a video game is as a primitive sort of virtual reality, in which a consistent, computerised world is generated and then presented to the player through a screen. “Proper” virtual reality, in which the illusion is made all-consuming by being supplied through a headset that blocks out the real world, is all the rage this year. Two retail headsets, one from Facebook and one from HTC, have already been launched; a third, from Sony, is expected before the end of the year. For now, VR is aimed mostly at gamers. But Tim Sweeney, Epic’s founder, points out that even non-gaming VR applications—such as a relaxing beach simulation or a shared virtual workspace—require slick, fast, computer-generated imagery of exactly the sort that his company sells.

The same is true of VR’s cousin, augmented reality (AR), in which computer-generated imagery is painted on top of the real world. Again, big firms are cooking up consumer products. Google is working on a new version of its delayed Glass headset, and Microsoft is preparing for the release of an AR product dubbed the HoloLens. Game engines could become to VR and AR what Windows is to the PC—the base layer on which other products are built.

Nor need those products be intended only for retail consumers. Ncam is a special-effects firm based in Soho, an arty district of London. It makes its living developing game-engine-based technology that lets film and TV directors drop virtual objects straight into scenes in real-time. A recent demonstration involved Nic Hatch, Ncam’s boss, setting up one of the firm’s special cameras in the lobby of its office and pointing the lens at the empty middle of the room. A TV connected to the camera showed the same lobby, but with a convincing-looking McLaren sports car sitting in it. This was generated by Unreal from computer models supplied by McLaren’s designers. The firm also has clips of commentators walking around other virtual vehicles, explaining the finer points to viewers, and of weather forecasters sharing studios with computer-generated tornadoes that are, apparently, crossing the American Midwest.

The killer app of this sort of technology, though, will probably come in the film industry, on the “green screens” in front of which actors have to perform when computer-generated scenes are to be added later in a process called post-production. Green-screening requires actors to move around obstacles that are not there, and to interact with empty space where computer-generated characters will eventually stand. This is hard. Done badly, the results can look wooden and artificial. Technology like Ncam’s lets directors see what the special effects will look like while scenes are being filmed. They can thus manage the actors sensibly, telling them exactly where to look and how to behave.

I’m the king of the swingers

Ncam’s products have already been employed in big-budget films such as “White House Down”. A remake of “The Jungle Book”, released this year by Disney, used Unity. Mr Downie points to “Adam”, a short sci-fi movie shot entirely in Unity, and speculates that the first feature film made from start to finish in a game engine may not be far away. Mr Hatch thinks game engines may one day make conventional post-production obsolete.

Game engines may arrive on TV screens even quicker, though, if Future Universe, a small company based in Oslo, has its way. Future Universe plans to use game engines to merge video games with live television. According to Bard-Anders Kasin, one of the firm’s founders, their first endeavour will be a green-screened game show, with a game engine drawing a virtual world around the contestants. When the result is broadcast, viewers with tablets or smartphones will be able to jump into the action—such as a car race—and play alongside those in the studio.

Future Universe’s approach has attracted interest from TV networks. Mr Hatch says he knows of at least ten big TV companies that are actively experimenting with game engines. He speculates about using the engines to do everything from training car mechanics to building theme parks. “Imagine,” he posits, “if your kids could drop into a scene with Olaf and Elsa [a snowman and a princess from “Frozen”, a Disney film released in 2013].” Parents, worried about the costs of film spin-offs, may be less than delighted by that particular augmentation of reality. The prospect of a virtual sunlounger on a Caribbean island of their choice may help to ease the pain.read more
 
Copyright © The Economist Newspaper Limited 2016. All rights reserved.
© Copyright 2024 FX School. All Rights Reserved.

Home  |  Privacy Policy  |  Terms of Use