The human race: prototyping the future of film through the auto industry

The Human Race is a ground-breaking short film that is pushing the boundaries of interactive visuals, created to launch the Chevrolet ZL1 Camaro and premiered at the world’s largest professional game industry event, the GDC in San Francisco, in 2017. A co-production between car manufacturers Chevrolet, London-based VFX and creative content studio The Mill and Epic Games, it is blurring the lines between the worlds of gaming and advertising.

“The Human Race blends cinematic storytelling and real-time visual effects to define a new era of narrative possibilities
— Angus Kneale, Chief Creative Officer, The Mill

Utilising three unique innovations – Unreal Engine, a game engine developed by Epic Games, a revolutionary virtual production toolkit devised by The Mill called Cyclops and the Blackbird, The Mill’s adjustable car rig that captures environment and motion data – the film features a race between the Chevy Camaro ZL1 and the Chevrolet FNR self-driving concept car. Essentially pitting man against machine as a human driver races an AI-controlled car, the film was part of a multi-platform campaign to celebrate the 50th anniversary of the Camaro.

It came about as a response to a basic problem encountered across several sectors for years – the lack of availability of cars for photoshoots and filming, due to cost constraints or secrecy issues.

“There has always been a need for CG cars but a few years ago car manufacturers were concerned that these didn’t look real enough,” explains Alistair Thompson, Executive Vice President International at The Mill. “We developed a method that brought a level of authenticity that hadn’t been seen before, by using a dummy car that gave the director a reference point for filming, with the car added in later.”

However, he admits it was a bit of a clunky process. “We had to digitally remove parts of the car we didn’t want to show and all of the data had to be captured manually.” And it didn’t answer the two fundamental issues that advertising clients faced when shooting their products – how can you see the vehicle you want to see when filming on set and how can you create a CG car in a live action film that the client can personalise?

To answer the challenges, Blackbird, the adjustable car rig that can transform its chassis to match almost any car for a live action shoot, was created. “Blackbird was an incredible innovation. It let directors have greater flexibility, but at the end of the day when you’re on set, you’re still not looking at the final car – so we started thinking about how our real-time engineers could work with The Mill to make it even better,” explains Kim Libreri, CTO of Epic Games.

The Mill created a software package called Cyclops and used Unreal Engine’s real-time rendering capabilities to build an augmented reality process that allowed film-makers to instantly visualise a CG car model in live action shots. On a shoot, Cyclops stitches 360° camera footage and transmits it to Unreal engine, producing an AR image of the virtual car, tracked and composited seamlessly into the scene. Which means the director can frame and sync a virtual car on location, reacting live to lighting and environment changes.

The Human Race was the first ever interactive short film to blend live action film-making with real-time game engine processing. All of the cars and visual effects in the film are generated live every 42 milliseconds using the Unreal Engine. Using a configurator app, the team personalised the film as it played live on the screen. Then they took it once step further, giving the launch audience a full control of the configurator to create their own film.

“The audience got to choose what type of Camaro they wanted to race, right down to the colour and other specific details,” says Alistair Thompson. “Altogether they had 400 different options, including the chance to drive a 1968 model. It was possible to configure it on screen or on a mobile tablet and see the results immediately. It was the first time this was possible.”

Since launch, The Human Race has picked up several prestigious awards, including Best Real-Time Graphics & Interactivity Award at SIGGRAPH 2017, two LIA awards and received eight nominations at Detroit’s D Show. It quickly gained 2 million online views and created the longest average viewing duration of any Chevy online advertising. The project had a huge global reach, with close to 100 pieces of international press coverage.

“This presentation demonstrates what’s possible when combining live action with the power of Unreal Engine’s real-time photorealistic rendering capabilities – essentially taking the ‘post’ out of post-production.
— Kim Libreri, CTO, Epic Games

“The effects on the sector are still nascent,” says Alistair.  “Until now, the car industry has been segmented across different areas of its business, with manufacturing, dealership, marketing and advertising teams all working separately. Now there is a case for creating high quality content and assets that can be used across all of these areas, to sell to people across all mediums and transverse different areas of the business. This is a major change. Like any major change, it takes a while for people to accept it and for it to gather pace. But it feels like we’re at the tipping point.”

And not only that. This project shows what’s possible across multiple sectors, with a reach that is far beyond car commercials. “This is a pivotal moment for film VFX and the coming era of augmented reality production,” says Angus Kneale. And there has already been lots of interest in this technology from a variety of brands, confirms Alistair Thompson. For example, architectural firms are keen to use it to allow their clients to personalise build specifications well in advance of project completion. Using this technology, it is also possible for brands to visualise products or objects that are not yet available or don’t yet exist.

This is only the beginning – “we’re only scratching the surface” says Alistair. Using this technology, film makers can see a virtual object or character rendered in real-time in any live action environment. Consumers can personalise their viewing experience by configuring any object on any screen. They have more flexibility when shopping for products, tailoring what they want to see in real-time. Brands now have the ability to create and deliver truly responsive advertising content. And in creative storytelling, audiences can dictate the path of what they are watching.

“People now have the opportunity to engage with content on multiple levels and to alter it,” explains Alistair. “This could mean having control over a character in an animated film. Or choosing your own car to race in an on-screen car chase – why not put yourself and the car you want to drive into the film sequence? The technology exists to do that.”

The challenge, he reveals, when you do something that’s never been done before, is working it out as you go. And the challenge for brands across all sectors is the speed that the technology is moving. But it is essential that brands are not afraid to try things out.

“We’re not seeing many examples yet of clients using real-time technology with AR, but I think we will see this change quickly. Real-time technology as well as content in AR, VR and MR requiring immersive graphics will start to become the norm for brands. Visualising content in new ways and creating interactivity within it will play a major part in our future world.”

Words by Bernadette Fallon

Back to case studies

tagged
Automotive, CG, Chevrolet, Epic Games, Film, Gaming, The Mill, Unreal Engine, VFX