Why Your Video Game Movie Can't Win an Oscar

Why Your Video Game Movie Can't Win an Oscar

1993's 'Super Mario Bros.' was the first real video game movie, and it began a legacy of misuse and mediocrity. Glixel

In trying to adapt games, moviemakers take all the wrong cues

In trying to adapt games, moviemakers take all the wrong cues

Maybe we should have known from the beginning. If Bob Hoskins can’t make a good movie out of a video game, then what hope does anyone else have? We’re talking about an Oscar-nominated actor who made commercials for a UK phone company seem like Palme d’Or-worthy snippets of warming social realism. A plumber rescuing a princess from an army of devolved lizards should've been easy.

Hoskins, of course, played Mario in the first ever Hollywood video game movie, 1993’s Super Mario Bros. It’s a terrible film, but influential in that it set the tone for everything that has come since – almost without exception, every other video game movie has been terrible too, right up to and including last year’s Assassin’s Creed.

This is more than bad luck. In the years since Hoskins’ brave but awful Mario, games have changed almost beyond recognition. They’re varied, sophisticated, and an ever-larger slice of the population has grown up playing them. And yet the best that can be said of Hollywood’s many attempts to translate games into film is that, sometimes, they turn out to be modest non-disasters. Not that this has stopped anyone trying. As LA studio execs cast around for something – anything – that might resonate with the Marvel crowd, there are several game adaptations currently filming or in pre-production, including big series like Uncharted, The Last Of Us, and Tomb Raider. Clearly, they are not going to stop and none of them are in danger of winning an Oscar anytime soon. So why do they suck so much and how can they get better?

Here's a handy five point explainer of why things go wrong when you try to make a video game movie.

1. You don't know why you're making it
Making movies is expensive, and marketing is king. That’s why Hollywood is filled with sequels, remakes and tie-ins. The term they use is 'pre-sold' – if an idea or even just a title is familiar to audiences, you're halfway to getting them to see it. That's why there are eight Fast and Furious movies. That's why Transformers came back, and has stayed back every other summer since. That's why they "adapted" Battleship, which isn't really even a game so much as a set of rules about arguing in boats. The film had ships in it. They battled.

Video games have cultural currency. The big ones are pre-sold. But the problem that video games have when it comes to adaptations is that they have a unique relationship with cinema, a feedback loop of influences and borrowed understandings that sets them apart from toys, comic books or even TV shows. Director Paul WS Anderson once told me that he was asked to helm a movie version of Dead Space, and in turning the offer down told the producers "Have you seen Event Horizon? I already did." Be like Paul. When you're adapting your video game into a film and you want it to be worth seeing, ask yourself – is there something unique about the game that you're setting out to capture? Because if not, there's a chance your film might simply disappear into the soup of generic influences traded from one medium to the other.

Take 2014's adaptation of Need For Speed. It was made, presumably, because the street racing series is among Electronic Arts' biggest selling properties, and because Fast and Furious proves there is a audience who will pay to see fun crews drive fast cars at the movies. But what does the Need For Speed movie do that any of the recent Fasts and Furiouses don’t? It's hard to identify anything in particular. Is there something unique about the game that merits adaptation, that suggests it could have become a great and unique movie? Well, no. So how can you stop your Need For Speed movie from becoming a generic 90 minutes of loud engines and willful blindness to stop signs? People who have seen the film know the answer to this question, and it is a sad one.


Games have been borrowing from cinema for decades. Genre, narrative grammar, action set-pieces – the building blocks of storytelling erected around the interactive mechanics of games are all stolen from Hollywood’s backlot. So when you're moving away from that interactive medium and back to cinema, you need to make sure there's something worth preserving in your game that isn't the experience of playing it, because that's the first thing you will lose.

And don't think you can get away with being "that video game movie", either. While games have been borrowing from movies for decades, movies are now starting to borrow back. The feedback loop is two-way, with the rules and trappings of games finding their way into movies – sometimes obvious, as in Scott Pilgrim Vs The World or Wreck-It Ralph, but also more subtly in films like Source Code and Edge Of Tomorrow. To invoke Fast and Furious just one more time, the series has an easy, repeat-with-variation approach to reality, a fan-pleasing fluidity of logic that surely comes from games. You like The Rock's bad guy? He's good now. You like Michelle Rodriguez? Her character was never really dead. You like Paul Walker? He's not really dead, either.

The bottom line: there is no value to making a movie with a vague video game aesthetic. There is no value to adapting a game whose identifiable features dissolve on contact with another medium. Have a good reason to adapt your game.

2. You're being too faithful to the source material
This is an issue of quality control. Have you made sure your game contains something unique? Excellent. Is that thing lame in a way that will make people familiar with the game ashamed when they watch it in public? This is worth double-checking.

Yes, this is about Assassin’s Creed – which is the latest game-related flop to hit our screens. In many ways Assassin’s Creed is a new breed of video game adaptation. It has a strong, notable cast, and a lead in Michael Fassbender who commits admirably to the hokum. It also comes with a sophisticated, minimalist visual design – all angular logos and cool motifs – that helps sell the idea of games as an industry of refined creativity. And yes, it has something unique to it, a conspiracy theory science fiction based on genetic time travel and the technology of the Animus – the in-game simulator system.

Actually, the big screen Animus is done rather well, transformed from a magical dentist's chair into a suitably cinematic super-VR simulation, part Holodeck, part mega-epidural. But elsewhere the film is scuppered by dogmatic faithfulness. Everyone knows that the modern day sections of the Assassin's Creed games, which bookend the historical bloodletting, are bland time-fillers. And yet the film adopts this same structure, of vague white interiors giving way only occasionally to lusty period stabbing. Worst yet, the film engineers a scenario in which Michael Fassbender's genetically time travelling modern day nobody jumps off a high ledge and the people watching in the vague white interior gasp and say "The leap of faith!", a signature concept from the games not established in the film to this point and apparently included to appease players who might have come to the film just to see a specific mechanic writ large on the big screen, regardless of plot or motivation. It is a Bad Moment.


The error, really, is down to a fundamental misreading of the reality put forward by video games. Games are often metaphors for experience or action, encapsulations of a situation that offer an extra dimension of involvement. And not just video games; one thing that happens to me regularly while playing the Star Wars board game Imperial Assault is that, as a mission’s mechanics and risk/reward proposition are revealed, I realize they provide a brilliant, engaging interpretation of a trope or scenario. That a hero's special ability to command a friend to take a last-ditch shot is a retelling, through the rules of the game, of stories that I’ve seen play out on the big screen.

In other words, games will often seem stilted or straight up ludicrous if viewed through the lens of regular reality. Players can make characters dance or walk everywhere backwards, hedges and invisible walls block pathways, bandages provide an instant fix for multiple bullet wounds. The internet at large has transformed these idiosyncrasies into "video game logic" memes, but players tend to be forgiving and intuitive, understanding and accepting systems of representation, and enjoying games regardless.

That's why the game you're adapting might be full of loads of weird, figurative business that the fan base loves. But that doesn't mean you should carry them over into your film. Leave the leap of faith on PlayStation. Calibrate your moment-to-moment reality carefully, and only stay true to your source as far as logic can sustain it.

3. You've not learned a damn thing from Marvel
I mean, obviously.

In many ways, Marvel has it easy. Comic books are further removed from films than games are. They do not move in real time, with music and visual effects and performances, and so they do not compete with the versions of themselves they might become. In fact, they are more like storyboards, more like the blueprints and raw materials of cinema.

But a quick comparison with DC's output shows that Marvel's films are still extraordinary examples of adaptation. One key reason among many is how they manage tone – building a believable reality capable of containing the wide spread of powers and characters. Video game movies, which from the sour-mouthed self-importance of Max Payne to the impenetrable lore of Warcraft are so often serious and self-referential, have much to learn from Marvel’s lightness of tone.

It's humor that binds Marvel's films together. A single line in Joss Whedon's Avengers successfully joins the worlds of a contemporary spy agency, a recently thawed Nazi-fighting super soldier, and a magical Asgardian: "There's only one God, ma'am, and I'm pretty sure he doesn't dress like that." Video games being hauled to the big screen are crying out for this kind of agility, a fleet-footed irreverence that lets the audience know that this isn't the version of the world or characters on show, but a version, an instance of the source in another, equally fun form.

4. You didn't navigate the prestige gap
Video games have matured into a varied, sophisticated medium, capable of sustaining everything from enormous blockbuster thrillers to bedroom-coded abstract puzzlers. But, once run through the process of big screen adaptation, all these differences disappear. When it comes to the movies, no matter what genre any given game adaptation falls into – military action, science fiction, animated avian revenge – it is, above all else, regarded as trash.

Let's blame some people for this. Lovely Bob Hoskins has to take his fair share, because the direction of travel was set by Super Mario Bros and has barely deviated since. Uwe Boll, the German director who built an implausible career on barely competent murderings of mid-tier games (and the German tax break system) also earns a mention, for relentlessly associating video game adaptations with ugliness and the budget DVD circuit. And we can't discount the not-historically-incorrect assumption that games are good at movement, noise and violence, and less able when it comes to quiet, reflection and meaning.


This is more of a problem for some games than others. Let's take two proposed movies based on games made by Naughty Dog: Uncharted and The Last Of Us. With the caveat that they are both long in development and may never be made, if they were to be filmed, Uncharted is a fairly simple proposition – irreverent, action-heavy, the essence of a thousand blockbuster weekends returning to the source, by way of Indiana Jones. But The Last Of Us is something else, a successful if self-aware attempt at "quality" gaming. With The Last Of Us, Naughty Dog was the Miramax or Coen brothers of games. But The Last Of Us: The Movie is the Sam Raimi-produced Sony Screen Gems of film.

Literally. The Last Of Us adaptation was set up with exploitation expert Raimi overseeing, based at the movie studio behind the Resident Evil franchise, Underworld, and Pride And Prejudice And Zombies. If The Last Of Us – memorably and lazily described as "the Citizen Kane of video games" by one hyperbolic reviewer – actually turns out to be the Resident Evil 2 of adaptations, can anything bridge the prestige gap that exists between the two mediums?

I'm not sure that Naughty Dog has thought particularly hard about this question. When I asked Naughty Dog co-president Evan Wells why The Last Of Us needed to be a movie, just after the adaptation was announced, he mentioned two things. First, that as a PlayStation exclusive, The Last Of Us becoming a movie was a way of reaching a bigger audience, and second, that the viewing figures of let's play videos on YouTube provided evidence of an untapped market. These are both good business reasons for making a movie of The Last Of Us, but not reasons it would make a good movie.

Which brings us to:

5. You're not great at making movies
If this list was one entry long, this would be the entry.

As Hollywood is in the slow process of proving definitively, there isn’t a game in existence that can't be made into a bad film. There is, very clearly, nothing inherent to the medium of video games that makes for good film adaptations. So maybe don't make one. And if you do, understand that making your video game adaptation good will have very little to do with your choice of source material, and everything to do with you, and how good you are at making films.