Why Many Video Game Films Are Bad - Raindance

Why are there no good video game movies? This is one of those questions that gamers have stopped asking, and for good reason. Every time a video game adaption is announced with big fanfare, usually during a big conference such as E3, people automatically assume it’s going to be bad. And, most of the times, it is. From the 1993 Super Mario film, which set the tone for many adaptations to come, to Alone in the Dark, Far Cry, Bloodrayne and Max Payne, all video game adaptations have been either ludicrously bad or, best case scenario, mediocre and forgettable.

So, what gives? Why aren’t the big studios, with their vast financial resources and talent pool, capable of delivering a decent adaptation? Sadly, there’s no definitive answer to this question, so let’s take it step by step.

The Nature of the Mediums

The most obvious reason for why video game adaptations don’t work is the nature of the two mediums. While video games are an interactive medium where players have more or less control over how the story unfolds, films are a more passive experience. 

This brings us to another, albeit slightly tangential, problem which is symptomatic of both industries: films needlessly attempting to replicate the experience of playing a video game, and video games trying to replicate the experience of watching a big-budget blockbuster. While there’s nothing inherently wrong with attempting to transpose the spirit of a video game in film form*, the efforts are more often than not misguided. 

Instead of taking full advantage of the core elements of the medium, such as editing, sound and acting to bring the spirit, story and universe of a game to life, filmmakers focus on recreating what they believe brings people to video games – the action sequences. This results in a watered down product that, apart from lacking the interactivity that attracts people to video games in the first place, are devoid of what made the source material good. 

* Doom (2005) with the first-person scene and Prince of Persia (2010) with the great parkour choreography are good examples in this respect. 

The Mediums Are Already Intersecting

This might seem contradictory to my first point, but hear me out: many games are already, in essence, films. If you take a look at the best selling AAA games of 2018, you will see that many of them have integrated cinematic elements into their story-telling. Examples in this respect are critically and commercially acclaimed franchises such as Mass Effect, The Last of Us, Red Dead Redemption, and Grand Theft Auto. If you’re familiar with the industry, you probably know that ‘’it’s just like a movie!’’ became a buzzword thrown around by AAA publishers to promote upcoming games.

So, since video games are already interactive films, why would anybody spend their money on a watered down, two-hour long version, when the real thing is a vastly, objectively superior experience? 

And there’s another aspect to this issue that, quite bafflingly, film producers choose to ignore. In the beginning, the emphasis was on the gameplay and mechanics, and the story was more often than not an afterthought slapped together to give the 1s and 0s some context. 

Only later, during the 90s – which some call the transitional phase –  after realising the storytelling potential of the medium, did developers start paying more attention to narrative. Naturally, in order to achieve mainstream credibility – remember, this was a time when gamers were still considered basement-dwelling slackers – games started borrowing heavily from films. One example in this respect is Tomb Raider, which was directly inspired by Raiders of the Lost Ark and, indirectly, by 1930’s adventure films. 

Dubious Casting Choices 

This is not necessarily related to the talent of the actors, rather the casting process and ‘’artistic liberties’’ that filmmakers take when it comes to the characters. This is not to say a video game film has to follow the source material religiously, because this kind of defeats the purpose of an adaptation in the first place. 

But it becomes a problem when the casting choices and the changes made to the characters deviate entirely from the source material – or rather, its spirit, to reiterate a previous point. And when casting and characters are not at least partly in-line with their video game counterparts, the whole project falls apart. 

One evocative example in this sense is the critically panned film Max Payne, starring Mark Wahlberg as the protagonist, and Mila Kunis as Mona Sax. This is in no way a condemnation of Mark Wahlberg’s acting talent, as his work in The Departed and The Fighter proves that he’s capable of playing three-dimensional characters. 

The issue is that Wahlberg doesn’t have the gravitas or the innate broodiness necessary to portray a character such as Max Payne, a cynical cop with a knack for dark metaphors and self-deprecatory humuor. Not to mention the questionable casting of Mila Kunis for the role of Mona Sax, or Ludacris as Commissioner Jim Bravura, who in the game is twice his age, balding, overweight and white. And actually, this brings us to another problem…

Generational Lag 

According to an annual report published by the Entertainment Software Association, in 2018, gamers aged 18 or older make up for 70 % of the video game-playing population, while the average gamer is roughly 34 years old. Now, take a look at this list of the richest Hollywood producers. Notice anything strange? Yes, most of them are either Baby Boomers or Gen Xers. 

So, what’s the catch here? To make these projects happen, studios are required (nay, forced) to hire bankable talent in order to secure financing. Since many games are set in Sci or Fantasy settings, we’re talking about hundreds of millions of dollars just to get these projects off the ground. 

With a few notable exceptions such as Henry Cavill, who is an avid gamer, most A-list Hollywood talents, from actors, screenwriters, directors to producers and editors, are not part of a generation that grew up with video games. Consequently, most projects end up in the hands of people who either don’t understand the medium of video games and the source material or who try to overcompensate for their lack of knowledge and experience by catering to the lowest common denominator. And often, with bad results.

I’d argue that this generational gap between the Hollywood high-ranks and the gaming population is the biggest reason why we haven’t seen a truly good video game adaptation. Until the current generation of gamers (which I’m a part of) or, more likely, the next one, reaches the top echelons of Hollywood, I doubt we will ever see a good video game film, apart from the rare oddity. 


These are the main reasons why a truly good video game film has yet to bless the big screen. While there are many other factors involved, the dubious casting choices, the intersection of the two mediums, their different natures – one’s interactive, the other passive – and the generational gap between the people creating these films and gamers are the main culprits. To end things on a positive note, Netflix’s upcoming The Witcher series looks pretty good. Maybe it will represent the impulse that the industry needs to start producing good video game films. 



Midwestern emo by night, video game journalist by day. Big fan and self-declared connoisseur of pop-
culture. If I like something, you’ll probably hear me talk about it ad-nauseam, to the exasperation of
everyone around. Lead editor at UnleashTheGamer; I sometimes tweet

  • twitter