More of the worst things about modern video games

A couple of months ago I took a look at some of the trends I hate the most in the modern games industry. But one list wasn’t comprehensive enough, apparently, because I’ve found ten more of the worst things to look at today!

Gaming as a hobby has come a long way since I first owned a Super Nintendo. Games have evolved from being little more than electronic toys to being a legitimate artistic and storytelling medium in their own right, and many of my favourite entertainment experiences of all time are in the gaming realm. Games can equal, and in some cases surpass, film and television.

Mass Effect 2 has to be one of the best stories I’ve ever experienced.

But not everything about gaming is fun! There are annoyances and problems with games today, some of which didn’t exist a few years ago, and others which have dogged the medium since its inception. As always, this list is entirely subjective, so if I criticise something you like, or ignore something you hate, please keep in mind that all of this is just the opinion of one person. If you want to check out my previous list, you can find it by clicking or tapping here.

With all that out of the way, let’s get started!

Number 1: Checkpoints

Cal Kestis at a checkpoint in Star Wars Jedi: Fallen Order.

Is it 1996? No? Then let’s stop using checkpoints and allow players the freedom to save their game whenever and wherever they need to! With relatively few sensible exceptions – like in the middle of a boss fight or during a cut-scene – there’s no reason why modern games can’t incorporate a free save system.

Checkpoints were a limitation of older hardware and software; games and consoles weren’t always able to offer players the ability to save the game anywhere, so designated save zones – or checkpoints – had to be incorporated. This was already a step up from passwords that you had to write down (remember those?) but checkpoints are simply unnecessary and out-of-date in modern games.

Control also uses a checkpoint system.

With gaming having grown in the years since checkpoints were the only way to manage save files, more people from different backgrounds are getting into the hobby – including many more adults, working-age people, and folks with less free time. Having to replay a lengthy section of a game because the game didn’t offer the freedom to save when you needed to is incredibly frustrating, and considering that there is no technical reason for not implementing a proper save system, in my opinion there’s no excuse.

Whine all you want about “vision” and “integrity” and that players should “git gud,” but a lot of folks simply want to play through a fun and entertaining narrative. We also want to play through it once, not multiple times because of the lack of a convenient save function. Checkpoints seemed to have largely disappeared until the likes of Dark Souls brought them back as part of its “extreme difficulty” shtick. But there’s a difference between a challenge and something frustrating; checkpoints are definitely in the latter category.

Number 2: Boring and/or repetitive side-missions

“Another settlement needs our help.”

It’s no good bragging about the number of quests or missions in your game if 80% of them are the same – or equally as bad as each other! Open-world games tend to fall victim to this, but it’s a phenomenon that can plague all manner of different titles.

These kinds of missions follow one of a couple of different formulae: “go to location X and pick up item Y” or “go to location X and kill Y number of enemies.” Then that’s it. Mission over, receive a few experience points or a random, usually-not-worth-it item, and repeat. Such quests are nothing but padding for a game that should’ve been shorter and more focused.

The Mako in Mass Effect: Legendary Edition.

Even otherwise good games can end up going down this route. Mass Effect 1 is a case in point. The main story missions in the game are phenomenal, and while the stories which set up some of the side-missions sound like they could be potentially interesting, each one basically consists of “drive vehicle to location, kill enemies, press button.” Because 90% of the side-missions use basically identical maps and environments, this gets old fast – even if the storyline setting up the mission seems superficially interesting.

If you can’t make a good side-mission, skip it. I’d rather play a game that isn’t as long but doesn’t have this unnecessary fluff padding it out and, frankly, wasting my time.

Number 3: Collect-a-thons

Another feather. Yay.

On a related note, many open-world games have recently begun being padded out with miscellaneous items to collect. Upon picking up a feather, for example, the game will tell you that you’ve discovered 1/100 – only 99 more to go! These items almost always have no impact on the plot or gameplay of a title, and often don’t even give out a reward for finding all of them. At most you might get a trophy or achievement for collecting all of them.

At least boring side-missions usually have some kind of setup. A villager needs you to kill the rats in his basement, an admiral needs you to shut down all four computer cores, etc. Though the missions themselves are junk, a modicum of thought went into their creation. Collect-a-thons have no such redeeming feature. Often the items to be collected are so random that they have no link whatsoever to the plot or character.

Pigeons in Grand Theft Auto IV are another example.

Why does my grizzled war veteran on a mission to save the world need to spend his time hunting down 100 feathers or 50 leaves? If the items did something – anything – like if they could be used for crafting or if they were notes or recordings containing lore and info about the game world, well at least there’d be a point. It wouldn’t necessarily be a good point, but still.

These items are added into games – often in obscure or hard-to-reach places – purely to pad out the game and extend its runtime. They serve no purpose, either narratively or in terms of gameplay, and while I have no doubt that some players find collecting every single in-game item fun, for me I’d rather the effort and attention wasted on features like this was refocused elsewhere. One side-mission, even an average one, would be better than 100 random pieces of shit to collect.

Number 4: Online cheating

An aimbot for popular game Fortnite.

If you have a single-player game and want to turn on god mode or assisted aiming, go for it. Cheats can sometimes be accessibility features, offering a route through a game for players with disabilities, as well as providing a way to skip the grind for players who don’t have much time. But when you go online and play against real people, you damn well better leave the cheats behind!

There are so many examples of cheating players getting caught and banned that it can be kind of funny. Even some professional and wannabe-professional players have been caught out and learned the hard way that the internet never forgets. But no one should be doing this in the first place.

Some losers even cheated at Fall Guys, for heaven’s sake…

Trying to take away the most fundamental tenet of competition – fairness – is so phenomenally selfish that I don’t even know what to say. If there were a financial incentive – like winning the prize money at a big tournament – I could at least recognise that some folks would be tempted to try to take the easy route to payday. But in a game like Fall Guys where it’s supposed to be fun… I just don’t get why someone would feel the need to cheat.

Some games have a bigger problem with cheating than others, and games that don’t get a handle on a cheating problem fast can find themselves in serious jeopardy. It’s unfortunate that the anonymity of the internet means that a lot of players simply get away with it, with some even going so far as to use “disposable” accounts, so that if one gets banned they can just hop to another and keep right on cheating.

Number 5: Overly large, confusing levels

Looks like fun…

We kind of touched on this last time when considering empty open worlds, but some games have poorly-designed levels that are too large and almost maze-like. Getting lost or running in circles – especially if no map is provided – can become frustrating very quickly. These kinds of levels are often repetitive and bland with little going on.

Some games have levels which are simply not well laid-out, making it difficult to find the right path forward. I’ve lost count of the number of times I was trying to explore, thinking I was investigating a side-area, only to find it was the main path forward, and vice versa. Advancements in technology – particularly as far as file sizes go – have meant that levels and worlds can be physically larger. Sometimes that’s a good thing, but sometimes it isn’t!

This also applies to featureless open worlds or maps without landmarks for ease of navigation.

If a game has a map, or if a level is well-signposted (either literally or figuratively) then it shouldn’t matter how large it is. Players will be able to figure out where to explore and where to go to proceed with the story or quest. But too often that isn’t the case, and getting lost, backtracking, or not knowing where to go are all annoyances! Not every level has to be massive. Some work far better when kept concise, especially if the number of things to find or do in the level are limited.

Obviously I don’t include in this category mazes or levels which are deliberately designed to be puzzling. Some games make clever use of deliberately puzzling levels, where exploring and figuring out the right path is all part of the fun. Others just screw up their level design and leave players wandering around, confused.

Number 6: Orphaned franchises/unfinished stories

I’m not even going to say it…

Though the phenomenon of a story being abandoned partway through is hardly new – nor even unique to gaming – the rise of more cinematic, story-driven games since the turn of the millennium has brought this issue to the fore. The first encounter I had with this was in 2001 when Shenmue II dropped off the face of the earth (following abysmal sales in Japan and elsewhere) meaning that the saga was never finished.

But it isn’t just financial failures that don’t land sequels. The lack of a third game in the Half-Life series has become a joke at this point, more than fifteen years after the last mainline entry in the series. Fans have been clamouring for Half-Life 3 for a long time, and the recent success of VR title Half-Life: Alyx proves there’s a market and that the game’s audience is still here.

Will there ever be a Bully 2?

Sometimes a studio gets busy with other projects. There hasn’t been a new Elder Scrolls game, for example, in part because Bethesda has worked on the Fallout franchise and Starfield in the years since Skyrim was released. But there are also plenty of cases where a developer or publisher finds a cash-cow and abandons all pretence at making any new game so they can milk it dry.

Look at Rockstar with Grand Theft Auto V’s online mode, or Valve with its Steam digital shop and the success of online games like Dota 2 and Counter-Strike: Global Offensive. Those studios could make new games or sequels to existing games, but instead choose to focus on older titles. Similarly, studios like Bethesda found success by porting existing games to new and different hardware, as well as releasing new or updated versions of older games.

Number 7: Ultra Special Super Extreme Deluxe Editions

How many different “editions” does a game need?!

I’m not talking about so-called “collector’s editions” of games, which are often simply the game plus a statue or other memorabilia. Those can be fine, because if someone is willing to part with silly money to get a resin statue of an in-game character who am I to judge? What I greatly dislike are games that are sold with multiple “editions” – i.e. a “basic” version with missing features, then several progressively more expensive versions with those missing features added back in.

Some games take this to silly extremes, with a “basic” version retailing for full price (£55/$60) and the most expensive “deluxe” edition being far more expensive for the sake of adding in-game content (extra skins, missions, etc.) that were literally developed alongside the main game then cut out. Some of these ultra extreme special editions can retail for £80, £90, or even £100 in some cases, and that’s just deceptive.

Sports games, like the FIFA series, do this a lot.

This is an evolution of the “day-one DLC” phenomenon that was present a few years ago. In the case of Mass Effect 3, for example, an entire main character, a mission to recruit them, and all of their scenes and dialogue, was literally developed along with the game, perfectly integrated and designed to be part of the game, then cut out and sold as downloadable content literally on the day the game launched.

In multiplayer titles, the extreme special supreme editions can come with in-game advantages, making them literally pay-to-win. In free-to-play games, perhaps a degree of paying for an advantage is to be expected – but some of these games are asking full price, then giving a competitive advantage to players who pay above full price.

Number 8: Unrepresentative trailers/marketing material

Anthem made a fake trailer… and look what happened to the game.

I used to work in video games marketing, and I thought I’d seen every shady trick in the book! But some of the trailers and marketing material that publishers show off in the run-up to the launch of a new game can be downright deceptive. Some games, like notorious failure Anthem, even went so far as to create fake “in-game” footage to be shown off at marketing events, which is incredibly bad form.

Cyberpunk 2077 is another example. That game was developed to run on high-end PCs and next-gen consoles, and the Xbox One/PlayStation 4 version was so poorly-optimised when it launched that many folks considered it to be literally “unplayable.” The trailers and marketing material hid this fact, and developer CD Projekt Red deliberately kept those versions of the game away from reviewers. The result was that no one realised how broken the game was until it was too late.

CD Projekt Red didn’t show things like this in the Cyberpunk 2077 trailer…

Mobile games are notorious for putting out trailers that are entirely unrepresentative of the games they’re selling. Many mobile games are samey, basic tap-a-thons with unimpressive graphics and mediocre gameplay, yet the trailers make them seem like big-budget console-quality games. In a way this isn’t new; 2D games in the 8-bit era were often marketed with cartoons and fancy graphics that made them look far better than they were!

The thing is, unrepresentative marketing always comes back to bite a company. Just ask CD Projekt Red, whose implosion in the aftermath of Cyberpunk 2077′s abysmal launch will enter gaming history.

Number 9: Massive patches and updates

Yikes.

Last time I criticised ridiculously huge file sizes for games, and this time I want to pick on updates and patches in particular. There’s no feeling more disappointing than sitting down to play a game you’ve been looking forward to all day only to find that either the game or the console needs to download a stupidly large update before you can jump in.

Some updates can be dozens of gigabytes, and if you’re on a slow internet connection (like I am) or have limited downloads, it can take forever to update the game – or be outright impossible. Once again, folks with limited time for gaming are in trouble here; even on a reasonably fast connection, a massive update can cut into or erase the time someone set aside for gaming.

After buying a brand-new console, downloading patches and updates can be a time-consuming task.

The stupid thing is that many of these updates appear to change absolutely nothing! I’ve lost track of how many times Steam has updated itself on my PC, for example, only to look exactly the same every time. While it’s good that games companies can roll out bug fixes, patch out glitches, and even fix cheating issues remotely, these things can happen at the most inconvenient times!

In the run-up to Christmas it’s now commonplace, even in mainstream news outlets, to see advice given to update new consoles and games before giving them out as presents. Little Timmy’s Christmas would be ruined if he had to spend all of Christmas Day waiting around for his new PlayStation to update before he could use it!

Number 10: We’re drowning in sequels, remakes, and spin-offs

The Final Fantasy series is up to its fifteenth mainline title…

It’s increasingly rare for a games company to produce a new game that isn’t based on an existing franchise or property. Don’t get me wrong, this isn’t an issue unique to gaming – it’s happening on television and in cinema too. We’re 100% in the era of the franchise.

As great as it is to play a sequel to a much-loved title, it’s also great fun to get stuck into a completely new story with new characters and a new world. Unfortunately, as is the case in television and cinema, companies are increasingly viewing brand-new stories as risky – if fans don’t respond well then their investment will have been wasted!

How many Call of Duty games have there been by now?

Sooner or later, I think this franchise and sequel mania has to break. It can’t go on forever, not least because existing franchises will run out of material and fans will lose interest. But right now it shows absolutely no signs of abating, and some video game franchises have become annual or almost-annual fixtures. The Call of Duty series is a case in point – there’s been a new game every year since 2005.

I appreciate studios willing to stick their necks out and take a risk. Control is a good recent example of a successful new IP, and Starfield will be Bethesda’s first wholly new property in decades when it’s finally ready. But there’s certainly less storytelling innovation than there used to be, and fewer new games in favour of sequels, franchises, and spin-offs.

So that’s it. Ten more things that bug me about modern gaming!

I’m sure I’ll be able to think of more later!

Although we’ve now found twenty annoying trends in modern gaming, the hobby is generally in a good place. Technological improvements mean games look better than ever, and the increase in gaming’s popularity has seen more money enter the industry, as well as quality standards generally rising rather than falling. There are problems, of course, but the industry as a whole isn’t in a terrible place.

At the end of the day, it’s fun to complain and have a bit of a rant! The last list I published seemed to be well-read, so I hope this one has been a bit of fun as well! Now if only someone would make a Star Trek video game… perhaps the lack of one warrants a place on my next list!

You can find my first list of the worst things about modern video games by clicking or tapping here.

All titles mentioned above are the copyright of their studio, developer, and/or publisher. Some screenshots and promotional art courtesy of press kits on IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The worst things about modern video games

The first home console I owned – after saving up my hard-earned pocket money and pestering my parents for ages – was a Super Nintendo. Gaming has changed a lot since then, and while many of those changes have been fantastic and introduced us to new genres, not every change has been for the better! In this list I’m going to cover some of my biggest pet peeves with video games in 2021.

As always, this list is entirely subjective. If I criticise something you like, or exclude something you hate, just keep in mind that this is only one person’s opinion. Gaming is a huge hobby that includes many people with many different perspectives. If yours and mine don’t align, that’s okay!

Number 1: No difficulty options.

Some people play video games because they love the challenge of a punishingly-difficult title, and the reward of finally overcoming an impossible level after hours of perseverance. I am not one of those people! In most cases, I play video games for escapism and entertainment – I want to see a story unfold or just switch off from other aspects of my life for a while. Excessive difficulty is frustrating and offputting for me.

As someone with health issues, I would argue that difficulty settings are a form of accessibility. Some people don’t have the ability to hit keys or buttons in rapid succession, and in some titles the lack of a difficulty setting – particularly if the game is not well-balanced – can mean those games are unavailable to folks with disabilities.

While many games are too difficult, the reverse can also be true. Some titles are just too easy for some people – I’m almost never in that category, but still! Games that have no difficulty settings where the base game is incredibly easy can be unenjoyable for some folks, particularly if the challenge was what got them interested in the first place.

In 2021, most games have difficulty options as a standard feature. Difficulty settings have been part of games going back decades, and in my opinion there’s no technical reason why they shouldn’t be included. There’s also not really a “creative” reason, either. Some developers talk in grandiose terms about their “vision” for a title being the reason why they didn’t implement difficulty options, but as I’ve said before – the inclusion of an easier (or harder) mode does not impact the game at all. It only impacts those who choose to turn it on, and considering how easy it is to implement, I find it incredibly annoying when a game is deliberately shipped without any difficulty options.

Number 2: Excessive difficulty as a game’s only selling point.

While we’re on the subject of difficulty, another pet peeve of mine is games whose entire identity is based on their difficulty (or perceived difficulty). Think about this for a moment: would Dark Souls – an otherwise bland, uninspired hack-and-slash game – still be talked about ten years after its release were it not for its reputation as impossibly difficult? How many late 2000s or early ’10s hack-and-slash games have dropped out of the cultural conversation? The only thing keeping Dark Souls there is its difficulty.

A challenge is all well and good, and I don’t begrudge players who seek that out. But for me, a game has to offer something more than that. If there’s a story worth telling under the difficult gameplay I’m impressed. If the difficult, punishing gameplay is all there is, then that’s boring!

Difficulty can also be used by developers as cover for a short or uninteresting game. Forcing players to replay long sections over and over and over can massively pad out a game’s runtime, and if that’s a concern then cranking the difficulty to ridiculous levels – and offering no way to turn it down – can turn a short game into a long one artificially.

I’m all for games that offer replay value, but being forced to replay the same level or checkpoint – or battle the same boss over and over – purely because of how frustratingly hard the developers chose to make things simply isn’t fun for me.

Number 3: Ridiculous file sizes.

Hey Call of Duty? Your crappy multiplayer mode does not need to be 200 gigabytes. Nor does any game, for that matter. It’s great that modern technology allows developers to create realistic-looking worlds, but some studios are far better than others when it comes to making the best use of space! Some modern games do need to be large to incorporate everything, but even so there’s “large” and then there’s “too large.”

For a lot of folks this is an issue for two main reasons: data caps and download speeds. On my current connection I’m lucky to get a download speed of 7 Mbps, and downloading huge game files can quite literally take several days – days in which doing anything else online would be impossibly slow! But I’m fortunate compared to some people, because I’m not limited in the amount of data I can download by my ISP.

In many parts of the world, and on cheaper broadband connections, data caps are very much still a thing. Large game files can take up an entire months’ worth of data – or even more in some cases – making games with huge files totally inaccessible to a large number of people.

This one doesn’t seem like it’s going away any time soon, though. In fact, we’re likely to see file sizes continue to get larger as games push for higher resolutions, larger environments, and more detail.

Number 4: Empty open worlds.

Let’s call this one “the Fallout 76 problem.” Open worlds became a trend in gaming at some point in the last decade, such that many franchises pursued this style even when it didn’t suit their gameplay. Read the marketing material of many modern titles and you’ll see bragging about the size of the game world: 50km2, 100km2, 1,000km2, and so on. But many of these open worlds are just empty and boring, with much of the map taken up with vast expanses of nothing.

It is simply not much fun to have to travel across a boring environment – or even a decently pretty one – for ages just to get to the next mission or part of the story. Level design used to be concise and clever; modern open worlds, especially those which brag about their size, tend to be too large, with too little going on.

The reason why Fallout 76 just encapsulates this for me is twofold. Firstly, Bethesda droned on and on in the weeks before the game’s release that the world they’d created was the “biggest ever!” And secondly, the game had literally zero non-player characters. That huge open world was populated by a handful of other players, non-sentient monsters, and nothing else. It was one of the worst games of the last few years as a result.

Open worlds can work well in games that are suited for that style of gameplay. But too many studios have been pushed into creating an open world simply to fit in with a current trend, and those open worlds tend to just flat-out suck because of it. Even when developers have tried to throw players a bone by adding in collect-a-thons, those get boring fast.

Number 5: Pixel graphics as a selling point.

There are some great modern games that use a deliberately 8-bit look. But for every modern classic there are fifty shades of shit; games that think pixel graphics and the word “retro” are cover for creating a mediocre or just plain bad title.

It may be hard to remember, but there was a time when the idea of using a deliberately “old-school” aesthetic would have been laughed at. The first few console generations were all about improvements, and I’m old enough to remember when 3D was a huge deal. It seemed like nobody would ever want to go back to playing a SNES game after trying the Nintendo 64, and while there are still plenty of gamers who love the retro feel, I’m generally not one of them.

That isn’t to say that realistic graphics should be the only thing a game strives for. And this point works for modern graphics or visual styles in general – bragging about how detailed the graphics are, or how unique a title’s art style is, means nothing if the game itself is shit. But it likewise works for pixel-graphics games – an outdated art style does not compensate for or cover up a fundamentally flawed, unenjoyable experience.

Games with pixel graphics can be good, and many titles have surprised me by how good they are. I’ve written before about how Minecraft surprised me by being so much more than I expected, and that’s one example. But I guess what I’d say is this: if your game looks like it should have been released in 1991, you’ve got more of an uphill battle to win me over – or even convince me to try it in the first place – than you would if your game looked new.

Number 6: Unnecessary remakes.

We called one of the entries above “the Fallout 76 problem,” so let’s call this one “the Mass Effect: Legendary Edition problem.” In short, games from even ten or fifteen years ago still look pretty good and play well. There’s far less of a difference between games from 2011 and 2021 than there was between games from 1991 and 2001 – the pace of technological change, at least in gaming, has slowed.

“Updating” or “remaking” a game from ten years ago serves no real purpose, and in the case of Mass Effect: Legendary Edition I’ve struggled at times to tell which version of the game is the new one when looking at pre-release marketing material. There’s no compelling reason to remake games that aren’t very old. Re-release them or give them a renewed marketing push if you want to drum up sales or draw attention to a series, but don’t bill your minor upgrade as a “remake.”

There are some games that have benefitted hugely from being remade. I’d point to Crash Bandicoot and Resident Evil 2 as two great examples. But those games were both over twenty years old at the time they were remade, and having been released in the PlayStation 1 era, both saw massive upgrades such that they were truly worthy of the “remake” label.

I’ve put together two lists of games that I’d love to see remade, but when I did so I deliberately excluded titles from the last two console generations. Those games, as I said at the time, are too recent to see any substantial benefits from a remake. In another decade or so, assuming sufficient technological progress has been made, we can talk about remaking PlayStation 3 or PlayStation 4 games – but not now!

Number 7: Fake “remakes.”

On a related note to the point above, if a title is billed as a “remake,” I expect to see substantial changes and improvements. If all that’s happened is a developer has run an old title through an upscaler and added widescreen support, that’s not a remake!

A lot of titles that acquire the “HD” suffix seem to suffer from this problem. Shenmue I & II on PC contained a number of bugs and glitches – some of which existed in the Dreamcast version! When Sega decided to “remake” these two amazing games, they couldn’t even be bothered to patch out bugs that were over fifteen years old. That has to be some of the sloppiest, laziest work I’ve ever seen.

There are other examples of this, where a project may have started out with good intentions but was scaled back and scaled back some more to the point that it ended up being little more than an upscaled re-release. Kingdoms of Amalur: Re-Reckoning springs to mind as an example from just last year.

Remakes are an opportunity to go back to the drawing board, fix issues, update a title, and bring it into the modern world. Too many “remakes” fail to address issues with the original version of the game. We could even point to Mass Effect: Legendary Edition’s refusal to address criticism of the ending of Mass Effect 3 as yet another example of a missed opportunity.

Number 8: The “release now, fix later” business model.

This isn’t the first time I’ve criticised the “release now, fix later” approach taken by too many modern games – and it likely won’t be the last! Also known as “live services,” games that go down this route almost always underperform and draw criticism, and they absolutely deserve it. The addition of internet connectivity to home consoles has meant that games companies have taken a “good enough” approach to games, releasing them before they’re ready with the intention to patch out bugs, add more content, and so on at a later time.

Cyberpunk 2077 is one of the most recent and most egregious examples of this phenomenon, being released on Xbox One and PlayStation 4 in a state so appallingly bad that many considered it “unplayable.” But there are hundreds of other examples going back to the early part of the last decade. Fortunately, out of all the entries on this list, this is the one that shows at least some signs of going away!

The fundamental flaw in this approach, of course, is that games with potential end up having launches that are mediocre at best, and when they naturally underperform due to bad reviews and word-of-mouth, companies panic! Planned updates are scrapped to avoid pumping more money into a failed product, and a game that could have been decent ends up being forgotten.

For every No Man’s Sky that manages to claw its way to success, there are a dozen Anthems or Mass Effect: Andromedas which fail. Time will tell if Cyberpunk 2077 can rebuild itself and its reputation, but its an uphill struggle – and a totally unnecessary one; a self-inflicted wound. If publishers would just wait and delay clearly-unfinished games instead of forcing them to meet arbitrary deadlines, gaming would be a much more enjoyable hobby. Remember, everyone: NO PRE-ORDERS!

Number 9: Forcing games to be multiplayer and/or scrapping single-player modes.

Some games are built from the ground up with multiplayer in mind – but many others are not, and have multiplayer modes tacked on for no reason. The Last Of Us had an unnecessary multiplayer mode, as did Mass Effect 3. Did you even know that, or notice those modes when you booted up those story-focused games?

Some games and even whole genres are just not well-suited to multiplayer. And others that are still have the potential to see single-player stories too. Many gamers associate the first-person shooter genre with multiplayer, and it’s true that multiplayer games work well in the first-person shooter space. But so do single-player titles, and aside from 2016’s Doom and the newer Wolfenstein titles, I can’t think of many new single-player first-person shooters, or even shooters with single-player modes that felt anything other than tacked-on.

Anthem is one of the biggest failures of the last few years, despite BioWare wanting it to be the video game equivalent of Bob Dylan. But if Anthem hadn’t been multiplayer and had instead maintained BioWare’s usual single-player focus, who knows what it could have been. There was potential in its Iron Man-esque flying suits, but that potential was wasted on a mediocre-at-best multiplayer shooter.

I started playing games before the internet, when “multiplayer” meant buying a second controller and plugging it into the console’s only other available port! So I know I’m biased because of that. But just a few short years ago it felt as though there were many more single-player titles, and fewer games that felt as though multiplayer modes had been artificially forced in. In the wake of huge financial successes such as Grand Theft Auto V, Fortnite, and the like, publishers see multiplayer as a cash cow – but I wish they didn’t!

Number 10: Early access.

How many times have you been excited to see that a game you’ve been waiting for is finally available to buy… only to see the two most awful words in the entire gaming lexicon: “Early Access?” Early access billed itself as a way for indie developers to get feedback on their games before going ahead with a full release, and I want to be clear on this point: I don’t begrudge indie games using it for that purpose. Indies get a pass!

But recently there’s been a trend for huge game studios to use early access as free labour; a cheap replacement for paying the wages of a quality assurance department. When I worked for a large games company in the past, I knew a number of QA testers, and the job is not an easy one. It certainly isn’t one that studios should be pushing off onto players, yet that’s exactly what a number of them have been doing. Early access, if it exists at all, should be a way for small studios to hone and polish their game, and maybe add fan-requested extras, not for big companies to save money on testers.

Then there are the perpetual early access games. You know the ones: they entered early access in 2015 and are still there today. Platforms like Steam which offer early access need to set time limits, because unfortunately some games are just taking the piss. If your game has been out since 2015, then it’s out. It’s not in early access, you’ve released it.

Unlike most of the entries on this list, early access started out with genuinely good intentions. When used appropriately by indie developers, it’s fine and I don’t have any issue with it. But big companies should know better, and games that enter early access and never leave should be booted out!

Bonus: Online harassment.

Though this problem afflicts the entire internet regardless of where you go, it’s significant in the gaming realm. Developers, publishers, even individual employees of games studios can find themselves subjected to campaigns of online harassment by so-called “fans” who’ve decided to take issue with something in a recent title.

Let’s be clear: there is never any excuse for this. No game, no matter how bad it is, is worth harassing someone over. It’s possible to criticise games and their companies in a constructive way, or at least in a way that doesn’t get personal. There’s never any need to go after a developer personally, and especially not to send someone death threats.

We’ve seen this happen when games are delayed. We’ve seen it happen when games release too early in a broken state. In the case of Cyberpunk 2077, we’ve seen both. Toxic people will always find a reason to be toxic, unfortunately, and in many ways the anonymity of the internet has brought out the worst in human nature.

No developer or anyone who works in the games industry deserves to be threatened or harassed. It’s awful, it needs to stop, and the petty, toxic people who engage in this scummy activity do not deserve to be called “fans.”

So that’s it. Ten of my pet peeves with modern gaming.

This was a rant, but it was just for fun so I hope you don’t mind! There are some truly annoying things – and some truly annoying people – involved in gaming in 2021, and as much fun as playing games can be, it can be a frustrating experience as well. Some of these things are fads – short-term trends that will evaporate as the industry moves on. But others, like the move away from single-player games toward ongoing multiplayer experiences, seem like they’re here to stay.

Gaming has changed an awful lot since I first picked up a control pad. And it will continue to evolve and adapt – the games industry may be unrecognisable in fifteen or twenty years’ time! We’ll have to keep our fingers crossed for positive changes to come.

All titles mentioned above are the copyright of their respective developer, publisher, and/or studio. Some stock images courtesy of pixabay. Some screenshots and promotional artwork courtesy of IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The odd criticism of Six Days In Fallujah

This article discusses the Iraq War and the Second Battle of Fallujah and may be uncomfortable for some readers.

One of the bloodiest and most controversial battles of the Iraq War was the Second Battle of Fallujah, which took place in November 2004. The battle saw coalition forces – most of whom were American, but there were a number of Iraqi and British troops who took part as well – capture the city from al-Qaeda and other insurgent forces. The Iraq War is controversial and its history complicated, and I’m simplifying the events of the battle and the war to avoid making this article about a video game too long. Suffice to say that even now, eighteen years since the United States led a coalition to defeat Saddam Hussein, and more than sixteen years since the Battle of Fallujah, the events are controversial, disputed, and the consequences of military action are still being felt in Iraq, the wider Middle East, and indeed the whole world.

Six Days In Fallujah is a video game depicting the battle from the American side, and when it was initially in development in the late 2000s it became incredibly controversial in the United States, with politicians and Iraq War veterans’ groups expressing opposition and disgust. The idea of recreating for fun any aspect of one of the most divisive conflicts of the last few decades was considered obscene, and the idea of encouraging gamers to play through a battle that took place, at that time, a mere five years earlier was too much for many people to countenance.

After the controversy boiled over and saw media personalities and politicians get involved in 2009, Six Days In Fallujah disappeared, and by 2010 or 2011 the project was effectively shelved. The critics moved on, the developers moved on, and that appeared to be the end of the matter.

Last month, however, there came the announcement from a studio called Highwire Games – which is said to consist of developers who worked on games in the Halo and Destiny franchises at Bungie – that Six Days In Fallujah was back. The game is now scheduled for a late 2021 release date, and plans to retain the original focus that was the cause of such controversy a decade ago. Cue outrage from the expected sources.

What took me by surprise was not the strength of feeling expressed by some veterans of the battle, nor the criticism by largely self-serving politicians. That was to be expected, and the announcement of Six Days In Fallujah went out of its way to highlight how Highwire Games has worked with veterans in particular – clearly anticipating this kind of reaction and trying to pre-empt some of the criticism. Instead what genuinely surprised me was the reaction from some games industry insiders and commentators, who appear to be taking an equally aggressive stance in opposition to Six Days In Fallujah.

Politicians, particularly those to the right-of-centre, have long campaigned against video gaming as a hobby. Initially games were derided as being wastes of time or childish, but some time in the 1990s the tactic switched to accusing games of inspiring or encouraging violence; equating in-game actions with real-world events. Numerous studies have looked into this issue, by the way, and found it to be without merit. But we’re off-topic.

Advocates of video gaming as a hobby – in which category I must include myself, both as someone who used to work in the industry and as an independent media critic who frequently discusses gaming – have long tried to push back against this narrative and these attacks. “Video games can be art” is a frequently heard refrain from those of us who support the idea of interactive media having merit that extends beyond simple entertainment, and there are many games to which I would direct an opponent to see for themselves that games can be just as valid as works of cinema and literature.

To see folks I would consider allies in the fight for gaming in general to be taken more seriously calling out Six Days In Fallujah because of its controversial subject matter was disappointing. Art, particularly art that deals with controversial current and historical events, can be difficult and challenging for its audience – and it’s meant to be. A painting, photograph, novel, or film depicting something like war is sometimes going to challenge our preconceptions and ask us to consider different points of view. That’s what makes art of this kind worthwhile. It’s what makes everything from war photography to protest songs to the entire genre of war in cinema incredibly important.

Documentaries and news reports only cover events in one way. The way we as a society come to understand events is partly factual but also is, in part, informed by the art those events inspire. The First World War is covered very well in history textbooks and newsreels produced at the time, but another side of the conflict – a more intimate, personal side – is seen in the poetry of people like Siegfried Sassoon and Wilfred Owen. The poems that they wrote about their wartime experiences were not pure depictions of fact, they were written to both inform and entertain – and perhaps to inform through entertainment.

If we relegate the Iraq War to contemporary news broadcasts and documentaries by the likes of Michael Moore we will miss something important, and so will future generations who want to look back and understand what happened. There are many works of fiction and non-fiction which attempt to show the big picture of what happened in Iraq, from the lies about “weapons of mass destruction” through to the use of banned weapons. Those works absolutely need to exist. But in a way, so does Six Days In Fallujah. It aims to depict, in as realistic a manner as game engines in 2021 will allow, one of America’s most controversial battles of recent decades – an event which will be seen in future, perhaps, as one of the American military’s darkest hours of the entire 21st Century due to their alleged use of illegal white phosphorus.

Getting as many perspectives as possible across as broad an array of media as possible about such an important event seems worthwhile, at least to me. Six Days In Fallujah may ultimately turn out to depict the event poorly, or be a game plagued by technical issues. It might be flat-out crap. But it really does surprise me to hear serious commentators and critics suggest that it shouldn’t be made at all, perhaps because of their own biases and preconceptions about the war and the game’s possible depiction of it.

There is value in art, and if video games are to ever be taken seriously as artistic expression, we need to make sure we allow difficult and challenging works of art to exist in the medium. That doesn’t mean we support them or the messages they want to convey, but rather that we should wait and judge them on merit when they’ve been made. As I said, Six Days In Fallujah may be a dud; an easily-forgotten piece of fluff not worth the energy of all this controversy. But maybe it will be a significant work that aids our understanding of the history of this battle, and the entire Iraq War.

It feels odd, as someone who lived through the Iraq War and all its controversy, to be considering it as an historical event, especially considering its continued relevance. I actually attended a huge anti-war march in London that took place a few weeks before British forces joined the US-led coalition and attacked Iraq. But the beginning of the Iraq War is now almost two decades in the past, and even as the world struggles with the aftermath of those events, we need to create works like Six Days In Fallujah if we’re ever to come to terms with what happened and begin to understand it. We also need to consider future generations – are we leaving them enough information and enough art to understand the mistakes our leaders made in 2003? If we don’t leave that legacy, we risk a future George W. Bush or Tony Blair making the same kind of mistake. I don’t know if Six Days In Fallujah will even be relevant to the conversation, but it’s incredibly important that we find out.

Six Days In Fallujah is the copyright of Highwire Games and Victura. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

On the subject of gaming addiction

This column deals with the sensitive topic of addiction, and may be uncomfortable for some readers.

In 2018 the World Health Organisation surprised and upset a number of fans of video games when it formally designated “gaming disorder” as a distinct clinical condition. The reaction was, sadly, predictable, and boiled down to some variant of the following argument: “I’m not addicted to video games! Therefore video games can’t possibly be addictive!” Many commentators and outlets that focus on video gaming piled on with complaints and criticism, and the result is that the subject is still controversial even today, almost two years on from the WHO’s initial decision.

I’m not a doctor or psychologist, but I wanted to take a moment to defend the decision to categorise gaming disorder/video game addiction as a separate condition, because I feel that too many people who don’t really understand the topic had a knee-jerk reaction to attack it. To them it felt like an attack on their hobby, and perhaps what we can gleam from that is that the messaging surrounding the decision could have been better and clearer.

Firstly, the commentators who criticised the decision, even those who work for major publications, are universally not medical professionals. Their knowledge of the subject is limited at best, nonexistent at worst, and quite frankly having a bunch of uninformed people criticising doctors for a medical decision is comparable to conspiracy theories like the anti-vaccine movement or the Earth being flat. The people who made the decision to categorise video game addiction in this way are qualified to do so, and they will have made their decision on the basis of investigations and evidence, all of which has been peer-reviewed. The people who took offence to the decision simply aren’t on that level.

The biggest problem some people seemed to have is that the decision felt like an attack on gaming as a hobby. Many people have long derided games, dismissing them as children’s toys and even blaming gaming for criminal and violent acts, so I can understand why, to some people, this felt like just another attack in a long line. But it isn’t, because the designation of gaming disorder in no way says that all video games are a problem or that all gamers are addicts. The classification of alcoholism as a disease doesn’t mean that the vast majority of drinkers are alcoholics; no sensible person would even dream of making that argument. Alcoholism affects a small minority of drinkers, just as gaming disorder affects a small minority of gamers. And no one is trying to say otherwise.

Something that can become a problem for one person isn’t going to be a problem for everyone. Many gamers – by far the majority – play games in a sensible and responsible way, enjoying their hobby without allowing it to dominate their life. But some people will take it too far, and will allow it to take over, perhaps as an expression of other mental health issues but perhaps simply because they allowed it to get out of hand.

Choosing to classify gaming disorder as a separate and distinct condition means that more studies can be performed in the field, more information disseminated to psychiatrists and other healthcare professionals, and the result of these things is that for those people who do suffer, better help, and help more tailored to their specific problem, will be available. This can only be a good thing, as it will mean more people will have access to specialist help.

In order to meet the criteria for an individual to even be suspected of having gaming disorder, there’s actually quite a high bar. The most important factor is that their gaming is having a detrimental effect on their life. This could manifest in many ways, which will vary from person to person.

When I was a student at university many years ago, I witnessed gaming disorder firsthand. I was living in a rented apartment which I shared with just one other person, and this person (who will of course remain nameless) became addicted to video games. The individual in question was, like me, an exchange student, which is how we met and how we came to share an apartment. He had friends back home who he liked to play games with, and this was around the time that online gaming was just taking off. He would spend endless hours playing an online game, often late into the night, and over the span of a few weeks it began to have a huge impact on his life. He stopped attending classes, which saw him end up in a mess of trouble with the university as he failed every class that semester. His parents found out, which caused personal problems for him with his family, and his failure to pay rent – despite promising me he’d paid his share – almost wound up getting the pair of us evicted. This was in addition to the weight he lost from not eating properly, the destroyed social relationships with other exchange students at the university, and the missed opportunities to have the once-in-a-lifetime experience of living in another country. Ever since then I’ve used his story as a warning, because his addiction to gaming had serious and lasting consequences.

There is a happy ending to this individual’s story, however, and that is that he did eventually get his life back on track and scale down his gaming. When we parted ways we didn’t keep in touch, so I can’t be certain he’s still living his best life, but as of the last time we were together it definitely seemed that he was moving in the right direction. It took an intervention from his family – who flew halfway around the world to see him after he failed all of his classes – and a twice-weekly therapy appointment to get him to that point, though.

Any time someone tells me that they know loads of people who play games who aren’t addicted, I tell them the story of my ex-roommate, and make the same point: “just because it hasn’t happened to you or someone you care about doesn’t mean it hasn’t happened to anyone.”

I hope that nobody tries to use the designation of gaming disorder to attack what is for most people a fun and innocent hobby. That would be counterproductive, and would lead to people who genuinely have issues with gaming addiction finding it harder to get help. But so far, that doesn’t seem to have happened. The designation is just that: a clinical classification designed to help that small minority of people who have a problem.

It’s worth noting that some games, especially in recent years, have gone out of their way to introduce potentially addictive elements to their gameplay. In particular we can look at lootboxes and randomised rewards, which in many games are little more than gambling – often using real-world money. There are frequent news stories, some of which end up in the mainstream media, of individuals who end up spending hundreds or thousands of pounds on these in-game “micro” transactions. In one case last year here in the UK, a child inadvertently spent his parents’ entire monthly wages in a game.

Putting a warning label of some kind on games that have in-game “micro” transactions is definitely a good idea, but in an era where physical sales of games in boxes (where such a label would be affixed) are in terminal decline, that probably won’t be good enough. And as I noted from my former roommate’s experience, which came long before such in-game transactions were commonplace, gaming addiction doesn’t always manifest with titles that have such systems in place.

We also have to be careful how we use the terminology of addiction – and of mental health in general, but that’s a separate point. When reading reviews of new titles, I often see the word “addictive” thrown around as if it were a positive thing: “this new game is incredibly addictive!” That kind of normalisation and misuse of the term can be problematic, as affected people may simply brush off their addiction by thinking that’s how everyone plays the game. I feel that writers have a certain responsibility to try to avoid this kind of language. Presenting addictiveness as a positive aspect could indirectly contribute to real harm. I’m sure I’ve made this mistake myself on occasion, but it’s something I hope to avoid in future.

Gaming addiction, like other addictions, is a complex problem that is not easily solved. It’s no easier for someone suffering from some form of gaming disorder to “just turn off the console” than it is for an alcoholic to “just stop drinking vodka”. The temptation is always present and it can be overwhelming. Anyone suggesting that it’s a simple case of “just stopping”, as if it were that easy, doesn’t know what they’re talking about. Again, it comes back to the point I made earlier: just because it might that easy for you doesn’t mean it is that easy for everybody. One person’s subjective experience is not a complete worldview; many people find it impossible to break the cycle of addiction without help. This classification has the potential to make more specialised help available, which is the primary reason I support it.

So that’s my take on the subject. Gaming can be addictive, and for a small number of people, that addiction can cause real harm and create lasting problems for themselves and their families. Recognising this reality is a good first step if it means more research can be conducted into the subject as that will hopefully lead to better and more effective treatments for people whose gaming addiction requires outside intervention. I’ve seen firsthand how this can happen, and I have absolutely no time for the argument that goes: “well I don’t have a problem with gaming addiction, so it must be fine for everyone!” That is a blinkered and selfish way to look at the subject.

For anyone reading this who thinks they may be affected by gaming disorder or video game addiction, I’ve prepared a quick checklist of questions you can ask yourself. If you find yourself answering “yes” to any of the points below, I would suggest you reach out to someone who can help – talking to a friend, family member, or someone you trust could be a great first step, and of course professional medical help is always available.

Question #1: Do you find yourself thinking about video games all the time, and planning ways to get back to your game as quickly as possible if interrupted?

Question #2: Have you missed important events – such as work, school, meetings, or other appointments – because you couldn’t tear yourself away from gaming?

Question #3: Do you find yourself unhappy, depressed, angry, or irritated while not gaming? And/or would you say that your happiness is inextricably tied to gaming?

Question #4: Have you ever lied about how much time you spend gaming to cover it up? And/or do you break rules or limits set by others on how much time you may spend gaming?

Question #5: Have you tried to spend less time gaming but failed?

Question #6: Do your friends, family members, or people close to you ever tell you that you spend too much time gaming? And/or do you feel that you have neglected your relationship(s) as a result of gaming?

Question #7: Do you forget to eat or skip meals because of gaming? Do you skip showering or fail to take care of basic hygiene and grooming because of gaming?

While not everyone who answers “yes” to the above questions will be an addict, these points do indicate that something may be amiss with your relationship with gaming.

At the end of the day, if you’re happy with your life and gaming is a hobby, that’s okay. If it isn’t causing any harm to yourself or other people, there is no problem. But for some people gaming can get to a point where it stops being a harmless bit of fun and becomes something more sinister: an addiction. Missing important events, skipping school, neglecting friends, skipping meals, skipping showers, etc. are all points which can indicate an individual’s relationship with gaming is becoming unhealthy, and if you recognise these signs in yourself, I encourage you to reach out and get help.

Yes, gaming disorder or gaming addiction is a real phenomenon. The World Health Organisation did not invent it, all they have done is classify it and formally recognise what many people have known for a long time – that it is real. Far from being an attack on gaming as a hobby, this should be seen as a positive thing, as it has the potential to help affected individuals get better and more appropriate help.

This article contains the thoughts and opinions of one person only and is not intended to cause any offence.