The first home console I owned – after saving up my hard-earned pocket money and pestering my parents for ages – was a Super Nintendo. Gaming has changed a lot since then, and while many of those changes have been fantastic and introduced us to new genres, not every change has been for the better! In this list I’m going to cover some of my biggest pet peeves with video games in 2021.
As always, this list is entirely subjective. If I criticise something you like, or exclude something you hate, just keep in mind that this is only one person’s opinion. Gaming is a huge hobby that includes many people with many different perspectives. If yours and mine don’t align, that’s okay!
Number 1: No difficulty options.
Some people play video games because they love the challenge of a punishingly-difficult title, and the reward of finally overcoming an impossible level after hours of perseverance. I am not one of those people! In most cases, I play video games for escapism and entertainment – I want to see a story unfold or just switch off from other aspects of my life for a while. Excessive difficulty is frustrating and offputting for me.
As someone with health issues, I would argue that difficulty settings are a form of accessibility. Some people don’t have the ability to hit keys or buttons in rapid succession, and in some titles the lack of a difficulty setting – particularly if the game is not well-balanced – can mean those games are unavailable to folks with disabilities.
While many games are too difficult, the reverse can also be true. Some titles are just too easy for some people – I’m almost never in that category, but still! Games that have no difficulty settings where the base game is incredibly easy can be unenjoyable for some folks, particularly if the challenge was what got them interested in the first place.
In 2021, most games have difficulty options as a standard feature. Difficulty settings have been part of games going back decades, and in my opinion there’s no technical reason why they shouldn’t be included. There’s also not really a “creative” reason, either. Some developers talk in grandiose terms about their “vision” for a title being the reason why they didn’t implement difficulty options, but as I’ve said before – the inclusion of an easier (or harder) mode does not impact the game at all. It only impacts those who choose to turn it on, and considering how easy it is to implement, I find it incredibly annoying when a game is deliberately shipped without any difficulty options.
Number 2: Excessive difficulty as a game’s only selling point.
While we’re on the subject of difficulty, another pet peeve of mine is games whose entire identity is based on their difficulty (or perceived difficulty). Think about this for a moment: would Dark Souls – an otherwise bland, uninspired hack-and-slash game – still be talked about ten years after its release were it not for its reputation as impossibly difficult? How many late 2000s or early ’10s hack-and-slash games have dropped out of the cultural conversation? The only thing keeping Dark Souls there is its difficulty.
A challenge is all well and good, and I don’t begrudge players who seek that out. But for me, a game has to offer something more than that. If there’s a story worth telling under the difficult gameplay I’m impressed. If the difficult, punishing gameplay is all there is, then that’s boring!
Difficulty can also be used by developers as cover for a short or uninteresting game. Forcing players to replay long sections over and over and over can massively pad out a game’s runtime, and if that’s a concern then cranking the difficulty to ridiculous levels – and offering no way to turn it down – can turn a short game into a long one artificially.
I’m all for games that offer replay value, but being forced to replay the same level or checkpoint – or battle the same boss over and over – purely because of how frustratingly hard the developers chose to make things simply isn’t fun for me.
Number 3: Ridiculous file sizes.
Hey Call of Duty? Your crappy multiplayer mode does not need to be 200 gigabytes. Nor does any game, for that matter. It’s great that modern technology allows developers to create realistic-looking worlds, but some studios are far better than others when it comes to making the best use of space! Some modern games do need to be large to incorporate everything, but even so there’s “large” and then there’s “too large.”
For a lot of folks this is an issue for two main reasons: data caps and download speeds. On my current connection I’m lucky to get a download speed of 7 Mbps, and downloading huge game files can quite literally take several days – days in which doing anything else online would be impossibly slow! But I’m fortunate compared to some people, because I’m not limited in the amount of data I can download by my ISP.
In many parts of the world, and on cheaper broadband connections, data caps are very much still a thing. Large game files can take up an entire months’ worth of data – or even more in some cases – making games with huge files totally inaccessible to a large number of people.
This one doesn’t seem like it’s going away any time soon, though. In fact, we’re likely to see file sizes continue to get larger as games push for higher resolutions, larger environments, and more detail.
Number 4: Empty open worlds.
Let’s call this one “the Fallout 76 problem.” Open worlds became a trend in gaming at some point in the last decade, such that many franchises pursued this style even when it didn’t suit their gameplay. Read the marketing material of many modern titles and you’ll see bragging about the size of the game world: 50km2, 100km2, 1,000km2, and so on. But many of these open worlds are just empty and boring, with much of the map taken up with vast expanses of nothing.
It is simply not much fun to have to travel across a boring environment – or even a decently pretty one – for ages just to get to the next mission or part of the story. Level design used to be concise and clever; modern open worlds, especially those which brag about their size, tend to be too large, with too little going on.
The reason why Fallout 76 just encapsulates this for me is twofold. Firstly, Bethesda droned on and on in the weeks before the game’s release that the world they’d created was the “biggest ever!” And secondly, the game had literally zero non-player characters. That huge open world was populated by a handful of other players, non-sentient monsters, and nothing else. It was one of the worst games of the last few years as a result.
Open worlds can work well in games that are suited for that style of gameplay. But too many studios have been pushed into creating an open world simply to fit in with a current trend, and those open worlds tend to just flat-out suck because of it. Even when developers have tried to throw players a bone by adding in collect-a-thons, those get boring fast.
Number 5: Pixel graphics as a selling point.
There are some great modern games that use a deliberately 8-bit look. But for every modern classic there are fifty shades of shit; games that think pixel graphics and the word “retro” are cover for creating a mediocre or just plain bad title.
It may be hard to remember, but there was a time when the idea of using a deliberately “old-school” aesthetic would have been laughed at. The first few console generations were all about improvements, and I’m old enough to remember when 3D was a huge deal. It seemed like nobody would ever want to go back to playing a SNES game after trying the Nintendo 64, and while there are still plenty of gamers who love the retro feel, I’m generally not one of them.
That isn’t to say that realistic graphics should be the only thing a game strives for. And this point works for modern graphics or visual styles in general – bragging about how detailed the graphics are, or how unique a title’s art style is, means nothing if the game itself is shit. But it likewise works for pixel-graphics games – an outdated art style does not compensate for or cover up a fundamentally flawed, unenjoyable experience.
Games with pixel graphics can be good, and many titles have surprised me by how good they are. I’ve written before about how Minecraft surprised me by being so much more than I expected, and that’s one example. But I guess what I’d say is this: if your game looks like it should have been released in 1991, you’ve got more of an uphill battle to win me over – or even convince me to try it in the first place – than you would if your game looked new.
Number 6: Unnecessary remakes.
We called one of the entries above “the Fallout 76 problem,” so let’s call this one “the Mass Effect: Legendary Edition problem.” In short, games from even ten or fifteen years ago still look pretty good and play well. There’s far less of a difference between games from 2011 and 2021 than there was between games from 1991 and 2001 – the pace of technological change, at least in gaming, has slowed.
“Updating” or “remaking” a game from ten years ago serves no real purpose, and in the case of Mass Effect: Legendary Edition I’ve struggled at times to tell which version of the game is the new one when looking at pre-release marketing material. There’s no compelling reason to remake games that aren’t very old. Re-release them or give them a renewed marketing push if you want to drum up sales or draw attention to a series, but don’t bill your minor upgrade as a “remake.”
There are some games that have benefitted hugely from being remade. I’d point to Crash Bandicoot and Resident Evil 2 as two great examples. But those games were both over twenty years old at the time they were remade, and having been released in the PlayStation 1 era, both saw massive upgrades such that they were truly worthy of the “remake” label.
I’ve put together two lists of games that I’d love to see remade, but when I did so I deliberately excluded titles from the last two console generations. Those games, as I said at the time, are too recent to see any substantial benefits from a remake. In another decade or so, assuming sufficient technological progress has been made, we can talk about remaking PlayStation 3 or PlayStation 4 games – but not now!
Number 7: Fake “remakes.”
On a related note to the point above, if a title is billed as a “remake,” I expect to see substantial changes and improvements. If all that’s happened is a developer has run an old title through an upscaler and added widescreen support, that’s not a remake!
A lot of titles that acquire the “HD” suffix seem to suffer from this problem. Shenmue I & II on PC contained a number of bugs and glitches – some of which existed in the Dreamcast version! When Sega decided to “remake” these two amazing games, they couldn’t even be bothered to patch out bugs that were over fifteen years old. That has to be some of the sloppiest, laziest work I’ve ever seen.
There are other examples of this, where a project may have started out with good intentions but was scaled back and scaled back some more to the point that it ended up being little more than an upscaled re-release. Kingdoms of Amalur: Re-Reckoning springs to mind as an example from just last year.
Remakes are an opportunity to go back to the drawing board, fix issues, update a title, and bring it into the modern world. Too many “remakes” fail to address issues with the original version of the game. We could even point to Mass Effect: Legendary Edition’s refusal to address criticism of the ending of Mass Effect 3 as yet another example of a missed opportunity.
Number 8: The “release now, fix later” business model.
This isn’t the first time I’ve criticised the “release now, fix later” approach taken by too many modern games – and it likely won’t be the last! Also known as “live services,” games that go down this route almost always underperform and draw criticism, and they absolutely deserve it. The addition of internet connectivity to home consoles has meant that games companies have taken a “good enough” approach to games, releasing them before they’re ready with the intention to patch out bugs, add more content, and so on at a later time.
Cyberpunk 2077 is one of the most recent and most egregious examples of this phenomenon, being released on Xbox One and PlayStation 4 in a state so appallingly bad that many considered it “unplayable.” But there are hundreds of other examples going back to the early part of the last decade. Fortunately, out of all the entries on this list, this is the one that shows at least some signs of going away!
The fundamental flaw in this approach, of course, is that games with potential end up having launches that are mediocre at best, and when they naturally underperform due to bad reviews and word-of-mouth, companies panic! Planned updates are scrapped to avoid pumping more money into a failed product, and a game that could have been decent ends up being forgotten.
For every No Man’s Sky that manages to claw its way to success, there are a dozen Anthems or Mass Effect: Andromedas which fail. Time will tell if Cyberpunk 2077 can rebuild itself and its reputation, but its an uphill struggle – and a totally unnecessary one; a self-inflicted wound. If publishers would just wait and delay clearly-unfinished games instead of forcing them to meet arbitrary deadlines, gaming would be a much more enjoyable hobby. Remember, everyone: NO PRE-ORDERS!
Number 9: Forcing games to be multiplayer and/or scrapping single-player modes.
Some games are built from the ground up with multiplayer in mind – but many others are not, and have multiplayer modes tacked on for no reason. The Last Of Us had an unnecessary multiplayer mode, as did Mass Effect 3. Did you even know that, or notice those modes when you booted up those story-focused games?
Some games and even whole genres are just not well-suited to multiplayer. And others that are still have the potential to see single-player stories too. Many gamers associate the first-person shooter genre with multiplayer, and it’s true that multiplayer games work well in the first-person shooter space. But so do single-player titles, and aside from 2016’s Doom and the newer Wolfenstein titles, I can’t think of many new single-player first-person shooters, or even shooters with single-player modes that felt anything other than tacked-on.
Anthem is one of the biggest failures of the last few years, despite BioWare wanting it to be the video game equivalent of Bob Dylan. But if Anthem hadn’t been multiplayer and had instead maintained BioWare’s usual single-player focus, who knows what it could have been. There was potential in its Iron Man-esque flying suits, but that potential was wasted on a mediocre-at-best multiplayer shooter.
I started playing games before the internet, when “multiplayer” meant buying a second controller and plugging it into the console’s only other available port! So I know I’m biased because of that. But just a few short years ago it felt as though there were many more single-player titles, and fewer games that felt as though multiplayer modes had been artificially forced in. In the wake of huge financial successes such as Grand Theft Auto V, Fortnite, and the like, publishers see multiplayer as a cash cow – but I wish they didn’t!
Number 10: Early access.
How many times have you been excited to see that a game you’ve been waiting for is finally available to buy… only to see the two most awful words in the entire gaming lexicon: “Early Access?” Early access billed itself as a way for indie developers to get feedback on their games before going ahead with a full release, and I want to be clear on this point: I don’t begrudge indie games using it for that purpose. Indies get a pass!
But recently there’s been a trend for huge game studios to use early access as free labour; a cheap replacement for paying the wages of a quality assurance department. When I worked for a large games company in the past, I knew a number of QA testers, and the job is not an easy one. It certainly isn’t one that studios should be pushing off onto players, yet that’s exactly what a number of them have been doing. Early access, if it exists at all, should be a way for small studios to hone and polish their game, and maybe add fan-requested extras, not for big companies to save money on testers.
Then there are the perpetual early access games. You know the ones: they entered early access in 2015 and are still there today. Platforms like Steam which offer early access need to set time limits, because unfortunately some games are just taking the piss. If your game has been out since 2015, then it’s out. It’s not in early access, you’ve released it.
Unlike most of the entries on this list, early access started out with genuinely good intentions. When used appropriately by indie developers, it’s fine and I don’t have any issue with it. But big companies should know better, and games that enter early access and never leave should be booted out!
Bonus: Online harassment.
Though this problem afflicts the entire internet regardless of where you go, it’s significant in the gaming realm. Developers, publishers, even individual employees of games studios can find themselves subjected to campaigns of online harassment by so-called “fans” who’ve decided to take issue with something in a recent title.
Let’s be clear: there is never any excuse for this. No game, no matter how bad it is, is worth harassing someone over. It’s possible to criticise games and their companies in a constructive way, or at least in a way that doesn’t get personal. There’s never any need to go after a developer personally, and especially not to send someone death threats.
We’ve seen this happen when games are delayed. We’ve seen it happen when games release too early in a broken state. In the case of Cyberpunk 2077, we’ve seen both. Toxic people will always find a reason to be toxic, unfortunately, and in many ways the anonymity of the internet has brought out the worst in human nature.
No developer or anyone who works in the games industry deserves to be threatened or harassed. It’s awful, it needs to stop, and the petty, toxic people who engage in this scummy activity do not deserve to be called “fans.”
So that’s it. Ten of my pet peeves with modern gaming.
This was a rant, but it was just for fun so I hope you don’t mind! There are some truly annoying things – and some truly annoying people – involved in gaming in 2021, and as much fun as playing games can be, it can be a frustrating experience as well. Some of these things are fads – short-term trends that will evaporate as the industry moves on. But others, like the move away from single-player games toward ongoing multiplayer experiences, seem like they’re here to stay.
Gaming has changed an awful lot since I first picked up a control pad. And it will continue to evolve and adapt – the games industry may be unrecognisable in fifteen or twenty years’ time! We’ll have to keep our fingers crossed for positive changes to come.
All titles mentioned above are the copyright of their respective developer, publisher, and/or studio. Some stock images courtesy of pixabay. Some screenshots and promotional artwork courtesy of IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.