Ten of my gaming pet peeves

A couple of years ago, I put together two lists of things I really dislike about modern video games – but somehow I’ve managed to find even more! Although there’s lots to enjoy when it comes to the hobby of gaming, there are still plenty of annoyances and dislikes that can detract from even the most pleasant of gaming experiences. So today, I thought it could be a bit of fun to take a look at ten of them!

Several of these points could (and perhaps one day will) be full articles or essays all on their own. Big corporations in the video games industry all too often try to get away with egregiously wrong and even malicious business practices – and we should all do our best to call out misbehaviour. While today’s list is somewhat tongue-in-cheek, there are major issues with the way big corporations in the gaming realm behave… as indeed there are with billion-dollar corporations in every other industry, too.

Gaming is great fun… but it has its annoyances!

That being said, this is supposed to be a bit of fun. And as always, I like to caveat any piece like this by saying that everything we’re going to be talking about is nothing more than one person’s subjective take on the topic! If you disagree with everything I have to say, if you like, enjoy, or don’t care about these issues, or if I miss something that seems like an obvious inclusion to you, please just keep in mind that all of this is just the opinion of one single person! There’s always room for differences of opinion; as gamers we all have different preferences and tolerance levels.

If you’d like to check out my earlier lists of gaming annoyances, you can find the first one by clicking or tapping here, and the follow-up by clicking or tapping here. In some ways, this list is “part three,” so if you like what you see, you might also enjoy those older lists as well!

With all of that out of the way, let’s jump into the list – which is in no particular order.

Number 1:
Motion blur and film grain.

Film grain and motion blur options in Ghostwire Tokyo.

Whenever I boot up a new game, I jump straight into the options menu and disable both motion blur and film grain – settings that are almost always inexplicably enabled by default. Film grain is nothing more than a crappy Snapchat filter; something twelve-year-olds love to play with to make their photos look “retro.” It adds nothing to a game and actively detracts from the graphical fidelity of modern titles.

Motion blur is in the same category. Why would anyone want this motion sickness-inducing setting enabled? It smears and smudges even the best-looking titles for basically no reason at all. Maybe on particularly underpowered systems these settings might hide some graphical jankiness, but on new consoles and even moderately good PCs, they’re unnecessary. They make games look significantly worse – and I can’t understand why anyone would choose to play a title with them enabled.

Number 2:
In-game currencies that have deliberately awkward exchange rates.

Show-Bucks bundles in Fall Guys.

In-game currencies are already pretty shady; a psychological manipulation to trick players into spending more real money. But what’s far worse is when in-game currencies are deliberately awkward with their exchange rates. For example, if most items on the storefront cost 200 in-game dollars, but I can only buy in-game dollars in bundles of 250 or 500. If I buy 250 in-game dollars I’ll have a few left over that I can’t spend, and if I buy 500 then I’ll have spent more than I need to.

This is something publishers do deliberately. They know that if you have 50 in-game dollars left over there’ll be a temptation to buy even more to make up the difference, and they know players will be forced to over-spend on currencies that they have no need for. Some of these verge on being scams – but all of them are annoying.

Number 3:
Fully-priced games with microtransactions.

The in-game shop in Diablo IV.

If a game is free – like Fortnite or Fall Guys – then microtransactions feel a lot more reasonable. Offering a game for free to fund it through in-game purchases is a viable business model, and while it needs to be monitored to make sure the in-game prices aren’t unreasonable, it can be an acceptable way for a game to make money. But if a game costs me £65 up-front, there’s no way it should include microtransactions.

We need to differentiate expansion packs from microtransactions, because DLC that massively expands a game and adds new missions and the like is usually acceptable. But if I’ve paid full price for a game, I shouldn’t find an in-game shop offering me new costumes, weapon upgrades, and things like that. Some titles absolutely take the piss with this, too, even including microtransactions in single-player campaigns, or having so many individual items for sale that the true cost of the game – including purchasing all in-game items – can run into four or even five figures.

Number 4:
Patches as big as (or bigger than) the actual game.

No patch should ever need to be this large.

This one kills me because of my slow internet! And it’s come to the fore recently as a number of big releases have been buggy and broken at launch. Jedi: Survivor, for example, has had patches that were as big as the game’s original 120GB download size – meaning a single patch would take me more than a day to download. Surely it must be possible to patch or fix individual files without requiring players to download the entire game all over again – in some cases more than once.

I’m not a developer or technical expert, and I concede that I don’t know enough about this topic on a technical level to be able to say with certainty that it’s something that should never happen. But as a player, I know how damnably annoying it is to press “play” only to be told I need to wait hours and hours for a massive, unwieldy patch. Especially if that patch, when fully downloaded, doesn’t appear to have actually done anything!

Number 5:
Broken PC ports.

This is supposed to be Joel from The Last Of Us Part 1.

As I said when I took a longer look at this topic, I had hoped that broken PC ports were becoming a thing of the past. Not so, however! A number of recent releases – including massive AAA titles – have landed on PC in broken or even outright unplayable states, plagued by issues that are not present on PlayStation or Xbox.

PC is a massive platform, one that shouldn’t be neglected in this way. At the very least, publishers should have the decency to delay a PC port if it’s clearly lagging behind the console versions – but given the resources that many of the games industry’s biggest corporations have at their disposal, I don’t see why we should accept even that. Develop your game properly and don’t try to launch it before it’s ready! I’m not willing to pay for the “privilege” of doing the job of a QA tester.

Number 6:
Recent price hikes.

It must be some kind of visual metaphor…

Inflation and a cost-of-living crisis are really punching all of us in the face right now – so the last thing we need are price hikes from massive corporations. Sony really pissed me off last year when they bragged to their investors about record profits before turning around literally a matter of weeks later and announcing that the price of PlayStation 5 consoles was going to go up. This is unprecedented, as the cost of consoles usually falls as a console generation progresses.

But Sony is far from the only culprit. Nintendo, Xbox, Activision Blizzard, TakeTwo, Electronic Arts and practically every major corporation in the games industry have jacked up their prices over the last few years, raising the basic price of a new game – and that’s before we look at DLC, special editions, and the like. These companies are making record-breaking profits, and yet they use the excuse of “inflation” to rip us off even more. Profiteering wankers.

Number 7:
The “release now, fix later” business model is still here.

The player character falling through the map in Star Wars Jedi: Survivor.

I had hoped that some recent catastrophic game launches would have been the death knell for the “release now, fix later” business model – but alas. Cyberpunk 2077 failed so hard that it got pulled from sale and tanked the share price of CD Projekt Red… but even so, this appalling way of making and launching games has persisted. Just in the first half of 2023 we’ve had titles like Hogwarts Legacy, Redfall, Jedi: Survivor, Forspoken, and The Lord of the Rings: Gollum that arrived broken, buggy, and unplayable.

With every disaster that causes trouble for a corporation, I cross my fingers and hope that lessons will be learned. But it seems as if the “release now, fix later” approach is here to stay. Or at least it will be as long as players keep putting up with it – and even defending it in some cases.

Number 8:
Day-one DLC/paywalled day-one content.

An example of a “digital deluxe edition” and its paywalled content.

It irks me no end when content that was clearly developed at the same time as the “base version” of a game is paywalled off and sold separately for an additional fee. The most egregious example of this that comes to mind is Mass Effect 3′s From Ashes DLC, which was launched alongside the game. This DLC included a character and missions that were completely integrated into the game – yet had been carved out to be sold separately.

This practice continues, unfortunately, and many modern titles release with content paywalled off, even if that content was developed right along with the rest of the game. Sometimes these things are designed to be sold as part of a “special edition,” but that doesn’t excuse it either. Even if all we’re talking about are character skins and cosmetic content, it still feels like those things should be included in the price – especially in single-player titles. Some of this content can be massively overpriced, too, with packs of two or three character skins often retailing for £10 or more.

Number 9:
Platform-exclusive content and missions.

Spider-Man was a PlayStation-only character in Marvel’s Avengers.

Some titles are released with content locked to a single platform. Hogwarts Legacy and Marvel’s Avengers are two examples that come to mind – and in both cases, missions and characters that should have been part of the main game were unavailable to players on PC and Xbox thanks to deals with Sony. While I can understand the incentive to do this… it’s a pretty shit way of making money for a publisher, and a pretty scummy way for a platform to try to attract sales.

Again, this leaves games incomplete, and players who’ve paid full price end up getting a worse experience or an experience with less to do depending on their platform of choice. That’s unfair – and it’s something that shouldn’t be happening.

Number 10:
Pre-orders.

Cartman from South Park said it best:
“You know what you get for pre-ordering a game? A big dick in your mouth.”

Pre-ordering made sense – when games were sold in brick-and-mortar shops on cartridges or discs. You wanted to guarantee your copy of the latest big release, and one way to make sure you’d get the game before it sold out was to pre-order it. But that doesn’t apply any more; not only are more and more games being sold digitally, but even if you’re a console player who wants to get a game on disc, there isn’t the same danger of scarcity that there once was.

With so many games being released broken – or else failing to live up to expectations – pre-ordering in 2023 is nothing short of stupidity, and any player who still does it is an idiot. It actively harms the industry and other players by letting corporations get away with more misbehaviour and nonsense. If we could all be patient and wait a day or two for reviews, fewer games would be able to be launched in unplayable states. Games companies bank on a significant number of players pre-ordering and not cancelling or refunding if things go wrong. It’s free money for them – and utterly unnecessary in an age of digital downloads.

So that’s it!

A PlayStation 5 console.

We’ve gone through ten of my pet peeves when it comes to gaming. I hope this was a bit of fun – and not something to get too upset over!

The gaming landscape has changed massively since I first started playing. Among the earliest titles I can remember trying my hand at are Antarctic Adventure and the Commodore 64 title International Soccer, and the first home console I was able to get was a Super Nintendo. Gaming has grown massively since those days, and the kinds of games that can be created with modern technology, game engines, and artificial intelligence can be truly breathtaking.

But it isn’t all good, and we’ve talked about a few things today that I find irritating or annoying. The continued push from publishers to release games too early and promise patches and fixes is particularly disappointing, and too many publishers and corporations take their greed to unnecessary extremes. But that’s the way the games industry is… and as cathartic as it was to get it off my chest, I don’t see those things disappearing any time soon!

All titles mentioned above are the copyright of their respective developer, studio, and/or publisher. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The state of PC ports

The dreaded “release now, fix later” model that has been adopted by corporations across the games industry has shown up constantly in 2023. Although a number of console titles have been affected, by far the worst impact has been felt on PC. As PC is my primary gaming platform these days, this is something that hits me personally. Today, I wanted to talk a little about the absolute state of many recent PC releases.

Jedi: Survivor, Redfall, Forspoken, Hogwarts Legacy, and The Last Of Us Part 1 should have all been among the biggest PC releases in the first half of 2023. I was genuinely looking forward to several of these games myself. But all of them, despite being massive games with huge budgets backed up by major corporate publishers, have been released in broken, unfinished, and in some cases borderline unplayable states.

It’s Joel from The Last Of Us… apparently.

As a rule, I don’t pre-order games. I’ve been burned in the past, and as someone who doesn’t have money to piss away, pre-ordering just doesn’t feel like a good idea any more. But many folks still do, lured in by pre-order exclusive bonuses and the like, and many of these folks – as well as those who picked up titles shortly after launch – have been left severely disappointed in the first half of 2023.

I had hoped, particularly after the Cyberpunk 2077 debacle a couple of years back, that the games industry was beginning to learn its lesson. Just because it’s technically feasible to launch a title in an unfinished state and patch it out later, that doesn’t mean it’s a good idea; the damage done by a rocky launch can be difficult to overcome – if not outright impossible. For every success story like No Man’s Sky, there are dozens of titles like Anthem, Aliens: Colonial Marines, or Assassin’s Creed Unity that are too far gone to be salvageable. And even titles that manage to continue development, like Cyberpunk 2077, are forever tainted by the way they launched.

A hollow character model in Redfall.

Who knows how many more sales Cyberpunk 2077 might’ve made had it been released six months later? The damage that game did to CD Projekt Red has set back the company immeasurably, damaging its share price and tanking its reputation with players. It’s an expensive lesson in how not to release a video game… so why have none of the other corporations in the games industry taken notice?

I didn’t buy Jedi: Survivor this month, even though I’d gone out of my way to save up for it and allocate money for it in my budget. Why? The reason is simple: I read the reviews, saw breakdowns of the PC port of the game, and decided to put my wallet away and wait. Electronic Arts lost what should have been a guaranteed sale because I’m not willing to buy an unfinished product. And make no mistake, that’s what Jedi: Survivor and all the other games listed above are: unfinished.

Cal falls through the map in Jedi: Survivor.

Unlike making a game for a console, developing for PC can be a challenge. Take it from someone who built their own PC last year: there are a huge number of different internal components from CPUs to GPUs, RAM to solid-state drives, and beyond. Ensuring perfectly smooth compatibility across an almost infinite range of potential PCs isn’t as easy as getting a game to run on an Xbox Series X or PlayStation 5, which don’t have this issue of varied internal components. And I get that, I really do.

But that isn’t a good enough excuse. I’d actually rather that a corporation delayed the PC port of a game than release it in a broken state, and I won’t be alone in saying so. It isn’t ideal to break up a title’s release by platform, and it’s something to be avoided if at all possible, but under some circumstances it can be forgiven – especially where smaller, independent studios are concerned.

Characters clipping through each other in Hogwarts Legacy.

I used to work in the games industry, and I know or knew dozens of developers at both small and large companies. Developers are great, passionate people who put a lot of energy and love into their work. Developers working on franchises like Star Wars, for instance, are almost always passionate fans who want to bring their story to life as best they can. These bad releases are not a reflection on developers – nor should anyone try to harrass or attack developers because of these broken games.

The fault here lies with games publishers: corporations like Electronic Arts, Microsoft, Sony, and Warner Bros. Games. They’re the ones who hold the cards, and developers are forced to work to often unreasonable timelines. Even intense periods of “crunch” are often not enough to salvage a project in time, and a premature launch is almost always forced on a developer by a publisher. That’s undoubtedly what happened in each of these cases.

The fault lies with corporations like EA.

Crappy PC ports used to be fairly commonplace, but as the platform has grown and become more lucrative, that games industry stereotype seemed to be fading away. 2023 has brought it right back, and I’m now in a position where every PC game release is treated with scepticism. As players and fans, we shouldn’t be in the position of assuming a PC release will automatically be buggy, laggy, and an overall worse experience – yet here we are.

I’m not prepared to accept this as being “just” one of the downsides of PC gaming, either. Corporations need to make sure they’re allocating enough time and energy to their PC ports as they are for consoles – and if they can’t guarantee that a game will be in a playable state, the only option is to delay it. Ideally a game would be delayed on every platform, but in some cases it might be okay to go ahead with a console release and merely delay the PC port.

Promo art for Jedi: Survivor.

As consumers in this marketplace, all we can do is refuse to participate. It’s on us to tell corporations that we aren’t willing to pay their inflated prices to do the job of their quality assurance team, and that releasing games before they’re finished and before they’re basically playable is not acceptable.

One of the disappointing trends that I’ve seen, not just with PC games in 2023 but with a whole host of “release now, fix later” titles, is players and fans covering for and continuing to support these faceless, greedy corporations. Too many people seem willing to make excuses on behalf of big publishers, essentially doing the job of a marketing team for the. Some games, like Jedi: Survivor, have even received positive reviews on platforms like Steam and Metacritic, even as the reviewer admits that the game is in a poor state and playing it isn’t a great experience. Why say that? What benefit is there?

A couple of examples of positive Steam reviews for Jedi: Survivor.

I’m also deeply disappointed in some professional outlets. Practically all of the titles above received positive reviews from professional critics, reviews which in some cases glossed over or outright ignored bugs, glitches, and other issues with the titles in question. There’s a stinking rot at the core of the relationships between some games corporations and certain media outlets – and while I would never accuse anyone of writing a paid-for review, there are clearly incentives given and threats made to keep review scores higher than they deserve to be in some cases.

I also don’t buy the excuse of “pandemic-related disruption,” not any more. That might’ve worked three years ago, but as the World Heath Organisation downgrades covid and society gets back on track across the globe, it’s beginning to stretch credulity to blame any and all problems on the pandemic. That’s a cheap excuse by corporations who don’t want us to know the truth: they’re greedily publishing unfinished games to grab as much cash as possible for as little work and investment as possible. That’s always been the case, but it’s been turned up to eleven in recent years.

At the end of the day, this is all about money.

Unfortunately, I don’t see this trend disappearing any time soon. For me, all PC releases are now suspect, and I will be checking out multiple reviews and tech breakdowns of the latest titles before I even consider parting with my money. I would advise all PC players to take the same approach – and to not shy away from calling out games corporations that misbehave. No other industry could get away with this – not in entertainment nor in any other sector. We wouldn’t take this kind of behaviour from other corporations and companies – so why should we be forced to put up with it with our games?

It is infinitely better to delay a game, continue to work on the issues it may have, and only release it when it’s ready. This is a lesson that the games industry really ought to have learned by now – but I guess we’ll have to do whatever we can to hammer the point home. Why should we accept low-quality, broken, unfinished games with promises of fixes and patches to come? We shouldn’t – and this awful trend of crappy PC ports has to stop.

All titles discussed above are the copyright of their respective developer, studio, and/or publisher. Some screenshots and promo images courtesy of IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The worst things about modern video games

The first home console I owned – after saving up my hard-earned pocket money and pestering my parents for ages – was a Super Nintendo. Gaming has changed a lot since then, and while many of those changes have been fantastic and introduced us to new genres, not every change has been for the better! In this list I’m going to cover some of my biggest pet peeves with video games in 2021.

As always, this list is entirely subjective. If I criticise something you like, or exclude something you hate, just keep in mind that this is only one person’s opinion. Gaming is a huge hobby that includes many people with many different perspectives. If yours and mine don’t align, that’s okay!

Number 1: No difficulty options.

Some people play video games because they love the challenge of a punishingly-difficult title, and the reward of finally overcoming an impossible level after hours of perseverance. I am not one of those people! In most cases, I play video games for escapism and entertainment – I want to see a story unfold or just switch off from other aspects of my life for a while. Excessive difficulty is frustrating and offputting for me.

As someone with health issues, I would argue that difficulty settings are a form of accessibility. Some people don’t have the ability to hit keys or buttons in rapid succession, and in some titles the lack of a difficulty setting – particularly if the game is not well-balanced – can mean those games are unavailable to folks with disabilities.

While many games are too difficult, the reverse can also be true. Some titles are just too easy for some people – I’m almost never in that category, but still! Games that have no difficulty settings where the base game is incredibly easy can be unenjoyable for some folks, particularly if the challenge was what got them interested in the first place.

In 2021, most games have difficulty options as a standard feature. Difficulty settings have been part of games going back decades, and in my opinion there’s no technical reason why they shouldn’t be included. There’s also not really a “creative” reason, either. Some developers talk in grandiose terms about their “vision” for a title being the reason why they didn’t implement difficulty options, but as I’ve said before – the inclusion of an easier (or harder) mode does not impact the game at all. It only impacts those who choose to turn it on, and considering how easy it is to implement, I find it incredibly annoying when a game is deliberately shipped without any difficulty options.

Number 2: Excessive difficulty as a game’s only selling point.

While we’re on the subject of difficulty, another pet peeve of mine is games whose entire identity is based on their difficulty (or perceived difficulty). Think about this for a moment: would Dark Souls – an otherwise bland, uninspired hack-and-slash game – still be talked about ten years after its release were it not for its reputation as impossibly difficult? How many late 2000s or early ’10s hack-and-slash games have dropped out of the cultural conversation? The only thing keeping Dark Souls there is its difficulty.

A challenge is all well and good, and I don’t begrudge players who seek that out. But for me, a game has to offer something more than that. If there’s a story worth telling under the difficult gameplay I’m impressed. If the difficult, punishing gameplay is all there is, then that’s boring!

Difficulty can also be used by developers as cover for a short or uninteresting game. Forcing players to replay long sections over and over and over can massively pad out a game’s runtime, and if that’s a concern then cranking the difficulty to ridiculous levels – and offering no way to turn it down – can turn a short game into a long one artificially.

I’m all for games that offer replay value, but being forced to replay the same level or checkpoint – or battle the same boss over and over – purely because of how frustratingly hard the developers chose to make things simply isn’t fun for me.

Number 3: Ridiculous file sizes.

Hey Call of Duty? Your crappy multiplayer mode does not need to be 200 gigabytes. Nor does any game, for that matter. It’s great that modern technology allows developers to create realistic-looking worlds, but some studios are far better than others when it comes to making the best use of space! Some modern games do need to be large to incorporate everything, but even so there’s “large” and then there’s “too large.”

For a lot of folks this is an issue for two main reasons: data caps and download speeds. On my current connection I’m lucky to get a download speed of 7 Mbps, and downloading huge game files can quite literally take several days – days in which doing anything else online would be impossibly slow! But I’m fortunate compared to some people, because I’m not limited in the amount of data I can download by my ISP.

In many parts of the world, and on cheaper broadband connections, data caps are very much still a thing. Large game files can take up an entire months’ worth of data – or even more in some cases – making games with huge files totally inaccessible to a large number of people.

This one doesn’t seem like it’s going away any time soon, though. In fact, we’re likely to see file sizes continue to get larger as games push for higher resolutions, larger environments, and more detail.

Number 4: Empty open worlds.

Let’s call this one “the Fallout 76 problem.” Open worlds became a trend in gaming at some point in the last decade, such that many franchises pursued this style even when it didn’t suit their gameplay. Read the marketing material of many modern titles and you’ll see bragging about the size of the game world: 50km2, 100km2, 1,000km2, and so on. But many of these open worlds are just empty and boring, with much of the map taken up with vast expanses of nothing.

It is simply not much fun to have to travel across a boring environment – or even a decently pretty one – for ages just to get to the next mission or part of the story. Level design used to be concise and clever; modern open worlds, especially those which brag about their size, tend to be too large, with too little going on.

The reason why Fallout 76 just encapsulates this for me is twofold. Firstly, Bethesda droned on and on in the weeks before the game’s release that the world they’d created was the “biggest ever!” And secondly, the game had literally zero non-player characters. That huge open world was populated by a handful of other players, non-sentient monsters, and nothing else. It was one of the worst games of the last few years as a result.

Open worlds can work well in games that are suited for that style of gameplay. But too many studios have been pushed into creating an open world simply to fit in with a current trend, and those open worlds tend to just flat-out suck because of it. Even when developers have tried to throw players a bone by adding in collect-a-thons, those get boring fast.

Number 5: Pixel graphics as a selling point.

There are some great modern games that use a deliberately 8-bit look. But for every modern classic there are fifty shades of shit; games that think pixel graphics and the word “retro” are cover for creating a mediocre or just plain bad title.

It may be hard to remember, but there was a time when the idea of using a deliberately “old-school” aesthetic would have been laughed at. The first few console generations were all about improvements, and I’m old enough to remember when 3D was a huge deal. It seemed like nobody would ever want to go back to playing a SNES game after trying the Nintendo 64, and while there are still plenty of gamers who love the retro feel, I’m generally not one of them.

That isn’t to say that realistic graphics should be the only thing a game strives for. And this point works for modern graphics or visual styles in general – bragging about how detailed the graphics are, or how unique a title’s art style is, means nothing if the game itself is shit. But it likewise works for pixel-graphics games – an outdated art style does not compensate for or cover up a fundamentally flawed, unenjoyable experience.

Games with pixel graphics can be good, and many titles have surprised me by how good they are. I’ve written before about how Minecraft surprised me by being so much more than I expected, and that’s one example. But I guess what I’d say is this: if your game looks like it should have been released in 1991, you’ve got more of an uphill battle to win me over – or even convince me to try it in the first place – than you would if your game looked new.

Number 6: Unnecessary remakes.

We called one of the entries above “the Fallout 76 problem,” so let’s call this one “the Mass Effect: Legendary Edition problem.” In short, games from even ten or fifteen years ago still look pretty good and play well. There’s far less of a difference between games from 2011 and 2021 than there was between games from 1991 and 2001 – the pace of technological change, at least in gaming, has slowed.

“Updating” or “remaking” a game from ten years ago serves no real purpose, and in the case of Mass Effect: Legendary Edition I’ve struggled at times to tell which version of the game is the new one when looking at pre-release marketing material. There’s no compelling reason to remake games that aren’t very old. Re-release them or give them a renewed marketing push if you want to drum up sales or draw attention to a series, but don’t bill your minor upgrade as a “remake.”

There are some games that have benefitted hugely from being remade. I’d point to Crash Bandicoot and Resident Evil 2 as two great examples. But those games were both over twenty years old at the time they were remade, and having been released in the PlayStation 1 era, both saw massive upgrades such that they were truly worthy of the “remake” label.

I’ve put together two lists of games that I’d love to see remade, but when I did so I deliberately excluded titles from the last two console generations. Those games, as I said at the time, are too recent to see any substantial benefits from a remake. In another decade or so, assuming sufficient technological progress has been made, we can talk about remaking PlayStation 3 or PlayStation 4 games – but not now!

Number 7: Fake “remakes.”

On a related note to the point above, if a title is billed as a “remake,” I expect to see substantial changes and improvements. If all that’s happened is a developer has run an old title through an upscaler and added widescreen support, that’s not a remake!

A lot of titles that acquire the “HD” suffix seem to suffer from this problem. Shenmue I & II on PC contained a number of bugs and glitches – some of which existed in the Dreamcast version! When Sega decided to “remake” these two amazing games, they couldn’t even be bothered to patch out bugs that were over fifteen years old. That has to be some of the sloppiest, laziest work I’ve ever seen.

There are other examples of this, where a project may have started out with good intentions but was scaled back and scaled back some more to the point that it ended up being little more than an upscaled re-release. Kingdoms of Amalur: Re-Reckoning springs to mind as an example from just last year.

Remakes are an opportunity to go back to the drawing board, fix issues, update a title, and bring it into the modern world. Too many “remakes” fail to address issues with the original version of the game. We could even point to Mass Effect: Legendary Edition’s refusal to address criticism of the ending of Mass Effect 3 as yet another example of a missed opportunity.

Number 8: The “release now, fix later” business model.

This isn’t the first time I’ve criticised the “release now, fix later” approach taken by too many modern games – and it likely won’t be the last! Also known as “live services,” games that go down this route almost always underperform and draw criticism, and they absolutely deserve it. The addition of internet connectivity to home consoles has meant that games companies have taken a “good enough” approach to games, releasing them before they’re ready with the intention to patch out bugs, add more content, and so on at a later time.

Cyberpunk 2077 is one of the most recent and most egregious examples of this phenomenon, being released on Xbox One and PlayStation 4 in a state so appallingly bad that many considered it “unplayable.” But there are hundreds of other examples going back to the early part of the last decade. Fortunately, out of all the entries on this list, this is the one that shows at least some signs of going away!

The fundamental flaw in this approach, of course, is that games with potential end up having launches that are mediocre at best, and when they naturally underperform due to bad reviews and word-of-mouth, companies panic! Planned updates are scrapped to avoid pumping more money into a failed product, and a game that could have been decent ends up being forgotten.

For every No Man’s Sky that manages to claw its way to success, there are a dozen Anthems or Mass Effect: Andromedas which fail. Time will tell if Cyberpunk 2077 can rebuild itself and its reputation, but its an uphill struggle – and a totally unnecessary one; a self-inflicted wound. If publishers would just wait and delay clearly-unfinished games instead of forcing them to meet arbitrary deadlines, gaming would be a much more enjoyable hobby. Remember, everyone: NO PRE-ORDERS!

Number 9: Forcing games to be multiplayer and/or scrapping single-player modes.

Some games are built from the ground up with multiplayer in mind – but many others are not, and have multiplayer modes tacked on for no reason. The Last Of Us had an unnecessary multiplayer mode, as did Mass Effect 3. Did you even know that, or notice those modes when you booted up those story-focused games?

Some games and even whole genres are just not well-suited to multiplayer. And others that are still have the potential to see single-player stories too. Many gamers associate the first-person shooter genre with multiplayer, and it’s true that multiplayer games work well in the first-person shooter space. But so do single-player titles, and aside from 2016’s Doom and the newer Wolfenstein titles, I can’t think of many new single-player first-person shooters, or even shooters with single-player modes that felt anything other than tacked-on.

Anthem is one of the biggest failures of the last few years, despite BioWare wanting it to be the video game equivalent of Bob Dylan. But if Anthem hadn’t been multiplayer and had instead maintained BioWare’s usual single-player focus, who knows what it could have been. There was potential in its Iron Man-esque flying suits, but that potential was wasted on a mediocre-at-best multiplayer shooter.

I started playing games before the internet, when “multiplayer” meant buying a second controller and plugging it into the console’s only other available port! So I know I’m biased because of that. But just a few short years ago it felt as though there were many more single-player titles, and fewer games that felt as though multiplayer modes had been artificially forced in. In the wake of huge financial successes such as Grand Theft Auto V, Fortnite, and the like, publishers see multiplayer as a cash cow – but I wish they didn’t!

Number 10: Early access.

How many times have you been excited to see that a game you’ve been waiting for is finally available to buy… only to see the two most awful words in the entire gaming lexicon: “Early Access?” Early access billed itself as a way for indie developers to get feedback on their games before going ahead with a full release, and I want to be clear on this point: I don’t begrudge indie games using it for that purpose. Indies get a pass!

But recently there’s been a trend for huge game studios to use early access as free labour; a cheap replacement for paying the wages of a quality assurance department. When I worked for a large games company in the past, I knew a number of QA testers, and the job is not an easy one. It certainly isn’t one that studios should be pushing off onto players, yet that’s exactly what a number of them have been doing. Early access, if it exists at all, should be a way for small studios to hone and polish their game, and maybe add fan-requested extras, not for big companies to save money on testers.

Then there are the perpetual early access games. You know the ones: they entered early access in 2015 and are still there today. Platforms like Steam which offer early access need to set time limits, because unfortunately some games are just taking the piss. If your game has been out since 2015, then it’s out. It’s not in early access, you’ve released it.

Unlike most of the entries on this list, early access started out with genuinely good intentions. When used appropriately by indie developers, it’s fine and I don’t have any issue with it. But big companies should know better, and games that enter early access and never leave should be booted out!

Bonus: Online harassment.

Though this problem afflicts the entire internet regardless of where you go, it’s significant in the gaming realm. Developers, publishers, even individual employees of games studios can find themselves subjected to campaigns of online harassment by so-called “fans” who’ve decided to take issue with something in a recent title.

Let’s be clear: there is never any excuse for this. No game, no matter how bad it is, is worth harassing someone over. It’s possible to criticise games and their companies in a constructive way, or at least in a way that doesn’t get personal. There’s never any need to go after a developer personally, and especially not to send someone death threats.

We’ve seen this happen when games are delayed. We’ve seen it happen when games release too early in a broken state. In the case of Cyberpunk 2077, we’ve seen both. Toxic people will always find a reason to be toxic, unfortunately, and in many ways the anonymity of the internet has brought out the worst in human nature.

No developer or anyone who works in the games industry deserves to be threatened or harassed. It’s awful, it needs to stop, and the petty, toxic people who engage in this scummy activity do not deserve to be called “fans.”

So that’s it. Ten of my pet peeves with modern gaming.

This was a rant, but it was just for fun so I hope you don’t mind! There are some truly annoying things – and some truly annoying people – involved in gaming in 2021, and as much fun as playing games can be, it can be a frustrating experience as well. Some of these things are fads – short-term trends that will evaporate as the industry moves on. But others, like the move away from single-player games toward ongoing multiplayer experiences, seem like they’re here to stay.

Gaming has changed an awful lot since I first picked up a control pad. And it will continue to evolve and adapt – the games industry may be unrecognisable in fifteen or twenty years’ time! We’ll have to keep our fingers crossed for positive changes to come.

All titles mentioned above are the copyright of their respective developer, publisher, and/or studio. Some stock images courtesy of pixabay. Some screenshots and promotional artwork courtesy of IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.