Gaming “Hot Takes”

I’m back with another edition of my infamous Gaming “Hot Takes!” I’ve officially given up on numbering these; I think this might be piece number four or five, but I’ve made several other posts over the last few years in which I share a few of my “hot takes” on gaming and the games industry in general. As I’ve said before, it’s never long before something in the world of gaming comes along to prompt another “hot take,” so that’s what we’re gonna look at today!

Video games are… well, they’re pretty darn good, to be honest with you. And I always like to make sure you know that I’m not some kind of “hater;” I like playing video games, and there are some titles that I genuinely believe eclipse films and TV shows in terms of their world-building, storytelling, or just pure entertainment value. We’re going to tackle some controversial topics today, though!

Atari Jaguar logo + console on a black background.
Let’s get into some gaming “hot takes!”

Before we take a look at the “hot takes,” I have a couple of important caveats. Firstly, I’m well aware that some or all of these points are the minority position, or at least contentious. That’s why they’re called “hot takes” and not “very obvious takes that everyone will surely agree with!” Secondly, this isn’t intended to be taken too seriously, so if I criticise a game or company you like, just bear that in mind. Finally, all of this is the entirely subjective, not objective, opinion of just one person.

Although gamers can be a cantankerous bunch, I still like to believe that there’s enough room – and enough maturity – in the wider community for respectful discussion and polite disagreement that doesn’t descend into name-calling and toxicity! So let’s all try to keep that in mind as we jump into the “hot takes,” eh?

“Hot Take” #1:
If your game is still in “early access,” you shouldn’t be allowed to sell DLC.

Steam pre-order info showing early access.
Pre-purchase to play early!

“Early access” means a game hasn’t been released yet, right? That’s what it’s supposed to mean, anyway – though some titles take the absolute piss by remaining in early access for a decade or more. But if you haven’t officially released your game, your focus ought to be on, y’know, finishing the game instead of working on DLC. Paid-for downloadable content for games that are still officially in “early access” is just awful.

Star Citizen is arguably the most egregious example of this. The game – from what I’ve seen – would barely qualify as an “alpha” version, yet reams of overpriced downloadable content is offered for sale. Some it exists in-game, but a lot of it is really just a promise; an I.O.U. from the developers, promising to build a ridiculously expensive spaceship if and when time permits.

Several DLC spaceships from Star Citizen's webstore.
Expensive DLC ships in Star Citizen.

Early access has a place in gaming, and I don’t want to see it disappear. But that place is with smaller independent projects seeking feedback, not massive studios abusing the model. Selling DLC that doesn’t exist for game that also doesn’t fully exist feels like a total piss-take, and given how often these things go horribly wrong, I’m surprised to see people still being lured in and falling for what can, at times, feel like a scam.

There have been some fantastic expansion packs going back decades, and I don’t object to DLC – even if it’s what I would usually call a pack of overpriced cosmetic items. But when the main game isn’t even out, and is supposedly still being worked on, I don’t think it’s unreasonable to say that charging money for DLC is wrong – these things should either be free updates or, if they’re definitely going to be sold separately, held in reserve until the game is launched.

“Hot Take” #2:
Bethesda Game Studios has basically made four good games… ever.

Four Bethesda role-playing/action games.
Yup, you heard me.

Morrowind, Oblivion, Fallout 3, and Skyrim. That’s it. That’s the list. From 2002 to 2011 – less than a decade – Bethesda Game Studios managed to develop and release four genuinely good games… but hasn’t reached that bar since. Bethesda has spent longer as a declining, outdated, and thoroughly mediocre developer than it ever did as a good developer. The studio is like the games industry equivalent of The Simpsons: fantastic in its prime, but what followed has been a long period of stagnation, decay, and mediocrity as they’ve been completely overtaken and eclipsed by competitors. To be blunt… I don’t see Starfield’s next (and probably last) expansion pack, or The Elder Scrolls VI, changing that.

There is a retro charm to the likes of Arena and Daggerfall, and I won’t pretend that Fallout 4 didn’t have its moments. Even Starfield, with all of its limitations and issues, still had interesting elements, and the ship-builder was genuinely fun to use… at least at first. But since Skyrim in 2011, I would argue that Bethesda has been in decline. In fact, I believe Skyrim’s unprecedented success broke something fundamental in the way Bethesda’s executives and directors think about games. Gone was the idea of games as one-and-done things to be created and released. Replacing it was the concept I’ve called the “single-player live service,” where titles were transformed into “ten-year experiences” that could be monetised every step of the way.

Screenshot of Starfield's microtransaction store.
Starfield has an in-game marketplace comparable to even the worst free-to-play mobile games.

As I said recently, I don’t have a lot of faith in The Elder Scrolls VI any more. It seems all but certain to contain another disgusting in-game marketplace for skins, items, and even entire questlines and factions. When there are so many other games to play that aren’t hideously over-monetised… why should I bother getting excited for The Elder Scrolls VI? Even worse, it’s being made in Bethesda’s “Creation Engine;” the zombified remains of software from thirty years ago that clearly isn’t up to the task and hasn’t been for a while.

Bethesda’s decline has been slow, and folks who skipped titles like Starfield and Fallout 76 might not be aware of just how bad things have gotten. Maybe I’m wrong, and maybe The Elder Scrolls VI will be a miraculous return to form. I hope so – I never want to root for a game to fail. But with so many other role-playing games out now or on the horizon… I just don’t see it measuring up as things stand. And in a way, I can’t help but feel it would be better in the long run if another studio were to take on the project.

“Hot Take” #3:
There won’t ever be another 1983-style “crash.”

Black-and-white image of video games on shop shelves with a red "downwards" arrow superimposed on top.
It ain’t gonna happen.

Given the absolute state of modern gaming – at least insofar as many of the industry’s biggest corporations are concerned – I genuinely get where this feeling is coming from. But I think the people making this argument either don’t fully understand the 1983 crash, or don’t appreciate how massive gaming as a whole has become in the decades since then.

In short: in 1983, video games weren’t much more than pretty expensive digital toys. The home console market was relatively small, and like so many products over the years, it was genuinely possible that video games themselves could’ve been a flash in the pan; something comparable to LaserDisc, the hovercraft, or, to pick on a more modern example, Google Glass. All of these technologies threatened to change the world… but didn’t. They ended up being temporary fads that were quickly forgotten.

Photo of discarded and buried Atari game cartridges from the 1983 crash.
Atari dumped unsold games in a New Mexico landfill during the crash.
Photo: taylorhatmaker, CC BY 2.0 https://creativecommons.org/licenses/by/2.0, via Wikimedia Commons

Fast-forward to 2025. The games industry is massive. So many people play games in some form or another that the idea of a total market collapse or “crash” is beyond far-fetched. That isn’t to say there won’t be changes and shake-ups – whole companies could disappear, including brands that seem massive and unassailable right now. Overpriced games and hardware are going to be challenges, too. Changing technology – like generative A.I. – could also prove to be hugely disruptive, and there could be new hardware, virtual reality, and all sorts.

But a 1983-style crash? Gaming as a whole on the brink of disappearing altogether? It ain’t gonna happen! There is still innovation in the industry, though these days a lot of it is being driven by independent studios. Some of these companies, which are small outfits right now, could be the big corporations of tomorrow, and some of the biggest names in the industry today will almost certainly fall by the wayside. Just ask the likes of Interplay, Spectrum HoloByte, and Atari. But whatever may happen, there will still be games, there will still be big-budget games, and there will still be hardware to play those games on. Changes are coming, of that I have no doubt. But there won’t be another industry crash that comes close to what happened in ’83.

“Hot Take” #4:
Nintendo’s die-hard fans give the company way too much leniency and support – even for horribly anti-consumer shenanigans.

Stock photo of a "riot."
If you dare to criticise Nintendo, fans are going to riot!

I consider myself a fan of Nintendo’s games… some of them, at least. I’ve owned every Nintendo console from the SNES to the first Switch, and unless something major comes along to dissuade me, I daresay I’ll eventually shell out for a Switch 2, too. But I’m not a Nintendo super-fan, buying every game without question… and some of those folks, in my opinion at least, are far too quick to defend the practices of a greedy corporation that doesn’t care about them in the slightest.

Nintendo isn’t much different from the likes of Ubisoft, Activision, Electronic Arts, Sony, Sega, and other massive publishers in terms of its business practices and its approach to the industry. But none of those companies have such a well-trained legion of die-hard apologists, ready to cover for them no matter how badly they screw up. Nintendo fans will happily leap to the defence of their favourite multi-billion dollar corporation for things they’d rightly criticise any other gaming company for. Price hikes, bad-value DLC, lawsuits against competitors or fans, underbaked and incomplete games… Nintendo is guilty of all of these things, yet if you bring up these points, at least in some corners of the internet, there are thousands of Nintendo fans piling on, shouting you down.

Still frame from the Nintendo Switch 2 broadcast showing Welcome Tour.
Welcome Tour.

Obviously the recent launch of the Switch 2 has driven this point home for me. The console comes with a very high price tag, expensive add-ons, a paid-for title that should’ve been bundled with the system, an eShop full of low-quality shovelware, literally only one exclusive launch title, and over-inflated prices for its first-party games. But all of these points have been defended to the death by Nintendo’s super-fans; criticising even the shitty, overpriced non-entity Welcome Tour draws as much vitriol and hate as if you’d personally shat in their mother’s handbag.

Very few other corporations in the games industry enjoy this level of protection from a legion of well-trained – and pretty toxic – super-fans. And it’s just… odd. Nintendo has made its share of genuinely bad games. Nintendo has made plenty of poor decisions over the years. Nintendo prioritises profit over everything else, including its own fans and employees. Nintendo is overly litigious, suing everyone from competitors to its own fans. And Nintendo has taken actions that are damaging to players, families, and the industry as a whole. Gamers criticise other companies when they behave this way; Electronic Arts is routinely named as one of America’s “most-hated companies,” for instance. But Nintendo fans are content to give the corporation cover, even for its worst and most egregious sins. They seem to behave like fans of a sports team, insistent that “team red” can do no wrong. I just don’t understand it.

“Hot Take” #5:
“Woke” is not synonymous with “bad.”
(And many of the people crying about games being “woke” can’t even define the word.)

Screenshot of a famous YouTube video/meme of a commentator screaming at the camera about "pronouns" in Starfield.
He seems like a reasonable man…

In some weird corners of social media, a game (or film or TV show) is decreed “woke” if a character happens to be LGBT+ or from a minority ethnic group. And if such a character is featured prominently in pre-release marketing material… that can be enough to start the hate and review-bombing before anyone has even picked up a control pad. The expression “go woke, go broke” does the rounds a lot… but there are many, many counter-examples that completely disprove this point.

Baldur’s Gate 3 is a game where: the player character can be any gender, and their gender is not defined by their genitals. Players can choose to engage in same-sex relationships, practically all of the companion NPCs are pansexual, and there are different races and ethnicities represented throughout the game world. But Baldur’s Gate 3 sold incredibly well, and will undoubtedly be remembered as one of the best games of the decade. So… is it “woke?” If so, why didn’t it “go broke?”

Screenshot of a nude character from Baldur's Gate 3.
The famously not-woke-at-all Baldur’s Gate 3.

Many “anti-wokers” claim that they aren’t really mad about women in leading roles, minority ethnic characters, or LGBT+ representation, but “bad writing.” And I will absolutely agree that there are some games out there that are genuinely poorly-written, or which have stories I just did not care for in the least. The Last Of Us Part II is a great example of this – the game’s entire narrative was based on an attempt to be creative and subversive, but it hacked away at too many of the fundamentals of storytelling to be satisfying and enjoyable. But you know what wasn’t the problem with The Last Of Us Part II? The fact that one of its secondary characters was trans and another female character was muscular.

Good games can be “woke” and “woke” games can be good. “Woke” games can also be bad, either for totally unrelated reasons or, in some cases, because they got too preachy. But to dismiss a game out of hand – often without playing it or before it’s even launched – because some armchair critic on YouTube declared it to be “woke” is just silly. Not only that, but there are many games that contain themes, storylines, or characters that could be reasonably described as “woke” that seem to be completely overlooked by the very folks who claim it’s their mission to “end wokeness.” The so-called culture war is just a very odd thing, and it’s sad to see how it’s impacted gaming. I would never tell anyone they “must” play or not play certain games, but I think it’s a shame if people miss out on genuinely fun experiences because of a perception of some ill-defined political concept that, in most cases, doesn’t have much to do with the game at all.

So that’s it!

Screenshot of Mario in the castle in Super Mario 64.
It’s a-me, Mario!

We’ve looked at a few more of my infamous “hot takes!” I hope it’s been a bit of fun… and not something to get too upset about! It’s totally okay to disagree, and one of the great things about gaming nowadays is that there’s plenty of choice. If you like a game that I don’t, or I enjoy a genre you find boring… that’s okay. If you’re a super-fan of something that I’m not interested in… we can still be friends. Even if we don’t agree politically, we ought to be able to have a civil and reasonable conversation without screaming, yelling, or name-calling!

Be sure to check out some of my other “hot takes.” I’ve linked a few other pieces below. And I daresay there’ll be more of these one day soon… I keep finding things in gaming to disagree with, for some reason. It must be because I’m getting grumpy in my old age; I’m just a big ol’ sourpuss!

Have fun out there, and happy gaming!


All titles mentioned above are the copyright of their respective publisher, developer, and/or studio. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.


Links to other gaming “hot takes:”

Games Industry “Hot Takes”

A few months ago, I put together a list of “hot takes” about video games. As much as I enjoy gaming as a hobby, there are things that annoy me and things to criticise! There were a few other things that I considered including, but they didn’t really fit with that list. These “hot takes” have less to do with games themselves and more to do with the games industry, development, and gaming as a whole – so that’s what we’re going to discuss today!

If you’re interested in checking out that earlier list, by the way, you can find part one by clicking or tapping here, and part two by clicking or tapping here.

Whenever I use the term “hot take” it’s because I’m acutely aware that we’re talking about something contentious! So before we get started, let’s re-emphasise that: these are all topics of debate among players and critics, and mine may well be the minority position. I don’t pretend to be 100% right, and I welcome disagreements and differences of opinion.

A stock photo of a crying girl.
Let’s not throw a tantrum if we disagree, okay?

I worked in the games industry for close to a decade, and I worked with large and small games companies in that time. I’ve got a bit of a feel for how development works from the time I spent “on the inside,” and I know that developers are passionate people who care deeply about their art. But that doesn’t mean games get a free pass; a bad game is a bad game, no matter how well-intentioned it may have been!

As I always like to say: all of this is just the subjective opinion of one player, and I believe that there should be enough room in the community for differences of opinion and respectful disagreement. The topics we’re going to get into today are the subject of discussion and debate, and there isn’t a right answer – just opinions.

If you aren’t in the right headspace to see some potentially controversial games industry opinions, this is your final chance to nope out – because we’re about to jump into the list!

“Hot Take” #1:
“Game development is hard” isn’t an excuse for selling a sub-par title.

Stock photo of a woman working at a computer with two monitors.
A lot of people work really hard on some absolutely shite games…

Speaking as both a player and as someone who used to work in the industry, believe me when I say that I get it. Game development is undeniably difficult, it isn’t straightforward, and there are many, many reasons why a game may not be as good, enjoyable, or polished as we’d like it to be. There can be problems getting an engine to work, fixing one bug might cause ten more to pop up elsewhere, and the more complex and in-depth a title is, the greater the chance of these kinds of issues occurring. Publishers and corporations also meddle, moving the goalposts and pushing developers to hit unreasonable deadlines. So I get it. But that doesn’t make “development is hard” a good enough excuse.

Here’s a helpful analogy: suppose I buy a house, move in, and every time I turn on the washing machine, the electric goes off. Then when I ring the electrician, he basically says “wiring a house is really hard. You wouldn’t get it because you aren’t an electrician.” That’s not an excuse. If I go to a bakery and the bread is stale and mouldy, I likewise wouldn’t accept the excuses that “baking is really difficult,” or “running a business and keeping track of sell-by dates is hard.” The same basic principle applies to video games.

Stock photo of loaves of bread in a bakery.
You wouldn’t accept sub-par bread from a baker, so why should you accept a sub-par game from a developer?

I will acknowledge and agree that game development is hard, and that bigger games are harder to make; it’s an almost exponential scale of difficulty. But trying your best and failing is still failing, and in a competitive marketplace where most games aren’t free, if you release a sub-par, broken, uninspired, or inferior game, you’re gonna get called out for it. Media criticism exists for this purpose, and just because a critic has never worked in the games industry or has no experience with development doesn’t invalidate their criticism.

When a game is listed for sale, even if it’s discounted or at a low price, players still have expectations – and those expectations aren’t “wrong” just because they didn’t see how hard the game was to create. If you’re a brand-new developer releasing your first-ever game for free and asking for feedback, then maybe some of the harshest words should be held back. But this asinine argument is too often made by publishers and executives who work for massive companies. When a game underperforms, they trot out the trusty old “game development is hard” argument as a rebuttal to critics.

Screenshot of The Lord of the Rings: Gollum showing a serious bug.
The Lord of the Rings: Gollum was widely criticised upon its release for being riddled with bugs and glitches.

In no other business or industry would customers be told that “my job is hard, you should be grateful for what you got” as a response to genuine criticism. Selling a game that’s outdated, riddled with glitches, or just not fun can’t be excused in this way, and developers – no matter how hard they may have worked and no matter what programming hurdles they may have had to overcome – have to accept that. Criticism is inevitable in entertainment and media, and even if a developer had created an impossibly perfect game, there’d still be players who didn’t like it in whole or in part, or who just weren’t interested in its narrative or its gameplay. That’s unavoidable.

Some developers and studios actively make things worse for themselves by trying to respond to criticism in this way. It never works, it never succeeds at garnering sympathy, and practically zero players come away from this conversation having more positive thoughts about the game. It’s an argument that needs to go away, and developers and publishers should think long and hard before reacting to genuine criticism with this irritating whine.

“Hot Take” #2:
Subscriptions are happening and physical discs and cartridges are dying out.

A stock photo of Mega Drive games.
A selection of Sega Mega Drive game cartridges.

This is a subject I’ve tackled before in a longer column here on the website. In that piece I took a look at the media landscape in general, talking about how the move away from physical media started with music, then moved to film and TV, and is now belatedly arriving in gaming, too. You can find that piece by clicking or tapping here, if you’re interested! But for the games industry specifically, a move away from discs and cartridges has been happening for a long time – and the rise of subscriptions could well be the final nail in the coffin.

In the very early days, no one owned a video game outright. If you wanted to play a game, you had to go to where the games were: an arcade. It was only with the growth of home consoles in the ’80s that physically owning a video game became possible for a mainstream audience, and even then, renting games or even whole systems was still a big deal. Many of the SNES, Nintendo 64, and Dreamcast games that I played in through the ’90s and into the new millennium were rented, not purchased outright. The idea of owning a massive media library is, when you think about it, a relatively new phenomenon that was kicked into a higher gear when DVD box sets became a thing in the mid-2000s.

Concept art for Wreck-It Ralph showing the arcade.
Arcades (like this one from Wreck-It Ralph) used to be the only place to play video games.

In that sense, we could argue that subscriptions aren’t “changing” the way people engage with media, they’re just a return to the 20th Century status quo. For much of the history of film, television, music, and gaming, audiences have had a temporary or impermanent relationship with media… and to me, that’s absolutely fine. It’s a trade-off I and many other players are happy to make.

I could probably count on my fingers the number of games I’d want a permanent hard copy of… because most games aren’t gonna be played on a loop forever nor returned to every few months. Just like when I used to rent SNES and N64 games in the ’90s, I’m totally okay with not having a huge library of titles gathering dust on a shelf (or metaphorical dust in a digital library), because once I’ve beaten a title like Donkey Kong 64 or Bioshock, I’m in no rush to play them again.

Promo screenshot of Red Dead Redemption II.
Red Dead Redemption II is one of just a handful of games I might conceivably want a hard copy of.

Speaking as someone on a low income, subscription services like Netflix and Xbox Game Pass open up a huge library of titles to me – allowing me to play more games than I’d ever be able to afford if I had to buy or even rent them individually. I’ve played dozens of games over the past couple of years that I’d never have bought for myself, and some of them have become personal favourites. Subscriptions like Game Pass are a great way into gaming for players on a budget – because for a single monthly fee a huge library of titles become available.

If the trade-off for that is that titles are occasionally removed from the platform and become unplayable… well, I’m okay with that. And for one-in-a-generation masterpieces like Red Dead Redemption II or Baldur’s Gate 3, I’m happy to splash out. When you consider that an annual subscription to Game Pass is more or less the same price as buying one or two games… you start to see why people are choosing to sign up. I wouldn’t be surprised at all if Xbox, PlayStation, or both choose to go all-digital later in the decade when their next-generation machines are ready.

“Hot Take” #3:
Microtransactions have no place in single-player games.

A screenshot of part of Starfield's in-game shop.
*cough* Starfield *cough*

I’m not wild about microtransactions in general – but in online multiplayer games and especially free-to-play titles, I accept that they’re an established funding model. They should still be regulated and prevented from being exploitative, but in those genres the microtransaction model seems to work well enough. But in a single-player game? Microtransactions need to GTFO.

Going back decades, games have released expansion packs – and large pieces of content that add new maps, quests, characters, and so on are usually okay. Look at something like Morrowind’s expansion Bloodmoon, or a more recent example like Phantom Liberty for Cyberpunk 2077. These are the kinds of expansion packs that have always been okay. Some are better than others, sure, and some expansions offer much more in terms of value. But as a general rule, I’m okay with expansion packs.

A still frame from the trailer for Cyberpunk 2077: Phantom Liberty showing Johnny Silverhand in a helicopter.
Phantom Liberty is a great example of an expansion pack that offers good value.

But in a single-player game, I shouldn’t be asked to purchase a “premium currency,” weapon skins, cosmetic items, and so forth. These microtransactions have no place in a single-player title, and there’s no excuse for adding them in other than pure, unadulterated greed. If a game like No Man’s Sky can remain profitable for Hello Games for close to a decade without charging for a single additional piece of content, there’s no excuse for the disgusting in-game marketplace in a title like Starfield.

I love a game with cosmetic customisation. Making my character feel personal to me goes a long way to enhancing the experience and making my playthrough feel like “mine,” so I enjoy having the option to change a hairstyle, outfit, or do things like re-paint a vehicle. But these things are an integral part of the game experience – not something to charge extra for. Exploiting players by locking basic items behind a paywall is despicable – and that’s before we say anything about “XP boosters,” damage multipliers, and other pay-to-win or pay-to-skip-the-grind items.

Steam page for No Man's Sky showing that the game has no DLC.
Oh look, it’s all of the DLC available for No Man’s Sky

I’ll also include in this category “super premium deluxe editions” of games that come with exclusive content. You might think that Han Solo’s vest in Star Wars Outlaws is okay to lock behind a paywall, but some games do this with whole quests. Hogwarts Legacy infamously locked an entire mission behind a paywall, and it’s far from the only game to have done so in recent years. Offering an in-game item as a pre-order bonus is one thing, locking a whole chest full of items and even pieces of gameplay behind an expensive “luxury edition” that can easily run to $100 or more is just scummy.

If I’m paying full price for a game, I don’t expect that game to reach into my wallet and try to grab even more cash every time I want to use a consumable item or change my character’s appearance. I tend to avoid online multiplayer games, where this phenomenon primarily exists, but inserting a microtransaction marketplace into a single-player game where it has absolutely no business being is enough to make me uninstall that title and never return to it. I’ll even refund it if I can. Some studios have even taken to concealing in-game marketplaces at launch, hoping to garner better reviews and more sales, before adding them in a few weeks or months later. Truly disgusting stuff.

“Hot Take” #4:
You aren’t paying for “early access,” you’re being charged an additional fee to play the game on its real release date.

Early access info for Indiana Jones and the Great Circle.
An example of what I’m talking about.

“Early access” is controversial in general, but let me just say before we start that I’m generally supportive of smaller studios and indie developers using early access as a way to get feedback and even to keep the lights on during what can be a difficult process. I very rarely touch an early access title, but independent devs should always feel free to use whatever tools are available to them, including launching an early access version of their game. But that’s where my patience with early access ends.

Recently we’ve seen two pretty shitty trends in the games industry: firstly, massive studios backed up by big publishers have been abusing early access, sometimes leaving a game officially unreleased for four, five, or six years, charging almost full price for it all the while. And secondly, the issue we’re looking at today: “early” access for an extra charge.

Promo graphic for Star Wars Outlaws showing the different versions of the game.
Ubisoft wanted to charge players an extortionate amount of money to play Star Wars Outlaws on its real release date.

This kind of “early” access usually grants players access to a game a few days or maybe a week ahead of its official release date, but by that point the game is finished and should be ready to go. The “early” version that players get is usually no different from the launch version, and there’s no time for a studio to act on player feedback or patch bugs. This is a scam, plain and simple, and an excuse for wringing even more money out of players.

If a game launches on the 1st of September for players who pay £100, and the 6th of September for players who “only” pay £65, then the release date is the 1st of September. They’ve just charged more to players who want to play on release day – or, if you flip things around, deliberately penalised players who didn’t splash the extra cash. These versions of games – which I think we should call “real release date” versions – are often $20, $30, or $40 more expensive than their delayed counterparts.

A stock photo of a hand holding burning dollar bills.
And who has that kind of money to waste these days?

Buying a game on day one is a risk nowadays. So many games – even those that go on to be hailed as masterpieces – arrive on launch day with bugs, glitches, and other problems. So paying extra to play what is almost always a demonstrably shittier version of a game just feels… stupid. I’ve been burned by this before, and just as with pre-orders, I’ve sworn to never again pay for so-called “early” access.

I’d like to see digital stores like Steam, Epic Games, and ideally Xbox and PlayStation too clamp down on this practice. Early access should be reserved for studios that need it, and charging players extra to play a game on release day is something that should be banned outright.

“Hot Take” #5:
Players’ expectations aren’t “too high.”

A stock photo of an angry man holding a PlayStation control pad.
It isn’t the players that are wrong…

There have been some fantastic games released over the last few years. Red Dead Redemption II, Baldur’s Gate 3, and Kena: Bridge of Spirits all come to mind in the single-player space, but I’m sure you have your own favourite. These games are, in a word, masterpieces; titles that did everything right and are rightly considered to be at the very pinnacle of not only their genres but video games as an art form in general. So… if your game doesn’t get that kind of glowing reception, whose fault is it?

Some developers think it’s the fault of players, and that we’ve had our expectations set “too high.” They argue that it was unrealistic to expect their game to be as engaging or entertaining as others in the genre, and we should be grateful for what we got. They worked hard on it, after all.

A screenshot from Starfield showing a first-person perspective and three NPCs.
I wonder which game might’ve prompted this “hot take.”

The tl;dr is this: it isn’t the fault of players if they don’t like your game – it’s yours. Complaining about high expectations makes no sense when other titles have demonstrably been able to meet and even exceed those expectations, so if you learned nothing from your competition, once again that isn’t anyone else’s fault but yours! That’s to say nothing of the out-of-control and frequently dishonest marketing that promises players way more than the game can deliver. Studios and publishers are responsible for reining in hype and keeping their marketing honest. That, more than anything else, will help players set appropriate expectations.

I get it: it isn’t fun to be criticised or see your work picked apart. It’s even less fun to see a game you worked hard on for a long time compared negatively to another title in the same space. But to lash out at players – the people who are supposed to be your customers and the people it’s your job to entertain – just doesn’t make any sense to me. Not only is it wrong, but it also risks building up resentment and ill-will, so the next time you work on a game and get it ready for launch, players will be even more sceptical and perhaps even quicker to criticise.

A stock photo of a smartphone showing social media apps.
This is a problem exacerbated by social media.

Thankfully, it isn’t all developers who say this – at least not in public! I heard complaints like this from time to time when I worked in the industry, but most developers I worked with were smart enough to keep such thoughts to themselves if they had them. So we’re fortunate that it’s only a minority of developers who take this argument into the public square.

Some developers need to get off social media. Social media is a great tool, don’t get me wrong, and being able to communicate directly with players can be useful in some situations. But if a developer is so thin-skinned that they feel the need to react in real-time and respond to every armchair critic and Twitter troll… that can’t be good for them, and it certainly isn’t good for the company they work for. For their own good, some developers need to shut down their social media profiles!

So that’s it… for now!

A promo graphic of an Xbox Series control pad.
I hope this wasn’t too controversial!

I’m always finding more “hot takes” and things to criticise in the games industry, so I daresay this won’t be the last time I put together a piece like this one! Despite what I’ve said today, I still really enjoy gaming as a hobby and I find there are far more positives than negatives. And if you hated all of my points, just remember that all of this is the entirely subjective opinion of a single old gamer.

So I hope this has been a bit of fun… and maybe a little thought-provoking in places, too. If you don’t agree with any of my points that’s totally okay! I tried my best to present my arguments as articulately as possible, but these are “hot takes” so I’m sure plenty of people can and will disagree with all of them. If I gave you a chuckle or you found this discussion interesting in some way, then I reckon I’ve done my job!

Until next time… and happy gaming!


All titles discussed above are the copyright of their respective publisher, studio, and/or developer. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The worst things about modern video games

The first home console I owned – after saving up my hard-earned pocket money and pestering my parents for ages – was a Super Nintendo. Gaming has changed a lot since then, and while many of those changes have been fantastic and introduced us to new genres, not every change has been for the better! In this list I’m going to cover some of my biggest pet peeves with video games in 2021.

As always, this list is entirely subjective. If I criticise something you like, or exclude something you hate, just keep in mind that this is only one person’s opinion. Gaming is a huge hobby that includes many people with many different perspectives. If yours and mine don’t align, that’s okay!

Number 1: No difficulty options.

Some people play video games because they love the challenge of a punishingly-difficult title, and the reward of finally overcoming an impossible level after hours of perseverance. I am not one of those people! In most cases, I play video games for escapism and entertainment – I want to see a story unfold or just switch off from other aspects of my life for a while. Excessive difficulty is frustrating and offputting for me.

As someone with health issues, I would argue that difficulty settings are a form of accessibility. Some people don’t have the ability to hit keys or buttons in rapid succession, and in some titles the lack of a difficulty setting – particularly if the game is not well-balanced – can mean those games are unavailable to folks with disabilities.

While many games are too difficult, the reverse can also be true. Some titles are just too easy for some people – I’m almost never in that category, but still! Games that have no difficulty settings where the base game is incredibly easy can be unenjoyable for some folks, particularly if the challenge was what got them interested in the first place.

In 2021, most games have difficulty options as a standard feature. Difficulty settings have been part of games going back decades, and in my opinion there’s no technical reason why they shouldn’t be included. There’s also not really a “creative” reason, either. Some developers talk in grandiose terms about their “vision” for a title being the reason why they didn’t implement difficulty options, but as I’ve said before – the inclusion of an easier (or harder) mode does not impact the game at all. It only impacts those who choose to turn it on, and considering how easy it is to implement, I find it incredibly annoying when a game is deliberately shipped without any difficulty options.

Number 2: Excessive difficulty as a game’s only selling point.

While we’re on the subject of difficulty, another pet peeve of mine is games whose entire identity is based on their difficulty (or perceived difficulty). Think about this for a moment: would Dark Souls – an otherwise bland, uninspired hack-and-slash game – still be talked about ten years after its release were it not for its reputation as impossibly difficult? How many late 2000s or early ’10s hack-and-slash games have dropped out of the cultural conversation? The only thing keeping Dark Souls there is its difficulty.

A challenge is all well and good, and I don’t begrudge players who seek that out. But for me, a game has to offer something more than that. If there’s a story worth telling under the difficult gameplay I’m impressed. If the difficult, punishing gameplay is all there is, then that’s boring!

Difficulty can also be used by developers as cover for a short or uninteresting game. Forcing players to replay long sections over and over and over can massively pad out a game’s runtime, and if that’s a concern then cranking the difficulty to ridiculous levels – and offering no way to turn it down – can turn a short game into a long one artificially.

I’m all for games that offer replay value, but being forced to replay the same level or checkpoint – or battle the same boss over and over – purely because of how frustratingly hard the developers chose to make things simply isn’t fun for me.

Number 3: Ridiculous file sizes.

Hey Call of Duty? Your crappy multiplayer mode does not need to be 200 gigabytes. Nor does any game, for that matter. It’s great that modern technology allows developers to create realistic-looking worlds, but some studios are far better than others when it comes to making the best use of space! Some modern games do need to be large to incorporate everything, but even so there’s “large” and then there’s “too large.”

For a lot of folks this is an issue for two main reasons: data caps and download speeds. On my current connection I’m lucky to get a download speed of 7 Mbps, and downloading huge game files can quite literally take several days – days in which doing anything else online would be impossibly slow! But I’m fortunate compared to some people, because I’m not limited in the amount of data I can download by my ISP.

In many parts of the world, and on cheaper broadband connections, data caps are very much still a thing. Large game files can take up an entire months’ worth of data – or even more in some cases – making games with huge files totally inaccessible to a large number of people.

This one doesn’t seem like it’s going away any time soon, though. In fact, we’re likely to see file sizes continue to get larger as games push for higher resolutions, larger environments, and more detail.

Number 4: Empty open worlds.

Let’s call this one “the Fallout 76 problem.” Open worlds became a trend in gaming at some point in the last decade, such that many franchises pursued this style even when it didn’t suit their gameplay. Read the marketing material of many modern titles and you’ll see bragging about the size of the game world: 50km2, 100km2, 1,000km2, and so on. But many of these open worlds are just empty and boring, with much of the map taken up with vast expanses of nothing.

It is simply not much fun to have to travel across a boring environment – or even a decently pretty one – for ages just to get to the next mission or part of the story. Level design used to be concise and clever; modern open worlds, especially those which brag about their size, tend to be too large, with too little going on.

The reason why Fallout 76 just encapsulates this for me is twofold. Firstly, Bethesda droned on and on in the weeks before the game’s release that the world they’d created was the “biggest ever!” And secondly, the game had literally zero non-player characters. That huge open world was populated by a handful of other players, non-sentient monsters, and nothing else. It was one of the worst games of the last few years as a result.

Open worlds can work well in games that are suited for that style of gameplay. But too many studios have been pushed into creating an open world simply to fit in with a current trend, and those open worlds tend to just flat-out suck because of it. Even when developers have tried to throw players a bone by adding in collect-a-thons, those get boring fast.

Number 5: Pixel graphics as a selling point.

There are some great modern games that use a deliberately 8-bit look. But for every modern classic there are fifty shades of shit; games that think pixel graphics and the word “retro” are cover for creating a mediocre or just plain bad title.

It may be hard to remember, but there was a time when the idea of using a deliberately “old-school” aesthetic would have been laughed at. The first few console generations were all about improvements, and I’m old enough to remember when 3D was a huge deal. It seemed like nobody would ever want to go back to playing a SNES game after trying the Nintendo 64, and while there are still plenty of gamers who love the retro feel, I’m generally not one of them.

That isn’t to say that realistic graphics should be the only thing a game strives for. And this point works for modern graphics or visual styles in general – bragging about how detailed the graphics are, or how unique a title’s art style is, means nothing if the game itself is shit. But it likewise works for pixel-graphics games – an outdated art style does not compensate for or cover up a fundamentally flawed, unenjoyable experience.

Games with pixel graphics can be good, and many titles have surprised me by how good they are. I’ve written before about how Minecraft surprised me by being so much more than I expected, and that’s one example. But I guess what I’d say is this: if your game looks like it should have been released in 1991, you’ve got more of an uphill battle to win me over – or even convince me to try it in the first place – than you would if your game looked new.

Number 6: Unnecessary remakes.

We called one of the entries above “the Fallout 76 problem,” so let’s call this one “the Mass Effect: Legendary Edition problem.” In short, games from even ten or fifteen years ago still look pretty good and play well. There’s far less of a difference between games from 2011 and 2021 than there was between games from 1991 and 2001 – the pace of technological change, at least in gaming, has slowed.

“Updating” or “remaking” a game from ten years ago serves no real purpose, and in the case of Mass Effect: Legendary Edition I’ve struggled at times to tell which version of the game is the new one when looking at pre-release marketing material. There’s no compelling reason to remake games that aren’t very old. Re-release them or give them a renewed marketing push if you want to drum up sales or draw attention to a series, but don’t bill your minor upgrade as a “remake.”

There are some games that have benefitted hugely from being remade. I’d point to Crash Bandicoot and Resident Evil 2 as two great examples. But those games were both over twenty years old at the time they were remade, and having been released in the PlayStation 1 era, both saw massive upgrades such that they were truly worthy of the “remake” label.

I’ve put together two lists of games that I’d love to see remade, but when I did so I deliberately excluded titles from the last two console generations. Those games, as I said at the time, are too recent to see any substantial benefits from a remake. In another decade or so, assuming sufficient technological progress has been made, we can talk about remaking PlayStation 3 or PlayStation 4 games – but not now!

Number 7: Fake “remakes.”

On a related note to the point above, if a title is billed as a “remake,” I expect to see substantial changes and improvements. If all that’s happened is a developer has run an old title through an upscaler and added widescreen support, that’s not a remake!

A lot of titles that acquire the “HD” suffix seem to suffer from this problem. Shenmue I & II on PC contained a number of bugs and glitches – some of which existed in the Dreamcast version! When Sega decided to “remake” these two amazing games, they couldn’t even be bothered to patch out bugs that were over fifteen years old. That has to be some of the sloppiest, laziest work I’ve ever seen.

There are other examples of this, where a project may have started out with good intentions but was scaled back and scaled back some more to the point that it ended up being little more than an upscaled re-release. Kingdoms of Amalur: Re-Reckoning springs to mind as an example from just last year.

Remakes are an opportunity to go back to the drawing board, fix issues, update a title, and bring it into the modern world. Too many “remakes” fail to address issues with the original version of the game. We could even point to Mass Effect: Legendary Edition’s refusal to address criticism of the ending of Mass Effect 3 as yet another example of a missed opportunity.

Number 8: The “release now, fix later” business model.

This isn’t the first time I’ve criticised the “release now, fix later” approach taken by too many modern games – and it likely won’t be the last! Also known as “live services,” games that go down this route almost always underperform and draw criticism, and they absolutely deserve it. The addition of internet connectivity to home consoles has meant that games companies have taken a “good enough” approach to games, releasing them before they’re ready with the intention to patch out bugs, add more content, and so on at a later time.

Cyberpunk 2077 is one of the most recent and most egregious examples of this phenomenon, being released on Xbox One and PlayStation 4 in a state so appallingly bad that many considered it “unplayable.” But there are hundreds of other examples going back to the early part of the last decade. Fortunately, out of all the entries on this list, this is the one that shows at least some signs of going away!

The fundamental flaw in this approach, of course, is that games with potential end up having launches that are mediocre at best, and when they naturally underperform due to bad reviews and word-of-mouth, companies panic! Planned updates are scrapped to avoid pumping more money into a failed product, and a game that could have been decent ends up being forgotten.

For every No Man’s Sky that manages to claw its way to success, there are a dozen Anthems or Mass Effect: Andromedas which fail. Time will tell if Cyberpunk 2077 can rebuild itself and its reputation, but its an uphill struggle – and a totally unnecessary one; a self-inflicted wound. If publishers would just wait and delay clearly-unfinished games instead of forcing them to meet arbitrary deadlines, gaming would be a much more enjoyable hobby. Remember, everyone: NO PRE-ORDERS!

Number 9: Forcing games to be multiplayer and/or scrapping single-player modes.

Some games are built from the ground up with multiplayer in mind – but many others are not, and have multiplayer modes tacked on for no reason. The Last Of Us had an unnecessary multiplayer mode, as did Mass Effect 3. Did you even know that, or notice those modes when you booted up those story-focused games?

Some games and even whole genres are just not well-suited to multiplayer. And others that are still have the potential to see single-player stories too. Many gamers associate the first-person shooter genre with multiplayer, and it’s true that multiplayer games work well in the first-person shooter space. But so do single-player titles, and aside from 2016’s Doom and the newer Wolfenstein titles, I can’t think of many new single-player first-person shooters, or even shooters with single-player modes that felt anything other than tacked-on.

Anthem is one of the biggest failures of the last few years, despite BioWare wanting it to be the video game equivalent of Bob Dylan. But if Anthem hadn’t been multiplayer and had instead maintained BioWare’s usual single-player focus, who knows what it could have been. There was potential in its Iron Man-esque flying suits, but that potential was wasted on a mediocre-at-best multiplayer shooter.

I started playing games before the internet, when “multiplayer” meant buying a second controller and plugging it into the console’s only other available port! So I know I’m biased because of that. But just a few short years ago it felt as though there were many more single-player titles, and fewer games that felt as though multiplayer modes had been artificially forced in. In the wake of huge financial successes such as Grand Theft Auto V, Fortnite, and the like, publishers see multiplayer as a cash cow – but I wish they didn’t!

Number 10: Early access.

How many times have you been excited to see that a game you’ve been waiting for is finally available to buy… only to see the two most awful words in the entire gaming lexicon: “Early Access?” Early access billed itself as a way for indie developers to get feedback on their games before going ahead with a full release, and I want to be clear on this point: I don’t begrudge indie games using it for that purpose. Indies get a pass!

But recently there’s been a trend for huge game studios to use early access as free labour; a cheap replacement for paying the wages of a quality assurance department. When I worked for a large games company in the past, I knew a number of QA testers, and the job is not an easy one. It certainly isn’t one that studios should be pushing off onto players, yet that’s exactly what a number of them have been doing. Early access, if it exists at all, should be a way for small studios to hone and polish their game, and maybe add fan-requested extras, not for big companies to save money on testers.

Then there are the perpetual early access games. You know the ones: they entered early access in 2015 and are still there today. Platforms like Steam which offer early access need to set time limits, because unfortunately some games are just taking the piss. If your game has been out since 2015, then it’s out. It’s not in early access, you’ve released it.

Unlike most of the entries on this list, early access started out with genuinely good intentions. When used appropriately by indie developers, it’s fine and I don’t have any issue with it. But big companies should know better, and games that enter early access and never leave should be booted out!

Bonus: Online harassment.

Though this problem afflicts the entire internet regardless of where you go, it’s significant in the gaming realm. Developers, publishers, even individual employees of games studios can find themselves subjected to campaigns of online harassment by so-called “fans” who’ve decided to take issue with something in a recent title.

Let’s be clear: there is never any excuse for this. No game, no matter how bad it is, is worth harassing someone over. It’s possible to criticise games and their companies in a constructive way, or at least in a way that doesn’t get personal. There’s never any need to go after a developer personally, and especially not to send someone death threats.

We’ve seen this happen when games are delayed. We’ve seen it happen when games release too early in a broken state. In the case of Cyberpunk 2077, we’ve seen both. Toxic people will always find a reason to be toxic, unfortunately, and in many ways the anonymity of the internet has brought out the worst in human nature.

No developer or anyone who works in the games industry deserves to be threatened or harassed. It’s awful, it needs to stop, and the petty, toxic people who engage in this scummy activity do not deserve to be called “fans.”

So that’s it. Ten of my pet peeves with modern gaming.

This was a rant, but it was just for fun so I hope you don’t mind! There are some truly annoying things – and some truly annoying people – involved in gaming in 2021, and as much fun as playing games can be, it can be a frustrating experience as well. Some of these things are fads – short-term trends that will evaporate as the industry moves on. But others, like the move away from single-player games toward ongoing multiplayer experiences, seem like they’re here to stay.

Gaming has changed an awful lot since I first picked up a control pad. And it will continue to evolve and adapt – the games industry may be unrecognisable in fifteen or twenty years’ time! We’ll have to keep our fingers crossed for positive changes to come.

All titles mentioned above are the copyright of their respective developer, publisher, and/or studio. Some stock images courtesy of pixabay. Some screenshots and promotional artwork courtesy of IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.