Gaming “Hot Takes”

I’m back with another edition of my infamous Gaming “Hot Takes!” I’ve officially given up on numbering these; I think this might be piece number four or five, but I’ve made several other posts over the last few years in which I share a few of my “hot takes” on gaming and the games industry in general. As I’ve said before, it’s never long before something in the world of gaming comes along to prompt another “hot take,” so that’s what we’re gonna look at today!

Video games are… well, they’re pretty darn good, to be honest with you. And I always like to make sure you know that I’m not some kind of “hater;” I like playing video games, and there are some titles that I genuinely believe eclipse films and TV shows in terms of their world-building, storytelling, or just pure entertainment value. We’re going to tackle some controversial topics today, though!

Atari Jaguar logo + console on a black background.
Let’s get into some gaming “hot takes!”

Before we take a look at the “hot takes,” I have a couple of important caveats. Firstly, I’m well aware that some or all of these points are the minority position, or at least contentious. That’s why they’re called “hot takes” and not “very obvious takes that everyone will surely agree with!” Secondly, this isn’t intended to be taken too seriously, so if I criticise a game or company you like, just bear that in mind. Finally, all of this is the entirely subjective, not objective, opinion of just one person.

Although gamers can be a cantankerous bunch, I still like to believe that there’s enough room – and enough maturity – in the wider community for respectful discussion and polite disagreement that doesn’t descend into name-calling and toxicity! So let’s all try to keep that in mind as we jump into the “hot takes,” eh?

“Hot Take” #1:
If your game is still in “early access,” you shouldn’t be allowed to sell DLC.

Steam pre-order info showing early access.
Pre-purchase to play early!

“Early access” means a game hasn’t been released yet, right? That’s what it’s supposed to mean, anyway – though some titles take the absolute piss by remaining in early access for a decade or more. But if you haven’t officially released your game, your focus ought to be on, y’know, finishing the game instead of working on DLC. Paid-for downloadable content for games that are still officially in “early access” is just awful.

Star Citizen is arguably the most egregious example of this. The game – from what I’ve seen – would barely qualify as an “alpha” version, yet reams of overpriced downloadable content is offered for sale. Some it exists in-game, but a lot of it is really just a promise; an I.O.U. from the developers, promising to build a ridiculously expensive spaceship if and when time permits.

Several DLC spaceships from Star Citizen's webstore.
Expensive DLC ships in Star Citizen.

Early access has a place in gaming, and I don’t want to see it disappear. But that place is with smaller independent projects seeking feedback, not massive studios abusing the model. Selling DLC that doesn’t exist for game that also doesn’t fully exist feels like a total piss-take, and given how often these things go horribly wrong, I’m surprised to see people still being lured in and falling for what can, at times, feel like a scam.

There have been some fantastic expansion packs going back decades, and I don’t object to DLC – even if it’s what I would usually call a pack of overpriced cosmetic items. But when the main game isn’t even out, and is supposedly still being worked on, I don’t think it’s unreasonable to say that charging money for DLC is wrong – these things should either be free updates or, if they’re definitely going to be sold separately, held in reserve until the game is launched.

“Hot Take” #2:
Bethesda Game Studios has basically made four good games… ever.

Four Bethesda role-playing/action games.
Yup, you heard me.

Morrowind, Oblivion, Fallout 3, and Skyrim. That’s it. That’s the list. From 2002 to 2011 – less than a decade – Bethesda Game Studios managed to develop and release four genuinely good games… but hasn’t reached that bar since. Bethesda has spent longer as a declining, outdated, and thoroughly mediocre developer than it ever did as a good developer. The studio is like the games industry equivalent of The Simpsons: fantastic in its prime, but what followed has been a long period of stagnation, decay, and mediocrity as they’ve been completely overtaken and eclipsed by competitors. To be blunt… I don’t see Starfield’s next (and probably last) expansion pack, or The Elder Scrolls VI, changing that.

There is a retro charm to the likes of Arena and Daggerfall, and I won’t pretend that Fallout 4 didn’t have its moments. Even Starfield, with all of its limitations and issues, still had interesting elements, and the ship-builder was genuinely fun to use… at least at first. But since Skyrim in 2011, I would argue that Bethesda has been in decline. In fact, I believe Skyrim’s unprecedented success broke something fundamental in the way Bethesda’s executives and directors think about games. Gone was the idea of games as one-and-done things to be created and released. Replacing it was the concept I’ve called the “single-player live service,” where titles were transformed into “ten-year experiences” that could be monetised every step of the way.

Screenshot of Starfield's microtransaction store.
Starfield has an in-game marketplace comparable to even the worst free-to-play mobile games.

As I said recently, I don’t have a lot of faith in The Elder Scrolls VI any more. It seems all but certain to contain another disgusting in-game marketplace for skins, items, and even entire questlines and factions. When there are so many other games to play that aren’t hideously over-monetised… why should I bother getting excited for The Elder Scrolls VI? Even worse, it’s being made in Bethesda’s “Creation Engine;” the zombified remains of software from thirty years ago that clearly isn’t up to the task and hasn’t been for a while.

Bethesda’s decline has been slow, and folks who skipped titles like Starfield and Fallout 76 might not be aware of just how bad things have gotten. Maybe I’m wrong, and maybe The Elder Scrolls VI will be a miraculous return to form. I hope so – I never want to root for a game to fail. But with so many other role-playing games out now or on the horizon… I just don’t see it measuring up as things stand. And in a way, I can’t help but feel it would be better in the long run if another studio were to take on the project.

“Hot Take” #3:
There won’t ever be another 1983-style “crash.”

Black-and-white image of video games on shop shelves with a red "downwards" arrow superimposed on top.
It ain’t gonna happen.

Given the absolute state of modern gaming – at least insofar as many of the industry’s biggest corporations are concerned – I genuinely get where this feeling is coming from. But I think the people making this argument either don’t fully understand the 1983 crash, or don’t appreciate how massive gaming as a whole has become in the decades since then.

In short: in 1983, video games weren’t much more than pretty expensive digital toys. The home console market was relatively small, and like so many products over the years, it was genuinely possible that video games themselves could’ve been a flash in the pan; something comparable to LaserDisc, the hovercraft, or, to pick on a more modern example, Google Glass. All of these technologies threatened to change the world… but didn’t. They ended up being temporary fads that were quickly forgotten.

Photo of discarded and buried Atari game cartridges from the 1983 crash.
Atari dumped unsold games in a New Mexico landfill during the crash.
Photo: taylorhatmaker, CC BY 2.0 https://creativecommons.org/licenses/by/2.0, via Wikimedia Commons

Fast-forward to 2025. The games industry is massive. So many people play games in some form or another that the idea of a total market collapse or “crash” is beyond far-fetched. That isn’t to say there won’t be changes and shake-ups – whole companies could disappear, including brands that seem massive and unassailable right now. Overpriced games and hardware are going to be challenges, too. Changing technology – like generative A.I. – could also prove to be hugely disruptive, and there could be new hardware, virtual reality, and all sorts.

But a 1983-style crash? Gaming as a whole on the brink of disappearing altogether? It ain’t gonna happen! There is still innovation in the industry, though these days a lot of it is being driven by independent studios. Some of these companies, which are small outfits right now, could be the big corporations of tomorrow, and some of the biggest names in the industry today will almost certainly fall by the wayside. Just ask the likes of Interplay, Spectrum HoloByte, and Atari. But whatever may happen, there will still be games, there will still be big-budget games, and there will still be hardware to play those games on. Changes are coming, of that I have no doubt. But there won’t be another industry crash that comes close to what happened in ’83.

“Hot Take” #4:
Nintendo’s die-hard fans give the company way too much leniency and support – even for horribly anti-consumer shenanigans.

Stock photo of a "riot."
If you dare to criticise Nintendo, fans are going to riot!

I consider myself a fan of Nintendo’s games… some of them, at least. I’ve owned every Nintendo console from the SNES to the first Switch, and unless something major comes along to dissuade me, I daresay I’ll eventually shell out for a Switch 2, too. But I’m not a Nintendo super-fan, buying every game without question… and some of those folks, in my opinion at least, are far too quick to defend the practices of a greedy corporation that doesn’t care about them in the slightest.

Nintendo isn’t much different from the likes of Ubisoft, Activision, Electronic Arts, Sony, Sega, and other massive publishers in terms of its business practices and its approach to the industry. But none of those companies have such a well-trained legion of die-hard apologists, ready to cover for them no matter how badly they screw up. Nintendo fans will happily leap to the defence of their favourite multi-billion dollar corporation for things they’d rightly criticise any other gaming company for. Price hikes, bad-value DLC, lawsuits against competitors or fans, underbaked and incomplete games… Nintendo is guilty of all of these things, yet if you bring up these points, at least in some corners of the internet, there are thousands of Nintendo fans piling on, shouting you down.

Still frame from the Nintendo Switch 2 broadcast showing Welcome Tour.
Welcome Tour.

Obviously the recent launch of the Switch 2 has driven this point home for me. The console comes with a very high price tag, expensive add-ons, a paid-for title that should’ve been bundled with the system, an eShop full of low-quality shovelware, literally only one exclusive launch title, and over-inflated prices for its first-party games. But all of these points have been defended to the death by Nintendo’s super-fans; criticising even the shitty, overpriced non-entity Welcome Tour draws as much vitriol and hate as if you’d personally shat in their mother’s handbag.

Very few other corporations in the games industry enjoy this level of protection from a legion of well-trained – and pretty toxic – super-fans. And it’s just… odd. Nintendo has made its share of genuinely bad games. Nintendo has made plenty of poor decisions over the years. Nintendo prioritises profit over everything else, including its own fans and employees. Nintendo is overly litigious, suing everyone from competitors to its own fans. And Nintendo has taken actions that are damaging to players, families, and the industry as a whole. Gamers criticise other companies when they behave this way; Electronic Arts is routinely named as one of America’s “most-hated companies,” for instance. But Nintendo fans are content to give the corporation cover, even for its worst and most egregious sins. They seem to behave like fans of a sports team, insistent that “team red” can do no wrong. I just don’t understand it.

“Hot Take” #5:
“Woke” is not synonymous with “bad.”
(And many of the people crying about games being “woke” can’t even define the word.)

Screenshot of a famous YouTube video/meme of a commentator screaming at the camera about "pronouns" in Starfield.
He seems like a reasonable man…

In some weird corners of social media, a game (or film or TV show) is decreed “woke” if a character happens to be LGBT+ or from a minority ethnic group. And if such a character is featured prominently in pre-release marketing material… that can be enough to start the hate and review-bombing before anyone has even picked up a control pad. The expression “go woke, go broke” does the rounds a lot… but there are many, many counter-examples that completely disprove this point.

Baldur’s Gate 3 is a game where: the player character can be any gender, and their gender is not defined by their genitals. Players can choose to engage in same-sex relationships, practically all of the companion NPCs are pansexual, and there are different races and ethnicities represented throughout the game world. But Baldur’s Gate 3 sold incredibly well, and will undoubtedly be remembered as one of the best games of the decade. So… is it “woke?” If so, why didn’t it “go broke?”

Screenshot of a nude character from Baldur's Gate 3.
The famously not-woke-at-all Baldur’s Gate 3.

Many “anti-wokers” claim that they aren’t really mad about women in leading roles, minority ethnic characters, or LGBT+ representation, but “bad writing.” And I will absolutely agree that there are some games out there that are genuinely poorly-written, or which have stories I just did not care for in the least. The Last Of Us Part II is a great example of this – the game’s entire narrative was based on an attempt to be creative and subversive, but it hacked away at too many of the fundamentals of storytelling to be satisfying and enjoyable. But you know what wasn’t the problem with The Last Of Us Part II? The fact that one of its secondary characters was trans and another female character was muscular.

Good games can be “woke” and “woke” games can be good. “Woke” games can also be bad, either for totally unrelated reasons or, in some cases, because they got too preachy. But to dismiss a game out of hand – often without playing it or before it’s even launched – because some armchair critic on YouTube declared it to be “woke” is just silly. Not only that, but there are many games that contain themes, storylines, or characters that could be reasonably described as “woke” that seem to be completely overlooked by the very folks who claim it’s their mission to “end wokeness.” The so-called culture war is just a very odd thing, and it’s sad to see how it’s impacted gaming. I would never tell anyone they “must” play or not play certain games, but I think it’s a shame if people miss out on genuinely fun experiences because of a perception of some ill-defined political concept that, in most cases, doesn’t have much to do with the game at all.

So that’s it!

Screenshot of Mario in the castle in Super Mario 64.
It’s a-me, Mario!

We’ve looked at a few more of my infamous “hot takes!” I hope it’s been a bit of fun… and not something to get too upset about! It’s totally okay to disagree, and one of the great things about gaming nowadays is that there’s plenty of choice. If you like a game that I don’t, or I enjoy a genre you find boring… that’s okay. If you’re a super-fan of something that I’m not interested in… we can still be friends. Even if we don’t agree politically, we ought to be able to have a civil and reasonable conversation without screaming, yelling, or name-calling!

Be sure to check out some of my other “hot takes.” I’ve linked a few other pieces below. And I daresay there’ll be more of these one day soon… I keep finding things in gaming to disagree with, for some reason. It must be because I’m getting grumpy in my old age; I’m just a big ol’ sourpuss!

Have fun out there, and happy gaming!


All titles mentioned above are the copyright of their respective publisher, developer, and/or studio. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.


Links to other gaming “hot takes:”

Games Industry “Hot Takes”

A few months ago, I put together a list of “hot takes” about video games. As much as I enjoy gaming as a hobby, there are things that annoy me and things to criticise! There were a few other things that I considered including, but they didn’t really fit with that list. These “hot takes” have less to do with games themselves and more to do with the games industry, development, and gaming as a whole – so that’s what we’re going to discuss today!

If you’re interested in checking out that earlier list, by the way, you can find part one by clicking or tapping here, and part two by clicking or tapping here.

Whenever I use the term “hot take” it’s because I’m acutely aware that we’re talking about something contentious! So before we get started, let’s re-emphasise that: these are all topics of debate among players and critics, and mine may well be the minority position. I don’t pretend to be 100% right, and I welcome disagreements and differences of opinion.

A stock photo of a crying girl.
Let’s not throw a tantrum if we disagree, okay?

I worked in the games industry for close to a decade, and I worked with large and small games companies in that time. I’ve got a bit of a feel for how development works from the time I spent “on the inside,” and I know that developers are passionate people who care deeply about their art. But that doesn’t mean games get a free pass; a bad game is a bad game, no matter how well-intentioned it may have been!

As I always like to say: all of this is just the subjective opinion of one player, and I believe that there should be enough room in the community for differences of opinion and respectful disagreement. The topics we’re going to get into today are the subject of discussion and debate, and there isn’t a right answer – just opinions.

If you aren’t in the right headspace to see some potentially controversial games industry opinions, this is your final chance to nope out – because we’re about to jump into the list!

“Hot Take” #1:
“Game development is hard” isn’t an excuse for selling a sub-par title.

Stock photo of a woman working at a computer with two monitors.
A lot of people work really hard on some absolutely shite games…

Speaking as both a player and as someone who used to work in the industry, believe me when I say that I get it. Game development is undeniably difficult, it isn’t straightforward, and there are many, many reasons why a game may not be as good, enjoyable, or polished as we’d like it to be. There can be problems getting an engine to work, fixing one bug might cause ten more to pop up elsewhere, and the more complex and in-depth a title is, the greater the chance of these kinds of issues occurring. Publishers and corporations also meddle, moving the goalposts and pushing developers to hit unreasonable deadlines. So I get it. But that doesn’t make “development is hard” a good enough excuse.

Here’s a helpful analogy: suppose I buy a house, move in, and every time I turn on the washing machine, the electric goes off. Then when I ring the electrician, he basically says “wiring a house is really hard. You wouldn’t get it because you aren’t an electrician.” That’s not an excuse. If I go to a bakery and the bread is stale and mouldy, I likewise wouldn’t accept the excuses that “baking is really difficult,” or “running a business and keeping track of sell-by dates is hard.” The same basic principle applies to video games.

Stock photo of loaves of bread in a bakery.
You wouldn’t accept sub-par bread from a baker, so why should you accept a sub-par game from a developer?

I will acknowledge and agree that game development is hard, and that bigger games are harder to make; it’s an almost exponential scale of difficulty. But trying your best and failing is still failing, and in a competitive marketplace where most games aren’t free, if you release a sub-par, broken, uninspired, or inferior game, you’re gonna get called out for it. Media criticism exists for this purpose, and just because a critic has never worked in the games industry or has no experience with development doesn’t invalidate their criticism.

When a game is listed for sale, even if it’s discounted or at a low price, players still have expectations – and those expectations aren’t “wrong” just because they didn’t see how hard the game was to create. If you’re a brand-new developer releasing your first-ever game for free and asking for feedback, then maybe some of the harshest words should be held back. But this asinine argument is too often made by publishers and executives who work for massive companies. When a game underperforms, they trot out the trusty old “game development is hard” argument as a rebuttal to critics.

Screenshot of The Lord of the Rings: Gollum showing a serious bug.
The Lord of the Rings: Gollum was widely criticised upon its release for being riddled with bugs and glitches.

In no other business or industry would customers be told that “my job is hard, you should be grateful for what you got” as a response to genuine criticism. Selling a game that’s outdated, riddled with glitches, or just not fun can’t be excused in this way, and developers – no matter how hard they may have worked and no matter what programming hurdles they may have had to overcome – have to accept that. Criticism is inevitable in entertainment and media, and even if a developer had created an impossibly perfect game, there’d still be players who didn’t like it in whole or in part, or who just weren’t interested in its narrative or its gameplay. That’s unavoidable.

Some developers and studios actively make things worse for themselves by trying to respond to criticism in this way. It never works, it never succeeds at garnering sympathy, and practically zero players come away from this conversation having more positive thoughts about the game. It’s an argument that needs to go away, and developers and publishers should think long and hard before reacting to genuine criticism with this irritating whine.

“Hot Take” #2:
Subscriptions are happening and physical discs and cartridges are dying out.

A stock photo of Mega Drive games.
A selection of Sega Mega Drive game cartridges.

This is a subject I’ve tackled before in a longer column here on the website. In that piece I took a look at the media landscape in general, talking about how the move away from physical media started with music, then moved to film and TV, and is now belatedly arriving in gaming, too. You can find that piece by clicking or tapping here, if you’re interested! But for the games industry specifically, a move away from discs and cartridges has been happening for a long time – and the rise of subscriptions could well be the final nail in the coffin.

In the very early days, no one owned a video game outright. If you wanted to play a game, you had to go to where the games were: an arcade. It was only with the growth of home consoles in the ’80s that physically owning a video game became possible for a mainstream audience, and even then, renting games or even whole systems was still a big deal. Many of the SNES, Nintendo 64, and Dreamcast games that I played in through the ’90s and into the new millennium were rented, not purchased outright. The idea of owning a massive media library is, when you think about it, a relatively new phenomenon that was kicked into a higher gear when DVD box sets became a thing in the mid-2000s.

Concept art for Wreck-It Ralph showing the arcade.
Arcades (like this one from Wreck-It Ralph) used to be the only place to play video games.

In that sense, we could argue that subscriptions aren’t “changing” the way people engage with media, they’re just a return to the 20th Century status quo. For much of the history of film, television, music, and gaming, audiences have had a temporary or impermanent relationship with media… and to me, that’s absolutely fine. It’s a trade-off I and many other players are happy to make.

I could probably count on my fingers the number of games I’d want a permanent hard copy of… because most games aren’t gonna be played on a loop forever nor returned to every few months. Just like when I used to rent SNES and N64 games in the ’90s, I’m totally okay with not having a huge library of titles gathering dust on a shelf (or metaphorical dust in a digital library), because once I’ve beaten a title like Donkey Kong 64 or Bioshock, I’m in no rush to play them again.

Promo screenshot of Red Dead Redemption II.
Red Dead Redemption II is one of just a handful of games I might conceivably want a hard copy of.

Speaking as someone on a low income, subscription services like Netflix and Xbox Game Pass open up a huge library of titles to me – allowing me to play more games than I’d ever be able to afford if I had to buy or even rent them individually. I’ve played dozens of games over the past couple of years that I’d never have bought for myself, and some of them have become personal favourites. Subscriptions like Game Pass are a great way into gaming for players on a budget – because for a single monthly fee a huge library of titles become available.

If the trade-off for that is that titles are occasionally removed from the platform and become unplayable… well, I’m okay with that. And for one-in-a-generation masterpieces like Red Dead Redemption II or Baldur’s Gate 3, I’m happy to splash out. When you consider that an annual subscription to Game Pass is more or less the same price as buying one or two games… you start to see why people are choosing to sign up. I wouldn’t be surprised at all if Xbox, PlayStation, or both choose to go all-digital later in the decade when their next-generation machines are ready.

“Hot Take” #3:
Microtransactions have no place in single-player games.

A screenshot of part of Starfield's in-game shop.
*cough* Starfield *cough*

I’m not wild about microtransactions in general – but in online multiplayer games and especially free-to-play titles, I accept that they’re an established funding model. They should still be regulated and prevented from being exploitative, but in those genres the microtransaction model seems to work well enough. But in a single-player game? Microtransactions need to GTFO.

Going back decades, games have released expansion packs – and large pieces of content that add new maps, quests, characters, and so on are usually okay. Look at something like Morrowind’s expansion Bloodmoon, or a more recent example like Phantom Liberty for Cyberpunk 2077. These are the kinds of expansion packs that have always been okay. Some are better than others, sure, and some expansions offer much more in terms of value. But as a general rule, I’m okay with expansion packs.

A still frame from the trailer for Cyberpunk 2077: Phantom Liberty showing Johnny Silverhand in a helicopter.
Phantom Liberty is a great example of an expansion pack that offers good value.

But in a single-player game, I shouldn’t be asked to purchase a “premium currency,” weapon skins, cosmetic items, and so forth. These microtransactions have no place in a single-player title, and there’s no excuse for adding them in other than pure, unadulterated greed. If a game like No Man’s Sky can remain profitable for Hello Games for close to a decade without charging for a single additional piece of content, there’s no excuse for the disgusting in-game marketplace in a title like Starfield.

I love a game with cosmetic customisation. Making my character feel personal to me goes a long way to enhancing the experience and making my playthrough feel like “mine,” so I enjoy having the option to change a hairstyle, outfit, or do things like re-paint a vehicle. But these things are an integral part of the game experience – not something to charge extra for. Exploiting players by locking basic items behind a paywall is despicable – and that’s before we say anything about “XP boosters,” damage multipliers, and other pay-to-win or pay-to-skip-the-grind items.

Steam page for No Man's Sky showing that the game has no DLC.
Oh look, it’s all of the DLC available for No Man’s Sky

I’ll also include in this category “super premium deluxe editions” of games that come with exclusive content. You might think that Han Solo’s vest in Star Wars Outlaws is okay to lock behind a paywall, but some games do this with whole quests. Hogwarts Legacy infamously locked an entire mission behind a paywall, and it’s far from the only game to have done so in recent years. Offering an in-game item as a pre-order bonus is one thing, locking a whole chest full of items and even pieces of gameplay behind an expensive “luxury edition” that can easily run to $100 or more is just scummy.

If I’m paying full price for a game, I don’t expect that game to reach into my wallet and try to grab even more cash every time I want to use a consumable item or change my character’s appearance. I tend to avoid online multiplayer games, where this phenomenon primarily exists, but inserting a microtransaction marketplace into a single-player game where it has absolutely no business being is enough to make me uninstall that title and never return to it. I’ll even refund it if I can. Some studios have even taken to concealing in-game marketplaces at launch, hoping to garner better reviews and more sales, before adding them in a few weeks or months later. Truly disgusting stuff.

“Hot Take” #4:
You aren’t paying for “early access,” you’re being charged an additional fee to play the game on its real release date.

Early access info for Indiana Jones and the Great Circle.
An example of what I’m talking about.

“Early access” is controversial in general, but let me just say before we start that I’m generally supportive of smaller studios and indie developers using early access as a way to get feedback and even to keep the lights on during what can be a difficult process. I very rarely touch an early access title, but independent devs should always feel free to use whatever tools are available to them, including launching an early access version of their game. But that’s where my patience with early access ends.

Recently we’ve seen two pretty shitty trends in the games industry: firstly, massive studios backed up by big publishers have been abusing early access, sometimes leaving a game officially unreleased for four, five, or six years, charging almost full price for it all the while. And secondly, the issue we’re looking at today: “early” access for an extra charge.

Promo graphic for Star Wars Outlaws showing the different versions of the game.
Ubisoft wanted to charge players an extortionate amount of money to play Star Wars Outlaws on its real release date.

This kind of “early” access usually grants players access to a game a few days or maybe a week ahead of its official release date, but by that point the game is finished and should be ready to go. The “early” version that players get is usually no different from the launch version, and there’s no time for a studio to act on player feedback or patch bugs. This is a scam, plain and simple, and an excuse for wringing even more money out of players.

If a game launches on the 1st of September for players who pay £100, and the 6th of September for players who “only” pay £65, then the release date is the 1st of September. They’ve just charged more to players who want to play on release day – or, if you flip things around, deliberately penalised players who didn’t splash the extra cash. These versions of games – which I think we should call “real release date” versions – are often $20, $30, or $40 more expensive than their delayed counterparts.

A stock photo of a hand holding burning dollar bills.
And who has that kind of money to waste these days?

Buying a game on day one is a risk nowadays. So many games – even those that go on to be hailed as masterpieces – arrive on launch day with bugs, glitches, and other problems. So paying extra to play what is almost always a demonstrably shittier version of a game just feels… stupid. I’ve been burned by this before, and just as with pre-orders, I’ve sworn to never again pay for so-called “early” access.

I’d like to see digital stores like Steam, Epic Games, and ideally Xbox and PlayStation too clamp down on this practice. Early access should be reserved for studios that need it, and charging players extra to play a game on release day is something that should be banned outright.

“Hot Take” #5:
Players’ expectations aren’t “too high.”

A stock photo of an angry man holding a PlayStation control pad.
It isn’t the players that are wrong…

There have been some fantastic games released over the last few years. Red Dead Redemption II, Baldur’s Gate 3, and Kena: Bridge of Spirits all come to mind in the single-player space, but I’m sure you have your own favourite. These games are, in a word, masterpieces; titles that did everything right and are rightly considered to be at the very pinnacle of not only their genres but video games as an art form in general. So… if your game doesn’t get that kind of glowing reception, whose fault is it?

Some developers think it’s the fault of players, and that we’ve had our expectations set “too high.” They argue that it was unrealistic to expect their game to be as engaging or entertaining as others in the genre, and we should be grateful for what we got. They worked hard on it, after all.

A screenshot from Starfield showing a first-person perspective and three NPCs.
I wonder which game might’ve prompted this “hot take.”

The tl;dr is this: it isn’t the fault of players if they don’t like your game – it’s yours. Complaining about high expectations makes no sense when other titles have demonstrably been able to meet and even exceed those expectations, so if you learned nothing from your competition, once again that isn’t anyone else’s fault but yours! That’s to say nothing of the out-of-control and frequently dishonest marketing that promises players way more than the game can deliver. Studios and publishers are responsible for reining in hype and keeping their marketing honest. That, more than anything else, will help players set appropriate expectations.

I get it: it isn’t fun to be criticised or see your work picked apart. It’s even less fun to see a game you worked hard on for a long time compared negatively to another title in the same space. But to lash out at players – the people who are supposed to be your customers and the people it’s your job to entertain – just doesn’t make any sense to me. Not only is it wrong, but it also risks building up resentment and ill-will, so the next time you work on a game and get it ready for launch, players will be even more sceptical and perhaps even quicker to criticise.

A stock photo of a smartphone showing social media apps.
This is a problem exacerbated by social media.

Thankfully, it isn’t all developers who say this – at least not in public! I heard complaints like this from time to time when I worked in the industry, but most developers I worked with were smart enough to keep such thoughts to themselves if they had them. So we’re fortunate that it’s only a minority of developers who take this argument into the public square.

Some developers need to get off social media. Social media is a great tool, don’t get me wrong, and being able to communicate directly with players can be useful in some situations. But if a developer is so thin-skinned that they feel the need to react in real-time and respond to every armchair critic and Twitter troll… that can’t be good for them, and it certainly isn’t good for the company they work for. For their own good, some developers need to shut down their social media profiles!

So that’s it… for now!

A promo graphic of an Xbox Series control pad.
I hope this wasn’t too controversial!

I’m always finding more “hot takes” and things to criticise in the games industry, so I daresay this won’t be the last time I put together a piece like this one! Despite what I’ve said today, I still really enjoy gaming as a hobby and I find there are far more positives than negatives. And if you hated all of my points, just remember that all of this is the entirely subjective opinion of a single old gamer.

So I hope this has been a bit of fun… and maybe a little thought-provoking in places, too. If you don’t agree with any of my points that’s totally okay! I tried my best to present my arguments as articulately as possible, but these are “hot takes” so I’m sure plenty of people can and will disagree with all of them. If I gave you a chuckle or you found this discussion interesting in some way, then I reckon I’ve done my job!

Until next time… and happy gaming!


All titles discussed above are the copyright of their respective publisher, studio, and/or developer. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

Ten Gaming “Hot Takes” (Part 2)

A few days ago, I shared the first of my gaming “hot takes,” and today we’re going to finish the job. I’ve got five more “hot takes” to round out this list, and I think we’ve got some spicy ones in the mix!

As I said last time, this isn’t clickbait! These are opinions that I genuinely hold, and I’m not inventing things for the sake of being controversial or to score “internet points.” I’m also keenly aware that I’m in the minority, and that plenty of folks can and will disagree. That’s okay – there should be enough room in the gaming community for differences of opinion and friendly discussion of these topics. This is all subjective, at the end of the day!

So if you missed the first part of the list, you can find it by clicking or tapping here. Otherwise, it’s time to get started!

“Hot Take” #6:
Story matters more than gameplay (in most cases).

Starfield (2023).

When discussing Starfield a few weeks ago, I said something rather telling. I didn’t really appreciate it in the moment, but looking back, I think it sums up my relationship with video games as a hobby quite well: “I’m someone who’ll happily play through some absolutely bog-standard gameplay if I’m enjoying a story or getting lost in a fictional world…” If you want to see the full quote in context, by the way, you can find my piece on Starfield by clicking or tapping here.

That line pretty much sums up how I relate to most games I play – and almost all single-player and action/adventure titles. There are some exceptions: Mario Kart 8 Deluxe springs to mind, as does Fall Guys, and some turn-based strategy games, too. But when I look at the games I’ve enjoyed the most since at least the second half of the ’90s, it’s story more than gameplay that appeals to me.

There are some exceptions!

It was a solid story and great world-building that convinced me to stick with Cyberpunk 2077, even when I felt its gameplay was nothing special. And on the flip side, it was a mediocre story set in a boring, empty world that led to me giving up on Starfield after less than thirty hours. When I fire up a single-player game, I’m looking for a story that grabs me, and a world I can lose myself in.

It doesn’t feel controversial to say “I want a game to have a good story,” but that isn’t really the point I’m trying to make. For me, story almost always trumps gameplay. While there can be exceptions – games with either incredibly innovative gameplay in which the narrative is less relevant or games that are so mechanically poor or bug-riddled that even the best story couldn’t salvage them – for the most part, that’s what I’m looking for in a new release.

I stuck with Cyberpunk 2077 because of its story.

It was Shenmue, around the turn of the millennium, that stands out to me as an example of this. Shenmue was the first game I’d played where the story seemed like it would be right at home on the big screen, and I absolutely adored that. Many games have come along in the years since with compelling characters, wonderful worlds, or magnificent mysteries… and I think that’s part of why I still love playing video games after more than thirty years.

If games had stuck to being glorified toys; story-less arcade boxes where the only objective was either “kill everything on the screen” or “keep walking to the right,” then I think I’d probably have drifted away from the hobby. But I was fortunate enough to play some absolutely phenomenal titles as gaming made that transition and many incredible stories were written.

“Hot Take” #7:
More complexity and additional gameplay elements do not make a game “better.”

Darn young’ins.

Some modern games try to cram in too many features and gameplay mechanics that add nothing to the experience – and in some cases actively detract from it. I know this probably comes across as “old man yells at cloud;” an out-of-touch dinosaur whining about how modern games are too convoluted! And if this was something that only happened in a handful of titles, I guess I’d be okay with it. But it seems to happen all the time!

Strategy and “tycoon” games seem to fall victim to this very easily. I adored Rollercoaster Tycoon when it launched in 1999; it felt like a game that was simple to get started with but difficult to master. In contrast, when I tried 2016’s Planet Coaster… I was hit with such a huge wall of options and features that it was offputting. I didn’t know where to start.

Games used to be simpler…

There’s a balance that games have to find between challenge and complexity, and some titles get it wrong. I don’t have the time (or the energy) to spend tens or hundreds of hours becoming a literal rollercoaster engineer; I want something I can pick up and play, where I’m able to throw down a few theme park attractions without too much complexity. If the game had those more complex engineering sim elements in addition – as optional extras for players who wanted them – that could be okay. But when booting up a new game for the first time, I don’t want to encounter a dense wall of features and content.

This doesn’t just apply to strategy games, either. An increasing number of shooters and action/adventure games are incorporating full-bodied role-playing systems, and again it just feels wholly unnecessary. Look at a game from the early 2000s like Halo: Combat Evolved. It was a shooter – your character had a handful of weapons to choose from, and you blasted away at aliens. There was no need for levelling up, for choosing traits or skills, or anything like that. But more and more modern games, even in the first-person shooter or stealth genres, are going for these kinds of role-playing mechanics.

Skill points and levelling up in Assassin’s Creed: Mirage.

Don’t get me wrong: I love a good role-playing game. But when I boot up something like Assassin’s Creed or Destiny, the last thing I want or expect is to spend ages in menus micromanaging a character who, to be blunt, doesn’t need that level of engagement. Partly this is about balance, and in some cases it can be fun to level up and gain access to new equipment, for instance. But in others it really is a question of simplicity over complexity, and what kind of game I’m playing. Not every game can or should be a role-playing experience with a complex set of stats and skills.

Some titles really emphasise these elements, too, seeking to win praise for including a convoluted levelling-up system and skill tree. And a lot of the time, I find myself rolling my eyes at that. Leave the role-playing to RPGs and leave the overly-complicated systems to simulators and let me pick up and play a fun game!

“Hot Take” #8:
I hate VR.

Promo image of the HTC Vive Pro 2 headset.

Is “hate” too strong a word to use in this context? I’m going to go with “no,” because I genuinely hate VR. I was worried when the first VR headsets started being released that the video games industry in general was going to go all-in on VR, because I felt if that were to happen that I wouldn’t be able to keep up. But thankfully VR remains a relatively niche part of gaming, and even if that were to change, it doesn’t seem like it’s going to replace regular old video games any time soon!

In the ’80s and ’90s, it seemed as if VR was something tech companies were working towards. It was a futuristic goal that was just out of reach… so when VR headsets first started cropping up, I really thought that they were going to be “the next big thing.”

TV shows like VR Troopers hinted at VR being the direction of travel for video games as far back as the ’90s.

But I’ve never found a VR system that I could actually use. I could barely manage playing tennis on the Wii – and even then I had to remain seated! I’m disabled, in case you didn’t know, and the move toward VR headsets and motion-tracking devices felt a bit threatening to me; these technologies seemed like they had the potential to lock me out of gaming.

There haven’t been many VR titles that have interested me, though. One of the only VR titles that did – Star Trek: Bridge Crew – was pretty quickly ported to PC without the VR requirement. While the influence of VR is still clearly present in that title, I think it demonstrates that at least some VR games can work without the expensive equipment.

Star Trek: Bridge Crew was quickly ported to non-VR systems.

There’s plenty of room for innovation in gaming, and for companies to try out different kinds of screens, controllers, and methods of interactivity. But for me personally, VR felt like a step too far. I’m biased, of course, because between vision problems and mobility restrictions I don’t feel capable of using any of the current VR systems – not to anything like their full capabilities, at any rate. But even with that caveat, I just don’t think VR has turned out to be anything more than a gimmick.

It’s possible, I suppose, that a VR system will come along one day that I’ll feel compelled to invest in. But it would have to be something I could use with ease, and none of the VR devices currently on the market fit the bill. So I won’t be jumping on the VR bandwagon any time soon!

“Hot Take” #9:
We need fewer sequels and more original games.

I’ve lost count of the number of entries in the Call of Duty franchise at this point…

Across the world of entertainment in general, we’re firmly in an era of franchises, sequels, spin-offs, and connected “universes.” This trend has been going on for well over a decade at this point… but it’s been to the detriment of a lot of stories. There’s always going to be room for sequels to successful titles… but too many video game publishers have gone all-in on franchises and a handful of ongoing series at the expense of creating anything original.

And unfortunately, some original titles that have come along in recent years haven’t found success. I mentioned Starfield above, which seems to be seeing a precipitous drop in its player count, but we could also point to games like Anthem, Forspoken, or Babylon’s Fall – all of which were new settings featuring new characters that struggled to get off the ground.

Forspoken didn’t exactly light up the board, unfortunately.

The reason why I consider this one to be a “hot take” is simply because of how many players seem content to go back to the same handful of franchises or series over and over again. Some folks have even gotten genuinely angry with developers for sidelining their favourite series in order to work on something new, as if a studio should only ever be allowed to work on a single series in perpetuity. Sequels, prequels, and spin-offs are all more popular and attract more attention than brand-new experiences, and I think that’s short-sighted on the part of publishers and narrow-minded on the part of at least some players.

And I have to hold up my hands here: I can be guilty of this, too. I’ve written articles here on the website looking ahead to the next Mass Effect game, for instance, while it seems clear that at least some of the folks at BioWare wanted to branch out and create something different. And I have to admit that a sequel to a game I enjoyed or a new entry in a franchise I’m invested in is exciting – more so, arguably, than the announcement of a brand-new project.

Lots of people are eagerly anticipating the next Mass Effect game.

Brand-new games are more difficult and more expensive to get people to pay attention to. They’re also comparatively risky propositions from a corporate point of view; a ton of people will turn up for a game with a well-known name attached, even if it’s not all that good. But a brand-new world has to be something truly special to attract players in the first place – let alone retain a huge playerbase and make a profit.

But it’s a shame that that’s the situation we’re in, because when developers are restricted to sequels and the same handful of franchises, creativity is stifled. Where’s the next breakthrough going to come from if the only games a studio is able to make are sequels and spin-offs to earlier titles? And when audiences get tired of the decreasing number of surviving franchises… what will happen?

“Hot Take” #10:
Graphics actually do matter.

Kena: Bridge of Spirits (2021).

This is perhaps the most contentious point on this list! I’ve lost track of the number of times I’ve heard some variant of the expression “graphics don’t matter” when discussing video games. But you know what? If you showed me two similar games in the same genre, with the key difference between them being that one was a ray-tracing Unreal Engine 5 beauty and the other looked like a Nintendo 64 game that had been sneezed on… I know which one I’d choose to play.

When I was really getting into gaming as a hobby in the 1990s, it seemed like the push for better and better graphical fidelity was never-ending. Games used their visuals as a selling-point, and that trend continued into the 2000s with consoles like the Xbox and PlayStation 2. It would’ve seemed wild in those days for a game to not only take a backwards step in graphical terms, but to celebrate doing so.

Grand Theft Auto: Vice City looked great in 2002.

We need to separate “graphics” from “art style,” because they’re really two different things. Some games can do wonderful things with cell-shading, for example, or a deliberately cartoony aesthetic. When I say that “graphics actually do matter,” I don’t mean that photorealism is the be-all and end-all; the only art style that games should pursue. What I mean is that games that prioritise looking great – within their chosen style – are going to grab my attention.

I think an interesting example here is South Park: The Stick of Truth. No one would argue that that game is “realistic” in its art style – but that’s the point. Developers Obsidian Entertainment worked overtime to recreate the look and feel of the South Park cartoon – and what resulted was a genuinely fun and interesting visual presentation. Playing that game really felt like taking part in an extended episode of the show. Compare the way The Stick of Truth and its sequel look to the upcoming South Park: Snow Day. I know which one I’d rather play!

South Park: The Stick of Truth stands out because of its visual style.

When a developer wants to go down the photorealism route, though, it’s great to see just how far they can push modern hardware. There were moments in games like Red Dead Redemption II where the environment felt genuinely real – and that feeling is one that games have been chasing since the inception of the medium. I really can’t wait to see how graphics continue to improve, and how realistic some games might be able to look in fifteen or twenty years from now… if I live that long!

At any rate, visually beautiful games are always going to catch my eye, and games that don’t prioritise graphical fidelity will always have a hurdle to overcome in some ways. Gameplay and story are important, of course, but graphics aren’t irrelevant. The way a game looks really does matter.

So that’s it!

A Sega Dreamcast console. I had one circa 2000.

We’ve come to the end of the list – for now! I’m sure I’ll have more “hot takes” and controversial opinions about video games that I’ll be able to share before too long.

I hope that this has been interesting – and not something to get too worked up over! As I said at the beginning, I know that I’m in the minority and that a lot of folks can and will disagree. Although some people take gaming a bit too seriously sometimes, I like to think that there’s room in the community for polite discussions and disagreements.

Have fun out there – and happy gaming!

All titles discussed above are the copyright of their respective studio, developer, and/or publisher. Some images used above courtesy of IGDB and Unsplash. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

Ten Gaming “Hot Takes” (Part 1)

Today I thought we could have a bit of fun and talk about some of my more controversial gaming opinions! This is the first part of a two-part list, so be sure to stay tuned in the days ahead for five more gaming “hot takes.” There were too many to fit into a single piece this time around!

Although this is intended to be lighthearted and somewhat tongue-in-cheek, these are opinions that I genuinely hold; I’m not making things up for the sake of clickbait. I’ll always give the caveat that I’m a fan of video games and an advocate for gaming as a hobby… but that doesn’t mean that there aren’t things to criticise from time to time!

A Sega Mega Drive console.
Let’s share some controversial gaming opinions!

Gaming has changed a lot since I first picked up a joystick at a kids’ club in the ’80s, and I’ve seen the games industry and games themselves evolve dramatically! Most of those changes have been for the better… but perhaps not every last one.

As I always say when we talk about potentially controversial topics: these are my wholly subjective opinions! I’m not trying to claim that I’m right and that’s the end of the affair – on the contrary: I’m acutely aware that I’m in the minority here! I share these “hot takes” in the spirit of thought-provoking fun, and you are free to disagree wholeheartedly.

With all of that out of the way, let’s take a look at some “hot takes!”

“Hot Take” #1:
An open world isn’t the right choice for a lot of games.

A screenshot of Jedi: Survivor showing protagonist Cal Kestis outside of a saloon.
Jedi: Survivor is a recent game that employed an open world style.

Open worlds became a gaming trend sometime in the early 2010s, and too many publishers nowadays insist on forcing the formula onto titles that are entirely unsuited to it. Some open worlds are great… but I’d argue that relatively few manage to hit the golden combo of being both a well-constructed open world and one that suits the game in question. There have been some fantastic open worlds in which stories were told that didn’t fit, and some games that could’ve been wonderful that were undone by the fetishisation of the open world formula in some corporate boardrooms.

In many, many cases, having distinct levels or separate sections of a larger map just… works. It allows for the game’s narrative to create an often-necessary sense of physical distance in between locations – something that even the best open world maps are usually unable to manage. And for an awful lot of stories – even in games that we might consider to be masterpieces – that can be important to the immersion.

Ryo Hazuki, protagonist of Shenmue, encounters a man dressed as Santa Claus.
An early open world pioneer was Shenmue on the Dreamcast.

Take Red Dead Redemption II as an example. That game is one of the very best that I’ve ever played… but there were several points in its single-player story where the open world formula came close to being a problem. After escaping the town of Blackwater by the skin of their teeth in the game’s prologue, Arthur Morgan and the gang roam around in the mountains for a while, before eventually finding a new place to make camp… literally five minutes away from Blackwater. And this would happen again later in the game, when the gang would escape the town of Valentine only to settle at a new campsite just up the road.

The game’s narrative presented these locations as if they were far apart, but the open world of Red Dead Redemption II, for all of the content that it was filled with, didn’t always gel with that. It’s a scaled-down representation of part of the United States, and I get that. But narratively, it might’ve worked even better if the game’s main acts took place in separate, smaller open maps instead of merging them all into one larger open world.

Arthur Morgan, the protagonist of Red Dead Redemption II.
Red Dead Redemption II is a masterpiece.

Red Dead Redemption II is, without a doubt, one of the best games that I’ve ever played. So if the open world could be a problem there… well, you don’t need to think too hard to find examples of the open world formula tripping up worse and far less enjoyable titles! There’s absolutely nothing wrong with creating separate levels for a game – as has been done really since the beginning of narrative video games. Doing so often allows for more diversity in locations, environments, and terrain – and it’s something more titles need to consider taking advantage of.

I could probably count on my fingers the number of games that have genuinely made good use of an open world formula, and that have used that style of map properly. And when I think about modern games that I’ve really enjoyed such as The Last of Us, Jedi: Fallen Order, or the Mass Effect trilogy, they don’t use open worlds – and they’re much better for it.

“Hot Take” #2:
Every game should have a robust easy mode – it’s an accessibility feature.

The Skyrim options menu with difficulty settings highlighted.
Difficulty options in Skyrim.

I’m a big believer in making games accessible to as many players as possible. That can mean including accessibility features like colourblindness settings, disabling quick-time events, or ensuring that subtitles are available. But it also means that players need to be able to tone down the difficulty – yes, even in your precious Dark Souls!

I suffer from arthritis, including in my hands and fingers. I don’t have the ability to pull off complicated multi-button combos any more – if I ever possessed such an ability! And as with any skill or set of skills, gaming abilities vary from person to person; even someone who isn’t suffering from a health condition may simply not be blessed with the reflexes or hand-eye coordination necessary to progress through some of the industry’s more punishing titles. Not to mention that many folks don’t have the free time to dedicate to learning precise button combos or the intricate details of specific boss battles.

A promotional screenshot of Kingdom Come: Deliverance.
Kingdom Come: Deliverance was a title I found too difficult to play, despite wanting to enjoy it.

And that’s a real shame – because there are some outstanding games that everyone should be able to experience. Stories in some games are truly awe-inspiring, and can be better in some cases than films or television shows. For those stories to be denied to people with disabilities or people who may not have the time to repeat the same boss fight or level over and over again is just… sad.

I absolutely detest the expression “not every game is made for every player” when this debate rolls around. It’s absolutely true that people like different things, so if I’m not into online multiplayer shooters then I’m probably not going to enjoy the next Call of Duty title. But that doesn’t apply to difficulty, or to making a game that millions of potential players are locked out of because of a skill mismatch or health condition. That kind of gatekeeping is honestly just pathetic.

A toddler or young child playing a racing game.
Gaming should be accessible to as many people as possible.

I’d also add that the reverse is true here: certain games can be too easy for some players, and including the option to increase the difficulty in that case is likewise a good thing and something that developers should seek to include.

Difficulty settings have been a part of games going back decades, and they aren’t all that difficult to implement. At the very least, giving players the option to skip a level or boss battle after failing it multiple times should be achievable for every developer – and I can’t think of a good reason why a studio that cares about its audience wouldn’t want to implement something so incredibly basic. It doesn’t “hurt” the game to include an easy mode, nor does it damage the developers’ “artistic vision.” An easy mode only impacts players who choose to turn it on – and in a single-player game, why should anyone be judgemental about that?

“Hot Take” #3:
Artificial intelligence isn’t “coming soon,” it’s already here – and the games industry will have to adapt.

Still frame from the film Terminator (1984).
Are you ready for the “rise of the machines?”

One of the hottest topics of 2023 has been the arrival of easily-accessible generative AI software. It seems that anyone can now create an article like this one, a photorealistic image of just about anything, an audio recording of a celebrity… or even code for a video game. This technology has well and truly landed, and I don’t see any practical way to prohibit or ban it – so the games industry is going to have to adapt to that reality.

I can see a lot of potential positives to AI. Modding, for instance, can now get a lot more creative, and we’ve seen already mods featuring AI voices that are basically seamless and can add a lot to a character or story. For smaller developers and indie studios, too, AI has the potential to be a massively useful tool – doing things that a single developer or small team wouldn’t be able to achieve.

"Matrix code" from the 2021 film The Matrix: Resurrections.
AI is already here – and could prove incredibly useful to game developers.

But there are unquestionably massive downsides. The games industry has seen significant layoffs this year – despite most of the big corporations making record profits. Corporations in all kinds of industries are looking to replace as many real humans as possible with AI software… and for an all-digital product like a video game, the potential for divisions or even entire studios being shut down is firmly on the table.

The arrival of generative AI is going to shake things up, and because of the way it works, I can absolutely see there being less creativity in the games industry if too many big corporations go down that road. Because of the way these AI programmes work, they aren’t capable of truly creating – only reworking things that already exist and generating something with the same parameters. If major video games start using AI in a big way, you can say goodbye to innovation and creativity.

An example of AI-generated art.
An example of AI-generated art that was created (in less than ten seconds) from a prompt I entered.
Image Credit: Hotpot Art Generator

Whichever company cracks AI first is, in all likelihood, going to be rewarded – so there may even be a kind of “AI arms race” within the games industry, as some of the biggest corporations duke it out to be the first one to strike the right balance between AI and human-created content. What that might mean for games in the short-to-medium term… I can’t really say.

Generative AI is here to stay, though, and I don’t see a way around that. Some folks have suggested boycotting AI-heavy titles, but these consumer boycotts seldom succeed. If a new game that relied on AI during its creation ends up being fun to play, I daresay it’ll get played. Most players don’t follow the ins and outs of the industry, and may never even know the extent to which their favourite game was created using AI. I hope you’re ready for AI… because I’m not sure that I am!

“Hot Take” #4:
Sonic the Hedgehog doesn’t work in 3D.

Promotional screenshot from 2014's Sonic Boom: Rise of Lyric.
3D Sonic.

We’re going franchise-specific for this one! I adored the first Sonic the Hedgehog games on the Sega Mega Drive. I didn’t have a Mega Drive at the time, but a friend of mine did and we played a lot of Sonic in the early ’90s! Along with Super Mario, Sonic was one of the characters who scaled the mountain and was at the absolute peak of gaming… for a time.

But Sonic’s sole gimmick meant that the character struggled to successfully make the transition from 2D side-scrolling games to fully 3D titles. Extreme speed is something that works well in a 2D title, but it’s hard to code and even harder to play in a 3D environment.

Cropped box art for the re-release of Sonic the Hedgehog.
Sonic’s “gotta go fast” gimmick works in 2D games… but not in 3D.

The most successful Sonic game this side of the millennium has been Sonic Mania… a 2017 title that was originally created by fans of the series before Sega got involved. Sonic Mania is an old-school 2D platformer in the style of the original Mega Drive games. It’s great fun, and a real return to form for Sega’s mascot after years of mediocrity.

Sonic’s fundamental problem begins with his sole superpower: speed. Extreme speed was something that felt wonderful in 2D… and not to mention incredibly innovative! But in 3D, it’s just so much more difficult to build worlds suited to moving so quickly – not to mention that it’s tricky for players to control a character moving at such speed.

Promotional screenshot for 2017's Sonic Mania.
Sonic Mania has been the most successful Sonic game in decades.

There have been 3D Sonic games that tried to innovate, but even the best of them feel like they’re missing something. I remember playing Sonic Adventure on the Dreamcast and barely having to push any buttons; in order to make Sonic work in 3D, much of the interactivity had to be stripped out. That made for a far less enjoyable gaming experience.

When Sonic shows up in other titles – such as alongside Mario for an arcadey sports game, or in Sega’s Mario Kart competitor – then the character can be made to work. But those games almost always rob Sonic of his one defining trait: his speed. I’ve never played a 3D Sonic game that felt anywhere near as good as those original 2D titles.

“Hot Take” #5:
Google Stadia was a good idea (in more ways than one).

Promo image featuring the Stadia control pad.
Promo image of the Stadia control pad (right) next to a laptop.

The history of video gaming is littered with failed consoles and devices; machines that didn’t quite make it for one reason or another. 2019’s Stadia – Google’s attempt to break into the games industry – has become the latest example, being fully shut down after only a couple of years. There were myriad problems with Stadia, and Google has a track record of not backing up its projects and investments nor giving them enough time to deliver. So in that sense its failure is understandable. But I think I’m out on a limb when I say that it’s disappointing – and potentially even bad for the games industry as a whole.

Stadia offered a relatively inexpensive way to get started with gaming by relying on streaming. Gone was the need for an expensive console or PC; players could jump in using only their existing screen and a Stadia controller. Lowering the cost of entry to gaming is a good thing, and we should be looking around for more ways to do that!

Promo screenshot of Stadia-exclusive title Gylt.
Gylt was one of the only Stadia-exclusive games.

Secondly, Stadia represented the first potential shake-up of a pretty stagnant industry in nigh-on twenty years. Since Microsoft entered the video game market and Sega dropped out, there have been three major hardware manufacturers and three main gaming platforms. Disrupting that status quo is, again, not a bad thing in theory. Stadia, with Google’s support and financial resources, seemed well-positioned to be the kind of disruptive force that often leads to positive change.

Stadia won’t be remembered – except as the answer to an obscure pub quiz question in a few years’ time, perhaps. But it had potential when it was announced, both in terms of the way it could have brought console-quality games to people who couldn’t necessarily pay for a current-generation machine up-front, and in the way Google could’ve disrupted the industry, leading to competition and innovation.

A Google Chromecast device.
Stadia was designed to be compatible with Google’s Chromecast devices – as well as other platforms.

I didn’t buy into Stadia on day one. As someone who has a gaming PC, I didn’t really feel it was necessary. And there were limitations to Stadia: a lack of exclusive games, no subscription option, and Google’s well-known history of prematurely shutting down underperforming products and services. All of these things put me off – and undoubtedly put off a lot of other folks, too.

But in a way, I regret the demise of Stadia. Its short, unsuccessful life will surely be a warning to any other company that might’ve considered launching a new console or a comparable streaming device, and if there’s one thing I think we can all agree on it’s this: the games industry needs a shake-up from time to time! Stadia couldn’t do it, unfortunately… but I hope that another device will.

So that’s it… for now!

Screenshot of Starfield.
Starfield (2023).

Stay tuned, because I have five more “hot takes” that I’m currently in the process of writing up.

As I said at the beginning, none of these things should be taken too seriously – this is just intended to be a bit of thought-provoking fun, at the end of the day.

There’s a lot to love about gaming as a hobby, and the quality of video games in general is way higher today than I could’ve imagined even just a few years ago. There are some incredible games out there; masterpieces in every sense of the word that have given me some of the best entertainment experiences I’ve ever had. And there are some games that I didn’t enjoy, too! I hope this look at a few of my “hot takes” hasn’t gotten anyone too upset!

All titles discussed above are the copyright of their respective studio, developer, and/or publisher. Some images used above courtesy of IGDB and Unsplash. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.