Ten Gaming “Hot Takes” (Part 2)

A few days ago, I shared the first of my gaming “hot takes,” and today we’re going to finish the job. I’ve got five more “hot takes” to round out this list, and I think we’ve got some spicy ones in the mix!

As I said last time, this isn’t clickbait! These are opinions that I genuinely hold, and I’m not inventing things for the sake of being controversial or to score “internet points.” I’m also keenly aware that I’m in the minority, and that plenty of folks can and will disagree. That’s okay – there should be enough room in the gaming community for differences of opinion and friendly discussion of these topics. This is all subjective, at the end of the day!

So if you missed the first part of the list, you can find it by clicking or tapping here. Otherwise, it’s time to get started!

“Hot Take” #6:
Story matters more than gameplay (in most cases).

Starfield (2023).

When discussing Starfield a few weeks ago, I said something rather telling. I didn’t really appreciate it in the moment, but looking back, I think it sums up my relationship with video games as a hobby quite well: “I’m someone who’ll happily play through some absolutely bog-standard gameplay if I’m enjoying a story or getting lost in a fictional world…” If you want to see the full quote in context, by the way, you can find my piece on Starfield by clicking or tapping here.

That line pretty much sums up how I relate to most games I play – and almost all single-player and action/adventure titles. There are some exceptions: Mario Kart 8 Deluxe springs to mind, as does Fall Guys, and some turn-based strategy games, too. But when I look at the games I’ve enjoyed the most since at least the second half of the ’90s, it’s story more than gameplay that appeals to me.

There are some exceptions!

It was a solid story and great world-building that convinced me to stick with Cyberpunk 2077, even when I felt its gameplay was nothing special. And on the flip side, it was a mediocre story set in a boring, empty world that led to me giving up on Starfield after less than thirty hours. When I fire up a single-player game, I’m looking for a story that grabs me, and a world I can lose myself in.

It doesn’t feel controversial to say “I want a game to have a good story,” but that isn’t really the point I’m trying to make. For me, story almost always trumps gameplay. While there can be exceptions – games with either incredibly innovative gameplay in which the narrative is less relevant or games that are so mechanically poor or bug-riddled that even the best story couldn’t salvage them – for the most part, that’s what I’m looking for in a new release.

I stuck with Cyberpunk 2077 because of its story.

It was Shenmue, around the turn of the millennium, that stands out to me as an example of this. Shenmue was the first game I’d played where the story seemed like it would be right at home on the big screen, and I absolutely adored that. Many games have come along in the years since with compelling characters, wonderful worlds, or magnificent mysteries… and I think that’s part of why I still love playing video games after more than thirty years.

If games had stuck to being glorified toys; story-less arcade boxes where the only objective was either “kill everything on the screen” or “keep walking to the right,” then I think I’d probably have drifted away from the hobby. But I was fortunate enough to play some absolutely phenomenal titles as gaming made that transition and many incredible stories were written.

“Hot Take” #7:
More complexity and additional gameplay elements do not make a game “better.”

Darn young’ins.

Some modern games try to cram in too many features and gameplay mechanics that add nothing to the experience – and in some cases actively detract from it. I know this probably comes across as “old man yells at cloud;” an out-of-touch dinosaur whining about how modern games are too convoluted! And if this was something that only happened in a handful of titles, I guess I’d be okay with it. But it seems to happen all the time!

Strategy and “tycoon” games seem to fall victim to this very easily. I adored Rollercoaster Tycoon when it launched in 1999; it felt like a game that was simple to get started with but difficult to master. In contrast, when I tried 2016’s Planet Coaster… I was hit with such a huge wall of options and features that it was offputting. I didn’t know where to start.

Games used to be simpler…

There’s a balance that games have to find between challenge and complexity, and some titles get it wrong. I don’t have the time (or the energy) to spend tens or hundreds of hours becoming a literal rollercoaster engineer; I want something I can pick up and play, where I’m able to throw down a few theme park attractions without too much complexity. If the game had those more complex engineering sim elements in addition – as optional extras for players who wanted them – that could be okay. But when booting up a new game for the first time, I don’t want to encounter a dense wall of features and content.

This doesn’t just apply to strategy games, either. An increasing number of shooters and action/adventure games are incorporating full-bodied role-playing systems, and again it just feels wholly unnecessary. Look at a game from the early 2000s like Halo: Combat Evolved. It was a shooter – your character had a handful of weapons to choose from, and you blasted away at aliens. There was no need for levelling up, for choosing traits or skills, or anything like that. But more and more modern games, even in the first-person shooter or stealth genres, are going for these kinds of role-playing mechanics.

Skill points and levelling up in Assassin’s Creed: Mirage.

Don’t get me wrong: I love a good role-playing game. But when I boot up something like Assassin’s Creed or Destiny, the last thing I want or expect is to spend ages in menus micromanaging a character who, to be blunt, doesn’t need that level of engagement. Partly this is about balance, and in some cases it can be fun to level up and gain access to new equipment, for instance. But in others it really is a question of simplicity over complexity, and what kind of game I’m playing. Not every game can or should be a role-playing experience with a complex set of stats and skills.

Some titles really emphasise these elements, too, seeking to win praise for including a convoluted levelling-up system and skill tree. And a lot of the time, I find myself rolling my eyes at that. Leave the role-playing to RPGs and leave the overly-complicated systems to simulators and let me pick up and play a fun game!

“Hot Take” #8:
I hate VR.

Promo image of the HTC Vive Pro 2 headset.

Is “hate” too strong a word to use in this context? I’m going to go with “no,” because I genuinely hate VR. I was worried when the first VR headsets started being released that the video games industry in general was going to go all-in on VR, because I felt if that were to happen that I wouldn’t be able to keep up. But thankfully VR remains a relatively niche part of gaming, and even if that were to change, it doesn’t seem like it’s going to replace regular old video games any time soon!

In the ’80s and ’90s, it seemed as if VR was something tech companies were working towards. It was a futuristic goal that was just out of reach… so when VR headsets first started cropping up, I really thought that they were going to be “the next big thing.”

TV shows like VR Troopers hinted at VR being the direction of travel for video games as far back as the ’90s.

But I’ve never found a VR system that I could actually use. I could barely manage playing tennis on the Wii – and even then I had to remain seated! I’m disabled, in case you didn’t know, and the move toward VR headsets and motion-tracking devices felt a bit threatening to me; these technologies seemed like they had the potential to lock me out of gaming.

There haven’t been many VR titles that have interested me, though. One of the only VR titles that did – Star Trek: Bridge Crew – was pretty quickly ported to PC without the VR requirement. While the influence of VR is still clearly present in that title, I think it demonstrates that at least some VR games can work without the expensive equipment.

Star Trek: Bridge Crew was quickly ported to non-VR systems.

There’s plenty of room for innovation in gaming, and for companies to try out different kinds of screens, controllers, and methods of interactivity. But for me personally, VR felt like a step too far. I’m biased, of course, because between vision problems and mobility restrictions I don’t feel capable of using any of the current VR systems – not to anything like their full capabilities, at any rate. But even with that caveat, I just don’t think VR has turned out to be anything more than a gimmick.

It’s possible, I suppose, that a VR system will come along one day that I’ll feel compelled to invest in. But it would have to be something I could use with ease, and none of the VR devices currently on the market fit the bill. So I won’t be jumping on the VR bandwagon any time soon!

“Hot Take” #9:
We need fewer sequels and more original games.

I’ve lost count of the number of entries in the Call of Duty franchise at this point…

Across the world of entertainment in general, we’re firmly in an era of franchises, sequels, spin-offs, and connected “universes.” This trend has been going on for well over a decade at this point… but it’s been to the detriment of a lot of stories. There’s always going to be room for sequels to successful titles… but too many video game publishers have gone all-in on franchises and a handful of ongoing series at the expense of creating anything original.

And unfortunately, some original titles that have come along in recent years haven’t found success. I mentioned Starfield above, which seems to be seeing a precipitous drop in its player count, but we could also point to games like Anthem, Forspoken, or Babylon’s Fall – all of which were new settings featuring new characters that struggled to get off the ground.

Forspoken didn’t exactly light up the board, unfortunately.

The reason why I consider this one to be a “hot take” is simply because of how many players seem content to go back to the same handful of franchises or series over and over again. Some folks have even gotten genuinely angry with developers for sidelining their favourite series in order to work on something new, as if a studio should only ever be allowed to work on a single series in perpetuity. Sequels, prequels, and spin-offs are all more popular and attract more attention than brand-new experiences, and I think that’s short-sighted on the part of publishers and narrow-minded on the part of at least some players.

And I have to hold up my hands here: I can be guilty of this, too. I’ve written articles here on the website looking ahead to the next Mass Effect game, for instance, while it seems clear that at least some of the folks at BioWare wanted to branch out and create something different. And I have to admit that a sequel to a game I enjoyed or a new entry in a franchise I’m invested in is exciting – more so, arguably, than the announcement of a brand-new project.

Lots of people are eagerly anticipating the next Mass Effect game.

Brand-new games are more difficult and more expensive to get people to pay attention to. They’re also comparatively risky propositions from a corporate point of view; a ton of people will turn up for a game with a well-known name attached, even if it’s not all that good. But a brand-new world has to be something truly special to attract players in the first place – let alone retain a huge playerbase and make a profit.

But it’s a shame that that’s the situation we’re in, because when developers are restricted to sequels and the same handful of franchises, creativity is stifled. Where’s the next breakthrough going to come from if the only games a studio is able to make are sequels and spin-offs to earlier titles? And when audiences get tired of the decreasing number of surviving franchises… what will happen?

“Hot Take” #10:
Graphics actually do matter.

Kena: Bridge of Spirits (2021).

This is perhaps the most contentious point on this list! I’ve lost track of the number of times I’ve heard some variant of the expression “graphics don’t matter” when discussing video games. But you know what? If you showed me two similar games in the same genre, with the key difference between them being that one was a ray-tracing Unreal Engine 5 beauty and the other looked like a Nintendo 64 game that had been sneezed on… I know which one I’d choose to play.

When I was really getting into gaming as a hobby in the 1990s, it seemed like the push for better and better graphical fidelity was never-ending. Games used their visuals as a selling-point, and that trend continued into the 2000s with consoles like the Xbox and PlayStation 2. It would’ve seemed wild in those days for a game to not only take a backwards step in graphical terms, but to celebrate doing so.

Grand Theft Auto: Vice City looked great in 2002.

We need to separate “graphics” from “art style,” because they’re really two different things. Some games can do wonderful things with cell-shading, for example, or a deliberately cartoony aesthetic. When I say that “graphics actually do matter,” I don’t mean that photorealism is the be-all and end-all; the only art style that games should pursue. What I mean is that games that prioritise looking great – within their chosen style – are going to grab my attention.

I think an interesting example here is South Park: The Stick of Truth. No one would argue that that game is “realistic” in its art style – but that’s the point. Developers Obsidian Entertainment worked overtime to recreate the look and feel of the South Park cartoon – and what resulted was a genuinely fun and interesting visual presentation. Playing that game really felt like taking part in an extended episode of the show. Compare the way The Stick of Truth and its sequel look to the upcoming South Park: Snow Day. I know which one I’d rather play!

South Park: The Stick of Truth stands out because of its visual style.

When a developer wants to go down the photorealism route, though, it’s great to see just how far they can push modern hardware. There were moments in games like Red Dead Redemption II where the environment felt genuinely real – and that feeling is one that games have been chasing since the inception of the medium. I really can’t wait to see how graphics continue to improve, and how realistic some games might be able to look in fifteen or twenty years from now… if I live that long!

At any rate, visually beautiful games are always going to catch my eye, and games that don’t prioritise graphical fidelity will always have a hurdle to overcome in some ways. Gameplay and story are important, of course, but graphics aren’t irrelevant. The way a game looks really does matter.

So that’s it!

A Sega Dreamcast console. I had one circa 2000.

We’ve come to the end of the list – for now! I’m sure I’ll have more “hot takes” and controversial opinions about video games that I’ll be able to share before too long.

I hope that this has been interesting – and not something to get too worked up over! As I said at the beginning, I know that I’m in the minority and that a lot of folks can and will disagree. Although some people take gaming a bit too seriously sometimes, I like to think that there’s room in the community for polite discussions and disagreements.

Have fun out there – and happy gaming!

All titles discussed above are the copyright of their respective studio, developer, and/or publisher. Some images used above courtesy of IGDB and Unsplash. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

Ten Gaming “Hot Takes” (Part 1)

Today I thought we could have a bit of fun and talk about some of my more controversial gaming opinions! This is the first part of a two-part list, so be sure to stay tuned in the days ahead for five more gaming “hot takes.” There were too many to fit into a single piece this time around!

Although this is intended to be lighthearted and somewhat tongue-in-cheek, these are opinions that I genuinely hold; I’m not making things up for the sake of clickbait. I’ll always give the caveat that I’m a fan of video games and an advocate for gaming as a hobby… but that doesn’t mean that there aren’t things to criticise from time to time!

A Sega Mega Drive console.
Let’s share some controversial gaming opinions!

Gaming has changed a lot since I first picked up a joystick at a kids’ club in the ’80s, and I’ve seen the games industry and games themselves evolve dramatically! Most of those changes have been for the better… but perhaps not every last one.

As I always say when we talk about potentially controversial topics: these are my wholly subjective opinions! I’m not trying to claim that I’m right and that’s the end of the affair – on the contrary: I’m acutely aware that I’m in the minority here! I share these “hot takes” in the spirit of thought-provoking fun, and you are free to disagree wholeheartedly.

With all of that out of the way, let’s take a look at some “hot takes!”

“Hot Take” #1:
An open world isn’t the right choice for a lot of games.

A screenshot of Jedi: Survivor showing protagonist Cal Kestis outside of a saloon.
Jedi: Survivor is a recent game that employed an open world style.

Open worlds became a gaming trend sometime in the early 2010s, and too many publishers nowadays insist on forcing the formula onto titles that are entirely unsuited to it. Some open worlds are great… but I’d argue that relatively few manage to hit the golden combo of being both a well-constructed open world and one that suits the game in question. There have been some fantastic open worlds in which stories were told that didn’t fit, and some games that could’ve been wonderful that were undone by the fetishisation of the open world formula in some corporate boardrooms.

In many, many cases, having distinct levels or separate sections of a larger map just… works. It allows for the game’s narrative to create an often-necessary sense of physical distance in between locations – something that even the best open world maps are usually unable to manage. And for an awful lot of stories – even in games that we might consider to be masterpieces – that can be important to the immersion.

Ryo Hazuki, protagonist of Shenmue, encounters a man dressed as Santa Claus.
An early open world pioneer was Shenmue on the Dreamcast.

Take Red Dead Redemption II as an example. That game is one of the very best that I’ve ever played… but there were several points in its single-player story where the open world formula came close to being a problem. After escaping the town of Blackwater by the skin of their teeth in the game’s prologue, Arthur Morgan and the gang roam around in the mountains for a while, before eventually finding a new place to make camp… literally five minutes away from Blackwater. And this would happen again later in the game, when the gang would escape the town of Valentine only to settle at a new campsite just up the road.

The game’s narrative presented these locations as if they were far apart, but the open world of Red Dead Redemption II, for all of the content that it was filled with, didn’t always gel with that. It’s a scaled-down representation of part of the United States, and I get that. But narratively, it might’ve worked even better if the game’s main acts took place in separate, smaller open maps instead of merging them all into one larger open world.

Arthur Morgan, the protagonist of Red Dead Redemption II.
Red Dead Redemption II is a masterpiece.

Red Dead Redemption II is, without a doubt, one of the best games that I’ve ever played. So if the open world could be a problem there… well, you don’t need to think too hard to find examples of the open world formula tripping up worse and far less enjoyable titles! There’s absolutely nothing wrong with creating separate levels for a game – as has been done really since the beginning of narrative video games. Doing so often allows for more diversity in locations, environments, and terrain – and it’s something more titles need to consider taking advantage of.

I could probably count on my fingers the number of games that have genuinely made good use of an open world formula, and that have used that style of map properly. And when I think about modern games that I’ve really enjoyed such as The Last of Us, Jedi: Fallen Order, or the Mass Effect trilogy, they don’t use open worlds – and they’re much better for it.

“Hot Take” #2:
Every game should have a robust easy mode – it’s an accessibility feature.

The Skyrim options menu with difficulty settings highlighted.
Difficulty options in Skyrim.

I’m a big believer in making games accessible to as many players as possible. That can mean including accessibility features like colourblindness settings, disabling quick-time events, or ensuring that subtitles are available. But it also means that players need to be able to tone down the difficulty – yes, even in your precious Dark Souls!

I suffer from arthritis, including in my hands and fingers. I don’t have the ability to pull off complicated multi-button combos any more – if I ever possessed such an ability! And as with any skill or set of skills, gaming abilities vary from person to person; even someone who isn’t suffering from a health condition may simply not be blessed with the reflexes or hand-eye coordination necessary to progress through some of the industry’s more punishing titles. Not to mention that many folks don’t have the free time to dedicate to learning precise button combos or the intricate details of specific boss battles.

A promotional screenshot of Kingdom Come: Deliverance.
Kingdom Come: Deliverance was a title I found too difficult to play, despite wanting to enjoy it.

And that’s a real shame – because there are some outstanding games that everyone should be able to experience. Stories in some games are truly awe-inspiring, and can be better in some cases than films or television shows. For those stories to be denied to people with disabilities or people who may not have the time to repeat the same boss fight or level over and over again is just… sad.

I absolutely detest the expression “not every game is made for every player” when this debate rolls around. It’s absolutely true that people like different things, so if I’m not into online multiplayer shooters then I’m probably not going to enjoy the next Call of Duty title. But that doesn’t apply to difficulty, or to making a game that millions of potential players are locked out of because of a skill mismatch or health condition. That kind of gatekeeping is honestly just pathetic.

A toddler or young child playing a racing game.
Gaming should be accessible to as many people as possible.

I’d also add that the reverse is true here: certain games can be too easy for some players, and including the option to increase the difficulty in that case is likewise a good thing and something that developers should seek to include.

Difficulty settings have been a part of games going back decades, and they aren’t all that difficult to implement. At the very least, giving players the option to skip a level or boss battle after failing it multiple times should be achievable for every developer – and I can’t think of a good reason why a studio that cares about its audience wouldn’t want to implement something so incredibly basic. It doesn’t “hurt” the game to include an easy mode, nor does it damage the developers’ “artistic vision.” An easy mode only impacts players who choose to turn it on – and in a single-player game, why should anyone be judgemental about that?

“Hot Take” #3:
Artificial intelligence isn’t “coming soon,” it’s already here – and the games industry will have to adapt.

Still frame from the film Terminator (1984).
Are you ready for the “rise of the machines?”

One of the hottest topics of 2023 has been the arrival of easily-accessible generative AI software. It seems that anyone can now create an article like this one, a photorealistic image of just about anything, an audio recording of a celebrity… or even code for a video game. This technology has well and truly landed, and I don’t see any practical way to prohibit or ban it – so the games industry is going to have to adapt to that reality.

I can see a lot of potential positives to AI. Modding, for instance, can now get a lot more creative, and we’ve seen already mods featuring AI voices that are basically seamless and can add a lot to a character or story. For smaller developers and indie studios, too, AI has the potential to be a massively useful tool – doing things that a single developer or small team wouldn’t be able to achieve.

"Matrix code" from the 2021 film The Matrix: Resurrections.
AI is already here – and could prove incredibly useful to game developers.

But there are unquestionably massive downsides. The games industry has seen significant layoffs this year – despite most of the big corporations making record profits. Corporations in all kinds of industries are looking to replace as many real humans as possible with AI software… and for an all-digital product like a video game, the potential for divisions or even entire studios being shut down is firmly on the table.

The arrival of generative AI is going to shake things up, and because of the way it works, I can absolutely see there being less creativity in the games industry if too many big corporations go down that road. Because of the way these AI programmes work, they aren’t capable of truly creating – only reworking things that already exist and generating something with the same parameters. If major video games start using AI in a big way, you can say goodbye to innovation and creativity.

An example of AI-generated art.
An example of AI-generated art that was created (in less than ten seconds) from a prompt I entered.
Image Credit: Hotpot Art Generator

Whichever company cracks AI first is, in all likelihood, going to be rewarded – so there may even be a kind of “AI arms race” within the games industry, as some of the biggest corporations duke it out to be the first one to strike the right balance between AI and human-created content. What that might mean for games in the short-to-medium term… I can’t really say.

Generative AI is here to stay, though, and I don’t see a way around that. Some folks have suggested boycotting AI-heavy titles, but these consumer boycotts seldom succeed. If a new game that relied on AI during its creation ends up being fun to play, I daresay it’ll get played. Most players don’t follow the ins and outs of the industry, and may never even know the extent to which their favourite game was created using AI. I hope you’re ready for AI… because I’m not sure that I am!

“Hot Take” #4:
Sonic the Hedgehog doesn’t work in 3D.

Promotional screenshot from 2014's Sonic Boom: Rise of Lyric.
3D Sonic.

We’re going franchise-specific for this one! I adored the first Sonic the Hedgehog games on the Sega Mega Drive. I didn’t have a Mega Drive at the time, but a friend of mine did and we played a lot of Sonic in the early ’90s! Along with Super Mario, Sonic was one of the characters who scaled the mountain and was at the absolute peak of gaming… for a time.

But Sonic’s sole gimmick meant that the character struggled to successfully make the transition from 2D side-scrolling games to fully 3D titles. Extreme speed is something that works well in a 2D title, but it’s hard to code and even harder to play in a 3D environment.

Cropped box art for the re-release of Sonic the Hedgehog.
Sonic’s “gotta go fast” gimmick works in 2D games… but not in 3D.

The most successful Sonic game this side of the millennium has been Sonic Mania… a 2017 title that was originally created by fans of the series before Sega got involved. Sonic Mania is an old-school 2D platformer in the style of the original Mega Drive games. It’s great fun, and a real return to form for Sega’s mascot after years of mediocrity.

Sonic’s fundamental problem begins with his sole superpower: speed. Extreme speed was something that felt wonderful in 2D… and not to mention incredibly innovative! But in 3D, it’s just so much more difficult to build worlds suited to moving so quickly – not to mention that it’s tricky for players to control a character moving at such speed.

Promotional screenshot for 2017's Sonic Mania.
Sonic Mania has been the most successful Sonic game in decades.

There have been 3D Sonic games that tried to innovate, but even the best of them feel like they’re missing something. I remember playing Sonic Adventure on the Dreamcast and barely having to push any buttons; in order to make Sonic work in 3D, much of the interactivity had to be stripped out. That made for a far less enjoyable gaming experience.

When Sonic shows up in other titles – such as alongside Mario for an arcadey sports game, or in Sega’s Mario Kart competitor – then the character can be made to work. But those games almost always rob Sonic of his one defining trait: his speed. I’ve never played a 3D Sonic game that felt anywhere near as good as those original 2D titles.

“Hot Take” #5:
Google Stadia was a good idea (in more ways than one).

Promo image featuring the Stadia control pad.
Promo image of the Stadia control pad (right) next to a laptop.

The history of video gaming is littered with failed consoles and devices; machines that didn’t quite make it for one reason or another. 2019’s Stadia – Google’s attempt to break into the games industry – has become the latest example, being fully shut down after only a couple of years. There were myriad problems with Stadia, and Google has a track record of not backing up its projects and investments nor giving them enough time to deliver. So in that sense its failure is understandable. But I think I’m out on a limb when I say that it’s disappointing – and potentially even bad for the games industry as a whole.

Stadia offered a relatively inexpensive way to get started with gaming by relying on streaming. Gone was the need for an expensive console or PC; players could jump in using only their existing screen and a Stadia controller. Lowering the cost of entry to gaming is a good thing, and we should be looking around for more ways to do that!

Promo screenshot of Stadia-exclusive title Gylt.
Gylt was one of the only Stadia-exclusive games.

Secondly, Stadia represented the first potential shake-up of a pretty stagnant industry in nigh-on twenty years. Since Microsoft entered the video game market and Sega dropped out, there have been three major hardware manufacturers and three main gaming platforms. Disrupting that status quo is, again, not a bad thing in theory. Stadia, with Google’s support and financial resources, seemed well-positioned to be the kind of disruptive force that often leads to positive change.

Stadia won’t be remembered – except as the answer to an obscure pub quiz question in a few years’ time, perhaps. But it had potential when it was announced, both in terms of the way it could have brought console-quality games to people who couldn’t necessarily pay for a current-generation machine up-front, and in the way Google could’ve disrupted the industry, leading to competition and innovation.

A Google Chromecast device.
Stadia was designed to be compatible with Google’s Chromecast devices – as well as other platforms.

I didn’t buy into Stadia on day one. As someone who has a gaming PC, I didn’t really feel it was necessary. And there were limitations to Stadia: a lack of exclusive games, no subscription option, and Google’s well-known history of prematurely shutting down underperforming products and services. All of these things put me off – and undoubtedly put off a lot of other folks, too.

But in a way, I regret the demise of Stadia. Its short, unsuccessful life will surely be a warning to any other company that might’ve considered launching a new console or a comparable streaming device, and if there’s one thing I think we can all agree on it’s this: the games industry needs a shake-up from time to time! Stadia couldn’t do it, unfortunately… but I hope that another device will.

So that’s it… for now!

Screenshot of Starfield.
Starfield (2023).

Stay tuned, because I have five more “hot takes” that I’m currently in the process of writing up.

As I said at the beginning, none of these things should be taken too seriously – this is just intended to be a bit of thought-provoking fun, at the end of the day.

There’s a lot to love about gaming as a hobby, and the quality of video games in general is way higher today than I could’ve imagined even just a few years ago. There are some incredible games out there; masterpieces in every sense of the word that have given me some of the best entertainment experiences I’ve ever had. And there are some games that I didn’t enjoy, too! I hope this look at a few of my “hot takes” hasn’t gotten anyone too upset!

All titles discussed above are the copyright of their respective studio, developer, and/or publisher. Some images used above courtesy of IGDB and Unsplash. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

Ten of my gaming pet peeves

A couple of years ago, I put together two lists of things I really dislike about modern video games – but somehow I’ve managed to find even more! Although there’s lots to enjoy when it comes to the hobby of gaming, there are still plenty of annoyances and dislikes that can detract from even the most pleasant of gaming experiences. So today, I thought it could be a bit of fun to take a look at ten of them!

Several of these points could (and perhaps one day will) be full articles or essays all on their own. Big corporations in the video games industry all too often try to get away with egregiously wrong and even malicious business practices – and we should all do our best to call out misbehaviour. While today’s list is somewhat tongue-in-cheek, there are major issues with the way big corporations in the gaming realm behave… as indeed there are with billion-dollar corporations in every other industry, too.

Gaming is great fun… but it has its annoyances!

That being said, this is supposed to be a bit of fun. And as always, I like to caveat any piece like this by saying that everything we’re going to be talking about is nothing more than one person’s subjective take on the topic! If you disagree with everything I have to say, if you like, enjoy, or don’t care about these issues, or if I miss something that seems like an obvious inclusion to you, please just keep in mind that all of this is just the opinion of one single person! There’s always room for differences of opinion; as gamers we all have different preferences and tolerance levels.

If you’d like to check out my earlier lists of gaming annoyances, you can find the first one by clicking or tapping here, and the follow-up by clicking or tapping here. In some ways, this list is “part three,” so if you like what you see, you might also enjoy those older lists as well!

With all of that out of the way, let’s jump into the list – which is in no particular order.

Number 1:
Motion blur and film grain.

Film grain and motion blur options in Ghostwire Tokyo.

Whenever I boot up a new game, I jump straight into the options menu and disable both motion blur and film grain – settings that are almost always inexplicably enabled by default. Film grain is nothing more than a crappy Snapchat filter; something twelve-year-olds love to play with to make their photos look “retro.” It adds nothing to a game and actively detracts from the graphical fidelity of modern titles.

Motion blur is in the same category. Why would anyone want this motion sickness-inducing setting enabled? It smears and smudges even the best-looking titles for basically no reason at all. Maybe on particularly underpowered systems these settings might hide some graphical jankiness, but on new consoles and even moderately good PCs, they’re unnecessary. They make games look significantly worse – and I can’t understand why anyone would choose to play a title with them enabled.

Number 2:
In-game currencies that have deliberately awkward exchange rates.

Show-Bucks bundles in Fall Guys.

In-game currencies are already pretty shady; a psychological manipulation to trick players into spending more real money. But what’s far worse is when in-game currencies are deliberately awkward with their exchange rates. For example, if most items on the storefront cost 200 in-game dollars, but I can only buy in-game dollars in bundles of 250 or 500. If I buy 250 in-game dollars I’ll have a few left over that I can’t spend, and if I buy 500 then I’ll have spent more than I need to.

This is something publishers do deliberately. They know that if you have 50 in-game dollars left over there’ll be a temptation to buy even more to make up the difference, and they know players will be forced to over-spend on currencies that they have no need for. Some of these verge on being scams – but all of them are annoying.

Number 3:
Fully-priced games with microtransactions.

The in-game shop in Diablo IV.

If a game is free – like Fortnite or Fall Guys – then microtransactions feel a lot more reasonable. Offering a game for free to fund it through in-game purchases is a viable business model, and while it needs to be monitored to make sure the in-game prices aren’t unreasonable, it can be an acceptable way for a game to make money. But if a game costs me £65 up-front, there’s no way it should include microtransactions.

We need to differentiate expansion packs from microtransactions, because DLC that massively expands a game and adds new missions and the like is usually acceptable. But if I’ve paid full price for a game, I shouldn’t find an in-game shop offering me new costumes, weapon upgrades, and things like that. Some titles absolutely take the piss with this, too, even including microtransactions in single-player campaigns, or having so many individual items for sale that the true cost of the game – including purchasing all in-game items – can run into four or even five figures.

Number 4:
Patches as big as (or bigger than) the actual game.

No patch should ever need to be this large.

This one kills me because of my slow internet! And it’s come to the fore recently as a number of big releases have been buggy and broken at launch. Jedi: Survivor, for example, has had patches that were as big as the game’s original 120GB download size – meaning a single patch would take me more than a day to download. Surely it must be possible to patch or fix individual files without requiring players to download the entire game all over again – in some cases more than once.

I’m not a developer or technical expert, and I concede that I don’t know enough about this topic on a technical level to be able to say with certainty that it’s something that should never happen. But as a player, I know how damnably annoying it is to press “play” only to be told I need to wait hours and hours for a massive, unwieldy patch. Especially if that patch, when fully downloaded, doesn’t appear to have actually done anything!

Number 5:
Broken PC ports.

This is supposed to be Joel from The Last Of Us Part 1.

As I said when I took a longer look at this topic, I had hoped that broken PC ports were becoming a thing of the past. Not so, however! A number of recent releases – including massive AAA titles – have landed on PC in broken or even outright unplayable states, plagued by issues that are not present on PlayStation or Xbox.

PC is a massive platform, one that shouldn’t be neglected in this way. At the very least, publishers should have the decency to delay a PC port if it’s clearly lagging behind the console versions – but given the resources that many of the games industry’s biggest corporations have at their disposal, I don’t see why we should accept even that. Develop your game properly and don’t try to launch it before it’s ready! I’m not willing to pay for the “privilege” of doing the job of a QA tester.

Number 6:
Recent price hikes.

It must be some kind of visual metaphor…

Inflation and a cost-of-living crisis are really punching all of us in the face right now – so the last thing we need are price hikes from massive corporations. Sony really pissed me off last year when they bragged to their investors about record profits before turning around literally a matter of weeks later and announcing that the price of PlayStation 5 consoles was going to go up. This is unprecedented, as the cost of consoles usually falls as a console generation progresses.

But Sony is far from the only culprit. Nintendo, Xbox, Activision Blizzard, TakeTwo, Electronic Arts and practically every major corporation in the games industry have jacked up their prices over the last few years, raising the basic price of a new game – and that’s before we look at DLC, special editions, and the like. These companies are making record-breaking profits, and yet they use the excuse of “inflation” to rip us off even more. Profiteering wankers.

Number 7:
The “release now, fix later” business model is still here.

The player character falling through the map in Star Wars Jedi: Survivor.

I had hoped that some recent catastrophic game launches would have been the death knell for the “release now, fix later” business model – but alas. Cyberpunk 2077 failed so hard that it got pulled from sale and tanked the share price of CD Projekt Red… but even so, this appalling way of making and launching games has persisted. Just in the first half of 2023 we’ve had titles like Hogwarts Legacy, Redfall, Jedi: Survivor, Forspoken, and The Lord of the Rings: Gollum that arrived broken, buggy, and unplayable.

With every disaster that causes trouble for a corporation, I cross my fingers and hope that lessons will be learned. But it seems as if the “release now, fix later” approach is here to stay. Or at least it will be as long as players keep putting up with it – and even defending it in some cases.

Number 8:
Day-one DLC/paywalled day-one content.

An example of a “digital deluxe edition” and its paywalled content.

It irks me no end when content that was clearly developed at the same time as the “base version” of a game is paywalled off and sold separately for an additional fee. The most egregious example of this that comes to mind is Mass Effect 3′s From Ashes DLC, which was launched alongside the game. This DLC included a character and missions that were completely integrated into the game – yet had been carved out to be sold separately.

This practice continues, unfortunately, and many modern titles release with content paywalled off, even if that content was developed right along with the rest of the game. Sometimes these things are designed to be sold as part of a “special edition,” but that doesn’t excuse it either. Even if all we’re talking about are character skins and cosmetic content, it still feels like those things should be included in the price – especially in single-player titles. Some of this content can be massively overpriced, too, with packs of two or three character skins often retailing for £10 or more.

Number 9:
Platform-exclusive content and missions.

Spider-Man was a PlayStation-only character in Marvel’s Avengers.

Some titles are released with content locked to a single platform. Hogwarts Legacy and Marvel’s Avengers are two examples that come to mind – and in both cases, missions and characters that should have been part of the main game were unavailable to players on PC and Xbox thanks to deals with Sony. While I can understand the incentive to do this… it’s a pretty shit way of making money for a publisher, and a pretty scummy way for a platform to try to attract sales.

Again, this leaves games incomplete, and players who’ve paid full price end up getting a worse experience or an experience with less to do depending on their platform of choice. That’s unfair – and it’s something that shouldn’t be happening.

Number 10:
Pre-orders.

Cartman from South Park said it best:
“You know what you get for pre-ordering a game? A big dick in your mouth.”

Pre-ordering made sense – when games were sold in brick-and-mortar shops on cartridges or discs. You wanted to guarantee your copy of the latest big release, and one way to make sure you’d get the game before it sold out was to pre-order it. But that doesn’t apply any more; not only are more and more games being sold digitally, but even if you’re a console player who wants to get a game on disc, there isn’t the same danger of scarcity that there once was.

With so many games being released broken – or else failing to live up to expectations – pre-ordering in 2023 is nothing short of stupidity, and any player who still does it is an idiot. It actively harms the industry and other players by letting corporations get away with more misbehaviour and nonsense. If we could all be patient and wait a day or two for reviews, fewer games would be able to be launched in unplayable states. Games companies bank on a significant number of players pre-ordering and not cancelling or refunding if things go wrong. It’s free money for them – and utterly unnecessary in an age of digital downloads.

So that’s it!

A PlayStation 5 console.

We’ve gone through ten of my pet peeves when it comes to gaming. I hope this was a bit of fun – and not something to get too upset over!

The gaming landscape has changed massively since I first started playing. Among the earliest titles I can remember trying my hand at are Antarctic Adventure and the Commodore 64 title International Soccer, and the first home console I was able to get was a Super Nintendo. Gaming has grown massively since those days, and the kinds of games that can be created with modern technology, game engines, and artificial intelligence can be truly breathtaking.

But it isn’t all good, and we’ve talked about a few things today that I find irritating or annoying. The continued push from publishers to release games too early and promise patches and fixes is particularly disappointing, and too many publishers and corporations take their greed to unnecessary extremes. But that’s the way the games industry is… and as cathartic as it was to get it off my chest, I don’t see those things disappearing any time soon!

All titles mentioned above are the copyright of their respective developer, studio, and/or publisher. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

More of the worst things about modern video games

A couple of months ago I took a look at some of the trends I hate the most in the modern games industry. But one list wasn’t comprehensive enough, apparently, because I’ve found ten more of the worst things to look at today!

Gaming as a hobby has come a long way since I first owned a Super Nintendo. Games have evolved from being little more than electronic toys to being a legitimate artistic and storytelling medium in their own right, and many of my favourite entertainment experiences of all time are in the gaming realm. Games can equal, and in some cases surpass, film and television.

Mass Effect 2 has to be one of the best stories I’ve ever experienced.

But not everything about gaming is fun! There are annoyances and problems with games today, some of which didn’t exist a few years ago, and others which have dogged the medium since its inception. As always, this list is entirely subjective, so if I criticise something you like, or ignore something you hate, please keep in mind that all of this is just the opinion of one person. If you want to check out my previous list, you can find it by clicking or tapping here.

With all that out of the way, let’s get started!

Number 1: Checkpoints

Cal Kestis at a checkpoint in Star Wars Jedi: Fallen Order.

Is it 1996? No? Then let’s stop using checkpoints and allow players the freedom to save their game whenever and wherever they need to! With relatively few sensible exceptions – like in the middle of a boss fight or during a cut-scene – there’s no reason why modern games can’t incorporate a free save system.

Checkpoints were a limitation of older hardware and software; games and consoles weren’t always able to offer players the ability to save the game anywhere, so designated save zones – or checkpoints – had to be incorporated. This was already a step up from passwords that you had to write down (remember those?) but checkpoints are simply unnecessary and out-of-date in modern games.

Control also uses a checkpoint system.

With gaming having grown in the years since checkpoints were the only way to manage save files, more people from different backgrounds are getting into the hobby – including many more adults, working-age people, and folks with less free time. Having to replay a lengthy section of a game because the game didn’t offer the freedom to save when you needed to is incredibly frustrating, and considering that there is no technical reason for not implementing a proper save system, in my opinion there’s no excuse.

Whine all you want about “vision” and “integrity” and that players should “git gud,” but a lot of folks simply want to play through a fun and entertaining narrative. We also want to play through it once, not multiple times because of the lack of a convenient save function. Checkpoints seemed to have largely disappeared until the likes of Dark Souls brought them back as part of its “extreme difficulty” shtick. But there’s a difference between a challenge and something frustrating; checkpoints are definitely in the latter category.

Number 2: Boring and/or repetitive side-missions

“Another settlement needs our help.”

It’s no good bragging about the number of quests or missions in your game if 80% of them are the same – or equally as bad as each other! Open-world games tend to fall victim to this, but it’s a phenomenon that can plague all manner of different titles.

These kinds of missions follow one of a couple of different formulae: “go to location X and pick up item Y” or “go to location X and kill Y number of enemies.” Then that’s it. Mission over, receive a few experience points or a random, usually-not-worth-it item, and repeat. Such quests are nothing but padding for a game that should’ve been shorter and more focused.

The Mako in Mass Effect: Legendary Edition.

Even otherwise good games can end up going down this route. Mass Effect 1 is a case in point. The main story missions in the game are phenomenal, and while the stories which set up some of the side-missions sound like they could be potentially interesting, each one basically consists of “drive vehicle to location, kill enemies, press button.” Because 90% of the side-missions use basically identical maps and environments, this gets old fast – even if the storyline setting up the mission seems superficially interesting.

If you can’t make a good side-mission, skip it. I’d rather play a game that isn’t as long but doesn’t have this unnecessary fluff padding it out and, frankly, wasting my time.

Number 3: Collect-a-thons

Another feather. Yay.

On a related note, many open-world games have recently begun being padded out with miscellaneous items to collect. Upon picking up a feather, for example, the game will tell you that you’ve discovered 1/100 – only 99 more to go! These items almost always have no impact on the plot or gameplay of a title, and often don’t even give out a reward for finding all of them. At most you might get a trophy or achievement for collecting all of them.

At least boring side-missions usually have some kind of setup. A villager needs you to kill the rats in his basement, an admiral needs you to shut down all four computer cores, etc. Though the missions themselves are junk, a modicum of thought went into their creation. Collect-a-thons have no such redeeming feature. Often the items to be collected are so random that they have no link whatsoever to the plot or character.

Pigeons in Grand Theft Auto IV are another example.

Why does my grizzled war veteran on a mission to save the world need to spend his time hunting down 100 feathers or 50 leaves? If the items did something – anything – like if they could be used for crafting or if they were notes or recordings containing lore and info about the game world, well at least there’d be a point. It wouldn’t necessarily be a good point, but still.

These items are added into games – often in obscure or hard-to-reach places – purely to pad out the game and extend its runtime. They serve no purpose, either narratively or in terms of gameplay, and while I have no doubt that some players find collecting every single in-game item fun, for me I’d rather the effort and attention wasted on features like this was refocused elsewhere. One side-mission, even an average one, would be better than 100 random pieces of shit to collect.

Number 4: Online cheating

An aimbot for popular game Fortnite.

If you have a single-player game and want to turn on god mode or assisted aiming, go for it. Cheats can sometimes be accessibility features, offering a route through a game for players with disabilities, as well as providing a way to skip the grind for players who don’t have much time. But when you go online and play against real people, you damn well better leave the cheats behind!

There are so many examples of cheating players getting caught and banned that it can be kind of funny. Even some professional and wannabe-professional players have been caught out and learned the hard way that the internet never forgets. But no one should be doing this in the first place.

Some losers even cheated at Fall Guys, for heaven’s sake…

Trying to take away the most fundamental tenet of competition – fairness – is so phenomenally selfish that I don’t even know what to say. If there were a financial incentive – like winning the prize money at a big tournament – I could at least recognise that some folks would be tempted to try to take the easy route to payday. But in a game like Fall Guys where it’s supposed to be fun… I just don’t get why someone would feel the need to cheat.

Some games have a bigger problem with cheating than others, and games that don’t get a handle on a cheating problem fast can find themselves in serious jeopardy. It’s unfortunate that the anonymity of the internet means that a lot of players simply get away with it, with some even going so far as to use “disposable” accounts, so that if one gets banned they can just hop to another and keep right on cheating.

Number 5: Overly large, confusing levels

Looks like fun…

We kind of touched on this last time when considering empty open worlds, but some games have poorly-designed levels that are too large and almost maze-like. Getting lost or running in circles – especially if no map is provided – can become frustrating very quickly. These kinds of levels are often repetitive and bland with little going on.

Some games have levels which are simply not well laid-out, making it difficult to find the right path forward. I’ve lost count of the number of times I was trying to explore, thinking I was investigating a side-area, only to find it was the main path forward, and vice versa. Advancements in technology – particularly as far as file sizes go – have meant that levels and worlds can be physically larger. Sometimes that’s a good thing, but sometimes it isn’t!

This also applies to featureless open worlds or maps without landmarks for ease of navigation.

If a game has a map, or if a level is well-signposted (either literally or figuratively) then it shouldn’t matter how large it is. Players will be able to figure out where to explore and where to go to proceed with the story or quest. But too often that isn’t the case, and getting lost, backtracking, or not knowing where to go are all annoyances! Not every level has to be massive. Some work far better when kept concise, especially if the number of things to find or do in the level are limited.

Obviously I don’t include in this category mazes or levels which are deliberately designed to be puzzling. Some games make clever use of deliberately puzzling levels, where exploring and figuring out the right path is all part of the fun. Others just screw up their level design and leave players wandering around, confused.

Number 6: Orphaned franchises/unfinished stories

I’m not even going to say it…

Though the phenomenon of a story being abandoned partway through is hardly new – nor even unique to gaming – the rise of more cinematic, story-driven games since the turn of the millennium has brought this issue to the fore. The first encounter I had with this was in 2001 when Shenmue II dropped off the face of the earth (following abysmal sales in Japan and elsewhere) meaning that the saga was never finished.

But it isn’t just financial failures that don’t land sequels. The lack of a third game in the Half-Life series has become a joke at this point, more than fifteen years after the last mainline entry in the series. Fans have been clamouring for Half-Life 3 for a long time, and the recent success of VR title Half-Life: Alyx proves there’s a market and that the game’s audience is still here.

Will there ever be a Bully 2?

Sometimes a studio gets busy with other projects. There hasn’t been a new Elder Scrolls game, for example, in part because Bethesda has worked on the Fallout franchise and Starfield in the years since Skyrim was released. But there are also plenty of cases where a developer or publisher finds a cash-cow and abandons all pretence at making any new game so they can milk it dry.

Look at Rockstar with Grand Theft Auto V’s online mode, or Valve with its Steam digital shop and the success of online games like Dota 2 and Counter-Strike: Global Offensive. Those studios could make new games or sequels to existing games, but instead choose to focus on older titles. Similarly, studios like Bethesda found success by porting existing games to new and different hardware, as well as releasing new or updated versions of older games.

Number 7: Ultra Special Super Extreme Deluxe Editions

How many different “editions” does a game need?!

I’m not talking about so-called “collector’s editions” of games, which are often simply the game plus a statue or other memorabilia. Those can be fine, because if someone is willing to part with silly money to get a resin statue of an in-game character who am I to judge? What I greatly dislike are games that are sold with multiple “editions” – i.e. a “basic” version with missing features, then several progressively more expensive versions with those missing features added back in.

Some games take this to silly extremes, with a “basic” version retailing for full price (£55/$60) and the most expensive “deluxe” edition being far more expensive for the sake of adding in-game content (extra skins, missions, etc.) that were literally developed alongside the main game then cut out. Some of these ultra extreme special editions can retail for £80, £90, or even £100 in some cases, and that’s just deceptive.

Sports games, like the FIFA series, do this a lot.

This is an evolution of the “day-one DLC” phenomenon that was present a few years ago. In the case of Mass Effect 3, for example, an entire main character, a mission to recruit them, and all of their scenes and dialogue, was literally developed along with the game, perfectly integrated and designed to be part of the game, then cut out and sold as downloadable content literally on the day the game launched.

In multiplayer titles, the extreme special supreme editions can come with in-game advantages, making them literally pay-to-win. In free-to-play games, perhaps a degree of paying for an advantage is to be expected – but some of these games are asking full price, then giving a competitive advantage to players who pay above full price.

Number 8: Unrepresentative trailers/marketing material

Anthem made a fake trailer… and look what happened to the game.

I used to work in video games marketing, and I thought I’d seen every shady trick in the book! But some of the trailers and marketing material that publishers show off in the run-up to the launch of a new game can be downright deceptive. Some games, like notorious failure Anthem, even went so far as to create fake “in-game” footage to be shown off at marketing events, which is incredibly bad form.

Cyberpunk 2077 is another example. That game was developed to run on high-end PCs and next-gen consoles, and the Xbox One/PlayStation 4 version was so poorly-optimised when it launched that many folks considered it to be literally “unplayable.” The trailers and marketing material hid this fact, and developer CD Projekt Red deliberately kept those versions of the game away from reviewers. The result was that no one realised how broken the game was until it was too late.

CD Projekt Red didn’t show things like this in the Cyberpunk 2077 trailer…

Mobile games are notorious for putting out trailers that are entirely unrepresentative of the games they’re selling. Many mobile games are samey, basic tap-a-thons with unimpressive graphics and mediocre gameplay, yet the trailers make them seem like big-budget console-quality games. In a way this isn’t new; 2D games in the 8-bit era were often marketed with cartoons and fancy graphics that made them look far better than they were!

The thing is, unrepresentative marketing always comes back to bite a company. Just ask CD Projekt Red, whose implosion in the aftermath of Cyberpunk 2077′s abysmal launch will enter gaming history.

Number 9: Massive patches and updates

Yikes.

Last time I criticised ridiculously huge file sizes for games, and this time I want to pick on updates and patches in particular. There’s no feeling more disappointing than sitting down to play a game you’ve been looking forward to all day only to find that either the game or the console needs to download a stupidly large update before you can jump in.

Some updates can be dozens of gigabytes, and if you’re on a slow internet connection (like I am) or have limited downloads, it can take forever to update the game – or be outright impossible. Once again, folks with limited time for gaming are in trouble here; even on a reasonably fast connection, a massive update can cut into or erase the time someone set aside for gaming.

After buying a brand-new console, downloading patches and updates can be a time-consuming task.

The stupid thing is that many of these updates appear to change absolutely nothing! I’ve lost track of how many times Steam has updated itself on my PC, for example, only to look exactly the same every time. While it’s good that games companies can roll out bug fixes, patch out glitches, and even fix cheating issues remotely, these things can happen at the most inconvenient times!

In the run-up to Christmas it’s now commonplace, even in mainstream news outlets, to see advice given to update new consoles and games before giving them out as presents. Little Timmy’s Christmas would be ruined if he had to spend all of Christmas Day waiting around for his new PlayStation to update before he could use it!

Number 10: We’re drowning in sequels, remakes, and spin-offs

The Final Fantasy series is up to its fifteenth mainline title…

It’s increasingly rare for a games company to produce a new game that isn’t based on an existing franchise or property. Don’t get me wrong, this isn’t an issue unique to gaming – it’s happening on television and in cinema too. We’re 100% in the era of the franchise.

As great as it is to play a sequel to a much-loved title, it’s also great fun to get stuck into a completely new story with new characters and a new world. Unfortunately, as is the case in television and cinema, companies are increasingly viewing brand-new stories as risky – if fans don’t respond well then their investment will have been wasted!

How many Call of Duty games have there been by now?

Sooner or later, I think this franchise and sequel mania has to break. It can’t go on forever, not least because existing franchises will run out of material and fans will lose interest. But right now it shows absolutely no signs of abating, and some video game franchises have become annual or almost-annual fixtures. The Call of Duty series is a case in point – there’s been a new game every year since 2005.

I appreciate studios willing to stick their necks out and take a risk. Control is a good recent example of a successful new IP, and Starfield will be Bethesda’s first wholly new property in decades when it’s finally ready. But there’s certainly less storytelling innovation than there used to be, and fewer new games in favour of sequels, franchises, and spin-offs.

So that’s it. Ten more things that bug me about modern gaming!

I’m sure I’ll be able to think of more later!

Although we’ve now found twenty annoying trends in modern gaming, the hobby is generally in a good place. Technological improvements mean games look better than ever, and the increase in gaming’s popularity has seen more money enter the industry, as well as quality standards generally rising rather than falling. There are problems, of course, but the industry as a whole isn’t in a terrible place.

At the end of the day, it’s fun to complain and have a bit of a rant! The last list I published seemed to be well-read, so I hope this one has been a bit of fun as well! Now if only someone would make a Star Trek video game… perhaps the lack of one warrants a place on my next list!

You can find my first list of the worst things about modern video games by clicking or tapping here.

All titles mentioned above are the copyright of their studio, developer, and/or publisher. Some screenshots and promotional art courtesy of press kits on IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The worst things about modern video games

The first home console I owned – after saving up my hard-earned pocket money and pestering my parents for ages – was a Super Nintendo. Gaming has changed a lot since then, and while many of those changes have been fantastic and introduced us to new genres, not every change has been for the better! In this list I’m going to cover some of my biggest pet peeves with video games in 2021.

As always, this list is entirely subjective. If I criticise something you like, or exclude something you hate, just keep in mind that this is only one person’s opinion. Gaming is a huge hobby that includes many people with many different perspectives. If yours and mine don’t align, that’s okay!

Number 1: No difficulty options.

Some people play video games because they love the challenge of a punishingly-difficult title, and the reward of finally overcoming an impossible level after hours of perseverance. I am not one of those people! In most cases, I play video games for escapism and entertainment – I want to see a story unfold or just switch off from other aspects of my life for a while. Excessive difficulty is frustrating and offputting for me.

As someone with health issues, I would argue that difficulty settings are a form of accessibility. Some people don’t have the ability to hit keys or buttons in rapid succession, and in some titles the lack of a difficulty setting – particularly if the game is not well-balanced – can mean those games are unavailable to folks with disabilities.

While many games are too difficult, the reverse can also be true. Some titles are just too easy for some people – I’m almost never in that category, but still! Games that have no difficulty settings where the base game is incredibly easy can be unenjoyable for some folks, particularly if the challenge was what got them interested in the first place.

In 2021, most games have difficulty options as a standard feature. Difficulty settings have been part of games going back decades, and in my opinion there’s no technical reason why they shouldn’t be included. There’s also not really a “creative” reason, either. Some developers talk in grandiose terms about their “vision” for a title being the reason why they didn’t implement difficulty options, but as I’ve said before – the inclusion of an easier (or harder) mode does not impact the game at all. It only impacts those who choose to turn it on, and considering how easy it is to implement, I find it incredibly annoying when a game is deliberately shipped without any difficulty options.

Number 2: Excessive difficulty as a game’s only selling point.

While we’re on the subject of difficulty, another pet peeve of mine is games whose entire identity is based on their difficulty (or perceived difficulty). Think about this for a moment: would Dark Souls – an otherwise bland, uninspired hack-and-slash game – still be talked about ten years after its release were it not for its reputation as impossibly difficult? How many late 2000s or early ’10s hack-and-slash games have dropped out of the cultural conversation? The only thing keeping Dark Souls there is its difficulty.

A challenge is all well and good, and I don’t begrudge players who seek that out. But for me, a game has to offer something more than that. If there’s a story worth telling under the difficult gameplay I’m impressed. If the difficult, punishing gameplay is all there is, then that’s boring!

Difficulty can also be used by developers as cover for a short or uninteresting game. Forcing players to replay long sections over and over and over can massively pad out a game’s runtime, and if that’s a concern then cranking the difficulty to ridiculous levels – and offering no way to turn it down – can turn a short game into a long one artificially.

I’m all for games that offer replay value, but being forced to replay the same level or checkpoint – or battle the same boss over and over – purely because of how frustratingly hard the developers chose to make things simply isn’t fun for me.

Number 3: Ridiculous file sizes.

Hey Call of Duty? Your crappy multiplayer mode does not need to be 200 gigabytes. Nor does any game, for that matter. It’s great that modern technology allows developers to create realistic-looking worlds, but some studios are far better than others when it comes to making the best use of space! Some modern games do need to be large to incorporate everything, but even so there’s “large” and then there’s “too large.”

For a lot of folks this is an issue for two main reasons: data caps and download speeds. On my current connection I’m lucky to get a download speed of 7 Mbps, and downloading huge game files can quite literally take several days – days in which doing anything else online would be impossibly slow! But I’m fortunate compared to some people, because I’m not limited in the amount of data I can download by my ISP.

In many parts of the world, and on cheaper broadband connections, data caps are very much still a thing. Large game files can take up an entire months’ worth of data – or even more in some cases – making games with huge files totally inaccessible to a large number of people.

This one doesn’t seem like it’s going away any time soon, though. In fact, we’re likely to see file sizes continue to get larger as games push for higher resolutions, larger environments, and more detail.

Number 4: Empty open worlds.

Let’s call this one “the Fallout 76 problem.” Open worlds became a trend in gaming at some point in the last decade, such that many franchises pursued this style even when it didn’t suit their gameplay. Read the marketing material of many modern titles and you’ll see bragging about the size of the game world: 50km2, 100km2, 1,000km2, and so on. But many of these open worlds are just empty and boring, with much of the map taken up with vast expanses of nothing.

It is simply not much fun to have to travel across a boring environment – or even a decently pretty one – for ages just to get to the next mission or part of the story. Level design used to be concise and clever; modern open worlds, especially those which brag about their size, tend to be too large, with too little going on.

The reason why Fallout 76 just encapsulates this for me is twofold. Firstly, Bethesda droned on and on in the weeks before the game’s release that the world they’d created was the “biggest ever!” And secondly, the game had literally zero non-player characters. That huge open world was populated by a handful of other players, non-sentient monsters, and nothing else. It was one of the worst games of the last few years as a result.

Open worlds can work well in games that are suited for that style of gameplay. But too many studios have been pushed into creating an open world simply to fit in with a current trend, and those open worlds tend to just flat-out suck because of it. Even when developers have tried to throw players a bone by adding in collect-a-thons, those get boring fast.

Number 5: Pixel graphics as a selling point.

There are some great modern games that use a deliberately 8-bit look. But for every modern classic there are fifty shades of shit; games that think pixel graphics and the word “retro” are cover for creating a mediocre or just plain bad title.

It may be hard to remember, but there was a time when the idea of using a deliberately “old-school” aesthetic would have been laughed at. The first few console generations were all about improvements, and I’m old enough to remember when 3D was a huge deal. It seemed like nobody would ever want to go back to playing a SNES game after trying the Nintendo 64, and while there are still plenty of gamers who love the retro feel, I’m generally not one of them.

That isn’t to say that realistic graphics should be the only thing a game strives for. And this point works for modern graphics or visual styles in general – bragging about how detailed the graphics are, or how unique a title’s art style is, means nothing if the game itself is shit. But it likewise works for pixel-graphics games – an outdated art style does not compensate for or cover up a fundamentally flawed, unenjoyable experience.

Games with pixel graphics can be good, and many titles have surprised me by how good they are. I’ve written before about how Minecraft surprised me by being so much more than I expected, and that’s one example. But I guess what I’d say is this: if your game looks like it should have been released in 1991, you’ve got more of an uphill battle to win me over – or even convince me to try it in the first place – than you would if your game looked new.

Number 6: Unnecessary remakes.

We called one of the entries above “the Fallout 76 problem,” so let’s call this one “the Mass Effect: Legendary Edition problem.” In short, games from even ten or fifteen years ago still look pretty good and play well. There’s far less of a difference between games from 2011 and 2021 than there was between games from 1991 and 2001 – the pace of technological change, at least in gaming, has slowed.

“Updating” or “remaking” a game from ten years ago serves no real purpose, and in the case of Mass Effect: Legendary Edition I’ve struggled at times to tell which version of the game is the new one when looking at pre-release marketing material. There’s no compelling reason to remake games that aren’t very old. Re-release them or give them a renewed marketing push if you want to drum up sales or draw attention to a series, but don’t bill your minor upgrade as a “remake.”

There are some games that have benefitted hugely from being remade. I’d point to Crash Bandicoot and Resident Evil 2 as two great examples. But those games were both over twenty years old at the time they were remade, and having been released in the PlayStation 1 era, both saw massive upgrades such that they were truly worthy of the “remake” label.

I’ve put together two lists of games that I’d love to see remade, but when I did so I deliberately excluded titles from the last two console generations. Those games, as I said at the time, are too recent to see any substantial benefits from a remake. In another decade or so, assuming sufficient technological progress has been made, we can talk about remaking PlayStation 3 or PlayStation 4 games – but not now!

Number 7: Fake “remakes.”

On a related note to the point above, if a title is billed as a “remake,” I expect to see substantial changes and improvements. If all that’s happened is a developer has run an old title through an upscaler and added widescreen support, that’s not a remake!

A lot of titles that acquire the “HD” suffix seem to suffer from this problem. Shenmue I & II on PC contained a number of bugs and glitches – some of which existed in the Dreamcast version! When Sega decided to “remake” these two amazing games, they couldn’t even be bothered to patch out bugs that were over fifteen years old. That has to be some of the sloppiest, laziest work I’ve ever seen.

There are other examples of this, where a project may have started out with good intentions but was scaled back and scaled back some more to the point that it ended up being little more than an upscaled re-release. Kingdoms of Amalur: Re-Reckoning springs to mind as an example from just last year.

Remakes are an opportunity to go back to the drawing board, fix issues, update a title, and bring it into the modern world. Too many “remakes” fail to address issues with the original version of the game. We could even point to Mass Effect: Legendary Edition’s refusal to address criticism of the ending of Mass Effect 3 as yet another example of a missed opportunity.

Number 8: The “release now, fix later” business model.

This isn’t the first time I’ve criticised the “release now, fix later” approach taken by too many modern games – and it likely won’t be the last! Also known as “live services,” games that go down this route almost always underperform and draw criticism, and they absolutely deserve it. The addition of internet connectivity to home consoles has meant that games companies have taken a “good enough” approach to games, releasing them before they’re ready with the intention to patch out bugs, add more content, and so on at a later time.

Cyberpunk 2077 is one of the most recent and most egregious examples of this phenomenon, being released on Xbox One and PlayStation 4 in a state so appallingly bad that many considered it “unplayable.” But there are hundreds of other examples going back to the early part of the last decade. Fortunately, out of all the entries on this list, this is the one that shows at least some signs of going away!

The fundamental flaw in this approach, of course, is that games with potential end up having launches that are mediocre at best, and when they naturally underperform due to bad reviews and word-of-mouth, companies panic! Planned updates are scrapped to avoid pumping more money into a failed product, and a game that could have been decent ends up being forgotten.

For every No Man’s Sky that manages to claw its way to success, there are a dozen Anthems or Mass Effect: Andromedas which fail. Time will tell if Cyberpunk 2077 can rebuild itself and its reputation, but its an uphill struggle – and a totally unnecessary one; a self-inflicted wound. If publishers would just wait and delay clearly-unfinished games instead of forcing them to meet arbitrary deadlines, gaming would be a much more enjoyable hobby. Remember, everyone: NO PRE-ORDERS!

Number 9: Forcing games to be multiplayer and/or scrapping single-player modes.

Some games are built from the ground up with multiplayer in mind – but many others are not, and have multiplayer modes tacked on for no reason. The Last Of Us had an unnecessary multiplayer mode, as did Mass Effect 3. Did you even know that, or notice those modes when you booted up those story-focused games?

Some games and even whole genres are just not well-suited to multiplayer. And others that are still have the potential to see single-player stories too. Many gamers associate the first-person shooter genre with multiplayer, and it’s true that multiplayer games work well in the first-person shooter space. But so do single-player titles, and aside from 2016’s Doom and the newer Wolfenstein titles, I can’t think of many new single-player first-person shooters, or even shooters with single-player modes that felt anything other than tacked-on.

Anthem is one of the biggest failures of the last few years, despite BioWare wanting it to be the video game equivalent of Bob Dylan. But if Anthem hadn’t been multiplayer and had instead maintained BioWare’s usual single-player focus, who knows what it could have been. There was potential in its Iron Man-esque flying suits, but that potential was wasted on a mediocre-at-best multiplayer shooter.

I started playing games before the internet, when “multiplayer” meant buying a second controller and plugging it into the console’s only other available port! So I know I’m biased because of that. But just a few short years ago it felt as though there were many more single-player titles, and fewer games that felt as though multiplayer modes had been artificially forced in. In the wake of huge financial successes such as Grand Theft Auto V, Fortnite, and the like, publishers see multiplayer as a cash cow – but I wish they didn’t!

Number 10: Early access.

How many times have you been excited to see that a game you’ve been waiting for is finally available to buy… only to see the two most awful words in the entire gaming lexicon: “Early Access?” Early access billed itself as a way for indie developers to get feedback on their games before going ahead with a full release, and I want to be clear on this point: I don’t begrudge indie games using it for that purpose. Indies get a pass!

But recently there’s been a trend for huge game studios to use early access as free labour; a cheap replacement for paying the wages of a quality assurance department. When I worked for a large games company in the past, I knew a number of QA testers, and the job is not an easy one. It certainly isn’t one that studios should be pushing off onto players, yet that’s exactly what a number of them have been doing. Early access, if it exists at all, should be a way for small studios to hone and polish their game, and maybe add fan-requested extras, not for big companies to save money on testers.

Then there are the perpetual early access games. You know the ones: they entered early access in 2015 and are still there today. Platforms like Steam which offer early access need to set time limits, because unfortunately some games are just taking the piss. If your game has been out since 2015, then it’s out. It’s not in early access, you’ve released it.

Unlike most of the entries on this list, early access started out with genuinely good intentions. When used appropriately by indie developers, it’s fine and I don’t have any issue with it. But big companies should know better, and games that enter early access and never leave should be booted out!

Bonus: Online harassment.

Though this problem afflicts the entire internet regardless of where you go, it’s significant in the gaming realm. Developers, publishers, even individual employees of games studios can find themselves subjected to campaigns of online harassment by so-called “fans” who’ve decided to take issue with something in a recent title.

Let’s be clear: there is never any excuse for this. No game, no matter how bad it is, is worth harassing someone over. It’s possible to criticise games and their companies in a constructive way, or at least in a way that doesn’t get personal. There’s never any need to go after a developer personally, and especially not to send someone death threats.

We’ve seen this happen when games are delayed. We’ve seen it happen when games release too early in a broken state. In the case of Cyberpunk 2077, we’ve seen both. Toxic people will always find a reason to be toxic, unfortunately, and in many ways the anonymity of the internet has brought out the worst in human nature.

No developer or anyone who works in the games industry deserves to be threatened or harassed. It’s awful, it needs to stop, and the petty, toxic people who engage in this scummy activity do not deserve to be called “fans.”

So that’s it. Ten of my pet peeves with modern gaming.

This was a rant, but it was just for fun so I hope you don’t mind! There are some truly annoying things – and some truly annoying people – involved in gaming in 2021, and as much fun as playing games can be, it can be a frustrating experience as well. Some of these things are fads – short-term trends that will evaporate as the industry moves on. But others, like the move away from single-player games toward ongoing multiplayer experiences, seem like they’re here to stay.

Gaming has changed an awful lot since I first picked up a control pad. And it will continue to evolve and adapt – the games industry may be unrecognisable in fifteen or twenty years’ time! We’ll have to keep our fingers crossed for positive changes to come.

All titles mentioned above are the copyright of their respective developer, publisher, and/or studio. Some stock images courtesy of pixabay. Some screenshots and promotional artwork courtesy of IGDB. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

The odd criticism of Six Days In Fallujah

This article discusses the Iraq War and the Second Battle of Fallujah and may be uncomfortable for some readers.

One of the bloodiest and most controversial battles of the Iraq War was the Second Battle of Fallujah, which took place in November 2004. The battle saw coalition forces – most of whom were American, but there were a number of Iraqi and British troops who took part as well – capture the city from al-Qaeda and other insurgent forces. The Iraq War is controversial and its history complicated, and I’m simplifying the events of the battle and the war to avoid making this article about a video game too long. Suffice to say that even now, eighteen years since the United States led a coalition to defeat Saddam Hussein, and more than sixteen years since the Battle of Fallujah, the events are controversial, disputed, and the consequences of military action are still being felt in Iraq, the wider Middle East, and indeed the whole world.

Six Days In Fallujah is a video game depicting the battle from the American side, and when it was initially in development in the late 2000s it became incredibly controversial in the United States, with politicians and Iraq War veterans’ groups expressing opposition and disgust. The idea of recreating for fun any aspect of one of the most divisive conflicts of the last few decades was considered obscene, and the idea of encouraging gamers to play through a battle that took place, at that time, a mere five years earlier was too much for many people to countenance.

After the controversy boiled over and saw media personalities and politicians get involved in 2009, Six Days In Fallujah disappeared, and by 2010 or 2011 the project was effectively shelved. The critics moved on, the developers moved on, and that appeared to be the end of the matter.

Last month, however, there came the announcement from a studio called Highwire Games – which is said to consist of developers who worked on games in the Halo and Destiny franchises at Bungie – that Six Days In Fallujah was back. The game is now scheduled for a late 2021 release date, and plans to retain the original focus that was the cause of such controversy a decade ago. Cue outrage from the expected sources.

What took me by surprise was not the strength of feeling expressed by some veterans of the battle, nor the criticism by largely self-serving politicians. That was to be expected, and the announcement of Six Days In Fallujah went out of its way to highlight how Highwire Games has worked with veterans in particular – clearly anticipating this kind of reaction and trying to pre-empt some of the criticism. Instead what genuinely surprised me was the reaction from some games industry insiders and commentators, who appear to be taking an equally aggressive stance in opposition to Six Days In Fallujah.

Politicians, particularly those to the right-of-centre, have long campaigned against video gaming as a hobby. Initially games were derided as being wastes of time or childish, but some time in the 1990s the tactic switched to accusing games of inspiring or encouraging violence; equating in-game actions with real-world events. Numerous studies have looked into this issue, by the way, and found it to be without merit. But we’re off-topic.

Advocates of video gaming as a hobby – in which category I must include myself, both as someone who used to work in the industry and as an independent media critic who frequently discusses gaming – have long tried to push back against this narrative and these attacks. “Video games can be art” is a frequently heard refrain from those of us who support the idea of interactive media having merit that extends beyond simple entertainment, and there are many games to which I would direct an opponent to see for themselves that games can be just as valid as works of cinema and literature.

To see folks I would consider allies in the fight for gaming in general to be taken more seriously calling out Six Days In Fallujah because of its controversial subject matter was disappointing. Art, particularly art that deals with controversial current and historical events, can be difficult and challenging for its audience – and it’s meant to be. A painting, photograph, novel, or film depicting something like war is sometimes going to challenge our preconceptions and ask us to consider different points of view. That’s what makes art of this kind worthwhile. It’s what makes everything from war photography to protest songs to the entire genre of war in cinema incredibly important.

Documentaries and news reports only cover events in one way. The way we as a society come to understand events is partly factual but also is, in part, informed by the art those events inspire. The First World War is covered very well in history textbooks and newsreels produced at the time, but another side of the conflict – a more intimate, personal side – is seen in the poetry of people like Siegfried Sassoon and Wilfred Owen. The poems that they wrote about their wartime experiences were not pure depictions of fact, they were written to both inform and entertain – and perhaps to inform through entertainment.

If we relegate the Iraq War to contemporary news broadcasts and documentaries by the likes of Michael Moore we will miss something important, and so will future generations who want to look back and understand what happened. There are many works of fiction and non-fiction which attempt to show the big picture of what happened in Iraq, from the lies about “weapons of mass destruction” through to the use of banned weapons. Those works absolutely need to exist. But in a way, so does Six Days In Fallujah. It aims to depict, in as realistic a manner as game engines in 2021 will allow, one of America’s most controversial battles of recent decades – an event which will be seen in future, perhaps, as one of the American military’s darkest hours of the entire 21st Century due to their alleged use of illegal white phosphorus.

Getting as many perspectives as possible across as broad an array of media as possible about such an important event seems worthwhile, at least to me. Six Days In Fallujah may ultimately turn out to depict the event poorly, or be a game plagued by technical issues. It might be flat-out crap. But it really does surprise me to hear serious commentators and critics suggest that it shouldn’t be made at all, perhaps because of their own biases and preconceptions about the war and the game’s possible depiction of it.

There is value in art, and if video games are to ever be taken seriously as artistic expression, we need to make sure we allow difficult and challenging works of art to exist in the medium. That doesn’t mean we support them or the messages they want to convey, but rather that we should wait and judge them on merit when they’ve been made. As I said, Six Days In Fallujah may be a dud; an easily-forgotten piece of fluff not worth the energy of all this controversy. But maybe it will be a significant work that aids our understanding of the history of this battle, and the entire Iraq War.

It feels odd, as someone who lived through the Iraq War and all its controversy, to be considering it as an historical event, especially considering its continued relevance. I actually attended a huge anti-war march in London that took place a few weeks before British forces joined the US-led coalition and attacked Iraq. But the beginning of the Iraq War is now almost two decades in the past, and even as the world struggles with the aftermath of those events, we need to create works like Six Days In Fallujah if we’re ever to come to terms with what happened and begin to understand it. We also need to consider future generations – are we leaving them enough information and enough art to understand the mistakes our leaders made in 2003? If we don’t leave that legacy, we risk a future George W. Bush or Tony Blair making the same kind of mistake. I don’t know if Six Days In Fallujah will even be relevant to the conversation, but it’s incredibly important that we find out.

Six Days In Fallujah is the copyright of Highwire Games and Victura. This article contains the thoughts and opinions of one person only and is not intended to cause any offence.

On the subject of gaming addiction

This column deals with the sensitive topic of addiction, and may be uncomfortable for some readers.

In 2018 the World Health Organisation surprised and upset a number of fans of video games when it formally designated “gaming disorder” as a distinct clinical condition. The reaction was, sadly, predictable, and boiled down to some variant of the following argument: “I’m not addicted to video games! Therefore video games can’t possibly be addictive!” Many commentators and outlets that focus on video gaming piled on with complaints and criticism, and the result is that the subject is still controversial even today, almost two years on from the WHO’s initial decision.

I’m not a doctor or psychologist, but I wanted to take a moment to defend the decision to categorise gaming disorder/video game addiction as a separate condition, because I feel that too many people who don’t really understand the topic had a knee-jerk reaction to attack it. To them it felt like an attack on their hobby, and perhaps what we can gleam from that is that the messaging surrounding the decision could have been better and clearer.

Firstly, the commentators who criticised the decision, even those who work for major publications, are universally not medical professionals. Their knowledge of the subject is limited at best, nonexistent at worst, and quite frankly having a bunch of uninformed people criticising doctors for a medical decision is comparable to conspiracy theories like the anti-vaccine movement or the Earth being flat. The people who made the decision to categorise video game addiction in this way are qualified to do so, and they will have made their decision on the basis of investigations and evidence, all of which has been peer-reviewed. The people who took offence to the decision simply aren’t on that level.

The biggest problem some people seemed to have is that the decision felt like an attack on gaming as a hobby. Many people have long derided games, dismissing them as children’s toys and even blaming gaming for criminal and violent acts, so I can understand why, to some people, this felt like just another attack in a long line. But it isn’t, because the designation of gaming disorder in no way says that all video games are a problem or that all gamers are addicts. The classification of alcoholism as a disease doesn’t mean that the vast majority of drinkers are alcoholics; no sensible person would even dream of making that argument. Alcoholism affects a small minority of drinkers, just as gaming disorder affects a small minority of gamers. And no one is trying to say otherwise.

Something that can become a problem for one person isn’t going to be a problem for everyone. Many gamers – by far the majority – play games in a sensible and responsible way, enjoying their hobby without allowing it to dominate their life. But some people will take it too far, and will allow it to take over, perhaps as an expression of other mental health issues but perhaps simply because they allowed it to get out of hand.

Choosing to classify gaming disorder as a separate and distinct condition means that more studies can be performed in the field, more information disseminated to psychiatrists and other healthcare professionals, and the result of these things is that for those people who do suffer, better help, and help more tailored to their specific problem, will be available. This can only be a good thing, as it will mean more people will have access to specialist help.

In order to meet the criteria for an individual to even be suspected of having gaming disorder, there’s actually quite a high bar. The most important factor is that their gaming is having a detrimental effect on their life. This could manifest in many ways, which will vary from person to person.

When I was a student at university many years ago, I witnessed gaming disorder firsthand. I was living in a rented apartment which I shared with just one other person, and this person (who will of course remain nameless) became addicted to video games. The individual in question was, like me, an exchange student, which is how we met and how we came to share an apartment. He had friends back home who he liked to play games with, and this was around the time that online gaming was just taking off. He would spend endless hours playing an online game, often late into the night, and over the span of a few weeks it began to have a huge impact on his life. He stopped attending classes, which saw him end up in a mess of trouble with the university as he failed every class that semester. His parents found out, which caused personal problems for him with his family, and his failure to pay rent – despite promising me he’d paid his share – almost wound up getting the pair of us evicted. This was in addition to the weight he lost from not eating properly, the destroyed social relationships with other exchange students at the university, and the missed opportunities to have the once-in-a-lifetime experience of living in another country. Ever since then I’ve used his story as a warning, because his addiction to gaming had serious and lasting consequences.

There is a happy ending to this individual’s story, however, and that is that he did eventually get his life back on track and scale down his gaming. When we parted ways we didn’t keep in touch, so I can’t be certain he’s still living his best life, but as of the last time we were together it definitely seemed that he was moving in the right direction. It took an intervention from his family – who flew halfway around the world to see him after he failed all of his classes – and a twice-weekly therapy appointment to get him to that point, though.

Any time someone tells me that they know loads of people who play games who aren’t addicted, I tell them the story of my ex-roommate, and make the same point: “just because it hasn’t happened to you or someone you care about doesn’t mean it hasn’t happened to anyone.”

I hope that nobody tries to use the designation of gaming disorder to attack what is for most people a fun and innocent hobby. That would be counterproductive, and would lead to people who genuinely have issues with gaming addiction finding it harder to get help. But so far, that doesn’t seem to have happened. The designation is just that: a clinical classification designed to help that small minority of people who have a problem.

It’s worth noting that some games, especially in recent years, have gone out of their way to introduce potentially addictive elements to their gameplay. In particular we can look at lootboxes and randomised rewards, which in many games are little more than gambling – often using real-world money. There are frequent news stories, some of which end up in the mainstream media, of individuals who end up spending hundreds or thousands of pounds on these in-game “micro” transactions. In one case last year here in the UK, a child inadvertently spent his parents’ entire monthly wages in a game.

Putting a warning label of some kind on games that have in-game “micro” transactions is definitely a good idea, but in an era where physical sales of games in boxes (where such a label would be affixed) are in terminal decline, that probably won’t be good enough. And as I noted from my former roommate’s experience, which came long before such in-game transactions were commonplace, gaming addiction doesn’t always manifest with titles that have such systems in place.

We also have to be careful how we use the terminology of addiction – and of mental health in general, but that’s a separate point. When reading reviews of new titles, I often see the word “addictive” thrown around as if it were a positive thing: “this new game is incredibly addictive!” That kind of normalisation and misuse of the term can be problematic, as affected people may simply brush off their addiction by thinking that’s how everyone plays the game. I feel that writers have a certain responsibility to try to avoid this kind of language. Presenting addictiveness as a positive aspect could indirectly contribute to real harm. I’m sure I’ve made this mistake myself on occasion, but it’s something I hope to avoid in future.

Gaming addiction, like other addictions, is a complex problem that is not easily solved. It’s no easier for someone suffering from some form of gaming disorder to “just turn off the console” than it is for an alcoholic to “just stop drinking vodka”. The temptation is always present and it can be overwhelming. Anyone suggesting that it’s a simple case of “just stopping”, as if it were that easy, doesn’t know what they’re talking about. Again, it comes back to the point I made earlier: just because it might that easy for you doesn’t mean it is that easy for everybody. One person’s subjective experience is not a complete worldview; many people find it impossible to break the cycle of addiction without help. This classification has the potential to make more specialised help available, which is the primary reason I support it.

So that’s my take on the subject. Gaming can be addictive, and for a small number of people, that addiction can cause real harm and create lasting problems for themselves and their families. Recognising this reality is a good first step if it means more research can be conducted into the subject as that will hopefully lead to better and more effective treatments for people whose gaming addiction requires outside intervention. I’ve seen firsthand how this can happen, and I have absolutely no time for the argument that goes: “well I don’t have a problem with gaming addiction, so it must be fine for everyone!” That is a blinkered and selfish way to look at the subject.

For anyone reading this who thinks they may be affected by gaming disorder or video game addiction, I’ve prepared a quick checklist of questions you can ask yourself. If you find yourself answering “yes” to any of the points below, I would suggest you reach out to someone who can help – talking to a friend, family member, or someone you trust could be a great first step, and of course professional medical help is always available.

Question #1: Do you find yourself thinking about video games all the time, and planning ways to get back to your game as quickly as possible if interrupted?

Question #2: Have you missed important events – such as work, school, meetings, or other appointments – because you couldn’t tear yourself away from gaming?

Question #3: Do you find yourself unhappy, depressed, angry, or irritated while not gaming? And/or would you say that your happiness is inextricably tied to gaming?

Question #4: Have you ever lied about how much time you spend gaming to cover it up? And/or do you break rules or limits set by others on how much time you may spend gaming?

Question #5: Have you tried to spend less time gaming but failed?

Question #6: Do your friends, family members, or people close to you ever tell you that you spend too much time gaming? And/or do you feel that you have neglected your relationship(s) as a result of gaming?

Question #7: Do you forget to eat or skip meals because of gaming? Do you skip showering or fail to take care of basic hygiene and grooming because of gaming?

While not everyone who answers “yes” to the above questions will be an addict, these points do indicate that something may be amiss with your relationship with gaming.

At the end of the day, if you’re happy with your life and gaming is a hobby, that’s okay. If it isn’t causing any harm to yourself or other people, there is no problem. But for some people gaming can get to a point where it stops being a harmless bit of fun and becomes something more sinister: an addiction. Missing important events, skipping school, neglecting friends, skipping meals, skipping showers, etc. are all points which can indicate an individual’s relationship with gaming is becoming unhealthy, and if you recognise these signs in yourself, I encourage you to reach out and get help.

Yes, gaming disorder or gaming addiction is a real phenomenon. The World Health Organisation did not invent it, all they have done is classify it and formally recognise what many people have known for a long time – that it is real. Far from being an attack on gaming as a hobby, this should be seen as a positive thing, as it has the potential to help affected individuals get better and more appropriate help.

This article contains the thoughts and opinions of one person only and is not intended to cause any offence.