Enthusiasts look at gaming as sort of a country. Firing up an old game and finding no players is like walking through an abandoned city that was once populated with celebration, cool explosions and tits. Other forms of media that could be also symbolized as countries are changed over time by simple cultural shifts or change of leadership. Gaming on the other hand is a country that changes only on the off chance that the landscape gets devastated by a giant meteorite. The question is, will the meteorite invasion ship be filled with evil rabid wizard ninjas who want to decapitate everything or friendly viking lumberjacks who sing show tunes?
Stupid metaphors aside, gaming goes through drastic shifts much faster than most media in both good or bad ways. Today I’m discussing the bad ones specifically. Everyone has their own idea about what has been the worst influential factor or change in gaming history, but what if you had to pick the worst? This series will be what I consider to be the five worst things to happen to gaming (in no particular order.)
How exploration died and resurrected
If your definition of exploration is straying off the main path to collect some kind of intel for an achievement, you don’t understand the concept.
There was a period of gaming that started around 2007 and peaked in 2009 when you had to swim through an ocean of linear shooters (mostly sparked by the major success of Call of Duty 4) in order to find anything else within certain genres. This was likely the result of the shift to emphasizing graphics over gameplay from a development standpoint when the PS3 and Xbox 360 showed up. Gamers were content with this model of gameplay for a while, but once enough titles came out that didn’t take any risks the shooter genre began to feel less like precision based action and more like the spiritual sequel to Rambo 4. This was a vast departure from the origins of shooters that had you seeking out hidden rooms and secrets everywhere just to find extra health or loot. Shooters were just the center of the issue though.
Depending on the genre you look at, exploration is either the most important part of a game or the least. Nonetheless it serves an enormous purpose for many and the mysterious or ambiguous parts of a game are what allow gamers to implant their imagination into it. Take away that opportunity and all you have is novelty spectacle. Exploring your surroundings without being told what to expect is key to a good adventure game or any game that features adventuring elements. Gaming as a medium has been slowly forgetting the importance of the unknown. The gradual decline of the adventure game genre is testament to that. We would never see games built entirely around exploration like Myst or The Secret of Monkey Island these days unless they were created by an indie developer with a tiny budget.
What likely shook gamers out of their corridor comas was the advent of Minecraft.
Obviously it wasn’t the greatest game of all time but it didn’t have to be to influence gamers this way. All it provided was something the assembly line of clockwork achievement hunting games didn’t. It reintroduced the element of curiosity, discovery and danger which helped drive its major success. It seemed so different in comparison to traditional games and this gamers flocking to it like bears on steak. A new game that could garner a mainstream audience that let you go where ever you wanted in a randomly generated world? The concept was considered ancient or unheard of by the internet.
Exploration hasn’t returned to gaming as dominant design factor it was back in the day, but it’s definitely on an upward rise in popularity for both developers and gamers old and new. Then again, you could say it never really fell in popularity for them. It was the publishers in control who never wanted to take risks that are to blame.