Well, it’s the end of the year, which means all of the major gaming outlets are determining the games of the year for every category and one overall winner. One game that came out of nowhere this year is The Walking Dead, which has prompted a healthy debate about gameplay, story-telling, and what separates games from other forms of media.
All of this begs an important, prior question: what makes a game a game, and what makes a game good. To answer both of these questions, it is necessary to look back to the early days of gaming.
Part One: In the Beginning…
Way back in the day, video games were a lot more simple than they are now. Things like story and graphics were limited by the technology of the time, and as a result, games were just a hobby for fun, but nowhere near the massive sensation they are today. This also meant that a game was defined almost solely on its gameplay: a good game was fun and easy to learn. When controllers were comprised of a single joystick and maybe a button or two, programmers’ options were limited in terms of variety and depth.
The 1980s and the era of the arcade saw a lot of evolution in the gaming industry. The most successful game was the one that everyone was talking about to their friends, which benefited games with a simple premise and a memorable character. Pac-Man and Donkey Kong were hits because they not only had satisfying gameplay, but they introduced memorable heroes and villains. While games like Galaga and Frogger found success in satisfying gameplay, their franchises failed to stand the test of time because gameplay innovation failed to keep up with consumer demand for fresh content.
The transition to console gaming allowed unprecedented growth of the gaming industry, despite massive hurdles. Cartridges had very limited storage space (who even remembers what a megabit is anymore?) which limited the scope of the games that could be developed. As a result, games were often quite a bit harder in an attempt to produce greater value. After, who could justify spending $50 of 1990 money on a game that you’d be done with in a couple of hours? These systems also had limited musical outputs, only capable to producing a handful of unique notes.
Despite these technical limitations, the reintroduction of video games in the household propelled the industry to new heights. Unlike at an arcade, where you played until you ran out of lives (or quarters), console games could be played as long as you wanted, allowing the scope and style of games to change dramatically. Games like The Legend of Zelda or Final Fantasy could have never been arcade hits because they featured complicated controls and took many hours to complete, but they paved the way for complex storytelling in games. With the introduction of saving your progress, players could afford to engage in dozens of hours worth of gameplay while not having to stay glued to their TVs.
Over the next decade, game developers would come to see the potential for this reinvigorated medium to not only serve as a fun passtime, but a new avenue for artistic expression and storytelling…
Part Two: From Games to Experiences
For a while, there were two types of games: those with little or no story, and RPGs. But, as gaming has evolved, more involved storytelling has become necessary to give actions context and to make different titles distinct. Back in the day, so many games could get away with the “princess in danger, go rescue” formula, and it worked because games were simple enough that your actions didn’t need much explanation.
Increases in technology also led the way for more innovative story elements. Part of the reasons in depth stories were typically reserved for RPGs was because there was no mechanism to provide story information to the character other than text.