Wednesday, December 17, 2014

Why Console Gaming is Dead

The writing is on the wall ... television-based video game units are already an anachronism.


Friends. Family. The love of Christ. For many people, these sorts of things represent the holiday season. For me, however, Christmas has always meant but one thing: new console time, bitches.

As a kid, I looked forward to Dec. 25 all-year-long, because that meant, in some form or fashion, I was going to my grubby little paws on some kind of newfangled gaming machine. Whether it was a plug-and-play device from Nintendo, Sega, Atari or Sony or a poorly-received handheld from SNK, I knew that my day was going to be filled with all kinds of new and improved interactive experiences. Indeed, it’s the type of wonder and awe and sheer consumer bliss that makes us yearn for yesteryear, gloriously blind to the fact that everything around our Sega Saturns and Game Boy Colors at the timeframe pretty much sucked.

To me, console gaming really hits its zenith in the 128-bit era. With the Dreamcast and Xbox, we had pretty much gone as far as we could with our gaming devices, in terms of graphical horsepower. The units were more advanced than even the most high-tech arcade cabinets, which pretty much spelled the death of that quarter-fueled industry. With online play and DVD functionality, our units had become true multimedia devices, which in a way, marked the end of console gaming altogether.

The last console cycle -- the one with the 360, PS3 and Wii -- was ultimately the start of a slow, terminal industrial decline for TV-based gaming. With a greater emphasis on Internet-based multifunctionality, the hardware itself became less about gaming than it was peripheral interactions; hell, from 2010 onward, the Wii might as well been called “the Nintendo Netflix.”

With the costs of game development skyrocketing, in the face of a global recession no less, a lot of the great third party developers either went extinct or got gobbled up, and even the biggest heavy hitters of the gaming market decided to eschew innovation for proven capital generators. In short, that led to fewer game developers, fewer game publishers, and really, a lot less games worth a good goddamn. And then, the smart phone and tablet gaming market exploded, with casual gamers turning away from mass-marketed console and handheld units to get their “Angry Birds” and “Minecraft” fix via Android. Sure, indie developers could still put their little microbrew games on XboxLive and the Playstation Store and whatnot, but when you’re selling software at $5 a pop instead of $60 … well, it’s pretty easy to see how that’s a less than enviable prospect for would-be investors.

It’s a perfect storm of shit for the console manufacturers. From a marketing standpoint, how do you convince people to spend $400 on a new gaming rig, when you can play “Farmville” and “League of Legends” on the iPhone you have superglued to your hand at all times?

Back in the 1990s, the hardware selling point was always the proprietary software. If you wanted to play Mario and Zelda, you got a SNES, and if you wanted to play Sonic and “Mortal Kombat” with blood in it, you bought a Genesis. That’s why so many hardware manufacturers faltered in the heyday of “NBA Jam” and “Street Fighter II” -- with Sega and Nintendo, you knew you were getting a certain, exclusive brand quality. Now, who in the fuck knows what you were going to get with something called an “Atari Jaguar” or a “CD-I” or a “3DO?”

All of the would-be Nintendo and Sega usurpers faltered, because they had nothing to market to gamers other than the technology. The same fate would have befallen the Playstation, had it not been for Nintendo and Sega’s own inability to woo the masses with 3D technologies; Sony won the next two console generations simply because they were able to equate their hardware brand with versatile software quality, and all they had to do to shake consumer confidence in its competitors was point out how they tried to sell you tech sans the applications just one model earlier.

Which brings us to today, and the PS4/XBONE/WiiU era. Looking at the Metacritic scores for each hardware unit, I noticed something fairly surprising -- that being, the near total lack of proprietary brand differentiators. Yeah, Nintendo still has its “Mario Kart” and “Smash Bros,” but its top ten ranked games for the calendar year primarily consists of not just multi-platform games, but multi-device titles: offerings like “Guacamelee!” and “Child of Light” and “Steamworld.” There’s even less proprietary software quality diversity on the Sony and Microsoft units: the Xbox One is glutted with PC ports like “Dragon Age” and “Minecraft,” while the top three highest ranking PS4 games for 2014 are actually re-releases of games that came out in 2013.

So, uh, what’s the appeal to consumers to shell out all that dough to play games they can probably already play on their home computers or iPads again? Almost brazenly ignorant of the hardware failings of the past, the big three today are once again trotting out “technology for technology’s sake” as their central argument for user adoption.

Thankfully, the market seems to have gotten over the “peripheral madness” that was kickstarted by the Wiimote and “Guitar Hero.” The fact that Nintendo released a version of the 3DS sans any kind of 3D tech pretty much tells you that gamers’ infatuation with novelty controllers has gone the way of Mitt Romney’s presidential ambitions. Alas, while Microsoft and Sony have lost tons of money on their motion-sensing add-ons, they’re not sure how and what to market to consumer audiences anymore; and outside of bilking parents into paying money for Sky Landers toys, Nintendo ain’t doing much of shit with its little tablet connectivity set-up, neither.

Call me crazy, but I don’t think anyone in their right mind is going to buy “being able to play ‘Super Metroid’ or ‘Shovel Knight’ in HD” as reasons to plunk down an entire week’s paycheck on today’s uber-super-duper home units. Sure, there are “new” games being released, but most of them are perennial updates (Madden, FIFA, etc.), rehashes (Mario this, Halo this, Call of Duty this, etc.) or brown and grey war-shooting games, that to me at least, are utterly indistinguishable from one another. Seriously, if you showed me a photo of “Destiny” and “Titanfall” side by side, I’d have no idea which was which.

So outside of your gratuitous ADHD exploitation of middle schoolers (hey, a new Pokemon game!) and high school mass shooter fantasy fuel fodder, gamers today are left with precious little new to experience with their hardware. And don’t even think about trying to sell me on that “Walking Dead”  and “The Last of Us” serious-conscientious-gaming-as-philosophy nonsense; if I wanted to mull the peculiarities of man and the inherent sadness of existence, I’d read some Popper or Sartre, not play “Bioshock” and try to ring some sort of political statement gobbledygook out of fighting Scooby Doo monsters.

Kids today have migrated to phone and tablet-based games for a reason; the simplicity and instant gratification. No load-times, no bullshit self-congratulatory, high-falutin plotlines trying to mimic summertime box office fare and best of all, no television units required. All you need is steady Wi-Fi, a full battery and blood circulating to your thumbs -- who has the patience nowadays to find a television remote, anyway?

Call it perception bias if you must, but there’s simply no way I can imagine any kid in 2014 having as much fun with his or her Wii U or PS4 as I did when I got my Sega Genesis in 1993. All of the applications and online ranked match modes in the world don’t compare with the fundamental joy of being sucked into the experiential zone of new-and-improved software. There wasn’t just a graphical leap in the jump from “Super Mario Bros. 3” to “Sonic 2,” but really, an entire sensorial step forward; alas, I feel few youths feel similar experiences when upgrading from “Forza Horizon” to “Forza Horizon 2,” sadly.

That, my friends, is the real appeal of console gaming -- the ability to pull the gamer into a complete state of concentration and focus for hours on end. Shit, I vividly recall playing game after game of “NHLPA Hockey ‘93,” with the only worldly influence being the Dr. Dreadful gummy treats I was chewing on for sustenance.  Today’s hardware is literally built for multifunctional use, so you’re never really absorbed into the game; you’ve got online messages popping up while you’re playing games and all sorts of shit going on with your controller, plus, all the additional stuff going on with your phone and your headset and all that jazz. Instead of plunging headfirst into the software world, you’re just plugging yourself into various hardware features, with the software serving as a central hub to your technological interaction.

Simply put, that’s why gamers are abandoning “Grand Theft Auto” for “Fruit Ninja.” The same way all of the old school classics forced you to turn off your Aspergers to reach a new high score, these endless runners are physically and mentally challenging consumers, which is something console games really haven’t done since the 3D transition of the mid 1990s.

It’s not that today’s games are too complex, it’s that they are needlessly, ostentatiously complex. If you put “Diablo III” in front of me, I’d get bored with its huge-assed game sphere in five minutes, but if I encountered an old “Raiden” cabinet, I’d probably play that sumbitch for an hour solid.

Truthfully, console games haven’t really evolved since “Super Mario 64,” when the technology allowed for a sense of depth, immersion and genuine exploration that simply couldn’t be replicated in 2D space. Yeah, we’ve made prettier and deeper game worlds, but we’re still doing the same shit in “Skyrim” that we were doing in “Shenmue” and “Mega Man Legends.” After fiddling with analog nubs  and running around trees for 20 years, perhaps its not all that surprising that hardcore gamers want to return to the twitchy, linear madness of “Gunstar Heroes” and “Super Soldier Blade.”

The problem is, you don’t need brand new, super expensive hardware to create that kind of gaming experience. Shit, my laptop is utter garbage, but it can still play just about every neo-old-school game that’s received praise over the last few years flawlessly. Outside of mildly more refined graphics and smoother online deathmatches, there’s really nothing new that the PS4 offers over the PS3, other than less variety and higher consumer costs. With the Wii-U, you’re just getting slight retweakings of the games you’ve played a million times before, except now with a battery-devouring tablet that makes games less engaging than they were on the Gamecube. And the following are reasons why the Xbox One is a worthy investment over an Xbox360: none.

As a generation of hyper-mobile consumers, who prefer their multimedia in the Cloud, you really, really have to give us a good reason to plunk our asses down in front of a TV screen for more than twenty minutes.

With nothing but bland warm-overs and experiences ported over from more accessible platforms, the appeal of console ownership is nothing like it used to be. Multimedia purchasers today prefer to downsize their entertainment -- with a growing demand for completely atomized content, no less -- these hulking, wire-tethered monstrosities in our living rooms with the oh-so-passé disc trays and power sources the size of stereo speakers are instantly outmoded.

Console gaming, as a pastime, isn’t on the verge of becoming obsolete -- it already is. Sure, sure there are some purists out there who cling to their “Street Fighters” and their “Gears of War,” but their rank is ever-diminishing compared to all of the new “Dota 2” and “Flappy Bird” recruits.

With an ever expanding mobile base, there’s nowhere for the console market to go, except downward; and considering the general consumer ennui toward these latest home units, I think we may have just hit terminal velocity on the freefall to antiquity, folks.

No comments:

Post a Comment