Wednesday, August 18, 2010

I was 1337. But then they changed what 1337 was.

I picked up StarCraft II a few weeks ago, and am leisurely making my way through the single-player campaign. I am in no hurry to play people online, though. I’m a decent RTS player, but even if I were great at this, past experience with playing StarCraft online has taught me that I’ll still spend most of my time frantically trying to churn out units – anything at all – while some Korean kid is leisurely humping my leg with a few Hydralisks. I’m perfectly happy with my illusory bubble of PC game proficiency, especially considering that this is the first PC game I’ve spent any time with since… well, since the last Blizzard game.

I feel increasingly guilty about that, for some reason. I cut my gaming teeth on the Atari 2600 and the NES, but for many years, I was solely a PC gamer. At some point, though, I strayed off that path. The benefits of playing on specific types of hardware got lost somewhere in the avalanche of legal tender necessary to reap those benefits.

Take shooters, for instance. I play a lot of Team Fortress 2, and while I’m not the best player around, I can hold my own. Ask the guy I played with last weekend. He started bugging me on mic about how bad of a sniper I was until the teams were eventually reassigned, whereupon I gave him repeated opportunities to reassess his opinion, until he finally ragequit after the sixth or seventh headshot. The thing is, I play it on Xbox 360. I would get chewed up and spit out by people playing on the PC, because a mouse is simply better for twitch than a controller. Not to mention the fact that the PC players get content updates and I don’t. Grumble mumble I never get new hats grumble.

These are things I already know. I don’t need to be lectured by some seventeen-year-old ball of forehead grease who thinks he’s the first to figure that out. I remember the Wolfenstein 3D shareware, son. I made Doom WADs. I am not impressed by how much Modern Warfare 2 has taught you about the first-person shooter genre.

Back then, though, the effort needed to get these games to run was spent in DOS, not on the motherboard. I remember saving up to buy a 8MB RAM SIMM (I wanted to upgrade my 486DX rig. Sixteen whole megs of memory! I could rule the world with that kind of computing power!), but that was the extent of the hardware barriers I had to cross. Any time a game wouldn’t run, I would spend hours in DOS tweaking settings and rewriting config files, which became almost as fun as the game itself. Especially when the payoff came, because I’ll tell you something: I could get that bastard to run nine time out of ten.

Around the time Quake was released and my beloved adventures games started to die out, though, something changed. That stuff didn’t work anymore. DOS went away and Windows, an operating system I hated from day one, took its place. Game files became more cryptic and hard to access. And the solutions for running games increasingly became homogenous: spend a couple hundred dollars on a new video card, because holy shit, water effects! Pfft.

It eventually got to the point where I had built a brand new computer out of top-of-the-line components that wouldn’t run a new game I wanted, because it needed the newest version of some Radeon card that had been released a month prior. Screw that business. I started playing Ocarina of Time on the Nintendo 64, instead.

Three prebuilt PCs later, I’ve never really looked back. I like my consoles. The games I want to play will always work on them, and work the way that the developers intended. Considering that I have a full-time job, a kid on the way, and a mortgage to pay, that kind of security is very attractive to me. Never mind the extra money spent on upgrading every few months; I don’t have that kind of frigging time to invest in my entertainment.

But I do miss the superiority of PC gaming. I sunk months into Diablo II. I’ve dipped my toes back in with Warcraft III and Guild Wars. I figured that StarCraft would be a safe bet, and worth the potential headache. My laptop is fairly beefy, and I figured that I wouldn’t have any problems at all unless I tried to run it at top settings. Naturally, I realized my folly when I played it for the first time, and the game politely suggested that I turn all of the graphics settings down to the lowest possible level, with a barely concealed snicker at the sheer foolishness of trying to run it on an integrated video card. Even after an hour or two of experimenting with the settings and twiddling with the config file (old habits die hard) until I managed to get it both playable and somewhat attractive, it still occasionally asserts that I would be much happier in the newb section, where the blue squares fight the purple squares and I won’t have to worry my silly little head over things like “textures” and “shadows.”

I completely expected this, and I can still play around on the backend until it works. And it does work, which is a marked improvement from the way things used to be, where a game would simply not install or run if it didn’t meet specs, didn’t have twine and chewing gum applied to the config file, and/or didn’t have a boot disk to ensure that it didn’t have to compete with any other programs. Even though I can’t get advanced creep lighting (whatever the hell that means), I can get to the core gameplay. And that’s the important part. This game doesn’t disappoint; it has everything that made the first game so addictive, along with the nice storytelling and even some of the gimmicky stuff that made Warcraft III so fun (though, truthfully, I’ve always preferred Warcraft to StarCraft). I’m having a blast, even though I apparently don’t get the eye candy that the “preferred” player does.

No, that’s not what bothers me. What bothers me is the fact that the default solution for the gamer in the know is now “buy more crap” instead of “learn how to tweak your system.” I mean, you can still play around with the software. I did. But there doesn’t seem to be much pride in that anymore, because it’s nowhere near as important for being a top-notch PC gamer as having a perpetually-updated rig, on which you have spent over four thousand dollars of your Applebee’s paychecks. It’s the curse of technological advancement, I suppose. I’m remembering a time when people would case mod and overpower their computers just to show off, kind of like the mooks that put a purple light kit and rims on a family sedan. These days, though, it’s like you need to invest in and install a better engine every time you want to drive to the store.

I just don’t want to do that anymore. I’ll be getting Diablo III and Guild Wars II at full retail price, but will probably have to settle for dumbed down graphics on those, too. In the meantime, I’ll still be playing console games with no extra assembly required.

Even so, there’s a part of me that really wants to make sure my laptop can run anything. Or to go all Original Digital Gangster and build and mod my own rig, instead of settling for a prebuilt machine that will always be underpowered somehow. I like to joke about how sixteen-year-old grunge kid me would snarl at me now for living in the suburbs and listening to NPR; it’s a safe assumption that 13-year-old me, who once spoofed his way into an online gaming network for free, would have a similar reaction to settling for grandma’s computer and playing with the Madden meatheads’ game toys.

Too bad I pay the bills, and he doesn’t.

No comments:

Post a Comment