An Analytical Discussion of the Industry, Culture, Progress and Nature of Video Games.
Do Graphics Matter?
Nicholas G. Carr, in the May 2003 edition of the Harvard Business Review, published an essay entitled "IT Doesn't Matter." The main thesis of the essay, in my weak understanding, was that the constant battle most companies and firms were engaged in to have the best, fastest, and most efficient technology, hardware, and software was, due to the nature of emerging infrastructures, becoming entirely frivolous. Tech, in his view, had evolved from being a business tool, to be used to gain an advantage over competitors, into a matter of basic business expense. Tech started as a godsend, but would end up like railroads, electricity or phone lines; necessary to do business, and a disaster if they go down. Consider this: In the early nineties, Ford or Sarah Lee having websites meant that they, at least in theory, had one-up on their competition, like having all your office computers networked, high speed internet, or the latest and speediest in software does right now. But, says Mr. Carr, if the pattern follows, at some point or another everyone will have these tools, and all will be using them, essentially providing the advantage to no one.
"IT Doesn't Matter" was, as you can imagine, controversial, and many articles were written in response supporting his thesis, debunking it, decrying it, rebunking it, etc. In short, it stirred up a great deal of discussion. I myself largely agree with this idea; everyone and everything has a website now, and if you or whatever you represent doesn't have some sort of tech aspect or representation you're basically a dinosaur. But, I introduce this concept for an entirely different reason; so that I may draw a parallel within the game industry, or more specifically within video games themselves. As we have progressed up the mountain of technological advancement in video games we have reached a certain number of plateaus, which have provided us with, to continue a metaphor, spectacular views. Sprites in systems up to the NES were very basic formations of colored blocks which, in theory, resembled the fantastic monsters found in the instruction manuals. Octorocs in the first Zelda are particularly apt; the thrashing monster turned into a simple square squid on screen. The capabilities of a system like the Super Nintendo fleshed out that kind of gameplay nicely; compare the old Zelda with the new one and you'll see facial expression, shading, depth, and with Link climbing ladders and falling off cliffs, it even had the illusion of 3-D.
But on that same Super Nintendo system came StarFox, hailed as the most advanced and first real 3-D game for . . . whatever. While it was fun to play, it also looked like crap, with the simplest polygons possible and virtually texture free environments. Basically we were back to early Nintendo graphics, but there was a third dimension. Essentially Super Mario with a lot of clipping. This was one of the early, awkward steps in bringing realism to gaming.
Graphics in video games have been inexorably lurching towards realism for the past twenty or so years. (Note that by realism I mean a game's ability to visually depict a setting in an "accurate" fashion, not recreate dull real life in a game. The guns and men and monsters in Gears of War look fantastical, but they also look real.) Unfortunately no one seems to realize that realism is a finite and relatively empty goal. Every jump made in graphics reduces the distance between what we have now and true realism, but we must also realize that every jump gets smaller. The distance in realism between the original Final Fantasy VII and Final Fantasy X was greater than that between FFX and any of the series that came after or even the vaunted realism of games like Crysis. Admittedly, there are jumps in systems in those comparisons, but over six generations, the general trend was a movement towards realism that began as a mad rush and has now slowed to a crawl. So what is the industry going to do when it gets to that definite point where it can't get any further? Well, as far as I can tell, it will either grind to a halt, or free it like no other advancement has before.
Consider Mr. Carr's argument: Once everyone had tech (or electricity or phone lines or any other infrastructure) it wasn't a matter of advantage anymore, so logically business had to look to other areas to gain an advantage. In the games industry, that will have to happen, because with graphics, they won't have anywhere else to go. Graphics creation can now be wielded in the same manner as a master artist may a brush, with no difference between what the imagination sees and what the tools can create. Hence Nintendo's continuous insistence, and the agreement of a number of critics, that their new system doesn't have the epic graphics power of the 360 or PS3, but that, by this point, it doesn't matter, so they decided to go for the wand controller. By the end of the new generation's life, with maxed out graphical realism even more maxed out by the console's age and gamemaker's experience with the hardware, this will probably be apparent to all but the graphics whores. So then, finally, we may be able to pay attention to something more substantial. Like the way we play games.
One perk this week was the arrival of Electronic Gaming Monthly to my mailbox. In it, along with the usual snarky photo captions and transcribed interview, was a feature story on Call of Duty 3. This latest installment to the best-selling Xbox 360 game (75% of all Xbox 360 owners have a copy of COD2, according to the article) retreads the same WWII first-person-shooter action that’s been done over and over again. What’s different this time, according to the article: Better graphics. Foliage that flattens when you walk on it. Smoke that billows when you drive through it. High-res textures and shaders.
This is it? This is “next-gen”? I’m going to pay $600 for this?
Hopefully not. Hopefully, somewhere in California, people are ingesting Carr’s article and are coming to the same conclusion as Aaron: There needs to be something more than graphics to keep people interested. There needs to be attention paid to things called “gameplay” and “artistic design”. The developers of COD3 maintain that the tweaks are more than graphical. I’m skeptical.
Some companies have paid attention to these facets of game development for years, while others seem to concentrate on one at the exclusion of the other. Finally, other developers are content to push polygons and textures, paste it onto a 21-year old game, and call it a day. In the days of the SNES and Genesis, this behavior was appropriate. Aaron makes a good point with StarFox – which, admittedly, sold boatloads.
But it’s over. Developers, you can’t impress us with graphics anymore. People are just as excited about Super Smash Brothers Brawl (in which some characters get little-to-no graphical improvement from the Gamecube version) as they are about games like Mass Effect, Bioshock, and Gears of War. Why? Artistic style and gameplay.
Still don’t think style and gameplay matter? Then I steer you toward the topic about which I was going to originally post: The Quest for the Rest, Somorost, and Somorost 2. The graphics are fine, but the artistic style, music, and simple adventure-style gameplay sets these games apart from much of the flash-generated junk on the interweb. And they are also more memorable than many next-gen experiences I’ve had.
I’m particularly fond of Somorost 2, in which the main character endeavors to reunite with his dog.
April 1, 1993 - June 13, 2006
Copyright © 2006 Stev Weidlich and Aaron Weiner