Tuesday, June 14, 2005

The HD Era: Not Entirely Bogus?

When J Allard gave the Microsoft keynote at GDC this year, he spoke about the HD Era, which is basically Microsoft's strategy for the next generation consoles. Though he defined "HD" as covering more than just the "better" visuals of HDTV, the hi-def future certainly seemed to start with more resolution, greater color range, and the ever-popular widescreen. Microsoft put its money where its mouth is and gave away 1,000 Samsung HDTVs at the end of the talk, which, in my mind, was probably a worthwhile endeavor.

You see, I happened to win one of those TVs. I wouldn't bring it up except that, as a PR move, it seems to have worked. I've finally got my TV set up with PS2 and Xbox running through component cables, and all my audio is piped by fiber optic to my 5.1 surround system. I'm not an A/V nut by any stretch of the imagination, so experiencing this sort of rig is pretty cool for me.

The thing is, I probably would have been one of the last people to get an HDTV. I'm one of the naysayers who always figured that the difference couldn't be that big a deal. How much better could HD look over a regular TV? And given the durability of old school TVs, I wouldn't have had to replace mine for a good long time. In essense, I'm exactly the kind of person that Microsoft is trying to convince.

And, to be honest, I'm pretty impressed. Playing God of War in progressive scan widescreen on a big, bright LCD tv is pretty cool.

However, there are problems with the HD Era strategy. The main one being that in some ways HDTV is a fairly important evolutionary step for the medium of video. It's all still video, but as Clive Thompson points out, hi-def gives us a lot more detail, so much so that some live action stars actually look worse as a result. "Watching a show in high definition is thus rather like being Gulliver in the land of Brobdingnag -- where every pore on the giants' faces looms like a shell-blasted crater."

What does that mean for videogames? Well, as many have pointed out, art assets are going to have to be much more hi-fidelity. I tried playing Halo 2 on the new HDTV, and although it was much crisper, it also looked a lot chunkier. All the imperfections that were hidden on a low-def TV became much more evident. More detailed art not only means more work (and more money) but also pushes game visuals farther into Uncanny Valley territory. The animation will need to be much more naturalistic, and behavior will have to have a certain level of plausibility to match.

Playing Breakdown last night, I was struck by the inability of my NPC partner to maintain eye contact. She would deliver a line, stop, turn to face me, and then deliver another line. Often she looked like she was looking over my shoulder or away to one side. Sometimes this was ok; she looked like she was gazing off in thought. Other times it was irritatingly distracting, as it was obvious she was supposed to be looking me in the eye. These sorts of situations will be even more evident in next-gen games, as the characters look more and more real.

Jurie over at Intelligent Artifice thinks that this "will be the first generation shift where the diminishing returns [of better graphics] should become pretty obvious to everyone." The shift to 3D was a huge jump, but the shift to hi-def won't have the same impact, especially in terms of gameplay. I'm hoping that the better visual quality will force developers to confront issues of behavior. Work in animation and AI could certainly have a huge effect on gameplay. So perhaps HDTV could have an indirect impact on game design.

So what about the HD Era? I don't think it's all it's cracked up to be, but I also don't think it's completely without merit. The pressures that HD will put on developers will be enormous, but at the same time, develops who rise to the challenge may innovate in unexpected ways. HD may turn out to be a revolution, but if it does, it will probably be a different kind of revolution that most people expected. We'll just have to see.

1 Comments:

At 2:51 PM, Blogger Clubberjack said...

Foopy, you're absolutely right about PC games having many of the qualities of HD era console games. Of course, in some ways, the PC has been a platform that supported the sorts of innovation I'm talking about, Doom 3 notwithstanding. For instance, The Sims, a game centered on behavior, originated on the PC. Half-Life 2 raised the graphical bar but also did a great job of getting the fidelity of the behavior and animation to a similar level. The PC seems to me to have a much wider array of gameplay genres (strategy, political, etc in addition to shooters and action games), although that's probably due to interface more than anything else.

The other factor is the audience. I think the console audience (ie the more casual, mass-market types) will be more easily turned off by games with character animation/behavior that don't match up with the visual quality of the HD image. My guess is that a casual gamer will more quickly put down a game that doesn't live up to expectations. This will definitely drive up production costs (and hopefully production values as well), but may also result in innovations in AI and animation. At least, that's my hope.

 

Post a Comment

<< Home