This is something I've complained about for years but it seems the big players are only just realising this. It won't stop though as I'm sure AMD are bending over backwards trying to get contracts for next-generation console GPU/APU - yet, wonder why PC gaming is suffering... on with the story! AMD's worldwide developer relations manager, Richard Huddy blames Microsoft and more precisely, DirectX for the lack of great looking games on PC.
It's funny, we often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.
Huddy sounds like he's pissed off and rightly so. There is so much more horsepower in a single mid-range GPU when compared to current-gen consoles yet the graphics we receive on PC is barely any better. Should this mean we should pay less for our games? PC's get higher resolutions, AA/AF options, but at the end of the day we're simply not seeing the leaps and bounds we used to in the 90s when the PC was the lead innovation for gaming.
Another quote from Huddy:
I certainly hear this in my conversations with games developers, and I guess it was actually the primary appeal of Larrabee to developers - not the hardware, which was hot and slow and unimpressive, but the software - being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft - no doubt at all. Wrapping it up in a software layer gives you safety and security but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate.