WHERE TO FROM HERE?
Battlefield 3 marked a change in the gaming development rainbow - it was built for PC first and then ported down to consoles. Built on a foundation of next-generation graphics, capabilities, all wound up with a pretty DirectX 11 bow. But, half way through development, DICE switched to consoles as the lead platform, and yet we ended up with a great looking game that scales surprisingly well with high-end, multi-GPU setups.
This was a change the PC platform needed, a way to show off "this is what you can have if you have a PC," which has worked surprisingly well. Battlefield 3 looked absolutely luscious on the PC, while the console versions don't actually look too shabby for 5-year old (or more for the 360) technology. The problem is, where to from here? We'd have to be pushing against the boundaries of what current-generation consoles are capable of.
The question now is, will more developers take the same leap DICE did and start the development of their game on PC first? Nearly every game I can think of apart from Diablo III will be built on console first, then ported up to PC. DICE took a bold step and actually did the reverse, starting development on the most powerful platform, porting it down to consoles (but ended up switching to consoles mid-way through development).
If developers don't do this, we're going to be looking at a very dangerous future for the desktop PC. How is it dangerous you ask? Well, Sandy Bridge-E just launched and with the arrival of a next-generation CPU architecture, there's no real requirement of this for games. Where do Intel go from here? How much more powerful will CPUs get over the next two to five years? Will we hit a 100 to 200-percent increase in performance, but games won't benefit?
WILL BETTER HARDWARE HELP?
Then we have the new AMD Radeon HD 7000-series of GPUs - the cards are virtually twice as fast as the 6900-series, is there any point? Any $250 GPU can nearly max out a AAA title right now (minus anti-aliasing). What about in two to five years, we'll see two or three entire generational changes in GPUs, but games are stuck at resolutions of 1080p and then stuck again at the refresh rate of virtually all displays, 60Hz.
Is there perfect scaling between multi-GPU setups right now across most games? There are a few good examples, but it's another case of hardware has turned into "release it, firmware, drivers, drivers, drivers," then a refresh of GPUs come out, gaming goes flat for a while, then next-gen hardware is announced or at least leaked. We don't get a chance to enjoy the hardware, almost to the point where high-end users are beta testers.