It's almost an impossible question. One can deny that the PC market is falling, but the "death of PC gaming" bell has been rung for quite a while now and with every new console, we hear it again.
First, let's rewind back to the mid-90s. Weren't those the days? PC gaming was in its A-grade, with gaming developer Gods such as Carmack and Sweeney controlling the two titan gaming engines, Quake and Unreal. With each new title in the respective games series, required a huge bump in graphical power requirements.
Each game that came out featured new graphical abilities, such as colored lighting, shadowing systems, bigger levels, A.I. - the list is virtually endless. Tablets and smartphones have gotten to the point where they are comparable to basic PCs now, with quad-core tablets hitting shelves just before Christmas. Where does this leave the desktop PC gaming market?
THE GOOD OLD DAYS:
Years ago, we had games such as Quake III requiring SLI 3dfx cards to get the best performance (or higher resolutions) out of the game. To get then-cinema-like graphics, you'd spend the cash and with every dollar spent, there would be a visible difference in graphics. These days, you can play 1080p at 60 frames-per-second with a $250 GPU. Today, there's no reason to own a high-end desktop and this is where companies like NVIDIA, AMD and Intel should be scared.
Added to that list are companies such as Corsair, ASUS, GIGABYTE - again, this list is endless. Without the high-end PC industry, those companies would be selling quite a lot less, and it would hurt their bottom line as other cheaper companies could sell their products at the same prices and still operate without an issue. Has anyone noticed just how much these companies have "thought outside the square", by releasing products like peripherals, tablets, notebooks, and more.
Back then, from simple single-core CPUs and barely-there graphics cards, we saw exponential increases in games and their graphics. From games such as Doom, upgraded to Quake, from Wing Commander upgraded to Descent and its sequels, first-person shooters were the 'show-off' for new graphical prowess, where graphics engines were a David vs. Goliath battle. Each year saw a new engine, with every 6 or so months bringing new technology in the form of graphics cards, processors and sound cards.
We didn't need constant patches and firmware updates, an always-on Internet connection, DRM, pay-to-play models, none of it. There were the occasional game patches, but nowhere near the same level or number of fixes that current games have. Games were released "when they were done", versus today's method of "release, patch, patch, patch" and it feels as though before the game feels solid, the sequel pops its head over the horizon.