Last night, I played a couple of hours of the just-released Assassin's Creed: Unity, and was absolutely disgusted at the performance of the game, even on my powerful PC.
I'm running a Core i7-4930K, 16GB of Corsair 2400MHz DDR3 RAM, SSDs galore from SanDisk, and two NVIDIA GeForce GTX 980s in SLI - yes, GTX 980s, two of them. All of this is sprinkled over the awesome ASUS ROG Swift PG278Q monitor, which has a native resolution of 2560x1440 and 144Hz refresh rate. I went into the game at 1440p, at its highest possible settings (with anti-aliasing disabled) and was getting an abysmal frame rate.
Quitting out of it, it crashed - and then I couldn't get back into it. Disabling SLI and rebooting, it still didn't work - so I downloaded and installed the new GeForce 344.65 Game Ready drivers, which helped. Back into the game, I reduced the resolution all the way down to 1280x720 and medium detail, which provided around 40-45FPS... which is way, way under what I should be getting. I quit out, jumped into Alien: Isolation which is running at 2560x1440 with everything cranked (again, sans AA) and I was getting 100FPS+. Battlefield 4? Same thing.
Senior Producer on the game, Vincent Pontbriand even said that the studio was "Technically we're CPU-bound. The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel". Now, see that CPU bound commment there - look at the screenshot from the game above (taken on my PC) and tell me how those copy/pasted NPCs would take up precious CPU resources.
This is just quick testing, and I'll be playing around with it between now and over the weekend - but AC: U is a pile of crap on PC right now. I've done some digging and it's not just me - it seems the Steam forums are packed full of complaints. The weird thing is, Ubisoft has a requirement of a GeForce GTX 680 as a minimum to play it - and bloody hell, now we know why. Considering the game was built from the ground up on next-gen consoles, we should be seeing a game that not only looks absolutely incredible, but runs really well too - both of which, are not happening.
So I'll say it again - Ubisoft, feel free to fly me to your studio - and I'll kick the ass of whoever lets this happen. We can't blame NVIDIA, or AMD, or PC gamers, or piracy on this. Did Ubisoft not see the huge performance problems on PC? Did the company not test the game on PC at all? If it did, it would've noticed the massive problems - so why release it? Considering the developer doesn't provide its staff with 4K monitors to play with, it would kind of make sense.