NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256

NVIDIA's next-gen Turing GPU is nearly here, so let's dive into what makes the new GPU architecture tick!

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256
Comment IconFacebook IconX IconReddit Icon
Gaming Editor
Published
Updated
10 minutes & 30 seconds read time

Introduction

This is the day that I've been personally waiting for, for well, a very long time. NVIDIA first changed the GPU game with the release of the GeForce 256 all the way back in 1999, when Quake III Arena was the flagship first-person shooter and PC crusher.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 103

There is a lot going on under the hood of the new Turing GPU, with NVIDIA developing the next-gen GPU in tandem with their Maxwell and Pascal GPUs over the last few years. NVIDIA has been building Turing for over 10 years now, so this isn't some last-minute release or something that will pass. Turing delivers so many new technologies and opens Pandora's Box (in a very, very good way) for the game developers of the world, AI and deep learning markets, and everything in between. The leap between Pascal and Turing is gigantic, and it's really quite magical.

NVIDIA says that Turing was built for RTX, and once you've read this article and the reviews of the GeForce RTX 2080 and RTX 2080 Ti in the coming days and weeks, you'll agree. We have some serious power under the hood, and as a technology enthusiast it turns me on.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 104

NVIDIA teases a new standard of performance from the Turing GPU architecture, with full dedicated parts of the GPU itself set aside for ray tracing and deep learning/AI. This is really interesting, and something I'll be adding to the article early next week as we continue tearing apart the Turing GPU architecture and seeing how it ticks.

What's New

What's New - NVLink, Turing Memory Compression & More

NVLink - NVLink is nothing new per se, but it's completely new on GeForce cards. Until now, NVLink has been exclusive to the Quadro and Tesla graphics cards, whereas SLI was the multi-GPU technology in previous-gen GeForce GTX graphics cards. NVIDIA has ended 3- and 4-way multi-GPU support on Turing, limiting gamers to just two graphics cards.

Turing Memory Compression - Pascal had some radical lossless compression techniques that drove the GeForce GTX 10 series, but Turing rides an entire new level of compression techniques with up to 50% increase in effective bandwidth over Pascal.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 98

Video and Display Engine - Turing has some massive video engine chops, whereas 8K 60Hz is enabled with DisplayPort 1.4a on a single cable which is insane. Current GPUs can do 8K60 but they require 2 x DP connectors. The new display engine also packs improved NVENC encoder which can encode H.265 streams at 8K30, as well as a new NVDEC decoder that packs HEV YUV444 10/12b HDR, H.264 8K and VP9 10/12 HDR support. HEVC will enjoy up to 25% bitrate savings, while H.264 enjoys up to 15% bitrate savings.

8K (7680x4320) at 60Hz with HDR over single DisplayPort 1.4a

Not only can the single DP1.4a handle 8K60, it will do 8K 60FPS with HDR. Impressive stuff. There's also native HDR support with Tone Mapping HW and low latency. All of NVIDIA's new GeForce RTX 20 series graphics cards have VirtualLink connectors on the back with four dedicated lanes of HBR3 DisplayPort. There's also USB 3.1 Gen2 SuperSpeed and a huge 27W of power over the single USB-C connector.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 96

This means that there's around 30W of power used within the TDP for VirtualLink, something we'll be experimenting with when next-gen VR headsets arrive in 2019 and beyond.

Deep Learning: The Secret Sauce Of Turing

Deep Learning: The True Next-Gen Of Games

NVIDIA had a very large focus on the AI and deep learning aspects of the Turing GPU architecture, and so they should - it truly is going to change the way we look at GPUs and gaming, just to put it lightly. We are talking about next level intelligence and in-game AI (in both enemies, and allies), AI powered voice commands, improved graphics, and freaking AI-based cheat detection.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 106

The new deep learning technology inside of the Turing GPU architecture will also increase material and art quality as well as facial and character animation. We haven't seen anything real yet, and by that I mean in the form of a new games or game engines that have RTX built in with all of the deep learning and AI technology turned on and used in real-time. The next wave of games with this technology are going to be mind blowing, with things we've never ever seen before.

Deep Learning Super Sampling

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 114

Alright, time for some more mind blowing. DLSS. Otherwise known as Deep Learning Super Sampling.

NVIDIA unveiled so much new sh*t that I'm having a hard time working out what I'm going to play with first. DLSS is one of those major things, as it really is as NVIDIA says it is a "breakthrough in high-quality motion image generation". We're tlaking higher quality graphics, with MORE performance... yes, more performance, not less. Normally anti-aliasing is enabled and we see a reduction in FPS, but an uptick in graphics quality.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 115

DLSS continues the improvement in the image quality department with DLSS 2X looking like 64X super sampled, and that is mighty impressive, especially when you consider there's a performance IMPROVEMENT.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 116
NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 117
NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 118

TAA looks pretty bad against 2X DLSS... but this is something I want to try myself.

DLSS Improves Performance?!

DLSS Improves Performance - NVIDIA Magic?!

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 119

Wait, DLSS enabled = more performance? That's what NVIDIA is promising... in the Unreal Engine 4 demo of 'Infiltrator' from Epic Games. DLSS enabled sees performance over DOUBLE from the GeForce GTX 1080 Ti.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 121

NVIDIA has some screenshots that they had in the Editor's Day reveal, but I will provide better examples when I have the cards, drivers, and the NDA is up. I'm only displaying these so you know what I'm looking at, except the only new information I have is that I saw it in person in Germany. Again, I'll judge it when it's in front of me with real GTX 1080 Ti vs RTX 2080 Ti in TAA and DLSS (on both cards).

I was the first person in Germany to raise this question: why were NVIDIA only showing off the GeForce GTX 1080 Ti running TAA and the GeForce RTX 2080 Ti running DLSS. I wanted to see the RTX 2080 Ti running TAA to see if we could expect the same doubling in performance that DLSS on the new Turing GPU architecture provides. I wanted to see if in raw performance with TAA enabled, whether the RTX 2080 Ti would still maintain a huge 100% performance improvement.

Games That Will Adapt To YOU

One of the most exciting things that I heard about what AI inside of a GPU could provide us was the idea of a new type of gaming... a new way of experiencing games that is unlike no other and it's all done invisibly inside of your new Turing-based graphics card.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 106

Some of the stories that I talked about with NVIDIA representatives excited me beyond words, and this is something I'll go into more detail on an upcoming Facebook Live video, but the future of gaming with AI powered features is the most exciting leap in the history of gaming. We're talking about AI that will define games, create totally new and unique playthroughs every single time. No two games would be the same when powered with AI.

One of the stories that really struck a chord with me was a future where we'll be playing games and have a situation like this: two enemies are in front of you in a game like Battlefield, maybe they're guards that are defending an entry into an enemy base. As you run towards them you take a shot at the guy on the right. The enemy on the left reacts to this situation in real-time, seeing his fallen comrade and then looks back at you with emotion in his eyes that you shot, and killed his mate.

You'll see the pain in his eyes and it'll be more real than any game before it, as it'll be completely dynamic. He could react emotionally and sporadically, running at you guns blazing and ready to do because the only friend he had in the world, was just murdered in front of him. He might dive behind cover and alert his other comrades, raising the alarm. He could dive to the ground and shoot out the light above you, turning around and shoot the light above him, and cover you in darkness making YOU react to the new situation.

Hearing these stories in Germany during the GeForce Gaming Celebration made the hairs on my neck stand up. I was excited about the future of PC gaming for the first time in a very long time. Not just because we're going to get faster graphics cards, or real-time ray tracing, but the future of games can be built on the foundations of NVIDIA's new Turing GPU architecture, Tensor cores and deep learning/AI technology.

The riffs back and forth with these NVIDIA employees continued, where I mentioned Left 4 Dead and its AI Director - which looks over the entire game from above and adapts the game to the situation of the gamers in the environment. The AI Director is already awesome, as you and your three other friends might be good at L4D, but not great - it will dynamically adjust the difficulty accordingly.

But if you're really good, the AI Director will increase the difficulty and the positioning of the major enemies like the Smoker, Boomer, etc. You'll find the game will get harder, and the hordes of zombies will increase if you're gaming together with three friends that are GOOD at the game, rather than just OK. But imagine using NVIDIA's fresh AI technology and having it all happen with 1000s of times the artificial intelligence, all in real-time.

We could be looking at reinvigorated horror games like Left 4 Dead powered by RTX and DL/AI technology, where games scale much higher and become scarily realistic in the way they feel. Games won't feel like games soon enough, they'll be realistic experiences and escapes from the real-world, so much so that I think we'll see a radical change to game ratings because no longer will they feel like crappy AI characters or NPCs in a game.

Imagine a new Resident Evil or another horror-style game where it adapts to you in real-time. Imagine a world where you have a horror game that has a 5-minute questionnaire before you play the game, and it asks you about all of the things you fear. Spiders... snakes... the girl that crawls out of the TV in The Ring... drowning... anything. Imagine that the game uses this, and adapts in real-time to scare the living sh*t out of you. This is all a possibility now, and all thanks to NVIDIA.

Rather, they'll have individual back stories and experiences that will drive their character (the AI or NPC in the game) and their decisions that THEY make in the game will be driven from how YOU play.

Think something like HBO's hit show Westworld... but in games. Amazing.

Benchmarks Soon & Final Thoughts

Benchmarks Coming Soon

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 122

NVIDIA's new Turing GPU architecture introduces an entire new core architecture, Tensor Cores into the GeForce family, RT cores into the GeForce family, advanced shading, the first card with VirtualLink, the first card with GDDR6... I mean, it just doesn't stop. AMD delivered new technologies with the introduction of the Radeon RX Vega, but it fell on its face. Sure, it had HBCC and HBM2, but they turned out to be virtually useless for gaming. NVIDIA isn't making that mistake with Turing.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 99

Look at this. This is the TU102, the full config of Turing as it stands - a huge 18.6 billion chip made on the 12nm node. It's freaking beautiful.

Final Thoughts

NVIDIA's new GeForce RTX 20 series is going to blow minds not just when it's released, but in the months post-launch. It really is the biggest leap in GPU technology since the GeForce 256 and that's a very, very big thing. It's not just real-time ray tracing, but we're talking about compute and AI, deep learning, and so much technological horsepower under the hood it's scary.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 48

I'm going to finish with this small story, something I've talked about previously in my now 18,000+ articles with TweakTown, a personal story that is straight from the heart.

Go look up 'anthony256'. That's my online handle, and it has been for 19 years - ever since the introduction of the GeForce 256. I still remember being utterly obsessed with Quake III Arena at the time and id Software were really pushing the new 32-bit color in the engine at the time, which was a huge upgrade from the 16-bit color that gamers were used to with the 3dfx Voodoo and Voodoo 2/3 cards at the time.

Living in Australia in the 90s and being an enthusiast was hard, as we couldn't get hardware anywhere near as easy as it was in the US at the time. It was near impossible to get just-launched hardware like NVIDIA's new GeForce 256 imported into Australia. But, I was determined to have the bleeding edge technology at the time.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 03

I remember being on the Overclockers Australia (OCAU) forums and IRC chat at the time (hell, my OCAU nickname is 'anthony256' as well and so was my IRC handle) and someone was importing the GeForce 256 (the Creative Labs 3D Blaster Annihilator Pro GeForce DDR). I had to have it. This wasn't a question, but it was an inevitability.

Once I had it in my hands and saw just how much additional horsepower it had over my then 3dfx Voodoo graphics card and my best mates' GeForce 256 SDR (there was a BIG difference between the SDR and DDR versions of the cards at the time), my mind was set. It influenced everything I was as a gamer at the time, so I changed my online handle to 'anthony256' because of the GeForce 256, and the rest is history.

Back then I was 16 years old, I was a gaming addict and LAN addict, hardware addict, tech news addict (I used to write for Voodoo Extreme if we have anyone that remembers VE3D! but then didn't write for 10+ years after that as I worked in IT retail). I went into IT retail and sold graphics cards for a living, all manners of computer parts, and then landed the gig here at TweakTown.

NVIDIA's next-gen Turing GPU: Biggest Leap Since GeForce 256 103

I ran one of the best Call of Duty: United Offensive (one of the best COD games) and we were #1 on the largest COD:UO ladder in Australia, everyone knew me as anthony256. Everyone. I've had that name for close to 20 years, and it's all in celebration of the original GeForce 256 and now here we are... with the largest leap since T&L on the GeForce 256 with the new GeForce RTX 20 series and the exciting new Turing GPU architecture.

Most people couldn't see what I saw back then with the GeForce 256, but look at how far NVIDIA has come since then. They are the undisputed champion of GPUs, and Turing only doubles down on that. I've already asked NVIDIA when AI will start telling the company how to make them better GPUs... and I think we'll see that inside of 10 years, if they aren't already tapping AI to design new GPUs.

The GeForce 256 was one of the largest leaps in GPU technology at the time, and I really wanted to make this personal. I'm going to be testing this card and its architecture for the next 6 months of my life, and I can't wait. I hope you're ready to take the journey with me!

Gaming Editor

Email IconX IconLinkedIn Icon

Anthony joined the TweakTown team in 2010 and has since reviewed 100s of graphics cards. Anthony is a long time PC enthusiast with a passion of hate for games built around consoles. FPS gaming since the pre-Quake days, where you were insulted if you used a mouse to aim, he has been addicted to gaming and hardware ever since. Working in IT retail for 10 years gave him great experience with custom-built PCs. His addiction to GPU tech is unwavering and has recently taken a keen interest in artificial intelligence (AI) hardware.

Right of Reply

We openly invite the companies who provide us with review samples / who are mentioned or discussed to express their opinion. If any company representative wishes to respond, we will publish the response here. Please contact us if you wish to respond.

Related Topics