NVIDIA has released some interesting stats on their GPU adoption, with new numbers releasing during their recent Investor Day event detailing Turing, Pascal, and older-gen GPU adoption.
The company has said that Turing GPUs have sold 45% more than Pascal GPUs in the first 8 weeks of launch, and considering the harsh reception and huge prices of the GeForce RTX cards, it hasn't stopped NVIDIA from selling a bunch of them. But how much penetration has Turing had into gamers' PCs?
According to NVIDIA's own graph we can see that just 2% of NVIDIA's consumers have upgraded to a Turing GPU, while 50% are using Pascal (GTX 10/16 series) and 48% are using older-gen GeForce cards. Close to 90% of GeForce GTX 10 series owners that are upgrading are grabbing higher-end Turing-based GPU offerings, too. You can see that most people that have a GTX 1080 or GTX 1080 Ti are upgrading to a new RTX 2080 or RTX 2080 Ti, which is interesting to see.
GTC 2019 - NVIDIA CEO and founder Jensen Huang had some big things to talk about during his opening keynote at GTC 2019, with one of them being the announcement of the latest RTX servers that will be hot pieces of tech for Hollywood studios working on VFX.
The new RTX server pods pack a huge 1280 Turing GPUs across 32 RTX blade servers, with 40 GPUs per server, each server consumes an 8U space in a rack. The GPUs inside are NVIDIA's new Quadro RTX 4000 and 6000 graphics cards, depending on the configuration and budgets of their customers.
NVIDIA said in its announcement: "NVIDIA RTX Servers - which include fully optimized software stacks available for Optix RTX rendering, gaming, VR and AR, and professional visualization applications - can now deliver cinematic-quality graphics enhanced by ray tracing for far less than just the cost of electricity for a CPU-based rendering cluster with the same performance".
GTC 2019 - NVIDIA unveiled a cute little AI computer today at its own GPU Technology Conference, with the introduction of the Jetson Nano.
The new Jetson Nano is an entry-level AI computer that NVIDIA will sell to developers for just $99, where inside NVIDIA is using an 128-core Maxwell GPU and quad-core ARM A57 processor that is capable of 472 gigaflops of processing power for neural networks, high-res sensors, and other robotic features.
It'll be a power-sipping PC requiring just 5W of power, can run Linux out of the box, and supports a bunch of different AI frameworks. It'll also house 4GB of RAM, 1GbE port, and I/O that you'll need to attach cameras and other devices to it.
NVIDIA's new Jetson Nano is $99 for individuals or $129 for the "production-ready" units that companies would use and deploy.
GTC 2019 - NVIDIA showed off something very close to my heart at GTC 2019 this year, with a real-time ray tracing powered version of Quake II demoed.
The company has created a demo of Quake II that uses NVIDIA's RTX technology as well as HDR, something else NVIDIA inserted into id Software's iconic shooter. NVIDIA explained that the original lighting in Quake II was baked and static, but when ray tracing is enabled the entire environment comes alive and looks completely different from the work id Software did in the 90s.
We have beautiful dyanmic lighting that creates realistic reflections and shadows, all of which are visible across the envrionment in the game be it walls, floors, or windows. NVIDIA explains: "We've introduced real-time, controllable time of day lighting, with accurate sunlight and indirect illumination; refraction on water and glass; emissive, reflective and transparent surfaces; normal and roughness maps for added surface detail; particle and laser effects for weapons; procedural environment maps featuring mountains, sky and clouds, which are updated when the time of day is changed; a flare gun for illuminating dark corners where enemies lurk; an improved denoiser; SLI support (hands-up if you rolled with Voodoo 2 SLI back in the day); Quake 2 XP high-detail weapons, models and textures; optional NVIDIA Flow fire, smoke and particle effects, and much more!"
NVIDIA is unleashing the world of ray tracing onto millions of graphics cards on the market, with the company announcing 'basic RT effects' and 'low ray count' abilities for a bunch of GeForce GTX series graphics cards.
You'll need a GeForce GTX 1060 6GB minimum, with all Pascal GPUs through to the TITAN Xp working with the basic RT abilities, as well as the new Turing-based GeForce GTX 1660 and GTX 1660 Ti graphics cards. For complex and multiple RT effects with high ray counts, this is where the Turing RTX cards come into play with the GeForce RTX 2060, 2070, 2080, 2080 Ti, and TITAN RTX come into play.
NVIDIA will push out an April GeForce driver that will unlock the new ray tracing support in the GTX 10 and GTX 16 series graphics cards. This is huge news, as it shows the adoption of GeForce RTX has been low enough that NVIDIA is unlocking ray tracing abilities on its previous-gen Pascal GPU graphics cards which, some of them at least, are nearly three years old now.
Unity has just announced today at NVIDIA's own GTC 2019 that it has teamed with the GPU leader where it will integrate NVIDIA's RTX real-time ray tracing technology into its future platforms.
The company announced that NVIDIA's beautiful RTX real-time ray tracing technology is officially arriving in Unity's High Definition Render Pipeline (HDRP) today in preview form. It will officially be deployed out in a more optimized state to Unity customers in 2H 2019, so we should begin to see the flow-on effects of this in 2H 2019 and more so in 2020 and beyond.
Natalya Tatarchuk, Vice President of Graphics at Unity Technologies explained in a statement: "As part of our commitment to best-in-class visual fidelity graphics, we rolled out the preview of the High Definition Render Pipeline (HDRP) last year - a highly-optimized, state-of-the-art raster-based solution capable of achieving stunning graphics in real-time on consumer hardware. We built HDRP with the future in mind and today we're excited to announce that we are working with NVIDIA to adopt its RTX real-time ray tracing capabilities so we could bring this technology to all. Real-time ray tracing moves real-time graphics significantly closer to realism, opening the gates to global rendering effects never before possible in the real-time domain".
NVIDIA still owns the crown for the fastest gaming graphics card in the world with its GeForce RTX 2080 Ti and it has plenty of room to move with overclocking, but the company puts some major restrictions on it or else we would see the card being magnitudes faster than it already is.
Well, now we have legendary overclocker 'KINGPIN' overclocking his custom-made EVGA GeForce RTX 2080 Ti KINGPIN Hybrid graphics card, pushing the Turing GPU up to the dizzying heights of 2.7GHz. The 11GB of GDDR6 memory was also cranked up from its stock 14Gbps to 17Gbps, offering up 750GB/sec of memory bandwidth.
KINGPIN had the EVGA GeForce RTX 2080 Ti KINGPIN Hybrid graphics card at its max power target (+7%) and used LN2 cooling to keep the OC at its limits. He used an Intel Core i9-9980XE processor also cooled with LN2 overclocked to 5.6GHz on all 18 cores on top of an EVGA X299 DARK motherboard, and 32GB of G.SKILL TridentZ DDR4 RAM at 4000MHz. The results? World record score in 3DMark's new Port Royal benchmark.
If you remember my exclusive report from April 2, 2018 you'll remember that I said t hat NVIDIA would be launching its Turing GPU later in the year, which it did, and then it would be launching Ampere "sometime in the future". Well, that future could be now.
NVIDIA is hosting its annual GPU Technology Conference (GTC) next week in California, but what will they be showing off? Volta is old news in the high-performance computing (HPC) world, and Turing has been cut into countless variants throughout the Quadro RTX, GeForce RTX, and new GeForce GTX families of products. So, what's next then?
The next generation GPU architecture that will come after Turing will be Ampere, which we should expect to arrive on the 7nm node. Right now all of the Turing GPUs are made on the 12nm node, while I expect Ampere to be nailed into the 7nm node.
Intel is ramping up towards its industry event at GDC 2019 next week, with the official Twitter page for Intel Graphics recently teasing the new control panel coming to Intel and its future discrete GPUs.
The new control panel lookks very Steam-ish and that's not a bad thing for me, and it's a vast upgrade on the aging look of NVIDIA's GeForce control panel. As good as GFE looks, the NVIDIA control panel looks beyond Plain Jane compared to the tease of Intel's upcoming control panel. Lots of work has gone into it by the looks of things, with large buttons and various toggles we'll get to play with.
Hopefully we see more of this during The Odyssey event next week, as the hype train for Intel's first discrete graphics card since the Intel i740 (how good was Trespasser on it!) arrives in 2020.
Intel has recently teased its graphics-related Odyssey event for March 20, which will take place during the energy that is the Game Developer Conference next week in San Francisco. The company has invited gamers, content creators and others to join its new Odyssey towards discrete graphics card and a renewed focus on the gaming audience in general.
You can sign up for The Odyssey right here, with Intel teasing it's "built around a passionate community, focused on improving graphics and visual computing for everyone, from gamers to content creators". At the event, you'll "get closer to the inner workings of visual technology development than ever before".
NVIDIA is expected to unveil its new GeForce GTX 1660 any day now with a bunch of AIB partner cards being leaked, today we're starting off with the EVGA GeForce GTX 1660 and its XC series in both BLACK and ULTRA variants.
The always-great VideoCardz is on point with the leaks, with the GeForce GTX 1660 expected to launch on March 14 starting at $219, with 6GB of GDDR5 memory (not GDDR6 like its higher-end GTX 1660 Ti bigger sibling).
Starting with the EVGA GeForce GTX 1660 XC ULTRA which is a dual-slot card with a dual-fan cooler.