NVIDIA is positioning themselves for a huge show at CES 2017, teasing that it will have major announcements on gaming and VR products, as well as the usual development updates in AI and autonomous car technology.
NVIDIA CEO Jen-Hsun Huang will be on stage for a keynote at the Consumer Electronics Show in January, with NVIDIA teasing: "When Huang takes the stage, you can be sure he'll break news in some of the areas we're focused on: artificial intelligence, self-driving cars, virtual reality, and gaming".
What could NVIDIA unveil? We could expect a big reveal of their next-gen Volta architecture, but I think this is something we'll see at NVIDIA's own GPU Technology Conference later a few months after CES 2017, but what about VR and gaming? NVIDIA could unveil the GeForce GTX 1080 Ti, a new Shield, or the Shield VR - which I've been saying for years that NVIDIA is working on a Shield-branded VR headset.
Samsung has been using various Qualcomm processors in its Galaxy-branded smartphones over the years, slowly shifting its reliance to the Exynos processor, a chip that Samsung created, and manufactures itself.
There are now reports surfacing from SamMobile that tease a possible future where Samsung could be using GPU technology from either AMD or NVIDIA, which could make for an interesting upcoming change in the smartphone industry. NVIDIA has its impressive Pascal architecture on the 16nm FinFET process made by TSMC, but it has also just signed a deal for Samsung to manufacturer its second wave of Pascal GPUs on their 14nm FinFET process.
AMD has launched its new Polaris architecture made on the 14nm FinFET process over at GlobalFoundries, and have just had major semi-contract wins with the new Xbox One S and upcoming Xbox Scorpio consoles from Microsoft, as well as the new PS4 Slim and PS4 Pro consoles for Sony.
AMD's Radeon Software Crimson Edition 16.9.1 driver is available now, and with it comes some much appreciated game optimizations and bug fixes, as usual.
Deus Ex: Mankind Divided supports DirectX 12 today from the game developer end, and with this driver, from AMD's end, too, so you should have no trouble running the game with buttery smoothness.
The other game to get some TLC is DOTA 2, which has been bestowed with a DirectX 11 CrossFire profile.
AMD has the lower- and mid-range GPU market tied with its Radeon RX 460 and RX 470 graphics cards, but NVIDIA is looking to enter the mid-range GPU game with its Pascal architecture in the purported GeForce GTX 1050.
Reports surfaced over the weekend of a GP107-powered GeForce GTX 1050, with its GPU featuring 768 CUDA cores, and 4GB of GDDR5 on a 128-bit memory bus with 112GB/sec memory bandwidth. It looks like the GPU will be clocked at 1316MHz with a boost clock of 1752MHz.
NVIDIA's upcoming GeForce GTX 1050 will reportedly have a TDP of just 75W, so we could expect GTX 1050s to arrive without an additional PCIe power connector. As for the price, considering the GeForce GTX 1060 is priced at $299 for the Founders Edition, the GTX 1050 will need to be priced at $150-$200 to better compete against the Polaris-based offerings from AMD.
GDDR5X debuted on NVIDIA's GeForce GTX 1080 earlier this year, and then we saw 12GB of GDDR5X powering the super-powerful Pascal-based Titan X. Now we have GDDR6 being prepped for a debut sometime in 2018.
GDDR6 will increase the bandwidth to over 14Gbps, up from the already generous 10Gbps offered by GDDR5X, and up greatly from the now-current bandwidth of new GDDR5-based cards at 8Gbps - before then, it was 7Gbps for GDDR5. GDDR6 is also more power efficient, with it being around 20% more efficient over GDDR5.
I'd expect to see GDDR6 in the cards for AMD and NVIDIA for 2018, with NVIDIA set to use GDDR5X on its flagship graphics cards into 2017 alongside HBM2 on the upcoming Volta architecture. AMD has Vega planned for the first half of 2017, which will utilize HBM2 memory - but only if HBM2 supply isn't ridiculously expensive at the time, and is available in high volume.
GDDR6 will most likely be used in the refreshes of Volta and then I'd wager we'll see Navi using it - but AMD has teased "next-gen" memory on their GPU roadmap, but hasn't elaborated on what "next-gen" memory they'll be using.
You might be too busy enjoying the Battlefield 1 open beta right now to realize AMD has released new Radeon Software Crimson Edition drivers that are ready for both Deus Ex: Mankind Divided and Battlefield 1. You can grab the new drivers right here.
AMD says it has also added new DX11 CrossFire profiles for both games, with the new 16.8.3 hotfix solving some of the issues with random blank or colored screens when gaming on Radeon RX 400 series cards.
NVIDIA's new 372.70 driver is out now, and with some major additions.
First up is optimizations for World of Warcraft: Legion, the newly launched Battlefield 1 beta (check your e-mail), Deus Ex: Mankind Divided, and Quantum Break (Steam version, due September 14 with DirectX 11 support).
Beyond that, you get Fast Sync for Maxwell GPUs and the Extended, Clone, and Surround multimonitor configurations, and various bug fixes, mostly notably one for the high deferred procedure call (DPC) latency experienced upon upgrading to the GTX 1080.
AMD has confirmed it will be launching its next generation Vega architecture in the first half of 2017, saying it will launch Vega-based graphics cards for the "enthusiast market" in 1H 2017. The last we heard, Vega-based graphics cards were launching in March 2017.
If we look at the Polaris announcement to retail launch, it was announced in December 2015 and released to market with the introduction of the Radeon RX 480 graphics card in the last days of June. If the same timeframe is used for Vega, we should expect an unveiling in December and a launch earlier than Polaris, and maybe sometime in March-April in order to make a bigger impact on the market - especially against NVIDIA's formidable Pascal-powered graphics cards.
AMD's next-gen Vega architecture will be an interesting upgrade over Polaris, which I've been hearing from industry sources will be a superior architecture in many ways. Vega will be using HBM2 technology, so we can expect much more VRAM than the HBM1-based Radeon R9 Fury X offered, which featured just 4GB of HBM1. We will probably see 8GB and 12GB models, but I'd like to see a higher-end Vega graphics card with 16GB of HBM2 - ok, AMD?
I reported on HBM3 a few days ago, but all of the details weren't clear - until the Hot Chips conference in Cupertino this week, where Samsung and SK Hynix shared some more details on the next leap in HBM technology.
HBM3 will offer improvements over HBM1 and HBM2 in nearly all areas, with HBM3 offering more RAM stacks. HBM3 will feature 8 or more stacks connected via through-silicon vias (TSVs), which is up from the 2/4/8 stacks on HBM2. The upgraded HBM3 tech will see individual memory dies of up to 16Gb, up from the 8Gb on HBM2, meaning 64GB of VRAM on next-gen graphics cards will become a reality.
Lower core voltage and twice the peak bandwidth will be offered on HBM3, which is another great thing to see, but HBM3 won't be arriving until sometime in 2019-2020.
NVIDIA revealed its Tesla P100 graphics card at its GPU Technology Conference earlier this year, the first Pascal-based graphics card, and the first HBM2-powered card from NVIDIA. It was a compute monster, and it was only today during the annual Hot Chips symposium that NVIDIA revealed their first die shot of the 610mm2 GPU die.
The company released the GP100 die shot as part of their presentation on Pascal and NVLink 1.0, but die shots have not frequent from both NVIDIA and AMD, so it's nice to see the GP100 die out in the wild. GP100 is NVIDIA's first part that features HBM and NVLink, which is an important time in the company's history, as this exciting technology isn't available for the consumer GeForce graphics cards... yet.
NVIDIA's new GP100 die shot teases the HBM2 interfaces at the top and bottom of the picture, with the 4096-bit memory bus able to transfer information at over 1TB/sec.