We know that NVIDIA are set to unleash the GK104-powered GEFORCE GTX 660, which should arrive during Computex in Taipei next week. Until then, we have to look at a photoshopped picture of the card itself.
VR-Zone reports that they've checked with their sources, and the product is indeed, legit. The top left of the card has a photoshopped 6-pin PCI-e power connector highlighting that it only draws up to 150W and no more, which is a great sign for power-conscious gamers.
If the GTX 660 pulls just 150W of power, NVIDIA have done a great thing with Kepler. Gone are the days that Fermi splashed out on your wallet sucking down all that juice on your rig, less than 150W of power on a mid-range card like this is great.
We should have more news, in-person, at Computex next week.
Extreme cooling is a must for any sort of extreme overclocking. Kingpin cooling looks to provide the solution with liquid nitrogen heatsinks designed for extreme overclocking. Using said heatsinks, overclocking expert managed to take a GTX 690 all the way up to 1547MHz which I believe is a new world record.
Originally he had tried for 1600MHz but it wasn't stable so he backed it off to the final speed of 1547MHz. He also managed to take the effective RAM speed all the way up to 7336MHz. Of course, he couldn't just achieve the clocks without running some benchmarks, so that's what he did. After a run of 3DMark 11, we saw some impressive results.
With the help of a 3960X running at 4.5GHz, TiN managed a whopping 20962 points. This is basically double the score of a stock clocked card. To match this score, one would need to use two AMD HD 7970s clocked at 1260/7400 MHz or two GTX 680s at 1527/7132. What an incredible card. It'll be interesting to see how the upcoming dual-Tahiti chip will perform in comparison.
It would seem that PowerColor is the first GPU manufacturer to spill the beans on the upcoming 7970 X2. The new GPU will be based upon two 7970 Tahiti cores, which is the top chip made by AMD. This card is being designed to compete with the NVIDIA GTX 690, which is the current king of the video cards.
The new GPU will be cooled by PowerColor's new monster Vortex III. This new cooler was designed specifically for the new Devil 13 7970 X2 and is huge as you can see by the picture above. It features three fans blowing onto a massive heatsink and takes up 3 slots, as seen by the picture of the backplate.
The new GPU is adorned in red and black and looks beautiful. But it's not all about beauty as the card appears to have some monster stats to go along with the beauty. It will feature a total of 4096 stream processors across the two chips and have a total of 6GB of memory available. The cores will be clocked at 1GHz and require a ton of power. The card is outfitted with three 8-pin PCIe power connectors and has a TDP of 525 watts. The card should make an appearance at Computex early next month. More should be known then.
We talked about NVIDIA prepping two more GPUs, but at the time we thought they'd be GK104-based. New reports have surfaced, where it might seem like the GEFORCE GTX 660 would be a GK106-based part.
The GTX660 based on the GK106 architecture would sport 768 CUDA cores, 4 SMXs, 64 texture units, 2GB of memory with an interface of 256-bit. Pricing isn't too bad, where we should expect the GTX 660 to hit $349 or so. But, in this market, with this much competition, is that too muhc to pay?
Would you grab one of these GPUs at $349? Or would you be looking at something from Team Red if you were getting into the $350-ish territory? Thoughts?
ASUS and ZOTAC have already been here, whizzing by with triple-slot, triple-fan GPU cooled cards. PowerColor are the next ones out of the gate, showing off a triple-slot, triple-fan GPU cooling solution in the form of Vortex III.
Vortex III will be slapped onto the company's upcoming Radeon HD 7970 GHz Edition. The teaser picture gives us an idea of what to expect, where we can see the black and red color scheme, black-colored PCB, it takes up three slots, sports three fans, and a large aluminum fin heatsink below them.
PowerColor will most likely show off Vortex III-powered GPUs at Computex in just over a weeks time, we will be on-hand with our new competition winner, Roshan.
It appears that TSMC and NVIDIA disagree whether or not NVIDIA's new Kepler architecture has a fatal flaw or not. On one had, we are hearing reports from TSMC that the new Kepler video card "chips may be suffering from serious performance degradation over long periods of heavy load" and could be the cause of a future recall by NVIDIA.
On the other hand, NVIDIA is saying all is fine. Bryan Del Rizzo, spokesman for NVIDIA stated, "There is no truth to this." NVIDIA denies that there will be any sort of recall over this report from TSMC. NVIDIA has, however, not provided any further details other than the previous statement denying that there is an issue.
EVGA has already had to recall all of their GTX 670 SC cards due to a hardware issue, so the idea of a recall could be correct. Is this the hardware fault that has caused the recall? Additionally, this could be the reason that the 670/680/690 are not in stock anywhere, even after TSMC promised NVIDIA a majority of its manufacturing resources.
A company can't keep a major hardware flaw secret for long as a mass amount of users would take to the internet with stories of the problems they are having. I imagine there is at least some truth to this report and it will come out soon enough. Let's not forget the bumpgate scandal where NVIDIA held off for as long as possible from admitting a design flaw to avoid compensating affected customers.
VideoCardz.com is reporting that NVIDIA is set to launch their flagship mobile GPU based on the Kepler architecture during Computex in Taipei next month. The NVIDIA GEFORCE GTX 680M is not a full GK104 Kepler GPU, nor does it even sport half of the CUDA cores of its desktop version.
The GEFORCE GTX 680M features just 744 CUDA cores, with some listings showing the GPU to have 768 cores, so this should be confirmed during Computex itself. Are you wearing socks? You mighr want to take them off in advance so they don't get blown off: the GPU has much higher GDDR5 memory capacity of.. 4096MB! 4GB of RAM on a notebook-based GPU!
It shares the desktop model's memory interface of 256-bit, and rumors swirling around put its power consumption at 100W. The chip is a second revision of N13E-GTX 680M chip - A2 silicon. The card will support SLI (!) and of course, DirectX 11.1. Performance numbers, that's what we all want, right? We're looking at it being 37-percent than the GEFORCE GTX 670M, with the first leaked benchmark coming from a Chinese website. The GPU hits 4,905 points in 3DMark 11's Performance Preset.
Today NVIDIA has announced a new line of graphics cards for professional applications. The new Tesla GPUs are called the K10 and K20 and are designed to perform exceptionally for high performance computing (HPC) scientific and technical applications that use GPU-accelerated computing. These Tesla's were designed to be high performance and extremely power efficient.
Kepler is three times as efficient as the Fermi architecture which had established itself as a new standard for computing when released 2 years ago. The Tesla K10 features two GK104 chips which produce 4.58 teraflops of performance and 320GB/s of memory bandwidth. The K10 is optimized for oil and gas exploration and the defense industry.
The Tesla K20 is where the GK110 makes its first appearance. The K20 is the new flagship product for the Tesla line of GPUs and provides three times the compute performance of any Fermi-based Tesla product and supports the Hyper-Q and dynamic parallelism capabilities. The K20 is expected to be available in the fourth quarter of 2012.
NVIDIA right now own the performance crown for their GEFORCE GTX 680, and GTX 690 GPUs, but AMD aren't just going to lie down and take it. The latest rumor spinning onto the Internet is that we should expect AMD to ramp up the Radeon HD 7970 reference core clock from 925MHz to 1GHz so that they can reclaim the single-GPU performance spot.
We already have cards clocking in at over 1GHz on the core from various partners, but a reference design from AMD would make this much easier. The rumored cards would launch as "GHz Edition" cards, which we already see in the 7800-series range. Why are AMD doing this now, and not at launch?
AtomicMPC had AMD explain that "yields are now better, their average voltage required to hit 925MHz is much lower than it was on early ES revisions, and most chips are happily hitting 1250MHz now". Is this enough for AMD to win back the performance crown? Or would NVIDIA just do the same thing and crank up their clock speeds once AMD do it? The competition is about to heat up, peeps.
The GEFORCE GTX 670 action over the weekend has been entertaining to say the least. I was busy for most of the weekend and only had my smartphone to check the going ons of the Internet, our site, our Facebook page, and to keep up with news.
If you didn't already know, Shane has posted up a glorious preview of the GTX 670's performance, Cameron smashed out an awesomely written piece on why we didn't receive the GTX 690 for review, and now we have news that GALAXY has a single-slot GTX 670 that will launch shortly.
Now we're staring down the barrel of a single-slot GTX 670 which sports the NVIDIA reference design PCB. The VRM area looks to be located near the front of the card, and the PCB appears to be cutting off at two-thirds the length of the card. Since the GTX 670's PCB is shorter, the fan being longer has to mean something, right?
Some ships are more leaky than others, just like some companies are more leaky than others. NVIDIA, at least with the Kepler launch, has been one of the more leaky companies of late. Pictures and benchmarks of the GTX 680 were surfacing weeks before the product launched and it seems like the same is holding true for the GTX 670.
That picture above is claimed to be the upcoming GTX 670. Accurate specifications such as CUDA core count, frequency, etc are still unknown. Based upon the picture, the GTX 670 will still require two 6-pin PCIe power connectors, which seems a bit excessive. It also features two SLI connectors. The board itself is small, but still appears to use a dual-slot cooler.
The new card produces some respectable numbers in both 3DMark Vantage and 3DMark 11. The leaked benchmarks show the GTX 670, GTX 670 OC, GTX 570 OC, and HD 7950. Vantage performance sees the 670 earn 29471 whereas the 7950 earns only a paltry 24035. Moving to 3DMark 11, the 670 earns 7353 and the 7950 gets only 6418.
AnandTech described the beasty and impressive new NVIDIA GEFORCE GTX 690 dual-GPU video card rather well with just three adjectives in its review yesterday - Expensive, Rare and Fast. They along with a bunch of other websites like ours got a review sample directly from NVIDIA. We didn't, but that was no surprise to us at all.
We have attempted for years to work with NVIDIA, but it hasn't worked. Some years ago we started breaking NVIDIA GEFORCE launch dates and posting our reviews early on purpose because NVIDIA would not support us properly. We didn't expect any more or less than the treatment that other media get. They would never send us review samples of new video cards. Our response was simple - you don't play nice with us, we won't play nice with you. We posted many GPU reviews well ahead of the launch time. Some will say you broke the NDA, but let's make it very clear - you have to have an NDA to begin with to break it. We haven't signed an NVIDIA NDA for a very long time. They'd have to firstly communicate with us for that to even happen...
Overclocking guru K|ngp|n has managed another incredible feat. It wasn't too long ago that he managed to take an NVIDIA GTX 680 up to 1957MHz with the help of some extreme cooling. This time he has managed to push an EVGA GTX 680 up to 1442MHz on basic air cooling alone. That is quite the feat and proves just how great the Kepler architecture is.
K|ngp|n used an EVGA GTX 680 SC, which features a default frequency of 1058MHz (1124MHz at Boost). He pushed 1.212V through the core of the GPU to achieve this feat. That voltage is at the very top of the voltage limit set by NVIDIA. The memory was also overclocked to 1812MHz which is also a big improvement.
Of course when you are running a card like that, especially at those clocks, you need a strong system to run with it so that it doesn't bottleneck. In this case, K|ngp|n used i7-3960X CPU (overclocked to 4.98GHz)and memory clocked at 1245.7MHz (2490MHz effective). This achieved a 3DMark 11 scoring of P12745.
NVIDIA are on a roll, nothing can stop them right now it seems and now we have the first pictures of MSI's GEFORCE GTX 670. The GTX 670 will be NVIDIA's third SKU based on the GK104 GPU, and is set to compete directly with Team Red's Radeon HD 7950 GPU.
The card looks to be pure reference design, and only sports two 6-pin PCI-e connectors. I know you want to get into the specifications of the card itself, so lets do that, shall we? The GTX 670 sports 1344 CUDA cores, 112 TMUs, 32 ROPs, a 256-bit wide GDDR5 memory interface with 2GB of RAM and clock speeds on the Core of 900MHz, 1250MHz or 5GHz effective on the memory.
Non-reference cards that come through in the next few months should sport much shorter PCBs thanks to slim VRM requirements, and just 8 memory chips. DIsplay outputs include two dual-link DVIs, one DisplayPort and one HDMI. The box does state "OC Edition", which means it should come out of the box with overclocked speeds. Also, I'm interested to see what this new technology "DispalyPort" can do, you'll know what I mean when you see it.
Word on the street is that NVIDIA is extremely happy with the 28nm yields of the Kepler architecture and so they decided to launch two high-end cards before stripping down the chip and releasing more value-priced cards. Even with the good yield of the 28nm architecture, the GTX 680 is out-of-stock almost everywhere.
Apparently, even with the good yield, it appears that there is quite the selection of chips that aren't performing up to snuff. NVIDIA is, according to sources, preparing to launch two new, cut-down versions of the GK-104. The GTX 670 (670Ti) will be powered by the GK104-335-A2 whereas the GTX 660 (660Ti) could be powered by a different revision.
The 670 is said to feature 1344 CUDA cores with a 256-bit memory bus and 2GB of GDDR5 memory. Clocks for the chip should be somewhere around 915-950MHz for the core and 1.25GHz for the memory. The 660 should feature a fully disabled GPC (Graphics Processing Cluster) disabled. This means it will feature 1152 CUDA codes with a cut-down 192-bit memory bus. This memory bus would force 768MB or 1.5GB of memory.
VR-Zone expects pricing to look like:
- $999 - GTX 690 4GB
- $579 - GTX 680 4GB OC (Preferred AIB Pricing)
- $499 - GTX 680 2GB
- $379 - GTX 670 4GB (Preferred AIB Pricing)
- $399 - GTX 670 2GB
- $249 - GTX 660 (Ti?) 1.5GB
Cards are expected to be announced sometime next week with wide availability by Computex in early June.
NVIDIA GTX 690 shows up in wooden crate, confirms what reviewers were thinking when crowbar showed up
Last week, NVIDIA sent out boxes that contained crowbars to reviewers around the web. The crowbar had the words "for use in case of zombies or..." and the NVIDIA logo. Nothing more and nothing less. LegitReviews hypothesized that it could be used to open a wooden crate and they were right: a wooden crate was delivered to their office this morning.
As you can see in the picture, the crate warns of "weapons grade gaming power" and has more writing on the side. "0b1010110010", which is is binary for "690" is one of the lines on the side, but I can't decipher what the other two lines mean. Of course you can already guess that the GEFORCE GTX 690 was inside.
With the pry bar that was sent out last week employed, the top of the crate was no match. Inside sat the GTX 690 in all of its $999 glory. Drivers, as of yet, are unavailable and the GTX 690 is set to be available in limited supply May 3. It looks like reviewers will have fewer than 3 days to do their magic before the new card goes on sale.
"The GTX 690 is truly a work of art-gorgeous on the outside with amazing performance on the inside," said Brian Kelleher, senior vice president of GPU engineering at NVIDIA. "Gamers will love playing on multiple screens at high resolutions with all the eye candy turned on. And they'll relish showing their friends how beautiful the cards look inside their systems."
We've all been salivating at the mouth for it, well, maybe not you, but I sure have. What are we salivating for? NVIDIA's answer to the question "who is your Daddy?" Well, their answer? The GEFORCE GTX 690.
Kepler has had an interesting launch, where NVIDIA just dumped the GTX 680 onto the market and pretty much said "have at it, everyone" and it was a great contender to the already fast Radeon HD 7970. But how good does the dual-Kepler GTX 690 need to be? Usually we get two decent cards with built-in SLI, but NVIDIA have opted for dual GTX 680s for the GTX 690.
This time round, we get two fully enabled GK104 cores bursting at the proverbial seams. NVIDIA are also setting some high targets on performance per watt, where NVIDIA are able to leverage the GK104's cores onto a single GPU without having to worry about it requiring super-cooling, or sucking down serious power. The GEFORCE GTX 690 also has something else high-end up its sleeve, it's price. It will launch at $999, but did you really expect this beast to be cheap? Didn't think so.
Ah, RumorTT, it's my favorite part of the day. Recent rumbles are pointing toward NVIDIA launching its GEFORCE GTX 670 alongside their dual-Kepler-based GTX 690 graphics card. This comes from Fudzilla, who notes their sources are talking of a May 10 launch date for both GPUs.
We're just hours away from the official announcement of some sort at the GeForce LAN/NVIDIA Gaming Festival (NGF) in Shanghai, China, on the April 28. This is where we should see NVIDIA's dual GK104 GPU, the GTX 690. Fudzilla's sources are saying that the official release date is the 10th of May, and that the GTX 690 should see a partner in its launch in the form of the GTX 670.
Availability, that's always an important question with GPU launches. GTX 690 availability is said to be not great at all, with some partners receiving cards next week. We should have more information as this happens. More news as it happens.
It's hard to have enough screen real estate to do everything that we want. And gaming across multiple monitors helps to engage the player all the more. This is why AMD created Eyefinity. Moreover, this is why today PowerColor released the only HD 7870 graphics card that supports up to 6 monitors using Eyefinity technology.
The HD 7870 Eyefinity 6 supports up to 6 monitors via its six mini display ports onboard. The graphics card comes clocked with a 1000MHz core clock with a 1200MHz memory clock. It sports 2GB of memory on a 256-bit bus. And being a 7000 series card, it supports DirectX 11.1 so that all of the eye candy is available.
AMD have now announced that they won't be supporting Radeon HD 4000 series and below in Windows 8. Most people will read this and be shocked, but the GPUs are old now and Windows 8 by the time it comes out, will be considered a 2013 release.
Windows should ship with some form of support for legacy Radeon cards, but AMD themselves won't be providing future driver updates for those GPUs. AMD have made this quite clear in their press release:
Also with regards to Windows 8 support for the AMD Radeon™ HD 2000, 3000, 4000 Series of products; the In-the-box AMD Graphics driver that ships with Windows 8 will include support for the AMD Radeon HD 2000, 3000, and 4000 Series, and it will support the WDDM 1.1 driver level features. The AMD Catalyst driver for Windows 8 will only include support for WDDM 1.2 support products (AMD Radeon HD 5000 and later).
We know Team Green are ready to launch their dual-GPU Kepler card, the GEFORCE GTX 690, but where are Team Red and their Radeon HD 7990? If current rumblings are to be believed, we should expect AMD to unveil the Radeon HD 7990 at Computex 2012 in Taipei.
This is only 6 weeks away, which means if NVIDIA drop the GTX 690 ball in between now and then, NVIDIA will have four weeks of people talking about their product, building hype and what not. But, it gives AMD four weeks to re-tweak their GPU and have it ready to open up a can of red whoop ass on NVIDIA, hopefully.
The HD 7990 is expected to sport two full HD 7970 GPUs onto a single PCB. It should also have 6GB of GDDR5 baked into it, as well as 4096 GCN cores, and the ability to run 6-screen Eyefinity setups right out of the box. We will be at Computex in force this year, and will have as much news as we can of this new Red Beast. Hopefully it'll punch all other GPUs in the nuts, again.