TweakTown NewsRefine News by Category:
Currently under investigation over at HardOCP: It appears that the Anit-Aliased pictures of NVIDIA's GeForce FX don't do it justice due to the manner in which the AA filters are applied both before and after the frame which is to be painted on the screen has left the frame-buffer. Traditional screen-capturing utilities typically capture images from the frame-buffer, thus neglecting the effect of the latter image-quality-enhancing filters. I'll let the last line of the quote speak for itself. Of course, NVIDIA and the rest of us would like to get to the bottom of this issue.
The GeForceFX's technology applies filters that effect AntiAliasing and Anisotropic filtering before the frame buffer and after the frame has left the frame buffer. In short, this means that all of our screenshots do not accurately represent the true in-game visual quality that the GFFX can and will produce, as the screen shots were pulled from the frame buffer (in the "middle" of the AA process). We have come to conclusions about the GFFX IQ (Image Quality) that may be simply wrong.More information @ HardOCP
I was just recently asked in our forums what I thought of the whole GeForce FX fiasco. I gave my honest opinion and wanted to share it with our wider audience - here is my stance on the whole situation and remember it is just my opinion, unbiased toward ATI or nVidia, as you'll notice neither of them advertise with us - so frankly I don't care what either company think of my thoughts, unlike possibly other web media do.
I think everyone should wait at least another month to see what third party companies come up with before making a judgment on the GeForce FX - don't loose sight of the fact that the 5 or 6 online reviews we have seen to date are of a reference design, and like always - third party go ahead and improve on it to compete with other companies doing the exact same thing. And not just performance-wise this time, cooling as is the clear case with the GeForce FX - and let me assure you, after talking to many third party companies today, they are all just as concerned with the noise related issues as you and I, and are all working hard to come up with more noise-tolerable solutions.
I tend to agree with the latest post made over at nV News by Typedef Enum saying that many websites hyped the NV30 so much (we are probably even guilty of it) that everyone was expecting the McLaren F1 of video cards, and obviously it did not happen and it seems the entire tech community is suddenly bitter towards nVidia for it. Let me lay it on the line for you all - nVidia had the lead over ATI and any other GPU maker on the market and have enjoyed much industry support for the past 4 or 5 years. Now watch ATI soak it all up for 2003 and possibly beyond, as they should, while nVidia sit back and scratch their heads for an answer. This is the key reason we've seen companies such as Gigabyte and Creative switch to ATI - they aren't stupid by any means, they know ATI and their future plans and goals. Hell, if anything it makes things better for as both companies compete to be king - let's just sit back and soak it up ourselves.
There comes a point in time where frames per second, are just that - simple frames per second. They've gotten so high now that there is no need seeing anything faster, at least in my opinion. Instead of looking for an outright speed king, why don't some of you folks enjoy more of the eye candy on offer and let the truly talented programmers of our world blow us away some more with unprecedented life-like detail instead of worrying about how high your 3DMark 2001 SE score is. Seriously guys, Dawn (the sexy nVidia elf chick) was just a start - wait till programmers and developers around the world make use of CG (ATI will also support it in their R350) and continue to perfect their art and we'll be back to seeing frame rates back in the 60-80 range, as told to us by nVidia - then and only then, will frames rates matter and come back into play like they've mattered so much over the past few years.
Frames per second are only those guys, so remember it. Why drive your car at such extreme speeds that you miss the beautiful scenery on the side of the road?
- All of your opinions and comments and flames can be made in our Video Card Forums
As predicted by many, today would be the day nVidia open the curtains on their highly anticipated GeForce FX GPU. Those who prophesized were indeed correct, as all the big boys post their first looks at nVidia's new flagship model, the GeForce FX 5800 Ultra.
You can read GeForce FX related content at the following sites, here they are in no particular order:
The folks at nVidia neglected allocating us a sample under NDA which we cannot help but be highly disappointed by - maybe next time hey guys .
We have just posted our GeForce FX Q&A interview with nVidia where they were kind enough to answer our questions regarding their upcoming GeForce FX.
TT - Over the last few days several companies released specification and system requirements for the GeForce FX. Most notable is the 350 watt power supply requirement - will the GeForce FX actually require a minimum of 350 watts to operate in a stable environment or is this just a precaution?More information in our Video Card Forums
nV - You hit the nail on the head. The card itself needs less than 75W but when combined in a typical enthusiast system with high performance CPUs, multiple hard drives or DVD/CD players, the total system requirements will tend to need a 350W supply. There may be individual configurations that could get away with less but when you buy your Ferrari you don't want to forget to fill up the gas tank.
It looks like Brian Burke from nVidia finally decided to send out the official specfications of the GeForce FX in its two confirmed flavours - 5800 and 5800 Ultra, to serveral nVidia fan sites. Australian website GeForce Zone have specs on both flavors of the card and two pictures posted on their front page of what will likely be the final reference design by nVidia, as cards prepare for store shelve release any day now.
Hey Brian, wanna add us to your e-mail list for next time, buddy?
More information @ GeForce Zone
We expect to receive answers from Hazel and the rest of our friends from nVidia Singapore first thing next week regarding the GeForce FX interview questions, where earlier in the week we gave you the opportunity to submit your own questions to nVidia, with the aim of getting some straight, to the point, facts on the subjet matter.
The questions and answers had to be approved by nVidia over in the United States, and it seems this process took a little longer than expected. We've been told to expect our GeForce FX Q&A returned first thing next week, at which point you guys will be the first to know about it once it is posted up here at TweakTown - keep your ears and eyes peeled!
NVIDIA has announced their workstation Quadro FX lineup, based upon the infamously delayed GeForce FX chip. The page provides a wealth of information about the card, as well as a few benchmarks of the two Quadro FX variants, the FX 1000 & FX 2000. Be sure to take the board tour as it demonstrates that the cooler of Quadro FX boards is of the more tarditional design (i.e. doesn't take up another PCI slot).
The NVIDIA Quadro® FX family delivers the fastest application performance and the highest quality workstation graphics. But raw performance and quality are only the beginning--NVIDIA Quadro FX takes the leading computer-aided design (CAD) and digital content creation (DCC) applications to a new level of interactivity by enabling unprecedented capabilities in programmability and precision. For the first time, styling and production rendering become integral functions of the design workflow, shortening the production process and enabling faster time to market.More information @ NVIDIA
Hazel Heng, Marketing Manager, Asia Pacific of nVidia, has agreed to answer a list of questions regarding their upcoming GeForce FX with the aim to get some definite answers once and for all!
If there are any specific questions you want answered, please list them at the following link and if suitable, I will add them to the list of interview questions I send off to nVidia Singapore later today.
More information in our Video Cards Forum
Rancho over at Warp2Search discovered a list of specifications of BFG Technologies upcoming GeForce FX. You can pre-order the 128mb version for a hard-pocket-hitting $400 US - with the 256mb version expected to cost around $500 US according to many sources.
Just remember, if you want to be run one of these powerhouses - you are going to require a minimum 350 watt power supply unit, keep it in mind.
Specifications:More information @ Warp2Search
Controller: NVIDIA GeForce FX
Bus type: AGP
Memory: 128MB DDR-II
Memory Bandwidth: 16 GB/second (32GB with compression)
Core clock: 500MHz
Memory clock: 500MHz (1000 DDR-II)
RAMDAC: 2 @ 400MHz each
Connectors: VGA, VIVO, DVI-I
200 million triangles/sec
CompUSA expect shipment of the PNY Verto GeForce FX graphics card February 05, 2003 - that is only 18 days away. You can pre-order one of these cards for $399.99 US and I'd hop to it straight away if you want half a chance at getting one anytime soon, if you aren't already too late.
And if haven't had enough of your nVidia dosage today, you will find a bunch of new GeForce FX card shots here from Comdex Nordic 2003 in Guthenburg, Sweden thanks to our friends over at nV News.
More information @ CompUSA