TweakTown NewsRefine News by Category:
Venny Yu from Albatron in Taiwan let us know about their Gigi GeForce FX5800 series of graphics cards. We have been promised testing samples on the 20th of March, we suspect Albatron will start shipping into retail channels at this time as well.
Feb. 2003 - Albatron Technology announced its latest graphics accelerator, the Gigi GeForce FX5800, featuring nVIDIA's highly touted GeForce FX5800 chipset otherwise known as the NV30. Albatron rolls out this card with an impressive lineup of lightning fast components including DDR II memory and AGP 8X. The DDR II memory interface comes with 128 MB and can clock up to 800 MHz. The GPU core clock is rated at 400 MHz and is designed with a 0.13-micron architecture which increases performance. In addition to performance factors, Albatron design the brand new image of butterfly-girl named 'Gigi'. This image will put on the color box of product's and all of other promotion issues.More information in our Video Cards Forum
Grace from Prolink just sent us through the ad for their GeForce FX 5800 Ultra which includes specifications and a small picture of the card itself.
View the larger version of the image in our Video Card Forums
Currently under investigation over at HardOCP: It appears that the Anit-Aliased pictures of NVIDIA's GeForce FX don't do it justice due to the manner in which the AA filters are applied both before and after the frame which is to be painted on the screen has left the frame-buffer. Traditional screen-capturing utilities typically capture images from the frame-buffer, thus neglecting the effect of the latter image-quality-enhancing filters. I'll let the last line of the quote speak for itself. Of course, NVIDIA and the rest of us would like to get to the bottom of this issue.
The GeForceFX's technology applies filters that effect AntiAliasing and Anisotropic filtering before the frame buffer and after the frame has left the frame buffer. In short, this means that all of our screenshots do not accurately represent the true in-game visual quality that the GFFX can and will produce, as the screen shots were pulled from the frame buffer (in the "middle" of the AA process). We have come to conclusions about the GFFX IQ (Image Quality) that may be simply wrong.More information @ HardOCP
I was just recently asked in our forums what I thought of the whole GeForce FX fiasco. I gave my honest opinion and wanted to share it with our wider audience - here is my stance on the whole situation and remember it is just my opinion, unbiased toward ATI or nVidia, as you'll notice neither of them advertise with us - so frankly I don't care what either company think of my thoughts, unlike possibly other web media do.
I think everyone should wait at least another month to see what third party companies come up with before making a judgment on the GeForce FX - don't loose sight of the fact that the 5 or 6 online reviews we have seen to date are of a reference design, and like always - third party go ahead and improve on it to compete with other companies doing the exact same thing. And not just performance-wise this time, cooling as is the clear case with the GeForce FX - and let me assure you, after talking to many third party companies today, they are all just as concerned with the noise related issues as you and I, and are all working hard to come up with more noise-tolerable solutions.
I tend to agree with the latest post made over at nV News by Typedef Enum saying that many websites hyped the NV30 so much (we are probably even guilty of it) that everyone was expecting the McLaren F1 of video cards, and obviously it did not happen and it seems the entire tech community is suddenly bitter towards nVidia for it. Let me lay it on the line for you all - nVidia had the lead over ATI and any other GPU maker on the market and have enjoyed much industry support for the past 4 or 5 years. Now watch ATI soak it all up for 2003 and possibly beyond, as they should, while nVidia sit back and scratch their heads for an answer. This is the key reason we've seen companies such as Gigabyte and Creative switch to ATI - they aren't stupid by any means, they know ATI and their future plans and goals. Hell, if anything it makes things better for as both companies compete to be king - let's just sit back and soak it up ourselves.
There comes a point in time where frames per second, are just that - simple frames per second. They've gotten so high now that there is no need seeing anything faster, at least in my opinion. Instead of looking for an outright speed king, why don't some of you folks enjoy more of the eye candy on offer and let the truly talented programmers of our world blow us away some more with unprecedented life-like detail instead of worrying about how high your 3DMark 2001 SE score is. Seriously guys, Dawn (the sexy nVidia elf chick) was just a start - wait till programmers and developers around the world make use of CG (ATI will also support it in their R350) and continue to perfect their art and we'll be back to seeing frame rates back in the 60-80 range, as told to us by nVidia - then and only then, will frames rates matter and come back into play like they've mattered so much over the past few years.
Frames per second are only those guys, so remember it. Why drive your car at such extreme speeds that you miss the beautiful scenery on the side of the road?
- All of your opinions and comments and flames can be made in our Video Card Forums
As predicted by many, today would be the day nVidia open the curtains on their highly anticipated GeForce FX GPU. Those who prophesized were indeed correct, as all the big boys post their first looks at nVidia's new flagship model, the GeForce FX 5800 Ultra.
You can read GeForce FX related content at the following sites, here they are in no particular order:
The folks at nVidia neglected allocating us a sample under NDA which we cannot help but be highly disappointed by - maybe next time hey guys .
We have just posted our GeForce FX Q&A interview with nVidia where they were kind enough to answer our questions regarding their upcoming GeForce FX.
TT - Over the last few days several companies released specification and system requirements for the GeForce FX. Most notable is the 350 watt power supply requirement - will the GeForce FX actually require a minimum of 350 watts to operate in a stable environment or is this just a precaution?More information in our Video Card Forums
nV - You hit the nail on the head. The card itself needs less than 75W but when combined in a typical enthusiast system with high performance CPUs, multiple hard drives or DVD/CD players, the total system requirements will tend to need a 350W supply. There may be individual configurations that could get away with less but when you buy your Ferrari you don't want to forget to fill up the gas tank.
It looks like Brian Burke from nVidia finally decided to send out the official specfications of the GeForce FX in its two confirmed flavours - 5800 and 5800 Ultra, to serveral nVidia fan sites. Australian website GeForce Zone have specs on both flavors of the card and two pictures posted on their front page of what will likely be the final reference design by nVidia, as cards prepare for store shelve release any day now.
Hey Brian, wanna add us to your e-mail list for next time, buddy?
More information @ GeForce Zone
We expect to receive answers from Hazel and the rest of our friends from nVidia Singapore first thing next week regarding the GeForce FX interview questions, where earlier in the week we gave you the opportunity to submit your own questions to nVidia, with the aim of getting some straight, to the point, facts on the subjet matter.
The questions and answers had to be approved by nVidia over in the United States, and it seems this process took a little longer than expected. We've been told to expect our GeForce FX Q&A returned first thing next week, at which point you guys will be the first to know about it once it is posted up here at TweakTown - keep your ears and eyes peeled!
NVIDIA has announced their workstation Quadro FX lineup, based upon the infamously delayed GeForce FX chip. The page provides a wealth of information about the card, as well as a few benchmarks of the two Quadro FX variants, the FX 1000 & FX 2000. Be sure to take the board tour as it demonstrates that the cooler of Quadro FX boards is of the more tarditional design (i.e. doesn't take up another PCI slot).
The NVIDIA Quadro® FX family delivers the fastest application performance and the highest quality workstation graphics. But raw performance and quality are only the beginning--NVIDIA Quadro FX takes the leading computer-aided design (CAD) and digital content creation (DCC) applications to a new level of interactivity by enabling unprecedented capabilities in programmability and precision. For the first time, styling and production rendering become integral functions of the design workflow, shortening the production process and enabling faster time to market.More information @ NVIDIA
Hazel Heng, Marketing Manager, Asia Pacific of nVidia, has agreed to answer a list of questions regarding their upcoming GeForce FX with the aim to get some definite answers once and for all!
If there are any specific questions you want answered, please list them at the following link and if suitable, I will add them to the list of interview questions I send off to nVidia Singapore later today.
More information in our Video Cards Forum