PDA

View Full Version : Nv30



Pages : [1] 2

weta
10-01-2002, 08:45 AM
Has anyone seen any new information on the NV30?
Will this card be as good as claimed, and will it be available this year?

Wiggo
10-01-2002, 11:07 AM
Nothin' new that I've seen lately (though I've been a bit pre-occupied since the missus has been in hospital) but I bet that there will be quite a few ppl who bought Radeon 9700's are goin' to be kickin' themselves for not waiting till November. ;)
<center>:cheers:</center>

saidin
10-01-2002, 11:16 AM
I think the NV30 will be a great card, but I do not think that the 9700 owners will be kicking themselves in the arse....
both cards will be amazing!!!

weta
10-02-2002, 07:53 AM
Sorry to hear about your wife, I hope it's nothing too serious.
I what to upgrade my present card (Hercules Prophet II 64mb GTS), but not sure whether to wait for the NV30, or buy a 4600ti 8x now.

bigjackusa
10-02-2002, 08:30 AM
I'd say buy the Radeon now, or wait for the NV30. If you're on a budget, the gf4600s should drop in price with the introduction of the new Nvidia cards. Do you have a fast processor? You'll want a pretty fast one to take advantage of the hot cards. Does your mobo support AGP 8X?

weta
10-02-2002, 09:03 AM
My current motherboard, an Abit KR7-133 doesn't support x8, but should still run either the 4600 or NV30 at x4, I understand the additional bandwidth makes very little difference to the overall performance of the card.
My intention is to either buy a 4600 now, or an NV30 when available, and then upgrade to an Abit/AMD K8 motherboard/processor next year.

bigjackusa
10-02-2002, 09:17 AM
Then I'd say wait for the NV30. If vid cards keep up with their current design cycle, the 4600 is going to look kind of dated in 6-8 months.

saidin
10-02-2002, 11:06 AM
with computer's life cycles....EVERYTHING looks dated in 6 - 8 months. lol :laugh:

weta
10-03-2002, 08:31 AM
Guess it would have helped if I had posted this thread in the correct forum, sorry about that.
Several sites are saying that Nvidia will officially announce the NV30 at Comdex (November 18-22) So I can't see the card being available before late Jan, early Feb next year.

Wiggo
10-03-2002, 01:02 PM
nVidia will want these cards on shelves before xmas or they will loose a lot of xmas dollars, pre xmas sales is usually the best profit making time of the year for these companies and they don't want to miss it but because nVidia are cuttin' it fine you'll want to be quick to get on a pre-order list if you can as these will go fast once in. ;)
<center>:cheers:</center>

pga1234
10-03-2002, 04:06 PM
Yeah they will be going fast, but how much are they going to cost when they come out? Like $385 US or higher? I would like to wait but if they come out around the first few weeks in December that is already a 2 1/2 month wait + having to wait till they drop in price a little/be widely available. So I am probably just going to go with the Radeon.

Wiggo
10-03-2002, 05:31 PM
Most likely be their usual $399us (unless manufacturing costs have gone up substancially) which equals $975aus here somehow but don't ask how they work that one out. :confused:
<center>:cheers:</center>

weta
10-03-2002, 07:28 PM
Several sites are saying that Derek Perez (PR Manager) has confirmed that Nvidia will launch the NV30 at Comdex in November.
Now we will have to wait and see what all the fuss is about.

weta
10-04-2002, 12:46 AM
Spotted this article over at the Inquirer. It's not about "clock speed, it's all about yields"
http://www.theinquirer.net/?article=5665

Wiggo
10-04-2002, 12:55 AM
It's because of poor yields that this card is so late to market, it should have been here before the new Radeons. ;)
<center>:cheers:</center>

weta
10-04-2002, 02:40 AM
From what I've read the yields could be extremely low, initially around 10-15% because of the 0.13 micron process.

Wiggo
10-04-2002, 11:54 AM
Yields have been a prob for nVidia since just before the GF4 launch (which was late too) which is why nVidia is slowly getting later and later with their products. ;)
<center>:cheers:</center>

ReSpAwN DeMoN
10-04-2002, 12:18 PM
and because of this delay Nvidia is realeasing more geforce 2 and geforce 3 based video cards under the geforce 4 name. Just so they can stay in the "current" before NV30 blows the world away.

pga1234
10-04-2002, 02:31 PM
Ok about how much is the NV30 going to be past the Radeon 9700? But here's the better question: When will games support such high graphics? I mean there are no real games out now that say a geforce3 cant play quite well. Except in the terms of FPS how is the NV30 going to be able to "Blow Away" the Radeon? And in the terms off FPS how many more is it going to get over the Radeon? 15-40? or what?

ReSpAwN DeMoN
10-04-2002, 09:27 PM
Well I saw some "leaked" benchamarks and the looks of NV30 has about 3 or 4 times the power. That is what I mean by "blow away" or "serious raping" anyway the only real way to tell is to wait for them to come out. Knowing Nvidia it will be a serious card.

Wiggo
10-04-2002, 09:43 PM
Some leaked benches have the NV30 as far in front of the 9700 as the 9700 is in front of a GF4 Ti4600 but how accurate they are I can't really say but the same site was pretty accurate for the 9700 though. ;)
<center>:cheers:</center>

drpeterbright
10-05-2002, 04:24 AM
As long as it makes everything else cheaper, it's great. It loks like more overkill, but I want it anyway.

bigjackusa
10-05-2002, 04:47 AM
No matter what the PR says, I personally think that Nvidia was a little suprised at the performance and acceptence of the 9700, and they probably decided they had to extend the features and capabilities of the NV30 after they thought they were done with the design. And if they are having any driver issues with the new stuff, I doubt they will release until they are resolved, considering all the flack ATI caught with the 8500 drivers.

weta
10-05-2002, 08:39 AM
The NV30 pushes beyond DX9, which is why Microsoft are releasing DX9.1 in early 2003.
Will Rockstar design the PC version of GTA Vice City around DX9, this would make one serious game.
But you're right, I don't think there are any games due that can push either the 9700 to NV30 to their limits. (UT2003?)

pga1234
10-05-2002, 01:48 PM
GTA and Doom 3 are the only games that will be able to even scrape their limits. I think that the graphics cards are developing too fast for the current video game manufacturers can handle...

weta
10-06-2002, 08:10 AM
What will the NV30 be called?
Geforce 5, Videoforce, Rampage, Annihilator, any ideas?

FlipMoeJack
10-06-2002, 12:40 PM
GTA and Doom 3 are the only games that will be able to even scrape their limits. I think that the graphics cards are developing too fast for the current video game manufacturers can handle...



I dunno. So far Morrowind has been extremely tough to play with a good framerate, that of course with every options on. Turning Shadows and view distance on really kills most systems.

weta
10-07-2002, 08:48 AM
With the NV30 launch confirmed for next month, what memory will Nvidia be using, DDR 2 isn't available yet is it?
ATI are considering the use of GDDR 3 on their future cards, this is a high bandwidth memory designed for graphics applications.

weta
10-09-2002, 09:09 AM
According to PC Games, a German website.
After the NV30 is launched, Epic will be releasing a patch that allows UT2003 to be played in "Ultra High Detail".

Nick
10-14-2002, 08:39 PM
I dunno. So far Morrowind has been extremely tough to play with a good framerate, that of course with every options on. Turning Shadows and view distance on really kills most systems.

Game makers would have NO trouble bringing the fastest PC in the world to its knees... Remeber they have to make games that the majority of people can actually play. If everyone had R9700's, then games would be far more detailed, and far more demanding on system resources

Ehhe Yebeb
10-14-2002, 09:21 PM
Game makers would have NO trouble bringing the fastest PC in the world to its knees... Remeber they have to make games that the majority of people can actually play. If everyone had R9700's, then games would be far more detailed, and far more demanding on system resources

I think thats going a bit far, and could probably only really be done with a whole bunch of extremely slopping coding.

ReSpAwN DeMoN
10-14-2002, 09:22 PM
So your basically saying that ID software has a bunch of sloppy coding?

pga1234
10-14-2002, 09:49 PM
:rolleyes2, I dunno. But anyway I think that I willk be happy with my Radeon for awhile.

weta
10-15-2002, 05:21 AM
Another article on the NV30, 400mhz vpu and 128bit memory according to The Inquirer. http://www.theinquirer.net/?article=5794

saidin
10-15-2002, 09:27 AM
Yeah you will be happy with radeon for a while. But I think the NV30 will blow the radeon out of the water.....but you wont notice the difference until games come out that can take advantage of the hardware ( which might be a while!)

But 400mhz....that is sick, sick I say. Absoulutely sick. I still remeber the day I got my p2 300Mhz, and thinking it was badass. hehe

Ehhe Yebeb
10-15-2002, 09:30 AM
So your basically saying that ID software has a bunch of sloppy coding?

No, but its also not going to follow nicks statement.

ReSpAwN DeMoN
10-15-2002, 11:05 AM
Well ya but you gotta understand something. There are companies out there that do have the capablilities of bringing down current computer hardware to its knees. ID is a number one example of this. Every release they have ever come out with has brought down PC hardware to a crawling state. Except for maybe return to castle wolfenstien as it is based on the Quake 3 engine but does have better graphics I will give it that. But the original Doom series bogged down the 86 family (286 386 486) The entire aquake series did it as well. As for quake 3 which is now a more then 3 year old game. Is STILL to this very day used for benchmarking and system stability tests. MFG's all over are STILL using the Quake 3 engine has a development platform for there games. EA did it with MOH. Just think a more then 3 yera old game has all this power. You won't even beginn to remotly imagine what Doom 3 will have the power to do. Its supposed to replace the Quake 3 engien and quake 4 will be built upon its technologies. I guess that's if ID still owns the quake rights as I heard it was sold to raven software. IMHO doom 3 will probably give a Pentium 4 2.8GHz and an NV30 a good run.

weta
10-19-2002, 06:35 AM
According to The Inquirer, the NV30 will use DDR II memory clocked at 1000mhz.
This has to confirm that the NV30 will be a Jan/Feb 2003 part, with the possibility of a very small amount of cards being available for Christmas.

ReSpAwN DeMoN
10-19-2002, 06:37 AM
I didn't know about the DDR2 part but ya that long time from now is exactly right. That's why Nvidia is releasing more geforce 2 and geforce 3 based cards as geforce 4 cards by the holiday's. Just to stay in the competition.

pga1234
10-21-2002, 12:49 PM
According to The Inquirer, the NV30 will use DDR II memory clocked at 1000mhz.
This has to confirm that the NV30 will be a Jan/Feb 2003 part, with the possibility of a very small amount of cards being available for Christmas.

That would be terrible marketing doing that... For some reason I have a hard time belieivng that.

weta
10-23-2002, 06:56 AM
You're right, they must have at least one version of the NV30 on the shelves at Christmas, otherwise they'll look like muppets
But I can't see it being the 400mhz version with the 1000mhz DDR II memory. It might be the 350mhz version with 800mhz memory.
I hope I'm wrong, I'm desperate to upgrade my existing Hercules Prophet II GTS. (I could always buy a 3D Prophet 9700pro instead)

weta
10-23-2002, 08:00 AM
In an interview with Florence Shih, Sales and Marketing Director of Abit, Virtual Zone asked her when they (Abit) would be launching their NV30 card.
Her answer, Nvidia is expected to annouce the NV30 at Comdex, and we will announce ours too at the same time. An Abit NV30 with OTES 2 cooling,
thats gotta fly..................................... http://www.vr-zone.com/Interviews/ABIT/

weta
10-23-2002, 10:44 AM
There are a couple of articles at PC Rave that I think relate directly to the NV30.
1. Yesterday ATI demonstrated a Radeon 9700pro running on DDR 2 memory.
2. Microsoft are now saying that DirectX9 may not come out until next year.
Microsoft/Nvidia have worked very closely on this program, so Nvidia wouldn't be to happy if it
was released before they had a product ready to support it, especially when ATI has.

weta
10-24-2002, 06:21 AM
Of course, in order to keep the word that was given to investers, the company may sell 500 to 1500.............

http://www.xbitlabs.com/news/story.html?id=1035331876

weta
10-27-2002, 06:02 AM
Are you ready? Link (http://www.nvidia.com/content/areyouready/index.html)

weta
10-30-2002, 02:57 AM
Apparently samples of the NV30 chip were passed on to some of Nvidia's board partners last week.
Also, Beyond 3D has a very good article comparing the R300 and NV30, goes into great depth. Link (http://www.beyond3d.com/articles/nv30r300)

weta
10-30-2002, 08:29 PM
Nvidia's ARE YOU READY? links have been updated again.

Are you ready? now includes a pretty cool screensaver Link (http://www.nvidia.com/content/areyouready/sdesktop_logon.html)
Re, PASSWORD, clue, What does Nvidia's processor, TNT, stand for?
Are you ready? (Movie) Link (http://www.nvidia.com/content/areyouready/player_select.html#)

Geforce, Geforce 2, Geforce 3, Nforce, Geforce 4, Nforce 2, Vforce FX, just a thought.

weta
11-05-2002, 08:30 AM
Nvidia Geforce Mania's Day

More NV30 facts

1. 125m transistors (10m more than the 9700pro)
2. 51,000,000,000 floating point operations per second
3. Increased performance, more than 2x the Geforce 4 4600ti
4. 3D real-time cinematic rendering

Technoa have loads of slides on their site, taken from the presentation
Link (http://www.technoa.co.kr/) see news, click on Nvidia Mania's Day

John L
11-06-2002, 02:56 AM
I heard as of late the the NV30 will not be sold retail until early 2003. The waiting is killing me!:crazy:

weta
11-06-2002, 04:15 AM
You're right, whilst sample chips have been passed on to OEM's already, production chips won't be available until January 2003.

Anandtech (Anand goes West)

Nvidia was over today showing Matthew and I something very impressive that you'll be able to read about in the
coming week(s) Link (http://www.anandtech.com/showdoc.html?i=1741&p=4) (This has to be the NV30)

Win an NV30

Theres a great competition at IGN.com, the first prize, a trip for you and friend to see the NV30 launch, an NV30 graphics card
and computer games for a year. You have to register with IGN in order to enter.

weta
11-08-2002, 09:10 AM
NV30 to ship in January 2003

Wiggo
11-08-2002, 10:16 AM
You deserve a few :beer: :beer: :beer: now. :thumb:

:cheers: :cheers: :cheers:

E^vol
11-09-2002, 04:00 PM
I want one dammit !

weta
11-09-2002, 11:06 PM
Extraordinary Engineering Link (http://www.nvidia.com/content/areyouready/story.html)

In the middle of the article there's a processor, is that the NV30 vpu?

weta
11-11-2002, 03:20 AM
See what your've been missing

weta
11-13-2002, 06:16 AM
If you believe the email Cameron received from Nvidia, it would
appear that they intend to continue using the GeForce name.

"NVIDIA invites you to join us at the Bellagio Hotel in Las Vegas
as we take the GeForce experience to a whole new level. NVIDIA
will redefine the limits of graphics engineering, performance, and
visual quality - right before your very eyes."

Nvidia's CEO is now saying that the NV30 will not be available
until January/February 2003. He wants improved yields in order
to keep cost's down, after all he has to recover a $400million
investment and then return a profit.

pga1234
11-14-2002, 06:51 PM
Well I am glad I got my Radeon. I would have to wait till Feb to get my hands on an NV30 :(.

weta
11-15-2002, 03:09 AM
Seems to me you made a smart move buying the 9700, by the time we can purchase a NV30, you'll be able to get the new Radeon 10000.

GeForceFX (http://www.geforcefx.com)

weta
11-16-2002, 07:25 AM
Here's a couple of links that might interest some of you.

1. The Inquirer (http://www.theinquirer.net/?article=6260) has yet more facts and figures on the NV30.

2. Nvidia's Are You Ready? (http://www.nvidia.com/content/areyouready/index.html) site has a new video entitled 'The Break In'

To enter, click on the N, log in as Guest, current password CineFX (this will change), click on Network, see Video 'The Break In'.

In it, you will see a monitor with images being rendered at a pretty impressive rate, a tray containing what could be the NV30
chipset, and lastly, what appears to be the NV30 videocard.

.

weta
11-17-2002, 06:38 AM
Found this Nvidia roadmap (http://www.watch.impress.co.jp/pc/docs/2002/0905/kaigai02.jpg) on a Japanese site.

NV30 Q1 2003 DX9,
NV35 Q4 2003 DX9.1
NV40 Q3 2004 DX10, 200m transistors, 0.11u/90nm

Just another two days to go.

Wiggo
11-19-2002, 01:13 AM
Here ya go (http://www.guru3d.com/tech/geforcefx/?PHPSESSID=cd9febff14f14930a7a24a16f52ba498) :devil win

:beer: :beer: :beer:

weta
11-19-2002, 01:40 PM
Cheers Wiggo

http://www.beyond3d.com/previews/nvidia/nv30launch/focus2.gif

Can't wait to get my hand's on one of these, the first versions, the Geforce 5800, and the 5800 Ultra should be available
by the end of January 2003, but no word on the cost.

Preview NV News (http://www.nvnews.net/previews/geforce_fx/index.shtml) (link updated)

Wiggo
11-19-2002, 01:51 PM
Speculation atm is $500us. :eek:

:beer: :beer: :beer:

BigPapaPuff
11-19-2002, 05:10 PM
Nv30 made an apperance at Comdex the one at the show had a HUGE copper heatsink on the Ram and also it took up 2 slots on the mother board.Reason being that the cooler blows air out the back of one of the pci slots.This is one heavy duty video card!

weta
11-19-2002, 08:24 PM
This setup is similar to Abit's, except OTES draws air from within the case, and expells it through an external port in
the mounting bracket.
Nvidia's version draws and expells air through two external ports, which are also located in the cards mounting bracket.
Personally I prefer Abit's cooler, and with OTES 2 just around the corner, there will be further improvement.

weta
11-19-2002, 08:38 PM
Mine's a Hercules 3D Prophet FX 5800 Ultra, what make and version will you buy?

BigPapaPuff
11-19-2002, 08:57 PM
I am not to sure which version I will get.I think I will wait a bit to see all the different ones then decided.

weta
11-21-2002, 05:26 PM
Nvidia chipset release's

1998.23.03 Riva TNT
1999.15.03 Riva TNT2
1999.31.08 Geforce 256
2000.26.04 Geforce 2
2000.14.08 Ultra update
2001.27.02 Geforce 3
2001.01.10 Titanium update
2002.06.02 Geforce 4
2002.25.09 Agp 8x update
2002.18.11 Geforce FX (paper release)

Based on these facts, I'd say the original release date was Feb/Mar 2003, so assuming it's available in January, then really it's early.

Wiggo
11-21-2002, 10:25 PM
The actual release date was August/September this year but due to manufacturing problems has been greatly delayed (was to be out to greet 8x AGP motherboards to market but sadly went the other way). ;)
<center>:cheers:</center>

weta
11-21-2002, 10:48 PM
Just did this pic to see what an FX would look like with a standard cooler, not suggesting for
one minute that this is enough to keep the 500mhz chipset within it's temp range.
(Sorry but I've had to cut down the image size and quality for it to load at a sensible speed)

JediAgent
11-21-2002, 11:02 PM
1999.31.08 Geforce 256
2000.26.04 Geforce 2
2000.14.08 Ultra update
2001.27.02 Geforce 3

Where you say "ultra" update, you actually mean MX update. The GeForce2 Pro and the GeForce2 Ultra were released simultaneously, the update was the release of the GeForce2 MX200/400 and the name change of Pro to GTS. About the same time th Ultra was exceptionally hard to find because nVida moved some of the chip producing plants that made the Ultra over to the MX. Where did you get that table anyway?

It doesnt matter when the FX comes out, the only date in my mind will be the release of the FX update, thus lowering the price of "old" technology GFFX's.


[EDIT]


not suggesting for
one minute that this is enough to keep the 500mhz chipset within it's temp range.

and to that i say .13 micron fab process. If done correctly wil be much easier to negotiate heat with. Possible overclock will be much better too. Lets just hope nVidia keep the same HSF mounting designs that they had in the GF4 in the GFFX. That way the aftermarket HSF'f used for the GF4 can be used on the new core, making upgrading easier to OC, and allowing the tried and true HSF's to be used rather than going to newer HSF's that may not work aswell. (like theres really that much of a science to it that a manufacterur could screw it up)

weta
11-22-2002, 07:16 AM
The reference card has mounting holes for this hsf, so the production cards should too.
Nvidia's manufacturing partners generally stay pretty close to their original reference design.
My favourite version of this cooler is made by Thermaltake (see below)

weta
11-22-2002, 08:18 AM
Prices

$399.00 (USD) GeforceFX 5800
$499.00 (USD) GeforceFX 5800 Ultra

Time to start saving

weta
11-22-2002, 07:50 PM
More FX previews

Anandtech (http://anandtech.com/video/showdoc.html?i=1749)
Beyond 3D (http://www.beyond3d.com/previews/nvidia/nv30gfx)
Toms Hardware (http://www17.tomshardware.com/graphic/02q4/021118/index.html)

weta
11-25-2002, 01:49 AM
ATI R350 (update)

Name, Radeon 9900 (originally 10000)
Process, 0.13 (originally 0.15)
Memory, DDR2
Release, Feb/Mar 2003

If this is true, ATI could soon have a faster chipset than Nvidia's FX, bring on the NV35.

weta
12-06-2002, 04:25 AM
More FX info,

Digit-Life has a good article on the Geforce FX (http://www.digit-life.com/articles2/gffx/index.html)
Tests carried out in Nvidia's Labs have seen FX chipsets from sample batches clocked up to 600MHz
and in some cases even higher (source: PC Rave)
Nvidia are planing to give the first public performance of their Geforce FX card during CPL Winter Event
in Dallas, Texas, 18/22 Dec (source: NV News)
Production chipsets should be shipped to Nvidia's Launch Partners at the beginning of January, cards
should start appearing in the following 2/3 weeks.

weta
12-07-2002, 06:37 AM
Nvidia's NV30 graphics chip has begun small-volume test production at TSMC's Fab 6. (Digitimes)
A second revision of the NV30 is just around the corner, and it will be this chipset that is put into mass production. (NV News)

weta
12-13-2002, 05:02 AM
Nvidia to demo the NV30 during a joint press conference with TSMC on the 12/12/02. (Digitimes)
Geforce FX Regular 400/900MHz core/memory with standard cooler.
Geforce FX Ultra 500/1000MHz core/memory with Flow Thermal Management cooler. (Reactor Critical)
Nvidia may allow add-in manufacturers to release a limited number of FX cards in January, with the
card becoming more widely available during February.

weta
12-15-2002, 06:36 AM
The soon to be released Detonator 41.34 drivers have been designed with the NV30 (Geforce FX) in mind. (NVP)

weta
12-17-2002, 03:10 AM
This is part of an article over at Bjorn 3D.

"Looking ahead, R350 and 400 are expected to be out this spring. ATi is betting the R400 will be an FX killer.
So where does this leave Nvidia? Since they work with 3 teams working on an 18 month schedule leapfrogging
each other, there should be an NV35 that is done if you look at the time lines, and NV40 should not be that
far behind. So they could release FX, and then if need be, release NV35 fast if ATi does grab the lead with the
R400. This could make them king again, but you now have a short lived card that will not make the board makers
happy again. And the PC guys will not be happy with that move either. I would not like to be the guys who have
to make the decisions at Nvidia on what to do about all this"

To read the full article click here. (http://www.bjorn3d.com/column.php?tid=20)

weta
12-22-2002, 09:16 PM
Part of an article over at the Virtual Zone (http://www.vr-zone.com/#2802), originally reported by Japanese site Impress

0.13 micron on 300mm wafers
Die size of 200mm2 (estimated0
Consume up to 35 watts

The initial cost of NV30 is expected to be high due to the cost of 10 layers PCB and use of 500Mhz GDDR-II but its performance
is expected to be much better than current R300. The high core clock of 500Mhz is attained due to new process technology
(0.13 micron using copper wiring) as well as using new package technology (Flip chip package). The NV30 GPU requires a special
copper heat pipe cooling unit to cool effective. The current memory frequency of 1Ghz could further improve another 40-50%
(700-750Mhz GDDR-II) at some point next year. The cost of a NV30 card can be lowered further when shifted to 8 layers PCB in
the future. The earliest possible retail version of NV30 cards (GeForce FX 5800 and 5800 Ultra) will be out by January 2003 from
MSI and Leadtek.

weta
12-27-2002, 03:44 AM
UK site Special Reserve (http://uk.special.reserve.co.uk/reviews/info.php?code=GG2765) is taking orders for Sparkle's GeForce FX graphics card at a discounted price of
£329.99 inc (RRP £349.99 inc)

Sparkle GeForce FX

AGP GeForce graphics cards for PC

• nVidia NV30 (GeForce FX) chipset
• AGP 8x bus connection
• Microsoft Direct X 9.0 Compatiable
• 256 MB DDR2 Memory (>1Ghz Data Rate)
• 500 MHz core clock speed
• 1000 MHz (1.0GHz) memory clock speed
• 8 rendering pipelines
• 16 texture units per pipeline
• 48.0 GB/sec memory bandwidth
• TV out

If this is correct, it seems good value for what is a GeForce FX Ultra shipping with 256mb of DDR2 memory.

weta
01-04-2003, 01:56 AM
It won't just be Creative that will have Nvidia GeForce FX (NV30) graphics cards out by the end of the month, according
to reliable sources in the distribution channel.
By the third week of January, distributors told the INQUIRER that a number of different vendors will have units up for sale,
suggesting the silicon is stable.

Read the article (http://www.theinquirer.net/?article=7021)

weta
01-05-2003, 02:31 AM
The following preview is of an early GeForce FX sample that was hand-delivered to the Maximum PC Lab by an Alienware representative.
Our full preview of Alienware’s new prototype machine and the GeForce FX can be found in the February issue of Maximum PC

Read more (http://www.maximumpc.com/features/feature_2003-01-03.html)

Wiggo
01-05-2003, 02:35 AM
You are determined to chew this to the end arn't ya? :D

:beer: :beer: :beer: :beer: :beer:

weta
01-05-2003, 03:40 AM
Hey, I've just posted the information as I've found it, blame Nvidia for dragging out this launch, not me.
Anyway, all the best with the move, and I hope your wife and son are both on their way to a full recovery, good luck for 2003.

Wiggo
01-05-2003, 03:52 AM
Thanx m8 and the same 4u in '03 as well. :beer:

Oh btw I have a private bet goin' on when ya'll stop. ;)

It's on whether ya stop when they hit the shelves or when one's in ya rig. :devil:
<center>:cheers:</center>

FLaCo
01-05-2003, 04:06 AM
Thanx m8 and the same 4u in '03 as well. :beer:

Oh btw I have a private bet goin' on when ya'll stop. ;)

It's on whether ya stop when they hit the shelves or when one's in ya rig. :devil:
<center>:cheers:</center>

Thats a funny bet...:laugh:

Wiggo
01-05-2003, 02:14 PM
Hey I'm bettin' he won't till he does the benches himself and reports the facts then that will bring this thread to a climax. ;)

Damn there are some sick perverts around here. :D
<center>:cheers:</center>

weta
01-06-2003, 02:25 AM
(Wiggo, I wasn't intending to go quite that far, but then I wouldn't want you to lose your bet mate) :laugh:

weta
01-07-2003, 05:54 AM
Dutch site Computerboot is taking orders for the Club3D Geforce FX 256DDR2 DVI/TVO graphics card, the price 650 Euros,
not sure if this is inclusive or subject to sales tax.

Club3D NV30 PDF brochure (http://www.sallandautomatisering.nl/Club3D%20GeforceFX.pdf)

Wiggo
01-07-2003, 08:37 AM
Hey, send The__tweaker a PM about that as he's alsways lookin' for the "latest & greastest thing out" plus I'm sure that he wil thoroughly test it out and give a full report on matters. :devil win

<center>:beer: :beer: :beer: :beer: :beer:</center>

Ehhe Yebeb
01-07-2003, 02:44 PM
Dutch site Computerboot is taking orders for the Club3D Geforce FX 256DDR2 DVI/TVO graphics card, the price 650 Euros,
not sure if this is inclusive or subject to sales tax.

Club3D NV30 PDF brochure (http://www.sallandautomatisering.nl/Club3D%20GeforceFX.pdf)

650 euro's :eek:

i guess that means we can expect it to be around the $1500 dollar mark here. No matter how good any video card is it aint worth $1500. I'll consider buying it when the price drops to $200. And eventually it will.

Wiggo
01-07-2003, 04:03 PM
I'm no't so sure but $1295aus was hinted at but I really hope that that is for the top model and not the base one. This sometimes payin' almost 3x for top vid cards is gettin' a bit much when there are cards like the Gigabyte MAYA R9000 that are practically bein' givin' away. :confused:
<center>:cheers:</center>

The__tweaker
01-07-2003, 06:29 PM
Like Wiggo said, we don't HAVE to buy the freakin geekforce FX, but we damn sure want to i can tell ya... :geek:

Those damn computer toys are so expensive and they don't last to long eighter... :knife:

I remember when my voodoo 2 with 12 mb of mem rocked the ass of me, that was one HELL of a rig i can tell ya,,, :shoot:

Never foget when i powered up the monster the first time and grabbed hold of my furniture and played like in frenzy...! :shoot3:

Naa, i'm not gonna bother with this **** no more it's just to expensive no more videocards for this Swede...

Or maybe just this last card, but then no more... :cackle:

:cheers:

Wiggo
01-07-2003, 06:37 PM
Hey wasn't meanin' nasty m8. : peace2:

But yes I remember how impressed I was not long ago when I went from a 512KB ISA vid card to a 1MB PCI card. :D
<center>:cheers:</center>


Damn! I wasn't goin' to post anymore tonite :(

The__tweaker
01-07-2003, 06:37 PM
Btw, i guess it's you Wiggo who owns the most computers and who surely spent the most$ of us all here on the forum throuh the years...

I have a couple of P2's & p3's lying around but as they aren't complete i do not count them. Then i only have two comps up'n running atm... How many comps are making noice ower at your place right now... 5-6?? :)

Just wondering as in my opinion there can never be to many...

:thumb:

Wiggo
01-07-2003, 06:39 PM
Only if I can do the same my friend. ;)

:beer: :beer: :beer: :beer: :beer:


<small>but I do seem to remember all the links that I posted when someone asked me for the "ultimate PC no cost to be spared without overclockin'" was about the question at the time? :devil win</small>

The__tweaker
01-07-2003, 06:55 PM
hmm i have never used ISA based graphics, or yeah btw in my first comp i did. But that was only a 25 Mhz 486 with 333 mb hdd
and 8 mb of EDO memory, not to fancy just a good workin machine whitch i unfortunatley had to put a sleep due to it's bad menthal health... :cry:

In your case the biggest difference should have been made by the major speed difference between the isa vs pci bus i assume, more than the actual memory upgradement right...?

Talking about huge steps, i went from that old 25 Mhz 486 to a 166 Mhz pentium how about that? :)

Wiggo
01-07-2003, 07:35 PM
What if I told ya that mine was a 33MHz 386 with 2MB of memory (4 x 512KB sticks which cost $100's back then) and had a 42MB hard drive? :?:

I still have the case and a much changed interior as it now houses my first ever upgrade and is now a AMD 586 133MHz (P75) @ 150MHz (and no heatsink fan), CDROM, 128MB memory, 4GB HDD, TNT2 M64 PCI, SB16 ISA, etc... :D

<center>:beer: :beer: :beer: :beer: :beer:</center>

The__tweaker
01-07-2003, 07:41 PM
i'd be damn... Not bad Wiggo, well the thing i liked best about those old machines is that they where QUIET... lol

No heavy ass cooling solutions needed... :)

Wiggo
01-07-2003, 08:07 PM
Very true but when my fastest PC does a SETI w/u in an average time of 2hrs and 45mins that old thing still takes 222hrs that I think is tryin' to tell me something. :?:

Anyhow the kids just use it for playin' games that won't run on anything else and it will eventually join the other in the girls' room as it's also handy just for web surfin' and homework. ;)

The eldest son won't be movin' with us (he's 23 and can do as he pleases now) and the boys' room PC will stay with him but I have something in mind here for the remainin' boy as there is one case here that would make an interesting project, http://forums.tweaktown.com/showthread.php?s=&threadid=4426&perpage=20&pagenum ber=8 down near the bottom of the page. I'll look more retro than the rest of my ancient relics (a PII 350 - PIII450 on AT mobo, BX chipset usually, can be picked up here 2nd hand for about $50-$70aus). :D
<center>:cheers:</center>

FLaCo
01-08-2003, 01:17 AM
You would think a monster like the GFFX would have a power connector like the 9700 ...it prolly does...just the pic doesn't show it.

weta
01-08-2003, 03:56 AM
Whilst there still isn't much information on prices, the RSP for a Sparkle GeForce FX 5800 Ultra
(NV30) with 256mb DDR2 memory in the UK is £349.00 inc vat. (17.5% sales tax)

This converts to,

$975.00 AUS
$873.00 CAN
€538.00 EU
$560.00 US

Theres a 128mb version of the Geforce FX 5800 Ultra (NV30) which will be cheaper.
A slower Geforce FX 5800 (NV30) (approx 400MHz core/800MHz memory) which will cheaper still.
Just around the corner there is the NV31 with 4 pipelines (NV30 8 pipelines) which will be even cheaper.

For FlaCo (http://forums.tweaktown.com/attachment.php?s=&postid=109622)

Ehhe Yebeb
01-09-2003, 05:26 AM
what about the NV35? how ridiculously expensive are they going to be? IMO anyone that pays this kind of price for a video card is either a fool or a very rich fool. All video cards go obselete so why splurge $1 grand on one? makes no sense to me. I'll stick my gigabyte radeon 9000 pro that i paid $195 for

Wiggo
01-09-2003, 05:32 AM
Those little R9K's arn't bad value at all are they? (provided ya don't get the ones w/ the crippled memory that is) :D
<center>:cheers:</center>

weta
01-10-2003, 03:32 AM
The AGP 3.0 specification provides a smooth upgrade path to AGP 8X. The mechanical bus specification remains
the same. AGP 8X speeds and capabilities are achieved by taking advantage of some previously unused pins, but
in a manner that facilitates the support of AGP 8X cards in existing AGP 2X and 4X systems, as well as new
systems that fully leverage the 8X interface. NVIDIA AGP 8X graphics solutions will be able to detect the AGP level
of the host system, and automatically configure the AGP interface to run in 3.0 mode (at 4X or 8X speeds), or in
2.0 mode (at 2X or 4X speeds). Therefore, a new NVIDIA graphics solution will be fully capable of 8X speeds, and
will be completely compatible with 2X, 4X, and 8X systems. The NVIDIA-based cards will automatically deliver the
maximum speed supported by the host system.

Nvidia

weta
01-10-2003, 04:13 AM
The NV35 will be an evolutionary card, and should be available in the later half of this year.

Possible improvements could be,

Change from 128-Bit to 256-Bit memory bus.
Increased GPU core, 600-650MHz
Increased DDR2 memory, 1200-1400MHz (Effective)
Small increase in transistor count
Support for Direct X9.1

In my opinion we'll have to wait for the NV40 (2004) to see a real improvement over the Geforce FX.

DOA
01-11-2003, 12:14 AM
weta
Let me know when nVidia beats the 9700 in Sandra and 3DMark. Until then it is all smoke and mirrors. I will buy what is proven to be fastest, not what is planned or rumored to be fastest.

Ehhe Yebeb
Another month and I will have $500 for a new card. Perhaps foolish, but considering sky diving drops $100 from my income, and movies are $10, it is a good buy. I easily spend 50 hours on the computer for every movie I watch in a theatre. But then again I do not watch TV, so I have more computer time.

I would feel foolish if I bought a 4600Ti, but that is because it is way over priced for its performance. The 9700 is the one to buy for now if you are heavily into the gaming scene. And I will not feel foolish doing so.

:clap:

Wiggo
01-11-2003, 12:30 AM
I have a sort of a prob (http://forums.tweaktown.com/showthread.php?s=&threadid=7473) myself but hey if it feels good then do it. ;)
But my prob may only be solved by buildin' 2 PC's though I'm 1 video card short. So for my hearts desire (and my wallet's likin') I'm thinkin' of gettin' a R9500P for it which has the 9700's features with a little less speed but I've had good results with the R9K so why not? If I can build myself a P4 setup (bet ya's thought that ya'd never hear me say that?) then why not? :beer:
<center><img src="http://forums.tweaktown.com/attachment.php?s=&postid=114416">
:cheers:</center>

BTW from what I have seen leaked so far ya can add the GFX at the top about 1K clear. ;)

weta
01-11-2003, 02:36 AM
Here's the initial sample of the MSI MS-8904 card based on the GeForce FX GPU using Samsung K4N26323AE-GC20
GDDR-II chips. Although GDDR-II runs on 1.8V VDDQ but VDD is still at 2.5V and data rate is at 1GB/s therefore the
heat generated from memory chips might be too hot. As such NVIDIA has designed the reference cooler in the way
that the memory chips are covered with copper heat spreader not for nice look but GDDR-II really need them to
dissipate heat well. Sources revealed that the GDDR-II may not ready for mobile platform yet as the heat issues are
still not resolved therefore we should not see a mobile version of the GeForce FX anywhere soon. Also retail cards
based on GeForce FX may be as late as March-April.

nv news

weta
01-12-2003, 06:42 AM
Part of an interview with Scott Thirwell, Marketing Director, ABIT TAIWAN

Our OTES II, designed for nVIDIA's NV30 is a totally different design than the OTES designed for the Ti4200 line.
The OTES II was specifically designed for the NV30 and takes a radically different form in terms of air intake and
outflow. ABIT's OTES brought real and effective heat pipe technology to VGA Cards and the OTES II takes this
original OTES heat pipe technology and transforms the way we think about air flow and heat pipes on a VGA Card.

OC workbench interview (http://www.ocworkbench.com/2002/abit/interview/interviewp2.htm)

weta
01-13-2003, 05:55 AM
The rumour that Hercules and Nvidia were getting back together was just that, a rumour, with the recent leak of an R350
watercooled card, it would appear that they will continue to partner ATi.
The FX will be the first Nvidia powered card I will have owned not made by Hercules (American or French owned).
This picture is what a Hercules 3D Prophet FX 5800 Ultra may well have looked like, and yes I know the PCB should be blue.

Wiggo
01-13-2003, 06:03 AM
Sorry to interupt ya here while ya on a roll but do ya google (www.google.com) hourly? :?:

<center>:beer: :beer: :beer: :beer: :beer:</center>

FLaCo
01-13-2003, 07:29 AM
*DROOOOOOOOOOOOOOLLLLLLLLLLLLLLL*:beer:

BUT WHAT IS UP WITH MARCH AND APRIL....more and more wait just like AMD!

weta
01-14-2003, 02:45 AM
Nvidia is said to have decided to handle all the design and production of new GeForce FX (NV30) graphics cards to
ensure product stability and quality. According to industry sources, Nvidia will only release the more simplified NV31
and NV34 chips to card makers for product design.

Requiring a twelve-layer PCB due to the complexity of its higher frequency design, Nvidia’s top-end GeForce FX card
is expected to hit the market after the Chinese New Year in late February. Instead of allowing graphics card
manufacturers to design and produce their own products, Nvidia is said to have decided to place orders at certain EMS
(electronics manufacturing service) companies itself and then sell the finished products to card makers to better control
the quality.

Digitimes

weta
01-15-2003, 03:48 AM
Nvidia's FX Flow Thermal Management consists of a copper heat spreader, heat pipes, and an air flow system.
Outside air is pulled inward and is cooled as it passes over the heat pipe and circulates over the heat spreader.
The heated air is eventually blown out of the case. The temperature of the GPU is monitored and adjustments
are made to the amount of air flow as needed. The GeForce FX contains an auxiliary Molex four-pin power
connector on the right side of the graphics card and is used to supply the additional power needed to operate
at maximum processor clock speeds. The combination of the cooling and power features takes up enough space
to occupy two PCI slots.

JediAgent
01-15-2003, 03:14 PM
Gee, nVidia taking over production now too, thatll help pricing :rolleyes:

weta
01-16-2003, 03:44 AM
BFG Technologies Asylum Geforce FX 128mb DDR2 graphics card can be pre-ordered from
Bestbuy.com for $399.99 US, and should start shipping from the 9th of March.
They're also offering a bonus gift package to all customers who pre-order a GFFX from them.

Asylum Geforce FX pre-order packaging (http://www.hardocp.com/image.html?image=MTA0MjY2MDM4NXRTbWdiRWR3QlBfMV80X 2wuanBn)
More information (http://www.bfgtech.com/fx_presell.html)

BigPapaPuff
01-16-2003, 04:17 AM
So did you pre-order??

weta
01-16-2003, 05:45 AM
Found this over at 3D Chipset

One of my friends sent word that a GeForce FX was demoing at CES and it was running 3DMark2001.
The score it produced was around 17,000 3DMarks. When asked what settings where checked, the fella
running the demo had no idea. I suspect that the default settings where used. The resolution was the
standard 1024x768x32bit though.

System info, Athlon XP2400, Nforce2 motherboard, 512mb Cas 2.5 DDR ram

Whilst I'm the first to admit that this information is extremely vague, it does give us some idea of what
too expect from the GFFX.

weta
01-16-2003, 06:11 AM
Sorry I haven't responded to your individual post's before now.

DOA

As soon as I know, I'll let you know, it won't be too long now.

Wiggo

No, I don't really use Google to find this information.

JediAgent

Yes, with Nvidia contolling production, they'll be controlling the price as well, but at the
same time they have to compete with ATi, so their pricing will have to be competitive.

BigPapaPuff

No, I'll be waiting until I've seen all the cards, complete with their coolers, software bundles etc.

weta
01-17-2003, 04:47 AM
eVGA's Geforce FX will be available from the 10th of March according to their website.

eVGA.com (http://www.evga.com)

Mr.Tweak
01-17-2003, 07:06 PM
Weta, I just thread every single post in this thread and I got to say you've got some good reporting skills - welcome to my world, but any money my list of news sourcing tech sites is bigger and better than yours! :D

Nevertheless, thanks for keeping us posted on the GeForce FX, which you forgot to add anywhere that nVidia said themselves the GeForce name would not be used every again 8 or so months back... :p

JediAgent
01-17-2003, 11:16 PM
He seems pretty good for a TT member i dont remember seeing before October. You should give him work to use those reporting skills on Tweak. :cool:

weta
01-18-2003, 03:03 AM
Not sure about this one, but I thought I should post it anyway.
This was orginally posted by Dark Crow, but I picked it up at NV News, I've never heard of this company,
so I'm not sure it even exists, it could be a start up, or even a wind up, I don't know, judge for yourselves.

Fragtools sales pitch

We know where you buy your hardware...
We know what games you are playing...
We even know what servers you play them on, but more importantly...
WE KNOW WHERE YOU ARE CAMPING!!!!
See you soon.....
Nighty night.........

Fragtools GFFX box art

weta
01-18-2003, 03:18 AM
Here's a picture of the forthcoming Quatro FX 1000 (source NV/forums)

weta
01-18-2003, 03:41 AM
Mr Tweak and JediAgent, thanks for your comments about this thread.

weta
01-18-2003, 04:07 AM
PNY Technologies Verto Geforce FX 128mb DDR2 graphics card can be pre-ordered from
CompUSA.com for $399.99 US now, expected shipping date, Wednesday the 5th of February.

More information (http://www.compusa.com/products/product_info.asp?product_code=300548)

weta
01-19-2003, 02:04 AM
Apparently Leadtek intend to offer three Geforce FX graphics cards, each having a different memory spec.

NV30.1 = "WinFast A300(0300)"
NV30.2 = "WinFast A300(0301)"
NV30.3 = "WinFast A300(0302)"

Card 1 will have 128mb/500MHz DDR 2 memory (1000MHz effective)
Card 2 will have 256mb/500MHz DDR 2 memory (1000MHz effective)
Card 3 will have 256mb/600MHz DDR 2 memory (1200MHz effective) (This could be the Ultra version)

weta
01-19-2003, 04:03 AM
Verto Geforce FX 128mb DDR 2, location USA, price $399.99, available 05/02/2003 info/order here (http://www.compusa.com/products/product_info.asp?product_code=300548)
Sparkle Geforce FX 256mb DDR 2, location UK, price £329.99, available 12/02/2003 info/order here (http://uk.special.reserve.co.uk/reviews/info.php?code=GG2765)
Asylum Geforce FX 128mb DDR 2, location USA, price $399.99, available 09/03/2003 info/order here (http://www.bestbuy.com/detail.asp?e=11205639&m=488&cat=540&scat=1574&cmp=IL13687)
Club3D Geforce FX 256mb DDR 2, location NL, price €692.00, available Feb/Mar 2003 info/order here (http://www.computerboot.com/nl/dept_81.html)
Gainward GeForce FX 5800 128MB DDR2, location UK, price £329.42, available 07/03/2003 info/order here (http://www.komplett.co.uk/k/k.asp?ck=1&r=1&action=info&s=pl&p=31989&AvdID=1&CatID=24&GrpID=1)

weta
01-19-2003, 08:07 AM
Complete this survey (http://www.demographix.com/surveys/TWHI-SO67/KJXETT89/) for your chance to win a brand new GeForce FX graphics card!

(At the end of the survey you are asked for your full name, address, and e-mail address)

weta
01-19-2003, 05:32 PM
These are the system requirements being quoted by PNY for their Verto GFFX 128mb DDR graphics card.

Intel Pentium® III, AMD Duron or Athlon™ class processor or higher
128MB system RAM
A 350W system power supply
An available 4-pin 12-volt power connector from the internal power supply
An AGP compliant motherboard with an AGP 2.0 slot.
A vacant PCI slot adjacent to the AGP slot. The GeForce FX card occupies 2 slots: AGP and 1 PCI
10 MB of available hard disk space (50 MB for full installation)
CD-ROM or DVD-ROM drive
Windows® 95 OSR2, 98 or higher, ME, 2000, XP, or Windows® NT4.0. (Service Pack 5 or 6)
VGA or DVI-I compatible monitor

Wiggo
01-20-2003, 04:52 AM
Has Nvidia bitten off more than it can chew? (http://www.theinquirer.net/?article=7285) :?:

:beer: :beer: :beer:

weta
01-20-2003, 07:27 AM
They've taken on a lot recently, the x-box, the nforce 2, the crush, and of course the GFFX, they may be
behind, but it's far from over.
Much is being made of the (dustbuster) fan fitted to the GFFX card, I believe they're using it to get every
last drop of performance out the chipset, in order to establish a lead over ATi's R300.
I'm certain that a quicker NV35 with a 256-bit memory bus will be ready within a few months of the R350,
and that the NV40 will be ready on time to take the R400 head on. ATi have established a good lead, but
they'll have to fight to keep it.

JediAgent
01-20-2003, 08:04 AM
Whats the crush?

weta
01-21-2003, 02:33 AM
Crush is Nvidia's K8 (Hammer) chipset.

More details (http://www.theinquirer.net/?article=7247%20)

JediAgent
01-21-2003, 02:46 AM
Ah, thx...

weta
01-21-2003, 05:06 AM
Q&A from Per Hansson's interview with Andrew Humber and Adam Foat of Nvidia Europe, held at last weeks
Comdex Show.

[TS]: Lastly we would like to know if the benchmark-results from Maximum PC are real?
[Nv]: Yes; they are real, however they are based on a board and drivers which are far from final and thus
does not represent the performance of the final product.

Read interview (http://www.techspot.com/vb/showthread.php?s=&threadid=3978) [includes some nice photos]

Imaginary FX (updated)

I've updated my imaginary 3DProphetFX card, added copper heatsinks and a background, hope you like it.

weta
01-21-2003, 05:51 AM
PEGGY, IN TAIWAN, TELLS ME that Nvidia anticipates it will ship around 100,000 GeForce FX boards worldwide
between now and May.
That's the period it has arranged for a third party contractor to make the boards to keep the quality high, she said.

And she whispered marchitecture names that Nvidia has hit on – the Ultra 5400 for the NV31 and the Ultra 5600
for the NV34 are the favorites round about now.

Read more (http://www.theinquirer.net/?article=7303)

weta
01-21-2003, 06:28 AM
Looks like ATi's and Nvidia's chipset design teams are going to be mighty busy for the next couple of years.

While ATI are known to be working on the 130nm process for the upcoming RV350 chip, they will not utilise
the 130nm process in a high end part until R400, which is scheduled for the latter half of 2003. This being
the case its likely that R500, which is probably being developed by the same team that produced R300
(Radeon 9500/9700), will be targetted at the 90nm process being discussed here and is likely due for release
within 18 to 24 months.

beyond3D

weta
01-21-2003, 06:46 AM
According to their own press release, MSI will be the first Launch Partner to offer a Geforce FX graphics card.

Press release (http://www.msi.com.tw/html/newsrelease/MSI_news/2003_0120_graphic.htm)

weta
01-25-2003, 06:32 AM
Leadtek will be offering two Quadro FX cards, the 1000 and the 2000, here's some info on the latter,
originally posted by Korean site Dark Crow.

FX 2000 spec's

Full 128-bit floating-point precision pipeline
12-bit subpixel precision
8 pixels per clock rendering engine
Hardware accelerated antialiased points & lines
Hardware OpenGL overlay planes
Hardware accelerated two-sided lighting
Hardware accelerated clipping planes
3rd-generation occlusion culling
16 textures per pixel
OpenGL quad-buffered stereo (3-pin sync connector)
AGP 8x with Fast Writes and sideband addressing
High-speed 128MB DDR2 memory
Fully programmable GPU (OpenGL 2.0/DirectX 9.0 class)
Optimized compiler for Cg and Microsoft HLSL
16x Full-Scene Antialiasing (FSAA) up to 2048x1536 per display or 3840x2400 for single digital display
Dual DVI output
Drives two independent digital displays at 1600x1200, or one at 3840x2400
Dual-link TMDS - Drives one digital display up to 2048x1536 and another at 1600x1200 simultaneously
Dual 400MHz RAMDACs
NVIDIA Unified Driver Architecture (UDA)
Fully compliant with professional OpenGL 2.0 and DirectX 9.0

FX 2000 image

weta
01-25-2003, 07:15 AM
Asus's 128mb GFFX card will be called the V9900, the spec's are the same as PNY's Verto, and BFG's Asylum Geforce FX cards.

V9900 box shot

weta
01-26-2003, 02:18 AM
GeforceFX 5800 Ultra
500MHz core
1GHz memory clock

GeforceFX 5800
400MHz core
800MHz memory clock

Nvidia Geforce FX image

weta
01-26-2003, 02:45 AM
AOpen announces the next generation of VGA cards named Aeolus FX based on the nVIDIA Geforce FX chip.
AOpen will launch its Aeolus FX in February 2003. Aeolus FX is the next generation VGA card with tremendous
2/3D graphic power, a variety of functions and cinematic effects.

Press release (http://www.boogletech.com/modules.php?name=PR&op=read&id=127)

weta
01-26-2003, 05:03 AM
Q&A taken from an interview with Nvidia's Product Line Manager, Geoff Ballew.

Q: We all know by now that the GeForceFX is equipped with 128 bit DDR II. What exactly are the technical benefits of
running 128 bit DDR II over 256 bit DDR?

A: From a technical design complexity point of view, fewer pins are better. The wider the bus, the more pins are required
on the GPU, the more traces you have to route across the board and the more often you have a memory granularity issue.
A 128-bit bus requires fewer connections than a 256-bit bus. Of course, bandwidth is important too. For situations where
you cannot raise your clock rates or cannot improve your data compression to get more effective bandwidth, going to a
wider bus is a clear method to increase bandwidth. However, if you can run a narrower bus at a faster clock rate, you
can get just as much raw bandwidth. This is exactly what we did with GeForce FX. We chose to use DDR2 because we
could run it at 500MHz! Here’s a pop quiz question --Which is faster….half the width at twice the speed or twice the width
at half the clock rate? Mathematically the raw bandwidth is the same for those two cases.

Geoff Ballew interview (http://www.elite*******s.com/page.php?pageid=858&head=1&comments=1)

Mr.Tweak
01-26-2003, 08:05 AM
Hey Weta - check your Private Messages ;)

weta
01-26-2003, 08:02 PM
NV30 Q1/2003
High end/gamer
Geforce FX 5800
Geforce FX 5800 Ultra [128mb/256mb]

NV30GL Q1/2003
Professional
Quadro FX 1000
Quadro FX 2000

NV31 - Q2/2003
Essentially an FX chipset with 4 pipelines
Similar performance to 4600ti
Will compete against Radeon 9500pro/9700

NV34 -2H/2003
1, Next generation mainstream solution
2, Mobile solution
3, Integrated into the next generation of nforce core logic sets.

weta
01-26-2003, 08:45 PM
This is good news, Nvidia are finally taking on ATi's AIW cards, the one area ATi have had pretty
much to themselves.

"We plan to offer a full line of Personal Cinema products at all price points to align with all our GPUs.
Our first new revision was targeted for a mainstream audience, so that dictated the configuration.
Thanks for your patience and rest assured we are working on it." (Geoff Ballew, Nvidia)

Personal Cinema (http://www.nvidia.com/view.asp?PAGE=personalcinema)

weta
01-26-2003, 10:04 PM
Thilo Bayer, hardware editor at PC Games Germany has received a GeForce FX for testing.
More interestingly he has posted some early findings in the 3DCenter forums.

E^vol
01-26-2003, 10:36 PM
Hey, any final specs on what the required power supply wattage will be ? I have an Antec Truepower 330W PSU, and I'm not going to buy a more powerful one, just so that I can spend every last cent I have to get a GeForce FX...
Will it be enough ? :?: :rolleyes2 Or will I be buying an R350 ?

weta
01-27-2003, 01:21 AM
Every spec I've seen so far recommends a 350w power supply, so you should get away with your Antec PSU.
The R350 should prove to be slightly faster than the FX, however I don't see it matching the graphics quality
we can expect from the FX. Check out the detail on Nvidia's elf Dawn, then imagine her rendering in realtime.

weta
01-27-2003, 03:59 AM
German site tecchannel.de has posted what appears to be the first FX
benchmarks, whilst the review is in German, the graphs speak for themselves.

Geforce FX 5800 Ultra review (http://www.tecchannel.de/hardware/1109/)

Benchmarks

3DMark2001SE Pro 1024 x 768 x 32 (http://www.tecchannel.de/hardware/1109/images/0011958_PIC.gif)
3DMark2001SE Pro 1280 x 1024 x 32 (http://www.tecchannel.de/hardware/1109/images/0011959_PIC.gif)
Quake 3 Arena 1280 x 1024 x 32 HQ (http://www.tecchannel.de/hardware/1109/images/0011964_PIC.gif)
Quake 3 Arena 4x FSAA 1280 x 1024 x 32 HQ (http://www.tecchannel.de/hardware/1109/images/0011963_PIC.gif)
UT2003 Botmatch 1280 x 960 x 32 (http://www.tecchannel.de/hardware/1109/images/0011966_PIC.gif)
UT2002 Botmatch 1600 x 1200 x 32 (http://www.tecchannel.de/hardware/1109/images/0011967_PIC.gif)

STILLLIFE
01-28-2003, 02:04 AM
Man the GF FX is a total let down im no ati lover i get the most for my money and yes the fx won in 3d mark only by 500 points it lost in UT2003 a real game who would pay that much money for a card that is no big inprovment from the 9700pro and runs hotter and is now the noisest card out with the hight core on it it doesent overclock that good just overclock a 9700 you have a better card i was looking forward to getting this card but its not worth it at all and if ati comes out with the r350 soon man this is gona be funny just my op

weta
01-28-2003, 02:20 AM
These are the first FX reviews/comments that I've seen, I'll add more as they become available.

3DGPU (http://www.3dgpu.com/previews/gffxbenches.php?PHPSESSID=6f12bca34c88dd5621f2cd77 c455fcbe)
Extremetech (http://www.extremetech.com/article2/0,3973,846356,00.asp)
Onethumb (http://www.onethumb.com/index.mg?EntryID=12)
Hothardware (http://www.hothardware.com/hh_files/S&V/gffxshowcase.shtml)
HardOCP (http://www.hardocp.com/article.html?art=NDIx)
Hexus (http://www.hexus.co.uk/review.php?review=497&page=1)
Anandtech (http://www.anandtech.com/video/showdoc.html?i=1779&p=1)
Sudhian Media (http://www.sudhian.com/showdocs.cfm?aid=315)
Gamespot (http://gamespot.com/gamespot/features/pc/nvidiageforcefx/01.html)
IGN (http://gear.ign.com/articles/384/384022p1.html)
Hardware Analysis (http://www.hardwareanalysis.com/content/article/1585.7/)
A1 Electronics (http://www.a1-electronics.co.uk/Graphics_Cards/GeForceFX/GeForceFX_5800Ultra.shtml)
PCpop (http://www.pcpop.com/read.asp?id=627&page=1)
Gamespy (http://gamespy.com/hardware/february03/geforcefx/)

STILLLIFE
01-28-2003, 02:34 AM
take a look at this rev http://www.hardocp.com/article.html?art=NDIx
and there is another at toms hardware just go reading all the fourms around its like one big flam war online today like over at madeonion alote of people are not happy with the FX after waiting all this time

weta
01-28-2003, 03:00 AM
Conclusion

As we’ve seen, the GeForce FX is no slouch in the 3D accelerator world, however it is not the "9700-killer" many have
expected. It is, at best, mildly faster in most games, and the same or slightly worse in a few. Had it arrived when most
of us thought it should, there is no doubt it would be much better received.

However, its best performance was in 3dmark2001, a synthetic benchmark, and its worst in ut2003, an actual game
compared to the 9700 Pro. This is strangely reminiscent of the Radeon 8500 release, where the Radeon won in 3dmark,
but lost in games to the GeForce 3. This anomaly, plus the strange texture problems that popped up, all point to
immature drivers.

As for Anti-Aliasing, the GeForce FX gained more of a lead in 3Dmark 2001 when it was enabled, exactly the opposite
of what was expected, due to the limitations on memory bandwidth, however in Unreal Tournament, as the AA level
was increased and the resolution rose, the GeForce FX dropped behind compared to the 9700 Pro, possibly due to
bandwidth limitations. When testing Anisotropic Filtering, The GeForce FX was all around faster at AF than the Radeon
9700 Pro, This being an advantage of the 500Mhz GPU, which gives a fantastic fill rate. The GeForceFX 5800 Ultra
certainly does have a sweet spot, and that is at 2XAA / 8XAF at 1280 resolution.

Overall, the GeForce FX seems to be a capable card, and is a step up from the GeForce 4, to which it is the successor.
Should you buy one? If you have a GeForce 4 class card, and want to stay on top with the latest and greatest, then
yes. However, if you are currently using a Radeon 9500/9700 level card, then there is no reason to spend another $400
to get a slight boost and if you use AA all the time, then you may actually decrease your performance depending on the
game and resolution.


--------------------------------------------------------------------------------

The Bottom Line: The GeForceFX 5800 Ultra is a very hot and noisy beast that may give you a bit of an edge over the
current king of the hill, the ATI 9700 Pro in some applications. If you are an NVIDIA fanboy, this of course has your name
all over it. At the current US$400.00 price point, the GFFX simply does not seem worth it to us. If NVIDIA can work some
driver magic and pull an extra 20% increase in frame rate out of the bag like we have seen in the past; they had best
start pulling. Either that or pull out the NV35 chipset, and quick.

This year will be interesting as both ATI and NVIDIA know it is all about having the best VidCard on the market when DOOM
hits.

weta
01-28-2003, 05:33 AM
Final Words

So there you have it, NVIDIA's response to ATI's Radeon 9700 Pro - but does anyone else feel unfulfilled by the
GeForce FX? A card that is several months late, that is able to outperform the Radeon 9700 Pro by 10% at best
but in most cases manages to fall behind by a factor much greater than that. Granted the problems that plagued
the launch of the FX weren't all up to NVIDIA's control, after all the decision to go 0.13-micron was made 1 - 2
years ago based on data that was available at the time. ATI took a gamble on producing a 0.15-micron part and
NVIDIA did the same on their 0.13-micron NV30, and it looks like ATI guessed right.

While we were reviewing the FX, looking at its performance and investigating its image quality, we found ourselves
reminiscing of ATI's launch of the Radeon 8500. A card that was long overdue, but in the end unable to outshine
the top performer at the time. Although the current state of the GeForce FX is much better than what we had with
the first Radeon 8500, the word impressive isn't what we'd use to describe it. The performance is an improvement
over the Ti 4600, without a doubt, but it does not place NVIDIA back in a position of dominance, which is what
everyone was expecting from NV30. This isn't the end of NVIDIA, the company is quite healthy and they've got a
number of products in the pipeline with great potential (GeForce FX included) but it does mean that the road to
regaining dominance in the market will be an even more difficult one to traverse.

ATI has not been sitting idle all this time, and progress on the R350 core has been coming along quite well. We
proved early on that the 0.15-micron R300 core could reach speeds of up to 400MHz, and with the GeForce FX
NVIDIA has established that shipping cards with 800MHz - 1GHz memory is feasible, if ATI can put together a R350
with specs close to what we're implying then even a driver-tweaked FX will not stand a chance. NVIDIA has told
us that the GeForce FX will be in stores next month, and we'd expect R350 in about a month following that. It will
be a close race, but what ATI has going for them right now is a much more mature driver set than NVIDIA for their
flagship GPU. The 3+ month advantage ATI had in bringing the R300 into production and to market gave ATI a much
bigger advantage than just being the king of the hill for a while, it gave them quite a bit of time to fine-tune and
optimize their drivers for this very occasion; this is a luxury that ATI has not had previously but they have made
excellent use of it today.

NVIDIA's focus at this point is NV31 and NV34, after all, that's where the money is. The small percentage of the
market that will go after the NV30 will not make or break NVIDIA, but should ATI compete like this in other market
segments then there will be cause for worry. As we mentioned at the start of our GeForce FX Preview - "Kudos to ATI."

weta
01-28-2003, 06:11 AM
All of us have been very busy last night and today. Everyone was venturing for a review, and hoped that the work NVIDIA
has put into this design would produce a killer card...not so. We found quite a few reviews this morning. Some were good,
some decent, and some taken out the blue. I'd just like to say that Geforce FX (NV30) is not what most us wished for. It's
more of a prototype for the future (NV35). It's disappointing for a lot of people and nVidiots in particular that this card
simply doesn't cut it (based on review samples).
Whether the retail card is going to be much better, I doubt that. It might be by an inch, but it won't matter because most
of us will not buy a $400 card with huge fan just for those few FPS. I won't mention "free" Anti-aliasing modes and Anisotropic
performance which were in plans (at least AA). It could be because of the drivers, but in my opinion NVIDIA had plenty of
time to compile a nice set of Detonators for that occasion. Again, retail cards may (or may not) perform better, but it's hard
to achive the quality ATI has shown us with their 256 bit bus compared to NVIDIA's 128 bit. Some say "bandwidth is not
everything". It's not, but we need to realize that switching from .15 micron to .13 micron process is a tough job. I can certainly
understand that (and I hope you all will).

Based on the reviews, the card does perform well, but it's not a killer. Without that huge fan aka nDustbuster aka Vacuum-
Cleaner aka LeafBlower and huge power consumption we would certainly appreciate it more. Let us hope NVIDIA can at least
work on those drivers.

weta
01-29-2003, 02:28 AM
While we are still looking into this, it seems that the in-game screen shots posted on the Net yesterday showing off IQ
produced by the GeForceFX 5800 Ultra are "wrong".

There is not doubt that we criticized the GFFX for its AntiAliasing, and now it seems that we may have not had the proper
evidence to base our conclusions on. To quote ourselves from this page:

With NoAA you can see the aliasing is quite predominant. 2X AA and Quincunx don’t seem to do much on the GeForceFX
visually, but the FPS are effected comparing the shots to the original with no AA enabled.

Of course all of this left us a bit puzzled, and wondering about the AA abilities of the drivers, but the "facts" are the fact
correct?

We have been working with NVIDIA on this to get an answer and it seems that now we have the preliminary information to
give us a bit more insight on the question.

The GeForceFX's technology applies filters that effect AntiAliasing and Anisotropic filtering after the frame has left the
frame buffer. In short, this means that all of our screenshots do not accurately represent the true in-game visual quality
that the GFFX can and will produce, as the screen shot were pulled from the frame buffer. We have come to conclusions
about the GFFX IQ (Image Quality) that are simply wrong.

While we cannot answer for other reviews of the GeForceFX it is very possible this is an issue with those articles as well,
if they were in fact thorough enough to cover IQ.

We are currently working on a way to capture the images properly and will be revisiting the GeForceFX 5800 Preview by
covering the IQ portion of our preview with proper screen shot comparison or further addressing the truth surrounding
this situation.

Certainly this is a huge issue it seems that NVIDIA was not even aware of when they issued us the review units. Having 48
hours to preview the card over Superbowl weekend compounded this, and while that is no excuse for improper evaluation on
our part, it did certainly impact our ability to do a better evaluation. We are sorry for any incorrect evaluations we have
made and are working now to remedy the situation. Any new information will be posted here. [H]ardOCP

STILLLIFE
01-29-2003, 02:56 AM
They can candy cote it all they want but
1= its already a overclocked card that puts out way to much heat
2=noise they might come out with one not as bad but that thing need big time cooling
3= performence to price they will have to mark it down alot
4=the software underclockes the card if it goes to hight or gets hot ( that tells ya its already overclocked)
5=???? R350 around the corner

weta
01-29-2003, 06:00 AM
Terratec will be offering two FX cards, the Mystify 5800, and the Mystify 5800 Ultra, both cards
will be supplied with 128mb of DDR2 memory, and should be available from the middle of February.

Mystify 5800 box shot (http://www.terratec.de/images/bilderpool/Verpackung_Mystify5800_S.jpg)
Mystify 5800 Ultra box shot (http://www.terratec.de/images/bilderpool/Verpackung_Mystify5800ultra_S.jpg)

JediAgent
01-29-2003, 02:19 PM
They can candy cote it all they want but
1= its already a overclocked card that puts out way to much heat
2=noise they might come out with one not as bad but that thing need big time cooling
3= performence to price they will have to mark it down alot
4=the software underclockes the card if it goes to hight or gets hot ( that tells ya its already overclocked)
5=???? R350 around the corner

6= nVidia cards always look better on the monitor regardless of GPU Power in comparison to ATI. Period.

Ive seen and played games on over 3 dozen PC's with cards ranging from a GF2 (most common) to a GF4 4600's and ATI 9000 and 9700's. ATI push the FPS but at same res, test the two cards... there is a noticeable difference in image quality. I dont know the difference in tech that makes it do it... but i can tell, everyone at the shop can tell, and everyone ive talked to that has owned a GF4 and 9700 has appreciated the GF4 more.

STILLLIFE
01-29-2003, 07:51 PM
like i said im not a ati lover i have had the voodoo 1500/3500/4500/5500 gf2 gf2 ultra gf3 and the gf4ti4600
when i got my 9700pro i still had the the 4600 its a damn good card but my 9700 is better hands down faster better aa
now the fx is just not worth the money all. I for one am a overclocker big time and the fx is already overclocked thats not for me im not saying the fx is out if they come out with a card that run's cooler is guiet and has a little more speed i would get it
all i was saying is look at it for what it is NOW not Latter its not worth the money to me BUT thats me MY OP

RDR
01-29-2003, 09:41 PM
6= nVidia cards always look better on the monitor regardless of GPU Power in comparison to ATI. Period.



hmmmm....I've generally found the opposite to be true

are you using ATi built cards - or 3rd party cards?
I have seen some crappy "powered by" cards - but the IQ on the "built by" cards I've owned has always been superior

this is based on owning the following:

eVGA GF2MX
ATi Radeon ViVo
Asus GF2 Ultra
ATi Radeon 8500
Powercolor GF3 Ti200
Powercolor GF4 Ti4200
ATi 9700Pro

IQ is a little subjective at best - but that's my opinion

BigPapaPuff
01-30-2003, 03:01 AM
hmm pretty funny stuff

weta
01-30-2003, 05:02 AM
RDR, I bet this takes a lot longer with your ATi Radeon 9700pro

weta
01-30-2003, 05:55 AM
Check out these ATi release dates, you have just got to wonder how long the FX will remain Nvidia's top spec card.

Nevertheless, I can inform you that the R350 based cards will appear in April this year, RV350 ones will come even a
little earlier. Of course, this is just a plan and it can be corrected any time. The R400 is due to be released after July.

Event review (http://www.digit-life.com/articles2/ati-jan2k3/index.html)

JediAgent
01-30-2003, 01:06 PM
hmmmm....I've generally found the opposite to be true


If your talking about 2D... ATI is supposedly better and i myself have never seen much of a difference, so i cant argue. But set your specs to max on both a 4600 and a 9700 and play the same game on the same system. Personal opinion is that the nVidia looks better. And ive heard the same from other ppl that deal with both the PC store i work at (not customers, other geeks) and all the clients i deal with through the Warehouse agree, too. And im not asking "GF4 4600's look better than 9700's, huh?" Im asing which looks better of the two... wothout telling them my tests or what my computer friends think. All this is specifis to 3D... i believe the 2D is better because of some RAMDAC thing... and yes, ATI built... not 3rd party... thou i would like to see a Hercules :)

RDR
01-30-2003, 09:47 PM
If your talking about 2D... ATI is supposedly better and i myself have never seen much of a difference, so i cant argue. But set your specs to max on both a 4600 and a 9700 and play the same game on the same system. Personal opinion is that the nVidia looks better. And ive heard the same from other ppl that deal with both the PC store i work at (not customers, other geeks) and all the clients i deal with through the Warehouse agree, too. And im not asking "GF4 4600's look better than 9700's, huh?" Im asing which looks better of the two... wothout telling them my tests or what my computer friends think. All this is specifis to 3D... i believe the 2D is better because of some RAMDAC thing... and yes, ATI built... not 3rd party... thou i would like to see a Hercules :)

no - I mean 3D

even the 2 systems I have listed in my sig - If I load the same game up on both of them, I definately like the way it looks on the ATi better - some of that may be due to the monitor - but I used to have the GF4 in my AMD box and a Radeon8500 in the P4 & I still liked the ATi better

Maybe I have a poor example of a GF4 :confused:

STILLLIFE
01-30-2003, 10:55 PM
what the 9700 has way better AA then any gf how can you say that what are you showing people hight performence settings at 800x600 i run ut2003 at 1289x1024 AA 4x af 16x Try that with a 4600 see whats better and look at this one of the tests of the FX at AA 2x AF 8x now look at CPU speed for the 9700 and FX hmmmmm
http://www.hardocp.com/image.html?image=MTA0MzYyMDg1OTVjVVNkMzFISXhfMl8zX 2wuZ2lm

weta
01-31-2003, 06:34 AM
Gainward Co., Ltd, the leading manufacturer of high performance 3D graphics and home entertainment accelerators,
announces that it will introduce the world’s most powerful series of 128MB 3D graphics processor boards based on
NVIDIA’s GeForce FX 5800 technology. The most powerful graphics board of the series; the Gainward FX PowerPack!
Model Ultra/1000 Plus “Golden Sample”TM features NVIDIA’s fastest GeForce FX 5800 Ultra and is priced at €649 incl.
VAT (approx £449 UK SRP). The Gainward FX PowerPack! Model Ultra/800 Plus “Golden Sample”TM is based on the
NVIDIA GeForce FX 5800 priced at €549 incl. VAT (approx £389 UK SRP).

Press release (http://www.hexus.co.uk/pr.php?pr=452)

weta
01-31-2003, 06:52 AM
Here's a picture of PNY's Quadro FX 2000, hopefully their Geforce FX 5800 Ultra will use a
similar cooler, not all of us want to hear a 70db fan when gaming, despite what Nvidia may think.

More pictures (http://www.gforcex.com/frame.pl?m=/artiklar/ovrigt/QuadroFX/)

BigPapaPuff
01-31-2003, 08:33 PM
The cooler might not have an exhaust fan or intake in the back but its still a really big cooler and looks to me that it still takes up two pci slots.I think I will wait awhile until they can size this monster down a bit.

weta
02-01-2003, 02:41 AM
Beyond3D have posted a technical Q&A on the FX

There's been lots of talk about Displacement Mapping support (although, we know this isn't a key feature
to determine 'Compliancy' for DirectX9), can you clarify what level of support GeForce FX has for it?

GeForce FX supports Geometry Displacement Mapping.

What percentage of the NV30 would you consider to be totally new?

All of it had to be redesigned for the .13 micron process technology. Of course, the Vertex Shader, Pixel
Shader and memory interface were areas of revolutionary changes and enhancements over previous GPUs.

Read article (http://www.beyond3d.com/interviews/gffxqa/)

weta
02-01-2003, 05:57 AM
If the NV30 happens to be the fastest graphics chipset on the market, it may or may not be a good
thing (more on that later). But the NV35 is where NVIDIA's high-end performance hopes lie. Remember
the improvements the GeForce 2 (NV15) had over the GeForce and the improvements the GeForce 4
had over the GeForce 3? Six months isn't a long time. NV30 was primarily delayed due to, once again,
the move to a new manufacturing process. The same was true of the GeForce 256 and GeForce 3.
And remember the GeForce2 Ultra? It was basically a placeholder for the GeForce3, which was pushed
back from 2000 until early 2001.

This is part of an article written by Tim Burton and Mike Chambers for NV News, it's well worth a read.

The article (http://www.nvnews.net/articles/geforce_fx_commentary/index.shtml)

weta
02-01-2003, 06:33 AM
Abit Siluro FX (Limited) graphics card and box shot, in a word superb.

weta
02-01-2003, 05:43 PM
Whilst I can't confirm the authenticity of this story, it's well worth a read anyway

Guys.. I dont work at nvidia, and I dont have *That* kind of information. And anything I tell you is never going
to get validated in any kind of an official way. I simply had a conversation with someone who works at another
company, whos identity will remain unrevealed. It was basically a long exchange regarding the true nature of the
delays behind the GFFX, based on obvious connections all IHV's have with TSMC. The gist of it is in that the .13u
process was *NOT* all Botched up at TSMC, there were definitely problems getting the low-K Dialectrics working
towards the end of the process.

Read more (http://www.beyond3d.com/forum/viewtopic.php?t=4072)

RDR, not long now

RDR
02-01-2003, 08:13 PM
RDR, not long now


hey...come on now - I'm not an ATi fanboy, I almost always have both an nVidia product and an ATi product in my boxes

there's a very good chance I will own an R350 product at some point - but there's also a very good chance I will have a GF FX derivative at some point too.

I've always maintained that both companies' cards have their individual merits

weta
02-02-2003, 01:58 AM
Checkout Visiontek's website, and if that's a reproduction of the FX's fan, then there's no way I'll be fitting one to
any PC's I build or own, it really does sound like a jet engine, if Nvidia seriously believe that people want to hear that
racket, then I'm stunned.

Visiontek (http://www.visiontek.com)

RDR, Chill, that was just a bit of fun, we all realise that as the forum moderator you're unbiased, but as a forum member
you're allowed to express your opinion like any of us.

More FX fun

weta
02-02-2003, 05:27 AM
NVIDIA probably could have designed a NV-30 class part on the 0.15u process, but there would have been some
very significant tradeoffs to do so. They would have had to cut back from the 125 million plus transistor design
and go with something a bit more simple. Other tradeoffs would have been things like clock speed, or making a
hybrid 96/128 bit floating point pixel pipeline (much like what ATI has) to save on transistor costs. This part would
have probably been around the same speed as the Radeon 9700 Pro for the high end, but NVIDIA wanted a chip
that was going to outrun anything ATI had at the time, so they chose the riskier 0.13u process. Of course it is nice
to have a chip that can run at 500 MHz, but it isn’t nice when you can’t have that chip on the market and the
competition has passed you by.

Read article (http://www.penstarsys.com/editor/Today/nvidia3/index.html)

weta
02-02-2003, 08:40 PM
Gainward's 5800, (£329.42 inc) and 5800 Ultra (£431.34 inc) can now be pre-ordered
from UK site Komplett, their unconfirmed delivery date is the 7th of March.

Gainward Geforce FX 5800 128mb DDR2 AGP, "Ultra/800 Plus GS", Retail (http://www.komplett.co.uk/k/ki.asp?action=info&p=31989&t=00&l=1&AvdID=1&CatID=24&GrpID=1&s=pl)
Gainward Geforce FX 5800 Ultra 128mb DDR2 AGP, "Ultra/1000 Plus GS" m/lyd & 1394 (http://www.komplett.co.uk/k/ki.asp?action=info&p=31987&t=00&l=1&AvdID=1&CatID=24&GrpID=1&s=pl)

Reportedly Gainward's Ultra model will ship with a 7db cooler,
if this proves to be true, why couldn't Nvidia manage to do this?

Gainward FX 5800 Ultra box shot

weta
02-05-2003, 07:14 AM
Nvidia has updated it's Geforce FX 5800 Ultra cooling system, now the fan remains inactive (silent) when in 2D mode,
and only kicks in when running 3D applications, they say the revised card is around 5db quieter than the original. [H]

E^vol
02-05-2003, 07:20 AM
5 db ?? that's it ? how about 50 db guys ? come on......

Mr.Tweak
02-05-2003, 07:53 AM
5 db ?? that's it ? how about 50 db guys ? come on......

It's a start...

Beefy
02-05-2003, 09:02 AM
I wanna know what the sound card is that ships with that Gainward Ultra 1000.

weta
02-06-2003, 04:17 AM
NVIDIA UK HAS REFUSED to comment on a posting on a 3D bulletin board claiming that the GeForce FX is for
the chop. According to the rumour, posted at x-3DFX, Nvidia is canning the GeForce FX and has told its foundry,
Taiwan Semiconductor Manufacturing Corp (TSMC), to stop when 100,000 NV30 are produced. The same posting
claimed that Nvidia will now concentrate on NV31, NV34, and NV35. warp2search

"Never in the history of computing has so much bull**** been written by so many with such little truth"
– a friend close to Nvidia the inquirer

snm
02-06-2003, 09:19 AM
Cameron,

I registered to post re: your front page GFFX thoughts. I agree with waiting until retail level products & drivers are available before passing final judgement. I don't think you quite undertand the Nvidia/NV30 situation, however...

It's most unlikely that AIBs/OEMs bother with a redesign of the reference board. The complexity of the 12-layer PCB & QC have necessitated Nvidia choosing 1 or 2 select AIB partners which will then supply all other OEMs with a finished SKU, sans cooler (if desired). If we look @ the GF4 series, all retail products are of reference design inc. the rev2 PCB of the Ti4200.

Your reference to CG & that "ATI will also support it in their R350" is completely wrong. CG is Nvidia's version of the MS HLSL & is designed to expose Nvidia hardware features. Why would ATI support this given the DX9 HLSL offers all their functionality? There's no point in them writing a back-end for an Nvidia optimising solution... Sure, CG offers some nice features for developers such as porting DX-OGL, but Nvidia needs this level of functionality as evidenced by Carmack's recent GFFX performance comments.

Your comments re: enjoying the scenery are also peculiar given the unfortunate state of FSAA & 1st order approximations of AF (unless the super slow fallback GF4 AF is selected). I agree that shaders v2.0, certainly look good, though! Ultimately the GFFX in it's present form is more a developer's board than an end-user's/gamer's, due to it's greater flexibility & design trade-offs...

JediAgent
02-07-2003, 04:48 AM
huh?

weta
02-07-2003, 05:24 AM
You've seen OTES I, now prepare yourself for OTES-The Next Generation. Only from
ABIT, the next OTES will blow away the competition. After 2 years of development on
the OTES line, we used our R and D expertise as well as customer feedback to create
the best ABIT Engineered cooling solution for the GeForceFX series.
ABIT will announce more details on the GeForceFX series in late February. Stay tuned!

More info >> (http://www.abit.com.tw/abitweb/webjsp/english/news/otes0130/otes0130.htm)

"BLOW AWAY", An unwise phrase to use given current circumstances

Abit Siluro FX image

weta
02-07-2003, 05:40 AM
JediAgent

He's referring to the article "My GeForce FX Thoughts" posted by Cameron on the homepage (last month).

weta
02-08-2003, 03:46 AM
Why the Geforce FX doesn't have a 256bit memory bus.

1. Memory too expensive
2. Nvidia's memory controller not up to it

WE ASKED AROUND the industry why the nVidia GeForce FX does not use a 256bit bus with DDR II memory.
We learned that nVidia wanted to go for DDR II memory simply because DDR I stops at the magic limit of 400
MHz and that was the reason for making card with DDR II memory.
We wondered why they didn't use a 256bit memory bus as that memory working at 500 MHz would give them
32GB of raw bandwidth. It would have definitely been a Radeon 9700 PRO killer, it would have had almost
13GB/s greater bandwidth.

Read more >> (http://www.theinquirer.net/?article=7642)

weta
02-08-2003, 03:55 AM
SOURCES CLOSE to Nvidia tell the INQUIRER that the graphics firm has told its partners that the
GeForce FX is likely to be discontinued, with the firm instead concentrating on the NV35 platform.

Read more >> (http://www.theinquirer.net/?article=7658)

weta
02-08-2003, 04:39 AM
Box shot of Leadteks new WinFast A300 Ultra TD graphics card.

weta
02-08-2003, 06:30 PM
As we noted here earlier this week, the GeForceFX 5800 Ultra will never make it to retail.
Those of you that PreBuy the cards will still get an Ultra model with the FX Flow cooling unit.
Those who don't will have the opportunity to get the non-Ultra version (400/800) off the
retail shelves for a price of US$300.00. This information is unconfirmed at this time, but has
been what we have been told repeatedly by different sources since Tuesday of this week.

[H]ardOCP/warp2search

weta
02-11-2003, 05:45 AM
Apogee FX81

Super conducting heatpipe technology
Dual gas turbine technology fans
Cyclone TSC
Sapphire lighting (http://www.ixbt.com/short/2k3-02/ch30-1.jpg)

Mr.Tweak
02-11-2003, 03:56 PM
We went to an nVidia conference up here in Sydney today with David Kirk (nVidia Chief Scientist) and talked about the GeForce FX and its future.

They refused to comment on the rumours surrounding the cancellation of the 5800 Ultra as they never comment on stuff which hasn't been announced - although I did try my hardest to milk them for info but David was too switched on to let anything slip.

We heard the GeForce FX Ultra in action with case sides open and to us it didn't really sound like the jumbo jet people are making it out to be - they have modifed the cooling system already and reduced the noise by about 8db.

Other than that, everything else we heard was just recapping what we already know with some predictions for the future.

weta
02-13-2003, 06:36 AM
If the Geforce FX 5800 Ultra never sees the light of day, where will Nvidia go from here?

In fact, Nvidia has at last contacted the INQ and told us the following: "The rumours are untrue,
we are full speed ahead on FX 5800 Ultra. Like every new high end GPU we have built in the last
five years, initial demand will outstrip supply so it will be on allocation."

Which particular rumours Nvidia is talking about remains unclear.

Read more >> (http://www.theinquirer.net/?article=7766)

DOA
02-13-2003, 10:00 AM
I gave up and bought a 9700 today, with the new 3.1 drivers it is fast and stable.

I will upgrade to the FX if:
1) fix the cooling - it will overheat in any of my systems because it sucks and blows through almost the same hole.
Anyone know what a brachiopod is?
A prehistoric animal that eats and excretes through the same orifice. I guess nVidia sent some engineers back in time.
2) get more performance - I run AMD cuz it is cheaper and just as fast as the P4 I would have bought, even though it runs a lower clock speed. I am not fooled by clock speed as a measure of performance.
3) ATI really screws up the drivers

:thumb:

JediAgent
02-13-2003, 01:01 PM
3) ATI really screws up the drivers

The man knows why i dont buy ATI no mo':angryfire :snip:

Wiggo
02-15-2003, 06:12 PM
http://forums.tweaktown.com/showthread.php?s=&threadid=8528 ;)

weta
02-15-2003, 06:21 PM
Prolink have released an ad displaying the usual FX spec's, a box shot (it's triangular), and a small board shot.
tweaktown

Please note, this image is the same as the one posted on the home page.

weta
02-16-2003, 02:19 AM
Wiggo,
:flames:You know RDR will do the right thing :flames:

Geforce FX uncovered

Wiggo
02-16-2003, 02:32 AM
Sorry m8 but I thought that ya should know about it but. ;)

:beer: :beer: :beer:

weta
02-17-2003, 06:20 AM
Here's a better picture of Chaintech's Apogee FX81, the first one doesn't really do it justice.

revenant
02-17-2003, 11:54 AM
That Chaintech card looks sweet-ass-sweet. :)

weta
02-19-2003, 03:09 AM
Nvidia does have some serious plans for the mainstream market. It plans to introduce, as soon as possible
(CeBIT is suggested), at least 3 new cards based on the GeForce FX core.

First one, is a GeForce FX 5600 Ultra card. This will be aimed perhaps at the highest end of mainstream market.
A GeForce FX 5600 non-ultra might also be introduced but that still remains unclear at this time.
Second card, GeForce FX 5200 Ultra will be aimed at the higher end of low end market or middle/low part of
mainstream market. Finally, the GeForce FX 5200 non-ultra model will be aimed at low/entry level of the market.

Read more >> (http://www.warp2search.net/article.php?sid=10722&mode=thread&order=0&thold=0)

weta
02-20-2003, 04:06 AM
Another superb Geforce FX card, this time it's Abit's offering, the Siluro FX 5800 Ultra,
Abit claim their OTES III cooler is significantly quieter than Nvidia's reference version.

weta
02-20-2003, 04:22 AM
"Of all the chips we sampled last year, NVIDIA's is clearly the fastest and most sophisticated," said Peter Glaskowsky,
a senior analyst with In-Stat/MDR. "All of us will eventually benefit from the impressive technology being developed by
3D-chip companies. Gamers eager to realize the full potential of titles such as Doom III and Unreal II--and developers
creating even more advanced software--will plunk down the big bucks for these boards right away."

Press release >> (http://www.nvidia.com/view.asp?IO=IO_20030218_9274)

jamie_horwood
02-21-2003, 05:34 AM
[QUOTE]Originally posted by weta

Hi Weta. If you can "AFFORD" to wait for the FX range and save all your hard earned cash then do so. but if you are looking for a high end card now. I would recommend the TI 4600 128 8x. I have the 4X TI 4600 and now i've just corrected my 2100+ Athlon XP fault. It runs like a dream with 2700ddr mem. TI 4600 will not however futurproof your setup for a year or so. Software Technology at present is barely keeping up with Hardware Technology anyway... we should be playing next years games now, this is soon to evolving.

EG465P-VE(FCA) in T3007 Task International Case
Abit KG7 Raid (AMD761/VIA 686B)
Infineon cas 2.5 2700 ddr non ecc
(166mhz running at 133mhz)
Asus V8460 Ultra 128ddr
Relysis Tl - 570 TFT rgb input
Audigy 2 Dolby 6.1
Liteon 16x DvD Rom
Liteon 48x12x48 Cd-Rw
AMD 2100+ Quantispeed Athlon XP
(Running at 1735mhz)
Thermaltake Volcano 7+ HSF type: 70:70:25mm
Western Digital 120Gb Caviar 8mb cache :7200rpm
2x 80mm Ys Tech Case fans » Speed - 3000RPM
» Output - 45.2CFM
1x 80mm Ys Tech Exhaust » Decibels - 34.2dBA
fan » Dimensions - 80x80x25

jamie_horwood
02-21-2003, 07:58 AM
[QUOTE]Originally posted by weta

Hi Weta. If you can "AFFORD" to wait for the FX range and save all your hard earned cash then do so. but if you are looking for a high end card now. I would recommend the TI 4600 128 8x. I have the 4X TI 4600 and now i've just corrected my 2100+ Athlon XP fault. It runs like a dream with 2700ddr mem. TI 4600 will not however futurproof your setup for a year or so. Software Technology at present is barely keeping up with Hardware Technology anyway... we should be playing next years games now, this is soon to evolving.

EG465P-VE(FCA) in T3007 Task International Case
Abit KG7 Raid (AMD761/VIA 686B)
Infineon cas 2.5 2700 ddr non ecc
(166mhz running at 133mhz)
Asus V8460 Ultra 128ddr
Relysis Tl - 570 TFT rgb input
Audigy 2 Dolby 6.1
Liteon 16x DvD Rom
Liteon 48x12x48 Cd-Rw
AMD 2100+ Quantispeed Athlon XP
(Running at 1735mhz)
Thermaltake Volcano 7+ HSF type: 70:70:25mm
Western Digital 120Gb Caviar 8mb cache :7200rpm
2x 80mm Ys Tech Case fans » Speed - 3000RPM
» Output - 45.2CFM
1x 80mm Ys Tech Exhaust » Decibels - 34.2dBA
fan » Dimensions - 80x80x25

weta
02-22-2003, 05:11 AM
Not 8 x 1 as previously suggested

IN A MARATHON INVESTIGATION that The Inquirer launched a couple of days ago, we have some solid stuff to present to you.
From the time when Nvidia first questioned the validity of 3DMark03 and its single texturing as a method of rendering that is rarely
used in games we knew that somthing was rotten in Denmark, sorry, Santa Clara.
An Nvidia technical marketing manager confirmed to us that Geforce FX has 4 Pipelines and 2 Texture Memory Units that can
result with 8 textures per clock but only in multitexturing.

However, Nvidia did say that there were some cases where its chip can turn out 8 pixels per clock. Here is a quote, "GeForce FX
5800 and 5800 Ultra run at 8 pixels per clock for all of the following, a) z-rendering b) stencil operations c) texture operations
d) shader operations and only color+Z rendering is done at 4 pixels per clock"

We talked with many developers and they said me that all games these days use Color + Z rendering. So all this Nvidia talk about
the possibility of rendering 8 pixels in special cases becomes irrelevant.

The bottom line is that when it comes to Color + Z rendering, the GeForce FX is only half as powerful as the older Radeon 9700.

We can assure you that this is just the start, we have more to present to you in the next few days.

Read more >> (http://www.theinquirer.net/?article=7920)

weta
02-26-2003, 05:25 AM
Looks like BFG are following Nvidia's reference design with their forthcoming Asylum FX 5800 Ultra card.

weta
02-26-2003, 05:49 AM
Thermaltake's fanless cooling solution for the NV30.
(Not all that different from Zalman's R300 model)

weta
02-26-2003, 06:20 AM
Albatron have announced two Geforce FX cards, the Gigi FX 5800 and the Gigi FX 5800V (Vivo).

weta
03-01-2003, 05:49 AM
Here's a picture of MSI's new Geforce FX 5800 Ultra board.

weta
03-05-2003, 06:10 AM
Motherboards.org has posted a short preview for MSI's Geforce FX 5800 Ultra graphics card. This card has been
available in Japan for several days now, so hopefully it should start appearing in other parts of world very soon.

Preview >> (http://www.motherboards.org/articlesd/hardware-reviews/1236_1.html)

STILLLIFE
03-05-2003, 06:31 AM
well looks like that might be the last from msi
[H]
Keith Cyr points out that there is big news for ATi, bad news for NVIDIA according to this Yahoo Finance report. The report is saying that NVIDIA lost one of their biggest card making partners, MSI, to rival ATi. Here is the clip.



Yahoo
8:34AM NVIDIA estimates trimmed at FBR on large design loss; target $11 (NVDA) 12.57: Friedman, Billings, Ramsey has learned that NVDA has lost a large design win at Micro-Star (one of its largest customers) to ATYT; firm now believes that other designs at Micro-Star are at risk, given that they had been an NVDA-exclusive customer; additionally, firm believes NVDA's new GeForce FX line will ramp more slowly than expected due to low yields on both discrete and board-level products; trims FY04 rev/EPS ests to $1.80 bln/$0.61 from $1.83 bln/$0.63 (consensus $1.82 bln/$0.59). Price target is $11.

weta
03-05-2003, 03:44 PM
The MSI/Medion story is not material to our business. We have factored in design wins months in advance.
We are talking about a minor amount - tiny % of our business. We have 100's of design wins each year, this
is only one. We ship 10's of millions of parts each year and this will not impact our business.

We want to win every design win, but for each one we have to do what makes the best business sense for
Nvidia. In this case all the products that Medion was looking at were on allocation.

This has no bearing on our relationship with Microstar. (Nvidia)

weta
03-06-2003, 05:55 AM
Here's the first pictures of Nvidia's new Geforce FX 5600 (NV31) reference card.
X-bit

weta
03-07-2003, 07:27 AM
Here are some pictures of Leadtek's new WinFast A300TD graphics card.

weta
03-08-2003, 06:32 AM
The most powerful graphics board of the series; the Gainward FX PowerPack! Model Ultra/1000 Plus “Golden Sample”TM features NVIDIA’s fastest GeForce FX 5800 Ultra and is priced at €649 incl. VAT. The Gainward FX PowerPack! Model Ultra/800 Plus “Golden Sample”TM is based on the NVIDIA GeForce FX 5800 priced at €549 incl. VAT.

The Gainward FX PowerPack! Model Ultra/1000 Plus “Golden Sample”TM takes the graphics environment to a completely new level. Running at 500 (Plus) MHz core clock speed and 1 (Plus) GHz DDR-II memory speed, the Gainward FX PowerPack! Model Ultra/1000 Plus “Golden Sample”TM is powered by pure adrenaline, delivering unprecedented 3D graphics performance to the PC platform. To further improve the versatility and gaming experience the Gainward FX PowerPack! Model Ultra/1000 Plus “Golden Sample”TM is bundled with a digital video-in IEEE 1394 FireWire daughter card, a high quality 5.1 sound card with both stereo output and optical SP/DIF output connectors, plus a variety of hot 3D games. All of Gainward’s FX PowerPack! products feature a radical new design of cooling system delivering maximum performance and incredibly low noise levels.

The Gainward FX PowerPack! Model Ultra/780 XP “Golden Sample”TM features the FX 5600 Ultra GPU and 256MB of 2.5ns DDR. The board offers a complete set of added value features including video-in/out, two DVI connectors and two VGA connectors for any dual monitor application. The street price is expected to be €499 with 256MB and € 449 with 128MB.

The Gainward FX PowerPack! Model Pro/660 TV-DVI makes cinematic computing available for the mainstream market. Featuring the FX 5200 GPU and 128MB of fast DDR the street price is expected to be less than € 100. The board offers an appealing set of added value features including video-out, a DVI connector with a DVI/VGA adapter plus an additional VGA connector for dual monitor application.

Gainward press release

revenant
03-08-2003, 08:19 AM
From the back, the Leadtek looks almost like a thin CDrom drive. ;) I still like the looks of the Chaintech one the best. The ApogEE if memory serves. :)

0R1()N
03-08-2003, 11:44 AM
:wow: Now thats some extreme cooling on that winfast sheesh two fann, now that must sound like a vaccum.:rofl:

STILLLIFE
03-08-2003, 03:28 PM
it looks good to bad the price just doest = the performance. i was looking forward to the FX but theres no way im paying 400 dollers for a card that is bigger puts out more noise and only a little perfor gain they better do good with the next card ill be waiting for it but im gona have to say im sticking with my ati AND NO IM NOT A ATI FANBOY i had all the voodoos and its been NV from there thats why i want them to push the next card well thats just me

revenant
03-08-2003, 03:53 PM
Well, you get what you pay for in this world. Pick up a Radeon 9800 when they hit the streets, it's gonna be "the one". :) IMHO.
:cheers:

STILLLIFE
03-08-2003, 04:01 PM
i might but might wait for price to go down a bit my 9700pro is good for now

weta
03-08-2003, 05:07 PM
If you're using a 9700 Pro now, then there's clearly no point in upgrading to the NV30, but I would suggest that you wait
until more information is available on the NV35 before upgrading to the 9800 Pro. Early details suggest that the NV35 chipset
is in the late stages of development, and offers 2x the efficiency of the NV30 GPU (not 2x faster). For example where the
NV30 can render one Dawn in realtime, the NV35 can render four (see attachment).

weta
03-10-2003, 01:36 AM
Here's an artists image of the new Triplex Millenium Silver Geforce FX 5800 Ultra, nothing new here, Nvidia's reference design,
including FX Flow cooler, Triplex's trademark silver PCB, and packaging. (Please don't ask me why it say's Geforce 4 on the case)

weta
03-10-2003, 02:08 AM
Here's a picture of Gainward's Geforce FX 5800 Ultra, is it just me, or does this card look bloody awful.
I was expecting better than this from Gainward, and before I forget, that 7db cooler is actually 40db.

E^vol
03-10-2003, 04:00 AM
Originally posted by [B]weta
...and before I forget, that 7db cooler is actually 40db.
Hahaha...That figures...:rofl:

weta
03-11-2003, 05:10 AM
Here's a picture of Leadtek's Winfast A300 (Geforce FX 5800 Ultra) graphics card uncovered.

weta
03-11-2003, 05:39 AM
A couple of sites are reporting that BFG and PNY have recalled some of their Geforce FX cards. Apparently some
customers are being told that they will have to wait a further two weeks for their pre-order cards, because they've
been recalled for an update. Whilst I have no idea if this story is true, you have to wonder if anything will go right
with this card.

E^vol
03-11-2003, 05:44 AM
...you have to wonder if anything will go right with this card.
No kidding ! :rofl: oops :shh:

This must be what the nVidia guys are doing right now ==> : omg:

weta
03-13-2003, 02:22 AM
Here's a picture of Gainward's watercooled Geforce FX 5800 Ultra prototype, the silver FX heatsink does look pretty cool.

E^vol
03-13-2003, 07:08 AM
The hoses look messy, but those waterblocks look very sweet !
Maybe they'll release their NV35 with a finalized version of this setup.

weta
03-14-2003, 07:35 AM
Here's a picture of MSI's new Geforce FX 5800.

weta
03-14-2003, 09:51 PM
MSI Geforce FX 5800 Ultra (http://forums.tweaktown.com/attachment.php?s=&postid=133517)
Gainward Ultra/1000 Plus (http://forums.tweaktown.com/attachment.php?s=&postid=134845)
Gainward Ultra/1000 Plus (wc'd prototype) (http://forums.tweaktown.com/attachment.php?s=&postid=135592)
Triplex Geforce FX 5800 Ultra (http://forums.tweaktown.com/attachment.php?s=&postid=134838)
Leadtek WinFast A300TD (http://forums.tweaktown.com/attachment.php?s=&postid=135119)
Abit Siluro FX 5800 Ultra (http://forums.tweaktown.com/attachment.php?s=&postid=128396)
BFG Asylum Geforce FX 5800 Ultra (http://forums.tweaktown.com/attachment.php?s=&postid=130882)
Nvidia Geforce FX 5800 Ultra reference (http://forums.tweaktown.com/attachment.php?s=&postid=123689)
Chaintech Apogee FX81 (http://forums.tweaktown.com/attachment.php?s=&postid=127163)

weta
03-14-2003, 10:02 PM
MSI FX5800 Ultra TD8X (http://www.hardwarezone.com/articles/articles.hwz?cid=3&aid=657&page=1)

weta
03-15-2003, 03:48 AM
Sparkle SP8830 (Geforce FX5800) Ultra photo

weta
03-15-2003, 03:51 AM
PNY Verto Geforce FX5800 Ultra photo

weta
03-15-2003, 03:57 AM
Asus V9900 (Geforce FX5800) Ultra photo

weta
03-15-2003, 04:01 AM
AOpen Aeolus (Geforce) FX5800 Ultra photo

weta
03-15-2003, 04:13 AM
Prolink PixelView Geforce FX5800 Ultra photo

weta
03-19-2003, 07:24 AM
Gainward Ultra/1000 Plus (Geforce FX5800 Ultra) watercooled photo

weta
03-31-2003, 03:59 AM
X-bit Labs has posted a huge (23 page) review of Nvidia's reference GeforceFX 5800 Ultra graphics card.
It's an extremely good review which goes into considerable depth, by the time your've finished reading it,
the NV35 should be available.

NV30 review >> (http://www.xbitlabs.com/articles/video/display/geforce-fx.html)

jamie_horwood
03-31-2003, 04:30 AM
That looks like the setup mobo there is the A7N8X

JediAgent
03-31-2003, 12:35 PM
Definately a Asus northbriudge fan thats for sure.

revenant
03-31-2003, 03:11 PM
Well, my MSI K7N2-L has a similar sink on it's northbridge nForce2 chip and the PCB does look exactly like the typical Asus color/style/design. I would also say it's an A7N8X! :)