PDA

View Full Version : NV35



weta
02-17-2003, 12:02 AM
NV35 details coming soon

The competition (R350 picture?)

rugbydude
02-17-2003, 12:40 AM
That looks just like the radeon 9500 and 9700 series. Maybe this is just a better version of a 9700 rather than a new card:cry:

FLaCo
02-17-2003, 01:36 AM
do all these card have the caps marked off with a Sharpie Marker?...

negomike
02-17-2003, 10:50 AM
do all these card have the caps marked off with a Sharpie Marker?...

Yeah, it's like racing stripes for video cards. It makes it feel faster. :D

Wow, that was my 100th post. Man and it wasn't even a very good one. Sighl.

FLaCo
02-17-2003, 11:00 AM
100th how can you tell...

My buds 9700 pro hasthe like check marks...i guess they are just that check point on all the caps..

negomike
02-17-2003, 11:07 AM
do all these card have the caps marked off with a Sharpie Marker?...

Yeah, it's like racing stripes for video cards. It makes it feel faster. :D

Wow, that was my 100th post. Man and it wasn't even a very good one. Sighl.

RDR
02-17-2003, 09:24 PM
That looks just like the radeon 9500 and 9700 series. Maybe this is just a better version of a 9700 rather than a new card:cry:

the most common rumour is that the RV350 will be a R300 with a faster clock (375 or 400 MHz depending on who you want to believe)....and that the R400 will be the next card to have a significant change in architecture.

negomike
02-17-2003, 10:47 PM
theinquirer.net said it would be anounced on the 11th.

:thumbs do

weta
03-02-2003, 08:23 PM
In a nutshell this is all the NV35 is, well according to Hellbinder that is.

-600mhz
-8 (true) pipelines
-256bit bus
-500mhz DDR-II (or perhaps DDR-I at 256mb)
-All the Nv30 hardare bugs fixed (in theory)

Im sure there are one or two suprises in there. However with the revelation that the low-k process is screwed
up at TSMC. It is not very likely that Nvidia will be able to get 600mhz without the use of ANOTHER dustbuster.
Currently the Nv35 is based on copper, and they have done a few tweaks to make it run a little cooler and more
stable at faster speeds. However, being that the Nv30 is already pushing it at 500mhz.
It would seem the changes they have made would simply make the Nv35 run like a *normal* chip at 500mhz.

Now with them fixing their pixel output, and jumping to around 30GB bandwidth, even at 500mhz it will be pretty
darn fast. However I have not heard that they have changed their AA modes, nor have they addressed some
issues with their AF that will become apparent when the Nv30 is public.

Well if this turns out to be true, like the R350/R300, the NV35 is simply a polished NV30, and with the R400
rumoured to be delayed until next year, ATi should continue to lead (just) with their R350.
As I've stated previously, Nvidia's NV40 will and needs to be something very special indeed if they're to get
back into the game.

E^vol
03-02-2003, 10:18 PM
weta, I've got to ask ! Does the NV30 / NV35 really do 128bit color ? What's the deal ? Also, do you know if the R350 does it ?
If ANYONE knows.....

weta
03-02-2003, 11:31 PM
Yes the NV30 offers 128bit colour, so the NV35 will too, not sure about the R350. Where's RDR when you need him?

The real world is filled with dramatic contrasts between lights, shadows and colors, from the brightest, harshest white
to the deepest, darkest black. For any game world to convincingly depict a 3D environment, it needs to simulate this
seemingly infinite range in a decidedly finite space. With each added "bit" of light or color information afforded a game,
the quality and accuracy of the resulting image grow exponentially. 32-bit color only gives each of the red, green, blue
and alpha channels 256 choices. 128-bit color provides the developer with literally millions of choices for each channel.
For film-quality real-time animated visuals, there's simply no substitute. The GeForce FX delivers this demanding level
of excellence in real time.

E^vol
03-02-2003, 11:42 PM
Thanks for the clarification !
As I remember, the jump from 16bit to 32bit had a performance penalty in first gen cards.
What kind of performance hit will the jump to 128bit color cost us ? Is it worth it yet ??:?: :rolleyes2

weta
03-03-2003, 12:01 AM
Sorry but I can't help you there, we will have to wait for retail products to be tested, and comparisims made with
other cards before that question can be truely answered.
But I would guess that there must be some sort of performance hit with 128bit colour. Having said that, if we could
see a game running with that depth of colour, well can you imagine the level of realism?

E^vol
03-03-2003, 12:04 AM
Not if it's choppy as all hell !
:shh: But it would be sweet ! :shh:

SmokeyTheBalrog
03-04-2003, 01:29 AM
I read somewhere that the R300 and up have floating point color to the 96th bit or something like that. Will go looking, it was in some interview.

SmokeyTheBalrog
03-04-2003, 02:26 AM
The full interview is here. (http://www.driverheaven.net/display.php?page=richard_interview)

Here we are:

[Zardon] You've almost got gamma correction completely covered; can we look forward to any further advances? Also in conjunction with that, will 10-bit per component colour output be supported soon, are developers looking to address the rare cases where gamma correction and 8-bit per component color can still cause noticeable banding?

[Richard Huddy] That's a potential future feature for our hardware - but given that we have already added support for floating point colour it's probably not the main thrust of development? Ten bits per channel sounds like a crude approximation compared to 96 bit floating point colour.

E^vol
03-05-2003, 08:39 AM
Do we know of any games coming out that use 64bit / 96bit / 128bit color ?
Doom3 ?

rugbydude
03-05-2003, 01:58 PM
I've heard rumours that it supports 64 bit colour:thumb:

SmokeyTheBalrog
03-05-2003, 04:07 PM
r-dude.. uh mg?

milligrams?

:shh:

rugbydude
03-06-2003, 12:17 AM
Wow sorry bout that i think i had one or two too many :beer: last night.:beer: :beer: :beer: :beer: :beer:

SmokeyTheBalrog
03-06-2003, 02:26 AM
And here I thought it was some new point systems that some corp. introduced to tell us all how good their paper launched products are.
:laugh:

weta
03-12-2003, 03:57 AM
Based on the latest roadmap of quite a big graphics card manufacturer we have seen, I can tell you that Santa Clara, California-based NVIDIA Corporation plans to launch its next-generation graphics processors for enthusiasts, game developers and hardcore gamers in the second quarter this year.

The novelty we presently know as the code-named NV35 GPU is going to be a considerably improved GeForce FX 5800 (NV30) graphics processors with several modifications made in order to boost performance, yield and other obstacles that presently do not let NVIDIA to start selling its GeForce FX 5800 solutions in mass quantities. Basically speaking, do not expect too lot from the newer GPU: NV35 reportedly implements 130 million of transistors, just 5 million more compared to NV30.

At the moment there is not really a lot of information concerning the part, though, expect it to support up to 256MB of DDR-II SDRAM memory, DirectX 9.0 and above features and so on. Also the actual graphics cards may have rather fancy look, just like the predecessors, but, at least, let us hope that NVIDIA will give up its Flow FX cooling system that brings great annoyance all the time GPU accelerates 3D-graphics.

Keeping in mind that some stores expect the GeForce FX 5800 and GeForce FX 5800 Ultra to come only in April, I would assume the official launch of the NV35 is not likely to happen until May this year, while the actual cards will appear even later. Nevertheless, we can be sure that the GeForce FX 5800-series is going to have extremely short time of availability as the top in line graphics card.

X-bit

SmokeyTheBalrog
03-12-2003, 06:10 PM
*only* 5 -million- more... sigh... some things are just so good, now if only we could stop blowing each other up.

weta
03-13-2003, 02:30 AM
Apparently the NV35 chipset has taped out, if this proves to be correct, then we should be looking at a June launch.

E^vol
03-13-2003, 06:56 AM
I've read that the NV35 is supposed to be twice as fast as the NV30....I'll believe that when we see some reviews and benchies...

weta
03-14-2003, 07:26 AM
French Hardware reports that a working NV35 sample, running at 250Mhz, is already as fast as a Geforce FX5800U.
If this turns out to be true, then its an impressive start for Nvidia's new graphics processor.

weta
03-16-2003, 01:23 AM
This time the company decided not to use the brand new DDR-II and to return to DDR-I memory. NV35 will have a 256bit memory bus, and the working frequencies of the chip and memory will be lower than those of the current NV30. However, nevertheless, the performance promises to be considerably higher. For example, in Quake3 Arena game (timedemo1, 1280x1024) the performance difference between NV35 and NV30 made over 70%! Hm.. If we add here an enhanced anisotropic filtering unit and improved anti-aliasing quality, we will get a real tidbit. The only thing which will remain the same is the size of the card with the cooling system installed: the graphics cards based on this chip will still occupy the room of two PCI slots. x-bit

weta
03-20-2003, 07:08 AM
More interesting stuff on the upcoming NV35.

256 Bit Memory Bus
500MHz DDR I (Effective 1000MHz)
500MHz GPU
Low noise cooling solution

Here are the results of a Quake3, 1600x1200, 4XAA and 8xAF benchmark/comparison run by Nvidia at CeBIT 2003.

The NV35 got 111 FPS
The NV30 got 48 FPS (Geforce FX5800 Ultra)

Both cards were clocked at 250MHz because the NV35 used was a prototype, and they wanted to make an exact comparision to the NV30.

chip.de/nv news

SmokeyTheBalrog
03-20-2003, 09:19 AM
The 9700 got around 90 fps at something close to the same settings I think.

Though if they double the clock speed of the NV35 sweet!

Hmm, now I have to decide if I really want to blow $500 on a 256MB 9800 Pro...

Though nVidia needs to deal with image quality.

So I guess I probably will still go for the 9800. I mean I can't wait 3 more months anyways.

rugbydude
03-20-2003, 02:05 PM
I'm still waiting to try an NV30:laugh: :laugh:

SmokeyTheBalrog
03-20-2003, 03:14 PM
Sorry to say r-dude that I ain't waiting for NV30 anymore. It's either 9700/9800 or NV35 for me.


I have a way strong feeling that the NV30 is going to be the much talked about black sheep of the nVidia family. Poor card, poor drivers, etc...

revenant
03-20-2003, 03:50 PM
...or the R400 for me. Mmmmm... it does go well with the chicken. ;)

rugbydude
03-20-2003, 11:51 PM
I aint waitin for one i just want to test one.:D But yer 9600 for me i think not quite enough money for a top end card:rolleyes:

SmokeyTheBalrog
03-21-2003, 05:31 AM
There are some rumors that the 9600 will be slower than the 9500. But again, thats just rumors.

revenant
03-21-2003, 07:26 AM
Smokey is right. According to speculation, it's going to be about the speed of a 9000, prolly a tad faster. I guess the real benefit then, would be DX9 support in hardware, and other more advanced visual engine features... peace...

SmokeyTheBalrog
03-21-2003, 02:52 PM
Doesn't 9500/9700 already have DX9 in hardware?

E^vol
03-21-2003, 07:12 PM
...or the R400 for me. Mmmmm... it does go well with the chicken. ;)
:rofl: :rofl: :cheers:

rugbydude
03-22-2003, 05:38 AM
Yep the 9700 and 9500 have the same features as the 9800 and 9600 there just ment to be faster. But the 9600 is lacking the 8 pipe line architecture of the 9500 so wot the true outcome will be is unknown:confused:

weta
03-27-2003, 04:52 AM
IBM and NVIDIA today announced the two companies have formed a multi-year strategic alliance under which IBM will manufacture NVIDIA's next-generation GeForce graphics processor units (GPUs).

As part of the agreement, NVIDIA will gain access to IBM's comprehensive suite of foundry services and leading-edge manufacturing technologies, including power-efficient copper wiring, and a roadmap that leads to 65nm (nanometer; a billionth of a meter) in the next several years, giving the company the necessary tools to advance its state-of-the-art GPUs.

IBM plans to begin manufacturing the next-generation GeForce graphics processor this summer at IBM's state-of-the-art 300mm plant in East Fishkill, N.Y. The new IBM $2.5 billion chip-making facility combines, for the first time anywhere, IBM chip-making breakthroughs such as copper interconnects, silicon-on-insulator (SOI) transistors and low-k dielectric insulation on 300mm wafers. The new facility began operation last year, and will ramp up in capacity throughout 2003.
Press Release

At last some good news from Nvidia, with access to these advanced manufacturing technologies, the NV35 should be free of the overheating problems that have plagued the NV30.

revenant
03-27-2003, 05:25 AM
Ahhhh.. the plot thickens! Good move on Nvidia's part, very good move. :)

rugbydude
03-27-2003, 02:00 PM
Looks like nvidia is haveing to throw some of its money around to try and keep up with ATI. Let the battles commence:thumb:

weta
03-29-2003, 02:48 AM
"Something you should take note of, look at the configuration in which the test was run (16x12, 4X AA, 8X AF); that should tell you a bit about NV35 already."
"You won't hear anything from NVIDIA until the chip is ready to go, they're keen on not making the same mistakes over again. I'm not too surprised this test info got leaked, I saw it a couple of months ago although I would expect tighter control from NVIDIA surrounding NV35".
Anand Lal Shimpi, Anandtech

weta
03-29-2003, 04:17 AM
1. It's already up and running in Nvidia's labs!
2. It wil use DDR - not DDR2 (!) memory because of the lower price of memory and PCB, more stability and availability in high volumes.
3. It will have 256 bit 400 MHz (may be more for ultra) DDR1 memory interface. And there will be cards with 256 MB of memory.
4. It will have the same features set and probably near as damnit the same core clock.
5. It will be as twice, and and sometimes a bit more than that, faster in raw shader power for 1.4 and 2.0 pixel shaders whereas the NV30 was not such a strong performer as we wished - so this aspect is pretty tuned up, lads!
6. It shows from 1.5 up to 2.0 times performance of NV30 depending on the tasks and the final frequency specs.
7. It will be definitely faster than RADEON 9800 PRO.
8. Other side specs like RAMDAC freq are the same as for NV30.
Inq

Wouldn't it be great if all these facts turn out to be true, pity about the (G)DDR2 memory though.

weta
03-29-2003, 07:08 AM
The following information comes from an interview between Josh Walrath (Penstarsys) and Mike Hara, Nvidia's VP of Investor Relations and Communications. It's a long interview but well worth a read, all Nvidia's products are covered.

Chip Production

This then leads to the low-k dielectrics question. TSMC was initially supposed to implement low-k in their 0.13u Cu process in Q3 2002, but it didn’t turn out to be the case. NVIDIA initially designed the NV-30 to utilize a low-k design, but that was dropped very shortly due to the increased risk that the low-k process will not be available. This turned out to be a good decision, but one that was not important enough to get the NV-30 out in a timely manner. Due to these process technologies, the NV-30 was delayed time and again, and only now are we seeing these products trickle into the market. Going with IBM as a primary partner will help to alleviate these problems, as the IBM fab at East Fishkill is begging for customers to use their advanced processes on 300 mm wafers. IBM already has available low-k and SOI features for bulk 0.13u Cu products. NVIDIA thinks that this is perfect for their next round of products.

Engineering and Product Development

One of the main questions asked of NVIDIA is how they do it all? NVIDIA has grown from three engineers in the early 90's to over 400+ engineers this year. How exactly is this workforce divided up? Any company producing processors makes "teams" of engineers that are focused on one product or architecture. Here is the basic breakup of teams: there are three distinct GPU teams that work on a variety of parts, two platform processor teams (nForce and derivatives), and additional teams for the mobile market. All in all there are approximately 6+ teams at any one time at NVIDIA.

NV35

NVIDIA has also shown off working silicon of the NV-35 chip, and it appears to be significantly more powerful than the NV-30. Even though the NV-30 was delayed, the other projects continued on schedule, and the NV-35 was part of that schedule. Perhaps the NV-35 was originally positioned for a Fall 2003 release, but due to pressure from ATI as well as the disappointing launch of the NV-30, we can probably expect the NV-35 to be introduced within the next four months. While Mike couldn’t verify this, he did mention that this 2nd quarter was going to be VERY interesting.

Nforce2

Core logic is an area where there is a huge amount of innovation just waiting to happen. I originally asked Mike about improvements upon the audio portion of the nForce 2, and he basically gave me a much broader picture about what NVIDIA plans to do. The entire platform needs to be examined to get a good concept on what is coming up. The MCP is the area where the most amount of innovation will occur. No longer is the MCP really a Southbridge, but rather a multi-media controller. Currently the MCP-T provides high quality audio, router and multistream capabilities, USB 2.0, 1394, and a host of typical Southbridge features. Future products will include wireless support, Gig-E, Serial ATA, and other technologies. NVIDIA tries to maintain their balance here by watching technology trends.

Final Thoughts

Mike could not stress more that the 2nd quarter of this year will be very, very interesting to watch. We will see a greater variety of FX products hit the streets, as well as the possible introduction of the NV-35. Put on top of that the possible mobile refresh, as well as the possible next generation of nForce parts. NVIDIA is still a company to keep your eye on.
Penstarsys

Full interview here >> (http://www.penstarsys.com/Interviews/nvidia/m_hara/index.html)

weta
04-03-2003, 04:22 AM
You may remember I wrote a mini-rant about NVIDIA's "The Way It's Meant To Be Played" logo campaign and how it relates to S.T.A.L.K.E.R. when they said the game engine was built using a Radeon 9700. I emailed the guys who are developing the game, asked for clarifications, and got a response from Oles V. Shishkovtsov. Here's what he had to say.

We use all hardware for development of Stalker, although most are NVIDIA. We will produce a game with the best quality, compatibility, and performance on all supported hardware (T&L or better). Here are the general things. As a programmer, I need to get access to the latest hardware and talk to it's manufacturers, otherwise we may get way behind the competition. I want to give credit to NVIDIA for agreeing to be our technical partner and render us this kind of assistance (we contacted NVIDIA and ATI for several months, but ATI did not respond). NVIDIA offers me early hardware and very good support. Prior to GeForceFX I worked with Radeon 9700 but I am currently developing the Stalker engine on NV35. Naturally, such close work with NVIDIA engineers allows me to come up with better optimizations and support the new technologies of NVIDIA boards.

In Stalker you'll be able to play fine on NVIDIA and ATI hardware - the gameplay and run stability should be the same. On ATI boards Stalker will run fast, but on NVIDIA boards it will run even faster, plus gamers will get a set of unique effects, namely due to close work with the company enginers and support of NVIDIA hardware features.

That helps clear things up, and I want to thank Oles for taking the time to respond to my email.
3dgpu

weta
04-07-2003, 04:43 AM
There is a very good possibilty that the NV35 will be officially announced at E3 (13-16th of May).
Geforce FX6800 Ultra and Geforce FX11600 Ultra are the favourite names for Nvidia's new high end card.

E^vol
04-08-2003, 08:06 AM
Geforce FX6800 Ultra and Geforce FX11600 Ultra are the favourite names for Nvidia's new high end card. Geforce FX11600 Ultra ?? LMAO !!! nVidia wants to "better" ATI in the name game too ? hahaha, ATI's R400 is supposed to be the Radeon 10000, so nVidia might name their card the 11600 ? :rofl: :rofl:

weta
05-03-2003, 04:58 AM
Geforce FX 5900

0.13m chipset
256bit memory controller
425/850MHz core/memory
128mb DDR memory
28GB/sec bandwidth

399.00 USD

Geforce FX 5900 Ultra
Same spec as FX 5900 except for the memory

256mb DDR memory

499.00 USD

Reportedly the overheating problems have been solved, so it should be back to a quieter and more conventional cooler.
The Inq

weta
05-10-2003, 04:59 AM
Nvidia are going to launch a budget version of their Geforce FX 5900, the card should be available in July, and will cost $299 USD.

weta
05-10-2003, 05:17 AM
Leaked Geforce FX 5900 Ultra photo

rugbydude
05-10-2003, 05:22 AM
Thats one good looking fan:drool: :drool: :drool: :drool:

t00lb0x
05-10-2003, 10:39 AM
yeah, if that can destroy the 9800P it'll be my card ;)

Tomi_s
05-10-2003, 04:48 PM
With that small fan the noise should be quite nasty! :rolleyes: :rolleyes: :D :D :D

Mr.Tweak
05-10-2003, 04:51 PM
I'll let this out of the bag, it wont do any harm.

nVidia told us in a conference call that the NV35 cooling system will be 100% quieter then the FX Flow system of the NV30.

t00lb0x
05-11-2003, 02:11 AM
DAMN!!!!! you were on a conference call w/ nVIDIA??? LoL if you guys review that chip...omg you would be our gods LOL

E^vol
05-11-2003, 10:34 PM
Originally posted by Mr.Tweak
I'll let this out of the bag, it wont do any harm.

nVidia told us in a conference call that the NV35 cooling system will be 100% quieter then the FX Flow system of the NV30. That's still pretty loud.... :rofl:

E^vol
05-11-2003, 10:35 PM
DAMN!!!!! you were on a conference call w/ nVIDIA??? LoL if you guys review that chip...omg you would be our gods LOL The TweakTown crew are gods....:group:

Edit: How come there isn't a "brown-noser" smily ? :rofl:

weta
05-14-2003, 06:27 AM
Here's a photo of Gainward's Geforce FX 5900 Ultra 256mb board

t00lb0x
05-14-2003, 07:33 AM
Damn, that looks pretty good (besides the ugly red IMHO) but i like hercules based boards, they always seem to able to o/c a lot

E^vol
05-14-2003, 08:10 AM
Yup, that looks pretty sweet !!! :thumb:

Mr.Tweak
05-14-2003, 10:39 AM
Here are some benchmarks in Doom III taken from FX 5900 Ultra conference which we are in now.

t00lb0x
05-14-2003, 10:53 AM
Well according to nVIDIA it stomps the 9800 but who knows...

Beefy
05-14-2003, 11:54 AM
woah.. it's a DOMM III performance chart.. What's that? :)

Looks nice though. I'm just sick of the rate at which video cards are becoming obselete now. It's too hard to keep up.

rugbydude
05-14-2003, 02:05 PM
Can we believe these benchmarks though. I mean doom 3 hasn't even been released nither has the board so how can they have reliable benchmarks:confused:

Mr.Tweak
05-14-2003, 02:19 PM
Can we believe these benchmarks though. I mean doom 3 hasn't even been released nither has the board so how can they have reliable benchmarks:confused:

Stick with the times - nVidia works closely with ID Software and a number of other game companies.

Beefy
05-14-2003, 02:30 PM
Can we believe these benchmarks though. I mean doom 3 hasn't even been released nither has the board so how can they have reliable benchmarks:confused:

This time of year for the gaming / hardware people is like Christmas come early. The time around E3 is always bursting with new technology and software, games and gadgets, so companies pull together beforehand and organise some things to show the public.

Yes, we can believe these benchmarks, as many people are now getting access, albeit limited, to the new FX cards and to a specially made Doom 3 benchmark. If you had a look around other hardware sites, you'll see more of the same.

rugbydude
05-14-2003, 02:34 PM
ok now we know the benchmarks are reliable does this mean that nvidia don't want to show benchmarks for other games?? maybe open gl or direct 3d games favour the 9800 pro. Anyone have any info on this:confused:

Mr.Tweak
05-14-2003, 02:56 PM
ok now we know the benchmarks are reliable does this mean that nvidia don't want to show benchmarks for other games?? maybe open gl or direct 3d games favour the 9800 pro. Anyone have any info on this:confused:

Maybe, maybe not.

They would have choose Doom III because many people are looking forward to it.

Beefy
05-14-2003, 03:43 PM
That and the fact that Doom II is going to be pushing the boundaries of current day gaming, and will push hardware more than current games do. Why not show how 'future-proof' the card is by showing how much better it works with future games?

Mr.Tweak
05-14-2003, 03:49 PM
That and the fact that Doom II is going to be pushing the boundaries of current day gaming, and will push hardware more than current games do. Why not show how 'future-proof' the card is by showing how much better it works with future games?

Exactly, but I think you mean Doom III ;)

Beefy
05-14-2003, 05:04 PM
No, I meant Doom 2. That game is top of the range now. I can't wait til they release them voodoo cards which apparently make 3D stuff look smooth. It's just a shame that it needs to run beside my existing video card....

:rolleyes2 Ok, you win...

weta
05-15-2003, 10:25 PM
Here's a photo of MSI's Geforce FX 5900 128mb board

Tomi_s
05-15-2003, 10:31 PM
Hmmm...nice fan...how much does it weigh? :?: :rolleyes: :D

t00lb0x
05-15-2003, 11:43 PM
The fans look really weird on all the new NV35 boards i've seen...but they look sooo originally cool

rugbydude
05-16-2003, 01:00 AM
The designers have been hard at work:thumb:

weta
05-16-2003, 05:47 PM
Leadtek Geforce FX 5900 Ultra 256mb box shot

weta
05-16-2003, 05:50 PM
Aopen Geforce FX 5900 box shot

weta
05-16-2003, 05:53 PM
Asus Geforce FX 5900 Ultra 256mb box shot

weta
05-16-2003, 07:53 PM
MSI® TWIN FLOW COOLING ... BLAZING SPEEDS WITHOUT SWEAT

Continuing the MSI tradition of incorporating only the best engineering process and design techniques, the GeForce FX 5900 GPUs take advantage of the most advanced and sophisticated TWIN FLOW COOLING technology. This twin ventilation cooling mechanism enables higher performance through faster clock rates, while still remains in comparable low GPU temperature-- a perfect craft for a perfect graphics from MSI.

rugbydude
05-17-2003, 02:45 AM
Thanx for the pics wetta. As always ure first with tha latest pics and news. And well done on being awarded a new title:thumb:

Tomi_s
05-17-2003, 04:02 AM
Yeah, congrats weta! :cheers: :cheers:

weta
05-17-2003, 05:41 AM
Thanks lads, a little overwhelmed by the title to be honest, but I'll do my best to live up to it.

t00lb0x
05-17-2003, 06:55 AM
Yep, always with the newest news on VID-cards, now just to test them :D hehe. But congrats (I should be offically TweakTown's Beloved noob :P)

rugbydude
05-17-2003, 11:02 PM
Yep, always with the newest news on VID-cards, now just to test them :D hehe. But congrats (I should be offically TweakTown's Beloved noob :P)

Err u mean anoying noob :laugh:

t00lb0x
05-17-2003, 11:05 PM
Rugby you should be spammin' outcast...so leave! and no beloved noob, you noob go play your ghei sport with no pads

E^vol
05-18-2003, 04:09 AM
Rugby you should be spammin' outcast...so leave! and no beloved noob, you noob go play your ghei sport with no pads Rugby players don't need padding...They're not wimpy like football players.

rugbydude
05-18-2003, 04:51 AM
Rugby players don't need padding...They're not wimpy like football players.
Well said well said:D And TB i was just joking i have a dry sense of humour like most british. U really are the beloved noob but honestly u shud play some rugby. U wudn't think it was gay then:laugh: probs get ure arsed kicked like most americans that try to play the sport for their first time:laugh: they get the hang of passing the ball backwards eventually:rofl:

t00lb0x
05-18-2003, 05:02 AM
Yeah i know rugby is leet, always wanted to play it. :D Just wanna hit kids...seriousily though i think i should get the title beloved noob or somethin..anyways I love watching rugby but its onlike Fox sports 300...even LAX gets more views in the US. I have the ultimate ultimatum...are british girls really ugly or is that just bull****?
--I think we should delete the posts above before we get kicked for being spam mastas.

rugbydude
05-18-2003, 05:07 AM
Well some are ugly and some aint its the same with every country. Anyway leave the posts mate becuz the lead spammers on this site aren't us. Its Wiggo and Beefy:laugh: They hijack all the threads to go on about spam:laugh:

t00lb0x
05-18-2003, 05:18 AM
True, true..whateva i r spam masta *gets banned*

Beefy
05-18-2003, 10:52 AM
If you guys don't shut up, you'll all get new titles. Plus the S.P.A.M. wiggo and I are talking about has nothing to do with spam posts. It's a little organisation we've formed, and you guys are shaping up to be new targets. So drop it all. I don't wanna see another post on it. If ya wanna keep it up, go to the beer garden.

t00lb0x
05-18-2003, 01:20 PM
Imma try a post like Wiggo or you:
:cry:
--Fin

rugbydude
05-18-2003, 05:39 PM
Ok i'm sorry beefy i was out of line last night. Right now lets get back to the topic at hand the nv35. Now does anyone (Weta) know when the first cards will be released and for how much:confused:

weta
05-18-2003, 07:26 PM
Nvidia's European Sales Director Roy Taylor confirmed on Friday at the System Builders Summit, that the Geforce FX 5900,
and FX 5900 Ultra will start shipping in June, with the FX 5900 value model following in July. Guideline prices are $299 (value),
$399 (standard), and $499 (ultra). (US Dollars)

t00lb0x
05-19-2003, 12:54 AM
Who is going to use the reference board and tweak it? I mean who else besides MSI is gonna make a 5900U FX?

E^vol
05-19-2003, 03:07 AM
Who is going to use the reference board and tweak it? I mean who else besides MSI is gonna make a 5900U FX? Judging by the response that nVidia received from their showing at E3...There'll be quite a few ! :group:

t00lb0x
05-19-2003, 06:55 AM
Ok good, E^vol look at my new tag i'm not just a member anymore!

E^vol
05-19-2003, 07:16 AM
Ok good, E^vol look at my new tag i'm not just a member anymore! Yeah, but it's not a positive status tag. :no:

Beefy
05-19-2003, 08:06 AM
toolbox, do you ever stay on topic?

t00lb0x
05-19-2003, 08:38 AM
No i stopped after people just stopped caring mister ''Beefy
Not-so-nice Adm'' and I hope I get smacktard!

E^vol
05-19-2003, 09:07 AM
weta, have you come across any performance numbers for the FX 5900 value model ?

homeworld1031tx
05-19-2003, 10:20 AM
okay, well since you jkust posted those prices Weta, im feeling kind off bad. I was thinking the Ultra was going to be 400 dollars:cry: :cry: :cry: :cry: . Well will the standard beable to still beat the 9800 PRO, or should i just go ahead and buy the 9800 PRO like i was planning to anyways?

weta
05-19-2003, 06:33 PM
E^vol. These are the basic specs for all three cards.

GeForce FX 5900 Ultra, 256Mb memory, 450/425MHz core/memory, analog, DVI, and VIVO.
GeForce FX 5900, same as Ultra version except, 128Mb memory, 425/425MHz core/memory.
GeForce FX 5900 (value) 128Mb memory, 400/400MHz core/memory, no VIVO.

homeworld1031tx. Haven't seen much information on the standard version so I'm not sure, but it must be pretty close.

E^vol
05-19-2003, 09:39 PM
Thanks for the info weta !
I think we all have something to think about now.
Last year, it was clear....ATI !
But from what we've seen, it's a race again.
Either way you look at it, we the consumers, win ! Even if I can't afford the 9800Pro or FX 5900 Ultra, when they appear in stores, the price of the 9700Pro will drop quite a bit, and that's still a great card !!! :2cents:

weta
05-20-2003, 08:11 AM
Here's a photo of Nvidia's Geforce FX 5900 Ultra 256mb reference board

The production version will be approx 1" shorter

t00lb0x
05-20-2003, 11:20 AM
That looks incredible! Thx for the info

E^vol
05-21-2003, 08:20 AM
GeForce FX 5900 Ultra : 450/850Mhz core/mem, 256MB, 256-bit memory bus, DDR/DDR-II support, 3rd Week of June, US$499
GeForce FX 5900 : 425/850Mhz core/mem, 128MB, 256-bit memory bus, DDR support, 3rd Week of June, US$399
GeForce FX 5900 Value : 400/800Mhz? core/mem, 128MB, 256-bit memory bus, DDR support, 3rd Week of June, US$299

Source : VR-Zone (http://www.vr-zone.com)

rugbydude
05-21-2003, 02:09 PM
In my opinion the referance cooler looks like one of the best. But looks aren't everythng we're gonna have to wait to find out if any of these nice looking coolers work.:D

E^vol
05-21-2003, 11:45 PM
...And just how loud it really is too !

rugbydude
05-25-2003, 04:21 PM
Thats true i mean they don't look lous but they could be just as loud or maybe even louder:eek:

Tomi_s
05-25-2003, 04:29 PM
What´s so special about that GeForce cooling system, compared to ATI´s cooling systems? :rolleyes2 :rolleyes2 :rolleyes2

rugbydude
05-25-2003, 04:44 PM
I'm not sure at the moment but the fact the ram is under the heat sink as well is a very good atrabute. And Weta i just saw ure name that u added to the card above. very sneaky:laugh:

Tomi_s
05-25-2003, 05:59 PM
BTW, you have posted quite a lot of stuff in the forums rugby! Almost 1100 posts!!! :wow: :wow: :wow: :thumb:

rugbydude
05-25-2003, 08:38 PM
:laugh: a bit off topic but thnx :laugh: and heres me thinking i had 300 :rofl: Anyway have any benchmarks come out from these cards yet?? or news on how loud or effective these nice looking fans are:confused:

weta
05-25-2003, 10:27 PM
Rugbydude: ExtremeTech has retested the Radeon 9800 Pro, Geforce FX 5800 Ultra, and Geforce FX 5900 Ultra boards
with 3DMark03 (build 330) "cheats disabled", and compared the results to those achieved in their earlier (build 320) tests
(see attachment). Some results show that ATI has (had) "optimized drivers as well.

Wiggo
05-25-2003, 11:18 PM
Tomi_s[/size]]
BTW, you have posted quite a lot of stuff in the forums rugby! Almost 1100 posts!!! :wow: :wow: :wow: :thumb: And most of it pointless. :tears:

Tomi_s
05-26-2003, 01:23 AM
:laugh: :laugh: :laugh:

homeworld1031tx
05-26-2003, 06:31 AM
Do u guys think i should go out and get a 9800 PRO 128 off of eBay for $350.00, or wait for the GF FX 5900 Standard. (the only difference between the Ultra and Standard is the Memory size, right?):confused:

weta
05-26-2003, 08:11 AM
Homeworld1031tx: 1. Wait, 2. Yes :thumb:

rugbydude
05-26-2003, 04:06 PM
And most of it pointless. :tears:
I sure hope ure joking:hmph: And if ure not the irony is that that post was pointless in itself:laugh:
Anyway thnx for the graphs Weta. But are you sure this patch stops them from cheating:confused: I mean by the looks of things it just lowers the scores a little:o

weta
05-26-2003, 06:29 PM
Game Play Testing

A time consuming part of this preview was spent on measuring performance during game play. Using FRAPS to record average and minimum frame rates, I played a variety of games that included first person shooters, flight and race sims, and role playing titles. Most of the testing involved playing in three to five minute sessions with various high quality graphics settings enabled in order to get a "feel" for the performance of the GeForce FX 5900 Ultra.

Although game play results are not as accurate as playing back a fixed-path benchmark, I've invested a great deal of time repeating game play scenarios in order to increase their accuracy during subsequent runs. There are times when a game play scenario can't be repeated, such as a bot match in Quake or Unreal Tournament. But results from subsequent matches should be close to those obtained from the initial match when the same graphics settings are used. While variations in performance will occur, they tend to have a greater impact on the minimum frame rate. I am confident that these results will be representative of the performance one would obtain with a GeForce FX 5900 Ultra on a similarly configured system.

Unfortunately, due to time constraints, I was unable to provide comments on game play results. However, with the GeForce FX 5900 Ultra, 4X antialiasing and 8X anisotropic filtering can be enabled at 1024x768 or higher in all of the games I tested and a fluid frame rate obtained.

Finally, game play testing can reveal issues that can be system, driver, or hardware related. With the exception of antialiasing not functioning in Age of Mythology and Morrowind, the GeForce FX 5900 Ultra was rock solid.

Games Tested

Age Of Mythology - v1.05
Dungeon Siege - v1.11
Enclave Demo
IL2-Sturmovik - v1.2
Morrowind - v1.2.0722
Need For Speed 2 Hot Pursuit Demo
Quake 3 Arena - v1.30
Raven Shield Single Player Demo - v1.0
Return To Castle Wolfenstein Single Player Demo - v1.0.1
Star Trek Elite Force 2 Single Player Demo - v1.0
Unreal Tournament 2003 Demo - v2206
Unreal 2 The Awakening Demo - v.1000
Warcraft 3 - v1.05

Conclusion

While the Unreal 2 screenshots I provided are high quality PNG format, they don't represent the excellent image quality you can achieve with the NV35. Frankly, I find the continued bickering from comparing image quality from one graphics card to another amusing. While one might be able to spot marginal differences when viewing static images, it's a different story when comparing images that are being rendered at 30 or more frames per second. NV News

Game play test results and screenshots (http://www.nvnews.net/previews/geforce_fx_5900_ultra/page_2.shtml#mythology)

weta
05-27-2003, 12:17 AM
Here's a photo of the new Asus V9950 videocard

Chaos
05-27-2003, 01:03 AM
I so love ASUS. Everything they do looks cool.


Chaos

weta
05-27-2003, 02:38 AM
Read Hot Hardware's "Radeon 9800 Pro 256MB Vs. GeForce FX 5900 Ultra" head to head review here (http://www.hothardware.com/hh_files/S&V/r9800256mb_gffx5900upd(4).shtml).

Tomi_s
05-27-2003, 03:52 AM
Wow, that Asus cooling system should do the job...two fans and full copper heatsink! It´s also quite heavy isn´t it? :D :D ATI and nVidia will have to invent new cooling systems for their cards. Soon they won´t be able to fit the actual graphics components to the board cause the cooling systems takes all the space! :hammer: :laugh: :laugh: :laugh:

homeworld1031tx
05-27-2003, 03:59 AM
Yeah it does look good..and heavy. Any pics on the Leadtek card Weta?

homeworld1031tx
05-27-2003, 04:01 AM
After looking at that review, it looks to me like nVidia finally dicided to do good in high AA + AF scenes :afro: :afro:

E^vol
05-27-2003, 05:23 AM
Any other companies besides Leadtek that offers VIVO cards ? Any TIVO cards out there ?
Does anyone know if there'll be a Personal Cinema GeForce FX 5900 (Ultra or nonUltra) ?

rugbydude
05-27-2003, 05:42 AM
Well i have to say that asus one is the one i would go for so far. Although the heat sink doesn't look the nicest it looks as though it would be the most effective cooling option so far:D

homeworld1031tx
05-27-2003, 06:01 AM
Yeah ur right, cause it looks like only flat copper shims have direct impact with the core before the active cooling comes in. The aluminum cooler on my Leadtek probably outproforms it.:cantfocus :cantfocus :cantfocus :cantfocus :cantfocus
The card also looks a lot shorter then the production/test card that nVidia has.

homeworld1031tx
06-04-2003, 01:34 AM
The inquirer just posted an article saying that a Japanese sight got a glimpse of the Asus cards, and that they jacked up the price !!!!!! they say $480 for Ultra, 128 MB version.

weta
06-12-2003, 03:58 AM
Albatron FX5900UV

Nvidia GeForce FX5900 Ultra GPU (Clock 450 MHz)
256 MB/ 850 MHz/ 256-bit (4MBx32bit-2.2ns) DDR Memory
Pure copper fans using AB-Fan technology provides maximize cooling for your GPU
AGP 8X with D-Sub/ TV-Out/ DVI/ VIVO ports
CineFX 2.0 engine supports Microsoft DirectX 9.0 & OpenGL 1.4
Intellisample HCT, UltraShadow technology, Nview multi-display
Bundled DVD player software and two retail games

Here's a photo of the new Albatron FX5900UV videocard
(http://forums.tweaktown.com/attachment.php?s=&postid=154560)

rugbydude
06-12-2003, 04:50 AM
That is one beast of a cooling solution:thumb:

homeworld1031tx
06-12-2003, 05:59 AM
Anf i thought i had a good cooler on my Leadtek..........
How many watts does the 5900 Ultra put out???

E^vol
06-12-2003, 06:50 AM
DAMN !!! That's pretty sweet ! Probably heavy too though...

Chaos
06-13-2003, 01:06 AM
WOW

weta
06-14-2003, 02:45 AM
Here's a photo of Leadtek's new WinFast A350 TDH 256mb videocard

rugbydude
06-14-2003, 03:16 AM
BLOODY HELL!!!!!!I'M GETTING MORE AND MORE IMPRESSED BY EACH CARD AS IT COMES ALONG AND I HAVE TO SAY THAT IS NOW THE BEST ONE!!!! IT COOLS EVERYTHING:drool:

weta
06-14-2003, 03:16 AM
Here's a photo of eVGA's new Geforce FX5900 Ultra videocard (Now on sale in Japan)

weta
06-14-2003, 03:42 AM
Here's a photo of the Aopen's Aeolus FX 5900 videocard

E^vol
06-14-2003, 06:12 AM
BLOODY HELL!!!!!!I'M GETTING MORE AND MORE IMPRESSED BY EACH CARD AS IT COMES ALONG AND I HAVE TO SAY THAT IS NOW THE BEST ONE!!!! IT COOLS EVERYTHING:drool: Have you thought that it might trap the heat in ?

The__tweaker
06-17-2003, 03:42 AM
Anf i thought i had a good cooler on my Leadtek..........
How many watts does the 5900 Ultra put out???

Actually less heat than the 9800 Pro.

The Asus 5900 Ultra is what I'm waiting for, the amazingly cool look is also great. IMO :)

Terratec is offering VIVO on their 5900 Ultra to.

Heres a pic:

:beer: :beer: :beer:

The__tweaker
06-17-2003, 03:46 AM
This is something great to:

"Here are the default 3D and 2D clocks of the GeForceFX 5900 Ultra. Interesting that with Coolbits, we will now be able to OC the 3D side of things seperately from the 2D side. Certainly this will go a long way in giving you a more stable system for everyday applications if you are an OCer with other things to do besides game. Great idea."




:cheers:

weta
06-24-2003, 04:46 AM
In celebration of our 4th Anniversary we are giving away one FREE card a day - every day - for one month in July and one grand prize on the 31st! That's 30 cards up for grabs + $1,000 bucks! Here is your chance to get yours. No purchase is necessary and we will even cover the shipping. Simply sign up on our website and our system will randomly pick the winners everyday.*

Here is our list of cards that we will randomly be giving away throughout the month.

Five e-GeForce FX 5200, 128MB DDR, w/ TV-Out and DVI
Five e-GeForce FX 5200 Ultra, 128MB, w/ TV-Out and DVI
Five e-GeForce FX 5600, 256MB DDR, w/ VIVO and DVI
Five e-GeForce FX 5600 Ultra, 128MB DDR, DVI, VIVO, TV-out
Four e-GeForce FX 5900, 128MB DDR, DVI, TV-out
Two e-GeForce FX 5900 Ultra, 256MB DDR, DVI, VIVO, TV-out
Four Nvidia Personal Cinema
One Grand Prize of $1000.00!

* Look for entry information and further details on July 1, 2003! Remember; sign up early to have more chances to WIN! You only need to sign up once and you are entered all month long. You must check back daily to see if you are the winner, winners will be responsible for emailing our webmaster to claim the prize.

Enter eVGA's July giveaway (http://www.evga.com/articles/public.asp?AID=139) (Starts July 1st)

American and Canadian Residents Only

weta
06-26-2003, 02:01 AM
It's rumoured that Nvidia intends to up the speeds on its FX5900 GPUs to 450MHz for the standard, and 500MHz
for the ultra version.
This is most likely a reaction to ATI's upcoming R360 chipset (Radeon 9900 Pro) due to be announced next month.

E^vol
06-26-2003, 09:40 AM
It's rumoured that Nvidia intends to up the speeds on its FX5900 GPUs to 450MHz for the standard, and 500MHz
for the ultra version.
This is most likely a reaction to ATI's upcoming R350 chipset (Radeon 9900 Pro) due to be announced next month. R350 or R360 ?

weta
06-26-2003, 12:37 PM
E^vol: Cheers mate

E^vol
06-27-2003, 03:08 AM
LOL, it wasn't supposed to sound like an attack...Sorry, I was just trying to clarify.: peace2: :group:

weta
06-27-2003, 03:28 AM
E^vol: It wasn't taken as one, I was just thanking you for pointing out the mistake.

weta
06-27-2003, 06:09 PM
These are the first details for Chaintech's new FX 5900 cards.

Chaintech FX 5900, 400/850MHz core/clock, 128mb DDR memory
Chaintech FX 5900 Ultra, 450/850MHz core/clock, 256mb DDR memory

As with most manufacturers Chaintech appears to have followed Nvidia's reference design, with the exception of its "Gas Turbine" cooling system.

rugbydude
06-27-2003, 11:46 PM
U got any info on this "gas turbine" cooling system??:confused:

homeworld1031tx
06-28-2003, 01:52 AM
Actually less heat than the 9800 Pro.

The Asus 5900 Ultra is what I'm waiting for, the amazingly cool look is also great. IMO :)

Terratec is offering VIVO on their 5900 Ultra to.

Heres a pic:

:beer: :beer: :beer:
Really?!?!? I would think that the card puts A LOT more , specially cuase of that huge HSF on the 5900.

weta
06-30-2003, 06:02 PM
Gainward FX PowerPack 1600XP Ultra "Golden Sample"

450/850MHz core/memory (Standard)
460/870MHz core/memory (Gainward Expertool)

homeworld1031tx
07-01-2003, 03:12 AM
Gainward FX PowerPack 1600XP Ultra "Golden Sample"

450/850MHz core/memory (Standard)
460/870MHz core/memory (Gainward Expertool)

When you say "(Standard)". do you mean like nVidia decided to up the speed of the standard 5900 to 450\850, or the standard of this card is at 450\850, and you can also get it in the 460\870 flavor. Cause i think you(or maybe it was the inquirer:?: :?: ) said some thing about them changing the speeds cause of the 9900

weta
07-01-2003, 03:50 AM
homeworld1031tx: The standard speed for Nvidia's Ultra version is 450/850MHz, with Expertool, Gainward FX 5900 Ultra owners can increase the core/memory to 460/870MHz.

weta
07-02-2003, 06:44 PM
ELSA Gladiac FX 935 128MB graphics card, with the exception of the heatsink/fan (60mm fan, 33dB), ELSA has followed Nvidia's reference design.

weta
07-03-2003, 01:30 AM
MSI FX 5900 Ultra
It appears that nearly all Nvidia's board partners are following the reference design for their Ultra version.

homeworld1031tx
07-03-2003, 02:35 AM
ELSA Gladiac FX 935 128MB graphics card, with the exception of the heatsink/fan (60mm fan, 33dB), ELSA has followed Nvidia's reference design.


And the power switch is facing backwards in stead of vertical, which is good new.:D :D

And is ELSA a new company, cause i have never heard of them in any computer parts-related stuff.

weta
07-03-2003, 07:47 PM
Suma has announced two FX 5900 boards, here's a picture of their Platinum GeForce FX 5900 SE 256mb version.
Once again it's very similar to Nvidia's reference design, with the obvious exceptions being the rearward facing molex
connector and the PCB extension carrying the boards title.

weta
07-03-2003, 07:54 PM
When you're asked to pay $499 US for a graphics card, I guess it's nice when it comes well packaged.
As you can see in the attachment, Suma have done a great job here when compared to some other manufacturers.

The__tweaker
07-04-2003, 02:41 AM
wow nice box.. :)

weta
07-05-2003, 02:21 AM
Now that Martin Haufschild (PNY Marketing) has confirmed that Nvidia's FX 5900 Value card will have a 256 bit memory interface.
I'm guessing the the current FX 5900 Standard will become the future FX 5900 Value, the current FX 5900 Ultra will become the future FX 5900 Standard and yet to be released NV35 500MHz core will become the future FX 5900 Ultra. Of course this would only apply to the GPU and not the memory.

Now
FX 5900 Standard =400MHz core
FX 5900 Ultra = 450MHz core

Future
FX 5900 Value = 400MHz core
FX 5900 Standard = 450MHz core
FX 5900 Ultra = 500MHz core

weta
07-14-2003, 08:07 PM
Abit has announced its new Siluro FX 5900 OTES, more details are available here (http://forums.tweaktown.com/showthread.php?s=&threadid=11881).

The New Standard in Silent Computing: 25 Decibels
In the interests of raising the standards of silent computing, ABIT has been working to establish a benchmark with which to judge silent computing. The ABIT Engineers have determined that 25 decibels is the optimum noise level for a system. On the ABIT Siluro FX5900 OTES, the ABIT Engineers have reduced the sound of the VGA card to about 25 decibels. How quiet is 25 decibels? Consider that a garbage truck operates at about 100 decibels, you speak at around 60 decibels, your living room is approximately 40 decibels and a library is at 30 decibels. Sounds that most people cannot hear start at 10 decibels. With OTES, your VGA card will operate at the coolest levels and your gaming and 3D performance at the highest levels.

Abit Siluro FX 5900 OTES image (http://forums.tweaktown.com/attachment.php?s=&postid=161206)

weta
07-17-2003, 08:24 PM
Spotted this on my travels

I thought I'd post this here as Google will do it's thing and hopefully help anybody else who encounters this bug as I've heard of a few who have.

If your using a mouse specifically a Microsoft mouse and you hear a strange noise when using the scroll wheel in Internet Explorer and you have a GeForceFX 5900 or an Ultra model, the fix is to turn off smooth scrolling in Internet options and the noise should disappear.

This bug is very random and seems to happen with MS mice and the Intellipoint software being installed and it may not happen even if you have the above.

So I hope that helps anyone from tearing their hair out like I have been for the last few days and hopefully this can be fixed with a software update from MS or Nvidia.

Edit: Apprently it's not limited to any MS mouse anything that uses a scroll wheel and happens in explorer and other program etc. , also according to eVGA it has something to do with the voltage levels of the card and Nvidia has been notified and is looking into it.

weta
08-06-2003, 03:20 AM
2D flickering of GeForce FX 5900 – a bug or lame PCB design?

Andrew Vorobyew, our video section editor, answering these forum questions related to screen flickering of NVIDIA GeForce FX 5900-based cards, informed that it seemed the reason of 2D flickering during scrolling of large and picture-rich texts was the 2D clock speed not reduced but set to the 3D nominal. The same flickering occurs in a number of menus or intermediate screens between tests (i.e. in 3DMark03).

This problem was examined on 4 different graphics cards with different drivers. FX5800 doesn't produce such artefacts, but one of the old 5900 Ultra cards on different PCB caused the flickering as well.

Andrew believes the flickering is caused by a hardware GPU bug or PCB circuitry (repeated in the new design) that produces crosstalk and/or raises 2D clock.

Digit-Life

clubsport
08-06-2003, 06:28 PM
2D flickering of GeForce FX 5900 – a bug or lame PCB design?

Andrew Vorobyew, our video section editor, answering these forum questions related to screen flickering of NVIDIA GeForce FX 5900-based cards, informed that it seemed the reason of 2D flickering during scrolling of large and picture-rich texts was the 2D clock speed not reduced but set to the 3D nominal. The same flickering occurs in a number of menus or intermediate screens between tests (i.e. in 3DMark03).

This problem was examined on 4 different graphics cards with different drivers. FX5800 doesn't produce such artefacts, but one of the old 5900 Ultra cards on different PCB caused the flickering as well.

Andrew believes the flickering is caused by a hardware GPU bug or PCB circuitry (repeated in the new design) that produces crosstalk and/or raises 2D clock.



Hi,

I think it looks like the new version ati 9900 pro to improve the
9800 pro.

Digit-Life

weta
08-07-2003, 03:04 AM
We have become aware that both NVIDIA and competitor’s cards are experiencing similar problems that are described as “flickering”, “rolling lines” or “quivering” in different forum posts.

NVIDIA has been investigating these problems; we are having a difficult time reproducing the issue. We are under the impression it is a noise issue, not a graphics card issue and is system specific.

In an effort to find a solution, we ask that anyone in the Bay area who is experiencing this issue on an NVIDIA GPU and would like to visit NVIDIA with the system in tow contact us. We can use the system to identify the problem and see if we can resolve the issue. You will be doing a favor to NVIDIA and a service to the community. We will also make sure you leave with an arm full of NVIDIA swag!

First come first served! We can not look at every system in the Bay area, sorry.

Contact [email protected]

NV News

weta
08-25-2003, 08:27 PM
Here's a picture of Gainward's new FX 5900 SE (Value) video card.

weta
08-26-2003, 12:26 AM
Here's a picture of Leadtek's GeForce FX 5900 (Value) card.

Soulburner
08-26-2003, 10:24 AM
Looks like mine...but mines not the weak version...

Mine was the $350-380 one with ViVo...

Runs as fast as the Ultra...showed by my 3DMark03 score...:flames:

weta
08-27-2003, 02:40 AM
Asus V9950SE (GeForce FX 5900 Value)

Nvidia GeForce FX 5900SE GPU
Core/memory 400/700MHz
128MB DDR memory (2.8ns)
400MHz RAMDAC
VGA, DVI and television out

weta
08-27-2003, 08:34 PM
Here's a picture of Sparkle's GeForce FX 5900 SE (Value) graphics card

ZEROKOOL
08-31-2003, 05:31 AM
I so love ASUS. Everything they do looks cool.


Chaos

omg, are you BLIND? i love asus, but there about as BLAND and BOREING as a company can get with products, unless ofcourse you just LOVE green PCB...


It's rumoured that Nvidia intends to up the speeds on its FX5900 GPUs to 450MHz for the standard, and 500MHz

well i guess that roumor was wrong...

weta
08-31-2003, 06:09 AM
ZEROKOOL: Nvidia is now releasing the NV38 instead, in response to ATI's soon to be announced R360. Whilst figures aren't available yet, the NV38 is believed to be around (500-550MHz) core, and (1000MHz) memory. The highly anticipated NV40 PCI-Express/AGP8X card won't be available until next year.

JSR
09-01-2003, 01:53 PM
hehe.......showin' ati a carrot,........... let them bite, ...........then the el kabong :hammer:

weta
09-10-2003, 05:32 PM
Nvidia was made aware of an issue involving the GeForce FX 5900 series of graphics cards where some users have reported a slight "flickering" of the image on their screen, visible on a light color background in certain 3D scenes.

In examining the information available, Nvidia has determined that the issue may involve an interaction between select combinations of the graphics card, system, monitor, other electronic components and the application which creates noise or signal interference.

If you are experiencing this issue on a GeForce FX 5900 or GeForce FX 5900 Ultra card, you can download this driver fix.

If you experience any other issues - please email us at [email protected]

Download Fix (http://bjorn3d.com/files/nvidia/45.33_xp.zip) Windows XP

Bjorn3D

weta
09-17-2003, 11:45 PM
Gainward CoolFX PowerPack! Ultra/1600 XP "Golden Sample"

The Gainward CoolFX PowerPack! Ultra/1600 XP “Golden Sample” features a hand selected GeForce FX 5900 Ultra GPU running at 500Plus MHz, 256 MB of carefully qualified 900 MHz DDR, video-in/out, DVI, plus integrated Gainward CoolFX water cooling. The Gainward CoolFX PowerPack! Ultra/1600 XP “Golden Sample” is immediately available through Gainward’s worldwide distribution and retail network and priced at €899 incl. VAT.

Gainward CoolFX is based on a revolutionary design featuring the world’s most reliable Eheim water pump with an average 10 years life span, while conventional lower cost solutions are using lower quality water fountain pumps which may not even survive the mandatory European warranty period of two years. Equipped with a 400 Watt radiator and a 12 cm low noise fan, Gainward CoolFX can be extended to also cool down the CPU (Intel or AMD) and the North Bridge to almost room temperature. In order to avoid any leakages and water evaporation through the pipe system Gainward is using special plastic pipes and fittings to connect the three components into a sealed water circulation system.

Conclusion
So what do I give it for an award. Well it definitely gets the MODFATHER AWARD as the potential of this card is so much more than just a card. This can be the starting block of a compete watercooled system, if you wish or just a stunning contribution to your whole system. As to its main rating that is so much harder as this is just so much more than just a graphics card. Without the watercooling it is one hell of a product, with, it is truly something special. This card has created a class of its own and there is in reality nothing to compare it to. Therefore with some reluctance I will give it a GOLD AWARD, the reluctance is purely because this is as high as our awards go and this card really should be on a pedestal of its own.

The Modfathers

Read the full review (http://www.themodfathers.com/reviews/coolfx.php)

weta
09-24-2003, 03:50 AM
XFX Personal Cinema FX 5900

This should interest quite a few people, XFX is displaying a Personal Cinema Edition of Nvidia's GeForce FX 5900 at Computex 2003.

weta
09-27-2003, 05:50 AM
Nvidia's new graphics card partner

GigaByte used Computex 2003 to show off its new GeForce FX 5900 card.

weta
10-18-2003, 01:13 AM
<center>DirectX 9.1 and GeForce FX cards update</center>

French website PC Inpact claims that DirectX9.1 opens up a new bag of tricks in favour for Nvidia's GeForce FX. As you know Shader 2.0 performance on DX9 has been the achilles heel of the GFX series bigtime. The site figures that Shader performance can get boosted upto 60% in favour of the GeForce FX series. Read that again, upto 60% better Shader performance, not overall performance !

guru of 3D

From a major news source, Microsoft is, with the assistance of Nvidia, about to release a new version of the great and essential DirectX 9. And it will be Direct X 9.1. What's so amazing about this? Well, this new DX will favour Nvidia's CPU's, especially regarding the pixels shaders 2.0. When accompanied by a regular update, the expected performances would be with the height of the unconditional hopes of the Californian company. The performances in DX9 (for example HL2) would then be similar to those of the competition in terms of quality and especially in terms of speed. And all in precision mode of 32 bit in the PS 2.0.

Maybe a better management of the instructions of the PS 2.0 (improvement management of the streaming instructions ). To quantify, that could go to 60% of profit. A percentage that seems confusing and amazing but that we can't verify before it is released. All that we can say is if that is going to occur exactly as announced, it will be an enormous relief for all the purchasers of GeForce FX cards.

PC Inpact