PDA

View Full Version : NV40



weta
11-18-2002, 03:34 AM
90nm chipset
200m transistors
DirectX 10 support

See you Q3, 2004

Sepherum
11-18-2002, 05:19 AM
what happened to the new NV30?

weta
11-18-2002, 05:44 AM
Nothing, it's being launched tomorrow at Comdex.

Wiggo
11-18-2002, 08:07 AM
Already startin' on the next one I see. :D

:beer: :beer: :beer: :beer: :beer:

E^vol
11-19-2002, 05:25 AM
YAA BABY !!! hehehe:laugh:

Valkyrie
11-19-2002, 08:14 PM
Some more NV40 Specs

300-350 Million Transistors on 90-nm process
750-800 MHz Core
16MB Embedded DRAM (134M trans.)
1.4 GHz 256-512MB DDR-II Memory
8 Pixel Rendering Pipelines (4 texels each)
16 Vertex Shader Engines
204.8 GB/sec Bandwidth (eDRAM)
44.8 GB/sec Bandwidth (DDR-II)
25.6 GigaTexels per Second
3 Billion Vertices per Second
DirectX 10
Release H2 2004

JediAgent
11-19-2002, 09:42 PM
We've waited how long now for NV30 and you guys expect to see it H2 or Q3 of 2004... right... lets start placing bets. I bet nVidia dont even press relase til Q4 2004 and we dont see stock on shelves til H1 2005. Just because they have their self imposed 6-month product cycle, doesnt mean A: they cant break it, and B: it doesnt consist of halfway ****, like NV32, NV35, and NV38.

I do like the numbers thou... except that 500$US one.

weta
11-19-2002, 09:50 PM
Believe me, if you're impressed with the GeForceFX, then the NV40's going to blow you away.

JediAgent
11-19-2002, 10:09 PM
Blow me away? Or blow a gaping hole in my pocketbook?

weta
11-20-2002, 01:48 AM
Both

Fact, in the 60's one transistor cost $30.00, so an NV40 would have set you back a cool $6 billion (USD), OUCH!

JediAgent
11-21-2002, 12:59 AM
*flips over couch and looks for change*

Oh, look, i have just enough...

E^vol
11-21-2002, 11:09 PM
yeah, "just enough" (for ONE transistor) ;)

JediAgent
11-21-2002, 11:18 PM
ok maybe not "just enough" might have to flip the love seat over too.

E^vol
11-22-2002, 02:23 AM
So, when is NV35 supposed to be released ? I heard "spring time", but who knows these days...Think it will have the 256bit memory ? Does it even matter all that much ?

weta
11-22-2002, 08:11 AM
The NV35 should be released in H2 2003 with support for DirectX 9/1.

Possible improvements could be,

Change from 128-Bit to 256-Bit memory bus as your've mentioned.
Increased GPU core speed, 600-650MHz
Increased DDR2 memory speed, 1200-1400 (Effective)
Small increase in transistor count

You'll have to wait for the NV40 to see real improvements over the Geforce FX.

jolle
11-22-2002, 09:16 AM
Listen, if your gonna post rumors like NV40 please post a link to where you heard about it...
Im not convinced at all at those specs..
And Im not about to take anyones word on it.
where did they come from? how much speculation is in that?
Even tho they bought bitboys and they where big on embedded RAM, so it makes sense that they should make use of that.
Bitboys had a 1024bit memory bus to its embedded RAM.

Arent Microsoft skipping Dx9 and going straigt to 9.1?
thought they did that cause of the GeforceFX shaders where so far above the dx 9 specs..

anyhow it might not be possible for Nvidia to lable GFfx as a dx9 card since it doesnt support displacement mapping.
http://www.reactorcritical.com/

alex777
11-22-2002, 12:26 PM
That's a good point: what's the source of that information?

By the way, I have the spec of NV50 here...

weta
11-22-2002, 05:49 PM
Jolle, I'm assuming your question is directed more at Valkyrie than myself, given the detailed spec's he's posted.
For my part, this information was found on a Korean and a Japanese site, I didn't include any links because they
were in Korean and Japanese, sorry about that.
Clearly any information posted this early in a chipsets life shouldn't be taken as final, spec's change all the time,
we've seen this only recently with the FX.
Regarding DirectX 9, I thought Microsoft were launching DX9 in January 2003, and DX9.1 later that year.

Wiggo
11-22-2002, 06:26 PM
Yes that's what M$ is still sayin' about DX9's releases and DX9b1 has been out to M$'s partners and beta testers since the end of May this year so it should be about to go gold. ;)
<center>:cheers:</center>

Valkyrie
11-23-2002, 08:34 AM
I am dreaming these are spacs people around the world reakon the next gen chips will have theres no proof of an NV40 yet but ive read NV35 is in disscision

Wiggo
11-23-2002, 04:13 PM
No, no real proof yet but most of these rumours start from ppl at or around nVIDIA and their partners, the same as with previous models (other manufactures also have the same probs). Most of us who travel the net in search of info will come across lots of references, speculation and actually reliablely leaked information on many things plus over the years you get to know which sources are reliable and those that are not. The specs will have slight alterations while testing goes on but everything here so far is what's only being reported by better sources so far on the web. If something steps out of line from that then I'll be askin' for a link, believe me. Have a Google around and you'll find plenty to go thru. :devil win
<center>:cheers:</center>

weta
11-23-2002, 05:40 PM
Spot on Wiggo, I've just found more than 50 articles/posts that mention the NV40, are all these people dreaming?
Theres a design team in Santa Clara already working on this chipset, I'm convinced of that, it may be early days, but
they've started.
In my view, I see the NV35 as an evolutionary chip, a faster, more refined version of the NV30, positioned to take on
the R400, H2 2003. The NV40 will be the next truly amazing offering from Nvidia.

Wiggo
11-23-2002, 05:50 PM
Yep ya can do that with any item and its just experience that separates the BS from fact plus getting to know the more reliable ones. :devil:
<center>:cheers:</center>

jolle
11-23-2002, 08:21 PM
Well there isnt necessary any truth in any of rumors..
They could all be based on the same malicios rumor from some evil dude.. hehe im not saying thats the case but its possible..
I remember early rumors about NV30 that it had a hole in the middle for extra cooling, supported a new version of glide(3dfx buy based i assume) and used embedded RAM(bitboys buy up)..
None of it was true as far as i know hehe..

Anyway when posting very loosly based info (well goes for pretty much all unannounced hardware/software i guess)
post a link to the source, and stress the fact that its unconfirmed rumors..
Mostly incase someone who doesnt know better reads it, belives it to be true facts and spreads the word on these "new facts".
that way we can all do our lil part in preventing rumors messing up the stockmarket and such.. hehe :crazy:

reasonable speculated (my own ideas) specs on what a nv35 chip could be:
Faster DDRII memory (Qdr if its not to far ahead?)
256Bit memory buss
updated LMA and that kind of stuff
maybe even updated vertex&pixel shaders.
Possibly more textures per clock..
seems a bit hard to make the GPU clock much higher tho..


Thoughts on what NV40 could possibly bring(or would be cool to se):
Embedded RAM since the Bitboys buy is possible i guess
(they had 12Mb embedded with 1024Bit buss on a card that never survived to se the light of day)
Possibly dx10 support depending on when NV40 is worked on and DX10 specs are drawn up.
Maybe 512 bit GPU and 512 memory busS?
QDR memory possibly depending on when.
A new AGP standard with PCI express or hypertransport support?
(got no idea if agp will benefit from those tho)

hehe.. its so sweet to speculate on what might be :)

JediAgent
11-24-2002, 03:21 AM
hehe.. its so sweet to speculate on what might be :)

Speculation leads to the dark side... dont ask me why... it just does...

jolle
11-24-2002, 09:45 AM
it also leads to dissapointment, when you speculate away and the reality isnt as fantastic as you imagined...
Feels a bit like that at the GeforceFX launch, or maybe its just the empty void where all the reviews and tests should have been..

Wiggo
11-24-2002, 01:34 PM
Maybe ya shouldn't be in this thread jolle if the speculation is to heavy for ya. :devil win

After all where would we be now without it? :?:
<center>:cheers:</center>

jolle
11-24-2002, 05:16 PM
Hehe.. and Im not speculating enough for ya? just look at my post above man..
I just agreed to JediAgent that it ahs its drawbacks to speculate, but I just cant help myself, if they wont tell me I will speculate!!! :bounce:

alex777
11-27-2002, 06:51 AM
And disapointment leads to anger!!!

(sorry, I had to say that...)

JediAgent
11-27-2002, 01:02 PM
When speculation there is, much suffering i see...

... suffering ...

... suffering ...

... suffering, without new hardware you will beif pocketbook is smaller that speculation...

Wiggo
11-27-2002, 01:10 PM
Yes new hardware. :D

I'm just speculating on an EPoX KT400A motherboard to hit the market to see if I need anymore hardware. :devil win

Late January/early February accordin' to rumours. :devil:
<center>:cheers:</center>

weta
01-12-2003, 07:06 PM
Just a thought, but the NV40 could be one of the first major graphics cards to be available with a PCI Express bus.

weta
02-16-2003, 10:55 PM
The latest (leaked) NV40 spec

0.09u process
300-350 Million Transistors
750-800 MHz Core
16MB Embedded DRAM (134M trans.)
1.4 GHz 256-512MB DDR-II Memory
8 Pixel Rendering Pipelines (4 texels each)
16 Vertex Shader Engines
204.8 GB/sec Bandwidth (eDRAM)
44.8 GB/sec Bandwidth (DDR-II)
25.6 GigaTexels per Second
3 Billion Vertices per Second
DirectX 10 (or extended 9.1)
Release H2 2004

This data is not official, and shouldn't be treated as such.

FLaCo
02-17-2003, 01:58 AM
PCI express ...is?

0R1()N
02-17-2003, 02:24 AM
Where did you get all this info/speculation on the nv 40?

weta
02-17-2003, 04:40 AM
PCI Express (PCI-X) will eventually replace the current PCI/AGP bus architechure.

PCI Express architecture is a state-of-the-art serial interconnect technology that keeps pace with recent
advances in processor and memory subsystems. From its initial release at 0.8V, 2.5GHz, the PCI Express
technology roadmap will continue to evolve, while maintaining backward compatibility, well into the next
decade with enhancements to its protocol, signaling, electromechanical and other specifications. The PCI
Express architecture retains the PCI usage model and software interfaces for investment protection and
smooth development migration. The technology is aimed at multiple market segments in the computing and
communication industries, and supports chip-to-chip, board-to-board and adapter solutions at an equivalent
or lower cost structure than existing PCI designs. PCI Express currently runs at 2.5Gtps, or 250MBps per
lane in each direction, providing a total bandwidth of 16GBps in a 32-lane configuration. Future frequency
increases will scale up total bandwidth to the limits of copper and significantly beyond that via other media
without impacting any layers above the Physical Layer in the protocol stack. PCI Express provides I/O attach
points for high-performance graphics, 1394b, USB 2.0, InfiniBand(tm) Architecture, Gigabit networking and so
on.

PCI-X 1.0

PCI-X 66
PCI-X 133

PCI-X 2.0

PCI-X 266 (2.1 Gigabytes per second bandwidth)
PCI-X 533 (4.3 Gigabytes per second bandwidth)

PCI-X 3.0

PCI-X 1066 (8.5 Gigabytes per second bandwidth)
PCI-X 2133

0R1()N
02-17-2003, 10:50 AM
Where are you getting all of this info from? Tell me I'd love to know:wave: :?:
Please don't leave me out in the dark:knife: lol

jamie_horwood
02-21-2003, 04:24 AM
[QUOTE]Originally posted by Valkyrie

well futureproofed there. now all you need is the purrrr fect memory for the mobo. i was running 2100+ but now i am running 2700+ and it runs like a dream.

weta
02-22-2003, 06:38 PM
FlaCo,

Here you are mate, a picture of ATi's Radeon 9700pro with PCI Express interface.

negomike
02-23-2003, 12:48 AM
notice all the sleek racing marker lines on all the capacitors. Sweet!

E^vol
03-05-2003, 09:16 AM
PCI Express (PCI-X) will eventually replace the current PCI/AGP bus architechure.

PCI-X 1.0

PCI-X 66
PCI-X 133

PCI-X 2.0

PCI-X 266 (2.1 Gigabytes per second bandwidth)
PCI-X 533 (4.3 Gigabytes per second bandwidth)

PCI-X 3.0

PCI-X 1066 (8.5 Gigabytes per second bandwidth)
PCI-X 2133
They are NOT the same thing.
PCI Express is a serial I/O interconnect while PCI-X is a parallel bus.
PCI Express used to be called 3GIO.
PCI-X was designed to enhance the PCI bus.
PCI Express was designed to replace it.

weta
03-06-2003, 04:22 AM
E^vol, point taken, I was simply using PCI-X as an abbreviation for PCI Express, but as you have correctly pointed out
PCI-X is the name used for the parallel bus, when I have a chance I'll edit the post.

E^vol
03-06-2003, 07:38 AM
No prob. I wasn't trying to criticize your info, just trying to clairify for others.: peace2:

jamie_horwood
03-06-2003, 07:51 AM
Im interested to see what system SETUP u have. i wanna know if and what sorts of problems with hardware you have an haven't manged to run or had problems with the KG7 RAID mobo. I think its you isn't it with this MOBO. and the athlonxp 2100+????? PLEASE SEND A PERSONAL MSG OR POST HERE

weta
03-24-2003, 04:16 AM
These are only rumours, and differ considerably from the figures posted a few months back.

0.13u process
150m transistors
120w power consumption (NV30) 75w
Could be available as early as Christmas

Reportedly, Nvidia were actively researching SOI and even Germanium oxide to reduce power requirements, finally
choosing to concentrate on more powerful cooling solutions, rather than invest heavily in advanced process technologies.

SmokeyTheBalrog
03-24-2003, 04:34 AM
Originally posted by weta
[B]These are only rumours, and differ considerably from the figures posted a few months back.

0.13u process
150m transistors
120w power consumption (NV30) 75w
Could be available as early as Christmas

Reportedly, Nvidia were actively researching SOI and even Germanium oxide to reduce power requirements, finally
choosing to concentrate on more powerful cooling solutions, rather than invest heavily in advanced process technologies.

Invest in powerful cooling my [email protected]@. In desperation turned to powerful cooling.

E^vol
03-25-2003, 06:31 AM
Invest in powerful cooling my [email protected]@. In desperation turned to powerful cooling.
LMAO !!! :rofl: :rofl: :rofl:

revenant
03-25-2003, 09:49 AM
...and I think Nvidia will take the crown formerly bestowed to Cyrix, for hottest running processor. ;) *golf clap*

kenny83
03-25-2003, 12:36 PM
Hope Nvidia dunt stuff up again, we need competition to keep prices down. But all nv40 **** is just speculation, DirectX 10 won't be released for at least another 2 years, as microsoft have no plans to make any more extensions to the DirectX until after Longhorn. Nvidia better not take 2 years to get nv40 out or they be out like 3DFX.

:smokin:

rugbydude
03-25-2003, 02:14 PM
Nvidia have trully ****ed us about. I still havent been able to even look at an NV30. The NV 35 is still just a bunch of rumours and there already telling us they have NV40 near christmas.WTF why they don't just make one descent chipset rather then ****ing us about i don't know:mad: :mad: :mad:

E^vol
03-26-2003, 08:34 AM
Temper, temper...:group:
I will be getting an ATI9800PRO 256mb ! And then I'll wait for nVidia to get their act together...after that...we'll see...If they can and do produce a quality card in the near future, then I might go to nVidia for my next video card (after this upcoming card).

revenant
03-26-2003, 02:47 PM
It's kind of a shame really... I was so looking forward to the FX Ultra being my next super-cool-a$$ video card. And with all the crap that Nvidia has been pulling with this release, it's like 3DFX all over again, a little. I really hope they do pull their heads out of their poopshoots, but in the mean time, I will sit back and watch with a kick butt ATI card in my rig. They have enough money, but they need to start thinking "outside of the box" again to pull ahead of ATI at this point, IMHO. This will only be good for the consumer, though. We're going to see killer toys come out as a result of this struggle for best high-end graphics card race. :)

weta
05-19-2003, 07:12 PM
The NV40 will use second generation GDDR2 memory.
As the NV40 will be available from Jan 2004, I can't see Nvidia using PCI Express on this release.

E^vol
05-19-2003, 09:56 PM
The NV40 will use second generation GDDR2 memory.
As the NV40 will be available from Jan 2004, I can't see Nvidia using PCI Express on this release. Really ? Why not ? Isn't PCI-Express supposed to be released this coming fall on the new mobos ? People who buy the new mobos will need a video card for sure...:confused:

weta
05-20-2003, 12:31 AM
Mainly because of ATI's recent decision to delay the release of their R400/500 chipset, which I believe was partly due
to this new technology.
If motherboards are selling with a PCI-Express bus when the NV40 is launched, then I'm confident Nvidia will support it.

Here's a picture to outline the physical differences between the two interfaces.

rugbydude
05-20-2003, 02:27 AM
so wot will this new technology bring us??:confused:

weta
05-20-2003, 03:16 AM
Key features

Compatible with current PCI enumeration and software device driver model
Layered architecture enabling physical layer attachment to copper, optical, or emerging physical signaling media to allow for future encoding schemes
Maximum bandwidth per pin for enabling unique and small form factors, reducing cost, simplifying board design and routing, and reducing signal integrity issues
Embedded clocking scheme enables superior frequency scalability versus source synchronous clocking
Bandwidth scalability with frequency and/or interconnect width
Predictable low latency suitable for applications requiring isochronous data delivery
Quality of Service (QoS) attributes
Mechanisms to support embedded and communications applications
Hot Plug and Hot Swap capability
Power Management capabilities

Benefit of serial technology over parallel architecture

Serial technology does away with the limitations of parallel bus architectures by delivering high bandwidth in the fewest number of signals. This allows higher frequency scaling while maintaining cost effectiveness.

Tomi_s
05-20-2003, 04:35 AM
Veeeery NICE...!!! :thumb: :cheers: :cheers:

ZEROKOOL
05-20-2003, 09:50 PM
isnt that "pci express" just a backwards pci slot? the short tab being toward the case edge instead being toward the case inside?(:crazy: if u can understand that talk with my shrink:crazy: ) anyway...is this pci express different from that "blue magic" pci stuff i saw/herd about? ok well back to the NV40... whens this 2004..but its dx10 compatible? DX 10 isnt officaly releaed by MS till 2005ish (longhorn release) so then its nv45 thats gona be dx10 compatible or nv 40 is commin out with LH RTM ...:spam: curiouse to c what y'all think :spam:

E^vol
05-21-2003, 03:34 AM
DX10 ?? We barely have software that'll use DX9....

rugbydude
05-21-2003, 02:12 PM
Hmmm theres gta vice city that uses direct x 9 hmmm errr any else:confused:

E^vol
05-21-2003, 10:02 PM
Basically, there's still not a huge demand for DX9 yet.
There are a few titles out there, but the choice is limited for now.

rugbydude
05-21-2003, 11:57 PM
which is why dx10 aint out till 2004:D

homeworld1031tx
05-23-2003, 09:19 AM
Really?, i haven't heard of any DX9 title shipping. Is Planetside DX9? What are some of the DX9 titles?

rugbydude
05-23-2003, 02:03 PM
Errr i only know of GTA:VC.....:o

E^vol
05-24-2003, 12:58 AM
Doom3 / HalfLife2 I believe ! -not that they're out yet-

weta
05-27-2003, 01:02 AM
Graphics card technology roadmap

t00lb0x
05-27-2003, 02:21 AM
Any pics yet?

weta
05-27-2003, 02:31 AM
t00lb0x: Six months away.

rugbydude
05-27-2003, 05:37 AM
Whoa that graph aint that clear. But at least u get a basic idea from it. Thnx Weta:thumb:

homeworld1031tx
05-27-2003, 08:38 AM
Yeah Thanks Weta:thumb: :thumb: but why didn't u put the Transistor count for us AMD fans:o :o :o :cry: :cry: :cry: :cry:

weta
05-27-2003, 06:21 PM
rugbydude: Click on the expand button in the bottom right hand corner of the graph.

rugbydude
05-27-2003, 06:35 PM
Hehe thnx Weat but i found that earlier:laugh:

weta
05-27-2003, 08:32 PM
From what I've seen lately it would appear that both ATI and Nvidia are almost mirroring each other, here are the latest
(rumoured) specs.

NV40: 0.13u process, 200m+ transistors, DX9.1 support, Q3/2003-Q1/2004 release
R400: 0.13u process, 200m+ transistors, DX9.1 support, Q3/2003-Q1/2004 release*
NV50: 0.09u process, 300m+ transistors, DX10 support, 2005 release
R500: 0.09u process, 300m+ transistors, DX10 support, 2005 release

Reportedly both ATI and Nvidia want to skip the 0.11u process, and move straight on to a 90 nanometer core.

*AKA the R420

E^vol
05-30-2003, 08:47 AM
From what I've seen lately it would appear that both ATI and Nvidia are almost mirroring each other, here are the latest
(rumoured) specs.
Reportedly both ATI and Nvidia want to skip the 0.11u process, and move straight on to a 90 nanometer core.
Neither wants to be viewed as having inferior technology ! :shh:

weta
06-08-2003, 11:11 PM
Nvidia's NV40 should be an entirely new core, and may be up to twice as fast as the NV35.

Representatives from nVidia were on hand for AMD's "Tech Tour 2003," and I had the opportunity to discuss nVidia's plans with them. The nVidia rep on hand claimed that nVidia was still on a 6 month development cycle, and that the delays encountered with the NV30 were only an anomaly. He conceded that the GeForce FX 5800 was a failure, and that nVidia had "temporarily" ceded the performance crown to ATI. The rep further claimed that the NV40 core would be out by the end of this year, and that it would be an entirely new core. He was confident that the new part would be up to twice as fast as the NV35. The NV35 should become available within the next 6 weeks, and the rep claimed that the NV35 "spanked" ATI's Radeon Pro 9800 in most benchmarks (although these claims are dubious given nVidia's recent benchmark scandals). He claimed that nVidia's "The Way It's Meant To Be Played" campaign had two ramifications. First, all participating game developers used only nVidia Quadro cards when designing games, and second, many games would have certain features that only nVidia users would be able to take advantage of. The rep showed a demo system running the "Dawn in black leather" demo on an NV35, and claimed that Nforce3 cards for the Opteron would be available in July. He thought that an Opteron/Nforce3/GeForce FX 5900 computer would be an ideal combination for gaming.

"Dawn in black leather" = Dusk

Wiggo
06-08-2003, 11:34 PM
ATM I'd rather faster HDD's and other subsystem hardware buses. These are the bigest bottlenecks in the PC and need the most work done to improve but basically HDD's are in need of the biggest overhaul most of all and likely need a whole new technology as the current technology seems to be gettin' close to the ceiling. : peace2:

homeworld1031tx
06-09-2003, 02:58 AM
okay this is really bugginh me so i have to ask it. what the hell does "ATM" mean??:confused: :confused: :confused:

theyneverknew
06-09-2003, 03:33 AM
At The Moment :thumb:

E^vol
06-09-2003, 08:45 AM
I definitely agree with Wiggo ! SATA & PCI Express show signs of hope, but they still lack the same raw speed and power that other components (ie: CPU's & video cards) have. And there are a lot of other areas that still need help, and lots of it ! :2cents:

Of coarse, faster video cards are still pretty fun....:p

weta
06-14-2003, 06:24 AM
Roadmap

NV36 AGP
NV36X PCI-Express
NV40 AGP and PCI Express
NV45 PCI Express only

E^vol
06-14-2003, 07:42 AM
Originally posted by weta
Roadmap

NV36 AGP
NV36X PCI-Express
NV40 AGP and PCI Express
NV45 PCI Express only
Now, would it be safe to assume that the NV36X will be a low end card when the NV40 or NV45 are released ?
Kind of like the GF4 ti4800 vs. the GF FX series...

t00lb0x
06-14-2003, 09:42 AM
Seems nice :P I can't wait until like 4 years to see what card there is. Like what GHZ will we be up to. Will we hit 10GHZ?

weta
06-21-2003, 12:47 AM
Nvidia noted that where the cycle time (the time from a design spin to silicon) was less than 10 weeks for 150nm processes, the cycle time for 130nm is presently at, or above, 14 weeks - this would mean that a part that requires two respins would be in the order of 6 months from initial tape-out. They also said that a product intended for release later in the year has taped out at IBM and they are expecting to receive the first qualification sample in the next few weeks.

Update: This has turned out to be the NV36 and not the NV40.

weta
06-23-2003, 05:10 PM
These are Nvidia's core/memory targets for its next generation NV40 graphics chipset.

550MHz-600MHz Core
700MHz-800MHz Memory

homeworld1031tx
06-24-2003, 01:52 AM
Now, would it be safe to assume that the NV36X will be a low end card when the NV40 or NV45 are released ?
Kind of like the GF4 ti4800 vs. the GF FX series...

I would say that the FX is the lower line compared to the TI4800 :D :D

ZEROKOOL
06-24-2003, 06:53 AM
I would say that the FX is the lower line compared to the TI4800

i would have to agree, considering the crown jewls of the FX line (5800ultra) was a dud, the FX5900&ultra owns over ati on all the benchmarks i seen(not by alot its still close), hehe but 5900 was a act of desperation on nvidia's part...was pushed way ahead of schedual.

Soulburner
09-02-2003, 06:23 AM
Roadmap

NV36 AGP
NV36X PCI-Express
NV40 AGP and PCI Express
NV45 PCI Express only
Looks like the NV40 will be the ticket for me.

:hammer:

weta
09-09-2003, 12:01 AM
In order of decreasing reliability

Supports FP32, FP16 and FX16 natively. Whether there is any performance difference between FP16 and FX16 is unknown, and whether there are any truly non-FP32 units is also unknown.
175M transistors, 600Mhz core clock, 1.5Ghz effective GDDR2 (Samples already shipped to nV - 16 memory chips per board)
Not taped-out yet, or if it did, tape-out failed. To tape-out sometime this month.
8 pipelines, Speculation: probably 8x2 and no 16 zixel trick (not worth it with 4x+ AA, which is really a minimum with 48GB/s of bandwidth) - Maybe such a bypass path for low-end models (NV42/NV43)

This means Christmas avaibility has IMO become absolutely out of the question. Best we can hope is for-developer documents for Comdex, or around then. And that's not a certainty.

NFI

Soulburner
09-09-2003, 11:27 AM
What are FP32, FP16, FX16 and FX32?

weta
09-18-2003, 04:27 PM
Samsung Unveils 1600MHz GDDR2 Memory Chips

Samsung Electronics has announced production of its 256Mb Graphics Double Data Rate2 (GDDR2) SDRAM. The 256Mb capacity will enable building graphics cards with up to 512MB of memory using PCBs of typical sizes. Besides, the 1600MHz DRAM chips will allow extremely fast graphics cards.

Samsung claims that the new devices had already been supplied to leading graphics card manufacturers, suggesting that the actual products with such amazing memory are not too far away. I believe that we will hardly see any graphics cards with 512MB of mind-blowing fast memory in late 2003 or early 2004, but a year from now there definitely will be high-end graphics products with 512MB of memory. Such graphics cards may be based on the successors of the NV40 and R420 designs, for example.

Samsung’s GDDR2 devices incorporate a number of new technical features, such as On-Die-Termination (ODT), Off-Chip driver-Calibration (OCD) and posted CAS to boost performance by up to 50% over graphics DDR SDRAM.

GDDR2 SDRAM at 1600MHz in 128-bit configuration is capable of providing of up to 25.6GB/s peak bandwidth, while in 256-bit configurations, the new type of memory will bring up to astonishing 51.2GB/s of throughput.

Samsung has introduced GDDR2 SDRAM ahead of competition by developing the industry’s first 1000MHz 128Mb GDDR2 SDRAM in July 2002. The chips were adopted by NVIDIA Corporation in its unsuccessful GeForce FX 5800 graphics product. Samsung’s full-scale production of the next generation graphics DDR SDRAM in the second half of 2003 will further accelerate its advanced graphics memory business.

Xbit

Update: Samsung has delivered 10000 of these memory chips to Nvidia for NV40 test samples.

AsianBatman
09-19-2003, 04:39 AM
I would say that the FX is the lower line compared to the TI4800 :D :D
TI4800 is just a ti4400 with 8x agp i believe :confused:

Soulburner
09-19-2003, 04:40 AM
Yes and the 4600 is the fastest GF4 card.....

rugbydude
09-19-2003, 02:04 PM
TI4800 is just a ti4400 with 8x agp i believe :confused:
I think you'll find its a ti 4600 with 8 times agp:thumb:

AsianBatman
09-19-2003, 02:18 PM
I doubt the ti4800 is the 4600 that is 8agp.....i believe soul is right..4600 the faster ti

Soulburner
09-19-2003, 04:05 PM
I think you'll find its a ti 4600 with 8 times agp:thumb:
What is it clocked at?

The__tweaker
09-19-2003, 07:39 PM
A 4800 is a 4600 with 8X agp. (300Mhz gpu)
A 4800se is a 4400 with 8x agp. (275Mhz gpu)

:geek:

Soulburner
09-20-2003, 06:35 AM
Ah so the SE is the slower one....

4600 based card will always be the better choice as it comes with faster ns rated ram and will overclock much higher than the 4400 or 4200. I have seen 4600s with RAM from the stock 600 or so to 700-750mhz.

weta
09-21-2003, 07:11 AM
<center>NV40 AA</center>

The NV40 supports "Stochastic organised grid" while randomising fetches for AA blocks. Should this turn out to be true, then the NV40 will probably have an edge in MSAA IQ against the R420. Please note that this information was reportedly leaked and originally posted on a Russian site, so it should be treated as a rumour and not fact.

More information (http://www.glhint.de/pub/data/VMV01Water.pdf)

iXBT/NFI

weta
09-24-2003, 07:20 PM
The Inquirer has posted a photo of what it claims could be Nvidia's next generation NV40 graphics card.

<img src="http://images.tweaktown.com/news/news_nv40.jpg">

The card uses ultra high speed GDDR3 memory modules from Micron Technology. A Micron official confirmed that it is GDDR3 running
at 800MHz, giving a data rate of 1.6Gbps and that it will be in full production Q1 2004. Micron suggests no extreme thermal cooling of these memory modules will be necessary.

The Inquirer

Soulburner
09-25-2003, 05:19 AM
I know it would just be a prototype, but why would Nvidia get rid of their heatspreader design on the GPU? That looks really wierd, like the way ATI does it.

weta
09-25-2003, 05:33 AM
<center>NVIDIA News and Insights</center>

The one thing we can bet on is that the NV40 will not have the problems the NV3x series has now. PS 2.0 performance problems will be a thing of the past with the NV40, as it is supposed to have PS/VS 3.0 hardware. Memory bandwidth will also not be a problem as GDDR3 will be available in late Q1 2004, and that promises around 51 GB/sec bandwidth on a 256 bit bus. The NV40 is starting to look like the answer to NVIDIA's problems as of late. ATI is also very far along with the R420, but it may not be as advanced architecturally as the NV40. Still, not enough is known about either part to give further insight. Needless to say, 2004 looks to be more competitive than 2003 was.

Penstar Sys

weta
09-25-2003, 05:39 AM
Soulburner: I'm not convinced that this is the NV40, to many things don't add up about this card. Unfortunately I can't get hold of anyone to check it out because they're all at the show.

weta
09-26-2003, 02:24 AM
Soulburner: Unfortunately I've been unable to obtain any information on the card (http://images.tweaktown.com/news/news_nv40.jpg). My guess is that it's either a Personal Cinema or Quadro version of the soon to be released NV36, but as I say it is a guess.

<center>The NVIDIA CEO Jen-Hsun made it quite clear that the graphics industry would no longer be the primary focus point of the company.</center>

Soulburner
09-26-2003, 05:35 PM
Whats with the system specs in your sig?

weta
09-26-2003, 06:00 PM
Soulburner: That was just a bit of fun, I was trying to think of a good AMD based system to play Half-Life 2 on.

Soulburner
09-27-2003, 04:58 AM
Cool, except the only problem is that memory would be severly underclocked to DDR400...kind of a waste :cool:.

weta
11-10-2003, 06:23 PM
Nvidia conference call
More NV40 details emerge

The officials from NVIDIA confirmed a “whole-new” NV4x family to be launched next year (probably, calendar year, editor) and bring “programmable shading and cinematic computing to a new level”.
In Q1 some of NV4x will be in production. By Q2 many of NV4x will be in production to bring the margins up. By Q3 “almost entire” family of NV4x will be ramping up in production. No indications whether “calendar” or “fiscal” quarters, though.
When asked about NV40 tapeout, the management did not comment saying “We do not comment on tapeouts; we just comment on production ramps and design wins. There will be time to talk about NV40, today is our time to talk about 5950 and 5700…”
NVIDIA will offer top-to-bottom lineup of GPUs and MCPs with support for PCI Express.
NVIDIA is now sampling customers with its “first PCI Express technology”.
NVIDIA is excited about the PCI Express and will surely be on track with it, but denied to talk about the speed of transition, etc.
NVIDIA anticipates “something like” 10 different versions of GPU products next year, the majority of which will be PCI Express, as I understood from Jen-Hsun Huang, NVIDIA’s CEO.

xbit

weta
01-14-2004, 09:09 AM
Nvidia's NV40 tapes out
On shelves in April, May

We can finally confirm that Nvidia's NV40 taped out in the last days of December of 2003 and that Nvidia is working hard to finish everything up before boards using the technology go on sale.
Nvidia's partners are still without chips and it seems that they will got their sample NV40s to do necessary PCB adjustments no earlier than late February 2004.

It's going to be a close call whether Nvidia partners manage to show some products at CeBIT and it will be great loss for a them if they don’t since ATI will show off the R420 at the Hannover show.

It takes around 120 days from tape out to final shipping products so it seems that we might see cards based on NV40 shipping in April or May.

We do believe that Nvidia will use CeBIT to trumpet its NV40 technology, however.

The fight between NV40 and R420 will be very interesting but we don’t believe that either chip will be significantly better then the other, with both having similar performance.

The Inquirer

weta
01-28-2004, 05:00 PM
Nvidia NV40
Latest rumoured specs

175 million transistors, in 130nm manufactured
8x2 architecture, however 16 Z/Stencil tests per clock
DirectX 9,0 architecture, supports Shader 3.0
Opposite NV38 and Shader doubled more efficient pixels
Supports DDR1, GDDR2 and GDDR3
Internal AGPx8 interface
Exact clock rates: unknown; there is estimated 500 to 600 MHz Chip clock and 600 to 800 MHz storing act
Improvements with Anti- Aliasing: (at least) a new mode; its SAM polarizing sample is however unknown
Improvements with the anisotropic filter: unknown
Presentation: GDC or CeBIT at the end of March 2004
Market entrance: At the end of April or May
2004 Sales name: GeForceFX 6XXX

3D Center / Warp2Search

weta
02-03-2004, 05:38 PM
NV40 and R420 memory secrets revealed
GDDR3 won't be ready in time

It's not often that rival graphics chip firms Nvidia and ATI use the same marchitectural tactics. But, this time around, it seems they don’t have any other choice. Nvidia’s upcoming NV40 and ATI’s R420 both support memory in the types DDR 1, GDDR 2 and GDDR 3, but both companies will be sticking with GDDR 2, at least at first.
The reason is simple: DDR 1 is just too slow to support the latest-generation graphics chippery in high resolutions, with fancy FSAA and Anisotropic filtering. Also, DDR 1 has a clock limit of 1GHz which is very hard to crank up further. DDR 2, of course, is nothing more than DDR 1 that can run at more than 1GHz, given a set of different commands.

Since both companies’ current chips use frequencies that are very close to that 1000MHz barrier, this means that neither has any choice other than to move to DDR II, or GDDR 2, as the suits would have us call it.

GDDR 2 was sampled in Q3 2003 by Samsung and rated at 600 to 800MHz -- effectively 1200MHz to 1600MHz. Insiders have told us that Nvidia received 10,000 memory chips back in Q4 last year to prepare prototypes of its NV40 boards. We also learned that NV40 has 16 memory chips on board. Nvidia is aiming at a frequency of 750MHz -- or 1500MHz effectively -- but this depends on PCB quality and the number of layers. The first NV40 silicon-powered prototypes are currently meandering through the offices of special, beloved Nvidia partners, we are given to understand.

GDDR 3 may, in theory, be one of the options on the market but, if you ask around in knowledgeable circles, you will learn that this memory is in early sample stage and so neither Nvidia nor ATI could get enough chips for Q2 retail availability of the cards, however big their muscles.

It is expected that GDDR 3 will be ready by Q3 2004 so you might expect that the planned NV45 and the next ATI chip (R450 - R480?) will use this memory.

Micron is the only signed up member of the Dramurai to have GDDR3 memory specifications on their site. There, the company suggests that Q1 will be a good time for sampling and my guess is that they won't be ready for production before Q3. Clockspeeds for both GDDR 2 and 3 will be set in the range from 600 to 800MHz - effectively 1200 to 1600 MHz.

It’s interesting to see that 800MHz GDDR2 SDRAM has a latency of an incredible 1.25ns.

Both the NV40 and R420 cards and memory interfaces are 256-bit ones and by current estimates, this means that a card that uses 600MHz GDDR 2 memory would have between a majestic 37.5 GB/s to a magnificent 50 GB/s raw bandwidth.

We await their appearance with unabated breath.

The Inquirer

Soulburner
02-03-2004, 10:36 PM
Sounds right to me, 600=1200, so 1200mhz / 8, x256 = 38Gb/s.

For 50Gb/s, 750mhz memory, or DDR 1562.5 to be exact would be needed. That's just crazy.

weta
02-10-2004, 12:53 AM
More NV40 details emerge
Samples this month, hype building

Nvidia's upcoming graphics chip, NV40, supports the Shader models 3.0 that Nvidia claims are in DirectX 9.1 and which, in the real world, are in DirectX 9.0c. Still, Nvidia's fearless marketeers are also claiming that the chip have three times the performance and eight times faster shaders than NV38—Geforce FX 5950 Ultra. Hmm.
It's even more interesting that Nvidia claims that its card will end up four times faster in Doom 3 and seven times faster in Half Life 2. Well, even if those numbers are ridiculously high, you might expect that NV40 will have much more efficient shaders than NV38 had. NV40 is still AGP but we know something about their PCI Express chip and we will tell you about this later. In the meantime, it's NV40's day of glory.

NV40 will use GDDR 2 or 3 memories as its memory controller is capable of both and Nvidia aims to get to 600MHz milestone - or should we say 1200MHz effectively. We heard before that it might end up faster but, in our view, that's simply impossible, since the PCB design would end too expensive and complex to produce. Another reason might be that it's not that easy to clock your card at more than 1200MHz. The card will have eight memory chips, 8x32BGA -144 GDDR memory.

There will be two implementations of NV40 chips - as always NV40 U implies Ultra, where Ultra will end up faster. It's still too early to know the clock speeds of this solution but more then 500MHz is what Nvidia hopes for. We mentioned that they had silicon up and running at 300MHz but you can be sure the final stuff will end up faster.

This time Nvidia wants to make cards more silent than NV30 and we're not being too ingenious when we guess this is what the phrase 'Silent Running' is all about. Even Nvidia now knows that people like quiet graphic cards.

Nvidia calls its NV40 board an "enthusiast" board since it will come at high end price. Boards will be samples this month, February, with availability in April. The card will have two monitor outs with both DVI, VGA and combinations are possible [two DVI, two VGAs, VGA and DVI] and, of course ,TV out.

Launch date so far remains Cebit's Thursday 18th of March. It's not the first and obviously not the last time that we will write about NV40. Stay tuned, there will be more.

The Inquirer

The__tweaker
02-10-2004, 01:11 AM
Can't wait to see some benchmarks. :)

weta
02-10-2004, 07:04 PM
NV45 is Nvidia's PCI Express X16
Catching the buses

Nvidia is beavering away creating the NV45, a chip which uses PCI Express and marks a break of tradition for the graphics gnome.
Nvidia will break the habit of a short lifetime to call the next rev the Nx5. In a way, this is an update since it ushers in the dawning of the Age of PCI Express, backed by Chipzilla, Chimpzilla, Marmosetzilla and even the legion of mobo vendors.

Nvidia again will claim that NV45 is DirectX 9.1 compliant but let's clear things up by saying that there is no such thing as DX 9.1 only 9.0c. Let me stop right here and say performance wise it will equal the NV40 -- the only difference is a PCI Express interface.

There will be two implementations of NV45 chips, as always. NV45 U implies Ultra and NV40 where Ultra will end up faster than 500, which is what Graphzilla is aiming for. Memory characteristics will be the same as the NV40, eight memory chips, 8x32BGA -144 GDDR 3 memory running at up to 600 MHz, 1200 MHz effectively. NV45 comes equipped with 256MB of 256Mbit GDDR3 memory, and not GDDR 2 like we suggested before. What's the difference? Even the vendors don't seem to know yet.

Nvidia will call its NV45 board the PCI Express Enthusiast board and this baby will cost you $499. As the $ is to the €nron, so the €nron is to ster£ing.

Production starts in April, not long after the Poisson d'Avril emerges.

Nvidia validated PCI Express marchitecture back in December and is working closely with Intel labs. Intel actually co-validates Nvidia cards and so far both parties are happy. PCI Express x 16 is already tested on Gruntsdale and Tumwater and t all fine with D3D and OpenGL on this cards. Gruntsdale and Alterationwood will be called the 925X2 and 915P2 and Tumwater will be a server chipset. This chipset will be named E7515 and will cost $€£100 a piece when available.

Nvidia has already sent over 500 boards to its customers to be able to prepare for the dawning of the Age of PCI Express.

Nvidia has put its PCI Express generation of cards under a Q2 hat but we suspect that NV45 might be announced together with NV40 so let's say it will be launched at SnoBIT in Hangover in March.

What's NV19, NV37, NV41 and NV43? We will tell you that later because Graphzilla has exhausted us yet again, but it has a lot to do with PCI Express 16.

The Inquirer

Radicus
02-11-2004, 12:36 AM
I cant't wait to see the price tag:barf:

Soulburner
02-11-2004, 01:53 AM
I cant't wait to see the price tag:barf:


9

Radicus
02-11-2004, 06:44 AM
$499 is not bad:woot: wonder how much the 9590u will go down in price:D

Soulburner
02-11-2004, 10:34 AM
I think you mean 5950U.

Radicus
02-11-2004, 12:31 PM
DOH!:snip:

SmokeyTheBalrog
02-11-2004, 02:07 PM
9 is not bad:woot: wonder how much the 9590u will go down in price:D

Maybe Radicus was thinking of the nVidia/ATI cross breeding project.

But he just got the names wrong:

The 9590 and the 9590u Pro

weta
02-27-2004, 02:29 AM
Nvidia NV40 specifications confirmed
Too late for ATI to respond

NVIDIA really needed to pull the cat out of the bag in the next round of its increasingly epic battle with ATI Technologies and from where we’re sitting we’d be more than surprised if the next round isn’t, now, a foregone conclusion.
US sources close to Nvidia have confirmed that their next generation GPU will feature a full sixteen pipelines – not as previously speculated an 8x2 arrangement - and this is reflected in the increased transistor count of circa 205-210 million, up from a previously speculated 175 million.

In very recent times ATI have been extremely confident in saying that their R420 - set to launch not long after, Nvidia show their hand – would, immaterial of architecture, thoroughly outgun the NV40.

Indeed today, Richard Huddy - ATI's European Developer Relations Manager maintained that their expectation was for ATi’s R420 to be, faster, on balance, across a suite of 10 common games than Nvidia NV40.

However we think that ATI's earlier confidence was based on internal intelligence that Nvidia’s NV40 feature set would be limited to an architecture built on 175 million transistors and that Nvidia would deliver, as anticipated, on the first day of CeBIT in March.

Nvidia's launch date for NV40 seems to have shifted backwards, but even a month or so is not going to give ATi enough time for a hardware response.

Knowing what we know now about NV40 having a full sixteen pipelines, we sensed that despite some fighting talk, and indeed some intelligent and rationalised arguments, ATi are now hoping, at best for a close fight…

The Inquirer

SmokeyTheBalrog
03-05-2004, 10:58 AM
NV4X generation has MPEG 1,2,4 encode/decode
Next generation video support

NVIDIA'S NV40 is not going to be just fast in shaders and pixels, it will have some additional features that will be interesting to anybody that messes with Video.

The video capabilities of NV40 are quite something.

It seems that all NV4X generation of cards will feature very attractive video options.

Nvidia wants to promote NV4X generation of chips as the ones with high quality video, complete and ready for HDTV and PVR.

High quality video will brought motion adaptive de-interlacing, high quality scaling and filtering, good old video de-blocking and integrated TV encoder.

As for HDTV, Nvidia claims Transport stream handling, HDTV output (720p, 1080p, 480p CGMS) and HDCP - High-bandwidth Digital Content Protection as well as HDMI High-Definition Multimedia Interface support.

The PVR part is the most interesting as Nvidia claims that the NV4X generation will have support for no more and no less then MPEG 1/2/4 encode and decode as well as WMV9 decode acceleration.

This means that all decoding and encoding operation previously done by software and very CPU dependant will be able to be processed on NV4X chips. We are not aware specifically how Nvidia plans to do that but it sounds promising.

NV40's introduction is just weeks away.

The Inquirer


------------------------------
You know, if these features take up enough transistors perhaps once all is said and done only 160 or so trans might actually be devoted to 3D acceleration. Just guessing here though, if so ATI’s R420 might still have a shot at the performance crown. Though perhaps they could put it on an external chip?

Well that’s enough of my baseless conjecturing.

SmokeyTheBalrog
03-07-2004, 10:40 AM
More details leak on NV40 Ultra and non Ultra
Faster but more expensive, slower but cheaper

We can now confirm that Nvidia will have two versions of the NV40 chip once it is ready to show it to the world.

As is now traditional, Nvidia will have one extremely expensive card that will cost about €/$499. This card is expected to be faster clocked and it will have faster memory as well.

The memory target sits close to the 600MHz range but it's still being tested for the right speed. As previously suggested, the card will use GDDR 3 memory which consumes less power and runs cooler then DDR 2.

The NV40 non ultra, amateur version is going to be clocked lower but we don’t have any details how slower yet. The price will be around €/$299.

Both boards will use similar memory configurations but we suspect that Nvidia might offer a 128 bit version of the card.

Nvidia will use similar PCBs (printed circuit boards) for desktop and workstation cards as before.

Production of NV40/NV40 Ultra is scheduled for April.

The Inquirer

weta
03-08-2004, 08:16 PM
NV40/NV40 Ultra are the CeBIT NDA kids
Graphics firms against product promiscuity

Sources close to Nvidia have confirmed that its spanking brand new NV40 marchitecture will only be shown to people who sign a comprehensive non disclosure agreement (NDA) with the graphics company.
It's for their eyes only, and not for the folk drifting through the massive halls at the mammoth trade show this month.

If you were hoping to catch a glimpse of the tech at the Nvidia partners' booths, then you're going to be bitterly disappointed.

Nvidia partners are not too thrilled about this repeat of last year's show when the ladies and gentlemen of the press were shown the technology. Famously, the INQUIRER was turfed off the Nvidia booth.

Why are the Nvidia partners not reeling in ecstasy about this? They feel they need to show off a handstome Nvidia steed at CeBIT, and everyone will have bridge PCI Express cards based on NV3X or the later marchitecture NV19, NV37, NV39 and the card without proper codename NV38 + BR2 covered up by the PCX 5950 brand name. All these cards are old NV17, NV34, NV36 and NV38 with PCI to AGP bridge solutions. Nothing more.

NV40 Ultra and possible even NV45 will be showed under NDA in heavily guarded rooms to press loyal to Nvidia's ambitions.

NV40 will most likely be launched after CeBIT we understand, and sometime in April sounds good to us but it could be later. Another delay for Nvidia and ATI chips wouldn't surprise us at all.

Being called a "partner" of Nvidia is a slightly misleading term these days because some of them have flirted and even indulged in French kissing and heavy petting with with ATI. Only a few can claim to be faithful to Nvidia and not snogging ATI at the same time. Those faithless souls would very much like to show off NV40 or should we say NV4x as Nvidia has prepared NV41, NV41 and NV45, all spawns of the PCI Express devilry.

No doubt the graphics police will be watching the "partners" carefully at CeBIT and applying virtual "chastity belts". Where new products are concerned, Nvidia and ATI are like Victorian fathers - partners should be seen and not heard.

The Inquirer

weta
03-16-2004, 07:21 PM
NV40 is a 16 pipelines part
12 vs. 16 pipes at 210 millions of transistors

After days, nay weeks of inquiring we can now confirm that NV40 is not 8x2 marchitecture part as we previously suggested. Nvidia is behaving as if the NV40 was the Crown Jewels of Her Brittanic Majesty Queen Elizabeth II. Fish can fly, but it's been spreading flying red herrings.
Nvidia is telling "selected people" that NV40 is indeed 210 millions of transistor chip with 16 pipelines as we reported a few weeks ago.

The other side of this NV40 coin is that the real Mc Coy the real NV40 card that was taped out quite some time ago is actually going to be KIA [Killed in action]. Very knowledgeable friends told us recently in the Vienna Opera house that NV40 with 16 pipes and 210 millions of transistor is completely other chip then original NV40.

What actually happened is that Nvidia recently learned about R420 marchitecture and this entire 12x1 story and, that they will eventually end up in second place and decided to can NV40 project and to go immediately with NV45.

Nvidia is a very egotistic company.

NV45 is the name for PCI Express NV40 but apparently this new chip is a rushed new project that Nvidia wanted to save for later.

Still as a consequence our Opera loving friend suggested that there is no possible way that Nvidia might have a working version of the chip at Cebit or if they have it its early silicon for showing off. You can forget about launch party at CeBIT or anytime soon, we are given to understand. Even if the company launches it in April it will be a pure paper launch as NVDA cannot deliver it so soon.

It's not easy to make dramatic changes in silicon and as a consequence you have to re-tape it again and then hope that all will be fine in order to be ready to produce it in six weeks minimum.

As the case is altered, Nvidia might win performance crown again but it's absolutely certain that ATI R420XT will be the first next generation card in shops.

Nvidia and its partners desperately need this fuel possibly with Doom 3 as a rocket to launch it into RetailSpace.

The Inquirer

weta
03-20-2004, 01:04 AM
First hands-on look at NV40
CeBIT 2004 Shocking power requirements

The Inquirer has managed to get hold of one NV40 sample.
We agreed not to photograph what we saw, but can confirm the following.

When removing the heatsink, we noted that the NV40 GPU itself was, in comparison to what we have seen before, quite large.

However whilst our earlier information indicated that it comprised of 205 -210 million transistors, our sources currently say that what we were holding was a 175 million transistor part.

With the heatsink removed we could see that a small formed surrounding was employed to assist leveling of the heatsink.

Unlike the ill fated GeForce FX 5800 Ultra (NV30) where a heatsink on the back of the PCB could also interface with a small section of the PCB directly behind the GPU as well as the memory modules, additional circuitry in this area on the NV40 prohibits this.

Whilst a heatsink still seems necessary to assist thermal control of the new lower voltage memory modules, the heatsink on the front of the GPU will have to do all the work.

Eight 32MB video memory modules are installed – four on each side of the PCB and as we first reported from Computex in Taiwan – these are of the new GDDR3 type.

However these were not from Micron, which at that time seemed to be both ATi and Nvidia’s GDDR3 development partner. Instead, Samsung seems to be the preferred supplier of the 256MB we saw installed.

The primary shocker was that the board requires two large 4-pin power connectors as opposed to the single power connector on current high-end products.

Perhaps a lot of end-users who have been thinking that they might wait and upgrade to NV40 when it becomes available, should start saving their sheckles to upgrade to a new power supply unit as well.

One final thing, contrary to previous speculation, we think that the ‘NV40’ that will come to market may be what was originally planned, as NV45 is, we’re told, the PCI Express part.

Whether that has an AGP to PCI Express bridge chip remains to be seen.

The Inquirer

weta
03-20-2004, 07:56 PM
NV40 A2 revision

NV40 revision A2 clocked at 475MHz core and 1.2GHz memory (GDDR-3)
16 pipelines - 3DMark2001 single-texture fillrate of 7,010 MPixels/sec and multi-texture fillrate of 7,234 MPixels/sec
Pixel shader performance ranges from 2.5X to 5X over GeForce FX 5950 Ultra
Antialiasing in screenshots is 4X RGMS (Rotated Grid Multi-Sampling)
Max anisotropic filtering at 8X with tested driver, but might see 16X
Gameplay performance 2.5X to 3X faster than GeForce FX 5950 Ultra using high resolutions with AA and AF
Image quality is "far far better"

Halo - 1600x1200 - no AA/8X AF - 51.1fps
Far Cry - 1600x1200 - no AA/no AF - 53.4fps
UT2004 Botmatch - 1600x1200 - 4X AA/4X AF - 71.9fps, 1280x960 - 82.9fps
Star Wars Kotor - 1600x1200 - 4X AA/8X AF - 49.3fps

System Specs - Athlon 64 3200+, Gigabyte GA-K8VT800 (VIA KT880) Mainboard, 1GB PC4000 RAM, 160GB SATA HDD.

NV News / Beyond 3D Forums

weta
03-22-2004, 11:05 PM
NV40 pic?

http://www.tweaktown.com/weta/nvidia/nv40.jpg

Black Rain
03-23-2004, 12:29 AM
Interesting, its looks like the same sort of setup from the Matrox Parhelia.
Is that 2 power connectors? Looks like it to me, I wonder what sort of powerdrain/heat output it'll have?

weta
03-23-2004, 01:00 AM
NVIDIA's NV40 mimics a 16x1-pipeline architecture with just eight pipelines
Exclusive: Vertex Shader 3.0 workaround

Investigative journalism is not an easy thing. But we do what we have to do just as Nvidia and ATI need to make their chips. Our job is to post news as exclusive as possible, this is occasionally at the risk of jeopardising communications with the company that we write about. Still we have to do it.
Anyhow, we have something that you all have been waiting for.

It was easy enough to state that NVIDIA's NV40 was an eight-times-two pipeline architecture before and now claim it's 16x1, but it was very hard to find out how this chip actually works without being able to speak with the company's David Kirk and ask him all the nasty questions. We don’t sign NDAs, that’s our problem. If we did, we'd know, but we wouldn't be able to tell you. As it is, we can post this stuff as soon as we get it.

So, the claim that we made more than once that NV40 is 16x1 chip is partially true and but the claim that I made previously that the chip is 8x2 is also true. How's that possible?

NV40 will score magnificent 6500 Mtexels per second in 3Dmark01 whereas ATI's Radeon 9800PRO will give you around half that, but there is a catch here.

We understand that the NV40 will actually only have eight physical pipelines, but these will appear act like 16 in certain games and, indeed, in 3DMark 2001 Nvidia was telling people how 3DMark01 is a very nice benchmark since they can render 16 textures per pass in it, using only eight pipelines. What Nvidia is using is the ability of the Vertex Shader model 3.0 (known as PS 3.0 or VS 3.0) where the Shader can actually render textures as a virtual pipeline but can only render them without filtering information.

But, in modern games you always apply Bilinear, Trilinear or Anisotropic filtering to textures that you render, otherwise they will have some visual malformations. This is why, in normal games, eight pipelines will be able to render eight textures in single texturing pass and possibly even 16 if you render both textures in same pass using an 8x2 architecture.

What we understand Nvidia is doing is processing eight textures by pipelines while the other eight textures get processed by the Vertex Shader 3.0. If they decide to use this in modern games this would roughly mean that they can render every second texture properly, which would mean that you would be able to notice the difference. However, this can be tweaked in the driver if they decide to go that way, but image quality would have to be sacrificed for that.

To get into more details, the Vertex Shader model 3.0 feature has the ability to fetch textures but you could use your VS 3.0 only where you don’t have any geometry - when you have to process just pixels. This cannot be used in modern games as you process just point sample textures. In 3DM2001SE you get such high numbers because you deliver 16 pixels per clock.

We asked Nvidia to comment but since they haven’t announced NV40 they sdaid they can't comment on "unannounced products". But we expect they will be able to tell us more then they launch on 14th in the old continent.

ATI's R420 offering remains at 12x1 architecture which makes this games-running business very complicated, since you cannot bring any final judgement about who will win the chip speed race.

However, we expect NV40 to deliver 16x1-like performance in Doom 3 (or should we say 16x0, which is how people call NVIDIA's approach) where they render all the information except Z or colour features. This will be used by Doom III and games based on this engine.

It's likely that Nvidia will launch up to two weeks prior to R420's introduction.

The Inquirer

weta
03-23-2004, 01:14 AM
Going by the trace lines on the PCB, it appears that the NV40's cooler could be very similar to the current 9800 XT one.
Although unconfirmed, I understand that the primary task for the second molex is to power the cards fan.

JSN
03-27-2004, 07:52 PM
is there any place that monitors the launch dates of these new cards?

weta
03-28-2004, 02:04 AM
JSN: Not really, The NV40 will be launched on the 13th of April at a NVIDIA sponsored LAN in San Francisco. The latest rumours suggest that the R420 will be launched two weeks after it.

benny_mankin
03-29-2004, 03:42 AM
Oh really? I thought R420 was gonna be released earlier than NV40....

oh, well... 2 weeks to go... :drool: :drool: :drool:

SmokeyTheBalrog
03-29-2004, 04:00 AM
No, ATI wants to hold off as long as possible before launching R420 because the margins on current cards are much higher.

Since they supposed have their new card ready to ship, supposedly unlike nVidia, they can wait till nVidia launch. Launch their cards after nVidia, BUT have their cards on the shelves first.

I believe, correct me if I'm wrong, that ATI generally try to have the products on the shelves within two weeks after their launch, unlike nVidia.

weta
03-29-2004, 07:28 AM
SmokeyTheBalrog: No you are quite correct, even if NVIDIA launches first, ATI will have their cards ready in greater numbers and in the stores quicker. If ATI launch the R420 now, they are simply competing against their own R360. It's better for them to maximise profits and continue selling the R360 until NVIDIA's NV40 hits the shelves. Only then will ATI launch the R420.

benny_mankin
03-29-2004, 11:23 AM
mmmm, that's interesting, makes a whole lot of sense tho.
But r u suggesting R420 will hit the shelves later too?

weta
03-29-2004, 05:39 PM
benny_mankin: Not really, I'm just suggesting that ATI will continue to sell their R360 cards for as long as possible. NVIDIA has a history of doing paper launches, so ATI will want to know the NV40 is actually available before launching their new card. ATI's R420 should be available (in quantity) first.

benny_mankin
03-29-2004, 10:26 PM
Sounds good to me :D
btw, check this (http://www.popcultureshock.com/news.php?id=688) and this (http://www.warp2search.net/modules....17166&71068) out guys.

One written by ATIfanatic, the other by NVidiot :roll:

weta
03-29-2004, 10:44 PM
benny_mankin: Your second link isn't working, but my guess is that you are referring to this story here (http://www.warp2search.net/modules.php?name=News&file=article&sid=17166) which has already been dismissed as utter rubbish by Epic's Mark Rein.

benny_mankin
03-29-2004, 10:55 PM
What the ****, now ur link isn't working either, hahaha... I guess it was being deleted while we were writing this :lol:

weta
03-30-2004, 12:10 AM
benny_mankin: The link's working ok for me. :?

benny_mankin
03-30-2004, 12:15 AM
???? :?: :?: :?:

Even the home page is not working, well maybe they're updating it or something... :)

JSN
04-03-2004, 11:06 PM
updates? I just ebayed my 9800XT (I know, i'm a *****) for this or the R420 :D

weta
04-04-2004, 06:51 PM
NV40 video clips

We have some exciting new SDK samples in the works for our next release. In case you missed them running live at GDC 2004, we created several educational video clips we think you'll find interesting. These clips are meant to explain some of the techniques that will be possible with our next-generation hardware. Full source code for these samples and more will be available in the next release of our SDK.

WARNING!: SDK Installer required to run videos (199 MB)

More details and download (http://developer.nvidia.com/object/sdk_home.html)

NVIDIA Developer

weta
04-05-2004, 06:03 PM
NV40 3DMark 2003 scores revealed

12,535: If people thought we were overconfident in this article (http://www.theinquirer.net/?article=14373), estimating the huge performance jump Nvidia were going to make when introducing its sixteen pipeline NV40, then they're going to have to think again.

Numerous sources have confirmed that a Futuremark 3DMark 2003 score of 12,535 has apparently been demonstrated by Nvidia on its pre-production engineering samples.

More information (http://www.theinquirer.net/?article=15169)

The Inquirer

JSN
04-05-2004, 06:29 PM
http://www.posterboys.net/homer-drool.gif

weta
04-05-2004, 06:47 PM
According to The Inquirer two NV40 versions will be available, and they will be called the G-Force 6800 and G-Force 6800 Ultra. The FX title appears to have been dumped.

Cataclysm
04-06-2004, 03:44 PM
I dont particularly mind the scores of the NV40, since I'll never be able to afford one.
All it means is I'll finaly be able to afford a 9600xt or maybe a 9800xt :lol:

weta
04-07-2004, 07:12 PM
NVIDIA's NV40 unchallenged in the power race
Power to the people

In the near future, we are pretty sure that NVIDIA is going to win the graphics crown back from ATI, but not by a huge amount. This will, as usual, last for about two weeks until the next product cycle kicks in, and the brass ring is once again up for grabs. If you think ATI will sit still, think again, it will fight tooth and nail.

NVIDIA will win one race hands down, the power race. I am sure you have seen the pictures of the NV40 boards floating around here and there, and saw the two molex connectors sticking off the back. If this didn’t clue you in, let me tell you right out, NV will be unchallenged in the power race.

Not only will the card lie closer to Prescott than Tejas on the power scale, but it will also have some really nasty asterisks after the two molex connectors on the requirements sheet. Those asterisks are the killer.

It seems each of those connectors will need to be plugged into a separate power rail, and nothing else can be plugged into that rail. OK, they will let you put a fan into it, but anything more is strictly forbidden. You can just hear the good folk at Antec doing the happy dance. Power supply upgrade time people.

Now, before you go off and start criticising NVIDIA the same way you are rapping Intel, let me say there is a big difference. Intel was *****-slapped by the press over Prescott because it didn’t perform and it sucked power. The NV40 will, by all accounts, will perform, and perform well. The juice sucking is secondary.

What I mean is that if you deliver, you are effectively dangling a shiny thing in front of the public, and they overlook the occasional wart. NVIDIA has a whopper of a shiny thing here, what was that about power again?

The Inquirer

benny_mankin
04-07-2004, 07:44 PM
Seriously, guys... don't u think TheEnquirer sounds like an NVidia fan? :)

Well, about the news if it's true, then we will surely need a new PSU. That means after spending $500, or more, on the GPU, we'll have to spend around 70-100 on a good PSU, with more power lines. My current Enermax EG-465..... has 1 extra power line (compared to my old generic PSU) and still.... not enough :roll:
Well, maybe it's worth it, who knows... (and I wonder about the R420's power demand, as 9800XT is the most power-demanding one atm)

Soyoman
04-07-2004, 10:53 PM
One of the reviews I read said that you need a 300Watt PSU or higher, depending on what your currently running for hardware. But i'm guessing 350Watts minimum.

JSN
04-08-2004, 12:55 AM
great...so the sparkle 400W I just bought is useless seeing how i'm planning on upgrading to a NV40 and keeping 3 hard drives....

I need a new hobby :(

tacos4me
04-08-2004, 04:49 AM
Surely you can split one rail for both of the connectors. :roll: But it doesnt matter to me anyway becasue Ati is the only way to go.. 8)

weta
04-12-2004, 07:04 PM
Geforce 6800 Ultra has 222 million transistors

Next week is when Nvidia's hammer strikes the forge and NV40 sparks fly.
We already know a fair amount of what Nvidia will reveal on the 13th in San Francisco and on the 14th in the land of chocolate and cuckoos.

16×1 and 32×0 no Z values is what Nvidia wants to tell people and 12353 3D marks 03 for the chip that has an amazing 222 million transistors.

As we said before Nvidia will push hard the fact that it is the only firm that will have a Pixel Shader 3.0 model on the market and it will focus its marketing efforts there. This means that apart from full support for Shader Model 3.0 it will support model 3.0 Vertex Texture Fetch/Long programs/Pixel Shader flow control Vertex Texture Fetch/Long programs/Pixel Shader flow control and full speed fp32 shading. Let's hope everyone understands what this means.

There are plenty of other features of the chip which can do 16 pixels per clock Color & Z or 32 pixels per clock Z-only, 64-bit FP Frame Buffer Blending & Display, Lossless Color & Z-Compression and a new Antialiasing approach called High Quality AA - Rotated Grid full MTR (multi target rendering I guess), and accelerated shadow rendering.

The chip is built by IBM on a .13µ (micron) architecture and it will use the 60 series of drivers - currently 60.70.

Geforce 6800 Ultra has two power connectors and is, surprisingly a one slot card for its reference card.

It uses GDDR3 memory clocked at 550MHz but some partners might go even higher.

The Inquirer

weta
04-12-2004, 07:10 PM
http://www.tweaktown.com/weta/nvidia/nv40_ref.jpg

NV40: NVIDIA GeForce 6800 Ultra

tacos4me
04-12-2004, 07:11 PM
I hope they include a DVI to analog adapter..

JSN
04-12-2004, 11:27 PM
I hope they include a DVI to analog adapter..

my first thought as well. one more day till the launch...just take my f'in wallet NVidia!

weta
04-13-2004, 07:35 AM
http://www.tweaktown.com/weta/nvidia/nvidia_gf6800u-sample.jpg

Here's a better picture of NVIDIA's GeForce 6800 Ultra reference card which according to one website will cost $399.

AsianBatman
04-13-2004, 07:47 AM
I hope they include a DVI to analog adapter..

Most cards now come with a dvi adapter. I know all ati cards come with it.

benny_mankin
04-13-2004, 08:56 PM
.......according to one website will cost $399.


Whoa, sounds too good to be true..... but hey if it's true, I think it's a great marketing strategy by NVidia. I mean, considering ATI's releasing the R420 2 weeks later, or even more than a month later for the XT series (supposedly the faster than NV40 :evil: ), then a lot of enthusiasts will be tempted to buy NV40, instead of waiting for the X800 XT.

But then again, maybe..... :roll:

:)

JSN
04-14-2004, 11:43 AM
i'll probably buy a NV40...return it in 30 days and get a X800 XT

werley_123
04-14-2004, 01:38 PM
Do they have an exact release date yet...i'm about to build a new computer and would be pretty pissed if I dump $260 on a 5900FX right now and the price plummets two months later :evil:

AsianBatman
04-14-2004, 03:49 PM
That will always happen...no matter what...i bought my 9800 for 200..then 2 weeks later..it was going for 100..but it was sold out the minute the store open.

Soyoman
04-14-2004, 10:52 PM
They just announced the GeForce 6800 today, check the mainsite.

www.nvidia.com

JSN
04-15-2004, 01:21 AM
http://images.anandtech.com/reviews/video/NVIDIA/GeForce6800U/angle1.jpg

so much for the non-power hogging, somewhat cooler, single slot sensible design.

NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.

No, it's not one slot, yes, it has 2 molex connectors, and generally its very loud.

have they learned nothing from the NV30 disaster? Wake me when the X800 arrives http://www.bimmerforums.com/vb3images/smilies/redface.gif

tacos4me
04-15-2004, 03:34 AM
Wake me when the X800 arrives

I agree :) Ati makes much better cards and cooling designs..

Soyoman
04-15-2004, 03:41 AM
:wow: I beleive that's the personal cinema version. There will no doubt be monitor ports on others. Although I am disapointed in the size and likely noisyness of the cooling unit. Let's hope they have it rigged with a stealth fan (crosses fingers...).

AsianBatman
04-15-2004, 06:31 AM
Wake me when the X800 arrives

I agree :) Ati makes much better cards and cooling designs..

Where were you when the g4 were dominating. :D

werley_123
04-15-2004, 06:46 AM
They just announced the GeForce 6800 today, check the mainsite.

www.nvidia.com


So it will take awhile for cards with the 6 series chip set to come out if they just now announced that it had been picked up by the major card makers...right?

tacos4me
04-15-2004, 07:22 AM
Where were you when the g4 were dominating

With Ati :wink:

Soyoman
04-15-2004, 08:05 AM
.

They just announced the GeForce 6800 today, check the mainsite.

www.nvidia.com


So it will take awhile for cards with the 6 series chip set to come out if they just now announced that it had been picked up by the major card makers...right?

No doubt :roll:

AsianBatman
04-15-2004, 08:48 AM
Where were you when the g4 were dominating

With Ati :wink:
With what card..i am quite curious.

werley_123
04-15-2004, 09:59 AM
So if it is going to take them time to pump out these new cards with the 6 series chip, would you think it would be okay for me to go ahead and buy my new card?

I'm looking at an ASUS nVIDIA GeForce FX5900 Video Card, 128MB DDR, for $260. I'm just hopin that it won't drop in price the second I buy it, and I know they prices will go down when these new 6 series cards come out.

AsianBatman
04-15-2004, 10:04 AM
Go with a 9800...5900 have problems with there shaders...so dont waste your money..go with the 9800.

werley_123
04-15-2004, 10:50 AM
Go with a 9800...5900 have problems with there shaders...so dont waste your money..go with the 9800.

I can't find an 9800 that can even compare the the 5900 in the same price range.

I have seen nothing but good reviews for the 5900, plus ASUS is know for making good product, and this praticular card comes with a fairly nice buddle.

But if you have any specific suggestions of what cards I should look at, I would gladly look at them.

The ASUS 5900 has a core speed 400 and a memory speed of 850 and once again, nice bundle. If you can know of anything in that ballpark, once again, I would check it out.

AsianBatman
04-15-2004, 11:08 AM
Core speed doesn't matter...well it does but not when you compare ati and nvidia.
http://www.newegg.com/app/ViewProductDesc.asp?description=14-102-299&depa=0
When you compare 9800 and 5900...the 9800 is the way to go because of its shader and coding sequence for future games...5900 is a great card only for games that dont use the advance shaders that half life2 is going to use...so i would go with the 9800pro sapphire that has a nice bundle and is 243..

werley_123
04-15-2004, 12:00 PM
Intresting (and cheaper)...I noticed that most people say the 9800 pro runs 3dmark03 at about 6000 and the 5900 at 5200 (I assume this is a better indicator than specs. alone). I have only two more questions: whats the big deal with the shader and how would this card run with an AMD 64 3200 and 1g of ddr 400 (yeah, I'm planning on building a monster).

JSN
04-15-2004, 12:03 PM
1. if you're planning on building a monster....why settle for a 9800 pro?
2. quit yer hijackin'

werley_123
04-15-2004, 12:24 PM
1. if you're planning on building a monster....why settle for a 9800 pro?
2. quit yer hijackin'

cuz monsters cost money...and i'm not about to dump 400-500 on a card when i could buy a cheaper one that will be usable for just as long.


and what is hijacking? if i knew what it was, odds are i wouldn't do it.

JSN
04-15-2004, 12:33 PM
if you're going to be gaming, the vid card is the only thing you go all out on. my machine has a 3000xp bought for 150 - and soon to have a R420. all i'm saying is your machine will boot in like 15 seconds, it will load up real quick and lag during play. not a good thing to experience after blowing a few grand.

and hijacking is changing the subject of a thread. we WERE discussing the colossal failure that is modern day nVidia

werley_123
04-15-2004, 12:59 PM
if you're going to be gaming, the vid card is the only thing you go all out on. my machine has a 3000xp bought for 150 - and soon to have a R420. all i'm saying is your machine will boot in like 15 seconds, it will load up real quick and lag during play. not a good thing to experience after blowing a few grand.

and hijacking is changing the subject of a thread. we WERE discussing the colossal failure that is modern day nVidia


Duely noted but all the same, I can only afford to spend about 250 on a card. I'm not looking for a gaming system as much as I am something that can handle all sorts of projects at once...my current computer can't even run word xp and kazaa at the same time; they both suck up the memory and my processor can't keep up. Gaming is definitely a plus, I love computer gaming and thats why I'm looking for the best card I can get at this time (and I think both the 5900FX and 9800 Pro are both good chips), but my main priority is to get a computer that can keep up with all the XP software, which my little anthlon 1.05 ain't doing.

By the way, I believe this does tie into the dismal state nVidia is in; we are sitting here arguing which company produces the best chip. I have personally always been a nVidia person, but the ATI people have brought up some valid points that check out (i.e. ATI's shading ability, though I still don't fully comprehend what the big, fat deal with shading is anyway). So come on, keep the nVidia vs. ATI talk comin.

AsianBatman
04-15-2004, 03:47 PM
1. if you're planning on building a monster....why settle for a 9800 pro?
2. quit yer hijackin'
Sorry : omg:
Start a thread and i will help you, assuming JSN and weta dont put a whooping on me. :roll:

ZEROKOOL
04-19-2004, 08:33 AM
i here its supposed to have up to 512 MB memmory...
the referance only has 256, and 256 on one side too!
so, when are we gona see the 512MB version?

AsianBatman
04-19-2004, 08:49 AM
Current games are barely utilizing 256mb....so i doubt 512 will be anytime soon.

JSN
04-19-2004, 09:18 AM
Current games are barely utilizing 256mb....so i doubt 512 will be anytime soon.

current games aren't even coming close to 256...much less 512

same goes with 8X bus speed...but technology moves forward I guess..?

werley_123
04-19-2004, 10:25 AM
But of course, they will do anything to jack up the price of cards to ungodly prices because they know there are crazy crazy people out there who will pay those crazy crazy prices for technology they won't be using for another five years.

But thats good for the rest of us cuz the prices of what WERE the ungodly priced cards will fall into our price range. It's the circle of life!

weta
04-19-2004, 08:23 PM
512MB of GDDR3 is more about marketing than anything else.

ZEROKOOL
04-20-2004, 07:24 AM
well, thats marketing thats working well for me.
man im getting all wet just thinking about 512mb of gddr3...*OOOHHHHH*....*SPLAT!* man 512mb...*MOOOAAANNN*

werley_123
04-20-2004, 08:07 AM
It's like I said...there will always be someone to buy it.

Another marketing scheme that has worked to perfection.

Now all they have to do is wory about ATI and their new chip.

JSN
04-20-2004, 08:28 AM
if I was Nvidia I wouldnt be worring about memory and clock speeds. On paper the 9500 KILLED the 9800.

now that the R420 is rumored at 600Mhz while the NV40 is at 400...I wonder if the roles have switched

AsianBatman
04-20-2004, 11:51 AM
if I was Nvidia I wouldnt be worring about memory and clock speeds. On paper the 9500 KILLED the 9800.

now that the R420 is rumored at 600Mhz while the NV40 is at 400...I wonder if the roles have switched
i am confused...9500 killed 9800? ...no trying to start anything..but i am seriously clueless.

JSN
04-20-2004, 02:03 PM
yes, on paper. it had a higher clock speed and a higher memory clock.

but it real life the 9800 was clearly better

weta
04-20-2004, 05:24 PM
1. The current NV40 GPU (A1 revision) is 400MHz, a second GPU (A2 revision) 475MHz is on the way.
2. Don't forget the 9800 has more pipelines, a 256bit memory interface and faster memory.

weta
04-20-2004, 05:41 PM
As I understand it, ATI's new (R420) Radeon X800 XT is the equivalent of two 9800XT (R360) GPUs. Further tweaks and the 0.13m process have enabled engineers to achieve even higher GPU clock speeds, and finally there's the new GDDR3 memory. Add all this together and you've got an extremely fast card.

Josh M.
04-21-2004, 10:04 AM
Does anyone know the date that these cards will be available to the general public

weta
04-21-2004, 05:09 PM
Josh.M: May/June.

JSN
04-25-2004, 02:59 PM
any updates on the release dates?

weta
04-26-2004, 04:36 AM
JSN: Nothing concrete, still looks like late May early June.

JSN
04-26-2004, 03:19 PM
maybe selling my 9800XT ahead of the curve wasn't the best idea

I having nothing to play new games until the NV40/R420 launches.

I vote for myself as most anxious http://www.bimmerforums.com/vb3images/smilies2/eyecrazy.gif

weta
04-26-2004, 09:43 PM
TR: When will the GeForce 6800 Ultra arrive in stores?
Tamasi (NVIDIA): By Memorial Day the 6800 Ultra will be available, and by July 4th, the full line of the 6800 series will be broadly available.

Luckily I've still got my Radeon 9800 XT.

JSN
04-27-2004, 10:39 AM
damn....

does anyone know a place that gives full refunds on vid cards within 30 days?

and I thought I was so smart getting $380 for my 9800XT :(

tacos4me
04-27-2004, 07:10 PM
The NV40 is avaible in Alienware systems ATM..

JSN
04-28-2004, 04:31 AM
they say they are...but when you click the link

Intel Pentium 4 3.2GHz Extreme
1GB DDR Corsair XMS PC-3200
160GB Seagate Serial ATA 8MB Cache
NVIDIA GeForce FX 5950 Ultra 256MB
Creative Sound Blaster Audigy 2 ZS

when you select the 6800...

Estimated Ship Date: 6/10/04

tacos4me
04-28-2004, 09:07 AM
when you select the 6800...

Estimated Ship Date: 6/10/04

No, it doesnt. When i select the 6800 Ultra, it says limited avaibility and says i need the upgraded PSU...

JSN
04-28-2004, 09:20 AM
when you select the 6800...

Estimated Ship Date: 6/10/04

No, it doesnt. When i select the 6800 Ultra, it says limited avaibility and says i need the upgraded PSU...

goto checkout, dumbass

weta
04-28-2004, 05:59 PM
Albatron Technology’s latest graphics cards, the GeForce 6800 and GeForce 6800UV, will hit the shelves in Taiwan in early May, according to company spokesman Kevin Lu.

More information (http://www.albatron.com.tw/english/news/news_detail.asp?news_id=77)

tacos4me
04-28-2004, 07:11 PM
goto checkout, dumbass

I did, dumb****. Says nothing of the sort at the checkout. Limited Avaibility would be they have the card but it is LIMITED. :wink:

AsianBatman
04-29-2004, 03:10 AM
goto checkout, dumbass

I did, dumb****. Says nothing of the sort at the checkout. Limited Avaibility would be they have the card but it is LIMITED. :wink:
...tacos you sure dont get along with alot of people...but can we all watch the langauge here? :D

JSN
04-29-2004, 05:24 AM
goto checkout, dumbass

I did, dumb****. Says nothing of the sort at the checkout. Limited Avaibility would be they have the card but it is LIMITED. :wink:

:roll:

http://jsn.prohosters.com/media/proof.jpg

wayout44
04-29-2004, 05:40 AM
tacos4me seems to have a lot of attitude problems as I found out in another thread but maybe he should direct his language at himself instead of others as then he would be far more accurate.

tacos4me
04-29-2004, 08:16 AM
Ok, I'm proven wrong..


tacos4me seems to have a lot of attitude problems as I found out in another thread but maybe he should direct his language at himself instead of others as then he would be far more accurate.

direct it at myself..how about at you? and where were u in this feud?, but i didnt ever show an attitude before JSN here..


goto checkout, dumbass

after which..i aqquired mine..

So yes JSN, you win, hope you feel wonderful.. would you like your slippers?

JSN
04-30-2004, 09:32 AM
Ok, I'm proven wrong..


tacos4me seems to have a lot of attitude problems as I found out in another thread but maybe he should direct his language at himself instead of others as then he would be far more accurate.

direct it at myself..how about at you? and where were u in this feud?, but i didnt ever show an attitude before JSN here..


goto checkout, dumbass

after which..i aqquired mine..

So yes JSN, you win, hope you feel wonderful.. would you like your slippers?

win? win what? just putting a tard in his place

now get we get back to NV40 news...

JSN
06-04-2004, 05:45 PM
by the time ps 3.0 and PCI express actually matter, this card will be too slow to run anything.

/waiting on Asus with their X800 XT

weta
06-05-2004, 07:22 AM
JSR: Welcome back

JSR
06-11-2004, 03:04 PM
Weta............My responses are being yanked, so it's pointless to proceed .......i noticed that the nv 40 is a dumbed down nv 45 (w/@ bridge)....can you imagine this rocket when they take off the govenor and use pci express......anyway, been nice knowin' ya............late

weta
06-13-2004, 03:13 AM
JSR: Don't know who's doing that mate, or why?

Yawgm0th
06-13-2004, 05:57 AM
I imagine it's Darthtanion's doing (only active mod or admin here that I know of other than you, Weta). He seems to post inaccurate information and bad advice every once in a while, so I imagine Darthtanion doesn't want it sitting there for every newbie in the world to come see. :2cents: :shrug:

01001101
06-13-2004, 10:25 AM
I know I new here and don't know a damn thing about JSR (or anyone else), but if all of my posts started disappearing I think I would just take the hint and leave. I wouldn't bother dealing with that kind of bs, there are plenty of other places to spend ones time. It seems clear to me that someone with authority doesn't want JSR here, for whatever reason. :shrug:

weta
06-13-2004, 08:22 PM
01001101: Welcome to TweakTown, I hope you enjoy your stay here.
Yawgm0th: Unfortunately I can't comment on the accuracy of JSR's latest posts because they've been deleted.

01001101
06-14-2004, 02:56 AM
thanks weta :)
But I've got to admit, I'm already having my doubts about a lenghty stay...

weta
06-14-2004, 06:22 PM
That's understandable - it's a little quiet around here of late. :)

JSR
06-15-2004, 01:04 AM
o come on, stick around, there's always me to pound on...............that has to be fun :D

01001101
06-17-2004, 11:09 AM
weta, the slowness i can deal with. It's the content (or should I say 'lack of quality') and MIA leaders that bothers me. There's actually a small list that I could post, but I don't want to offend the long-term loyal members here. However, I do see some potential just under the surface so I'm not quite ready to abandon ship.
Actually, I have one question...

If all the regular admin and mods left, why haven’t some members been promoted to fill their places? Has any effort been made to get some of the departed members back? A little order wouldn't hurt this place at all.

I know this isn't the proper place for such a post, but i felt compelled to say it anyway.

JSR, I didn't really mean to 'pound' on you as such, but I simply couldn't resist not responding to something I thought was obvious. No permanently hard feelings I hope.

JSR
06-17-2004, 01:38 PM
not at all, as you say mate....i'm a bit rude at times usually only to get my point accross.......but, genuinely do not mean to offend someone, probably more self-defence.....regardless, hey, if we're here, it's gotta be good!.........just a bit of proper attitude and we're off.......

weta
06-18-2004, 05:59 AM
It's very unlikely that any of the original mods will return, which is a great shame. With regards to new mods, I have no idea what's going to happen. Personally I would like to see some attempt made to get the forums buzzing again.

JSR
06-18-2004, 10:46 AM
10/4 good buddy..........i'm a one man marching band so we can take it from here........hehe......anyway we can't miss.....the future so bright .............i gotta wear shades