Please report all spam threads, posts and suspicious members. We receive spam notifications and will take immediate action!
Page 1 of 3 123 LastLast
Results 1 to 10 of 30

Thread: GeForce FX Q&A with nVidia




  1. #1
    Join Date
    Nov 2001
    Location
    Taipei, Taiwan
    Posts
    4,308

    Default

    nVidia was kind enough to answer a few of our questions (and some of the ones which YOU guys wanted answered) which we had for them regarding their upcoming GeForce FX product.

    <center></center>

    From the outset, the aim of the interview was to gain some clean-cut answers once and for all so we all know what to expect from nVidia's next GPU:


    TT - When will we see retail cards on the market (in Australia, Europe and the United States), in what sort of quantity and at what cost?

    nV - We are sampling now. Expect to see systems and boards based on the GeForce FX GPUs in February, prices will start at around $399 US.



    TT - We understand there may be several different versions of the GeForce FX. What are they called, how do they differ from each other and will they all be released at the same time - or will faster, enhanced versions be kept hidden under covers to combat future products from your competitors?

    nV - We will have two versions of GeForce FX - the NVIDIA GeForce FX 5800 Ultra and the NVIDIA GeForce FX 5800. The GeForce FX 5800 Ultra is the faster of the two GPUs.



    TT - Is it true that all GeForce FX cards will be manufactured by nVidia, much like what ATi has done in the past?

    nV - The NVIDIA business model has not changed. We are not producing boards.



    TT- Which manufacturers have licenses to manufacture their own cards, differentiating from your own reference design?

    nV - All of our add-in-card vendors have the ability to produce their own cards.



    TT - Speaking of the reference design, the pictures we’ve been seeing on hardware sites (including ours, from the Australian Game Developers Conference which <a href="http://forums.tweaktown.com/showthread.php?s=&threadid=6811" target="_blank">we saw</a> in Melbourne last month) - will this be similar to the final release or have you added any changes such as ViVo, for example?

    <div align="center"><a target="_blank" href="http://www.tweaktown.com/popImg.php?img=agdc02_04l.jpg"><img src="http://images.tweaktown.com/imagebank/agdc02_04.jpg" border="0"></a></div>

    <div align="center"><a target="_blank" href="http://www.tweaktown.com/popImg.php?img=agdc02_05l.jpg"><img src="http://images.tweaktown.com/imagebank/agdc02_05.jpg" border="0"></a></div>

    <div align="center"><a target="_blank" href="http://www.tweaktown.com/popImg.php?img=agdc02_06l.jpg"><img src="http://images.tweaktown.com/imagebank/agdc02_06.jpg" border="0"></a></div>

    nV - Yes - the pictures that are on the web are a good indication of what the final board will look like.



    TT - The cooling devices we’ve seen on GeForce FX cards are quite big and complex, why? Are you experiencing heat problems produced by the 0.13 micron process based NV30 core and / or DDR-2 memory - if so, can these be attributed to production delays?

    nV - Enthusiasts are accustomed to elaborate cooling systems, and view it as a badge of honor. Our silent running feature built-in to the GeForce FX is truly innovative. GeForce FX senses the activity in the 3D pipeline and adjusts the cooling system accordingly. The fan may not even come on during normal use, as the heat-pipe and heat-spreader may provide the needed heat dissipation.

    The move to 0.13u did introduce some delays. However, the transition to .13u was critical to keeping the heart and soul of GeForce FX. The manufacturing times for 0.13u processing and flip-chip packaging are much longer than 0.15u and wire-bond packaging, and while these technologies are critical to making GeForce FX run as fast as it does, they are also the biggest reasons for delaying our availability date to February.




    TT - Will all GeForce FX cards require such extravagant cooling devices on release, devouring precious PCI space like we’ve seen so far?

    nV - No - not all GeForce FX cards will require the cooling device.



    TT - With heat associated with the GeForce FX seeming to be a bit of an issue, what plans does nVidia have for a mobile version?

    nV - We can not discuss unannounced products, but history will tell you that NVIDIA likes to leverage our graphics core technologies in many different spaces. We have already announced the Quadro FX products of the workstation space. I do not think it will surprise anyone when the CineFX architecture finds its way into the mobile space.



    TT - Rumor has it you will be using a complex 12 layer PCB compared to ATI’s Radeon 9700 series, which uses only 8. Is this a requirement of DDR-II memory, operating at 1GHz, for stability or something else? Possibly for yielding faster clock speeds in the future?

    nV - As you know - we don’t comment on rumors.

    TT - Too bad then...



    TT - Over the last few days several companies released specification and system requirements for the GeForce FX. Most notable is the 350 watt power supply requirement - will the GeForce FX actually require a minimum of 350 watts to operate in a stable environment or is this just a precaution?

    nV - You hit the nail on the head. The card itself needs less than 75W but when combined in a typical enthusiast system with high performance CPUs, multiple hard drives or DVD/CD players, the total system requirements will tend to need a 350W supply. There may be individual configurations that could get away with less but when you buy your Ferrari you don't want to forget to fill up the gas tank.



    TT - For our final two questions, we’ll sway away from the tech side of things. Where did the FX name come from? An educated guess would say it came from the 3DFX part of nVidia; maybe you can give us a bit more detail on the naming scheme and why something a little more creative wasn’t used instead.

    nV - GeForce FX is a combination of technology invented at NVIDIA and technology invented at 3dfx. When NVIDIA purchased the 3dfx technology assets, over 100 engineers from 3dfx were offered positions at NVIDIA and those engineers were immediately deployed across all of the products in development at that time. The 3dfx engineers were starting a product called Fusion at the same time that NVIDIA engineers were starting their new GPU. The team formed from the combined group took the best elements of each architecture and created GeForce FX. We feel that it's a nice way to acknowledge the passion and technical contributions that the former 3dfx employees have brought to NVIDIA.



    TT - And finally, the benchmarks which Maximum PC published earlier this month, can you confirm these are accurate and correct?

    nV - They were accurate several months ago when the system was tested. That was very early in the development cycle. We have done a lot of work since then.


    Thanks for your time Hazel and I hope this has cleared up some of your questions regarding the GeForce FX - Feel free to add your comments to this thread.
    Cameron "Mr.Tweak" Wilmot
    Managing Director
    Tweak Town Pty Ltd

  2. #2

    Default

    Nice A couple of questions I wanted to know answers to
    One was the cooling system
    the other was the mobile FX:)
    Great Work :cheers:
    AMD Athlon 2400+ @ 2.26Ghz @1.9Core Cooled With Thermalright AX-7 | EPoX 8K5A2+ | 768Mb PC-2700DDR Corsair XMS | Leadtek GForce4 Ti4400 | Hercules Fortissimo II | 80Gb Seagate HDD | Liteon 12x DVD | Aopen 40x12x48 | Samsung SyncMaster 957p 1600X1200@75Hz | Modded Aopen HX-08 with 300W Aopen PSU | Windows XP Pro SP1

    ICQ :159735580 | MSN : nutty_33@hotmail.com | Email : nutty_33@iinet.com.au

  3. #3
    Join Date
    Nov 2001
    Location
    New England Highlands, Australia
    Posts
    21,907

    Default

    More ppl should take notice of this bit but.... :D
    [b]There may be individual configurations that could get away with less but when you buy your Ferrari you don't want to forget to fill up the gas tank.
    <center>:cheers:</center>

  4. #4
    Join Date
    Jul 2002
    Posts
    13

    Default

    Yeah, but with all the benchmarks coming out Monday morning at 8am, what will NVIDIA say when they show the GFFX can barely beat the 9700Pro.

    http://www.tecchannel.de/hardware/1109/


    For those of us wondering why NVIDIA hasn't done their normal crap talking ahead of time ( they have never been shy in the past ) about how bad their card will beat everyone elses and their precious 6 month product cycles....

    NVIDIA fans are in for a severe slap in the face, courtesy of the company they all love...NVIDIA.

    <ouch>

  5. #5
    Join Date
    Jan 2003
    Posts
    1

    Default

    Would that be like the slap in the face ATI is responsible for to it's customers? How do you ship a card that supports 8x AGP and then tell everyone it doesn't work and you need time to fix it, ATI is still 2nd class.

  6. #6
    Join Date
    Nov 2002
    Posts
    7

    Default

    I'm gonna reserve my opinion on this topic until solid results are avaliable... Not that it matters, I can't afford one anyway.:cry:
    It isss mine... my precioussss...

  7. #7
    Join Date
    Jan 2003
    Posts
    251

    Default

    Actually, I think lot of people and I know would have bought and would still by the 9700 card but I want a certain amount of assurance in the area of driver support.

    I want to upgrade when I want too, not because my card is no longer supported.

    Also, the hardware doesn't, if the software doesn't work.

    When ATI starts showing that it's driver teams are getting close to nVidia, then I will happily put one of their boards in my computer.

    As far as hardware goes it does seem that at the moment ATI has better hardware. The 9700 is in the same ballpark of "power" as the FX, but you can run it almost silently with the aid of Zalman ZM80A-HP VGA Cooler.

    In fact if the FX is truly un-silence able and really noisy I may got a GF4 or a 9700(Though that is less likely because I don't want to deal with driver issues.)


  8. #8
    Join Date
    Jul 2002
    Posts
    13

    Default

    Quote Originally Posted by ICUBB
    Would that be like the slap in the face ATI is responsible for to it's customers? How do you ship a card that supports 8x AGP and then tell everyone it doesn't work and you need time to fix it, ATI is still 2nd class.
    works well enough to beat the GFFX....

    duh.

  9. #9
    Join Date
    Jan 2003
    Posts
    251

    Default

    Now I really don't know which video card to get.

    The FX had a 45% lead n 3ds Max-01, I will be using Maya PLE and XSI, does this mean it will also have a 45% lead there?

    Do I go for the vacuum cleaner with reliable software or do I go for a silence-able great piece of hardware with less trustworthy software.

    I'm going to work on this thing as well as play; I need to be able to hear myself think!



    Add in the price difference of $300 vs. $400
    Or just go for a GF4 4200?!?!
    Or a 9500 and try and make it a 9700?!?!

    Oh, boy. Guess I'll wait for some more benchmarks.
    :(

    If ATI can deliver on drivers...
    But then there is that 45% in max...

    Man I was so hoping for a 9700 killer, much easier decision then.
    Sigh, sorry for the rant.
    :scream:

  10. #10
    Join Date
    Nov 2001
    Location
    Taipei, Taiwan
    Posts
    4,308

    Default

    Anand made a good point in his review - ATI have a 3 month advantage in driver development, with some driver work I think we'll see the GeForce FX 5800 Ultra performing a little faster than we all would have hoped.
    Cameron "Mr.Tweak" Wilmot
    Managing Director
    Tweak Town Pty Ltd

Page 1 of 3 123 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •