Nvidia GeforceFX (NV30) chip

Discussion in 'Archived Threads 2001-2004' started by MikeAlletto, Nov 18, 2002.

  1. MikeAlletto

    MikeAlletto Cinematographer

    Joined:
    Mar 11, 2000
    Messages:
    2,369
    Likes Received:
    0
    Trophy Points:
    0
    Over at:
    http://www.anandtech.com/video/showdoc.html?i=1749
    0.13-micron GPU
    125 million transistors
    8 pixel rendering pipelines, 1 texture unit per pipeline, can do 16 textures per pass
    1 massively parallel vertex rendering engine
    4 x 32-bit "DDR2" memory controllers running at ~500MHz DDR
    Up to 48GB/s of memory bandwidth using compression
    AGP 8X support
    Full DX9 Pixel & Vertex Shader support
    Supposedly not until Feb of 2003 though.
     
  2. AaronMg

    AaronMg Stunt Coordinator

    Joined:
    Mar 20, 2002
    Messages:
    247
    Likes Received:
    0
    Trophy Points:
    0
    It should be interesting... but it has been delayed for far too long. I am just going to get the 9700 pro instead. We'll see how it performs at comdex.
     
  3. Joel Mack

    Joel Mack Cinematographer

    Joined:
    Jun 29, 1999
    Messages:
    2,317
    Likes Received:
    0
    Trophy Points:
    0
    I'd wait until next Fall rather than soil my gaming rig with an ATI part...
     
  4. Camp

    Camp Cinematographer

    Joined:
    Dec 3, 1999
    Messages:
    2,301
    Likes Received:
    0
    Trophy Points:
    0
     
  5. Kelley_B

    Kelley_B Cinematographer

    Joined:
    Feb 27, 2001
    Messages:
    2,324
    Likes Received:
    0
    Trophy Points:
    0
     
  6. Steve_Com

    Steve_Com Extra

    Joined:
    Jan 1, 2002
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    0
    here is more...

    Graphics king NVidia has announced the GeForceFX here, a 500-MHz graphics processor the company says will render near-movie-quality real-time graphics and will win back the graphics crown from rival ATI and its 9700 chip.

    In addition to the core clock speed of 500 MHz, new boards based on the GeForceFX chip will feature 1-GHz memory speeds thanks to DDR2 memory. The chip, based on an efficient.13-micron process, also features 125 million transistors and eight-pixel pipelines. Drawing on 3dfx

    Despite NVidia's announcement Monday at the Comdex trade show, boards that use the GeForceFX chip won't begin shipping until mid- to late January 2003, according to the company. The biggest reason for the delay--which puts NVidia well behind rival ATI and its 9700 Pro--is the company's transition to the new process, as well as its implementation of new technologies.


    For example, with that many transistors packed into such a small space, NVidia had to develop a new cooling system for the GeForceFX. FXFlow thermal management system channels keep the board from overheating by moving warm air directly out of a PC's chassis. Despite the new cooling system, NVidia claims the chip will run silently when performing normal office tasks.


    NVidia executives say the chip's name resulted from combining its signature GeForce moniker with the FX from 3dfx, the graphics company NVidia purchased nearly two years ago.


    "It symbolizes the fusion of these two companies," says Jen-Hsun Huang, president and chief executive officer of NVidia. "The 3dfx acquisition that we made was all about bringing together the finest talent in the world."

    Interactive Movies

    Just as noteworthy as the physical specifications of the GeForceFX is the software architecture it supports. Based on NVidia's CineFX architecture and supporting NVidia's graphics programming language Cg, company executives say the new graphics processing unit will let game developers create graphics that rival those in motion pictures.


    "I've been through a lot of product launches, but this one gives me goose bumps," says Dan Vivoli, senior vice president of marketing at NVidia. "To be able to get this level of realism and deliver it in real time is just staggering."


    NVidia showed numerous PCs using the GeForceFX, including demonstrations such as an emoting fairy and a dancing ogre. The pixel instruction capabilities of the new chip, which determine how many effects it can pack into an individual pixel, allow for life-like hair and skin, as well as advanced lighting effects.


    In one demo, the paint on a 1950s mint-condition pickup was rapidly aged 50 years, until the vehicle finally rusted away.

    Game developers also took the stage to show off their work with the GeForceFX. In one demonstration, Mark Skaggs, general manager of Electronic Arts Pacific, displayed a battle scene from the company's upcoming real-time strategy game Command & Conquer: Generals. The demonstration culminated in a massive flood and a nuclear explosion full of impressively rendered special effects.

    Building Interest
    It's effects like those that have game developers backing the new chip, says NVidia's Vivoli. "The momentum that the developers have for the GeForceFX shows that we have given them far more choices," he says.

    Just how much it will cost consumers to join the fun remains unclear, as NVidia representatives decline to offer price estimates for the new chip or boards that use it. CEO Huang says, however, the price will be fair, and serious graphics users will be happy to pay it.

    "We will try to sell it as affordably as we can," he says. "Irrespective of its price, I'm sure you're going to love it."
     
  7. Joel Mack

    Joel Mack Cinematographer

    Joined:
    Jun 29, 1999
    Messages:
    2,317
    Likes Received:
    0
    Trophy Points:
    0
    Leo LaPorte showed one on today's "The Screen Savers" (hosted by yummy Morgan and Megan as part of an "all-female TSS")...
    The cooling system alone for the card takes up an additional slot. It looked, uh, cool... [​IMG]
     
  8. Jason Harbaugh

    Jason Harbaugh Cinematographer

    Joined:
    Jul 30, 2001
    Messages:
    2,968
    Likes Received:
    0
    Trophy Points:
    0
    I'm glad that they delayed it to get it working properly and fine tuned rather than putting out a half-assed card just to compete with ATI and stick to their own implemented 6 month product cycle. ATI gets to stay king of the hill a little longer but the consumers end up with one heck of a jump in graphics power in the new year.
    I can't believe they ripped off my name though. I was the first FX!! [​IMG]
     
  9. Kelley_B

    Kelley_B Cinematographer

    Joined:
    Feb 27, 2001
    Messages:
    2,324
    Likes Received:
    0
    Trophy Points:
    0
    Makes you wonder though, who came up with that cooling system first, Abit or nVidia....as Abit has been selling GeFORCE4 cards with a similar dual slot cooling system for several months now.
     

Share This Page