f2 X icon 3 y2 steam2
 
 Search find 4120

GeForce FX 5200

 
After the release of top cards and the failure in the Hi-End sector, nVidia decided to urgently turn towards the market of medium-sized video cards (mainstream) and release products that would compete with the ATi RADEON 9500/Pro and RADEON 9200.

The new GeForce FX 5200 chips, despite their novelty, had even lower memory bandwidth than the budget GeForce4 MX440-8x. However, the GeForce FX 5200 had twice as many texture units, that is, in applications with multitexturing, the FX 5200 should have worked faster, since it should not have had a performance drop due to switching to 3D mode with multitexturing.

 fairy1

There are a lot of innovations in NV34 (GeForce FX 5200). First of all, this is DirectX 9 support, which Nvidia has been dreaming about since the release of the Radeon 9700/9500 and which everyone has been waiting for since the release of 3DMark03. nVidia is still a little late, but managed to bring support for pixel and vertex shaders version 2.0 to the Mainstream market. Also in the new video chip there was something from the GeForce4 MX. So, for example, this is a built-in MPEG-2 decoder that allows you to play DVD movies at the hardware level. A TDMS transmitter was built into the NV34 core to output images to a DVI panel. Now the production of video cards based on the GeForce FX 5200 has become even cheaper. A decoder for displaying images on a TV was also built into the NV34 core, that is, the chip was an almost completely ready-made solution.

But, once again, nVidia has made a bit of a mess. Users are accustomed to calling video chips by code names. Before their release and the official announcement of the name of the video chip, all talk is only about marking - NV30, NV28, etc. And the larger the number in the chip marking, the larger the number will be in the official name. It was expected that NV31 would be called GeForce FX 5200, and NV34 - GeForce FX 5600, but in fact it turned out the other way around and the younger NV31 received a more "older" name - GeForce FX 5600, and NV34 - GeForce FX 5200.

box_GeForce_FX_5200

The cheapening of the chip and the card required sacrifices. List of main differences between the older NV30 and NV34. So, what the NV30 does not have:
0.13 Micron Process Technology - allowed placing more semiconductor elements on a chip and increasing the frequency of a 256-bit core. The FX5200 series has a 0.15 micron process.
Intellisample Technology is a new anti-aliasing technology that eliminates jaggies, ladders and combs in an image 50% better than before. It also allowed the use of color gamut adjustment, taking into account the difference in the perception of light and color directly by the eye and how it is reproduced on the monitor. In addition, this technology used a new and improved anisotropic filtering, which reduced texture distortion by making dynamic adjustments to its image. The FX 5200 didn't have the Z-compression and ironclad color support of that technology. Yes, and could not have - the power of the chip was simply not enough to implement such technologies.
8 Pixel Pipelines - output up to 8 pixels per clock. In our (5200) case - only 4.
400 MHz RAMDAC - for the 5200 series, the video memory digital-to-analog converter operated at a frequency of 350 megahertz.
 DDR II memory - instead of progressive DDR II memory, the FX 5200 had regular DDR.

Specifications NVIDIA GeForce FX 5200

Name GeForceFX 5200
Core NV34
Process technology (µm) 0.15
Transistors (million) 47
Core frequency 250
Memory frequency (DDR) 200 (400)
Bus and memory type DDR-128bit
Bandwidth (Gb/s) 6.4
Pixel pipelines 4(2)
TMU per conveyor 12)
textures per clock 4
textures per pass 16
Vertex conveyors 1
Pixel Shaders 2.0
Vertex Shaders 2.0
Fill Rate (Mpix/s) 1000
Fill Rate (Mtex/s) 1000
DirectX 9.0
Anti-Aliasing (Max) SS&MS - 4x
Anisotropic Filtering (Max) 8x
Memory 128 / 256 MB
Interface AGP8x/PCI
RAMDAC 2x350 MHz

This chip could be called the GeForce4 MX440-8x with DirectX 9 support. Indeed, it was a good update to the nVidia lineup. That's just useless at that time: games at that time with support for DirectX 8 can be counted on the fingers, and there were games with support for DirectX 9 so far and they came out much later. When they started to appear, the GeForce FX 5200 series became obsolete due to its low performance and these video cards dropped in price a lot.