In my previous vintage low end gaming system test, i have benchmarked the cheap graphics cards till the end of 1999. In this test, i am going to build the typical low end gaming rig of the early 2000s, up to 2004. This test will contain graphics cards with low price tags till 2004, but only the common and typical ones. In this test, i will not benchmark the pre-2000 cards again, as this time, far newer games will be benchmarked. I will include one of the earlier game tests as reference, as one of the important aspect of this test is to compare, how the newer cards compared to older cards behave.
The changing game industry
The programmer artists of the late 90s got slowly replaced by general people who really wanted to develop games, but they just had no basic scientific knowledge how to do it. The 2-3 person large groups quickly grew into groups of dozens of people, who licensed engines, content, and used heavy scripting. The AAA games of this era typically ignored backwards compatibility and they were only optimized for high-end machines. The casual gaming category became 2D only, and the B class of gaming were slowly disappearing once and for all. The community got replaced with more socially inept people, and the programmers left the field in droves for other industries, such as the banking sector and engineering fields.
Bankruptcy of 3dfx
3dfx cards were slowly becoming obsolete compared to any competing corporations. The Vooodoo3 released in 1999 still didn't supported 24 or 32 bit (true color) rendering, and the maximum texture size was 256x256 on their cards. In contrast, even the low-end cards from other manufacturers from the previous years, supported at least 512x512 sized textures, and at least 24 bit rendering. The formed king of the 3D industry were dying. In 2000, share holders of 3dfx decided to liquidate the company. By 2000, nVidia was already one of the market leader of the industry. They had the money to buy 3dfx, and by 2000 December, nVidia was the new owner of 3dfx. The Voodoo cards were quickly removed from the market, and nVidia became the new king of the graphics card industry.
New graphics cards
nVidia, Matrox and ATi continued to release newer graphics cards. VIA bought up S3, and since then, they making integrated graphics chips. SiS released the new Xabre and Mirage line, but they didn't delivered it in big quantities, they have started to use it in SiS based laptops instead, as integrated solutions. Trident got bought up by SiS. Alliance Semiconductors left the industry after the fiasco of AT3D. Cirrus Logic left the industry as well. Intel also solely used its 3D accelerator line as integrated chip on their motherboards. PowerVR left the industry for mobile phones. 3DLabs left the consumer graphics card market, and they started to sell CAD graphics cards. As nVidia, ATi and Matrox became the only corporations surviving, they didn't had to compete that hard any more. There was plenty of room to fill the gap behind the other corporations. We well see, if this caused a rapid decrease of product quality of these corporations as well, regarding build quality, driver quality, scalability, and compatibility.
New processors arrive
In the high end CPU segments, the Pentium3 processors slowly overtook the lead from the previous generations. Intel didnt released the Pentium3 to the low-end. The last low end CPU of AMD were the members of the K6/2 family for the Super Socket7 platform. The Athlon and Duron lines of AMD was released as high-end and midrange solutions, requiring new motherboards and new sockets. Intel released a low-end family, called Celeron. The Celeron processors lacked L2 cache, however they were usually clocked at around 366-400 MHz. The Celerons were good overclockers, 466 MHz was easily achievable - sometimes, even 500 MHz was stable. Cyrix and WinChip line was bought up by VIA, and they got mostly used as integrated solutions, but VIA later on also had released the C3 processor for Intel's Socket 370.
The low end processor rigs
The AMD K6/2 chips were cheap, and a drop in replacement for the older P1 based Socket7/Super Socket7 computers. The fastest models were 500 MHz in practice, but models with L2 cache and higher clock speeds were also available, for a premium price. The biggest problem of these Socket7 based computers is the very archaic AGP socket, which merely means more than electronic compatibility with the AGP 1x/2x standard.
The Intel Celeron based solutions are more modern, if they have AGP then those can usually profit about it, but do not expect more than 2-3 extra FPS. Intel released several socket and slot type of Celerons. The original Celeron was released as a Slot-1 card, similarly to the early Pentium3 chips. They have quickly released the Celeron for the new PGA370 socket. (There socket to slot converters available as well, allowing these CPU-s to be used in earlier slot based motherboards as well). Later on, they have released FCPGA Celerons alongside with the new FCPGA based Pentium3 processors, however in reality, everyone had the cheapest PGA360 based solutions, unless he was relatively rich. After a good overclock, these Celeron based CPU systems were also able to reach 500 MHz. These chips were a little bit faster than the K6/2 but not significantly, and these motherboards and chipsets were also a tiny bit faster than the Super Socket7 based platforms of AMD. The VIA Cyrix3/C3 cpus were also usable in the Celeron-compatible motherboards, and they scaled up to 700 MHz, however on 700 MHz these chips are barely faster than a 400 MHz Celeron.
The high-end processors was just not available
On paper, you was able to buy an 1+ GHz AMD Athlon or Pentium3 processor by 2001 in every store. In reality, nobody had these processors, because they was too expensive. The Athlon used a new socket called Socket A, and even if someone would gathered the money to buy a fast Pentium 3 CPU, he also had to buy a new motherboard, as regardless using the same Socket370 pin-out, newer Pentium3 processors were not compatible with earlyer revisions of motherboards. Second-hand Athlon processors were basically not available in shops, and even the early, backwards-compatible low-clockspeed Pentium3 CPUs were so rare that people had to win bids against dozens of people to be able to buy one. This forced low-end gamers to use the old 400-500 MHz K6/2 and Celeron processors, or sometimes, the VIA C3 processors.
Performance of these low-end rigs
These 400-500 MHz K6/2, Celeron, and C3 processors were about two times faster than the previous Pentium1 MMX and Cyrix 6x86MX based solutions. They was cheap, so it was a logical choice to upgrade the old computer with these. However, some older programs, such as games, didnt liked the newer processors, and was refused to run on anything than a P1 MMX or Cyrix 6x86MX. Such examples are various versions of Bleem, F1 1997. Some drivers of various early graphics drivers were also glitchy particularly on the K6/2. Windows 95 itself refused to boot on these CPUs with larger clocks, so the users had to upgrade to Windows 98SE instead. Despite of the two times higher performance, the games of this era suddenly required more than that, as now they were made by large teams of scripters and artists instead of programmers. Video card manufacturers were also targeting the high-end and mid-range computers with the drivers of the cards. This made a 4 year long peroid without any ideal solution of low-end gaming. In this article, however, i have built a typical ghetto gaming rig of the early 2000s, and we will see how it performs.
The test machine
I have used the same Asus P5A-B Super Socket 7 motherboard, as that was still a typical motherboard of this era.
As a CPU, i have replaced the Cyrix 6x86MX CPU with an 500 MHz AMD K6/2 chip (which was a present from Yutani, a friend of mine).
I have attempted to overclock the K6/2 to 550 MHz, but the system was not able to boot at that clock speed. When i was rising the FSB from 100 MHz to 105 MHz, the system booted, but crashed after a minute. It seems AMD have released these higher clocked K6/2 processors without pretty much any performance reserves.
The memory i have used was 768 MByte (3x256 MByte modules). This was not typical in the 2000-2004 time range, as even mid-range computers back then barely had more than 384 MByte of RAM. (Yeah, if you are wondering, typically the mid-range computers in early 2000s used not 2 or 4, but 3 memory slots. That was normal back then.)
As a sound card, i left my old ISA OPTi card in the motherboard. I have used a 6 GByte hard disk, which was typical in that era. A network card was also used in the machine, alongside with a slow DVD drive.
nVidia TNT1
The nVidia TNT1 card was a huge success story of nVidia in 1998. The card was basically a nuclear preemptive strike against 3dfx, ATi and S3. This card - with the early drivers - runs very fine in early 166-400 MHz computers as well. The drivers require MMX tho, so the card is not usable with a non-MMX CPU. The later drivers are crappy, so the early drivers were used. The driver i have used is the first driver from the cards manufacturer, and made specifically for this unit. This card has several iterations, but none of them was a cheap low-end card ever.
It was aimed to the mid-range and high-end, and no real low-end variation was ever produced from this card. nVidia certainly asked the price of the performance of this card. The chip was a significantly redesigned version of the Riva128. Despite of being a card from 1998, it supports 32 bit rendering without too many slowdowns. Perfect D3D and OpenGL support, and it has 16 MByte of video memory. Every game from its era is fluid (when using early drivers). Even after the end cycle of this product, this card didnt appeared on secondhand markets in masses. The only reason i will include this in the test as a reference, is basically to have a comparison point of a properly working pre-2000 3D card.
ATi Rage 128 pro
The Rage 128 pro is the only card in this test which was requiring a fan. Passive cooled Rage 128 cards are indeed prone of early failure. The fan on this card makes a disgusting noise (similarly to prety much any other cards from the era) but its not too loud. This card has 32 MByte of SD ram on it, and it was annouced in the end of 1999. These cards were mostly produced in 2000 when they get replaced by the Radeon line. There are various generations of cards labeled Rage 128, and it can be tricky to find the right driver. The first driver i have tried didnt worked.
A driver released in October 2001 with the code-name WME_R128_4_13_7192 was working properly, so i have used that. Despite its name, this card has nothing to do with the Rage2 and Rage Pro chips from 1997, as those were 4 MByte cards for early AGP and PCI, this card is even capable of AGP 4x if its being inserted to a motherboard with AGP 4x capability.
nVidia TNT2 M64/Vanta
After the success of TNT1, nVidia released the TNT2, in 1999. This time, they made a low-end card from it as well, which was in production for a few more years. The TNT2 is advertised to be twice as faster as a TNT1, and the low-end variations are advertised to be just as fast as the TNT1. We will see if this claim is true or not. If the card renders multi-textured surfaces, it can render both texture layers within the same cycle, as it has two texture pipelines. Actually, the real TNT2 cards can barely ever seen, and instead, only these M64 based cards can be found everywhere. I have several M64 cards, but the card i have tested, had the best build quality. Other cards i had, were designs with one or two caps, without even a passive cooler, sometimes only with two memory chips.
The memory system on these are 64 bit. There is even a TNT2 Pro family exists with 32 MByte RAM, still using the crippled 64 bit memory interface. Its a mystery why they have called it a Pro, but probably its intentional misleading. nVidia started doing shady practices at this point in time. They have started to artificially slow down older cards with newer drive upgrades. The card i had, identified itself as Vanta with 8 MByte of memory and of course it only had an 64 bit memory channel. Originally, i thought this card has at least 16 MByte but i was horribly wrong - they have used some super old memory chips instead. I have used the nVidia driver 2.08 for this test, as with later drivers, the card is becoming less usable. The sad thing is that these cards have no TV out. At least mine had a passive heatsink.
Matrox G450
The Matrox G200 card was very fast. It mostly beat the S3 Savage4, the Voodoo1 and even the Riva128 when its appeared. When the card was released, it was in pair with the competitors in high-end and mig-range, and had very decent drivers, with good compatibility. After the success of the Matrox G200, Matrox released two more chip revisions, based on the same core. These products were the G400 and the G450. Matrox didnt upgraded the chips significantly, the main point of the G450 is making the card cheaper than the previous cards. The G450 had an AGP 4x interface (of course its backwards compatible).
It has dual monitor output, but lacks a tv-out. These cards appeared on the market in masses on second-hand markets in the early 2000s. The G200 was indeed running very nicely even in computers below 200 MHz. We will see, how well the G450 drivers will be able to perform. The driver i have used is the version w9x_682, which was basically the only driver i have found. Interestingly the card has a grounding on the passive heatsink, grunding the chip from external interference. Matrox was advertised to have very nice 2D quality, but regarding the RAMDAC, this card was not different from other cards in this test in any notable ways.
nVidia GeForce4 MX440
Despite of its name, this card is basically a GeForce2 with AGP 8x-capable socket on it. The card has nothing to do with GeForce4 Titanium, which was a high-end gaming card. nVidia haven't released any low-end cards from the proper GeForce4 family, so they have decided to make a low-end GeForce4 with modified GeForce2 chips. As this card is a GeForce2 based card, obviously it does not supports pixel shaders. The card i have had 128 MByte of RAM on it, which was a very large amount of ram (even the generic GeForce4 Titaium cards only had 64 MByte memory on them). Luckily the card is passive, and it needs drivers 45.x or above to be recognized by the OS.
nVidia increased its shady practices when introducing this mutant GeForce2 under the GeForce4 brand, which was two generations older by that time. The people who bought it, were clearly surprised when they have realized this card does not supports shaders. Despite of these, the card has relatively decent build quality, and it was relatively popular. As it only compatible with a relatively modern driver package, the performance on vintage computers is not expected to be very fantastic.
ATi Radeon 9250
The market share of ATi was slowly growing, and nVidia started to lose ground. ATi released its new DirectX 9 compatible chipset in 2002, which supported Shader Model 2.0. Well, the Radeon 9250 does not supports DirectX9 from hardware, as its a DirectX 8 card. They did a similar trick which nVidia did with the GeForce4 MX. They filled the low-end with older cards. AMD however only went back one generation, and renamed the Radeon 8500 to 9250, after they have added support for the new AGP 8x standard as well. This caused a smaller backlash compared to nVidias idea to rename the GeForce2. The memory bandwidth was cut to 64 bit, and most of these cards were released with 128 MByte memory. At least my model has passive heatsink, and it supports tv out as well.
Half of the components originally planned on the card was spared down later on, the build quality is overally cheap, but there was no issues with signal quality of stability. At this time, ATi switched to unified driver model (similarly to nVidia), so they have started to support all the Radeons with the same driver packages. The driver i have used for this card was 4.2 which recognized the card as Radeon 9200. Back then, the Radeon cards i have tried in my low-end computers, performed terribly. Lets see if my memories about this card were correct.
GeForce FX 5200 AGP
nVidia was unprepared for the release of the ATi Radeon 9700 and 9500 cards. They product was far from finish. nVidia was working on their DirectX9 capable hardware, based on their previous GeForce 4 Titanium cards. nVidia was about to ditch TSMC and prepared to partner up with IBM to manufacture the chips. As the corporation was not really focusing on optimizing the drivers, especially for low-end computers, in an instant, they had to do much work too quickly. The FX was barely ready, they basically tossed the new DX9 capable shader units into the GeForce4 Titanium core, and started to manufacture the chips prematurely on IBM manufacturing lines, later nVidia regret this decision, and went back to TSMC. To see the contrast, The ATi Radeon 9700 and 9800 cards were using 256 bit memory bus, the nVidia GeForce FX5800 was only used 128 bit bus, so they had to overclock the memory and the chip significantly to catch the performance of ATi.
But those cards are not being discussed in this article, as this test only focuses on the-low end cards in low-end computers. And indeed, the FX5200 was a low-end card, mostly shipped with 64 bit memory bus and 64 MByte memory. The GeForce FX 5200 card is a direct competitor of the Radeon 9200 line, but it supports DirectX 9 from hardware. This card was also built usually with passive cooling. The initial driver i have tried was too unstable, so i have ended up using the driver version 55.56 (but 53.03 could also be usable).
GeForce FX 5200 PCI
I was able to get an FX 5200 PCI edition card, which has a PCI slot instead of an AGP slot. I have down-clocked this card a little bit, to preserve its life (but in this test, i ran it at full clock speeds for comparison). The PCI edition is rare, and it will not run in 486-class computers.
This card is also a 64 bit card, and it has 64 MByte RAM on it. I was curious to measure it against the AGP variant, to see if there is any difference in practice in these low-end computers, or not. The build quality of this card is very high, the card is heavy, and it has good quality dry alu-caps. We will find out, what happened, if the computer had no AGP slot, and someone bought this card.
Unreal Tournament
Unreal Tournament was a popular FPS game. The game offers compatibility with several 3D API, such as D3D, OpenGL, 3dfx Glide, S3 Metal, and it even has a software renderer. The game should run good above any 400 MHz CPU. The game was popular, and it was played on lan parties, and in schools. As we can see from the results, newer graphics cards were scaling so bad under this game, they fell beyond the TNT1, except the Matrox G450.
Unreal Tournament 2003
The new version of Unreal Tournament didn't became that popular. I have remember playing it on lan parties for a while, but i never tried to play it alone. We can see that the creators of this game were more like lever designers and scripters than actual programmers.
The game is about 4 times slower than its predecessor, and the levels are well designed, but at the same time, quite boring. We can see that the game engine is struggling on older cards, but its have a huge CPU limit as well, so the game never becomes playable on this computer, regardless of the graphics card used.
F1 2002
Formula1 2002 is released by EA. This game shows, what happens, when professionally inadequate people decide to script into a random 3d engine till it requires a $4000 PC to run the game above 15 fps. Despite it looks like something from 1993, i remember this game even stuttered on a 600 Mhz Pentium 3 computer behind playability.
My favorit part is when it shows a Formula1 race car in the main menu spinning at 1 fps, and you have to press the mouse button for a few seconds for the click to be registered. An 1.1+ GHz CPU is needed for this game to be playable. It takes 2-3 minutes to load a race.
Warcraft 3
This game was released by Blizzard. Its a sequel of the popular Warcraft2. They have started to add some RPG elements to the game, and a complex story. This game was not bad, but i was disappointed in it when i have first tried it. It lacked the atmosphere of the Warcraft2 games, the music and sounds are boring and unfit, 3D is more an issue that bothers you than a feature that makes it look nicer.
Despite of the game only uses a few dozens of models and textures, it takes one eternity to load the game. Once it is loaded, the game runs on the edge of playability. The game favors the newest nVidia cards, and with a little bit newer sound card and a little bit of fine tuning in the bios, it would reach 15 FPS which would be a very playable experience.
Jedi Academy
The game was scripted together by a random scriptjuggler game designer studio on top of the Quake3 game engine. The only problem happened when they hired the graphics designers, they forgot to explain them that they should not use too much polygons, otherwise the game will run at 4-5 fps due to the limitations of the AGP port bandwidth, and CPU's vertex transformation performance.
This results in a good 4 fps when more than one characters walking around on the screen, and does not matters what graphics card you use, the game is unplayable. Similar mistake to the F1 2002, you need a computer above 1 GHz with AGP 4x to play this game. After releasing these Star Wars titles, these games were put on hold by the studio, probably after they have learned that they snowflake game wont run on 99% of computers.
Collin Mcrae Rally 2
This was a very good car racing game, released in 2000. The game was written by real programmers, so its small, and scales properly. Newer video cards in low-end computers, however, having a problem running it. The old cards will run this game without a problem, but the newer cards will require at least an 600 MHz Pentium3 to deal with this game.
When using an older TNT1, Riva128 or a Matrox G200 video card, the game runs fluidly even on a Cyrix 6x86MX 250 MHz, otherwise its a total disaster. The game also runs on 3dfx cards, but the tree textures are sometimes buggy on those.
Collin Mcrae Rally 3
It seems the real developers got kicked out and was replaced by the javascript fanclub of Bangalore or something, because this game does not even starts, unless you have the latest shiny graphics cards in your computer. Well, actually it started on some of the other cards as well, just to crash to the desktop even from navigating in the menu.
When the game runs, it can run at a miraculously unplayable 6 fps. In this aspects, this is worse than the F1 2002 and the Jedi Academy combined, as those at least start. The game was so bad that after releasing this title, the WRC (licensing the Rally championship) immediately revoked the license from this group.
Price of Persia: Sands of Time
This game was released in 2003 and it just simply refuses to run on anything that's not DirectX 9 complaint. I don't know why the developers decided not to support older cards, as the game isn't even looks that nice. Once it starts, it runs pretty fluidly in lowest settings tho.
In the case someone would buy a Sound Blaster Live 5.1! sound card, that would spare some extra time for the computer allowing this game to run more playable, averaging above 18 fps. The game would indeed run very nice on a Pentium3 without too much trickery.
Croc2
Croc2 was actually released in 1999. Its the sequel of the Croc1, which is a very good 3D platformer game. The Croc2 is not so good as the first one, but still usable, and it worth trying. I have included this game to have a test about an older game as well. This game will tell, how these new cards will scale when we are about to use them to use older games. And indeed, newer the game gets, slower the game becomes.
After a certain nVidia driver version, the does not even runs any more, simply just crashing to desktop. We must note that the 11 fps the GeForce FX can produce under this game, is similar to the performance of an S3 Virge DX 4 MByte card from 1996 under this game in 400x300, and even a Matrox G200 can run it above 20 fps. This is of course the failure of nVidia and ATi, and not the problem of the developers of the Croc series.
The results
We can already see the problems of the low-end gaming of the early 2000s. Video chip developers decided not to support low-end systems, and the cards will only run properly in super expensive systems. The era of games written by programmers ended, which have resulted in a new era of games where loading a scene takes 2-3 minutes, and the games will only run on 6-7 fps on non-high-end computers if they even run. This meant that if someone wanted to play new games in early 2000s, he was not able to so on a low-end machine. Only high-end machines were capable to run the new generation of games, regardless of the settings, and there was no technical reson behind this, it was the fault of the software developers and the video card developers together. If someone wanted to play older games, he had to buy an older graphics card and use it with older video cards, such as the TNT1, Riva128, 3dlabs Permedia2, or 3dfx Voodoo. If someone wanted to play with newer games, he was usually not able to do so, as only few games were running properly even on mid-range computers, regardless of the videocard.
nVidia TNT1
The nVidia TNT1 is able to run pre-2000 games fluidly when paired up with an 500 MHz K6/2 CPU. In more modern games, it can usually keep up with far newer cards from other manufacturers as well. Probably in a Super Socket 7 machine, using the TNT1 is the best idea, when the goal is to play with games from 1998-1999. The game cant seem to be able to run more modern titles properly, but even the far more modern cards fail this task miserably, therefore there is not much sense in replacing the TNT1 card. The only problem is that the TNT1 is very rare, as it was an expensive high-end card, and even nowadays you can't just simply find one easily.
Rage 128 Pro
When the TNT1 was just able to reach playability (such as 20 fps and above) the Rage 128 Pro was always falling behind it by at least 3-4 FPS. This few percent performance deficit is not a significant, but it always happens in such situations, when the FPS just falls from fluid playability to stuttering. ATi didnt really optimized the drivers of this card to perform well in low-end computers, but it still outperforms some cards, like the MX440. The biggest problem of this card that it wanted to be smarter than the user, and fixed the refresh rate at 75 Hz under gaming. My monitor does not supports this, and there was no place in the driver to force the card to 60 Hz. The drivers are so basic like if we were in 1993 in Windows 3.1, ATi certainly was focusing on useless things, and it ignored critical problems like this. The Rage 128 Pro is not a good card, but it isn't a bad card either, shuffling around the drivers could give better results than this. Which is sad, because the age of the card would justify a more competent performance on a K6/2.
nVidia TNT2 Vanta/M64
The TNT2 M64 which was supposed to be a good and fast alternative to the TNT1 is actually 30% slower than the TNT1 its supposed to replace. In some cases the deficit of the TNT2 M64 is almost 50% which makes this card indded to be ideal only as a cheap movie playback accelerator card for OEMs. I have serious dubts if a model with the bigger memory would help, as the card is even outperformed by Riva128 cards with 4 MByte VRAM easily. This card was indeed not a good experience to be used, but some older games can at least run on it in near-playable frame rates.
ATi Radeon 9250
ATi havent optimized the drivers of the 9250 for low-end computers. Judging from these numbers, you were even better off to buy a TNT2 M64 to your low-end computer than using the Radeon 9250. The card is basically totally useless, the old game titles will not run properly, and the new ones will need far stronger computers to run than a low-end 500ish MHz computer. This made the 9250 a product not ideal. Despite it was faster than the FX5200 in modern games on high-end computers, its hard to imagine someone buying this cheap OEM card for his 1.4 GHz Athlon flagship for gaming.
nVidia Geforce4 MX440
nVidia scammed you out of your money, when they sold this rebranded GeForce2 card to you. The card is the worst in the test. The drivers produce glitches in most of the games. The card had to be re-seated about 10 times to the motherboard to be even detected. The driver simply ignored the refresh settings, and it always wanted to output in 85 Hz despite the monitor does not even supports that refresh rate. nVidia released this card by not even really writing a driver for it, they just used the GeForce2 backend in a driver set which was optimized for $2000 dollar high-end computers. The results of this 128 MByte low end gaming card are almost identical to an S3 VirgeDX 4 MByte performance-wise, but they support even less games in compared to the Virge. Old games wont run on proper FPS, and new games will not even start. This card resulted an exodus of users from nVidia, right into the arms of ATi when they have released the Radeon 9500. If there is a worst card of the history of nVidia, then clearly the MX440 is the winner of this competition. Did i mentioned it can run fewer games than the original GeForce2, and i had to try two drivers till i was able to run at least half of the games without crashing the computer?
Matrox G450
When running older games, the G450 is slower than the TNT1 or the previous G200 (not included in the test, but i tested it previously). The performance of the G450 is not bad regardless, it can run older games almost as well as the TNT1, so this card, similarly to the G200 or G400 was totally ideal for a low-end early 2000s gaming PC if the goal was mostly to run older games. Sadly, this card cant offer good performance with other games, therefore if someone bought this card to play with early 2000s games, he was not really able to get a good experience, but at least the old games were running properly, so Matrox didn't scammed him like nVidia did with the mx440.
FX5200 AGP
The performance of the FX5200 is very bad with the older games, but it can run the newer titles quite well. The quite well, of course, translates to 10ish FPS, which is certanily not too big, but other cards are either far behind that, or they can't even run newer titles. This means that the FX5200 AGP was actually a quite good choice as an early 2000s low-end gaming computer, if a person wanted to play the newer games, as the card will either outperform ATi cards, or it will run titles which ATi cards (or earlyer TNT cards) can't even run.
To gather a few extra FPS in games, the user must replace the sound card to a Sound Blaster Live 5.1 card (to a model which supports hardware accelerated sound playback) which will give enough stamina for the system to gather an additional 2-3 fps in games, by relieveing some burden from the IO system, and sparing some extra CPU time for the video card drivers and for the games. After doing this, games like Prince of Persia, Warcraft 3 will go from almost good playability to barely playabile. And of course obviously various of other games, which wasn't measured right now, will also rise from 11-12 fps to 15+ fps. If you are lucky, you can set your FSB from 100 to 110, which will give you an another FPS for free, and also disabling unused ports and interrupts in the bios (com ports, lpt port) gives you an another extra fps.
FX5200 PCI
If your computer had no AGP port, but the PCI version was able to work in your low-end computer, you are very lucky, because the performance is the same as the speed of the AGP model. The PCI models are usually passive, nowadays they can be tricky to find them, as the AGP models are far more common. The PCI models were not manufactured in masses, so only a few manufacturer even attempted to produce such cards. The PCI model will also scale similarly to the AGP model, if you buy a more modern sound card, and start playing with the FSB and disable unused IO devices in the BIOS. In some situations, the PCI card even outperformed the AGP card for me, but that was only because the ALI chipset uesd on this motherboard have very sluggish AGP support. In a Celeron or P2/P3 based system, the AGP version would be just a tiny bit faster than the PCI version.
Combined results
We can see the average FPS based on the sum of all of the tested games, divided by the number of games. The TNT1 won, and only the GeForceFX can offer a comparable performance (by making the newer generation of games to run more properly). The MX440 is prbably the biggest disaster of nVidia. The TNT2 M64 which was supposed to replace the TNT1 is about 30% slower in reality, in these low-end computers at least. The numbers would be significantly different, if a more modern Pentium3 would been used to perform these tests, but the drivers are so un-optimized that low-end Celerons and early P3s would still show the same results.
TL;DR
In early 2000s nVidia and ATi stopped doing the math, and started doing the meth. They have released fake $100 low-end gamer cards, which were running in low-end computers 2 times slower than a 3dfx Voodoo1, which was about 8 years old back then. Rart'ed script juggler modders wrote games that were running about 20 times slower than the games of 1990s meanwhile they looked worse. Mentally degraded fanboys clapped like its everything fine, and they still do up to this day.
Beautiful