Low end 3D cards from the 90s

5 1179
Avatar for Geri
Written by
3 years ago
Topics: Review, Programming

In this article, i will showcase and test the low end 3D video cards of the late 90's. The video cards will be used in a period-correct computer. Only the cards that was available for low-end prices, was tested. The market of that era was quite different from the modern era. There was far more manufacturers creating 3D chips, and the market for these early 3D chips were more fragmented. This article will only discuss the low-end graphics cards. The technology was still very new, 3D chips evolved rapidly, and the new AGP bus was just released. The goal of this article is to explain, what low-end video cards you were able to actually buy in the late of 90's, and it not goals to explain what video cards was manufactured at that time. Just like now, even if there is the nVidia GeForce 3080 available and you want to play games, the chances you are going to buy some multiple generations older $50 video card is far higher than buying a high-end solution released a few months ago.

The story begins in 1996

There was various 3D cards released in 1996 and 1997. These early 3D consumer cards quickly became obsolete, and they became the new low-end. Its important to note that this article is discussing the end of 90's, up to like 1999. In 1999, you still had to use the low-end cards originally designed in 1996 and 1997, and there was multiple reasons for this. Even if you could in theory buy a more modern 3D card for the era, there was factors not letting you to use more modern cards in your computer.

Your computer was still a 486

Unless you were rich, you only had a 486 or an early Socket4/5/7 around the end of the century. Sometimes, even in the first years of 2000. You were lucky to even have a PCI port, and enough memory sockets to upgrade the RAM. Even if you had an AGP port, it was a first-gen AGP port. Which means that only the very first generation AGP cards will work in it properly. So in theory you already had very fantastic AGP video cards in 1999 on paper, but there was either very expensive, or you simply just could not fit them in your computer.

Getting a 3D card was tricky

Production of the 3D cards of the late 90's was focused on high-end and mid-range computers. This means that every solution from that era targeted the more new Celeron and Pentium3 line. The new Athlon generation was also appearing on the market. Most of these new generation of 3D cards produced in the era, was not working properly in the late 486 boards, or even in the Socket7 and Super Socket 7 motherboards. The ones working properly in older computers, was very expensive. Which means that owners of low-end computers was forced to go with low-end video cards from the previous years.

S3 Virge

The S3 Virge was the world's first real 3D video card. It was released in 1995, and was manufactured for years. The original S3 Virge was very popular card, and it used the PCI port. This made it a very generic solution. The cards was usually released with 4 MByte video memory, but the cards with only 2 MBytes were upgrade-able to 4 MByte, as there was blank memory sockets available on them. To get usable 3D, the owner had to upgrade the cards to 4 MByte. The original price of the card was about $200, which they released to $160 in the next months. The prices quickly fell. The card was available for about $10-15 around 1999 on the street markets, but you still could get one for $20-$25 brand new by then.

(picture: S3 Virge)

Do not forget that we talk about 1999ish prices here, and $10 in 1999 was worth far more than now, but its still affordable. The card was designed to run 3D games in 320x240, and the speed starts to quickly fell above that resolution. In exchange, it supports even 1024x768 on the windows desktop in 16 bit mode, or up to 800x600 in 24 bit mode. To keep the costs low, S3 have designed the Virge to be pin-compatible with their previous Trio64 chipset. This meant that manufacturers didn't had to come up with a new circuit design to build Virge cards. The design of the card is very simple, its just the chip, video ram, and a few capacitors. Basically thats it. Early, non-standard 3D cards for arcade systems were using multiple chips. S3 followed a different approach. When they have designed this card, they have decided to create an affordable, 3D capable chip for the masses. This was a massive difference from other cards. S3 wanted to sell the cards in mass, so they designed the card to be just as cheap as previous 2D-only cards. By 1995 the technology was ready to stuff the required number of transistors into the chip. The release of the card made a shock on the video card market. S3 single handedly bankrupted some of the earlier video chip manufacturers, such as BOCA, Tseng Labs, Everex, and other big names from the era. About all the 20 biggest manufacturers of the era bite the dust, and only a handful of the rest survived. Such as ATi, which was also just about to release their first 3D chip.

S3 Virge DX

The Virge DX is a little bit faster than the original Virge. The chip was designed to run 3D payloads up to the 400x300 resolution. The 2D capabilities was also fine-tuned a bit, allowing 1024x768 in 24 bit. Its important to note however, low-end computers had monitors only capable up to 640x480, and the 800x600 was already the territory of mid-range machines. The Virge DX used the same PCB as the previous Virge, and had the same pin-out.

(picture: S3 Virge DX)

The card is about 30% faster than the first Virge, but we will measure it in the tests, how faster ecsactly it is. This card was also available in PCI only. It works a tiny little bit more warm, but still requires no heatsink. S3 noticed some performance bottlenecks when using texture filtering and alpha blending, and they was also able to speed-up the calculations with perspective correction calculations. The refined card was also released with 2 MB or 4 MB, the 2 MB variation was upgrade-able to 4 MB, which you have to do if you want to use it for 3D. The build quality of DX cards are notably cheaper, as these are later cards. They have smaller capacitors and electric components, but of course this depends on the cards manufacturer, and not on S3 itself.

S3 Trio3D PCI

The Trio3D PCI, or more precisely, the Trio3D/2x family was released in 1999. This card was released in the era of the first GeForce which is about 10 times faster - but there was indeed logic behind releasing it. The Trio3D PCI was very cheaply available for OEMs, and then it quickly became available on second hand markets, for an approximately similar prices as the Virge.

(picture: S3 Trio3D PCI)

Despite the name, the card is actually a Virge on steroids. Its the same chip with a little redesign. It was released in 4 and 8 MByte variants. Under 3D, only 4 MByte is usable, the rest is reserved for 2D desktop objects. The card is about 30% faster than the previous Virge chips, and it supports 24 bit rendering in 3D. The card requires no heatsink. It was also designed to run 3D games up to 512x384.

S3 Trio3D AGP

The AGP variant of the previous card, meant for cheap OEM computers. Similarly to the PCI edition, it quickly became available in mass, for the same $10-15 price tag on second hand street markets by the end of 1999, and the brand new version was available for about $25. The AGP variant is compatible with first-gen AGP motherboards, such as the first Super Socket7 motherboards, socket 370 motherboards, and Slot1 boards.

(picture: S3 Trio3D AGP)

The drivers will work properly even in DOS, with decent backwards compatibility with DOS games as well. Similarly to the PCI variant, its available in 4 and 8 MByte versions. I am curious to see if there will be any difference in performance compared to the PCI version. As i already have mentioned below the PCI variant, the Trio3D supports 24 bit rendering as well. I don't know which part of the memory segment stores the frame buffer, this feature will not be tested or discussed of the Trio3D, that would probably also worth a totally new article.

S3 Savage4

I was hesitating for a while to add the Savage4 to the list, or not. S3 Savage4 was manufactured in 1999, and it was meant to compete with the cheap nVidia cards of that time. The Savage4 is not optimized to work well on vintage computers, but in fact, it will work. The smallest and cheapest S3 Savage4 variation is the Savage4 8 MByte AGP edition, which was available for $50-$60 brand new in the end of 1999, which convinced me to include them in the test even if its a little bit out of era. S3 was working on the Savage products for a while in secret, and actually the first iteration was the Savage3D, which i was not able to find. The Savage4 is not much faster, so it does not really matters anyway.

(picture: S3 Savage4 AGP)

The card is also available with 16 and 32 MByte video memory. The bigger models are far more expensive, but the smallest model fits this test. The card was optimized for more modern systems, but there are no issues with Socket7 systems. At least, no issues with boting up. However, this card was optimized for Pentium2 class computers, and above, so i don't know just yet, how it will perform. Despite these concerns, the card will be tested, as i happened to have a 8 MByte AGP version, which can even be jumpered from/to agp 2x and 4x mode, which is a very neat feature for backwards compatibility. The card does not belongs to the Virge/Trio family, its a brand new technology from S3. They have added OpenGL support as well, and the card needs a passive heatsink. PCI versions exist from the card, and the card reportedly works from 486, 5x86, Socket 3/5 based computers as well, so it can be plugged into vintage computers of all sorts with a PCI port. The card is more than 5 times faster than the previous Trio3D, but due to the limitations of the drivers, it could be problematic to unleash this performance on low end pre-1999 computers.

ATi Rage2+ PCI

The Ati Rage2+ are a competitor for the Virge graphics cards. It was originally released in 1996. The original Rage was unusable for 3D, as it usually had 2 MBytes of VRAM and it was not capable to do z-buffering from hardware. ATi fixed this issue in the Rage2 and Rage2+ family. The Rage2+ usually had the required 4 MByte memory, and it was originally designed to allow gaming in 640x480. ATi dint let S3 to harvest all the low-end 3D market for itself. They was working for a while on the 3D chip, and when S3 released something new, ATi also have released a newer and faster models. They kept up in speed with each other. The card indeed tolares 640x480 better than the Virge, however the vertex processing is weaker, which puts it into competition directly with the Virge, which has a little bit stronger vertex performance, but scales very badly with the resolution. The speed of the two cards are approximately identical.

(picture: ATi Rage2+ PCI 4 MByte)

The Rage2+ has Windows NT drivers with OpenGL support, but under Win98, only D3D is available. The PCI variant was available for about $10-20 secondhand at the end of 1999, and it requires no heatsink. The card is coughing from some PCI ports, the plating of the connector is not perfect somehow, which means the owner must wiggle it sometimes a bit, to make the system boot. Also the screws seems like they are half milimeters off, and you have to force the card into the slot somehow. Once it starts up, the card is stable. I dont know if the card supports 32 bit mode in 3D gaming, but with 4 MByte of memory, it makes no sense to try using it in 3D even if it supports it. It at least can do it on Windows desktop. ATi followed a similar philosophy as S3 did, they aimed for cheap cards. The 4 MByte uses 8 half mbyte EDO memory chips, similarly to the Virge. This is very high-end for 1996 and the card somehow feels very high-end. My model also has the chip revision that has DVD playback acceleration - whatever that means, but it probably either has a h262 decoder built in, or it can just copy the decoded frame very efficiently into its frame buffer to dispaly the movies. Of course this feature needs ATis own video player program, and probably nobody used that ever. Instead of this decoder, the card really could have use more transistors to boost the 3D performance, but it already has 5 million transistors which is already very overkill. These early GPU designs were indeed inefficient with transistor counts.

ATi Rage2c AGP 4 MByte

The Rage2c is the AGP version of the Rage2 family. The card performs essentially similarly to the PCI Rage2+, but i was curious about the difference. The card was available for $10-15 on the second hand market, and it was priced about the same as the Trio3D AGP, so it was around $25-$30. The Rage2c and Rage2+ chip is essencially not differs.

The card is capable to run in early motherboards with AGP as well. It requires no heatsink. Unlike the Rage2+ PCI, it has more quality connector coating, so it usually works for the first try. This is the first AGP card of ATi, and unlike the later ones, this era of Rage cards rarely dies, so all of the current ones i have, still works without any problem.

ATi Rage2c AGP 8 MByte

Similar to the previous card, but this has 8 MByte of memory on it. This would make it possible to try the 32 bit mode. It was $5 more expensive than the 4 MByte model, it was sold for about $35-40$. Sadly, the more RAM in reality barely makes any difference... Or it does? We will see, as i have gathered both the 4 MByte and the 8 MByte versions of this card.

The 8 MByte model looks similar to the 4 MByte model, the only difference is the number of memory chips soldered onto the PCB. The 8 MByte model indeed signals more quality, the chip is however in the same as on the 4 MByte model.

SiS 6326

The SiS 6326 is the first 3D card made a SiS. The card was released in 4 MByte and 8 MByte variations. I have only the 8 MByte version, so i will test that. It was released for the AGP and PCI port as well. The AGP is a little flaky on some vintage motherboards, it will need the AGP drivers to be installed, otherwise, instability can be observed on some motherboards, such as the ones with ALi chipset.

(picture: SiS 6326 8 MByte AGP)

The card was about $40 in 1999 brand new, it was designed to run games in 512x384, and in some cases, even in 640x480. The card was released to compete with ATi Rage2 and S3 Virge series. It had a long life span, as it was integrated onto a lot of motherboards. Later on, SiS released drivers even for the Win2000 and XP, and they released an OpenGL 1.0 capable driver for Windows 9x. The card requires no heatsinks, and its a very simple design. It supports 16 and 24 bit graphics modes both in 2D and 3D. Besides this, this card has very good support for DOS, its integrated VBE can run most of the DOS games as well. The 24 bit mode rendering will not be tested in this article, but outside of this, i have checked it.

Cirrus Logic Laguna3D

The Cirrus Logic Laguna 3D was meant to be a high-end card, but quickly turned out to be a failed product. The first and last 3D card of Cirrus Logic, that caused its video card business to bankrupt. The card was too weak for its initial price tag, there was barely any buyers. Due to this, it was not manufactured in large quantities, and the price had to be thorn down to low-end eventually.

(picture: Cirrus Logic Laguna3D 4 MByte AGP)

I have included it in the test, because they have dumped the supplies of the card for a few months on second-hand markets, in the late 90s, so it fits in this test, even if its otherwise stay a rare find. Cirrus Logic decided to use the RAMBUS memory standard on the cards, 4 MByte of it. The chip in theory supports 6 MByte video memory, but no such thing was ever released. The card supports 16 and 32 bit rendering as well. AGP and PCI version is released. I only have the AGP variant, so the test will include that. The chip was released to beat the Virge and Rage cards... Can it do it? We will find it out. The card requires no heatsinks, but the build quality feels very high-end anyway.

nVidia Riva128

Riva128 is the second card of nVidia. The first one was removed from the market very qucikly, and almost caused nVidia to bankrupt. Riva128 was released as a mid-range card. When the card was released, it was proced at about $200. The prices only fell a few years after the release, by the time the card was gone from the market, but there was at least a few months where the Riva128 was available for $40-$50-ish USD brand new, to clear the stocks, which makes it just to fit in this test very well.

(picture: nVidia Riva128 PCI 4 MByte)

The original card was released in 1997 for PCI and AGP versions were later released as well. The original card supported 4 MByte of RAM. The AGP cards were released with the support up to 8 MByte memory. I havent seen any of these 8 MByte models yet in person. Some manufacturers released it with 2 MByte with an expansion module, which is of course the useless version. The Riva 128 is nVidias only backwards-compatible card that works in low-end 90's machines, as the later ones will not work properly with a 486 or early P1-class CPU, due to the Windows drivers will need MMX. The early drivers for the Riva 128 are terrifying, and the raw hardware is not even that impressive, but nVidia did a good job squishing out the performance from the card with strict driver maintaining policies. VIVO cards with video in and out was also released, and the card also supports proper OpenGL at version 1.1+. The card has a passive heatsink, but some models lack this. This card is not meant for DOS, even the fonts in DOS look like some child have drawed them in mspaint when you look it from the analig video out. This could probably be fixed with a video bios upgrade, but i will not bother doing it, to preserve the original state of the card. nVidia pushed other manufacturers to the limits with this card. We will see it in the test, how it compares to others. After the fiasco of nVidia's first card, the NV1, they only had one more chance. And they pulled the ace. The first triangle-based chip of nVidia was brutal by the standars of the era, and the prices was not even too high.

3Dlabs Permedia2

The Permedia2 card was meant for workstations and CAD and CAM applications. Despite this, the card appeared on the second hand market for cheap, and the Direct3D drivers are also good for gaming. However, this card was not manufactured in large quantities, and initially it was quite expensive. The fact that it makes it to be a lawful entry in this test is, that it was available on the second-hand market for $50-ish price tag around the christmas of 1999, when workstation users finally decommisioned it for more modern cards.

(picture: 3DLabs Permedia2 PCI 4 MByte VIVO)

In 1998 the card was still traded for about $300 but by the end of 1999 the prices fell to the affordable price range on second-hand markets, before suddenly vanishing. As being a server equipment, less people noticed it, which helped the prices to stay relatively affordable on second-hand markets. Some cards have video in and video out (VIVO). AGP and PCI variations are also available. The cards were manufactured with 4 and 8 MByte models. Some models are upgradeable to 8 MByte with a memory expansion slot. The card has fine OpenGL drivers, not just ideal for CAD applications, but also for gaming. Rumors say the card is CPU hungry for its era. Is it? We will find it out from the test.

Matrox G200

The G200 was released in 1998 for about $200, but the second-hand price fell to the $40-$50-ish price range only in late 1999, after the release of its predecessor. The G200 was available for PCI and AGP slots, and it came with 8 MByte of memory. In some cases, it would be possible to upgrade it to 16 MByte, but i have never seen a memory expansion for this model.

(picture: Matrox G200 AGP 8 MByte)

My card is an AGP variant with 8 MByte of video ram. The card usually requires no heatsink, but some models have it. Obviously no active cooling is required on this one. My model sadly has no tv-out, but some model do have this feature. The card and the drivers - especially the early drivers - was optimized properly for Socket 4/5/7 systems. Getting the card to boot in 486 systems however can be a little bit tricky, a video bios upgrade, a bios upgrade, and some luck is needed. This is the second 3D card of Matrox, and the goal of Matrox was to compete the nVidia and 3dfx cards, such as the TNT. The difference is, that in the case of Matrox, the prices of the 8 MByte cards rapidly fell after the introduction of the new models, which havent happened for certain other brands. The card supports OpenGL, but its not great in it. In non-AAA titles, OpenGL support can be a bit problematic and erratic (lack of textures and such), but there is no issues with AAA titles whatsoever, and the D3D drivers are almost perfect. Unlike 3dfx and nVidia, Matrox optimized the drivers for low-end computers. Lets see, how succesfull they was in this task!

3dfx Voodoo Rush

The only 3dfx card which will be included in this test, is the Voodoo Rush. I got the 3dfx Voodoo Rush from Yutani, a friend of mine, who gave this card to me just to be able to include a 3dfx card here as well. Besides this, the Voodoo Rush is the only 3dfx card that got available for cheap at the end of the 90s after the product failed the market, and the remaining stocks was sold for bargain prices on the market. The Voodoo1 was a high end video accelerator card, released in 1996. The card was much faster than the Virge, however the prize of the card was also much higher.

(picture: 3dfx Voodoo Rush 6 MByte PCI)

That made the Voodoo1 a high-end solution. Also, the Voodoo1 was not a video card, juts an external accelerator card. To use the Voodoo1, a normal video card was also required in the system. The two card was connected together with an external cable, and if 3D mode was initialized, the Voodoo overtook the display. This was non-optimal, so 3dfx decided to release a video card with 3D acceleration capabilities. To achieve this, they have integrated a separate 2D core to the board. This card was called Voodoo Rush, and it was released in 1997. The 3D chip of the Voodoo Rush was a redesigned Voodoo1 core, with a separate texturing and frame processing chip. This chip was a bit more cost effective, but slower than the predecessor Voodoo1. The Voodoo Rush card had a total three chip to process both 2D and 3D graphics, which made it too expensive for its performance. The 2D chip was an AT25/AT3D chip, with 2D only functionality, later on, they have added a Macronix MX chip as the 2D unit. The 3D performance was not the only one that was poor: according to rumors the 2D image quality is also terrifying, even in 640x480, the screen is blurry. The test will cover if this is true, or not. 3dfx canceled this line of product after they have realized the card will never met the expectations to conquer some OEM market for 3dfx. This was the only 3dfx card sold in the low-end segment after the failure of the product, other 3dfx cards, such as the Voodoo1, Voodoo2, Voodoo3, and above, never circulated on the market for low-end prices before the year 2000. The remaining stocks of the components were used even in 1998 to manufacture Rush cards, but most of the cards were made in 1997.

The Voodoo Rush, due to the product failure, is a lawful member of this test, and it will be included. The card is very complex, and it has a lot of separate memory chips. One group of memory chips belong to the AT25 chip, and accessed together with the 3dfx frame buffer chip. The second group of memory chip is wired directly into the Voodoo texturing units, only used for storing textures. The Voodoo Rush usually uses 6 MByte memory, where 4 MByte is used for 2D, and 2 MByte is used for textures. There are 4 MByte versions available with 2+2 MByte setup, where the 2D chip only has 2 MByte memory as well, limiting the resolution to 640x480. Otherwise, under 3D applications, the 6 MByte models support 800x600 as well. There are 8 MByte versions available with 4+4 MByte configuration, and some models are upgradeable due to standard unpopulated memory sockets. The chips require no heatsinks, but some later models include them, and the later models sometimes will use higher clocks to achieve bigger speeds. Drivers are a little bit funky, different driver packages exist for different Rush modells, depending on which 2D chip it has. Its interesting to note that the AT25/AT3D chip is capable of 3D acceleration on its own as well, however its being disabled on the Voodoo Rush, and the Voodoo is supported to handle all 3D operations. Despite of this, the Rush supports windowed 3D rendering at least on paper, but in reality, its hard to find any game that can do this. The previous Voodoo1 had issues with running non-AAA games, and it was incapable to work with windowed applications. Was the Rush able to solve this issue? We will see it from this test. Also, the Rush supports 3dfxs own graphics API, the Glide, which will be used for the Rush where it is available.

The cards i would like to include... but i dont have them

Matrox Mystique

I have sold the last one years ago. Its not a rare card, i just cant seem to find any right now. I would like not to pay premium for shipment, so i cant include this card right now. The card was a direct competitor for the Virge. It was released after the Virge, it has some issues with texture filtering when blending or alpha is enabled. The cards were available in PCI form for similar prices as the Virge and the Rage2 in the late 90s. The card was shipped with 4 MByte video memory. The card looks a little bit fragile, but in reality it not differs from the competitors. It required no heatsink. It rarely dies, and can run most of the games similarly to the Virge and the Rage, however we will not be able to observe that card right now.

Trident 3Dimage

This card was popular in Britain, but not too much elsewhere. Its certainly not a rarity, but i dont have any. I will not include this card in the test, but its certainly not a good 3D card. It was released to compete the Virge and Rage2, which it was barely able to due to its bugs. I havent do my home work on this card, so i dont know how much VRAM does it has, or how it behaves in various circumistances and computers, as this brand was not available where i live.

Alliance AT3D

This chip is was a failed 3D chip from Alliance. It was designed to compete with the Virge. Its bugous, does not supports blending properly. The 4 MByte variations are indeed 3D capable, but the performance is slower than the Virge. Polygons and textures jumping around, the picture quality is dirty. The chip was used with disabled 3D features later on, on the Voodoo Rush cards as a 2D engine. The Alliance AT3D card was available in PCI with usually 4 MByte RAM, but i have no information about the pricing. This card didnt made to my country in notable qualtities, and i dont have any. It required no heatsink.

Intel i740

I had one specimens but i sold it, and i was not able to find any more in the last months. The card came later in the game, to compete against the new era of 3D accelerators. However, due to its weak performance, it fell back to the low-end range. The price fell to $35-ish brand new by the end of 1999, and it was a suprisingly good card for the price, despite this is intels first 3D chip. There is OpenGL support, but its buggy and crashes can occur. The D3D drivers are more tame. As i dont have one right now, i am forced to skip this from the test. The card was available in 4 MByte and 8 MByte models, both for PCI and AGP. The AGP card was more popular. It required a passive heatsink.

SiS 300

This is the second video card of SiS. Its five to ten times faster than the SiS 6386, and its available in PCI and AGP versions. It was released in 1999. The problem is, i dont have any. Therefore, i cant test it. Once i had one, but that was integrated to a motherboard. The card was designed to be optimal on more high-end machines, it requires a Pentium2/Celeron based system at least to unleash its potential. It would have been a nice experience if i was able to observe, how optimal the drivers was on low-end Socket7 era computers. But this will not happen, as i was not able to get my hands on a card. These cards, if someone gets one, come in 16 and 32 MByte versions, they have OpenGL 1.1 compatible drivers, and passive heatsinks.

The cards i have, but i will not include them

nVidia TNT1

The TNT1 was a $400 graphics card, and this is a low-end test, so obviously it is not going to be included in this test. It was nVidia's first heavyweight 3D graphics card. Also, the TNT1 will have compatibility issues with various early computers, due to the drivers are optimized for CPU-s clocked at 433 MHz and above, such as the Celeron 433. This 16 MByte AGP 2x monster can be made to work properly on a Socket7 computer with using old dinosauric drivers and a little overclock of the system, but overally this is a non-go for this test. Also, the price of these cards were above $300 so there is no point featuring them in this test.

nVidia TNT2/TNT2 M64/GeForce

The same applies as the TNT1, but the drivers are even worse. Crashes and incompatibility issues can be occur with ALi and VIA chipsets from the era, drivers are not available for 486 and for early Pentium1/AMD K5 non-MMX computers. These cards are just designed for high-end gaming, in a Socket 5/7 motherboard, even if the drivers can be setup correctly, the speeds are capped to 6-7 fps under certain situations, as the card can not properly iterface with the IO system in the computer with the newer drivers optimized for the Slot-1 computers. Another problem is that even if these cards were released before 2000, the prices were around $300-$600 which makes it outside of the range of the low-end card league.

Rage 128

The Rage 128 from ATi behaves similarly as the TNT2 card. These cards are designed for newer computers, and the drivers will potato out if you dont give at least a Pentium2 or Pentium3 for it. The prices was also higher than the low-end budget of the late 90's, so even if some of these cards would fit into this test, the price range would make it a non-suitable player in this test. These card overheat and die easily, this was indeed the beginning of the era where graphics cards just boil themself to death. These 16 and 32 MByte cards are the beginning of a totally different era that this test is about.

The cards i dont have, so i cant include them (but i would not include them anyway)

3dfx Voodoo1/3dfx Voodoo2/3dfx Vodooo3/3dfx Voodoo Banshee

Oh how can i leave out the Voodoo from this comparison? Its very simple: i have sold it a few years ago, and i haven't found more since then (except the Voodoo Rush, which i got for just the sake of this test). But even if i would have a Voodoo right now, i would not include it in the test. And the reason is very simple. The price of the Voodoo cards didn't fell below $100 before the early 2000s. The price of a Voodoo3 was about $150-200 all the way, and its little brother, the Banshee was above $100 as well. The Voodoo1 was available for $100 which almost makes it to quality to this test, but its still a mid-range / high-end price range (dont forget: 1999ish prices!). The Voodoo1 price only fell to $50 after the millennium, and its also, not a video card, just a 3D expansion card, optimized only to run AAA game titles and nothing else. This means that non-AAA games will likely still use your primary video card, which you had to choose regardless of having the Voodoo1. Even in 2002, the second-hand price of a Voodoo1 was about $20. And the Voodoo2 is basically an overclocked Voodoo1 with two modified Voodoo1 chipsets integrated to the card, for double the price, and the prices fell only after 2000.

3dfx cards was indeed the holy grail of a lot of gamers, but 3dfx have never released an affordable budget card for legacy computers during its lifetime, which would make them to be non-suitable for this test. And as i currently dont have one, its not possible to include them as a reference. Also 3dfx never really care about low-end computers, for example they haven't released a driver capable to run on low-end computers, for years. One notorious example is the incompatibility with the AMD K6/2 series, which plagued the adoption of Voodoo1 and Voodoo2 cards. 3dfx cards were strictly mid-range solutions, where they competed with the TNT/TNT2 and early GeForce cards. The 3dfx Voodoo3 was expensive despite of lacking 24/32 bit rendering. Every card from other manufacturers which was released in 1997 and later, supported 24 or 32 bit rendering, except 3dfx. Also, 3dfx didnt supported bigger textures than 256x256 pixels, other manufacturers already supported 1024x1024 or at least 512x512. This made texts and HUD elements blurry in some games on Voodoo cards. 3dfx tried to bagatelize these problems, but instead, it slowly became laughing stock for these flaws among high-end gamers. But the price didnt fell, as the company could not afford to sell the cards for cheaper. Cards released after 2000 are not topic of this test, and they are not going to get discussed.

Others

There are other manufacturers are well, which are not included in this test. One of the main reason, i don't have them. Another reason is, they available only from decommissioned server rigs, they were too rare, expensive, or their product shipments fell after 2000. Such 3D cards are various PowerVR based cards, Rendition, Number Nine Revolution, Ticket to Ride, Neomagic, Intergraph and random noname products from the far east. Some of them would be intersting to test, and some of them was not even that rare... but probably i will not redo this test later on just for the sake of those cards, if i find one.

The rest of the computer

The test computer is a Cyrix 6x86MX based Socket7 machine.

This was a cheap processor generation for the Socket 5/Socket 7 platform in the late 90s. The cost of the Cyrix 6x86MX was about 40% less than a similar AMD or Intel chip. It supports MMX, and it offer similar compatibility to a Pentium MMX chip. The earlier 6x86 and 6x86L chips were competitors of the first generation Pentium 1 chips, which were later replaced by the far more robust 6x86MX line.

The 6x86MX was a cheap CPU family meant for office and casual multimedia consumption. The 6x86MX used in this test is a 200 MHz model, which i overclocked to 250 MHz. At 250 MHz, its still slower than a 233 Mhz Pentium 1 MMX CPU, but by only a few percent.

The motherboard is an Asus P5A-B. The performance of an older Socket5 or Socket7 motherboard without AGP and SD RAM would be quite identical, i have chosen this because this have an AGP port, making the test easier for all of the cards in the test queue.

As hard disk, a 6 GByte hard drive was added. In the late 90s, hard disks above 2 and 4 GByte was hard to find, as they were expensive. 486 motherboards usually support HDDs up to 2 or 4 GByte, Socket5 and Socket7 motherboards support them up to 8 or 10 GByte. Some motherboards can see even 100 GByte hard drives after a BIOS upgrade tho.

DVD burners were used to copy and install the game demos i was trying. Tests were running on Windows 98 SE, the sound card was an ISA card from OPTi. Besides this, a 10 megabit networking card was installed in a PCI slot, an USB mouse, and floppy drives was also present.

Lets see the games

Croc1

Croc1 was a popular 3D platformed game. It supplied almost a year long game time, if someone really wanted to finish it. It was originally released to support various native 3D APIs, but this version got corrupted on the 22 year old CD is had it. So i had to use the newer version, which is available for D3D as well, but otherwise its identical to the first one. Croc1 was tested in 512x384 resolution. The game occupies about 200 MByte size, and it needs the CD to be inserted to have the original music to play. The version i have tested can render in software rendering as well. This game was really well optimized, even the software renderer can produce almost fluid frame rates on a 400 MHz P2 CPU. I have played this game for almost a year, daily 20-30 minutes. It was very fun, and it has low demand.

Croc2

Croc2 is the seqel of Croc1, it is very similar. The game is a little bit more demanding, but the Cyrix should be enough to run it. This game was tested in 512x384 as well. The demo has two levels to play, the jungle demo was choosen to benchmark. The graphics settings was kept on medium, and the textures on low. The 3D visual quality can be changed upwards or downwards to suit your system better. I dont like this sequel. Croc1 had better camera management and controls. The second one has a little bit shady one. Do not fix something which is not damaged! I feel like Croc2 was a little bit lego-built in some map editor, meanwhile the first was really a well designed masterpiece. It also needs a little bit stronger computer to be playable.

Motoracer

Motoracer is a game designed for early 3D accelerators. If 3D acceleration is not present, it can also work in software rendering. The game runs in 640x480, and its optimized for early 3D accelerators. Sadly, the game will crash on newer 3D cards and drivers, so it can only be used on vintage hardware with vintage drivers. The game is very entertaining, but it needs FPS above 15 to be really enjoyable.

Hype the Time Quest

This is a game with a Lego person with a sword. The game is an adventure game, where this Lego person goes around to find treasure in barrels. There is nothing to configure in this game besides texture quality. Oh, there is one thing to configure anyway: to install the game in D3D mode, or in Glide mode. You basically have to reinstall the game to switch between graphics APIs!

Revolt

Revolt is a toy car racing simulator. It was meant to run on at least Pentium2 based computers, so it will stutter a bit, no matter what you do. Its still tested, to see, what is the maximum that can be achieved on this system, as it was a popular game. The game was runnin in 512x384, and 16 bit. Revolt was not able to reach fluid frame rates on this computer, but when i was pressing ESC to exit the menu, as the physics switched off, the frame rate became fluid for a second. This means the physic and animation engine is too demanding for the CPU, and does not matter what GPU you will use, it will always stutter on the Cyrix.

Lego Racers

Another race car game, where lego characters can race against each other. The game sort of optimized for vintage graphics cards. The game is very simple, and can handle the input very well, even on low-end graphics cards. I didn't knew this game previously, i have just tried it for this test. The game is well optimized, but it is quite boring to play. Its probably still more entertaining than any racing game from the previous decade, but i dont think i would play this a lot even if it runs very well on most of the cards.

Tomb Raider Chronicles

The game is a new incarnation of Tomb Raider, luckily it supports Direct3D and it supports older graphics cards. The game uses nice colors and textures, most effects are pre-rendered to the textures, so it does not makes too much stuttering even on vintage graphics cards. The game was running in 640x480. I dislike this game similarly to all earlyer Tomb Raider sequels. The controls are choppy, game dynamics are non-existent, and i dont get aroused from watching Lara Crofts booty which was probably the most important selling factor of this crappy game, but if its your fetish, then you will be happy, as it runs fluidly on every 3D potato pc.

Frogger

Frogger is a game, where a Frog must jump through the scene upwards, to reach the other frogs on the screen. It is optimized for low-end graphics cards at well, and its a relatively small game compared to other games from this era, only requiring 20 MByte of space. First i found this game quite irritating, but then i started to feel like i was playing some vintage Cat Mario iteration. Certainly not a bad, but i don't know if the game is just capped in FPS, or it supposed to run this crappy everywhere. When flopping around the levels, there are places where the speed drops to 14-15 fps regardless of the 3D card used. The graphics is not that demanding, and there is barely anything for the CPU to calculate, so i don't really understand the reason behind this choppy performance.

Drakan

Drakan is a fantasy game, where we control a woman and a dragon. This two creature is living in symbiosis, basically the dragon is a pet. It can fight against monsters with exhaling fire. The game is not optimized for low-end systems, but it was entertaining. Nowadays i would find something like this extremely boring and pointless. But back then, even the first level was tricky, and it was unique to play with something like this. Controlling a dragon? You must be some kind of wizzard! Sadly, on the Cyrix, the game is vomiting its guts regardless of the graphics chip. I remember playing this game more fluidly on an overclocked K6/2 tho, but as this is a low-end test, the Cyrix fits better.

Rollcage

This is a car racing game. The cars can flip upside down, enter tunnels and compete each other. The game is not really optimized for too old machines, but the Cyrix should have no problem running it. And indeed, the game was able to pump out usable frame rates with a strong graphics card, even if it was not able to reach a static 25 fps, it was playable. I dont think i would play with this game, i would rather go with the F1 (discussed later).

Freespace 2

Freespace 2 is a space simulator. Its about a war that is being fought by humans, against a foreigner race. The game is optimized well, and after all, its just a few polygons flying around in the void, so it should not be too demanding. The game was released on 4 CD, and i gave mine to someone who ghosted himself, and hasn't returned it ever since. To be honest, this is not a big problem, as the game is boring. The story evolves and there are some interesting turns, but no sane person should play this in 2021 even for retro purposes.

Tomb Raider 2

Tomb Raider 2 is an iconic game. It should run well on vintage computers as well. I was set 512x384 for this game, but i was lazy to figure out the controls just for this test. To be honest, i never really liked this game, and i still dont like it. But despite this, its indeed a very good game for those who had affiliation for polygon boobs and asses. Too bad, as this game engine, character animation engine, gameplay system could have used to create a more entertaining game, but we are in the 90s, even rotating a colored triangle required ultrahuman abilities and knowledge, and this game runs literally even on a rotting bean which was shat out by a cockroach. Unless an enemy appears, then the perfomance falls a little bit.

Final Fantasy 8

Final Fantasy 8 was a Japanese 3D game, it was very popular. Its a 3D RPG game, offering a year long play time. The game revolves about a war, which high school anime kids fight against each other, who travel in a floating sky fortress. Final Fantasy 8 was a cult game, it was released on 4 CD-ROM. The cost of the game was about equal to a monthly salary here. It offered a year long entertainment, if someone played one or two hours with it every day. The game has awesome music, but the battle system is a tiny little bit over-complicated. Final Fantasy 7 probably had a little bit better system. That game will not be tested here, as it only runs on one or two early 3D accelerators, and not on newer ones. Final Fantasy 8 has a D3D and a Software renderer as well. Sadly, it produces glitches with almost every cards. The most glitch-free graphics was produced by the Savage4, the G200, the Permedia2, and the Riva128 cards. Other graphics cards had more or less discoloration and artifacts. The game is very CPU-limited. Its a mystery, why.

Formula one

This is one of the first 3D accelerated F1 games for the PC, and it has both a simulation, both an arcade oriented mode. It should be running fine on older computers, and its a very small game. Its recommended to try out by every retro fan. The controls are a bit funky, so its hard to play this with one hand. The game supports an arcade-like and a simulator-like playing method. I like the first one, but back then, probably the simulator-method was the more popular. Interestingly, the game is a little bit CPU demanding, and if there are a lot of cars on the screen, none of the graphics cars can render the game fluidly on the Cyrix.

Star Wars Racer

This is a Star Wars racer game, where you can race with other people from the Star Wars universe with strange floating electronic devices. Its basically the gokart of the future, where aliens can compete each other in not so friendly ways. The game by the way sucks hard, and i was lazy to figure out the controllers for the first few tests. Later i have realized, you can accelerate with the Enter button. What the hell? Its probably the huge inadequacy of the developers, if they placed the acceleration on a random place on the keyboard which you will never find out, unless you read the manual. Besides this, its not even a bad game. I could imagine playing this even nowadays with a friend.

Tachion

Tachion, the Fringe - or more likely, Tachion, the cringe. Tachion is similar to Freespace2. The game is about a space war, where we control a mercenary. Another simulation, where you float around in the nothing. It has a little bit more demand than Freespace2, as this one was released later on. Tachion developers messed this up a bit, as the game requires strong CPU to render the scene. Nothing achieved playable frame rates on the Cyrix. The ships have a little bit better and realistic controls, and its fun to play. Or at least, it was, 20 years ago. Despite this, it runs on almost every potato you can come up with. Tachion is a little big laggy on low FPS, unlike some of the games, this can not be considered playable if the fps is too low.

F22 Lightning3

This was a typical Novalogic aircraft simulator. Its easy to play, and you can use various rockets and bombs to destroy the enemy. You can fly the F22 which was, and still is the top dog of the American Army. The game supports D3D and software rendering as well, of course for the sake of the test, the D3D acceleration is tested. It also had an online mode, which i have played for a few month. That is sadly already closed down. There was some light flight simulators i liked, this was one of them. Nowadays i would not play with it, as its too boring and oversimplified - in contrast of IL2 Sturmovik, which is over-complicated AND boring.

Nintendo64 emulator

Gaming was not just about playing AAA titles. I have digged up two vintage N64 emulators, to see, how this computer can play the games. For d3d based cards, the emulator called 1964 was unsed. For cards with better drivers, the ultra HLE was used (which supports Glide and OpenGL). Mario64 was the payload, which was tested in the emulators, because that is the most compatible game. Its a very entertaining game, everyone has to try at least once. I have accidentally used a different one for the Virge and Rage cards which was probably non-3D accelerated, so ignore those results. If the card supported OpenGL, the emulator was tested in that. Otherwise, it was tested in D3D. The Voodoo Rush was tested in glide mode in ultra HLE. As you can see, for different cards, different emulators and plugins had to be used, so do not take these results too seriously. But still take it into consideration, as AAA gaming on these machines was not everything. In fact you can see that some cards perform better in AAA-ish titles, meanwhile others perform good in every titles in overall, including these home-brew emulators.

The results

S3 Virge 4 MByte PCI

Windows had a built in driver for this card, and that was used. The S3 Virge PCI had a surprisingly crisp 2D graphics quality, even when compared to modern video cards. It 3D, it only supports 16 bit, but on the windows desktop, it supports 24 bit as well. The card lacks raw pixel processing power, for example, Croc goes from 9 fps to 12 when switching back to 400x300, and Croc2 goes from 5 fps to 6 fps when selecting 400x300. Motoracer ran about 11 fps in 640x480, which is almost playable, maybe tweeking the card a little bit would make it to reach better playability. Hype the time quest ran only on 5 fps, which is maybe playable for an adventure games by the early standards of gaming, but certainly not really comfortable. Revolt was running on 4 fps with glitches, and it was not playable. Lego Racers ran on 7 fps, but it was playable due to the good game engine that was able to handle the input very well. In Tomb Raider Chronicles, it was able to make 8 fps, which was almost playable, but it was not a good experience. Frogger was running on 8 fps, and it was playable. Drakan was refused to start. Rollcage was running at 6 fps, and texture glitches were present on the menu, which made the game very hard to handle. Freespace2 was running on 15 fps, but it always crashed after while. The computer had to rebooted. Tomb Raider 2 was running on 7 fps, and it was barely playable. The speed fell especially on areas with a lot of fights. Final Fantasy 8 produced glitches making the game non-playable. Otherwise, it was running at 4 fps. Formula One was running only at 5 fps, and i don't consider this playable. Star Wars Racer produced 4 fps and some glitches on the road. Thachion was running at 4 fps, and it was not playable. F22 Lightning 3 produces 3 fps, which is not really playable. Mario64 was running at 2 fps, even the logo screen, this was not playable. Indeed, the first version of Virge was only able to play the earliest 3D games, and it cant really do anything with the newer ones. Lets see the bigger brothers!

S3 Virge DX 4 MByte PCI

To compete with the upcoming line of ATi Rage chips, S3 released a newer version. The DX is indeed about 25% faster than the initial Virge. This means usually one extra frame per second. The card successfully keeps up with the Rage2 chip line, but produces less accurate pictures. Alpha blended objects usually have artifacts in most games. The operating system recognized this card out of the box as well. The chip runs a little bit warmer than the previous version, but barely noticeable. I have noticed that this card had a little bit worse 2D output quality than the original virge. The card has smaller and worse capacitors. This is probably unique to every manufacturer, but probably thats how the manufacturers started to spare some money, to use weaker components on the cards. The first Virge had a little bit more clean image. Otherwise, the card behaves identically to the previous model. In 3D, this should have better picture quality, as the bilinear filtering can be enabled without losing too much speed, compared to the first Virge. But in the small resolutions the card is usually used, it will not make too much difference, especially as in the early games, texture filtering is mostly turned off by default.

Is the Virge a 3D Deccelerator?

Early rumors floating around in the forums called the Virge family a 3D deccelerator. They propagated the idea that the D3D acceleration in the Virge is slower than software rendering. Some heavily optimized dos game engines can indeed produce very high frame rates, comparable with 3D rendering on Windows. However, when we compare the software rendered games on the Cyrix 6x86MX and the 3D rendering performance of Virge, for example in Croc, Motoracer or Final Fantasy, we can see that the Virge and Virge DX is about at least 2x-4x faster than the software rendering in these games. Of course on a modern Pentium4 the software rendering would easily outperform the performance of these early cards, but thats a totally different era and price range of computers. Guess what, a software renderer on an 8 core i7 would also outperform every card which was measured in this test... In a Pentium4, you would likely use a GeForce, and not a Virge from 1995. There is no point in putting an out-of-era video card to a computer (a too older, or a too newer one) as the performance will obviously be not ideal.

S3 Trio3D/2x PCI

The card was producing crash both with the newest 10037 and 10028 drivers. The card tought from itself, its an AGP model. But it isn't. After blocking the AGP in DirectX, the 3D contexts initialized, but still crashed after one frame. I have tried to remove the AGP drivers from the system, but the card was still unstable in some situations. Even DXDIAG crashed. I have relocated the card to a different PCI port, but the card didn't became stable. I have reinstalled the latest drivers. Some games started to work at this point. 2D have no issues, the screen is crisp. 24 bit modes are available in 3D, but for this, only 16 bit was used. Unlike with the previous S3 cards, some games now had texts appearing. Such as Rollcage Stage, where now the menus had proper texts. If someone knows why the PCI version crashed, then please leave a comment. I am legitimately curious about this, as otherwise, this card seems very usable.

S3 Trio3D AGP

The AGP model was free from the strange errors of the previous PCI unit. The net is full with clueless people, going around and saying that Trio3D is slower. In fact, the Trio3D is about twice as fast as the previous model. In some times, barely 20% faster, but in other cases, almost three times faster. Despite of being just a new revision of the old technology, the Trio3D successfully keeping up in performance with the new generation of low-end 3D cards in most games. The card was not producing too much heat, both the chip and the RAM was relatively cool while rendering. Final Fantasy refused to start, other games were able to run on the card. The N64 emulator was useless, just like on every previous S3 cards. Hype the Time Quest needed a bit more loading times than usual. I am quite satisfied with the performance of this card.

S3 Savage4 AGP

The Savage4 is a best. It almost won this test. The card is simple, but fast. Most of the games were fine. Shadows in Croc2 was actually grey boxes instead of textures. Revolt almost reaches playability, however, that game has a brutal CPU limit, which even the Savage4 cant fix. The Savage4 was quite hot compared to the Trio3D and Virge based cards. The Savage4 also had a jumper to set AGP2x or 2X/4x mode. This means the manufacturer of the card prepared with good vintage compatibility. And indeed, the card shows a fantastic scaling even with the Cyrix 6x86MX. Who was able to buy this card, was indeed very lucky, even some early 2000ish games will run fine on this card. I havent noticed any instability, or any serious bugs. The PCI version would probably also be a worthy catch, but probably costs a kidney. S3 indeed did a great job with this card, and they were back in the race again. I am a bit disappoined only because this card probably should have arrived one year earlyer, and S3 should not have released so many iterations of these early Savage chips to simplify its portfolio. The card is indeed fast, but the 32 MByte variations seems like overkill. At this time, users were already aware of the marketing power of the RAM size, and they was thinking: okay, this is not too bad card, but the 32 MByte card is just a trick to get into the pocket of some less-informed buyers. The 16 MByte model would be the best choice nowadays, if you want to play retro games in 32 bit color depth. Otherwise, the 8 MByte model will be totally fine.

ATi Rage2+

Windows 98SE offered a driver for this card. The last driver seemed to be older than the OS build date, so the built-in driver was used. The card is usually a little bit faster than the Virge. The biggest plus for the Virge is the compatibility. Blended effects usually have less artifacts than on the Virge. ATi was indeed a worthy competitior of the Virge, and from the battle of the two, eventually ATi arised victoriously. The card scales better when the resolution is rised. Virge instantly bottlenecks from 640x480, but the Rage2 barely loses performance. The build quality looks very high. I would position the 2D signal quality of the Rage2+ between the Virge and Virge DX, but this probably varies a lot by the card manufacturer itself. ATi trolled S3 with this card. The two company was able to rise the performance of their cards in the same pace. Due to this, the Rage2+ ended up being just as fast as the Virge. Its hard to declare a clear winner, but probably Rage2+ was a better solution due to the lower number of graphics glitches. There was no problems with the drivers, everything was working out of the box.

ATi Rage 2C AGP (4 Mbyte)

I was expecting a little speed-up from this card. Indeed, this card is faster than the Rage2+, but this usually translates to one or two extra FPS in games (4 vs 6 and so on). The card is AGP, but it probably does not supports AGP signaling. In DXDIAG, the AGP texturing was grey, so the card didnt supported this feature. This card is just a Rage2+ for the AGP bus, with maybe a little bit increased clock speed. The card produced a few BSODs when exiting games, sometimes. Sometimes, i was getting a BSOD when i was trying to reboot the computer. The games themself was not unstable, but sometimes, when i was exiting them, i get these crashes. Revolt had a notable perspective correction error, a little bit more noticable than other accelerators.

ATi Rage 2C AGP (8 Mbyte)

The 8 MByte version gains no speed compared to the 4 MByte version in gaming. I feel wasting my time by testing the 8 MByte variant. In fact, the 8 MByte version is even notably slower in one of the tests. ATi had nothing to hurry in early 1997 when the AGP port was released. They put the chip with minimal modifications to the new bus type. The 8 MByte video memory size was probably good for advertisement, but it does nothing useful, just wastes the earth's mineral reasources. At least in the games i have tested. Maybe it could be useful when someone tries to use 32 bit rendering, but the performance is just not there to do it anyway. It maybe makes difference in newer games - however, those will be likely below playability. Of course this modell also had no AGP texturing acceleration. At least this card produced no strange crashes like the 4 MByte model, so due for this, i recommending this card above the 4 MByte variation. But for retro purposes, the Rage previously discussed Rage 2+ model is better due its PCI port. Rage Pro family also exists, but i have none of them to test - that increases the speed significantly compared to the 2+/2c chip.

nVidia Riva128

The tiny, but horridly strong nVidia Riva128 surprised ATi and S3. Especially the second one, because ATi had the Rage Pro to compete with it, but S3 had nothing to compete against this performance. The card screams quality. The high-quality SG rams are strong enough to feed the high-performance chip. The Riva128 is barely larger than its competitors, but it produces more heat. The video signal varies by cards, its approximately identical to the Virge DX, and its totally fine in 640x480 and usable in 800x600 too. The Rage2 and Virge DX cards was not in danger just now, as nVidia indeed asked the price for the card. The card is about two-three times faster than these. ATi and S3 had to move instantly to act against the long-term dangers caused by the Riva128. ATi was able to do so, 3dfx and 3Dlabs was also aimed a product against it. S3 needed at least one more year to finish their new generation product, so they fell out from the game for a year. The Riva128 initially had crappy drivers, which were crashing just with about everything, but nVidia qucikly fixed these problems. From the tested games, only Drakan and Tachion failed to start. The rest produced good frame rates and image quality. The chip almost won the test, it was able to keep up with much modern video cards, like the Savage4.

3Dlabs Permedia2

The Permedia 2 is an interesting chip. Meanwhile it was less known than nVidia, 3dfx, people understood that the Permedia 2 chip offers good performance. 3Dlabs focused on CAD graphics cards, but the drivers were of course able to run games as well. After the price of the Permedia 2 cards fell on second hand markets, these cards indeed gave a good gaming performance. The main rival of this chip is the Riva 128 and early 3dfx Voodoo cards. The Permedia2 was not able to start F22 L3 and SWRacer. In other games, the performance was usually faster than Riva 128. In other hand, the Permedia2 generated more glitches with alpha-blending. This resulted funky texts in some games, and sometimes the effects had strange discoloration. Tachion had missing FPS counter with Fraps. I was not able to use the 512x384 resolution, so 640x400 was used instead. The Permedia2 wanted to output in a strange monitor frequency in 512x384, which my monitor didnt supported. I was not able to change this frequency despite of trying various methods to do so. But even in bigger resolutions, the card performs very well, so its not much sense to use 512x384 on this card, 640x480 is where this card belongs. I have not tested 24 bit rendering, as it would make no sense on the 4 MByte model, and sadly i was not able to find a memory expansion for this beauty. (Previosuly i have owned an 8 MByte Permedia2 model for AGP, and the 24 bit rendering was fine).

Cirrus Logic Laguna3D

This card was surprised me. It had good compatibility, and good drivers. The performance was okay. Online tests usually diss this card a bit, but i have noticed no issues when running it. Every game, except Freespace2 was running on it, the card is in the mid-range. After releasing this product, Laguna3D was canceled, and Cirrus Logic left the market of graphics chips. This makes no sense for me, as the card easily keeps up with the competitors. The card does not even looks like to be too expensive to manufacture. Maybe the RAMBUS RAM was problematic, but in this case they could just go with EDO or SD RAM instead. Its a mystery for me, why this product was failed on the market. Probably the bad marketing lead to the demise of this card, because the image quality and performance was okay. Other tests online suggests that there are some issues with perspective correction. All early cards have small issues with sperspective correction, and the Laguna3D had them as well, the quality of perspective correction was not different from other cards. The card is halfway between the S3 Virge and the Riva128, not too slow, but not too fast either. The card can even start later games. The only notable drawback is the lack of OpenGL, which would be nice to have at this levels of performance. In windows desktop, it can do 16, 24 and 32 bit modes. I havent tried true color modes in 3D, so i dont know if the card supports them, or not. But with 4 MByte of RAM, 32 bit would be out of the question anyway.

Matrox G200

This card won the test. The G200 only became available in the last months of 99 on the second-hand market for cheap enough to be able to included in this ghetto 3D accelerator test. The card had no notable errors or glitches. Both the Direct3D and OpenGL drivers was fine, the 2D quality was also good, the ramdac and the video signal produced good quality. The driver version 682 was used to perform the test. Only one minor problem was found. The refresh rate was funky at 512x384, similar to the problem of the Permedia2. In this case, however, i was able to use the drivers to set the output frequency to 60 for 512x384. This made the resolution available for me, but i guess it would retained the victory even if it had to run the crocks in 640x480. Of course, this chip allows gaming even in 800x600 but those resolutions was not used in this test. The card looks very cheap, but despite this, but it won easily. Matrox built a notable fanbase with this chip. (By the way, i also have a G450, but that is a far later model, and in the Socket7 it runs very shitty, it can just do half the FPS of G200.). If you got the G200, you was extremely lucky. The card will also do 32 bit rendering without any issue.

SiS 6326

SiS 6326 was a laughing stock of low-end cards. But actually, the performance was not even that bad. In fact, the card is sometimes reaching or exceeding the speed of Riva128, but in other cases, the performance is half of the Riva128. This uneven performance makes the SiS 6326 to be less ideal than the Permedia2 or the Riva128, but the prices was far lower than that, and the card was available for cheap. The card was mostly sold as an entry level 3D solution for OEMs, but bundled versions was also available. The card was able to start everything, and it is a decent performer in 512x384, and sometimes, in 640x480 as well. It supports 16 and 24 bit rendering. The downside is that the card produced some glitches with some games. One notable example is Hype the Time Quest, which took a lot of time to load, and produced garbled blue textures. The game can be made to run with different drivers tho, but in this test, the newest, OpenGL capable beta driver was used. Tomb Raider Chronicles started in some potato resolution, and had some color artifacts around the character, but otherwise, it was playable. The FPS counting in some games, such as Lego Racers, or SWracer, was producing irrealistic numbers. SWracer was notable stuttering, while fraps was showing 100ish FPS for example. Therefore, i had to use my eyes to estimate the performance in this two games. In the rest, fraps was working okay. The card had okay-ish 2D signal output quality in 640x480, but in 800x600 it was a little bit more blurry than other late cards. There is PCI models and 4 MByte models exist from this card, which i havent tested, but if someone wants to use this card in 24 bit, then its probably recommended to get the 8 MByte model.

Voodoo Rush

When i have installed the Rush for the first time, the drivers i have downloaded, caused desktop corruption. The card of course had no issue, the driver was problematic. In fact, the driver overclocked the 2D chip from 60 MHz to 72 MHz, and this caused artifacts on the desktop. To fix this, i have searched for a method to downclock the 2D chip, and i have edited the registry entry responsible for the clock. This have helped. D3D was also working, however, Glide caused crash. I gave up on this driver, and started to search for another one. Some drivers lacked D3D, some didnt worked at all. I have realized that there are no official drivers for my model, and on the internet, only random Rush driver packages from self-proclaimed gamer geeks was available (mega ultra quake edition potato 2000). Finally i was able to find one that had both the D3D and Glide portion working properly on my single-planar board. I have edited the ini and replaced the texts in the name section to Innovision 2DXRush, which is the official name of this card.

I have uploaded this Voodoo Rush driver package, and its available here if someone is interested: LINK

Initially, the video signal output of the Rush was extremely crappy. The output signal is padded left and right to met the internal frequency signals. This results blurry picture. To fix this, the refresh rate has to be forced to 60 Hz, and then the signal output gets normal. I have also forced the refresh rate to 60 under games. You can do this in the video cards menu. The card can also do 32 bit (but under 3D, only 16 bit is supported, so its recommended to leave it on 16 bit). The video signal quality is also acceptibe in 800x600 16 bit, with 60 Hz output (otherwise its blurry). Above this, the signal becames too ugly, so Rush can be used in 640x480 or in 800x600. The Rush chipset supports 2D resolutions 1024x768 and 1280x1024 as well, but the screen output would be blurry, so it would make no sense to use it.

After i finally had the proper drivers and setup, i was able to start the benchmarking. Croc was running in Glide (the own API of 3dfx) with a rock-steady 30 fps. In D3D, it was able to manage 18+ frame rate. I was able to start Croc in Glide in 800x600 as well, however, after a frew seconds, the game always crashed in that resolution. The D3D mode had no issues with 800x600 whatsoever. Tomb Raider Chronicles failed to start, complaining about inability to create a D3D context. Hype the Time Quest had a Glide mode too. I wanted to test it, but it caused the computer to hang. Rollcage Cage also caued crash sometimes. Other games had no stability issues. Glide-based N64 emulators were working very well on the Rush, but only in 640x480, in 800x600 the crash reappeared. Basically, every Glide stuff crashed in 800x600 for some reason. This was not a problem in D3D, there 800x600 was working properly.

Final words

The Matrox G200 won this test, and the Savage4 is the second fastest card in vintage computers. Riva 128 comes at third place, and 3DLabs permedia is fourth, only losing by one or two points. Voodoo Rush, Cirrus Logic Laguna3D, and SiS 6326 is in the middle of the test. The rest of the cards from S3 and ATi represent the low-end. The summary graph of the video cards can be viewed here:

Notes:

edit 2021-01-25:

I have tested the Rage2+ PCI and the S3 Virge DX PCI with a stronger CPU. The logic behind this is the following: The Virge and the Rage2+ has no hardware acceleration for triangle setup calculations. This means that the whole projection, clipping, and transformation process, with every trigonometric calculation of the triangles are ending up on the CPU. The graphics chip will only do the rasterization for us. Unlike more modern and strong cards, which have acceleration to do this from the hardware, the Rage2 and Virge will hog the CPU if the polygon count on the screen exceeds a few 1000 triangles.

I was curious, if there would be any notable speed-up in the cards, if i replace the CPU with a stronger one, which can process more triangles for the graphics cards to render. I have replaced the Cyrix 6x86MX 250 mhz with an AMD K6/2 500 MHz model, which is about twice as fast as the Cyrix.

The built in Windows drivers of the Virge DX caused crash in 3D on the AMD CPU, so i had to install the official drivers from S3, which made the cards stable. I have benchmarked a few of the games with this setup as well. Then i have inserted the Rage2+. The built in drivers of the Rage2+ were also glitching with the AMD cpu heavily, so i have installed the official ATi drivers for the Rage2+, which have fixed the problems. I ran the tests with this card as well.

The results with the more powerful CPU, compared with the old CPU, were the following:

Croc1:

Virge DX: CROC1 512x384 (6x86MX 250 MHz): 9 fps

Virge DX: CROC1 512x384 (AMD K6/2 500MHz): 12 fps

Virge DX: CROC1 640x480(AMD K6/2 500MHz): 10 fps

Rage 2+: CROC1 512x384 (6x86MX 250 MHz): 9 fps

Rage 2+: CROC1 512x384 (AMD K6/2 500MHz): 12 fps

Rage 2+: CROC1 640x480(AMD K6/2 500MHz): 11 fps

As we can see, both of the cards gained the same speed-up, but even with the two times faster CPU, the rendering speed is not able to climb in Croc1 that much. Basically we get a good 30% speed-up from using the twice as fast CPU.

Croc2:

Virge DX: CROC2 512x384 (6x86MX 250 MHz): 6 fps

Virge DX: CROC2 512x384 (AMD K6/2 500MHz): 8 fps

Virge DX: CROC1 640x480(AMD K6/2 500MHz): 7 fps

Rage 2+: CROC1 512x384 (6x86MX 250 MHz): 5 fps

Rage 2+: CROC1 512x384 (AMD K6/2 500MHz): 9 fps

Rage 2+: CROC1 640x480(AMD K6/2 500MHz): 7 fps

We get a good 50-90% speed-up in Croc2 from the more powerfull CPU - but its futile. The game still stays below 10 fps regardless of the giant speed-up. The Rage2+ flies from 5 fps to 9 fps in 512x384 from the stronger AMD cpu, which is almost doubling the performance, it scales a little bit better than the Virge, which goes from 6 fps to 8 fps only.

Frogger: it didn't scaled on the Rage2+ from the stronger CPU, but the FPS went from 7 fps to 10 fps on the Virge DX.

Tomb Raider Chronicles:

Virge DX: Tomb Raider Chronicles (6x86MX 250MHz): 10 fps

Virge DX: Tomb Raider Chronicles (AMD K6/2 500MHz): 12 fps

Rage 2+: Tomb Raider Chronicles (6x86MX 250MHz): 11 fps

Rage 2+: Tomb Raider Chronicles (AMD K6/2 500MHz): 15 fps

This game became 20% faster on the Virge when using the more powerful CPU. At the same time, it became 50% faster on the Rage2+, almost reaching full fluid gameplay .

Motoracer:

Virge DX: Motoracer (6x86MX 250MHz): 11 fps

Virge DX: Motoracer (AMD K6/2 500MHz): 18 fps

Rage 2+: Motoracer (6x86MX 250MHz): 10 fps

Rage 2+: Motoracer (AMD K6/2 500MHz): 17 fps

The speed-up with Motoracer is about 70-80%, we can observe linear scaling. The game became playable on both cards with the more stronger CPU.

Hype: The Virge went from 6 fps to 8 fps, the Rage2+ went from 6 fps to 9 fps with the faster CPU.

Indeed, some games will benefit from a stronger CPU when using the Rage2 and Virge based cards, as these video cards rely on the CPU power to process trigonometric calculations. In contrast, high-end cards will do this from hardware. If someone was lucky to swap the Cyrix 6x86MX to a Pentium MMX and even overclock it a little bit, he was indeed able to see better results from the first 3D cards of ATi and S3. A little overclock can be also performed on the Rage2 and Virge chips as well, which would add another 10-20% percent extra performance, enabling an almost fluid gaming experience even on these cards in some games.

17
$ 43.09
$ 29.46 from Anonymous user(s)
A
$ 13.62 from @TheRandomRewarder
$ 0.01 from @phavvy
Avatar for Geri
Written by
3 years ago
Topics: Review, Programming

Comments

Reading this post brought back some nice memories of the cards I had back in those years. I had an S3 ViRGE, Savage4, and also a Permedia2. Then around 2002 I got my first nVidia card, an Inno3D GeForce 2MX which I used for a few years until my aunt gave me her GeForce 4MX 440 8x when she upgraded to a GeForce FX 6200. Anyway, I'm new here and I am just about to write my first post.

$ 0.00
2 years ago

thankyou and good luck. i also got my first nvidia card in 2002, it was a geforce2 mx400 as well.

$ 0.00
2 years ago

Great Job iyou put lot of effort

$ 0.00
2 years ago

Wow this piece was wonderful, so much information in one place. This was really informative, keep it up... I even subscribed to you. You are doing well

$ 0.00
3 years ago

It's inspired me to write new article with more effort.

$ 0.00
3 years ago