Was the GeForce2MX a scam?

0 90
Avatar for Geri
Written by
2 years ago

When the year 2000 has arrived, nVidia increased its advertisement campaign to new levels. The new, upcoming GeForce2 line graphics cards were arriving the market. All of the gamemags and pc magazines were full with the news about the GeForce2 graphics cards. The GeForce2 was fully compatible with the new DirectX 7 standard, and promised new era of graphics performance due to a paradigm shift in rendering. The game magazines featured the card, and the benchmarks were showing superior performance to everything before. And when the users bought the card, they got disappointed by the performance. Something went wrong, and in this article, i will explain, what.

DirectX 7 is arriving

Previous versions of the DirectX graphics API were designed for early graphics cards. These graphics cards required the processor to do all of the calculations for triangle transformations, and light calculations. This was about to change, when Microsoft released the new DirectX 7 graphics API. This new set of feature helped to increase the polygon count more, while offering less utilization of the system bus and processor to do the graphics calculations. DirectX7 continued to work on older hardware as well, in that case, the transformation and lighting operations were running on the CPU.

The performance benefits

The typical triangle performance of graphics cards before the year 2000 was around 20-30000 polygon per frame at around 30 fps. This was due to the limitations of the bandwidth of the AGP 2x bus, to the limitations of the CPU and the early graphics chips. With the new DirectX 7 cards and DirectX 7 based video games, 50000-100000 polygons were about to be expected from video games. The GeForce2 line was also expected to run earlyer games faster, and nVidia also planned to offer a large increase in performance under OpenGL.

The GeForce2 was actually not the first DX7 accelerator

The GeForce2 is not the first DirectX7 complaint graphics card tho. The GeForce256 was the first DirectX7 card, which was released in 1999, and it was a quite expensive card. No low-end GeForce256 cards were made, and without any low-end models, it was a rare card. The arrival of the GeForce2 was about to change this, and make the new DirectX7 revolution happen. S3 also released a DirectX7 complaint video card, the Savage2000. It turned out, the hardware lighting engine is a bit broken inside that chip, and S3 later on blocked the hardware transform and lightning functionality in drivers, to avoid rendering glitches. Of course, the DirectX7 API is backwards compatible, so the games made on top of DX7 are going to work with earlier graphics cards.

Game magazines and the GeForce2

Game magazines advertised GeForce2 based video cards, and praised them for their superior performance. Its important to notice that there are no benchmarks. If there are benchmarks, there are no comparisons for older mid-range video cards. The GeForce2 MX cards were supposed to be the new mid-range, and the GeForce2 GTS and PRO cards supposed to be the new high-end. No matter how hard you are trying to find, somehow only the GeForce2 GTS and PRO cards are being reviewed.

picture: Hungarian game magazines from 2000 and 2001, featuring the GeForce2 (i have smudged the text)

Suspicious lack of GeForce2 MX tests

The GeForce2 MX cards were released a few months after the original GeForce2 line, as the mid-range gaming solution. Before the GeForce2, there was a short lived GeForce 256 card, which had no real low-end or midrange models. The GeForce2 were supposed to be the new high-end and mid-range. After the original GeForce2 MX, models with the MX200 and MX400 tags were released as well. These cards are supposed to beat the former generation of cards, but even the naming scheme is scammish. The drivers can't tell the MX 200 and the MX400 apart, and shady manufacturers arrived with questionable cards under the MX400 ticker.

The GeForce2 MX in reality

The GeForce2 MX cards are the back-scaled models of the GeForce2 line. They have only a 64 bit DDR memory bus compared to the 128 bit of the GTS line. Some MX models are equipped with SD RAM, and they are supposed to be 128 bit, the manufacturers cut corners there to offer cards with 64 bit SD RAM as well. The GeForce2 MX also comes with less pixel pipelines, and lower clock speeds. By just the specifications, we can tell that the GeForce2 MX is around 50% slower than the GeForce2 GTS, depending on the payload. The GeForce2 MX typically uses 32 MByte memory, and built with an AGP 4x capable connector (more on that later). PCI models were available, but they were very rare and expensive.

The GeForce2 MX is surprisingly cheap

The price of a cheap GeForce2 MX card was around 70$, which caught other manufacturers off-guard. The arrival of the GeForce2 MX caused other manufacturers, such as SiS and S3 forced to seriously cut their prices. Matrox were running upwards, and trying to offer professionalized solutions for multi-monitor owners, without too much luck. Eventually, they had to reposition their hardware. ATi released a DirectX7 compatible product, called the Radeon. (Later on, renamed to the Radeon 7000 family), and did the similar thing as nVidia, cutting corners with products to fill the new DirectX7 era of mid-range. The old Rage 128 got re-positioned to low-end. This test will not feature a Radeon 7000 test, because for some reason, none of my Radeon 7000 series cards survived till this day (except for one in a late Pentium4 laptop). But believe me, those weren't too fantastic cards either.

The disappointment

When i got my GeForce2 MX, i have replaced an 8 MByte SiS 6326 card. Based on the reviews, i have expected at least 10x the performance of my old card. This performance, however, failed to materialize. The PC only had 32 MByte of memory, and that also had to be upgraded. The card was very unstable, until i have found proper AGP drivers, which fixed the stability issues. The performance was far away from the advertised one, but the card still outperformed the SiS, so i wasn't too sad. Who already had a midrange video card, and upgraded to the MX, got probably disappointed, as the low-end version of the card was not able to deliver the speed.

Typical low-end system in 2000

A typical low-end computer in 2000-2001 used the Super Socket7 platform, with a K6/2, or an overclocked Pentium1 MMX processor. The K6/2 chips usually came at 350 MHz, and they were approx in pair with the low end 300-333 MHz Celeron offerings of Intel. Overclocked Pentium MMX processors also reached 300 MHz, and the speed of them was comparable to the K6/2 and Celeron chips. The K6/2 and Celeron processors also had good overclocking potential, at least the slower models. The K6/2 350 MHz CPU can be overclocked to 400 or even 450 with a little bit of voltage increase. 400 MHz Celeron processors can be overclocked to 466 MHz relatively easily. The high-end Pentium3 and Athlon systems already scaled from around 600 MHz to 1 GHz, and the GeForce line were targeting mainly those, however, the owners of these systems were able to pay for the GTS or Pro cards, and they were not interested in the GeForce2 MX. The GeForce2MX found its place typically in 300-500 MHz Celeron, Pentium1 and K6/2 builds.

Hardware bugs in the GeForce2

The backward compatibility of the GeForce2 was very bad. All early motherboards were affected by this. The cards liked to throw blue screen of death with ALI chipset, if the AGP drivers were not installed, or the installed driver were not fully compatible with the buggy hardware of GeForce2. Even some Intel BX chipset motherboards were unstable when using the GeForce2, and the AGP had to be disabled, otherwise, crashes occured. Some motherboards with VIA chipset simply refused to boot with the card. The AGP 4x era motherboards fixed this issue, the older ones were plagued with compatibility issues, when running the GeForce2.

Building the test-system

To build an era-correct low-end system, i have decided to use a K6/2 processor. I got a K6/2 processor from a friend of mine (Yutani) about two years ago. This K6/2 processor was rated for 500 MHz. I decided to use my Socket7 system, so i have evicted the Cyrix processor from it. The K6/2 is supposed to use 100 MHz FSB and a multiplier of 5.0. I decided to overclock this chip, as the K6/2 has no L2 cache (similarly to the early Celeron and Pentium1 processors). My motherboard supports higher than 100 MHz FSB settings at 105 and 110 MHz. The computer booted at 525 MHz, but was not stable. The K6/2 chip i use (500AFX) is rated to 2.2V, which i have overclocked to 2.3V. This resulted the OS to run, but programs were not running properly (gcc crashed after a few seconds). I have rised the voltage to 2.4V, which gave more stability, but the total stability required 2.5V. Just to achieve 5% of overclock, 15% of rise in the voltage is required for this chip. Normally, i wouldn't bother with such a small overclock, but by increasing the FSB by 5%, the entire system is going to get a little bit of extra horsepower, as the memory, the AGP and PCI ports will also run faster. This 5% is going to give 1-2 extra FPS, and as some of the games of the era will be around 20-25 fps. This will push the frame counter to exceed the playability.

Other components

The memory is 3x256 MByte SD ram stick, adding out a total of 768 MByte of RAM. This is a lot for this PC, and the cache of the motherboard can't even properly address all the memory (thats why i have overclocked the FSB a little bit). Most systems at this time had 256 MByte, but some people already went with 384 or even more, so this quantity of memory is not overkill in this era. The motherboard is an Asus P5A-B, paired with an ISA sound card, an IDE optical drive and hard disk to house the OS.

The test setup:

-Asus P5A-B motherboard

-768 MByte SD RAM

-K6/2 500 MHz @ 525 (2.2v@2.5v)

-GeForce2 MX200

-80 GB HDD

GeForce2 MX200/MX400

The video card in this test is a cheap GeForce2 MX200 video card. The card has an external analog video output PCB board attached to it. The card luckily has no cooling. The GeForce2 MX 200 cards usually had no coolers, some came even without a heatsink. The card has an AGP 4x connector, and has 32 MByte memory. It feels and looks very cheap, but by 2000s standards, this was the normal for a midrange video card.

The latest drivers for this card are borderline useless - they will be unstable, not be able to run any games, or would simply run at around 3-4 fps only in such an old computer. Therefore, the driver 8.05 will be used, from 2001. This driver supposed to have heavy optimizations over the 6.49 driver, but still should perform well in Socket7 systems.

ATi Rage 128

The ATi Rage 128 was a mid-range video card, supporting DirectX 6 from hardware. ATi upgraded these line of cards, this card has 32 MByte memory, and its even equipped with a fan to preserve the life of the video card. This is required, as Rage 128 cards like to melt them-self without active cooling. There is nothing special about the Rage 128, it was a typical $20 used card after 2000. Not too slow, but not too usable either. Its two pixel pipeline and 64 bit memory interface is not impressive, but at least its comparable to the GeForce2 MX.

Not because its from the same era - the Rage 128 is one year older, doesn't supports hardware transformations and lighting - but because it was easily available, and people probably considered to upgrade from this card to the GeForce2MX instead. The driver used with card is from 2001, version 4.13.7192.

3dfx Voodoo3 2000

The 3dfx Voodoo3 was a wanabe-high-end card by 3dfx. 3dfx, once the king of the 3D crown with the Voodoo1 cards, quickly lost ground against its competitors. The Voodoo3 got released in 1999, yet it was not even compatible with true-color rendering. The buyers of this card were mostly hardcore 3dfx fans, as it was priced around 250$ at release. Cheaper cards, with more features crushed this card from every direction. S3, SiS, ATi, Matrox and nVidia cards excelled under DirectX and OpenGL in every market segment.

The only thing that saved the Voodoo3 for a while, was the Glide API support. Glide was the own graphics API of 3dfx, and as it was designed to specifically be optimized to the 3dfx hardware, it gave advantage for the card in the games using it. Earlyer, there were 3dfx-exclusive games and software, supporting only Glide and software rendering. This era was however, over, and in early 2000, people who had a Voodoo3, were considering to buy a GeForce2 as well. The card in this test is a 16 MByte variant (128 bit), and uses the PCI bus instead of the AGP. It has a cooler attached to it by the previous owner, which was probably a wise decision, as this card runs very hot otherwise. The driver for the Voodoo3 was the 1.05.00 (2000, August). The driver 1.10 which i initially planned to use, crashed with every game after a few seconds.

Matrox G450

In the previous Socket7 low-end GPU test, we have learned that the Matrox G200 is the kind of this platform, when using a Cyrix 6x86MX CPU. The G400 is the sequel of the G200 chip released in 1999. The G450 is the low-end, cheaper version of the G400, released in 2000. With the Cyrix CPU, it gave far worse performance than the G200, probably due to driver-related issues (or it might be just slower than the G200 alltogether). The 525 Mhz K6/2 will hopefully enough to feed it properly. The G450 also has 32 MByte video memory, but its only 64 bit.

The card only supports DirectX 6 from hardware, but similarly to other competitors in this test, it has two pixel pipelines. It requires no active cooling. It seems Matrox tried to sell this video card for $125 even in 2001, i am really curious to see, how it will perform against the GeForce2 MX. The driver 682 was used. This driver is from 2002, an older driver might would given one or two extra fps tho.

nVidia TNT2 Pro

The TNT2 Pro is a low-end card in 1999. Its one of the old mid-range DirectX6-capable card of nVidia. The Pro is misleading, its actually a cut-back 64 bit version, compared to the real 128 bit TNT2 card. At least the card is passively cooled. This model is equipped with 32 MByte RAM. The driver i will use for this card, is going to be from the 2.x family, as nVidia slowed these cards back from drivers later on, to give more market for their GeForce2 cards.

Yeah, nVidia was this shoddy and scammy in the 2000s, deal with it. TNT2 based solutions (mostly the TNT2-M64) got repositioned as the new low-end offerings in 2000. The TNT2 Pro is somewhat slower than the TNT1, at least when it was paired with the Cyrix. This time, the 525 Mhz K6/2 will hopefully have the raw performance to push it beyond slide-show. The nVidia driver i used was the 3.68, and its from 1999 december.

S3 Savage4

The S3 Savage4 was a good low-end/mid-range card. Before someone would laugh on this claim: in the previous retro video card test, when using the Cyrix processor, this card simply beat all the other competitors even from 3dfx, ATi and nVidia. Despite its a single pipeline card, with a 64 bit video memory bus, S3 wasn't kidding with this card. They did their homeworks, and the drivers are properly optimized. The card was a favorit of OEMs as well, was cheaply available around 2000 - i remember these card circling around $15-20. When the card was brand new in 1999, the price was about 100$, which have quickly dropped after the release of the GeForce2MX. These cards were built with 16 MByte RAM as well, but unfortunately, my model is the 8 MByte version.

A lot of owners of the Savage4 thought, its time to upgrade to the GeForce2MX for various reasons. They expected better compatibility and performance from the GeForce2 MX. After 2000, the Savage4 started to struggle with newer games, and the company was also bought by VIA. People thought, its time to change to something bigger. We will see, if this was a good idea, or not. The driver i used, was the driver v82038 from 2000 january. The newer drivers caused instability with the configuration i have used.

The games

I will test games mostly from 1999 to 2003. Some of these games only require 200-300 MHz CPU, but some of them will be seriously CPU-limited on the K6/2. I mostly remember the titles i wanted to play on the GeForce2 when i got it, so i will test those games alongside with a few recommendations. Racing games, strategy, simulation, adventure and FPS games will get benchmarked as well.

Colin MCrae Rally 2

This was a popular race-car game around 2000. The game can start with almost every video card, only the earlyest 3D accelerators will have serious problems running it. This game requires a 300-400 MHz CPU to reach smooth frame rates. The game will be tested in 640x480 at 16 bits, increasing the resolution usually has a notable impact of the performance due to the forests around the road, which will consume the fill rate.

In the first test, the GeForce2 MX200 got massacred by the older graphics cards. The Savage4 was faster than all other cards, the slowest was the GeForce2. There are barely any difference in the performance of the graphics cards in this test tho, all other cards were on the edge of playability.

Crusaders of Might and Magic

I have never played with this game ever. A friend of me recommended it. I liked the videos about the game play, so i decided to try this game out. The graphics looks simplistic enough to run even on the first gen of 3D accelerators, but of course i can't be sure about the machine requirements. Despite i have seen one of the episodes, i know nothing about Might and Magic. This is the first one which catched my attention, as it reminds me of an old game i liked. It features a hero, who roams in an open-world type of map, fighting with monsters, and such. The game is released in 1999.

I got a lot of ideas today from this game. It was an interesting experience to play with. I might continue to play with it later on. The Voodoo3 and the Savage4 were the fastest cards, yet none of the cards reached fluid frame rates. This is interesting, because the specifications say, a P1-166 MMX CPU should be enough to run it smoothly. In contrast to this, the game ran between 10 and 20 fps all the way. The GeForce2 MX200 and the TNT2 were the slowest cards.

GTA3

This was a fantastic game. Such 3D open-world type crime simulators were unknown before. The prequel to this game was 2D only, and the new one prepared to move the genre to the 3D era. It was a success. I remember playing weeks with this game. Its not perfect, tho. The giant world requires areas to be constantly buffered, when driving around in the city. This was problematic in the Windows98 era, because threads wasn't a thing yet. Windows98 doesn't even supports multicores, or the now-standard threading functions. This was not needed, as every consumer-grade processor were single core only. The disk access also creates stalls in the video processing. This means small pauses and stutters, when the scene is being re-loaded. This was quite annoying. Slower the hardware is, bigger the stutters are. The game, on its own, is very demanding already, and requires a strong graphics card. The requires CPU is rated at a minimum of 700 MHz. I have only played this game on 450-500 MHz K6/2 and Celeron processors, so i was never able to escape the stuttering.

The game was very slow. No video cards were able to reach playable frame rates. The stuttering alone is not enough: the gameplay also ran in slow motion, making the game unplayable. A strong Pentium3 CPU is required by this game, and the hardware transforming and lighing pipelines in the GeForce2 couldn't help too much either. At this time, the GeForce2 MX200 came out on top with 7 fps, which was still resulted un-enjoyable frame rates, and approximately 60% faster than the other cards. The Savage4 failed to start this game, as the game demanded a video card with at least 12 MByte free video memory, which my 8 MByte Savage4 obviously doesn't have. Other video cards were running the game around 4 fps. This is the first game in the test, where the GeForce2 MX is potentially faster than other cards, and makes sense to use (if a stronger CPU is present).

MotoGP 2

This game was designed for the Pentium3, and it likes to stutter on everything below 700 MHz. The official readme says, it needs a Pentium2 at 450 MHz to run, but it tends to be unplayable on such a computer. Especially, when there is an action on the screen, the CPU seems to struggle with this title. This game is from 2003, and i remember trying it for the first time around its release. The graphics were super nice, but as it was a stuttering mess, i quickly stopped playing with it. The controls are a little bit messed up as well, a game shouldn't be like that.

This game was a stuttering mess. This time, every card were running at a steady 3 fps. The game also requires a fast Pentium3 chip, and even the GeForce2MX can't make any difference. This game is unplayable by any means.

NFS Hot Porsuit 2

This game is about escaping from the police. I expected more from this game, but it is totally boring. The gameplay is almost zero. Probably this is the worst Need for Speed game released ever. Despite being a game from 2002, at least it has a relatively small machine demand, and it can run on basically everything. Specifications say it needs a 500 MHz processor, 128 MByte RAM, and 16 MByte video RAM. This might puts my 8 MByte sized Savage4 outside of the compatibility list.

The game started on all video cards, except on the GeForce2 MX. This time, the Rage128 was the fastest card in the test, still only able to reach 6 fps. It seems the creators of this game intended a Pentium 3, as the 500 MHz K6/2 was not enough to move the things around. Another giant fail for the GeForce2 MX.

Unreal Tournament

A popular FPS game, the big competitor of Quake3. And the Unreal Tournament won against the Q3. It was able to take almost all of the users. The game was released in 1999, and later on, a lot of sequels were released. Another game which can use D3D, OpenGL and Glide. OpenGL is usually buggy, so i will run this game in D3D, unless i experience a problem with a certain card. This game was indeed more entertaining than Quake3: better engine, better gameplay, and a very good multiplayer-mode. It can even run in software rendering, if you have a strong-enough CPU. I played weeks with it, and played for about a year with my classmates on school PC-s around 2002 with this game.

This time, the GeForce2 MX200 was the fastest card. It barely exceeded other cards by 10%. The Voodoo3 was just as bad in Glide as in D3D. The Matrox G450 had less luck with this game than the others, so it was almost the slowest card in the test for some reason. The real loser of this test is the Savage4 tho. The S3 Metal api didn't worked, and caused the computer to hang (despite trying two drivers). The card reached 12 fps, which is probably due to the small quantity of video memory. Maybe throttling down the texture size, or using a 16 MByte Savage4 would have fixed these problems.

Quake3

A popular OpenGL based game, which is used more as a benchmark than as an actual game. I remember playing with it for a while, but the multiplayer mode was not as interesting as Unreal Tournament. This resulted in the demise of this game very quickly, but there were a few company that licensed the engine for all kinds of games. The engine wasn't super good, and the games using it quickly disappeared within years.

Most cards were struggling around 20 fps. It was not a good experience by any means, but it was playable. The Savage4 was the fastest card in the test. The GeForce2 MX was only catch the second place together with the Voodoo3. The Rage128 got very bad results, ATi was not strong in OpenGL around this time.

F1-2002

The F1-2001 required a strong CPU, and this is probably true for the sequel as well. The game looks identical to the 2001 version, but the cars got upgraded for the 2002 season. It wasn't a bad game, i remember playing with it for a few days. But it never worked on the low-end computers of its era, so i have played this game years after its release. Probably it will be a stuttering mess on the K6/2. In the system requirements, it says it needs at least 32 MByte video memory.

The game was too slow. Even loading the game took long minutes. The game refused to even start on the Savage4. It also failed to run on the Voodoo3 - complaining about DirectX compatibility. The GeForce2 reached 3 fps, the other cards topped out at 2 fps.

Tony Hawks Pro Skater 3

The sequel of the popular Tony Hawk's game released in 2001. I have played with earlier and later iterations of this game. I haven't played with the third episode tho, but as it was released after the arrival of the GeForce2 MX, it will be interesting how efficiently will it be able to utilize the new DX7 API. I remember playing hours with these games each day, for weeks. The game will be run at 640x480 at 16 bits.

The Matrox G450 won this test, at 12 fps. The GeForce2 was the slowest card in this test, only able to pump out 6 fps. Other cards are in between the two. The Voodoo3 failed to render most of the textures in this game. Its important to note, the game runs at higher frame rates, if less parts of the level are visible. The game was basically running at twice of the rated speeds, when skating in half-tubes and such. Another big disappointment for the GeForce2 MX200.

Warcraft3

The sequel of the popular Warcraft2 was a huge disappointment. The different races are not balanced. The enemy AI doesn't have to bother with human reflexes, and attacks you long before you even have a chance to properly build your base. The story and maps are boring. The graphics is not bad, but nothing fancy either. The original version from 2002 requires a 400 MHz processor (600 MHz recommended), but this type of graphics and gameplay should been possible on a Pentium1 as well. And actually, it was, but barely able to exceed 20 fps in most cases. A Pentium3 processor is strong enough for this game, i don't yet know, how it will scale with this high-clocked K6/2. The GeForce2 is expected to reach smooth frame rates tho even in this system.

The game was not really playable. Despite its an RTS, which doesn't needs high frame rates, it was not a good experience, as the unser interface was laggy. This game really requires a Pentium3 at least to be fluid. All cards managed to reach 6 fps, but the GeForce2 MX reached 8 fps, which means, this time the hardware transforming and lighting engine did a good job, and manages to give a good 40% boost in frame rates. Unfortunately, the game still stays a stuttering mess.

Croc2

The sequel of the famous Croc. The gameplay is less fantastic, the camera and controls are even worse than in the original Croc. It requires a little bit stronger computer to run fluidly, but a CPU above 350 MHz should probably do the trick. The game was released in 2000 for PC, i don't like its graphics that much as the first one, even if this seems to use bigger textures, and more polygons. In the previous test with the Cyrix, only one video card was able to reach fluid gameplay - probably the K6/2 will help the cards to exceed 25 fps, if the video cards are fast enough.

The Voodoo3 won this test with 27 fps, but the game was perfectly playable on other graphics cards as well. The Rage 128 and the GeForce2 MX200 was the slowest with 23 fps, which is another huge blunder for the MX200.

Screamer 4x4

I have never played this game, but i will try it. It was somewhat popular, but back then i felt i am fine with other type of car games. I was not a big fan of car games, but i always wanted to try this. On the apropo of this test, i will include it, and play a little bit with it. The game got released in 2000, and can OpenGL and Glide as well. It requires a 233 MHz CPU, so the system i am about to throw at it, should be fast enough. I don't know what to expect, but i think it will run fine on most of the cards.

I disliked this game. The camera is bad, and the maps are boring. The Glide mode didn't worked on the Voodoo3. The GeForce2 MX200 was the fastest, reaching 11 fps. The Savage4 came as second, with 9 fps. The Rage 128 was the slowest with 5 fps, its very slow OpenGL performance strikes again. This time, the hardware transformations engine helped the GeForce2 to gain a few 10-20% over its competitors.

Mortal Kombat 4

The first 3D Mortal Kombat game was released in 1998, and the graphics are quite nice. It was not such entertaining as the previous episodes, but the engine seems to be optimized, and the game is minimalistic. The game requires a 133 MHz CPU only, with a suitable 3D accelerator. Interestingly, it can even support 2 MByte 3D cards. You may as, what is the purpose of including such an old game in this test - and the reason is: i was curious to know, how well the GeForce2 MX will behave with such a game, in regards of compatibility and performance.

The 3D engine in Mortal Kombat 4 is super good. The programmers were among the bests sciencists of the time. It run on all graphics cards with a steady 60 fps, without any stuttering or bugs.

3D Mark 2001

This was a very popular benchmark program. It is focusing to measure the 3D performance of a system. I absolutely don't care about benchmark programs, as they don't say anything about the real world performance of a system. Yet, this time i will use this benchmark program, to see, how accurately it is able to measure the performance of the GeForce2 MX compared to the other cards. At the point i am writing these lines, i don't know what to expect, and how accurate this program will be on the system. This program was designed to be able to measure wide range of graphics cards, in theory it should even work on the very old dx5-complaint graphics cards, and also should be able to accurately measure newer video cards from the early 2000s.

The benchmark always crashed on the Savage4, and i was not able to get any points for that card. Rest of the cards scored around 400 points, some cards got the same rating (the G450 and Rage128 got 443 points). The GeForce2 MX200 got the worst rating, it only managed to reach 399 points, despite being to able to run more game tests than older cards.

Conclusion:

The GeForce2 MX was a scam. Of course it was a working video card, which is able to play games, so it was not scam in that sense. It was able to outperform integrated cards from the era as well. But the advertisement of the GeForce2 MX series involved fake reviews, and organized scam campaigns. The GeForce2 MX200 card, a $80 card in 2001, is on pair with mid-range video cards from 1998. The $80 GeForce2 MX is competing with $10 Savage4 cards in reality, and no one seems to mind it.

The GeForce2 MX would require a strong Pentium3 CPU to unleash its performance, but even then, it will barely outperform these cards with a few 10 percent. The compatibility of the card is also far from perfect, some games refuse to start. Early drivers are unstable, and if far newer drivers will be used, those will be slower than the early drivers.

Those who had enough money to buy a high-end Pentium3 or Athlon CPU, were likely able to buy high-end GeForce2 video cards (such as the GeForce2 GTS). The GeForce2 MX is not a good card by any means, but there are worse cards in this list as well. The biggest disadvantage of the GeForce2 MX is: it would usually need 1-2 extra FPS to reach 25 fps in some games, where other cards can push out 26-27 fps.

The Rage 128 is clearly bad due to its super weak OpenGL drivers, otherwise its either faster or slower than the GeForce2 MX.

The Savage4 (at least the 8 MByte edition) failed to start some of the games - where it works, it usually outperforms the GeForce2 MX.

The Voodoo3 is the fastest card in overall, but it simply refuses to start up a few titles, and it doesn't supports 24 or 32 bit rendering.

The TNT2 usually performs equally to the GeForce2 MX200 in this test. Of course this is a mid-range TNT2 card. A TNT-M64 wouldn't been so good.

The Matrox G450 is the winner of this test. It started up everything. Usually outperformed the GeForce2 MX200, and when other cards were around 22-23 FPS, the Matrox G450 usually managed to reach or exceed 24 fps.

There is however no real winners here, only losers. The K6/2 is not strong enough. 640x480 at 16 bit is not a big-enough resolution to unleash the potential of cards with bigger memory size and memory bandwidth - however in the early 2000s, low end PCs still had monitors with 640x480, or 800x600 at most.

The GeForce2 MX200 made no sense for a gamer, and it was nothing it could offer for someone using low-end hardware.

PS. The GeForce2 died after finishing the benchmarks. It started displaying white lines and corruption on boot. The GeForce2 was made after the manufacturers started to use new type of BGA chip soldering technologies - these cards die easily.

3
$ 0.53
$ 0.50 from @Metalhead33
$ 0.03 from @Unity
Avatar for Geri
Written by
2 years ago

Comments