Join 100,427 users already on

Gamer graphics cards for $10

5 88
Avatar for Geri
Written by   103
6 months ago

I decided to compare 10$ gamer graphics cards. The point of this test is, to see the potential of cheap second-hand graphics cards. Observe, if they can run relatively modern games, and to measure, what speed they can achieve. For the sake of this test, i have collected various cards on second-hand markets. I was not searching for anything special, or unique solution that can perform decently. I bought a few cards, from various generations. This should reflect on what video card generation is even compatible with relatively modern video games, as i have collected cards from the DX9 era through the DX10 era, up to DirectX 11.

To collect these cards, i have simply bought up the cheapest video card offers in my circles. I didn't wanted to buy all the cards one by one, as the postage fee would been bigger than the price of the cards itself. I have tried to buy at least 3-4 cards from a person - if this was not possible, i also bought other things from the person, if he had something i needed. After a few months of hunting for these cards, i was able to gather a significant number of video cards, so i was able to do this test.

Challenges when starting up modern games

Modern video games frequently using the DirectX 11 graphics API, or they sometimes use OpenGL 3. DirectX 11 is backwards compatible with DirectX 10 and DirectX 9 compatible cards, however, various features will be disabled, and very modern and long pixel shader code is not being supported on older cards. This means the game developer must carefully adjust the code to be compatible with older video cards.

In the case of OpenGL, there is a traditional OpenGL context, which is based on the super old OpenGL 1.0, and the newer features of OpenGL 2 and OpenGL 3 versions are exposed as feature extensions. It is possible to write an OpenGL based game which can scale back even to OpenGL 1.0 and 1.1 capable video cards, meanwhile supporting OpenGL 2.0, 3.x, and even 4.x features. This, however, requires careful design of the video game engine, to detect and use extensions which are compatible with the given card.

Modern graphics cards

Right now, modern video cards typically supporting DirectX 12 and OpenGL 4. AAA-class game engine developers sometimes using these modern graphics interfaces. However, sometime the developers create fallback to support older cards with older API-s, such as DirectX 9. I will test games i like, i haven't researched them previously to check how they will behave, or if they are even compatible with the video cards i will throw at them - so the test will be even a huge surprise for me.


Older Directx9 based video cards use different rendering methodology. The have separate vertex and pixel shader units to display the geometry. nVidia, however, released its first DirectX 10 compatible video card family, the GeForce 8. The GeForce 8 uses a new unified shader technology, where the shader units are not separated any more based on their role, and all of the shader units are capable to perform all types of operations. This type of rendering requires a lot of efforts to be compatible with the older DirectX 9 based cards, and it will be very interesting to see how many games will be compatible with these pre-2006 cards.

Hardware incompatibility

After 2005, manufacturers switched from AGP to PCI-E. This test, therefore, will not feature any AGP cards - most would been too slow to do anything with modern games anyway. The video cards used in this test mostly use the PCI-E slot, however there is one PCI card which i will also test, and an integrated Intel chip will also be tested. PCI-E cards are mostly backwards (and forwards) compatible, so they can be used in wide era of motherboards without too much compatibility issues.

The test system

I was wondering for a while, what kind of motherboard and CPU should i use. Originally, i planned to do the test with a Core2Quad based system, but i accidentally broke the clip on the cooler. I was searching for a screw to hold the cooler in place, but i haven't found any screw that was small enough. After this, i have decided to choose an Intel i3 based test configuration, which i already had a cooler and a memory installed in, and it will offer more modern instruction set extensions, which will convince some of the more modern games to even start up on the system.

The configuration

Motherboard: Asus P7H55

CPU: Intel i3-530

Memory: 4 GByte DDR3

HDD: 500 GByte SATA

The video cards

nVidia GeForce FX5200 PCI

Thats right, its a PCI card, and not a PCI-E model. I have this card in my collection for more than a decade. The AGP version was more popular, but of course, nowadays we dont use AGP ports any more. The motherboard i used, still had a PCI slot, so i decided to give this card a run. The FX5200 PCI has 64 MByte of video memory, and a dual-pipeline DirectX 9 based core. I have downlocked this card by 20% to preserve its life. Normally, this card requiers ForceWare drivers of 40.x to work without problems. Newer drivers make a lot of glitches with this card in most motherboards, probably due to the PCI bus. Unfortunately, Windows 7 64 bit is too new for this card, so even the oldest drivers are quite new.

After installing the card, Windows didn't wanted to boot up, i had to start up the computer in VGA mode. For some reason, the card worked without problems after that. I have used the latest Vista 64 bit drivers to make the magic happen. Even when released, this was a quite weak card, its good for retro gaming in a P3 system, but certainly useless by modern standards. In theory, however, even DirectX 11 games should be able to initialize with this card (as DX11 is backwards compatible), and they should work (without the modern features). The card supports OpenGL 2.1 with the newest drivers.

ATi Radeon x740XL

This card is a PCI-E clone of the Radeon 9700/9500. It has a first generation PCI-E slot, and 8 pipelines compatible with DirectX9 (shader modell 2). The memory size is 128 MByte. The 128 bit memory interface offers less bandwidth than a 9800PRO, at least it requires no extra power connector. The cooler is nice and big on the card, it signals good quality. Even if this is a PCI-E card, its still just an incarnation of early ATi cards from 2003 and such.

The 9500/9600/9700/9800 cards were very famous and impressive back when they were released, but they became obsolete just within a few years. I'm curious to see, how well it can hold up with modern titles, if it can even start them. The card supports OpenGL 2.1 with the newest drivers. I got this card for $2 a couple of years ago. Nowadays, it would be more expensive.

ATi Radeon x1550

This card were released around the time when AMD acquired ATI. ATi barely did anything else beyond renaming their old xXXX line. The X1550 card has only 4 pixel pipelines (half of the x740) which doesn't wispers any good from the expected performance. At least this is a 128 bit card, paired with 256 MByte DDR2 memory, the specifications aren't even too terrible. It can be imagined as a PCI-E version of the Radeon 9600, which is still just a DirectX9 card.

As a little fine-tuning, this card supports Shader Modell 3.0 which can give a little extra compatibility with some of the games, we will see if there will be any game that is willing to run on this, but not on the predecessors. The card looks good on pictures, but in reality, the cooler feels very cheap. The heatsink is moving and wobbling, so i enstrenghted it by adding some strings. Unfortunately, this card died, sometimes it produces artifacts and it crashes under heavy 3D, even if i have downclocked it by 30% to try enhancing its life. The card supports OpenGL 2.1 with the newest drivers. The card was $2 about 3 years ago.

nVidia GeForce 8500GT

The nVidia GeForce 8500GT was meant to be a midrange member of the GeForce 8000 line. This was the first DirectX 10 capable GPU lineage of nVidia. With the 8xxx line, nVidia entered the era of unified shaders, where there are no pixel or vertex shaders any more - the card is being built from small processors, being able to process the rendering more efficiently, and the video cards from that time, can also work on general purpose algorithms. Due to this, the 8800 cards achieved a lot of success, they became the defacto standard of gaming upon their release.

The same thing is not true for the 8500GT, which is more like a low-end card than a high-end one. You can see my model looks very similarly to the x1550, but at that time, this was the main style of Gigabyte, and the two card has nothing to do with each other (except for the manufacturer, which is the same). The card got 256 MByte of DDR2 memory, which is being attached at a 128 bit memory interface. This should have been enough for the era of the card, however, they also have decreased the number of shader units and texture processing units by more than 75%, which screams super low-end. The heatsink of the card is less wobbly, so Gigabyte indeed fixed that problem with this card, but the cooler still seems too cheap. I got this card for about $4 or $5 a few months ago.

Intel HD Graphics (x3xxx/i3)

After the release of the first DirectX 10 cards by nVidia, a non-expected competitor appeared on the scene. Intel wasn't a new kid on the block, as it had its integrated video chips on the market since the 90s. In the early 2000s, they slowly followed the trends with usually very weak graphics chips. They lagged features, typically one or two generations of API behind. They used the system memory, which back then was single channel SD RAM, capable of delivering only 100-200 MByte/sec of memory, which was not enough for any fancy real time 3D. When the LGA775 era arrived (Pentium D), they have released an integrated DirectX 9 complaint chip family, the GMA900/950. These chips were only in pair with the first generation of low-end DirectX9 chips of ATi and nVidia from years ago, and no one expected Intel to release an usable DirectX 10 chip in the near future. They were wrong. Just within two years, they came up with the DirectX10 complaint GMA 3000 family, which they quickly redesigned as the 4500/x4500 graphics line for their newer motherboards. These IGPs were usually part of the motherboard chipset, later on they have moved them to the CPU itself (first gen Intel i3). There are probably not too many differences between the chipset-based and the later-released on-chip solutions, both should perform nearly identically.

The GPU was simply called Intel HD Graphics, in reality this is an on-chip version of the x4500 with 10 shader units and 4 texture units. It supports DirectX 10 and OpenGL 2.1. It uses the memory sticks in the system, so if two memory sticks are used in the system in dual channel, it can access it as a 128 bit DDR3 memory interface above, which at that time was on pair with mid-range graphics cards. The drawback is, the CPU itself also uses the memory bandwidth, which the GPU has to share. I haven't tried any of the first generations of Intel's DirectX10 IGPs, so i don't know what to expect. At least this chip is free, as its integrated in the CPU itself. To be able to use the integrated graphics in the i3 processors, a compatible motherboard is needed, where the graphics card has a connector for a monitor. Not all Intel chips of this era have an integrated IGP, so don't be surprised if your fancy new i7 fails to boot in an LGA 1156 motherboard without a graphics card.

nVidia GeForce 9500GT

nVidia quickly tried to increase the performance of the mid-range, so they have released the 9500GT before AMD was able to react and fill the market gap. The chip got a die shrink, and they doubled the number of shader units (from 16 to 32). The card still has a 128 bit memory interface, at least it supports DDR2 or DDR3 memory chips to be used as well.

My card has 1024 MBytes of DDR2 memory (so its the cheaper one), and has no VGA connector any more, which is a little bit of annoyance for me. The card feels to be a quality build after all, nothing wobbling or trying to fell down, and luckily it has no external power connector. The cooling feels to be good quality, and the card is heavy. The card is advertised to have a good 60-70% performance uplift above the 8500GT, we will see if this is true or not. This card was about $3 this spring, so it can't be a bad buy.

nVidia GeForce 8800GT

The legendary GeForce 8800GT card is the die-shrinked version of the original 8800GTS/GS line. The variation of 8800 cards are quite big, there are cards with 1024, 768, 640, 512, 384, 320 an 256 MByte of video RAM, and they were built with 160, 192, 256, 320, 384 bit memory interface as well. The cards with unified shaders were market success regardless, however the manufacturing was too complex and costly, and the 8800GT was released to fix that. As nVidia settled with the 8800GT, and scrapped the rest, they have finally decreased the power consumption, they have switched to a single extra power connector, and single slot cards became available based on this solution. My card was originally a dual-slot ASUS card, i have replaced the cooler to an XFX coller to mod it to single slot. I have also downvolted and downclocked the card by 30% to preserve its life.

Not that i bought it for a lot: i got it for about $10, but nowadays this card is not available widely, and can be more expensive (it still fits the test, as i technically got it for $10) so i am lucky i have modded it. Unfortunately the card has some sort of bug with some of the DVI to VGA converters, so i either have to use it through DVI, or use a converter, and get strange signal and colors, but this is not a hardware issue, just a compatibility problem. The card feels very quality and cheap at the same time. There is a cooler on the VRM which i can simply touch, and then the cooler is flexing on the VRM, which feels very cheap, otherwise the components on the card signaling good quality to me. This card has a 256 bit memory interface (the only 256 bit card in this test), and it has 512 MByte memory. This card finally supports the new PCI-E 2.0 standard. nVidia initially messed up the PCI-E 2.0 standard, so the video bios on the card has to be upgraded if someone wants to use it in older motherboards, otherwise the computer wont even post. ASUS have probably patched this card out of the box, limiting it to PCI-E 1.0 with a GEN1 bios to avoid RMAs. There is something in this card that even 17 years after its release, tells you, there is maybe some performance still left in this toy. The card (similarly to other models in the 8000 and 9000 line) initially supported OpenGL 2.1, which was upgraded in later releases with OpenGL 3.2 and 3.3 with the last drivers.

AMD Radeon HD4350

The answer of AMD to the GeForce 8000 family was the Radeon HD2000 family, which quickly turned out to be a giant failure, despite AMD instantly switched to unified shaders and DX10 itself. After a short lived HD3000 line (which was also released for the AGP port to gather some sales there as well) they released the HD4xxx family, which was supposed to be a proper competitor for the GeForce 8000 and 9000 line of cards (and the newly released GeForce 200 as well). The HD4350 designed to be a low-end, low-power card with passive cooling.

No extra power coords are needed. The cards are very small and compact. The card can be built with DDR2 or DDR3 memory, sadly the memory interface is 64 bit only, which ultimately exiles this card to the low-end. Some motherboard manufacturers used the stripped version of the chip with the name Radeon 4200 as an integrated solution (the desktop version had 80 shader units, the integrated had only 40) which i remember to be too slow to even be playable in 800x600 in any games. Hopefully, the dedicated video card will be able to met my expectations, which aren't too high for this card. The features are okay on paper, the DirectX10 and OpenGL 2.1 (OpenGL 3.3 with newer drivers) should make this a card that could run some older titles, despite the 64 bit memory interface. My card is equipped with 1024 MByte memory, but models with 256 MByte or 512 MByte also exist. The 1 GB model seems pointless regardless, as the memory bandwidth of this card is barely bigger than the bandwidth of the PCI-E bus. AMD probably meant this card more for multimedia PCs and low-end desktop solutions, but the built quality of the card is not even that bad, they didn't spared too much on components. I don't remember how much i have paid for this card, but it was something around $5 in this april. The card has a video-out plug, and it has a jumper to set the video signal to PAL or NTSC. Thats something you don't frequently see even on 90s graphics cards.

AMD FirePro v5700

Another card of this generation is the FirePro 5700, which is a professional CAD card for usage in CAD, CAM, modeling, and in the healthcare industry as wll. Its based on the AMD Radeon HD 4650, but AMD released more modern drivers for it as the times went by. In reality, these card doesn't differ from the desktop equivalents besides the drivers, as the CAD software stopped to differ from game-type and office-type of software in the mid 2000s. The v5700 at least offers a 128 bit memory interface, and a temperature-controlled active fan. The v5700 was meant to use DDR3 memory chips, so i'm hoping my model also had that chip. The number of shader processing units are more than 5 times higher than the previously mentioned model, and the number of texturing units also twice as higher.

The overall memory bandwidth is also more than twice higher, so this card should be a more potent solution. Besides one single DVI port, the card features two DisplayPort connectors, which was AMDs favorit connector type at that time, but it was not widely used, and was suppressed with HDMI not much later. The card has a good quality, copper based heatsink, and the fan rotates at a low speed once the system boots up. Suspiciously low. The card feels heavy due to this heatsink, and i am wondering what could i expect from the card. AMD had plenty of card to fine tune the performance of this era, the specifications are not terrible, but not outstanding for a card like this. This was also a $5 card i got couple of months ago. I am not sure if the card will perform around the much older 9500GT, the 8800GT or around the HD4350, probably a lot of will depend on the microarchitecture and the drivers.

nVidia GT610

This is the only DirectX 11 capable card in the test. There could be some DirectX 11 games, which are not compatible with DirectX 10 video cards. DirectX 11 is designed to be backwards compatible, however, some game developers refuse to support older video cards, and if some features are missing, they refuse to support the card. If some games will be encountered, then the GT610 will be able to run these newer video games as well, meanwhile the other cards in this test will fail to run it. The GT610 was also a $3 card, which is suspicious for a DirectX 11 card. If we look at the specifications, we can see, why: the card only has a 64 bit DDR3 memory interface. It is still being sold for brand new in webshops, both PCI-E and PCI versions are available (more powerful models of the 600 line are long gone).

The card also has the new HDMI and the old VGA connectors, which makes the life of the user more comfortable. The memory bandwidth is only 15 GByte/sec, which is even slower than the CPUs memory bandwith of a 2010s PC. The card is available in 1 GByte and in 2 GByte versions. I have the 1 GByte version, but i don't think it would make that much of a differene with this weak memory bandwidth. Besides the DirectX11 support, the card also offers an OpenGL 4.6 support. When i first touched the card, it felt super low quality. The heatsink is cheap plastic, making the card light-weight. The card is a half-height model, so it can be put to media players and home theathers. Thats one of the intentioned usage of the card, as it has built-in hardware decoding acceleration for Blu-Ray video playback. This is more of a multimedia-accelerator card than a 3D accelerator card for sure. I am not sure what to expect from this card. The chip only have 48 shader units, and 8 texturing units, which is even somewhat comparable to the former flagship 8800GT, but the memory system is so weak, it will not be able to unleash this potential performance under any circumistances. Or will it be? After holding it in my hand, i am sceptical of this card.

Data summary

Cards i would loved to test

There are other $10 video cards on the market from other manufacturers. I was just not able to find any of them right now. The western market is dominated by nVidia, AMD and Intel, but East Europe and Asia has far more than these. I will mention them, and in the future i will might test those as well.

One of the main competitor of these are S3. S3 was creating graphics cards till the early 2000s, then they have switched to integrated graphics cards. VIA acquired them, and since then, they making video chips for VIA. They have released a few non-notable and rare desktop graphics cards as well, but the built-in chips are more popular. They switched to DX10-compatible chips after the Core2Duo era.

Matrox is a professional card manufacturer, which is known for their P models. The original Parhelia was released to the AGP port, the later P models were released to the PCI-E port, and they got OpenGL 2.1 support afterwards. When i was about to buy these cards, they were available for $10, but i was too slow, and someone else bought them instead.

3DLabs is a former CAD card manufacturer, sometimes they also released video cards for end users and OEMs. They had a small market share in the past decade, nowadays they stopped making graphics cards, and switched to processors instead. Their latest video cards are OpenGL 2.0 compatible.

SiS also switched to integrated graphics cards since the 2000s, they went on till DirectX9, when they suddenly cancelled their DirectX 10 based chipsets, and switched to the ARM and MIPS market instead.

PowerVR is an IGP manufacturer, which licenses its cores to other corporations. PowerVR cores can be found in some cell phones, some Intel Atom processors, and also in some Chinese and Russian processors as well. They support OpenGL 3, and DirectX 10/11 on desktop.

Zhaoxin is a chinese CPU company, which bought the architecture from VIA. They have redesigned to core later on, and they have added their own DirectX 10 compatible video chip inside the processor. Rumors say its a very slow chip, but as its mostly used for office-type desktop computers, laptops, NAS devices, where graphics acceleration is barely used, they don't have to worry too much about the performance.

The games

I have decided to test games based on how much i liked them. I don't care what games other would deem more test-worthy, or what others usually play. I have decided to choose only games i like. There are newer, and older games in this package as well. All game in this list is newer than 2010, and the newest games are from 2022. I haven't checked what graphics APIs they use, or what is the minimal GPU they are officially willing to start on. I will throw these graphics cards on these games for the first time.

Yandere Simulator (build 2022 march 1)

Yandere Simulator is a well known indie game. The developer is known to be hated for his success by failed and envy people who haven't shown up anything in their life. Some of the criticism he receives is legit tho: instead of learning the basics, he decided to use the Unity joke game development system, and the code quality is generally low and unoptimized. To be fair, this is basically the only mention-worthy game made in Unity ever. Performance issues are not new from Unity based games. Lets see how this game will be able to run.

Yandere Simulator is indeed capable of running on every video cards in this test, including the DX9 era cards. On those cards, its totally unplayable, and the game is around 1 fps all the time. The first surprise is the integrated Intel HD Graphics in the i3, which keeps up with the 8500 GT easily. Both of them are borderline unplayable at 5-7 fps. The 9500 GT can indeed push almost twice as many frames than the 8500 GT, but its still not enough for this game. The HD4350 scores lower than the 9500GT, but faster than the 8500GT. The GT610, which is the newest card in the test, offers a disappointing performance in this test, being barely faster than the 9500GT. The GeForce 8800GT (despite its downclocked) and the FirePro (Radeon) V5700 can finally achieve playable frame rates, but there are places where both card is sweating heavily to maintain playability. It worths nothing that nVidia cards needed the latest drivers, otherwise, they had no picture in this game.

Hyperdimension Neptunia U

Hyperdimension Neptunia U is more like an interactive japanese cartoon than a traditional video game. The game looks like visual novels, where you have to read a lot of text, then organize a team, and do battles in hack/slash style. The first Neptunia had multiple bugfix releases, it was generally the venetian horse of this genre. The graphics is very good, and the game is not that bad as it sounds. There are giant bugs which they was not able to fix: for example, i was not able to force it to recognize my Joystick, the menu element was grey, due to a compatibility bug, which is a huge problem, as this game meant to be played from Gamepads and not from a keyboard.

Neptunia was totally fluid on every graphics chips. Except on those where it refused to work. With the FX5200 there was simply no picture. Neptunia simply crashed with the x740XL and the x1550 cards. So once again, a big no for DirectX9 cards. The HD3450 was not maxed out the VSYNC limits of the system (VSYNC cannot be disabled in Neptunia), this could indicate a weaker card, or a buggy driver.

Fairy Fencer

This game is built with the engine of Neptunia as well. The battle system got more modern, and got refocused more to be round-based. Its similar to Neptunia, but uses a newer incarnation of the engine, and this time it detects the joystick properly. The rendering engine uses a more modern implementation as well. The game is bigger than Hyperdimension Neptunia U, and it requires a more robust graphics card - but it actually looks worse.

This game refuses to even start on Direcx9 graphics cards. The i3 HD Graphics and the HD3450 gives results between 10 and 20 fps. The game is still playable like this, as this type of game doesn't require high frame rates, but its not fun to play at that speed. The 8500 GT barely makes above 20 fps, the rest of the cards are okay. The winner is the 8800GT again, which is the only card to get vsynced at 60 fps (as this is Neptunia based, VSYNC is kicking in regardless of the settings).

Mad Riders

This is a racing game, where people ride on some sort of quads on extreme levels. I wouldn't say its a super interesting game, but its not bad either. The graphics is okay, but nothing too fancy. I could imagine playing with it, if i am really bored.

The FX5200 and the x740XL are SM2 based DX9 compatible graphics cards, and they are unable to start this game at all. The x1550 is able to start it, but can only reach 5 fps, and its totally unplayable on it. The i3, the 8500GT, and the HD4350 are between 10 and 20 fps again, and this is a game where this is a big problem - this game is not playable on these cards, unless you are willing to pull the details and resolution down to very ugly levels. The 9500GT outperforms the 8500GT by almost 2x, reaching about 30 fps. The 610GT is around the performance level of the 9500GT, and able to keep the game around 30 fps. The V5700 exceeds 40 fps, the absolute king is the 8800GT which is able to push this game on 80 fps (would be far more, but the card is downclocked).

Burnout Paradise

I don't even know, why i have choosen this game to this test. Its not a good game, so probably some ad video made me to download it, and try it. As i already had the game, i felt like i will measure it anyway. The results were surprisingly good.

The DirectX 9 cards were not able to run this game. The FX5200 and the x740XL crashed with missing SM3 capabilities. The x1550 should have been able to run it, but the game caused a system crash, not even the num-lock reacted any more. The i3 HD Graphics was not able to render the game fluidly. The other cards passed the test, and the experience was good. The HD4350 was the weakest among the newer graphics cards. The 610GT this time only rivaled with the 8500GT. The 8800GT, the 9500GT, the v5700 vsynced out at 60 fps.

Little Witch Nobeta

A friend recommended this game to me. If i would knew this is another Unity game, i wouldn't bothered to get it. This looks like some sort of magic platformer, but there is not too much things that can be done, just walking around on the scenes with wasd (quite typical). The game needs a strong machine to run, the cards are not performed well with this title.

Pupuya Games - the Fake-Japanese company name that made me cringe a bit

The game crashed with the x740XL card. The FX5200 produced a nice BSOD when attempting to run the game. The x1550 was able to start up the game, the performance was only 2 fps. The Intel i3 HD Graphics was also able only to achieve 2 fps. The Radeon HD4350 reached three whole fps. The game simply crashed on the 8500 with a strange cryptic error (oops, the game crashed). The 9500 GT reached 5 whole fps. Other cards were between 10 to 20 fps. Finally, the GT 610 performed well, and outperformed the 9500GT by 2x. The V5700 reached 13 fps, and the underclocked 8800GT was around 18 fps. The game was not fun to play with any of the GPUs. There is no reason to be this slow, the graphics are not too nice:

Interestingly, the game was full of bugs, like the camera and the animations jumped twice as much in every uneven frames, making a strange seizure effect. Developers, please stop using joke game scripting environments, and learn the basics. There are no words for this.

Il-2 1946

This game was initially made in around 2006-2008, this is a re-release they did in 2012. The game is a very famous Russian fighter jet simulator, featuring Russian, German, Japanese, American airplanes from the era. This new release defaults to OpenGL, so i have used that, but in theory the game also supports DirectX.

The game started and worked well on all of the cards. Except on the FX5200, which crashed after a while due to a driver bug. The FX5200 PCI works well only with 44.x drivers for Win9x/XP, however, those super old drivers don't exist for Windows7, which makes this game to be problematic on that card. The rest of the cards achieved at least 30 fps. The i3 HD Graphics was the slowest, the HD4350 followed. The x1550 and x740XL were also able to run the game at nice frame rates, as this is a very old graphics engine. The rest of the cards seemed to hit limits of the CPU and system bus, topping out around 70 to 110 fps.

Neptunia Rebirth 3

Compared to the first Neptunia U, this is more like a traditional game, focused more on battle than on story. The battle system is quite similar to the system used in Fairy Fencer. The joystick bug is fixed in this as well, however, i was not satisfied with the buttons default settings. This game is also far bigger than the predecessor, but looks far worse in the same time. Its more fun to play this, but the graphics are boring and lifeless. This time, the perforance is in very dramatic fall, and the cards struggle to run this game.

The integrated Intel HD Graphics and the 4350 didn't gave out any picture. Its a mistery, why. The FX5200 simply crashed. Funny enough, they have fixed the problems with some DirectX 9 cards, and now the x740XL is able to start the game. The performance is 1 fps. The x1550 is able to achieve 4 fps, which is probably due to the more modern SM3 architecture. Still anemic. The 8500GT can pump out 9 fps, but the rest of the cards performed very well. The 9500GT, the GT610, and the V5700 performed around 30 fps, and 8800GT ran the game at 66 fps (i was able to turn off vsync this time - they have finally fixed it).

Nier Automata

Thats a game i decided to try only because i am getting a lot of memes about it. I expected robot girls and such. Maybe that was the case, if someone decides to play the game long enough. But after 5 minutes, i still was at this screen. This supposed to be the intro. Yes, it looks like this all the way:

Modern game development are indeed a joke, and i got very angry at this point. Also, the game refused to run on anything, except on the GT610. On the rest of the cards, it returned a DirectX error. On the 610, it was able to achieve 9 fps.

Other games i attempted to try

Lacrimosa of Dana: As the newer iteration (YS IX) simply refused to start (displayed an error code, and nothing else), i decided to put YS VIII to the test. The installer literally took more than one hour to finish, which is already verty talkative if we start to think about the possible quality of a game (quite clearly something that belongs to the trash).

I am not in panic, i just don't think you guys should write games. I have closed the process, and shift-deleted the game.

Flatout carrage: Live initialization failed. Which is some type of networking manager by Microsoft. I can imagine how the development of this game went: oh i am too lazy spend 40 minutes to learn how to use connect, send, and receive. I go with some proprietary joke library that will be removed from the market within a year, and requires 10x the efforts to maintain the code instead!

Blue Reflection, a game that crashed on every video cards. With pre DirectX 11 cards, it just crashed, with the DX 11 capable 610, it displayed a DirectX error, ironically.

At this point i ended the test. Modern games having some fundamental design flaws when its about the game dynamics and controls, but there are a few games in this list i would still want to play in the future.

Verdict of the cards

After doing these tests, i have finally accumulated all the answers for the 10$ graphics card test. There are no big surprises, except for one. Most of the cards turned out to be quite useless for running modern AAA and C class titles on them, but some can perform quite decently, if the expectations are low enough.

nVidia GeForce FX 5200

The oldest card in the test is the 20 years old FX 5200. This card was not able to play any of the games in this test. When it was able to start something, the performance was only around 1-2 fps. This card is unusable for every modern purpose, it can only qualify as a display adapter. On the screen, it created these box-like artifacts:

The card is not damaged by the way, the drivers aren't. Even when it was new, the PCI variation only worked properly with the 40.x version of drivers, which are not available for Windows7. The card is designed to run older titles under Windows 98 and XP, do not drag it back from its pension.

1. Decent card for retro games for a 9x system with proper drivers.
2. Uses the PCI slot, and can run in wide range of computers from P1-MMX to i7.

1. Compatibility issues with very old 3D games from the 90s.
2. Drivers require MMX
3. Not really a con, but its unable to run games 20 years newer than the card. Not a big surprise.

ATi Radeon x740XL

The first PCI-E card family of ATi is not recommended for modern gaming, and basically the only game that was usable on it, is a rebundled version of a 15 year old Russian game. There is one reason someone might buys this card: Windows 98SE support, with Direct3D and OpenGL. The card is a PCI-E card which can be operated in modern motherboards, and yet it will support that ancient operating system. If your computer has no PCI slots any more, but you want retro experience, this card can help you.

1. The card is cool, and not noisy
2. Supports Win9x

1. Unusable for modern titles

ATi Radeon x1550

Compared to the x740XL, it doesn't supports Win9x any more, but doesn't support newer games either. This card doesn't makes too much sense for modern games. It barely can run more games than its predecessor, but its too weak to run them on playable speeds. Despite its a DX9-only card, some games using more modern version of DirectX will be able to run, but most of them will obviously not work. This card model is very flaky. People reporting the cards suddenly dying due to bad quality of cooling. Not a recommended card.

PROS: I can't think of any.

1. The card dies after a few months, if actively used.
2. Unusable for modern titles.
3. The card runs super hot all the time.
4. No Win9x support.

nVidia GeForce 8500GT

The 8500GT offered good compatibility, but due to some driver-related issues, some game refused to start (meanwhile it ran well on its bigger brothers). As that uses the first generation of DX10 shader units, this problem could be due to limitations or bugs of the hardware. Later revisions didn't suffered from this problem. The 8500GT was able to outperform the integrated i3 HD Graphics, but the extra performance is barely measurable in most games. It was able to run different games, and crashed with other games. In overall, i was not satisfied with the performance of this hardware. nVidia really made a mistake with this card, when they have unintentionally cut the chip back far behind the mid-range. I don't recommend this card even over the integrated Intel.

1. It has a standard VGA connector.

1. The card runs too hot, and will boil itself within a year.
2. The chip and the drivers are a little bit more buggy than its counterparts.
3. Its very slow and sluggish. It was never a mid-range card. Games are unplayable.

AMD Radeon HD3450

This card is disappointing. Due to the 64 bit memory interface, the card is even slower than the 8500GT. It suffered similar driver bugs, and sometimes some games simply refused to give any picture, despite other cards of this era worked without an issue. The card is overally 20% slower than the 8500GT despite its multiple generations newer. This is however not a big surprise due to the 64 bit memory interface. This card looks like a mistake, and i have expected far more from it.

1. None - besides it can start up more titles than the dx9 cards. Even if they are usually unplayable.

1. No VGA connector.
2. Too slow, even slower than the 8500GT. Bad user experience when running games on it.
3. Buggy drivers which refuse to co-operate with some games, even if those game should run.
4. The card runs very hot, and it will kill itself within a year under heavy usage.

Intel HD Graphics (i3-520)

AMD and nVidia released their low-end video card chips for multimedia and office from all of their generations. This worked for them as a cash-cow, and a simple method to sell defective chips with disabled execution units and memory lanes for the low-end. This was fine, until Intel, unexpectedly, released their new Core2Duo chipsets, and i3 CPU with integrated DX10 capable video. The HD Graphics turn out to be a potent low-end performer. No driver issues can be observed, except in one game, even with the titles using newer DirectX versions. I can imagine the shock on the face of nVidia and AMD, when they have realized how compatible, powerfull and potent the Intel chip are. Suddenly, the low-end nVidia and AMD cards became less viable on the market, if the given computer already had an integrated HD Graphics chip on-board. Intel not just succesfully competes with the low-end AMD and Intel cards of the era, but also makes a new standard of quality for integrated graphics, which was unknown before. It even outperforms the integrated nVidia and AMD solutions (which are not tested in this article, but the lead of Intel is very significant to those as well).

1. Its integrated.
2. Its free, if its in the CPU or in the chipset.
3. Acceptible driver compatibility.

1. Still not fast enough for the titles of nowadays.
2. Dual channel memory has to be used, otherwise the speed is going to be terrible.
3. Clearly a low-end solution.

nVidia GeForce 9500GT

This card is the first acceptible solution in this test. There are no compatibility issues to be observed, if the newest drivers are being used. The absolute newest games unfortunately are going to struggle with this card, but the refined DX10-capable execution units are offering an acceptible performance even in some newer titles. nVidia really was able to position this card to the low-end, and as it requires no extra power connector, its an ideal solution for most of the computers.

1. Acceptible performance even in some modern titles
2. The card runs cool and silent, the cooling is really good quality.

1. Low frame rate in some titles from 2022.
2. No VGA connector.

nVidia GeForce GT610

The newest card in the test is not impressive. Despite its native DX11 support and 1 GByte memory, the card is unable to beat even the 9500GT. It can start up one extra game, but even then, the frame rate of that is unbearable. If someone has to choose between the 9500GT and this, in overall, this one is probably a better choice due to the more modern DirectX support, and the modern hardware accelerated codecs inside the chip. The card is widely available, and its even available in PCI version. The 3D performance is usually above stuttering, but very far from the standards of its era.

1. It has a VGA connector
2. The native DX11 support gets more games to be start up
3. The card is small and compact.
4. Easily available in every shop. Also available for the PCI slot.

1. The performance is not totally terrible, but not good either
2. The card runs too hot, even if it has active cooling, its unable to cool the chip. The card will quickly die.

AMD FirePro V5700

This is a card i was satisfied with. The V5700 outperformed the Radeon 4350 and the 9500 GT in every titles, had very good driver compatibility. The 128 bit memory interface, when paired with DDR3, gives enough memory bandwidth to be pair even with high-end cards of the era in some titles. The raw muscles of the GPU are there to reach stutter-free frame rates in most of the tested titles. As the card requires no external power plug, it can be a handy replacement for most of the computers, without having to worry about exceeding power limits of the PSU.

1. The performance is good.
2. No external power-coord for the performance range.


1. No VGA or AV connectors.
2. The card has a hardware bug: the fan is not being ramped up even under heavy load. The card is very hot, the fan is too slow all the way, except when booting.

nVidia GeForce 8800GT

One of the oldest competitors in this article, yet still able to rock the games due to its DirectX 10 compatibility. The card offers monstrous performance even by today standards, and even if i underclocked it previously, it can reach fluid or near fluid frame rates in all of the tested games. This card beat every other cards in the test. The 256 bit memory interface shows other cards their place. The performance has a cost tho: extra power plug is needed. The card originally came as a dual-slot solution, so i had to replace the cooling to a single-slot version.

1. The performance is very good, its the winner of this test.

1. No VGA connector.
2. The dual slot models are more common. Changing the cooler takes extra money, time, and efforts.
3. Extra power plug, which requires a power supply to have such a plug, or a converter.
4. Runs very hot, downclocking and donwvolting with BIOS mod is required to preserve the life of the card.
5. The cooling on the VRM is flaky and wobbly.
6. Requires strong power supply.
7. The card is more noisy than other models in the test.


DirectX 9 era cards are unusable, those will not be able to run modern video games. DirectX10, DirectX11, and DirectX12 based cards have good forwards and backwards compatibility with the current era of games, so those are a better choice. Older high-end gaming and high-end CAD solutions are better choice than modern low-end cards in most of the cases.

High-end cards of the DX10 era becoming rare, and the prices can be expected to grow. These former high-end gaming cards (including the earlyest ones) are easily outperforming the newer generations of low-end cards released even 10-15 years later. There could be some games, which refuse to start on a DX10 chip, this could be the case with maybe 1-2 from 10 games.

$ 1.63
$ 1.57 from @TheRandomRewarder
$ 0.05 from @Unity
$ 0.01 from @sanctuary.the-one-law
Avatar for Geri
Written by   103
6 months ago
Enjoyed this article?  Earn Bitcoin Cash by sharing it! Explain
...and you will also help the author collect more tips.


Why don't you try a GNU/Linux OS like Debian? We use Debian since 2000.

$ 0.01
6 months ago

Thanks. I use Debian as well. For the gaming-related tests, i use Windows, as most of the people uses that for gaming. I have mentioned on the graphs, that all of the cards above, are compatible with Linux. I might do a Linux-related graphics card benchmark in the future too, with opensource graphics drivers.

$ 0.00
6 months ago

True. Most commercial games are not available for Linux. I enjoyed reading your detailed article. Appreciate your dedication to writing such an extensive test report.

$ 0.00
6 months ago

Older dx9 games tend to run well on Linux with wine (and ofc the opengl based ones as well). People however having less smooth ride with dx10 and dx11 titles. There is a dx to vulkan wrapper now for linux, which people report to give better experience, however vk requires on closedsource drivers, which i am not too happy to use.

$ 0.00
6 months ago

Our son plays some games with wine because we don't use windows in our home. Thanks for some details about playing games on linux.

$ 0.00
6 months ago