read.cash is a platform where you could earn money (total earned by users so far: $ 844,568.05).
You could get tips for writing articles and comments, which are paid in Bitcoin Cash (BCH) cryptocurrency,
which can be spent on the Internet or converted to your local money.
In this test, the final GPU of Matrox will be showcased. When someone talks about graphics cards and chips for personal computers, most people are aware of the solutions by nVidia, AMD and Intel. There are less known GPU manufacturers out there. In the late 90s and early 2000s, most of the graphics chip manufacturers decided to move on to offer graphics designs to phones, tablets, embedded systems. Some of them got bought by CPU companies. From the 10-ish PC 3D graphics chip manufacturer, Matrox was one of the very prominent companies.
Matrox is a chip designer company from Canada. They produced PC-compatible graphics cards since the XT era. When S3 and ATi released their first 3D chips, Matrox also released its 3D-capable MGA and Mystique graphics cards, which were competing in the low-end market segment. The cards were relative succesfull, and Matrox released the G200 in 1998. The card was a success, it was the fastest chip on the Socket7 (Pentium 1/WinChip/K6/Cyrix) platform, and the performance was in pair with the top-tier offerings of 3Dfx and nVidia on the Pentium2 platform. The Matrox G200 cards became popular, and the company got loved by gamers.
Competitors were developing new technologies, and Matrox released its G400 line, which supposed to replace the G200. It was a good card, but nVidia and ATi released their native DirectX7 chips (the Radeon, and the GeForce 256, GeForce2) within a few months. The G400 was a DirectX6 card only, and it was not capable of accelerate the hardware triangle transformations (hardware T&L). This was not an issue in 1999, but quickly became a problem, as newer games started to be more heavy on polygon count. The new G450 and G5xx cards were not able to close gap, and internally they still were only DirectX6 capable. Gamers and professional users started to ditch the Matrox cards, and to avoid the fate of 3dfx, they started to develop their new DirectX8 capable graphics card.
The Matrox Parhelia chip was fully compatible with the new DirectX8 standard, released in 2002. The chip was not bad, but the drivers weren't super good either. The main competitor of the card was the GeForce4 Ti4200, which was about 30-40% faster. The Perhelia was not a bad chip, but it was not good either, the market share of Matrox fell below 1%, which was able to sustain the company, but was not a healthy situation. The initial Parhelia was released for the AGP and PCI-X ports, and it was usually equipped with 128 MByte video memory.
The Matrox Parhelia got renamed to Matrox P in 2003, and the cards got re-designed to be cheaper. The P650 got a 128 bit memory bus, 64 MByte video ram (usually), usually passive cooling, and the size of the PCB got decreased to reduce costs. The P650 featured two pixel pipelines, two pixel shaders, two vertex shaders, and 8 TMUs. After the release of the PCI-E specification, the P650 was redesigned for the PCI-E standard, the shipments started in 2005. Other variations of the P series also seen the sunlight in these years, such as the P750.
Its important to note: the Matrox P series still lacked native DirectX9 support. ATi and nVidia switched to DirectX9 in 2003. Programs and games started to use the new DirectX9 standard heavily in 2005. If a game relied on the new SM 2.0 type of shaders, it refused to run on DX8 era hardware. As they were felling out from the pocket of gamers once again.
Matrox tried to find new markets for itself. They were traditionally good in multi-monitor environments, so they started to add multiple DVI ports to the video cards, sometimes up to four. Unluckily for them, Radeon and GeForce cards already had at least two outputs, and the pixel quality of those cards - especially the nVidia cards - were equal or better than Matrox.
In 2006, nVidia released the 8800 family, which is the first DirectX 10 card with unified shaders. This changed the entire industry. Windows Vista got released, which required at least a DirectX9 compatible graphics cards, to display its on-screen windowing decorations. Matrox was still using a DirectX8 chip. Designing a new graphics chip was urgent. The Matrox M finally got released in 2008, finally having a full DirectX 9 and OpenGL 2.0 compatible core.
There are no specifications of the Matrox M. I have googled it, to try to find actual reviews of this cards. No luck. Despite of the card is not being that extremely rare, there are no benchmarks, reviews exists on any sites. There are no review videos, and even the specifications are shady. The number of pixel shaders, vertex shaders are not known, no info about the number of texturing units. Searching any information about the Matrox M barely produces any results. Luckily, the drivers are still available on the official website.
The Matrox M9120 card got released for $259, about 10 times higher than an equivalent DirectX 9 based card was from nVidia or ATi (AMD). Matrox kept the card on the market for a few years (you can still buy Matrox M cards brand new), but scrapped its own GPU line in 2014 in favor of using AMD or nVidia chips. The Matrox M cards appear on marketplaces usually for 200-300ish USD, but sometimes sellers just want to get rid of them for $10. I have patiently waited a year for one to appear, and i have bought it.
The card i was get is the Matrox M9120, the more consumer-targeting version, which is the cheapest one. It has two normal DVI output, and ships with two DVI to VGA adapter. Video memory is populated on the front and the back side as well. PCI-E 1x version is available as well, my card is the PCI-E 16x version. The video card has 512 MByte of DDR2 memory, half of the memory chips are mounted on the backside. The card requires no external power plug.
There is no information on the power consumption, but it should be around 20-30w. The passive heatsink is enough to keep the chip cool, but barely. It seems the card has a fan plug if someone plans to attach one. The card feels like a cheap OEM-card, but it resembles the usual looks for a late entry level DirectX9 card. The manufacturing date on my card says 2008, which maybe or maybe not true.
I have decided to use my desktop PC equipped with a server motherboard to carry out this test. The MSI Speedster2 motherboard supports two Socket F cpus, and with two AMD Opteron 2423HE for this test, it should be more than enough for this card. Linux (64 bit) and Win7 64bit will be used to check if the card even boots and works. My computer has 32 GByte memory, which is a little bit overkill for this test, but as its already installed in the system, i was not willing to remove it.
The card turned on without a problem. Under Linux, there are no opensource drivers available. The kernel has no idea about the card, and it runs in VESA mode. There are closed source drivers for certain kernels and X11 versions, which i will not bother, as these only support older kernels. In theory, they have OpenGL 2.0 drivers as well, so if someone needs 3D acceleration for a Matrox M card under linux, its available. The Windows7 64 bit is supported by Maxtor. Beginning from XP to Win10, every operating system is supported. But i have struggled to make it working. After unpacking the installer, it had multiple directories and installers. The one i initially started from the Martox M series folder, refused to work, complaining about missing signature. I have tried to disable signature check from Windows with multiple methods, and reinstall the driver, but no luck. Then i discovered a signed driver inf, and install that manually, but it refused to install for some reason. At the end, i ran the installer file in the main directory in the driver, and behold: it worked. After wasting one hours of my life, the Matrox M9120 came alive, but the first impressions weren't great.
After rebooting, the Aero came alive, which means this is really a DirectX9 capable card. My monitor is just 1360x768, but the Matrox M9120 stuttered on the desktop even on this resolution. When dragging the windows around the screen, the fps rate was visibly below 25, probably about 20. This stuttering is always present, regardless of the size of the window. Matrox installs a tool which allows multiple monitors to set up. No settings to control 3D or to observe the status of the videocard itself. Very disappointing.
The video card will be compared against an x1550, which is released in 2007. Its a very late release for a DirectX 9 card, but low-end systems only switched to DirectX 10 in the upcoming years. With 4 pipelines, 256 MByte DDR2 memory, PCI-E 16x, the specifications are very close to the Matrox M9120, and due to the close release dates, the cards are direct competitors of each other.
I have tried to start up common programs to find out the specifications of the card. This is how it ended:
GPU-Z was not able to provide any informations about the card. A program called GPU caps viewer has no clue either, but it has some built-in OpenGL test tools, which were indeed able to function normally, comfirming the card really supports OpenGL 2.0, and pixel and vertex shaders through GLSL.
To confirm Direct3D acceleration is working, i entered to dxdiag, where i was able to verify if the D3D acceleration is present. After finding out both Direct3D and OpenGL works, i started to run game demos.
Hyperdimension Neptunia R3 is a fantastic RPG video game/adventure game. The game requires OpenGL 2.0 to run. I enjoy this game quite a lot, but it can be sometimes a little bit heavy on slower video cards. It is recommended to run this game on cards released after 2010 for good performance.
It was able to start on the x1550 and produced only 3-4 fps. It also started on the Matrox M9120, but its always crashed after a few frames regardless of what i did. The screen was also garbled on the M9120, like if the OpenGL push and pop commands would have failed on the card for some reason.
Prey is an OpenGL 1.x/2.0 game, based on the Doom3 engine. It features an alien invasion in a native american reservoir. I remember playing this game in the late 2000s, but it wasn't amusing. It focused more on graphics than on actual gameplay. It needs stronger card than the original Doom3, but usually starts up on evetyhing. Here is how the test went:
Prey worked without glitches on the M9120, but the frame rate wasnt fantastic. The x1550 was more than 3x faster in this title, but i wouldn't call this game a good experience on any of the cards. At least i was able to verify: the M9120 can run OpenGL games.
This is an old game, which i didn't liked that much. It can't deliver the old Unreal Tournament (1999) vibes, but it still can be fun to play. For a retro gamer, it can be handy to play this game. When comparing it to the original UT, its like they designed the maps to be colorful and nice looking, rather than designing then in a way that could make good action. UT2003 is not that demanding, it can even be executed on older video cards, so in theory it should work without problem.
And indeed, UT2003 was playable on both cards. The x1550 was about 3x faster once again, but it was totally okay on the M9120 as well.
I have tested the 2015 build of Yandere Simulator (the newer builds will require DirectX10 capable video cards to avoid being a stuttery mess). Most Yandere Simulator builds are still available on the internet, and this was a very early build of this game. The game ran on both video cards, however, the M9120 produced crashes, and the game was unplayable.
Effects and settings can be downtuned, wihch might would solve the issues, but i didn't wanted to spend too much time with this particular title.
This was a popular race car game, i remember playing days with it. Nowadays no one would play such a game, but back then this was considered a very good game, mostly due to the graphics effects. If could be interesting for a retro gamer to test it. Lets see, if the M9120 is even capable of running this game.
The M9120 succesfully ran the game, on a barely playable frame rate. The x1550 was more than 6x faster in this game.
This game is a wanabe successor of the DOS game Stunts. The player can create strange tracks with loops and other entertaining objects, and play on it. The graphics of the game are good, but nothing special, a video card of this era should be able to run it. I havent looked at this game that deeply, but it seems less exciting than the original Stunts that for sure.
Once again, the M9120 is able to run the game, but rarely exceeds 20 fps. The x1550 is more than 2x faster again. I wouldn't call it on the M9120 unplayable either, as the game sometimes reached and exceeded 25 fps.
Flatout is an arcade style rally simulator, where people can produce fancy crashes and jumps. I remember playing some with this game, it wasnt bad, the graphics can be considered acceptible even by modern graphics standards.
The M9120 is coming at 20 fps, meanwhile the x1550 manages to run the game at 60 fps. In all car racing games, the M9120 manages around 20 fps, which i found quite interesting: maybe the card doesn't really likes the way the roads are typically rendered in these games, as it always seem to fell to this magical 20 fps.
Far Cry was one of the first titles actively utilizing the new shader models in DirectX 9. Basically its the Crysis of the era. The game is backwards compatible with DirectX8 cards as well, but it also supports OpenGL. The old Parhelia cards was not able to work properly with Far Cry, lets see how the M9120 behaves.
The M9120 was able to run the game, however, some shading bugs occured with the plants, and other types of vegetation. I haven't played around with the settings, the card might could run without bugs, if someone fine tunes the quality related settings. The performance wasn't amusing either, but the game was fluid at around 24 fps. This is still less than fifth of what the x1550 was capable of.
A former friend told me about this game, to recommend it for this test. It uses DirectX 9, it looks quite acceptible even by todays standards. The game reminds me of Final Fantasy tactics a little bit. I havent played too much with it, i have checked the performance in the initial war battle. It might worth playing with it later on.
The M9120 usually managed 30ish fps, which sometimes increased above 40. This time, the x1550 was only twice as fast. There were no bugs or glitches.
This is a russian second world war aircraft simulator. Its a very good game, but by todays standards, i only recommend it for hardcore airplane simulator fans. This game supports both OpenGL and Direct3D rendering. Both API worked on both cards, and there were no difference in the speed regardless of what API were used on both cards.
The M9120 fell to 14 fps under heavy action, but otherwise usually managed above 20 fps. Its not unplayable, but not super bad either. The x1550 managed about 3x more FPS in this title, and i haven't seen any scenes where it fell below 40 fps.
This is a newer iteration of the Alice game. The first one used the Quake3 engine, this one uses some of the Unreal engines. As the game uses DirectX9, i was quite excited to try this game, as it looks very iteresting and entertaining. Unfortunately, despite the game using DirectX9, the game requires newer graphics cards to run properly.
The X1550 was able to start the game, but only ran at around 4-8 fps, which was totally unusable. The M9120 was not able to run the game, and it simply crashed. This game indeed requires a newer DX10 era graphics card.
The Final Fantasy 7 remastered uses DirectX9 to be able to run on modern graphics hardware. I recommend this game to anyone who have missed on the original Final Fantasy release - this game is the same, they just modernized the DirectX engine around it. And probably a lot of people missed out the original one, as it was only running on G200 and Permedia2 cards properly.
Both card managed a steady 35 fps under gameplay. The game is probably capped at 35 fps for some reason, and thats why there is no differences in performance. There were no bugs present with any of the cards.
This is a benchmark program, which is capable to use DirectX10, DirectX9 and OpenGL to measure the performance of the cards. Its a quite good benchmark, but the graphics card manufacturers used application specific optimizations on it, so it not neccessary reflects the true performance of the video cards.
Unfortunately, the benchmark refused to start up on the M9120 with OpenGL and DirectX9. It missed an extension for OpenGL, and it simply shown an error message when i tried to run it in DirectX9 mode, complaining about missing hardware support. Even on the x1550 it was only able to manage one fps.
I wouldn't call this card a tragedy, but its far from being impressive. A DirectX9 card, which is still being sold for $300 brand new, should at least be able to run all the DirectX9 titles without an exception, and the windows desktop shouldn't stutter on it. Its no surprise this chip was a financial disaster for Matrox, as far cheaper cards were able to outperform it. Anyone would be far better off with a low end passive GeForce 8xxx or a Radeon 2xxx which also goes for approx $10. Besides this, the card has various pros and cons.
-Acceptable performance for its era, approx identical to the GeForce 6200 or FX5200.
-Good for retro gaming
-Can run software and videogames released before 2010
-Drivers are stable, and available from Windows 2000, through XP, 2003, Win7, 8, 10 for 32 and 64 bit.
-The actual DirectX and OpenGL driver was not upgraded since 2013
-No driver for Windows 98
-No driver for modern Linux
-Significantly slower than its direct competitors
-Useless if you want to play with DX10/11/OpenGL3.x titles
-Even the window manager in Windows stutters with this card
-The brand new price contains two extra zero than what the card actually worths
-Drivers have a lot of files, and only one works, which is confusing
In overall, its a nice and collectible item for $10, if you don't expect too much from it.