Video game size and system requirements

2 71
Avatar for Geri
Written by
1 month ago

It surprises me how the smartest people on the face of earth (video game developers) are at the same time, the stupidest people. At least, when it comes down to calibrating the system requirements for the game, and targeting the total installation size (if they even bother to do that). I am sure i am not enough with my frustration, when observing video games above 100GB, requiring the newest graphics cards with expensive processors, or experiencing long loading times.

This is especially problem of the AAA-type "video games" (aka. glorified WASD model viewers), where they always seem to target cutting edge technology, manipulating their potential victims on social media to experience the new... experiences. This trend, unfortunately, sometimes leaks to other game types, and you can observe indie, B class, Z class games to also consume 100GB disk space, 8GB video memory, despite looking like something from 2008.

The reasons

When you start to reason with these developers, their explanations are very vague. The will come up with explanations line, a new 2 TB SSD is only $100 - forgetting the fact, their game is not the only game they want to store on their machines. These type of people are not being able to tell apart the current cutting edge technology on marketplaces from the actual computers of the users, which will be multiple years old, or even a decade old, because people will not run and buy a brand new computer every year. And even if they do, they will have issues: the game will simply not fit on their laptop, and they have to delte half of their data to squish the game.

Lack of market research

These developers are unable to look behind their own imagionation, where everyone has the same computer as them. At most, they check their close friends, who are also super geeks, and they come up with the wrong impression that everyone will have a 4k monitor, an 8 TB SSD, a gigabit internet connection. This will be true maybe within 20 years. But not today, when a random user will hang from a few 10 mbit connection, and it will take him days to download a large game. If he is willing to run his computer for multiple days without stopping it, hoping, if his connection dies for some reason, he will be able to resume the download, and his hard disk will not exhale its soul in the process of unzipping the game.

Lack of common sense

These people, like phonebook autists, wrote a game, but didn't bothered to understand the basics of technology. For example, they don't understand they can scale the compression ratio of a JPG file, and make it much smaller without notable quality loss in most cases. They will use high polygon geometry on everything, not understanding how slow it will be, when every piece of grass is high-poly. To try to counterweight the lack of common sense, they try to use some new technologies, to hide some of the performance deficit of their decisions.

Tech bros

Geometry shaders? GPU assisted instancing? FSR? XeSS? Random three and four letter nonsense is here to fix the issues you made for yourself! Instead of using their common sense to fix their fundamental issues, trying to cover them up with military grade of techno autism and experimental technologies, features wich are non standard, may or may not will be removed in the next driver release, making the game even more unreliable than before.

The root reason

When discussing this issue with fellow game engine developers, we came to the conclusion of the following. Nowadays, as its widely known, the typical users of an engine who makes the game, has no technical understanding of what he is doing. For example, a modeler, who creates a model, has no regards or understanding of technical limitations and possibilities. Even if the engine itself is being rated to have, lets say, 500k polygons per scene at the absolute most recommendation, the developers of the game might putting tens of millions of polygons, tens of tousands of textures into the game. Even if the developers of the engine are desperate to scale, optimize everything, introduce complicated acceleration structures, and so on, no threading strategy will save the game.

The solutions

Clear documentation of the engine which lists the recommendations and limitations clearly, and clear and transparent communication to the creators of the assets could be a key factor to avoid bloated video games. Most engine usually lacks this (or lacks a proper documentation at all), so even if the game developers would like to know the technical limitations of the engine, they have no way of knowing it beforehand.

Proper testing

Instead of assuming it will work, getting some old components, like a 8800GT with a Core2Duo and Windows7, or a second generation i7 with integrated graphics chip, and then testing and fine tuning the game and the engine, till it shows signs of an acceptible frame rate. Its better safe than be sorry, and have some extra headroom, than to precisely try to match some new 4 core Ryzen processor, and at the end fail to meet the minimal targets.

I also faced this problem

When i was less experienced, every code i ever did, only worked properly on the cutting edge of the time. This caused me about a good 60% degradation in my profits from certain software. And sometimes, the huge system requirement also caused outrage or disappointment of some of the users. After losing this much of money on certain of my software, i only then started to strategically think on system requirements, and was able to fully solve this general issue of my newer software - but in retrospect, i was not able to fix which was already broken. Don't be such an idiot like me. Learn from my mistakes, and think before you code!

Thinking in standard units

This problem also existed, and rooted long time in the past. When the Commodore64 got released, and the 5.25 floppy disk drives got popular, both sides of the disk had 180KByte space. This was plenty of the C64, as it took minutes to load this much of data from the floppy. Eventually, the developers had no regard of this, and started to release games using multiple floppies. It was not uncommon to have something on FOUR floppy disks, and the game took almost 15 minutes to fully load each time you started it.

This total nonsense continued into the PC era as well. Early PC computers also used 5.25 floppy disks, with 360KB space, which got eventually replaced by 3.5 inch floppies. The 3.5 inch disks started with 720kb size, but very quickly 1.44MB disks got released. This was plenty of space back then, and should have been enough for the games at that time. But it seems for some unknown reason, they werent. Game developers started to release their titles on two, and 4 floppies once again, and other type of software also started to grow out the floppy. Even if CD drives were not widely available, and almost only existing on paper. The people were just as angry, just as much when they see these 100+GB games.

And when finally we had CD, and games started to arrive on CD discs, not a surprise, but shortly afterwards, some games already required 2 or more CD. Final Fantasy 8 PC edition, required 5 discs in 1999 (just because developers were unable to find a proper jpg library at the time, and assets used some shady compression system). Then we had DVD...

The conclusion we can form is, to use the standard units of the information storage medium of the given era. If the Floppy has 360 KByte, then DO NOT use more than 360 KByte for the game. If the floppy has 1.44 MB, then be sure it fits on 1.44 MB. If the game is released on a CD, be sure its smaller than 700 MByte. This is what the common sense dictates.

Then suddenly, the internet

Nowadays, software and games are almost exclusively being sold and transferred online to the users. This means the user must download it, and store it somehow. On the Android platform, Google Play imposed a 100 MB file size limit for software and video games, as the absolute ceiling. This should have been totally enough for games and other applications - especially, because phones only had a few GB of storage space, but it turns out, it wasnt't. Some games, after loading it, started to download multiple gigabytes of assets from the developers web server, because the incompetent developers still don't have common sense. Google, two years ago, got forced to rise this to 200 MB for bundles, and now, modern apps can be 1GB in size, and again, thats still not enough. Not to mention these 30-200ish GB PC games of nowadays.

So what to do

Games for phones really should't be bigger than 100 MB, and a PC game shouldn't be bigger than 1-2 GB either. If a random computer has 8GB of RAM, then your game shouldn't use more than 1-2 GB under any circumistances. Don't use experimental and non standard technologies to cover up your incompetence. Have backwards compatibility. Use your common sense. Don't be a tech bro who develops an engine and restarts it once per month because a new three letter technology arrives, and now suddenly your code is obsolete, and you have to rewrite it - instead, write a code which works everywhere. And kick the ass of content creators who create bigger textures and models than a few 100 kbyte.

1
$ 2.01
$ 2.00 from @ewyr
$ 0.01 from Anonymous user(s)
A
Avatar for Geri
Written by
1 month ago

Comments

I agree with you. It seems that these game developers forces their players to upgrade their computers before they can play the latest games. Somehow, I think these developers are being funded by the hardware manufacturers to push people to buy their latest and outrageously priced products.

$ 0.00
1 month ago

in some cases, it could be, but in most cases its just up to their stupidity

$ 0.00
1 month ago