HNNewShowAskJobs
Built with Tanstack Start
Helldivers 2 devs slash install size from 154GB to 23GB(tomshardware.com)
296 points by doener 6 hours ago | 206 comments
  • snet05 hours ago

    > With their latest data measurements specific to the game, the developers have confirmed the small number of players (11% last week) using mechanical hard drives will witness mission load times increase by only a few seconds in worst cases. Additionally, the post reads, “the majority of the loading time in Helldivers 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time.”

    It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!

    I expect it's a story that'll never get told in enough detail to satisfy curiosity, but it certainly seems strange from the outside for this optimisation to be both possible and acceptable.

    • afavour5 hours ago |parent

      > It seems bizarre to me that they'd have accepted such a high cost

      They’re not the ones bearing the cost. Customers are. And I’d wager very few check the hard disk requirements for a game before buying it. So the effect on their bottom line is negligible while the dev effort to fix it has a cost… so it remains unfixed until someone with pride in their work finally carves out the time to do it.

      If they were on the hook for 150GB of cloud storage per player this would have been solved immediately.

      • jeroenhd4 hours ago |parent

        The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

        That's why they did the performance analysis and referred to their telemetry before pushing the fix. The impact is minimal because their game is already spending an equivalent time doing other loading work, and the 5x I/O slowdown only affects 11% of players (perhaps less now that the game fits on a cheap consumer SSD).

        If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.

        • PunchyHamster3 hours ago |parent

          > The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

          Not what happened. They removed an optimization that in *some other games* ,that are not their game, gave 5x speed boost.

          And they are changing it now coz it turned out all of that was bogus, the speed boost wasn't as high for loading of data itself, and good part of the loading of the level wasn't even waiting for disk, but terrain generation.

          • hinkley2 hours ago |parent

            5x space is going to be hard to beat, but one should always be careful about hiding behind a tall tent pole like this. IO isn’t free, it’s cheap. So if they could generate terrain with no data loading it would likely be a little faster. But someone might find a way to speed up generation and then think it’s pointless/not get the credit they deserve because then loading is the tall tent pole.

            I’ve worked with far too many people who have done the equivalent in non game software and it leads to unhappy customers and salespeople. I’ve come to think of it as a kind of learned helplessness.

        • account423 hours ago |parent

          23 GiB can be cached entirely in RAM on higher end gaming rigs these days. 154 GiB probably does not fit into many player's RAM when you still want something left for the OS and game. Reducing how much needs to be loaded from slow storage is itself an I/O speedup and HDDs are not that bad at seeking that you need to go to extreme lengths to avoid it entirely. The only place where such duplication to ensure linear reads may be warranted is optical media.

          • jeroenhd3 hours ago |parent

            They used "industry data" to make performance estimations: https://store.steampowered.com/news/app/553850/view/49158394...

            > These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.

            • PunchyHamster3 hours ago |parent

              Instead of y'know, running their own game on a hdd.

              It's literally "instead of profiling our own app we profiled competition's app and made decisions based on that".

            • hinkley2 hours ago |parent

              If I’m being charitable, I’m hoping that means the decision was made early in the development process when concrete numbers were not available. However the article linked above kinda says they assumed the problem would be twice as bad as the industry numbers and that’s… that’s not how these things work.

              That’s the sort of mistake that leads to announcing a 4x reduction in install size.

            • ghurtado3 hours ago |parent

              >In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.

              Never trust a report that highlights the outliers before even discussing the mean. Never trust someone who thinks that is a sane way to use of statistics. At best they are not very sharp, and at worst they are manipulating you.

              > We were being very conservative and doubled that projection again to account for unknown unknowns.

              Ok, now that's absolutely ridiculous and treating the reader like a complete idiot. "We took the absolute best case scenario reported by something we read somewhere, and doubled it without giving a second thought, because WTF not?. Since this took us 5 seconds to do, we went with that until you started complaining".

              Making up completely random numbers on the fly would have made exactly the same amount of sense.

              Trying to spin this whole thing into "look at how smart we are that we reverted our own completely brain-dead decision" is the cherry on top.

              • JohnBooty2 hours ago |parent

                Are you a working software engineer?

                I'm sure that whatever project you're assigned to has a lot of optimization stuff in the backlog that you'd love to work on but haven't had a chance to visit because bugfixes, new features, etc. I'm sure the process at Arrowhead is not much different.

                For sure, duplicating those assets on PC installs turned out to be the wrong call.

                But install sizes were still pretty reasonable for the first 12+ months or so. I think it was ~40-60GB at launch. Not great but not a huge deal and they had mountains of other stuff to focus on.

                • hinkley2 hours ago |parent

                  I’m a working software developer, and if they prove they cannot do better, I get people who make statements like GP quoted demoted from the decision making process because they aren’t trustworthy and they’re embarrassing the entire team with their lack of critical thinking skills.

                  When the documented worst case is 5x you prepare for the potential bad news that you will hit 2.5x to 5x in your own code. Not assume it will be 10x and preemptively act, keeping your users from installing three other games.

                  • JohnBootyan hour ago |parent

                    Well, then I'd like to work where you work. Hard to find shops that take performance seriously. You hiring?

                    In my experience it's always been quite a battle to spend time on perf.

                    I'll happily take a demotion if I make a 10x performance goof like that. As long as I can get promoted eventually if I make enough 10x wins.

                    • hinkley38 minutes ago |parent

                      I would classify my work as “shouting into the tempest” about 70% of the time.

                      People are more likely to thank me after the fact than cheer me on. My point, if I have one, is that gaming has generally been better about this but I don’t really want to work on games. Not the way the industry is. But since we are in fact discussing a game, I’m doing a lot of head scratching on this one.

            • the_af3 hours ago |parent

              But if I read it correctly (and I may be mistaken) in actual practice any improvement in load times was completely hidden by level generation that was happening in parallel, making this performance improvement not worth it, since it was hidden by the other process.

          • crest3 hours ago |parent

            Which describes both the PS2, PS3, PS4, Dreamcast, GameCube, Wii, and Xbox 360. The PS4 had a 2.5" SATA slot but the idiots didn't hook it up to the chipsets existing SATA port, but added a slow USB2.0<->SATA chip. So since the sunset of the N64 all stationary gaming consoles have been held back by slow (optical) storage with even worse seek times.

            Some many game design crimes have a storage limitation at their core e.g. levels that are just a few rooms connected by tunnels or elevators.

        • afavour4 hours ago |parent

          > If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.

          And others who wish one single game didn't waste 130GB of their disk space, it's fine to ignore their opinions?

          They used up a ton more disk space to apply an ill-advised optimization that didn't have much effect. I don't really understand why you'd consider that a positive thing.

          • jeroenhd3 hours ago |parent

            By their own industry data (https://store.steampowered.com/news/app/553850/view/49158394...), deduplication causes a 5x performance increase loading data from HDD. There's a reason so many games are huge, and it's not because they're mining your HDD for HDDCoin.

            The "problem" is a feature. The "so it remains unfixed until someone with pride in their work finally carves out the time to do it" mindset suggests that they were simply too lazy to ever run fdupes over their install directory, which is simply not the case. The duplication was intentional, and is still intentional in many other games that could but likely won't apply the same data minimization.

            I'll gladly take this update because considerable effort was spent on measuring the impact, but not one of those "everyone around me is so lazy, I'll just be the noble hero to sacrifice my time to deduplicate the game files" updates.

            • hinkley2 hours ago |parent

              > In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

              That makes no goddamn sense. I’ve read it three times and to paraphrase Babbage, I cannot apprehend the confusion of thought that would lead to such a conclusion.

              5x gets resources to investigate, not assumed to be correct and then doubled. Orders of magnitude change implementations, as we see here. And it sounds like they just manufactured one out of thin air.

            • hinkley2 hours ago |parent

              Seems to me that most of these situations have an 80/20 rule and it would be worth someone’s time to figure out what that is.

              Get rid of 80% of that duplication for a 2x instead of a 5x slowdown would be something.

            • JohnBooty3 hours ago |parent

              I expect better from HN, where most of us are engineers or engineer-adjacent. It's fair to question Arrowhead's priorities but...

                  too lazy
              
              Really? I think the PC install size probably should have been addressed sooner too, but... which do you think is more likely?

              Arrowhead is a whole company full of "lazy" developers who just don't like to work very hard?

              Or do you think they had their hands full with other optimizations, bug fixes, and a large amount of new content while running a complex multiplatform live service game for millions of players? (Also consider that management was probably deciding priorities there and not the developers)

              I put hundreds of hours into HD2 and had a tremendous amount of fun. It's not the product of "lazy" people...

          • devmor2 hours ago |parent

            > They used up a ton more disk space to apply an ill-advised optimization that didn't have much effect.

            The optimization was not ill-advised. It is in fact, an industry standard and is strongly advised. Their own internal testing revealed that they are one of the supposedly rare cases where this optimization did not have a noticeably positive effect worth the costs.

        • nearbuy4 hours ago |parent

          According to the post, "the change in the file size will result in minimal changes to load times - seconds at most."

          It didn't help their game load noticeably faster. They just hadn't checked if the optimization actually helped.

          • jeroenhd3 hours ago |parent

            The actual source (https://store.steampowered.com/news/app/553850/view/49158394...) says:

            > Only a few seconds difference?

            > Further good news: the change in the file size will result in minimal changes to load times - seconds at most. “Wait a minute,” I hear you ask - “didn’t you just tell us all that you duplicate data because the loading times on HDDs could be 10 times worse?”. I am pleased to say that our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

            > Now things are different. We have real measurements specific to our game instead of industry data. We now know that the true number of players actively playing HD2 on a mechanical HDD was around 11% during the last week (seems our estimates were not so bad after all). We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.

            They measured first, accepted the minimal impact, and then changed their game.

            • ghurtado3 hours ago |parent

              > They measured first,

              No, they measured it now, not first. The very text you pasted is very clear about that, so I'm not sure why you're contradicting it.

              If they had measured it first, this post would not exist.

            • the_af3 hours ago |parent

              But this means that before they blindly trusted some stats without actually testing how their game performed with and without it?

              • johnmaguire3 hours ago |parent

                Yes, but I think maybe people in this thread are painting it unfairly? Another way to frame it is that they used industry best practices and their intuition to develop the game, then revisited their decisions to see if they still made sense. When they didn't, they updated the game. It's normal for any product to be imperfect on initial release. It's part of actually getting to market.

                • the_af2 hours ago |parent

                  To be clear, I don't think it's a huge sin. It's the kind of mistake all of us make from time to time. And it got corrected, so all's well that ends well.

              • JohnBooty2 hours ago |parent

                FWIW, the PC install size was reasonable at launch. It just crept up slowly over time.

                    But this means that before they blindly trusted 
                    some stats without actually testing how their 
                    game performed with and without it?
                
                Maybe they didn't test it with their game because their game didn't exist yet, because this was a decision made fairly early in the development process. In hindsight, yeah... it was the wrong call.

                I'm just a little baffled by people harping on this decision and deciding that the developers must be stupid or lazy.

                I mean, seriously, I do not understand. Like what do you get out of that? That would make you happy or satisfied somehow?

                • the_af2 hours ago |parent

                  Go figure: people are downvoting me but I never once said developers must be stupid or lazy. This is a very common kind of mistake developers often make: premature optimization without considering the actual bottlenecks, and without testing theoretical optimizations actually make any difference. I know I'm guilty of this!

                  I never called anyone lazy or stupid, I just wondered whether they blindly trusted some stats without actually testing them.

                  > FWIW, the PC install size was reasonable at launch. It just crept up slowly over time

                  Wouldn't this mean their optimization mattered even less back then?

                  • JohnBootyan hour ago |parent

                        premature optimization
                    
                    One of those absolutely true statements that can obscure a bigger reality.

                    It's certainly true that a lot of optimization can and should be done after a software project is largely complete. You can see where the hotspots are, optimize the most common SQL queries, whatever. This is especially true for CRUD apps where you're not even really making fundamental architecture decisions at all, because those have already been made by your framework of choice.

                    Other sorts of projects (like games or "big data" processing) can be a different beast. You do have to make some of those big, architecture-level performance decisions up front.

                    Remember, for a game... you are trying to process player inputs, do physics, and render a complex graphical scene in 16.7 milliseconds or less. You need to make some big decisions early on; performance can't entirely just be sprinkled on at the end. Some of those decisions don't pan out.

                        > FWIW, the PC install size was reasonable at launch. It just crept up slowly over time
                    
                        Wouldn't this mean their optimization mattered even less back then?
                    
                    I don't see a reason to think this. What are you thinking?
                    • the_afan hour ago |parent

                      > One of those absolutely true statements that can obscure a bigger reality.

                      To be clear, I'm not misquoting Knuth if that's what you mean. I'm arguing that in this case, specifically, this optimization was premature, as evidenced by the fact it didn't really have an impact (they explain other processes that run in parallel dominated the load times) and it caused trouble down the line.

                      > Some of those decisions don't pan out.

                      Indeed, some premature optimizations will and some won't. I'm not arguing otherwise! In this case, it was a bad call. It happens to all of us.

                      > I don't see a reason to think this. What are you thinking?

                      You're right, I got this backwards. While the time savings would have been minimal, the data duplication wasn't that big so the cost (for something that didn't pan out) wasn't that bad either.

        • jjmarr3 hours ago |parent

          If this is a common issue in industry why don't game devs make a user visible slider to control dedup?

          I have friends who play one or two games and want them to load fast. Others have dozens and want storage space.

          • ThrowawayR22 hours ago |parent

            Any developer could tell you that it's because that would be extra code, extra UI, extra localization, extra QA, etc. for something nonessential that could be ignored in favor of adding something that increases the chance of the game managing to break even.

      • WreckVenom3 hours ago |parent

        It is a trade-off. The game was developed on a discontinued engine, the game has had numerous problems with balance, performance and generally there were IMO far more important bugs. Super Helldive difficulty wasn't available because of performance issues.

        I've racked up 700 hours in the game and the storage requirements I didn't care about.

      • weavejester3 hours ago |parent

        > They’re not the ones bearing the cost.

        I'm not sure that's necessarily true... Customers have limited space for games; it's a lot easier to justify keeping a 23GB game around for occasional play than it is for a 154GB game, so they likely lost some small fraction of their playerbase they could have retained.

        • Tostino3 hours ago |parent

          That is a feature for franchise games like CoD.

      • oersted2 hours ago |parent

        Gamers are quite vocal about such things, people end up hearing about it even if they don’t check directly.

        And this being primarily a live-service game drawing revenues from micro-transactions, especially a while after launch, and the fact that base console drives are still quite small to encourage an upgrade (does this change apply to consoles too?), there’s probably quite an incentive to make it easy for users to keep the game installed.

      • scruple4 hours ago |parent

        Studios store a lot of builds for a lot of different reasons. And generally speaking, in AAA I see PlayStation being the biggest pig so I would wager their PS builds are at least the same size if not larger. People knew and probably raised alarm bells that fell to the wayside because it's easier/cheaper to throw money at storage solutions than it is engineering.

      • runningRicky4 hours ago |parent

        > I’d wager very few check the hard disk requirements

        I have to check. You're assumption is correct. I am one of very few.

        I don't know the numbers and I'm gonna check in a sec but I'm wondering whether the suppliers (publishers or whoever is pinning the price) haven't screwed up big time by driving prices and requirements without thinking about the potential customers that they are going to scare away terminally. Theoretically, I have to assume that their sales teams account for these potentials but I've seen so much dumb shit in practice over the past 10 years that I have serious doubts that most of these suits are worth anything at all, given that grown up working class kids--with up to 400+ hours overtime per year, 1.3 kids on average and approx. -0.5 books and news read per any unit of time--can come up with the same big tech, big media, economic and political agendas as have been in practice in both parts of the world for the better part of our lives--if you play "game master" for half a weekend where you become best friends with all the kiosks in your proximity.

        > the effect on their bottom line is negligible

        Is it, though? My bold, exaggerated assumption is that they would have had 10% more sales AND players.

        And the thing is, that at any point in time when I, and a few I know, had time and desire to play, we would have had to either clean up our drives or invest game price + sdd price for about 100 hours of fun over the course of months. We would have gladly licked blood but no industry promises can compensate for even more of our efforts than enough of us see and come up with at work. As a result, at least 5 buyers and players lost, and at work and elsewhere you hear, "yeah, I would, if I had some guys to play with" ...

        • JohnBooty2 hours ago |parent

          I do not think the initial decision-making process was "hey, screw working-class people... let's have a 120GB install size on PC."

          My best recollection is that the PC install size was a lot more reasonable at launch. It just crept up over time as they added more content over the last ~2 years.

          Should they have addressed this problem sooner? Yes.

      • zelphirkalt3 hours ago |parent

        Which goes to show, that they don't care about the user, but only about the user's money.

        • horsawlarway3 hours ago |parent

          No - because most users also don't check install size on games, and unlike renting overpriced storage from a cloud provider, users paid a fixed price for storage up front and aren't getting price gouged nearly as badly. So it's a trade that makes sense.

          Both entrants in the market are telling you that "install size isn't that important".

          If you asked the player base of this game whether they'd prefer a smaller size, or more content - the vast majority would vote content.

          If anything, I'd wager this decision was still driven by internal goals for the company, because producing a 154gb artifact and storing it for things like CI/CD are still quite expensive if you have a decent number of builds/engineers. Both in time and money.

          • zelphirkalt2 hours ago |parent

            So guide me through this thought process:

            You are saying, that most users don't check install size of their games. Which I am not convinced of, but might even be true. Lets assume this to be true for the moment. How does this contradict, what I stated? How does users being uninformed or unaware of technical details make it so that suddenly cramming the user's disk is "caring" instead of "not caring"? To me this does not compute. Users will simply have a problem later, when their TBs of disk space have been filled with multiple such disk space wasters. Wasting this much space is user-hostile.

            Next you are talking about _content_, which most likely doesn't factor in that much at all. Most of that stuff is high resolution textures, not content. It's not like people are getting significantly more content for bigger games. It is graphics craze, that many people don't even need. I am still running around with 2 full-HD screens, and I don't give a damn about 4k resolution textures. I suspect that a big number of users doesn't have the hardware to run modern games fluently at 4k.

          • ajsnigrutin3 hours ago |parent

            154GB is A LOT still.

            I mean.. A few years ago, 1TB SSDs were still the best buy and many people haven't ugpraded still, and wasthing 15% of your total storage on just one game is still a pain for many.

    • clusterhacks4 hours ago |parent

      I started my career as a software performance engineer. We measured everything across different code implementations, multiple OS, hardware systems, and in various network configurations.

      It was amazing how often people wanted to optimize stuff that wasn't a bottleneck in overall performance. Real bottlenecks were often easy to see when you measured and usually simple to fix.

      But it was also tough work in the org. It was tedious, time-consuming, and involved a lot of experimental comp sci work. Plus, it was a cost center (teams had to give up some of their budget for perf engineering support) and even though we had racks and racks of gear for building and testing end-to-end systems, what most dev teams wanted from us was to give them all our scripts and measurement tools to "do it themselves" so they didn't have to give up the budget.

      • mikepurvis4 hours ago |parent

        That sounds like fascinating work, but also kind of a case study in what a manager's role is to "clear the road" and handle the lion's share of that internal advocacy and politicking so that ICs don't have to deal with it.

      • PunchyHamster3 hours ago |parent

        It's because patting yourself on the back for getting 5x performance increase in microbenchmark feels good and looks good on yearly review.

        > But it was also tough work in the org. It was tedious, time-consuming, and involved a lot of experimental comp sci work. Plus, it was a cost center (teams had to give up some of their budget for perf engineering support) and even though we had racks and racks of gear for building and testing end-to-end systems, what most dev teams wanted from us was to give them all our scripts and measurement tools to "do it themselves" so they didn't have to give up the budget.

        Misaligned budgeting and goals is bane of good engineering. I've seen some absolutely stupid stuff like outsourcing hosting a simple site to us, because client would rather hire 3rd party to buy domain and put a simple site there (some advertising), than to deal with their own security guys and host it on their own infrastructure.

        "It's a cost center" "So is fucking HR, why you don't fire them ?" "Uh, I'll ignore that, pls just invoice anything you do to other teams" ... "Hey, they bought cloud solution that doesn't work/they can't figure it out, can you help them" "But we HAVE stuff doing that cheaper and easier, why they didn't come to us" "Oh they thought cloud will be cheaper and just work after 5 min setup"

      • loeg3 hours ago |parent

        In an online services company, a perf team can be net profitable rather than a "cost center." The one at my work routinely finds quantifiable savings that more than justify their cost.

        There will be huge mistakes occasionally, but mostly it is death by a thousand cuts -- it's easy to commit a 0.1% regression here or there, and there are hundreds of other engineers per performance engineer. Clawing back those 0.1% losses a couple times per week over a large deployed fleet is worthwhile.

        • dogleashan hour ago |parent

          > The one at my work routinely finds quantifiable savings that more than justify their cost.

          That doesn't mean manager so-and-so gets $X amount of engineering budget reimbursed. And pointing out that some companies would, or some managers eat the spend anyway... none of that fixes parent poster's anecdote. Let alone shortsighted management in general.

    • deng5 hours ago |parent

      11% still play HD2 with a spinning drive? I would've never guessed that. There's probably some vicious circle thing going on: because the install size is so big, people need to install it on their secondary, spinning drive...

      • amlib5 hours ago |parent

        Even though I have two SSDs in my main machine I still use a hard drive as an overflow for games that I judge are not SSD worthy.

        Because it's a recent 20TB HDD the read speeds approach 250MB/s and I've also specifically partitioned it at the beginning of the disk just for games so that it can sustain full transfer speeds without files falling into the slower tracks, the rest of the disk is then partitioned for media files that won't care much for the speed loss. It's honestly fine for the vast majority of games.

        • deng5 hours ago |parent

          > It's honestly fine for the vast majority of games.

          Yes, because they apparently still duplicate data so that the terrible IOPS of spinning disks does not factor as much. You people need to stop with this so that we can all have smaller games again! ;-) <--- (IT'S A JOKE)

        • Pet_Ant4 hours ago |parent

          I install all my games on HDD but then use PrimoCache for RAM

          https://www.romexsoftware.com/en-us/primo-cache/

          • jquery3 hours ago |parent

            PrimoCache is awesome, highly recommended. I’d only say to make sure your computer is rock stable before installing it, in my limited experience it exponentially increases the risk of filesystem corruption if your computer is unstable.

      • superkuh3 hours ago |parent

        It is no surprise to me that people still have to use HDD for storage. SSD stopped getting bigger a decade plus ago.

        SSD sizes are still only equal to the HDD sizes available and common in 2010 (a couple TB~). SSD size increases (availability+price decreases) for consumers form factors have entirely stopped. There is no more progress for SSD because quad level cells are as far as the charge trap tech can be pushed and most people no longer own computers. They have tablets or phones or if they have a laptop it has 256GB of storage and everything is done in the cloud or with an octopus of (small) externals.

        • loeg2 hours ago |parent

          SSDs did not "stop getting bigger a decade plus ago." The largest SSD announced in 2015 was 16TB. You can get 128-256TB SSDs today.

          You can buy 16-32TB consumer SSDs on NewEgg today. Or 8TB in M.2 form factor. In 2015, the largest M.2 SSDs were like 1TB. That's merely a decade. At a decade "plus," SSDs were tiny as recently as 15 years ago.

          • __david__an hour ago |parent

            Perhaps my searching skills aren’t great but I don’t see any consumer ssds over 8TB. Can you share a link? It was my understanding that ssds have plateaued due to wattage restriction across SATA and M.2 connections. I’ve only seen large SSDs in U.3 and E[13].[SL] form factors which I would not call consumer.

          • numpad02 hours ago |parent

            But the mainstream is still at 500GB-2TB ranges, so...

            • jandresean hour ago |parent

              The mainstream drives are heavily focused on lowering the price. Back in the 2010s SSDs in the TB range were hundreds of dollars, today you can find them for $80 without breaking a sweat[1]. If you're willing to still spend $500 you can get 8TB drives[2].

              [1] https://www.microcenter.com/product/659879/inland-platinum-1...

              [2] https://www.microcenter.com/product/700777/inland-platinum-8...

        • everdrive2 hours ago |parent

          I read that SSDs don't actually guarantee to keep your data if powered off for an extended period of time, so I actually still do my backup on HDDs. Someone please correct me if this is wrong.

        • PunchyHamster3 hours ago |parent

          I bought 4x (1TB->4TB the storage for half the price after my SSD died after 5 years (thanks samsung), what you mean they 'stopped being bigger'?

          Sure, there is some limitation in format, can only shove so many chips on M.2, but you can get U.2 ones that are bigger than biggest HDD (tho price is pretty eye-watering)

          • superkuh2 hours ago |parent

            By stopped getting bigger I mean people still think 4TB is big in 2025. Just like 2010 when 3/4TB was the max size for consumer storage devices. u.2/u.3 is not consumer yet, unfortunately. I have to use m.2 nvme to u.2 adapters which are not great. And as you say, low number of consumer cpu+mobo pcie lanes has been restricting from the number of disks side until just recently. At least in 2025 we can have more than 2 nvme storage disks again without disabling a pcie slot.

      • robin_reala4 hours ago |parent

        Presumably only 11% of PC players, approximately 100% of console players will be on SSD.

        • vardump4 hours ago |parent

          Rather 89% PC players have an SSD. For the console players much less. People (read: hordes of kids) are still using PS3, PS4, etc.

          • Narishma3 hours ago |parent

            I think they were talking specifically about this game, which is only available on PS5 and PC.

            Edit: Forgot it was released recently on Xbox Series consoles but those also have SSDs.

          • ZekeSulastin4 hours ago |parent

            Which doesn’t matter at all in the case of Helldivers 2 as it’s only available for PC, PS5, and XBS/X. That’s a good part of why PC players were so irritated, actually: when all this blew up a few months ago, the PC install sizes was ~133 GB vs the consoles’ 36 GB.

          • theoldgreybeard3 hours ago |parent

            And only a very small portion of that 11% have ONLY an HDD but now that it’s only 23gb it’s easier to justify using your precious SSD space for it.

          • jsheard4 hours ago |parent

            Helldivers 2 is only on current gen consoles so older ones are beside the point, the current ones use NVMe SSDs exclusively. PC is the only platform where HDDs or SATA SSDs might still come up.

    • PoignardAzur5 hours ago |parent

      I don't find it surprising at all. A ton of developers do optimizations based on vibes and very rarely check if they're actually getting a real benefit from it.

      • bombcar5 hours ago |parent

        This is the moral behind "premature optimization is the root of all evil" - you could say preconceived just as easily.

        • embedding-shape5 hours ago |parent

          > you could say preconceived just as easily

          Would have saved us from all the people who reject any sort of optimization work because for them it is always "too early" since some product team wanted their new feature in production yesterday, and users waiting 5 seconds for a page load isn't considered bad enough just yet.

          • Capricorn24813 hours ago |parent

            Premature optimization doesn't mean "We have an obvious fix sitting in front of us that will definitely improve things."

            It means "We think we have something that could help performance based on a dubiously applicable idea, but we have no real workload to measure it on. But we're going to do it anyway."

            So it doesn't save us from anything, it potentially delays launching and gives us the same result that product team would have given us, but more expensive.

            • embedding-shape3 hours ago |parent

              Yes, you and me understand that quote, probably mostly because we've both read all the text around the quote too, not just the quote itself. But there is a lot of people who dogmatically follow things other's write about without first digging deeper, and it's these people I was talking about before. Lots of people seemingly run on whatever soundbites they can remember.

              • cratermoon2 hours ago |parent

                While I know the paper pretty well, I still tend to phrase my objections by asking something along the lines of "do you have any benchmarks for the effects of that change?"

            • PunchyHamster3 hours ago |parent

              > It means "We think we have something that could help performance based on a dubiously applicable idea, but we have no real workload to measure it on. But we're going to do it anyway."

              the problem is that it doesn't say that directly so people without experience take it at face value.

        • TeMPOraL4 hours ago |parent

          Counterpoint: data driven development often leads to optimizations like this not being made because they're not the ones who are affected, their customers are. And software market is weird this way - little barriers to entry, yet almost nothing is a commodity, so there's no competitive pressure to help here either.

        • PunchyHamster3 hours ago |parent

          Honestly looking over time I think that phrase did more bad than good.

          Yes, of course you shouldn't optimize before you get your critical path stable and benchmark which parts take too much.

          But many, many times it is used as excuse to delay optimisation so far that it is now hard to do because it would require to rewriting parts that "work just fine", or it is skipped because the slowness is just at tolerable level.

          I have a feeling just spending 10-20% more time on a piece of code to give it a glance whether it couldn't be more optimal would pay for itself very quickly compared to bigger rewrite months after code was written.

    • JohnBooty3 hours ago |parent

          I expect it's a story that'll never get told in 
          enough detail to satisfy curiosity, but it certainly 
          seems strange from the outside for this optimisation 
          to be both possible and acceptable.
      
      From a technical perspective, the key thing to know is that the console install size for HD2 was always that small -- their build process assumed SSD on console so it didn't duplicate stuff.

      154GB was the product of massive asset duplication, as opposed 23GB being the product of an optimization miracle. :)

      How did it get so bad on PC?

      Well, it wasn't always so crazy. I remember it being reasonable closer to launch (almost 2 years ago) and more like ~40-60GB. Since then, the devs have been busy. There has been a LOT of reworking and a lot of new content, and the PC install size grew gradually rather than suddenly.

      This was probably impacted to some extent by the discontinued game engine they're using. Bitsquid/Stingray was discontinued partway through HD2 development and they continued on with it rather than restarting production entirely.

      https://en.wikipedia.org/wiki/Bitsquid

    • jeffwask5 hours ago |parent

      Game companies these days barely optimize engine graphical performance before release never mind the package size or patching speed. They just stamp higher minimum system requirements on the package.

    • jsheard5 hours ago |parent

      From a business perspective the disk footprint is only a high cost if it results in fewer sales, which I doubt it does to any significant degree. It is wasteful, but can see why optimization efforts would get focused elsewhere.

      • code_for_monkey5 hours ago |parent

        I think certain games dont even bother to optimize the install size so that you cant fit other games on the hard drive, I think COD games are regularly hundreds of gigs

        • KeplerBoy5 hours ago |parent

          Having a humongous game might be a competitive advantage in the era of live-service games.

          Users might be more hesitant to switch to another game if it means uninstalling yours and reinstalling is a big pain in the backside due to long download times.

          • mitthrowaway24 hours ago |parent

            More likely once they uninstall it, they never reinstall it because they'd have to clear out so much other stuff to fit it back in.

            • KeplerBoy4 hours ago |parent

              I guess it's a lock-in effect of sorts.

        • snet05 hours ago |parent

          I've often seen people mention that one reason for games like Call of Duty being so enormous is optimising for performance over storage. You'd rather decompress textures/audio files at install-time rather than during run-time, because you download/install so infrequently.

          • justsomehnguyan hour ago |parent

            It's amusing we had a 2x-10x compression and negligible perfomance hit on 486 with DBLSPACE yet in 2020+...

        • jsheard4 hours ago |parent

          > I think COD games are regularly hundreds of gigs

          I looked up the size of the latest one, and Sony's investment in RAD Kraken seems to be paying dividends:

          Xbox: 214 GB

          PC: 162 GB

          PS5: 96 GB

      • Ekaros5 hours ago |parent

        Also the cost is often offloaded to the "greedy" Valve... So there is less pressure to optimize their own CDN use.

        • jsheard5 hours ago |parent

          Yeah, I don't think any of the stores charge developers in proportion to how much bandwidth they use. If that changed then the priorities could shift pretty quickly.

          Publishers do have to care somewhat on the Switch since Nintendo does charge them more for higher capacity physical carts, but a lot of the time they just sidestep that by only putting part (or none) of the game on the cart and requiring the player to download the rest.

          • PunchyHamster3 hours ago |parent

            it's not really all that big of a cost, serving few hundred GBs costs pennies, despise what prices of S3 storage and bandwidth might led some people to believe.

      • bombcar5 hours ago |parent

        Given how many Steam games are bought but never even installed, it would seem not terribly worth optimizing for.

        On phone, I bet you see some more effort.

        • georgeecollins5 hours ago |parent

          Both things are sort of true. Its not sales where size can hurt you but retention, which is why it tended to matter more on phones. When you need space on your device the apps are listed from largest to smallest.

          On both phones and PCs storage has just grown so its less of an issue. The one thing I have noticed is that Apple does its price windowing around memory so you pay an absurd amount for an extra 128 gb. The ultra competitive Chinese phone market crams high end phones with a ton of memory and battery. Si some popular Chinese phone games are huge compared to ones made for the iPhone.

      • LtWorf5 hours ago |parent

        It might but they have no way of measuring it so they won't take it into account.

    • jjk1663 hours ago |parent

      I'd bet any amount of money a demo ran slow on one stakeholder's computer, who happened to have a mechanical hard drive, they attributed the slowness to the hard drive without a real investigation and optimizing for mechanical hard drive performance became standard practice. The demo may not have even been for this game, just a case of once bitten twice shy.

    • pdntspa2 hours ago |parent

      I have heard that in many scenarios it is faster to load uncompressed assets directly rather than load+decompress. Load time is prioritized over hard drive space so you end up with the current situation.

    • NBJack4 hours ago |parent

      The game is released on both PC and PS5, the latter of which was designed (and marketed) to take advantage of SSD speeds for streaming game content near real time.

      The latest Ratchet and Clank, the poster child used in part to advertise the SSD speed advantage, suffers on traditional hard drives as well in the PC port. Returnal is in the same boat. Both were originally PS5 exclusives.

      • mikepurvis4 hours ago |parent

        Noting in particular that the PS5's internal storage isn't just "an ssd", it's a gen 4 drive that can sequential-read at up to 5500 MB/s.

        By comparison a SATA III port caps out at 6Gbps (750 MB/s), and first generation NVMe drives ("gen 3") were limited to 3500 MB/s.

        • PunchyHamster3 hours ago |parent

          the speed is one thing but seek times are just orders of magnitude different too.

          SSD on SATA is still "not bad" for most games, but HDD can be awful if game does not do much sequential

          • mikepurvisan hour ago |parent

            My understanding is that optimizing for sequential read is a big reason for historical game install bloat; if you include the common assets multiple times in the archive, then loading a level/zone becomes one big continuous slurp rather than jumping all over the place to pick up the stuff that's common to everything. Obviously this didn't matter with optical media where the user wasn't impacted, but it's annoying on PC where we've had a long period of users who invested in expensive, high-performance storage having to use more of it than needed due to optimizations geared at legacy players still on spinning rust.

            I expect that low-latency seek time is also pretty key to making stuff like nanite work, where all the LODs for a single object are mixed together and you need to be able to quickly pick off the disk the parts that are needed for the current draw task.

      • shantara4 hours ago |parent

        The HDD performance suffers very much during the portal loading sequences in Ratchet and Clank, but even the entry level SSD performs fine, with little visible difference compared to the PS5 one. It’s more about random access speed than pure throughput

      • garaetjjte4 hours ago |parent

        I played Rift Apart from HDD and apart from extra loading time during looped animations it was fine. On the other hand Indiana Jones Great Circle was barely playable with popping-in textures and models everywhere.

    • bee_rider4 hours ago |parent

      IIRC this has been the “done thing” forever. I’m not in game development, but I think I recall hearing about it in the Xbox 360 era. Conventional options are picked by default, benchmarks are needed to overturn that. Looking at my hard drive, massive game installations are still very much the industry standard…

    • hinkley2 hours ago |parent

      It’s the same sort of apathy/arrogance that made new Windows versions run like dogshit on old machines. Gates really should have had stock in PC makers. He sold enough of them.

      I don’t think it’s always still the case but for more than ten years every release of OSX ran better on old hardware, not worse.

      Some people think the problem was MS investing too eagerly into upgrading developer machines routinely, giving them a false sense of what “fast enough” looked like. But the public rhetoric was so dismissive that I find that pretty unlikely. They just didn’t care. Institutionally.

      I’m not really into the idea of Helldivers in the first place but I’m not going to install a 150GB game this side of 2040. That’s just fucking stupid.

    • nerdjon5 hours ago |parent

      High cost to who though. We see the same thing when it comes to RAM and CPU usage, the developer is not the one paying for the hardware and many gamers have shown that they will spend money on hardware to play a game they want.

      Sure they may loose some sales but I have never seen many numbers on how much it really impacted sales.

      Also on the disk side, I can't say I have ever looked at how much space is required for a game before buying it. If I need to clear out some stuff I will. Especially with it not being uncommon for a game to be in the 100gb realm already.

      That all being said, I am actually surprised by the 11% using mechanical hard drives. I figured that NVME would be a lower percentage and many are using SSD's... but I figured the percent with machines capable of running modern games in the first place with mechanical drives would be far lower.

      I do wonder how long it will be until we see games just saying they are not compatible with mechanical drives.

      • onli5 hours ago |parent

        That already happened :) Starfield claimed to not support HDDs and really ran bad with them. And I think I saw SSDs as requirement for a few other games now, in the requirement listings on steam.

        • embedding-shape5 hours ago |parent

          > Starfield claimed to not support HDDs and really ran bad with them.

          To be fair, at launch Starfield had pretty shit loading times even with blazing fast SSDs, and the game has a lot of loading screens, so makes sense they'll nip that one in the bud and just say it's unsupported with the slower type of disks.

      • literallywho5 hours ago |parent

        Latest Ratchet and Clank game relies heavily on ps5’s nvme drive. Its PC port states that SSD is required. And IIRC, the experience on mechanical drives is quite terrible to the unplayable level.

    • root_axis4 hours ago |parent

      Optimizing for disk space is very low on the priority list for pretty much every game, and this makes sense since its very low on the list of customer concerns relative to things like in-game performance, net code, tweaking game mechanics and balancing etc.

      • Cthulhu_4 hours ago |parent

        Apparently, in-game performance is not more important than pretty visuals. But that's based on hearsay / what I remember reading ages ago, I have no recent sources. The tl;dr was that apparently enough people are OK with a 30 fps game if the visuals are good.

        I believe this led to a huge wave of 'laziness' in game development, where framerate wasn't too high up in the list of requirements. And it ended up in some games where neither graphics fidelity or frame rate was a priority (one of the recent Pokemon games... which is really disappointing for one of the biggest multimedia franchises of all time).

        • Narishma3 hours ago |parent

          That used to be the case, but this current generation the vast majority of games have a 60 fps performance mode. On PS5 at least, I can't speak about other consoles.

    • PunchyHamster3 hours ago |parent

      it's not cost to them. The cost is paid by consumers and platforms.

      Also if goal was to improve things for small minority they could've just pawned it off to free DLC, like how some games do with 4k texture packs

      • oreally3 hours ago |parent

        It would be ironic if incidents like this made Valve start charging companies for large file sizes of their games. It would go to show that good things get abused to no end if limits aren't set.

    • aeve8905 hours ago |parent

      >It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!

      You should look at COD install sizes and almost weekly ridiculously huge "updates". 150gb for a first install is almost generous considering most AAA games.

    • kasabali4 hours ago |parent

      You missed the most bizarre quote:

      > These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns

      Unfortunately it's not only game development, all modern society seems operate like this.

    • londons_explore4 hours ago |parent

      Twenty years ago I bought a 1TB harddrive... It wasn't very expensive either.

      Twenty years on, and somehow that's still 'big'.

      Computing progress disappoints me.

      • neuroelectron4 hours ago |parent

        They say some kind of physical limit, but I think it's really market manipulation in order to bring about the cloud and centralization, control.

    • whalesalad4 hours ago |parent

      > It seems bizarre to me that they'd have accepted such a high cost

      Wait till you find out what engine this game is made in. https://80.lv/articles/helldivers-ii-was-built-on-an-archaic...

    • behringer4 hours ago |parent

      I think smaller game sizes would hurt sales. Your first though on a 23gb game when other games are 100 plus is, why is there so little content?

  • canucker20162 minutes ago

    Back in 2014, Titanfall's disk space was 75% UNCOMPRESSED audio (35GB of 48GB) for the gamers with only dual-core CPUs.

    from https://www.escapistmagazine.com/titanfall-dev-explains-the-...

      “On a higher PC it wouldn’t be an issue. On a medium or moderate PC, it wouldn’t be an issue, it’s that on a two-core [machine] with where our min spec is, we couldn’t dedicate those resources to audio.”
  • fleabitdev5 hours ago

    Back of the envelope, in the two years since the game was released, this single bug has wasted at least US$10,000,000 of hardware resources. That's a conservative estimate (20% of people who own the game keep it installed, the marginal cost of wasted SSD storage in a gaming PC is US$2.50 per TB per month, the install base grew linearly over time), so the true number is probably several times higher.

    In other words, the game studio externalised an eight-figure hardware cost onto their users, to avoid a five-to-six-figure engineering cost on their side.

    Data duplication can't just be banned by Steam, because it's a legitimate optimisation in some cases. The only safeguard against this sort of waste is a company culture which values software quality. I'm glad the developers fixed this bug, but it should never have been released to users in the first place.

    • ga2mer5 hours ago |parent

      >Data duplication can't just be banned by Steam

      Steam compresses games as much as possible, so in the case of Helldivers 2, you had to download between ~30 and ~40 GB, which was then unpacked to 150 GB (according to SteamDB[0])

      [0] https://steamdb.info/app/553850/depots/

      • deng4 hours ago |parent

        You are missing that each update takes AGES while it tortures your disk for patching the files (on my machine it takes 15min or so, and that's on an SSD). So I agree that this is careless and reminds me of the GTA5 startup time that was fixed by a dedicated player who finally had enough and reverse engineered the problem (see https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...). I still find these things hard to accept.

        • jeroenhd4 hours ago |parent

          Steam update durations depend on compression + CPU performance + SSD I/O. Things will be harder when the disk is almost full and live defragmentation kicks in to get free space for contiguous files. Some SSDs are fast enough to keep up with such a load, but a lot of them will quickly hit their DRAM limits and suddenly that advertised gigabyte per second write speed isn't all that fast. Bonus points for when your SSD doesn't have a heatsink and moving air over it, making the controller throttle hard.

          Patching 150GiB with a compressed 15GiB download just takes a lot of I/O. The alternative is downloading a fresh copy of the 150GiB install file, but those playing on DSL will probably let their SSD whizz a few minutes longer than spend another day downloading updates.

          If your SSD is slower than your internet capacity, deleting install files and re-downloading the entire game will probably save you some time.

        • butlike2 hours ago |parent

          Their update where they got $10k reward from R* brought a smile to my face

      • fleabitdev5 hours ago |parent

        In this case, the bug was 131 GB of wasted disk space after installation. Because the waste came from duplicate files, it should have had little impact on download size (unless there's a separate bug in the installer...)

        This is why the cost of the bug was so easy for the studio to ignore. An extra 131 GB of bandwidth per download would have cost Steam several million dollars over the last two years, so they might have asked the game studio to look into it.

        • rvnx4 hours ago |parent

          This article presents it as a big success, but it could be read the opposite way: "Developers of Helldivers 2 wasted 130 GB for years and didn't care because it was others people computers"

        • dotwaffle4 hours ago |parent

          > An extra 131 GB of bandwidth per download would have cost Steam several million dollars over the last two years

          Nah, not even close. Let's guess and say there were about 15 million copies sold. 15M * 131GB is about 2M TB (2000 PB / 2 EB). At 30% mean utilisation, a 100Gb/s port will do 10 PB in a month, and at most IXPs that costs $2000-$3000/month. That makes it about $400k in bandwidth charges (I imagine 90%+ is peered or hosted inside ISPs, not via transit), and you could quite easily build a server that would push 100Gb/s of static objects for under $10k a pop.

          It would surprise me if the total additional costs were over $1M, considering they already have their own CDN setup. One of the big cloud vendors would charge $100M just for the bandwidth, let alone the infrastructure to serve it, based on some quick calculation I've done (probably incorrectly) -- though interestingly, HN's fave non-cloud vendor Hetzner would only charge $2M :P

          • fleabitdev3 hours ago |parent

            Isn't it a little reductive to look at basic infrastructure costs? I used Hetzner as a surrogate for the raw cost of bandwidth, plus overheads. If you need to serve data outside Europe, the budget tier of BunnyCDN is four times more expensive than Hetzner.

            But you might be right - in a market where the price of the same good varies by two orders of magnitude, I could believe that even the nice vendors are charging a 400% markup.

          • icecube1232 hours ago |parent

            Yea, I always laugh when folks talk about how expensive they claim bandwidth is for companies. Large “internet” companies are just paying a small monthly cost for transit at an IX. They arent paying $xx/gig ($1/gig) like the average consumer is. If you buy a 100gig port for $2k, it costs the same if you’re using 5 GB a day or 8 PB per day.

          • stanac4 hours ago |parent

            Off topic question.

            > I imagine 90%+ is peered or hosted inside ISPs, not via transit

            How hosting inside ISPs function? Does ISP have to MITM? I heard similar claims for Netflix and other streaming media, like ISPs host/cache the data themselves. Do they have to have some agreement with Steam/Netflix?

            • icecube1232 hours ago |parent

              Yea netflix will ship a server to an ISP (Cox, comcast, starlink, rogers, telus etc) so the customers of that ISP can access that server directly. It improves performance for those users and reduces the load on the ISP’s backbone/transit. Im guessing other large companies will do this as well.

              A lot of people are using large distributed DNS servers like 8.8.8.8 or 1.1.1.1 and these cansometimes direct users to incorrect CDN servers, so EDNS was created to help with it. I always use 9.9.9.11 instead of 9.9.9.9 to hopefully help improve performance.

            • detaro3 hours ago |parent

              The CDN/content provider ships servers to the ISP which puts them into their network. The provider is just providing connectivity and not involved on a content-level, so no MITM etc needed.

              • bauruine3 hours ago |parent

                More details here https://as32590.net/steamcache/

      • embedding-shape5 hours ago |parent

        Makes sense, initial claim was that HD2 size was mainly because of duplicated assets, and any compression worth it's salt would de-duplicate things effectively.

    • WreckVenom5 hours ago |parent

      From the story:

      > Originally, the game’s large install size was attributed to optimization for mechanical hard drives since duplicating data is used to reduce loading times on older storage media. However, it turns out that Arrowhead’s estimates for load times on HDDs, based on industry data, were incorrect.

      It wasn't a bug. They made a decision on what to optimise which was based on incomplete / incorrect data and performed the wrong optimisation as a result.

      As a player of the game, I didn't really care that it took up so much space on my PC. I have 2TB dedicated for gaming.

      • zelphirkalt3 hours ago |parent

        Why not offer 2 versions for download and let the user choose, whether they want to block their whole disk with a single game, or accept a bit longer loading times? Or let the user at installation time make an informed decision by explaining the supposed optimization? Or let the user decide before downloading, what resolution (ergo textures) they want as the highest resolution they will play the game at and only download the textures they need up to that resolution?

        Questions, questions, questions.

        • WreckVenoman hour ago |parent

          Because all of these suggestions require developer resources. Doing a quick web search it is estimated they have ~150 employees. A lot of triple-A studios have thousands or ten of thousands of employees. So they are relatively small game studio.

          Also note that they are adding more game modes, mode warbonds, and the game is multi-platform and multiplayer. The dev team is relatively small compared to other game studios.

          The game engine the game is built in is discontinued and I believe is unsupported. IIRC they are rewriting the game in UE5 because of the issues with the unsupported engine.

          A lot of people have problems with Arrowhead (there been some drama between Arrowhead and the community). The install size of the game while a problem wasn't like the top problem. Bigger issues in my mind as someone that plays the game regular is:

          e.g.

          - The newest updates to the game with some of new enemy types which are quite unfair to fight against IMO (Dragon Roach and the War Strider).

          - The other complaint was performance/stability of the game was causing issues with streamers PCs. Some people claimed the game was breaking their PCs (I think this was BS and their PCs were just broken anyway). However there was a problem with performance in the game, which was resolved with a patch a few weeks ago. That greatly improved the game IMO.

      • ThrowawayTestr4 hours ago |parent

        SSD or HDD?

        • WreckVenom4 hours ago |parent

          SSD. Prices got reasonable sometime last year for 2TB NVME/SSD

    • pavel_lishin3 hours ago |parent

      > the marginal cost of wasted SSD storage in a gaming PC is US$2.50 per TB per month

      Out of curiousity, how do you come up with a number for this? I would have zero idea of how to even start estimating such a thing, or even being able to tell you whether "marginal cost of wasted hard drive storage" is even a thing for consumers.

      • fleabitdev2 hours ago |parent

        I'd be very interested in hearing alternative estimates, but here's my working:

        The lowest cost I could find to rent a server SSD was US$5 per TB-month, and it's often much higher. If we assume that markets are efficient (or inefficient in a way that disadvantages gaming PCs), we could stop thinking there and just use US$2.50 as a conservative lower bound.

        I checked the cost of buying a machine with a 2 TB rather than 1 TB SSD; it varied a lot by manufacturer, but it seemed to line up with $2.50 to $5 per TB-month on a two-to-five-year upgrade cycle.

        One reason I halved the number is because some users (say, a teenager who only plays one game) might have lots of unused space in their SSD, so wasting that space doesn't directly cost them anything. However, unused storage costs money, and the "default" or "safe" size of the SSD in a gaming PC is mostly determined by the size of games - so install size bloat may explain why that "free" space was purchased in the first place.

        > whether "marginal cost of wasted hard drive storage" is even a thing for consumers

        As long as storage has a price, use of storage will have a price :-)

    • PunchyHamster3 hours ago |parent

      "having one less game installed on your SSD" isn't exactly same as cost per TB, it's just slight wasted convenience at worst

    • mrec3 hours ago |parent

      > the marginal cost of wasted SSD storage in a gaming PC is US$2.50 per TB per month

      Where are you getting this number from? Not necessarily arguing with it, just curious.

    • zelphirkalt4 hours ago |parent

      I should probably look up the company that made the game or the publisher and avoid games they make in the future.

      • WreckVenom3 hours ago |parent

        That would be a shame because the game is honestly very good despite its flaws, is a lot of fun and has a decent community.

  • rincebrain6 hours ago

    I've been really curious precisely what changed, and what sort of optimization might have been involved here.

    Because offhand, I know you could do things like cute optimizations of redundant data to minimize seek time on optical media, but with HDDs, you get no promises about layout to optimize around...

    The only thing I can think of is if it was literally something as inane as checking the "store deduplicated by hash" option in the build, on a tree with copies of assets scattered everywhere, and it was just nobody had ever checked if the fear around the option was based on outcomes.

    (I know they said in the original blog post that it was based around fears of client performance impact, but the whole reason I'm staring at that is that if it's just a deduplication table at storage time, the client shouldn't...care? It's not writing to the game data archives, it's just looking stuff up either way...)

    • alias_neo5 hours ago |parent

      I'm not entirely clear what you're trying to say, but, my understanding is that they simply put lots of copies of files in lots of places like games have done for a long time, in the hopes it would lower seek times on HDDs for those players who use them.

      They realised, after a lot of players asking, that it wasn't necessary, and probably had less of an impact than they thought.

      They removed the duplicates, and drastically cut the install size. I updated last night, and the update alone was larger than the entire game after this deduplication run, so I'll be opting in to the Beta ASAP.

      It's been almost a decade since I ran spinning rust in a desktop, and while I admire their efforts to support shitty hardware, who's playing this on a machine good enough to play but can't afford £60 for a basic SSD for their game storage?

    • eska5 hours ago |parent

      HDDs also have a spinning medium and a read head , so the optimization is similar to optical media like CDs.

      Let’s say you have UI textures that you always need, common player models and textures, the battle music, but world geometry and monsters change per stage. Create an archive file (pak, wad, …) for each stage, duplicating UI, player and battle music assets into each archive. This makes it so that you fully utilize HDD pages (some small config file won’t fill 4kb filesystem pages or even the smaller disk sectors). All the data of one stage will be read into disk cache in one fell swoop as well.

      On optical media like CDs one would even put some data closer to the middle or on the outer edge of the disc because the reading speed is different due to the linear velocity.

      This is an optimization for bandwidth at the cost of size (which often wasn’t a problem because the medium wasn’t filled anyway)

      • swiftcoder5 hours ago |parent

        > HDDs also have a spinning medium and a read head , so the optimization is similar to optical media like CDs.

        HDDs also have to real with fragmentation, I wonder what the odds that you get to write 150 GBs (and then regular updates in the 30GB range) without breaking it into fragments...

      • everforward4 hours ago |parent

        The game installer can't control the layout on an HDD without doing some very questionable things like defragging and moving existing user files around the disk. It probably _could_ but the risk of irrecoverable user data loss or accidentally corrupting a boot partition via a bug would make it completely not worth it.

        Even if you pack those, there's no guarantee they don't get fragmented by the filesystem.

        CDs are different not because of media, but because of who owns the storage media layout.

        • Karliss3 hours ago |parent

          It's less about ensuring perfect layout as it is about avoiding almost guaranteed terrible layout. Unless your filesystem is fully fragmented already it won't intentionally shuffle and split big files without a good reason.

          Single large file is still more likely to be mostly sequential compared to 10000 tiny files. With large amount of individual files the file system is more likely to opportunistically use the small files for filling previously left holes. Individual files more or less guarantee that you will have to do multiple syscalls per each file and to open and read it, also potentially more amount of indirection and jumping around on the OS side to read the metadata of each individual file. Individual files also increases chance of accidentally introducing random seeks due to mismatch between the order updater writes files, the way file system orders things and the order in which level description files list and reads files.

  • geerlingguy5 hours ago

    Possibly a similar process to when you go into an AWS account, and find dozens of orphaned VMs, a few thousand orphaned disk volumes, etc., saving like $10k/month just deleting unused resources.

    • perching_aix4 hours ago |parent

      It's not a case of forgotten data, it's duplicated for access time reasons, like with optical media.

      It follows in the footsteps of trading in storage for less compute and/or better performance.

      An opposite approach in the form of a mod for Monster Hunter: Wilds recently made it possible [0] for end-users to decompress all the game textures ahead of time. This was beneficial there, because GPU decompression was causing stalls, and the trading in of compute for less storage resulted in significantly worse performance.

      [0] https://youtu.be/AOxLV2US4Ac

    • alias_neo5 hours ago |parent

      We've all been there Jeff.

      In this case, I don't think it was forgetfulness; unlike us, they have an excuse and they were trying to optimise for disk seek times.

      Anyway, I've got a half-dozen cloud accounts I need to go check for unused resources waves.

  • haritha-j4 hours ago

    I recently downloaded Hunt showdown. I think it was around 70 gigs. About a month later, I had to update it. The download was the same size. I think they literally just overrode the entire game because they were too lazy to update it properly.

    • deinonychus3 hours ago |parent

      The opposite happens in some games where the update will be a gigabyte or two, but the patching takes so long, that some players with fast enough download speeds will swear by wholly uninstalling and reinstalling the game.

    • zelphirkalt3 hours ago |parent

      I observed this too with some other games. Really annoying, when you need to redownload tens of gigabytes , because they cannot be arsed to put a proper updater. Things that were solved by most games way before Steam became even big.

      • darknavi2 hours ago |parent

        It's true for most of the modern web. Hardware and internet connections have come reliable and fast enough for many people not to care.

        Unfortunately there are still people on metered/slow connections that really do care still.

  • miohtama5 hours ago

    AFAIK Helldivers 2 runs some really old engine which was discontinued many years ago. Not "state of the art."

    It's also a title that shows you can have a really good game without the latest tech.

    • panja3 hours ago |parent

      To be fair, they started working on the game before it was discontinued. I'm sure someone made a decision that it made more sense to continue on without support rather than start from scratch.

    • thatguy09004 hours ago |parent

      You can, but on the other hand they've been battling bugs from it with every release. The game is notorious for breaking things constantly. I played for quite a while and the sound engine was always awful with things not in your line of sight frequently not making any noise at all, and every month or two a new major bug relating to host client desyncing is found out by the community who then has to have big campaigns badgering the devs to notice and fix it. Very fun game still but if they had started with a supported engine a lot of stuff would probably work way better

  • djmips6 hours ago

    I did similar work on a game a long time ago and it took over a month to slim it down to 1/4 of the size but in this case 'at runtime' - the producer wasn't impressed. It looked exactly the same. I wonder if they had any pushback.

  • unixnight3 hours ago

    It was legit faster to delete and redownload this game than update it since steam considered my SSD too full (WITH 200 GIGS FREE) to download the files to said SSD, instead opting to use my SLOWEST HDD as the cache drive for the download.

    It would then proceed to download the update in 5 minutes and spend 8 HOURS UPDATING.

    A full download of the game? 10 minutes.

    Glad to see this update. I hope more games follow suit

    • PunchyHamster3 hours ago |parent

      I remember how pre-loading some games turned out to be slower for me because downloading it at launch meant decrypting it directly from the network, but decrypting files on drive that are already encrypted doubled the IO and that was too much random access for my hard drive

  • easyThrowaway6 hours ago

    Did the duplicated files were even used on pc? Like, do you even have such low access to the file system that you can deduce which duplicated instance has a faster access time on a mechanical hard drive?

    • tehbeard5 hours ago |parent

      It's not which duplicated instance....

      Think of it as I have two packs for levels.

      Creek.level and roboplanet.level

      Both use the cyborg enemies, by duplicating the cyborg enemy model and texture data across both files, Only the level file needs to be opened to get all nessecary data for a match.

      Because modern OS will allow you to preallocate contiguous segments and have auto defrag, you can have it read this level file at max speed, rather than having to stop and seek to go find cyborg.model file because it was referenced by the spawn pool. Engine limitations may prevent other optimisations you think up as a thought exercise after reading this.

      It's similar to how crash bandicoot packed their level data to handle the slow speed of the ps1 disc drive.

      As to why they had a HDD optimisation in 2024... Shrugs

      • embedding-shape5 hours ago |parent

        > As to why they had a HDD optimisation in 2024... Shrugs

        Sadly, Valve doesn't include/publish HDD vs SSD in/on their surveys (https://store.steampowered.com/hwsurvey/?platform=combined) but considering the most popular combo seems to be 16GB RAM, 8GB VRAM, 2.3 Ghz to 2.69 Ghz CPU frequency, I'm getting the impression that the average gaming PC machine isn't actually that beefy. If someone told me the most common setup paired with the previous specs was a small SSD drive for the OS and a medium/large-sized HDD for everything else and I would have believed you.

        I think us as (software/developer/technology) professionals with disposable income to spend on our hobbies forget how things are for the average person out there in the world.

        • phatfish4 hours ago |parent

          Steam has so many users I'm not sure the average says a lot? If you are just playing Hentai games like most Steam users (j/k, probably) you can do that on any device from the last 10 years.

          More interesting would be to see the specs for users who bought COD (add other popular franchises as you wish) in the last 2 years. That would at least trim the sample set to those who expect to play recent graphics heavy titles.

    • arghwhat6 hours ago |parent

      Not sure if this is what they did, but you can just put all the things you need together sequentially into a single file and rely on the filesystem to allocate contiguous blocks where possible (using the appropriate size hints to help). It's trivial unpack at loading time without any performance impact.

      A filesystem is by itself just one big "file" acting like a file archive.

  • gethly4 hours ago

    if your game takes 154 GB of space, you should never be able to touch a computer ever again.

    • panja3 hours ago |parent

      I agree more can be done to shave game sizes but space is cheap these days so that's probably why it hasn't been a focus.

      • Sohcahtoa82an hour ago |parent

        The same argument gets used when concerning CPU performance when writing software with 10+ levels of abstraction and now your CPU gets pegged when on a semi-active Discord channel.

        Who cares that it takes a million clock cycles to process a single message? People are running 5 Ghz CPUs, so that's only 200 ms, right?

  • l33tfr4gg3r4 hours ago

    Warframe also did this relatively recently, though perhaps not quite as aggressive a reduction as Helldivers, but still.

    https://www.pcgamer.com/call-of-duty-take-note-warframe-is-r...

  • nopcode3 hours ago

    Nixxes has such a good reputation in my book that their name immediately removes any porting-fears when I see they are responsible for a release.

  • Artoooooor2 hours ago

    154GB? That's why I almost don't "buy" AAA games. This is much bigger cost than money for that game.

    • maccard2 hours ago |parent

      > This is much bigger cost than money for that game

      Is it? A 1TB SSD costs about the same as the game does...

    • Forgeties792 hours ago |parent

      I don’t disagree that it’s too large at 154GB but the decision (at least to me) also made more sense at a time when storage, especially external storage, had never been cheaper. That only changed like 6-8 weeks ago. Prior to that getting a 4TB external SSD was like what? $150-$200? $120 for a 2TB nvme? You could get 1TB SSD’s for like $65 if you looked for them.

  • Havoc2 hours ago

    They had the size at 7x duplication to save load times for hdds? WTH

  • Thev00d003 hours ago

    On size limited platforms like steam deck and friends this is a huge W

  • rwmj6 hours ago

    23GB is supposed to be "slim"?!

    • onli6 hours ago |parent

      Yes. High resolution textures take up a lot of space. Have a look at HD texture mods for skyrim for example. 23GB is more in line with a game from a few years ago, so this really is slim for a modern game with modern graphics.

    • throw0101c5 hours ago |parent

      Back in the day:

      > 3-D Hardware Accelerator (with 16MB VRAM with full OpenGL® support; Pentium® II 400 Mhz processor or Athlon® processor; English version of Windows® 2000/XP Operating System; 128 MB RAM; 16-bit high color video mode; 800 MB of uncompressed hard disk space for game files (Minimum Install), plus 300 MB for the Windows swap file […]

      * https://store.steampowered.com/app/9010/Return_to_Castle_Wol...

      * https://en.wikipedia.org/wiki/Return_to_Castle_Wolfenstein

      Even older games would be even smaller:

      * https://www.oldgames.sk/en/game/ultima-vi-the-false-prophet/...

      * https://en.wikipedia.org/wiki/Ultima_VI:_The_False_Prophet

      • jasomill3 hours ago |parent

        For gaming, this doesn't bother me much, given that, even at today's prices, the cost of maintaining a midrange gaming PC with ample storage and "recommended" specs for new releases is probably no more than $200-$300/year.

        The ever-increasing system requirements of productivity software, however, never ceases to amaze me:

        Acrobat Exchange 1.0 for Windows (1993) required 4 MB RAM and 6 MB free disk space.

        Rough feature parity with the most-used features of modern Acrobat also required Acrobat Distiller, which required 8 MB RAM and another 10 MB or so of disk space.

        Acrobat for Windows (2025) requires 2,000 MB RAM and 4,500 MB free disk space.

        • throw0101c2 hours ago |parent

          Further there is PDF software (read, write) that often does the same things that is much less heavy.

      • filleduchaos5 hours ago |parent

        I for one simply cannot believe that a game with 4K+ textures and high poly count models is bigger than a game that uses billboard sprites which aren't even HD. Whatever could be the reason? A complete mystery...

    • bilekas6 hours ago |parent

      In this day and age it's a gift to only be ~23GB.. I'm reminded of the old days when you literally didn't have the space so had to get creative, now any kind of space optimization isn't even considered.

      • 0cf8612b2e1e4 hours ago |parent

        Not true at all on my PlayStation. Just a few games with 100GB+ install size can quickly put you into a space juggling scenario.

    • mfro6 hours ago |parent

      Have you played a big budget video game released in the last 10 years? It’s pretty standard to reach upwards of 60GB.

      • jwagenet5 hours ago |parent

        GTAV had a 60GB install size over a decade ago.

    • mghackerlady5 hours ago |parent

      It can fit on a standard blu ray, so I'm inclined to say so

    • phoronixrly5 hours ago |parent

      I do love rich soundtracks with high quality compression, and textures that look crisp on 4k. And also games with 100+ hours of single-player campaign.

    • dontlaugh3 hours ago |parent

      It's tiny, compared to most games of similar graphical detail.

    • alias_neo5 hours ago |parent

      I mean yes, it's a very nice looking game with fairly sizeable worlds and lots of different enemies, biomes, etc.

      It's currently over 100GB because of duplicated assets, so this is a game-changer (pun intended).

      EDIT: Just checked; 157GB on my SSD.

      EDIT2: 26GB after updating with a 9.7GB download, 5.18GB of shaders before and after.

  • donatj4 hours ago

    > "It's no surprise to see modern AAA games occupying hundreds of gigabytes of storage these days"

    Is it not? I've genuinely never understood it!

    I used to do a little bit of level building for IdTech3 games back in the day but it's been 20 years. I'm not totally ignorant of what's involved, just mostly ignorant. I really want to know though, what is all that data!? Textures?

    In particular I find the massive disparity between decently similar games interesting. Indiana Jones and the Great Circle takes something like 130gb on my Xbox, whereas Robocop: Rogue City takes something like 8gb. They have similar visual fidelity, I would say Robocop might have a little bit of a lead, but Indiana Jones has fancier dynamic lighting.

    At 130gb though, I almost could have streamed my entire playthrough of the game at 4k and came out on top.

    • Sohcahtoa82an hour ago |parent

      > I really want to know though, what is all that data!? Textures?

      Yeah.

      A single 4096x4096 texture is 16 megapixels. At 8 bits per channel with potentially 4 channels, that's 64 MB for a single texture uncompressed.

    • zelphirkalt4 hours ago |parent

      These days I am just assuming it is textures. Textures for graphic quality, that I don't need on my merely full HD screens. Makes me usually not even bother buying, let alone installing such a game. Even 23 GB is still tons. When I compare that with how much fun I can have playing games that focus on gameplay instead of graphics, I would rather safe money, than spending it and having to go through my files and delete things, because I need more space for a game.

  • kakacik4 hours ago

    Devs went full fitgirl (repack site which reduces sizes of cracked releases significantly via similar approaches)

  • ChrisArchitect4 hours ago

    Source: https://store.steampowered.com/news/app/553850/view/49158394... (https://news.ycombinator.com/item?id=46131518)

  • ChrisArchitect4 hours ago

    [dupe] https://news.ycombinator.com/item?id=46131518

    • Kiro4 hours ago |parent

      Never a dupe when you say it is. You need another word.

  • CafeRacer5 hours ago

    In other news - "Call of Duty installer now takes additional 131GB of space on the disk"

    • dv_dt4 hours ago |parent

      I stopped playing CoD mainly because I was tired of juggling disk space to try to play it even casually. It's surprising to me that game publishers have ignored this as some checklist requirement to stay below.

      • PunchyHamster3 hours ago |parent

        They have a big player base that plays just that and not much more

  • cabirum4 hours ago

    What if.. the management made a request to make the game take more space than the previous release? So everyone could see just how much content there is and how much better everything is.

    I mean, the developers cannot be that incompetent while being able to ship a high quality product.

  • marknutter3 hours ago

    Now all they need to do is remove the kernel-level anti-cheat