HNNewShowAskJobs
Built with Tanstack Start
RAM is so expensive, Samsung won't even sell it to Samsung(pcworld.com)
322 points by sethops1 9 hours ago | 296 comments
  • OhMeadhbh12 minutes ago

    Or we could go back to using software that didn't require 1 Gb to run the OS / Browser combo powerful enough to run a browser that can load "too much" javascript to enable a webmail interface.

    In the 80s I ran an early SMTP / POP email client that functioned as a DOS TSR with code and data less than 64k. Granted, it was pretty crappy and was text-only (non MIME.) But there's got to be a middle ground between 64k for a craptastic text-only email client and a 1Gb OS / Browser / Webmail combo that could probably run that DOS TSR in an emulator as an attachment to a short email.

  • nickjj8 hours ago

    I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.

    I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.

    I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.

    For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

    • Cerium7 hours ago |parent

      Don't worry, if you are happy with those specs you can get corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.

      Personally, at work I use the latest hardware at home I use ewaste.

      • silverquiet6 hours ago |parent

        I got a junk Precision workstation last year as a "polite" home server (it's quiet and doesn't look like industrial equipment, but still has some server-like qualities, particularly the use of ECC RAM). I liked it so much that it ended up becoming my main desktop.

      • trollbridge3 hours ago |parent

        I have some Dell server with dual Xeons and 192GB RAM. It is NUMA but that’s fine for Docker workloads where you can just associate them with a CPU.

        The RAM for that is basically ewaste at this point, yet it runs the workloads it needs to do just fine.

      • vondur6 hours ago |parent

        Ha, I bought one of those for $500 from Ebay. It's a dual Xeon Silver workstation with a Nvidia Quadro P400 8GB, 128GB RAM and 256G SSD. I threw in a 1TB SSD and it's been working pretty well.

        • Forgeties794 hours ago |parent

          What are the limitations of machines like these?

          • snerbles4 hours ago |parent

            I too have a crippling dual CPU workstation hoarding habit. Single thread performance is usually worse than enthusiast consumer desktops, and gaming performance will suffer if the game isn't constrained to a single NUMA domain that also happens to have the GPU being used by that game.

            On the other hand, seeing >1TiB RAM in htop always makes my day happier.

          • t0mas884 hours ago |parent

            Power usage is the main limitation of using these as a home server. They have a high idle power use.

            • trollbridge2 hours ago |parent

              One of the reasons I use these is because it’s cold half the year and it’s not hard to basically use to supplement the heat.

          • bri3d2 hours ago |parent

            Very bad performance per watt and higher maintenance needs. Bad performance per watt generally means a larger formfactor and more noise as well.

          • arprocter4 hours ago |parent

            On Dell you'll probably be stuck with the original mobo, and their SFFs don't take standard PSUs

            • sevensor2 hours ago |parent

              In favor of their SFFs, they get retired 10k at a time, so you might as well pick up a second one for spares.

              • arprocteran hour ago |parent

                Not a bad call, although you'll probably need to upgrade the PSU to add a GPU (if you can find one small enough to fit the SFF case)

          • MrVitaliy4 hours ago |parent

            Just performance when compared to current generation hardware. Not significantly worse, but things like DDR4 ram and single thread performance show the signs of aging. Frankly for similar $$$ you can get a new hardware from beelink or equivalent.

            • Forgeties793 hours ago |parent

              Got it so basically it's one of those things you do if 1) the project interests you and/or 2) you get one dirt cheap and don't have high expectations for certain tasks

      • cestith6 hours ago |parent

        At home some of my systems are ewaste from former employers who would just give it to employees rather than paying for disposal. A couple are eBay finds. I do have one highish-end system at a time specifically for games. Some of my systems are my old hardware reassembled after all the parts for gaming have been upgraded over the years.

      • ge965 hours ago |parent

        Optiplex's used to be my go to the SFF, I had a 1050ti in there not crazy but worked for basic gaming

      • gpderetta6 hours ago |parent

        surely these will soon be scavenged for ram? Arbitrage opportunity?

        • legobmw996 hours ago |parent

          If they’re DDR4 (or even DDR3), it has no value to e.g. OpenAI so it shouldn’t really matter

          • noboostforyou3 hours ago |parent

            But it's a cascading effect, OpenAI gobbled up all of DDR5 production to the point that consumers are choosing to upgrade their older DDR4 systems instead of paying even more to upgrade to a new system that uses DDR5. As a result, DDR4 ram is at a new all time high - https://pcpartpicker.com/trends/price/memory/

          • auspiv5 hours ago |parent

            DDR4 prices are up 2-6x in the last couple months depending on frequency. High end, high speed modules (e.g. 128GB 3200MHz LRDIMM) are super expensive.

            • legobmw994 hours ago |parent

              Isn’t that due to different reasons (like the end of production for older standards)? I recall the same happening shortly after manufacturing for DDR3 ceased, before eventually demand essentially went to 0

          • jl64 hours ago |parent

            Demand spills over to substitutes.

          • gpderetta5 hours ago |parent

            The price of DDR4 is also going up!

      • zzzeek4 hours ago |parent

        ive dealt a bit with ewaste kinds of machines, old Dells and such and have two still running here, the issue is they use a crapton of power. I had one such ewaste Dell machine that I just had to take to the dump it was so underpeforming while it used 3x more power than my other two Dells combined.

    • kube-system7 hours ago |parent

      > I've never been more fearful of components breaking than current day.

      The mid 90s was pretty scary too. Minimum wage was $4.25 and a new Pentium 133 was $935 in bulk.

      • tossandthrow7 hours ago |parent

        If you were in minimum wage jn the 90s your lifelihood likely didn't rely on Pentium processors.

        Also, it is frightening how close that is to current day minimum wage.

        • kube-system7 hours ago |parent

          I was an unemployed student then -- a generous family member gifted me my first Windows PC, and it cost about the same as a used car.

        • silisili5 hours ago |parent

          1990-1997 averaged >4% yearly compounded minimum wage hikes, which is probably about where it should have been. The late 90s to today has been <1.25%.

        • briffle6 hours ago |parent

          Yep, I had a Cyrix processor in mine during that time. Slackware didn't care.

          • pixl975 hours ago |parent

            It also worked as a very good space heater.

        • immibis4 hours ago |parent

          If you account for inflation it's actually higher than current minimum wage.

        • adventured6 hours ago |parent

          Except nobody earns the minimum wage today, it's less than 1/2 of 1% of US labor.

          The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.

          • jfindper5 hours ago |parent

            >You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.

            Holy moly! 11 whole dollars an hour!?

            Okay, so we went from $4.25 to $11.00. That's a 159% change. Awesome!

            Now, lets look at... School, perhaps? So I can maybe skill-up out of Little Caesars and start building a slightly more comfortable life.

            Median in-state tuition in 1995: $2,681. Median in-state tuation in 2025: $11,610. Wait a second! That's a 333% change. Uh oh.

            Should we do the same calculation with housing...? Sure, I love making myself more depressed. 1995: $114,600. 2025: $522,200. 356% change. Fuck.

            • reissbakeran hour ago |parent

              This will probably be an unpopular reply, but "real median household income" — aka, inflation-adjusted median income — has steadily risen since the 90s and is currently at an all-time high in the United States. [1] Inflation includes the cost of housing (by measuring the cost of rent).

              However, we are living through a housing supply crisis, and while overall cost of living hasn't gone up, housing's share of that has massively multiplied. We would all be living much richer lives if we could bring down the cost of housing — or at least have it flatline, and let inflation take care of the rest.

              Education is interesting, since most people don't actually pay the list price. The list price has gone up a lot, but the percentage of people paying list price has similarly gone down a lot: from over 50% in the 90s for state schools to 26% today, thanks to a large increase in subsidy programs (student aid). While real education costs have still gone up somewhat, they've gone up much less than the prices you're quoting lead you to believe: those are essentially a tax on the rich who don't qualify for student aid. [2]

              1: https://fred.stlouisfed.org/series/MEHOINUSA672N

              2: https://econofact.org/how-much-does-college-really-cost

              • jfindper7 minutes ago |parent

                I have several qualms with how the real median household income is calculated, specifically the consumer price index.

                But I agree that tackling housing alone would be significant.

            • AnthonyMouse3 hours ago |parent

              You're identifying the right problem (school and housing costs are completely out of hand) but then resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.

              The easy way to realize this is to notice that the median wage has increased by proportionally less than the federal minimum wage has. The people in the middle can't afford school or housing either. And what happens if you increase the minimum wage faster than overall wages? Costs go up even more, and so does unemployment when small businesses who are also paying those high real estate costs now also have to pay a higher minimum wage. You're basically requesting the annihilation of the middle class.

              Whereas you make housing cost less and that helps the people at the bottom and the people in the middle.

              • jfindper3 hours ago |parent

                >resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.

                I'm not really resorting to any solution.

                My comment is pointing out that when you only do one side of the equation (income) without considering the other side (expenses), it's worthless. Especially when you are trying to make a comparison across years.

                How we go about fixing the problem, if we ever do, is another conversation. But my original comment doesn't attempt to suggest any solution, especially not one that "requests the annihilation of the middle class". It's solely to point out that adventured's comment is a bunch of meaningless numbers.

                • AnthonyMouse3 hours ago |parent

                  > It's solely to point out that adventured's comment is a bunch of meaningless numbers.

                  The point of that comment was to point out that minimum wage is irrelevant because basically nobody makes that anyway; even the entry-level jobs pay more than the federal minimum wage.

                  In that context, arguing that the higher-than-minimum wages people are actually getting still aren't sufficient implies an argument that the minimum wage should be higher than that. And people could read it that way even if it's not what you intended.

                  So what I'm pointing out is that that's the wrong solution and doing that rather than addressing the real issue (high costs) is the thing that destroys the middle class.

                  • jfindper3 hours ago |parent

                    >implies an argument that the minimum wage should be higher than that.

                    It can also imply that expenses should come down, you just picked the implication you want to argue against.

                    • AnthonyMouse3 hours ago |parent

                      Exactly. When it's ambiguous at best it's important that people not try to follow the bad fork.

            • genewitch4 hours ago |parent

              1980 mustang vs 2025 mustang is what i usually use. in the past 12 years my price per KWh electricity costs have doubled.

              in the mid 90s you could open a CD (certificate of deposit at a bank or credit union) and get 9% or more APY. savings accounts had ~4% interest.

              in the mid 90s a gallon of gasoline in Los Angeles county was $0.899 in the summer and less than that any other time. It's closer to $4.50 now.

            • mrits4 hours ago |parent

              The BBQ place across the street from me pays $19/hour to be a cashier in Austin. Or the sign says it does anyways

              • mossTechnician3 hours ago |parent

                Does the sign happen to have the words "up to" before the dollar amount?

              • jfindper4 hours ago |parent

                sweet! according to austintexas.gov, that's only $2.63 below the 2024 living wage. $5.55 below, if you use the MIT numbers for 2025.

                As long as you don't run into anything unforseen like medical expenses, car breakdowns, etc., you can almost afford a bare-bones, mediocre life with no retirement savings.

                • hylaridean hour ago |parent

                  I don't disagree that there has been a huge issue with stagnant wages, but not everybody who works minimum wage needs to make a living wage. Some are teenagers, people just looking for part time work, etc. Pushing up minimum wage too high can risk destroying jobs that are uneconomical at that level that could have been better than nothing for many people.

                  That being said, there's been an enormous push by various business groups to do everything they can to keep wages low.

                  It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...

                  • jfindperan hour ago |parent

                    >but not everybody who works minimum wage needs to make a living wage

                    I think this is a distraction that is usually rolled out to derail conversations about living wages. Not saying that you're doing that here, but it's often the case when the "teenager flipping burgers" argument is brought up.

                    Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.

                    >It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...

                    That's for sure! I know it's not getting solved on the hacker news comment section, at least.

          • GoatInGrey6 hours ago |parent

            Counterpoint: affording average rent for a 1-bedroom apartment (~$1,675) requires that exact median full-time wage. $15 an hour affords you about $740 for monthly housing expenses. One can suggest getting two roommates for a one-bedroom apartment, but they would be missing the fact that this is very unusual for the last century. It's more in line with housing economics from the early-to-mid 19th century.

          • mossTechnician5 hours ago |parent

            In addition to the other comments, I presume the big box retailers do not hire for full-time positions when they don't have to, and gig economy work is rapidly replacing jobs that used to be minimum wage.

          • yndoendo5 hours ago |parent

            My uncle was running a number of fast food restaurants for a franchise owner making millions. His statement about this topic is simple, "they are not living wage jobs ... go into manufacturing if you want a living wage".

            I don't like my uncle at all and find him and people like him to be terrible human beings.

            • The-Bus5 hours ago |parent

              If a business can't pay a living wage, it's not really a successful business. I, too, could become fabulously wealthy selling shoes if someone just have me shoes for $1 so I could resell them for $50.

              • AnthonyMouse3 hours ago |parent

                > If a business can't pay a living wage, it's not really a successful business.

                Let's consider the implications of this. We take an existing successful business, change absolutely nothing about it, but separately and for unrelated reasons the local population increases and the government prohibits the construction of new housing.

                Now real estate is more scarce and the business has to pay higher rent, so they're making even less than before and there is nothing there for them to increase wages with. Meanwhile the wages they were paying before are now "not a living wage" because housing costs went way up.

                Is it this business who is morally culpable for this result, or the zoning board?

                • FireBeyond30 minutes ago |parent

                  There are certainly elements of this. And there are also elements like my city, where some of the more notable local business owners and developers are all _way too cozy_ with the City Council and Planning/Zoning Boards (like not just rubbing shoulders at community events, fundraisers, but in the "our families rent AirBnBs together and go on vacation together) which gives them greater influence.

                  All that being said, though, Robert Heinlein said once:

                  > There has grown up in the minds of certain groups in this country the notion that because a man or corporation has made a profit out of the public for a number of years, the government and the courts are charged with the duty of guaranteeing such profit in the future, even in the face of changing circumstances and contrary to the public interest. This strange doctrine is not supported by statute or common law. Neither individuals nor corporations have any right to come into court and ask that the clock of history be stopped, or turned back.

              • raw_anon_11114 hours ago |parent

                Can we use the same argument for all of the businesses that are only surviving because of VC money?

                I find it rich how many tech people are working for money losing companies, using technology from money losing companies and/or trying to start a money losing company and get funding from a VC.

                Every job is not meant to support a single person living on their own raising a family.

                • dpkirchner3 hours ago |parent

                  That's what VC money is for. When it comes to paying below a living wage, we typically expect the government to provide support to make up the difference (so they're not literally homeless). Businesses that rely on government to pay their employees should not exist.

                  • raw_anon_11113 hours ago |parent

                    That’s kind of the point, a mom and pop restaurant or a McDonald’s franchise owner doesn’t have the luxury of burning $10 for every $1 in revenue for years and being backed by VC funding.

                    Oh and the average franchise owner is not getting rich. They are making $100K a year to $150K a year depending on how many franchises they own.

                    Also tech companies can afford to pay a tech worker more money because you don’t have to increase the number of workers when you get more customers.

                    YC is not going to give the aspiring fast food owner $250K to start their business like they are going to give “pets.ai - AI for dog walkers”

                    • dpkirchneran hour ago |parent

                      In that case they probably shouldn't be running a McDonald's. They aren't owed that and they shouldn't depend on their workers getting government support just so the owners can "earn" their own living wage.

                      • raw_anon_1111an hour ago |parent

                        Yet tech workers are “owed” making money because they are in an industry where their employers “deserve” to survive despite losing money because they can get VC funding - funded by among others government pension plans?

                        I find it slightly hypocritical that people can clutch their pearls at small businesses who risk their own money while yet another BS “AI” company’s founders can play founder using other people’s money.

              • CamperBob24 hours ago |parent

                Classically, not all jobs are considered "living wage" jobs. That whole notion is something some people made up very recently.

                A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances... and if it does, the owner has a strong incentive to automate it away.

                • autoexec3 hours ago |parent

                  > A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances

                  The majority of minimum wage workers are adults, not teenagers. This is also true for McDonald's employees. The idea that these jobs are staffed by children working summer jobs is simply not reality.

                  Anyone working for someone else, doing literally anything for 40 hours a week, should be entitled to enough compensation to support themselves at a minimum. Any employer offering less than that is either a failed business that should die off and make room for one that's better managed or a corporation that is just using public taxpayer money to subsidize their private labor expenses.

                • kube-system2 hours ago |parent

                  A teenager is presumably also going to school full time and works their job part time, not ~2000 hours per year.

                  If we build a society where someone working a full time job is not able to afford to reasonably survive, we are setting ourselves up for a society of crime, poverty, and disease.

                • swiftcoder3 hours ago |parent

                  > A teenager in his/her first job at McDonald's doesn't need a "living wage."

                  Turns out our supply of underage workers is neither infinite, nor even sufficient to staff all fast food jobs in the nation

                • jfindper3 hours ago |parent

                  >A teenager in his/her first job at McDonald's doesn't need a "living wage."

                  Wow, a completely bad-faith argument.

                  Can you try again, but this time, try "steelman" instead of "strawman"?

          • zzzeek4 hours ago |parent

            in that case it should be completely uncontroversial to raise the minimum wage and help that .5% of labor out. yet somehow, it's a non-starter. (btw, googling says the number is more like 1.1%. in 1979, 13.4% of the labor force made minimum wage. this only shows how obsolete the current minimum wage level is).

      • trollbridge2 hours ago |parent

        In the mid 90s mere mortals ran a 486DX2 or DX4.

        Pentium 60/66s were in the same price tier as expensive alpha or sparc workstations.

      • microtonal6 hours ago |parent

        That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..

        In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).

        • sejjean hour ago |parent

          Linux notoriously runs on worse hardware than almost anything, especially in the 90s

        • kube-system5 hours ago |parent

          > That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..

          A powerbook 5300 was $6500 in 1995, which is $13,853 today.

          • kergonath2 hours ago |parent

            > A powerbook 5300 was $6500 in 1995

            The TCO was much higher, considering how terrible and flimsy this laptop was. The power plug would break if you looked at it funny and the hinge was stiff and brittle. I know that’s not the point you are making but I am still bitter about that computer.

      • nickjj7 hours ago |parent

        > The mid 90s was pretty scary too.

        If you fast forward just a few years though, it wasn't too bad.

        You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.

      • morsch7 hours ago |parent

        Are you sure? From what I can tell it's more like 500 USD RRP on release, boxed.

        Either way, it was the 90s: two years later that was a budget CPU because the top end was two to three times the speed.

    • Barathkanna8 hours ago |parent

      I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.

      • asenna7 hours ago |parent

        I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.

        • crazygringo6 hours ago |parent

          This. My M1 MacBook felt like a similarly shocking upgrade -- probably not quite as much as my first SSD did, but still the only other time when I've thought, "holy sh*t, this is a whole different thing".

        • wongarsu6 hours ago |parent

          The M1 was great. But the jump felt particularly great because Intel Macbooks had fallen behind in performance per dollar. Great build quality, great trackpad, but if you were after performance they were not exactly the best thing to get

          • skylurk4 hours ago |parent

            For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.

            After the M1, my casual home laptop started outperforming my top-spec work laptops.

            • kergonath2 hours ago |parent

              > For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.

              But not if you cared about battery life, because that was the tradeoff Apple was making. Which worked great until about 2015-2016. The parts they were using were not Intel’s priority and it went south basically after Broadwell, IIRC. I also suppose that Apple stopped investing heavily into a dead-end platform while they were working on the M1 generation some time before it was announced.

        • redwall_hp5 hours ago |parent

          I usually use an M2 Mac at work, and haven't really touched Windows since 2008. Recently I had to get an additional Windows laptop (Lenovo P series) for a project my team is working on, and it is such a piece of shit. It's unfathomable that people are tolerating Windows or Intel (and then still have the gall to talk shit about Macs).

          It's like time travelling back to 2004. Slow, loud fans, random brief freezes of the whole system, a shell that still feels like a toy, a proprietary 170W power supply and mediocre battery life, subpar display. The keyboard is okay, at least. What a joke.

          Meanwhile, my personal M3 Max system can render Da Vinci timelines with complex Fusion compositions in real time and handle whole stacks of VSTs in a DAW. Compared to the Lenovo choking on an IDE.

          • ponector2 hours ago |parent

            There will be not so big difference if you compare laptops in the same price brackets. Cheap PCs are crap.

        • bigyabai5 hours ago |parent

          It's a lot more believable if you tried some of the other Wintel machines at the time. Those Macbook chassis were the hottest of the bunch, it's no surprise the Macbook Pro was among the first to be redesigned.

      • simlevesque7 hours ago |parent

        I've had this with gen5 PCIe SSDs recently. My T710 is so fast it's hard to believe. But you need to have a lot of data to make it worth.

        Example:

            > time du -sh .
            737G .
            ________________________
            Executed in   24.63 secs
        
        And on my laptop that has a gen3, lower spec NVMe:

            > time du -sh .
            304G .
            ________________________
            Executed in   80.86 secs
        
        
        It's almost 10 times faster. The CPU must have something to do with it too but they're both Ryzen 9.
        • adgjlsfhk16 hours ago |parent

          To me that reads 3x, not "almost 10x". The main differrence here is probably power. A desktop/server is happy to send 15W to the SSD and hundreds of watts to the CPU, while a laptop wants the SSD running in the ~1 watt range and the CPU in the 10s of watts range.

          • simlevesque6 hours ago |parent

            There's over twice as much content in the first test. It's around 3.8gb/s vs 30gb/s if you divide both folder size and both du durations. That makes it 7.9 times faster and I'm comfortable calling this "almost 10 times".

            • ls655365 hours ago |parent

              The total size isn't what matters in this case but rather the total number of files/directories that need to be traversed (and their file sizes summed).

              • simlevesque4 hours ago |parent

                I responded here, it's essentially the same content: https://news.ycombinator.com/item?id=46150030

            • adgjlsfhk15 hours ago |parent

              oops. I missed the size diff. that's a solid 8x. that's cool!

        • taneliv5 hours ago |parent

          I believe you, but your benchmark is not very useful. I get this on two 5400rpm 3T HDDs in a mirror:

              $ time du -sh .
              935G    .
                                                                                                                                    
              real    0m1.154s
          
          Simply because there's less than 20 directories and the files are large.
          • simlevesque5 hours ago |parent

            I should have been more clear: It's my http cache for my crawling jobs. Lots of files in many shapes.

            My new setup: gen5 ssd in desktop:

                > time find . -type f | wc -l
                5645741
                ________________________
                Executed in    4.77 secs
            
            My old setup, gen3 ssd in laptop:

                > time find . -type f | wc -l
                2944648
                ________________________
                Executed in   27.53 secs
            
            Both are running pretty much non-stop, very slowly.
      • pstadler7 hours ago |parent

        This and high resolution displays, for me at least.

      • jug4 hours ago |parent

        I thought so too on my mini PC. Then I got myself my current Mac mini M4 and I have to give it to Apple, or maybe in part to ARM... It was like another SSD moment. It's still not spun up the fan and run literally lukewarm at most my office, coding and photo work.

      • wdfx7 hours ago |parent

        The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.

        • genewitch4 hours ago |parent

          "Core" was/is like "PowerPC" or "Ryzen", just a name. Intel Core i9, for instance, as opposed to Intel Pentium D, both x86_x64, different chip features.

    • prmoustache7 hours ago |parent

      As other mentionned, there are plenty of refurbished stuff and second hand parts that there isn't any risk of finding yourself having to buy something at insane prices if your computer was to die today.

      If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.

    • snickerbockers2 hours ago |parent

      >For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

      I only ever noticed it on my windows partition. IIRC on my linux partition it was hardly noticeable because Linux is far better at caching disk contents than windows and also linux in general can boot surprisingly fast even on HDDs if you only install modules you actually need so that the autoconfiguration doesn't waste time probing dozens of modules in search of the best one.

    • forinti7 hours ago |parent

      You can still get brand new generic motherboards for old CPUs.

      I swapped out old ASUS MBs for an i3-540 and an Athlon II X4 with brand new motherboards.

      They are quite cheaper than getting a new kit, so I guess that's the market they cater to: people who don't need an upgrade but their MBs gave in.

      You can get these for US$20-US$30.

    • davely5 hours ago |parent

      About a month ago, the mobo for my 5950x decided to give up the ghost. I decided to just rebuild the whole thing and update from scratch.

      So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.

      I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!

      • VHRanger5 hours ago |parent

        Same, I got 96GB of high end 6000MHz DDR5 this summer for $600CAD and now it's nearly triple at $1500CAD

      • genewitch4 hours ago |parent

        what are you doing with that old 5950x?

    • mikepurvis7 hours ago |parent

      For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.

      I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.

    • phantasmish4 hours ago |parent

      I’m worried about the Valve mini PC coming out next year.

      Instant buy $700 or under. Probably buy up to $850. At, like, $1,100, though… solid no. And I’m counting on that thing to take the power-hog giant older Windows PC tower so bulky it’s unplugged and in a closet half the time, out of my house.

    • acters5 hours ago |parent

      I am still running an i5 4690k, really all I need is better GPU but those prices are criminal. I wish I got a 4090 when I had the chance rip

      • genewitch4 hours ago |parent

        intel arc b580 (i think that's the latest one) isn't obnoxiously priced but you're going to have to face the fact that your PCIE is really very slow. But it should work.

        if you want to save even more money get the older Arc Battlemage GPUs. I used one it was comparable with an RTX 3060; i returned it because the machine i was running it in had a bug that was fixed 2 days before i returned it but i didn't know that.

        I was seriously considering getting a b580 or waiting until the b*70 came out with more memory, although at this point i doubt it will be very affordable considering VRAM prices going up as well. A friend is supposedly going to ship me a few GTX 1080ti cards so i can delay buying newer cards for a bit.

        • TheAmazingRace4 minutes ago |parent

          By older Arc, I presume you're referring to Alchemist and not Battlemage in this case.

          One of my brothers has a PC I built for him, specced out with an Intel Core i5 13400f CPU and an Intel Arc A770 GPU, and it still works great for his needs in 2025.

          Surely, Battlemage is more efficient and more compatible in some ways over Alchemist. But if you keep your expectations in check, it will do just fine in many scenarios. Just avoid any games using Unreal Engine 5.

        • interloxiaan hour ago |parent

          Note that some tinkering may be required for modern cards on old systems.

          - A UEFI DXE driver to enable Resizable BAR on systems which don't support it officially. This provides performance benefits and is even required for Intel Arc GPUs to function optimally.

          List of working motherboards

          https://github.com/xCuri0/ReBarUEFI/issues/11

    • aposm7 hours ago |parent

      A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...

    • hnu08476 hours ago |parent

      Don't all RAM manufacturers offer a lifetime warranty?

      That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?

      • SoftTalker5 hours ago |parent

        I've never heard of a lifetime warranty on anything in the enterprise space. Maybe consumer stuff, where it's just a marketing gimmick.

        • chihuahua5 hours ago |parent

          Oh, your RAM died? That means its lifetime ended at that moment, and so did the lifetime warranty. Is there anything else we can help you with today?

    • the__alchemist7 hours ago |parent

      Man, it was just GPU for a while. But same boat. I regret not getting the 4090 for $1600 direct from Nvidia. "That's too much for a video card", and got the 4080 instead. I dread the day when I need to replace it.

      • jakogut6 hours ago |parent

        The Radeon RX 9070 XT performs at a similar level to the RTX 5070, and is retailing around $600 right now.

        • the__alchemist6 hours ago |parent

          No CUDA means not an option for me.

          • the__alchemist6 hours ago |parent

            > What kinds of applications do you use that require CUDA?

            Molecular dynamics simulations, and related structural bio tasks.

            • vlovich1233 hours ago |parent

              Is the CUDA compat layer AMD has that transparently compiled existing CUDA just fine insufficient somehow or buggy somehow? Or are you just stuck in the mindshare game and haven’t reevaluate whether the AMD situation has changed this year?

              • the__alchemist2 hours ago |parent

                I haven't checkout out AMD's transparency layer and know nothing about it. I tried to get vkFFT working in addition to cuFFT for a specific computation, but can't get it working right; crickets on the GH issue I posted.

                I use Vulkan for graphics, but Vulkan compute is a mess.

                I'm not in a mindshare, and this isn't a political thing. I am just trying to get the job done, and have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.

                • vlovich123an hour ago |parent

                  I didn’t talk about Vulkan compute.

                  > have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.

                  I’m saying this is a mindshare thing if you haven’t evaluated ROCm / HIP. HIPify can convert CUDA source to HIP automatically and HIP is very similar syntax to CUDA.

                  • the__alchemist43 minutes ago |parent

                    TY; will check those out.

              • jakogut35 minutes ago |parent

                There's also ZLUDA, which can run llama.cpp and some other CUDA workloads already without any modification, but it's still maturing.

          • jakogut6 hours ago |parent

            What kinds of applications do you use that require CUDA?

    • KronisLV3 hours ago |parent

      If I needed a budget build, I'd probably look in the direction of used parts on AliExpress, you can sometimes find good deals on AM4 CPUs (that platform had a lot of longevity, even now my main PC has a Ryzen 7 5800X) and for whatever reason RX 580 GPUs were really, really widespread (though typically the 2048SP units). Not amazing by any means, but a significant upgrade from your current setup and if you don't get particularly unlucky, it might last for years with no issues.

      Ofc there's also the alternate strategy of going for a mid/high end rig and hoping it lasts a decade, but the current DDR5 prices make me depressed so yeah maybe not.

      I genuinely hope that at some point the market will get flooded with good components with a lot of longevity and reasonable prices again in the next gens: like AM4 CPUs, like that RX 580, or GTX 1080 Ti but I fear that Nvidia has learnt their lesson in releasing stuff that pushes you in the direction of incremental upgrades rather than making something really good for the time, same with Intel's LGA1851 being basically dead on arrival, after the reviews started rolling in (who knows, maybe at least mobos and Core Ultra chips will eventually be cheap as old stock). On the other hand, at least something like the Arc B580 GPUs were a step in the right direction - competent and not horribly overpriced (at least when it came to MSRP, unfortunately the merchants were scumbags and often ignored it).

    • square_usual6 hours ago |parent

      You can still buy DDR4 for pretty cheap, and if you're replacing a computer that old any system built around DDR4 will still be a massive jump in performance.

    • ls6126 hours ago |parent

      GPU prices are actually at MSRP now for most cards other than the 5090.

      • bilegeekan hour ago |parent

        Problem is MSRP is also inflated, and Covid has locked that in. Arc Battlemage is the only exception I see.

        • ls61222 minutes ago |parent

          You’ve never been able to buy more GPU performance per dollar than you can today.

    • adventured6 hours ago |parent

      You could still easily build a $800-$900 system that would dramatically jump forward from that machine.

      $700 in 2014 is now $971 inflation adjusted (BLS calculator).

      RTX 3060 12gb $180 (eBay). Sub $200 CPU (~5-7 times faster than yours). 16gb DDR4 $100-$120. $90 PSU. $100 motherboard. WD Black 1tb SSD $120. Roughly $800 (which inflation adjusted beats your $700).

      Right now is a rather amazing time for CPUs, even though RAM prices have gone crazy.

      Assume you find some deals somewhere in there, you could do slightly better with either pricing or components.

    • TacticalCoder7 hours ago |parent

      > For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.

      The last one were I really remember seeing a huge speed bump was going from a regular SSD to a NVMe M.2 PCIe SSD... Around 2015 I bought one of the very first consumer motherboard with a NVMe M.2 slot and put a Samsung 950 Pro in it: that was quite something (now I was upgrading the entire machine, not just the SSD, so there's that too). Before that I don't remember when I switched from SATA HDD to SATA SSD.

      I'm now running one of those WD SN850X Black NVMe SSD but my good old trusty, now ten years old, Samsung 950 Pro is still kicking (in the wife's PC). There's likely even better out there and they're easy to find: they're still reasonably priced.

      As for my 2015 Core i7-6700K: it's happily running Proxmox and Docker (but not always on).

      Even consumer parts are exceptionally reliable: the last two failures I remember, in 15 years (and I've got lots of machines running), are a desktop PSU (replaced by a Be Quiet! one), a no-name NVMe SSD and a laptop's battery.

      Oh and my MacBook Air M1's screen died overnight for no reason after precisely 13 months, when I had a warranty of 12 months, (some refer to it as the "bendgate") but that's because first gen MacBook Air M1 were indescribable pieces of fragile shit. I think Apple got their act together and came up with better screens in later models.

      Don't worry too much: PCs are quite reliable things. And used parts for your PC from 2014 wouldn't be expensive on eBay anyway. You're not forced to upgrade to a last gen PC with DDR5 (atm 3x overpriced) and a 5090 GPU.

      • genewitch4 hours ago |parent

        fyi someone or something is downvoting your recent posts to oblivion, and i didn't see any obvious reason.

    • testing223215 hours ago |parent

      I got a used M1 MacBook Air a year ago.

      By far the fastest computer I’ve ever used. It felt like the SSD leap of years earlier.

    • sneak8 hours ago |parent

      Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.

      I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?

      Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.

      Spend at least 1% of your gross revenue on your tools used to make that revenue.

      • macNchz7 hours ago |parent

        What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.

        Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.

        • crazygringo6 hours ago |parent

          > This is self indulgence justified as « investment ».

          Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.

          It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?

        • londons_explore7 hours ago |parent

          Every hardware update for me involves hours or sometimes days of faffing with drivers and config and working round new bugs.

          Nobody is paying for that time.

          And whilst it is 'training', my training time is better spent elsewhere than battling with why cuda won't work on my GPU upgrade.

          Therefore, I avoid hardware and software changes merely because a tiny bit more speed isn't worth the hours I'll put in.

        • mikepurvis7 hours ago |parent

          My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.

      • Clent7 hours ago |parent

        This is a crazy out of touch perspective.

        Depending on salary, 2 magnitudes at $5k is $500k.

        That amount of money for the vast majority of humans across the planet is unfathomable.

        No one is worried about if the top 5% can afford DRAM. Literally zero people.

      • jfindper7 hours ago |parent

        >I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year

        >I buy the best phones and desktops money can buy

        Sick man! Awesome, you spend 1/3 of the median US salary on a laptop and desktop every year. That's super fucking cool! Love that for you.

        Anyways, please go brag somewhere else. You're rich, you shouldn't need extra validation from an online forum.

      • mitthrowaway27 hours ago |parent

        Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.

        Prices are high but they're not that high, unless you're buying the really big GPUs.

        • sgerenser7 hours ago |parent

          Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.

          • mitthrowaway27 hours ago |parent

            You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.

            The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.

            It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.

          • lionkor7 hours ago |parent

            Typing this on similar spec P16s that was around 2.6k or so. So if you call anything under 3k simply 2k, then it was 2k.

            Thats in Germany, from a corporate supplier.

      • gr4vityWall7 hours ago |parent

        > maybe $250/month (...) which you can then use to go and earn 100x that.

        25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.

        I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

        • Dibby0536 hours ago |parent

          >Most developers in the third world don't make that in a full year

          And many in the first world haha

        • londons_explore7 hours ago |parent

          > But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

          The soon to be unemployed SV engineers when LLM's mean anyone can design an app and backend with no coding knowledge.

          • genewitch4 hours ago |parent

            and you can code from an rpi / cellphone and use a cloud computer to run it so you actually don't really need an expensive PC at all

      • ceejayoz7 hours ago |parent

        > Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?

        Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.

        Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.

      • vultour7 hours ago |parent

        Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?

        • sneak7 hours ago |parent

          Most tech jobs could be done on a $25 ten year old smartphone with a cracked screen and bulging battery.

          That’s exactly my point. Underspending on your tools is a misallocation of resources.

          • pqtyw5 hours ago |parent

            That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...

          • jermaustin16 hours ago |parent

            Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.

          • antiframe6 hours ago |parent

            Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.

            The goal is the right tool for the job, not the best tool you can afford.

      • kube-system7 hours ago |parent

        I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.

      • dghlsakjg5 hours ago |parent

        If you are consistently maxing out your computers performance in a way that is limiting your ability to earn money at a rate greater than the cost of upgrades, and you can't offload that work to the cloud, then I guess it might make sense.

        If, you are like every developer I have ever met, the constraint is your own time, motivation and skills, then spending $22k dollars per year is a pretty interesting waste of resources.

        DOes it makes sense to buy good tools for your job? Yes. Does it make sense to buy the most expensive version of the tool that you already own last years most expensive version of? Rarely.

      • hansvm6 hours ago |parent

        Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.

        Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.

        Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.

        So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.

        Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.

      • ChromaticPanic7 hours ago |parent

        That's crazy spend for anyone making sub 100K

        • jermaustin16 hours ago |parent

          It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.

          A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.

        • red-iron-pine6 hours ago |parent

          thats a crazy spend for sub-200k or even sub-500k

          you're just building a gaming rig with a flimsy work-related justification.

      • neogodless6 hours ago |parent

        Have you ever heard of the term "efficiency"?

        It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.

        With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.

        That and your employer should pay for your hardware in many cases.

      • iberator2 hours ago |parent

        Extremist point of view, and NOT optimal. Diminishing performance per $...

        Proper calculation is: cost/ performance ratio. Then buy a second from the list:)

      • nickjj7 hours ago |parent

        I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.

        ======== COMPUTER ========

        I feel no pain yet.

        Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.

        My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.

        Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.

        I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.

        I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.

        ======== PHONE ========

        I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.

        I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.

        ======== ASIDE ========

        I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.

        I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.

      • crote2 hours ago |parent

        Sorry, but that's delusional.

        For starters, hardware doesn't innovate quickly enough to buy a new generation every year. There was a 2-year gap between Ryzen 7000 and Ryzen 9000, for example, and a 3-year gap between Ryzen 5000 and Ryzen 7000. On top of that, most of the parts can be reused, so you're at best dropping in a new CPU and some new RAM sticks.

        Second, the performance improvement just isn't there. Sure, there's a 10% performance increase in benchmarks, but that does not translate to a 10% productivity improvement for software development. Even a 1% increase is unlikely, as very few tasks are compute-bound for any significant amount of time.

        You can only get to $15k by doing something stupid like buying a Threadripper, or putting an RTX 4090 into it. There are genuine use-cases for that kind of hardware - but it isn't in software development. It's like buying a Ferrari to do groceries: at a certain point you've got to admit that you're just doing it to show off your wealth.

        You do you, but in all honesty you'd probably get a better result spending that money on a butler to bring your coffee to your desk instead of wasting time by walking to the coffee machine.

      • Krssst7 hours ago |parent

        One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.

      • ambicapter7 hours ago |parent

        I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.

      • pharrington6 hours ago |parent

        are you paid by the FLOP?

      • kotaKat7 hours ago |parent

        I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)

        I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.

        Big Tech has made it unaffordable for everyone.

        • zozbot2346 hours ago |parent

          8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.

          • pqtyw5 hours ago |parent

            > something other than Windows or macOS > 8GB

            No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?

        • ecshafer7 hours ago |parent

          The bright side is the bust is going to make a glut of cheap used parts.

  • browningstreet24 minutes ago

    Apple has the opportunity to do something really funny and radically increase the base RAM configurations of all their unified memory/CPU/GPU chips. Intel/AMD builders would struggle to meet the price/capacity points.

    • koolala4 minutes ago |parent

      Don't they have to pay the same crazy prices? What gives them the opportunity?

  • jsheard8 hours ago

    To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own chip division fight for placement in their own phones, constantly flip-flopping between using Samsung or Qualcomm chips at the high end, Samsung or Mediatek chips at the low end, or even a combination of first-party and third-party chips in different variants of ostensibly the same device.

    • lkramer8 hours ago |parent

      To be honest, this actually sounds kinda healthy.

      • dgemm7 hours ago |parent

        It's a forcing function that ensures the middle layers of a vertically integrated stack remain market competitive and don't stagnate because they are the default/only option

      • _aavaa_8 hours ago |parent

        Sears would like to have a word about how healthy intra-company competition is.

        • marcosdumay7 hours ago |parent

          Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.

          It makes absolutely no sense to apply the lessons from one into the other.

          • StableAlkyne7 hours ago |parent

            I think what the GP was referring to was the "new" owner of Sears, who reorganized the company into dozens of independent business units in the early 2010s (IT, HR, apparel, electronics, etc). Not departments, either; full-on internal businesses intended as a microcosm of the free market.

            Each of these units were then given access to an internal "market" and directed to compete with each other for funding.

            The idea was likely to try and improve efficiency... But what ended up happening is siloing increased, BUs started infighting for a dwindling set of resources (beyond normal politics you'd expect at an organization that size; actively trying to fuck each other over), and cohesion decreased.

            It's often pointed to as one of the reasons for their decline, and worked out so badly that it's commonly believed their owner (who also owns the company holding their debt and stands to immensely profit if they go bankrupt) desired this outcome... to the point that he got sued a few years ago by investors over the conflict of interest and, let's say "creative" organizational decisions.

            • silisili5 hours ago |parent

              This happened at a place where I worked years ago, but not as 'on purpose.' We were a large company where most pieces depended on other pieces, and everything was fine - until a new CEO came in who started holding the numbers of each BU under a microscope. This led to each department trying to bill other departments as an enterprise customer, who then retaliated, which then led to internal departments threatening to go to competitors who charged less for the same service. Kinda stupid how that all works - on paper it would have made a few departments look better if they used a bottom barrel competitor, but in reality the company would have bled millions of dollars as a whole...all because one rather large BU wanted to goose its numbers.

            • red-iron-pine6 hours ago |parent

              to put a finer point on it, it wasn't just competition or rewarding-the-successful, the CEO straight up set them at odds with each other and told them directly to battle it out.

              basically "coffee is for closers... and if you don't sell you're fired" as a large scale corporate policy.

            • _aavaa_6 hours ago |parent

              Yes, this is what I was referring to. I should have provided more context, thanks for doing so.

            • marcosdumay6 hours ago |parent

              That was a bullshit separation of a single horizontal cut of the market (all of those segments did consumer retail sales) without overlap.

              The part about no overlaps already made it impossible for them to compete. The only "competition" they had was in the sense of TV gameshow competition where candidates do worthless tasks, judged by some arbitrary rules.

              That has absolutely no similarity to how Samsung is organized.

          • reaperducer7 hours ago |parent

            Sears had horizontal market where all of it did basically the same thing. Samsung is a huge conglomerate of several completely different vertical with lots of redundant components.

            Sears was hardly horizontal. It was also Allstate insurance and Discover credit cards, among other things.

            • marcosdumay6 hours ago |parent

              Ok. And if it did divide on the borders of insurance and payment services, the reorganization wouldn't have been complete bullshit and may even have been somewhat successful.

        • HugoTea6 hours ago |parent

          Nokia too

      • fransje267 hours ago |parent

        Yeah, makes absolute sense.

        A bit like Toyota putting a GM engine in their car, because the Toyota engine division is too self-centered, focusing to much on efficiency.

        • cobalt607 hours ago |parent

          You mean toyota putting bmw engine (supra). Your statement is contradicting as Toyota has TRD, which focuses on the track performance. They just couldn't keep up with the straight six perf+reliability when comparing to their own 2jz

          • Der_Einzige6 hours ago |parent

            Buying a Supra is stupid. Either buy a proper BMW with the b58/Zf8 speed and get a proper interior or stop being poor and buy an LC500.

            Better yet, get a C8 corvette and gap all of the above for a far better value. You can get 20% off msrp on factory orders with C8 corvettes if you know where to look.

      • itsastrawman7 hours ago |parent

        The opposite, nepotism, is very unhealthy, so i think you're correct.

        • hammock6 hours ago |parent

          Not sure that the opposite of transfer pricing is nepotism. As far as I know it’s far more common for someone who owns a lake house to assign four weeks a year to each grandkid , than to make them bid real money on it and put that in a maintenance fund or something. Though it’s an interesting idea, it’s not very family friendly

      • zoeysmithe7 hours ago |parent

        n/a

        • crazygringo6 hours ago |parent

          I genuinely can't tell if this is sarcasm? Or do you live somewhere where this is taught?

    • EMM_3866 hours ago |parent

      Isn't this how South Korean chaebols work?

      They operate with tension. They're supposed to have unified strategic direction from the top, but individual subsidiaries are also expected to be profit centers that compete in the market.

    • DavidPeiffer6 hours ago |parent

      I worked with some supply chain consultants who mentioned "internal suppliers are often worse suppliers than external".

      Their point was that service levels are often not as stringently tracked, SLA's become internal money shuffling, but the company as a whole paid the price in lower output/profit. The internal partner being the default allows an amount of complacency, and if you shopped around for a comparable level of service to what's being provided, you can often find it for a better price.

    • morcus8 hours ago |parent

      > two versions of the same phone with different processors

      That's hilarious, which phone is this?

      • petcat8 hours ago |parent

        Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.

        My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.

        • sgerenser7 hours ago |parent

          In the past using Snapdragon CPUs for the U.S. made sense due to Qualcomm having much better support for the CDMA frequencies needed by Verizon. Probably no longer relevant since the 5G transition though.

      • muvlon8 hours ago |parent

        Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.

        • numpad07 hours ago |parent

          "Galaxy S II" and its aesthetics was already a mere branding shared across at least four different phones with different SoCs, before counting in sub-variants that share same SoCs. This isn't unique to Samsung, nor is it a new phenomenon, just how consumer products are made and sold.

          1: https://en.wikipedia.org/wiki/Samsung_Galaxy_S_II

        • noisem4ker8 hours ago |parent

          The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is almost on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing, from the benchmarks I remember.

        • magicalhippo7 hours ago |parent

          The S23 was also Snapdragon-only as far as I know[1]. The S24 had the dual chips again, while as you say S25 is Qualcomm only once more.

          [1]: https://www.androidauthority.com/samsung-exynos-versus-snapd...

      • grincek8 hours ago |parent

        This is the case as recent as of S24, phones can come with exynos or snapdragon, with exynos usually featuring worse performance and battery life

      • intrikate5 hours ago |parent

        I might be out of date, but last I knew, it was "most of them."

        International models tended to use Samsung's Exynos processors, while the ones for the North American market used Snapdragons or whatever.

      • namibj8 hours ago |parent

        Several high end Galaxy S's AFAIK.

    • MagicMoonlight6 hours ago |parent

      That’s really good business. Everyone is pushing to be the best rather than accepting mediocrity.

  • rafaelmn7 hours ago

    Apple is going to be even more profitable in the consumer space because of RAM prices ? I feel like they are the only player to have the supply chain locked down enough to not get caught off guard, have good prices locked in enough in advance and suppliers not willing to antagonize such a big customer by backing out of a deal.

    • londons_explore7 hours ago |parent

      Apple software typically seems to give a better user experience in less RAM in both desktop and mobile.

      For the last 10+ years apples iPhones have shipped with about half the ram of a flagship android for example.

      • Miraste5 hours ago |parent

        They used to, but they've caught up. The flagship iPhone 17 has 12GB RAM, the same as the Galaxy S25. Only the most expensive Z Fold has more, with 16GB.

        RAM pricing segmentation makes Apple a lot of money, but I think they scared themselves when AI took off and they had millions of 4GB and 8GB products out in the world. The Mac minimum RAM specs have gone up too, they're trying to get out of the hole they dug.

      • fennecbutt3 hours ago |parent

        People always make this argument. But could you please expand on what you think is actually in memory?

        code:data, by and large I bet that content held in ram takes up the majority of space. And people have complained about the lack of ram in iPhones for ages now, particularly with how it affects browsers.

    • Dibby0536 hours ago |parent

      >the only player to have the supply chain locked down enough to not get caught off guard What?

      • Miraste6 hours ago |parent

        Tim Cook is the Supply Chain Guy. He has been for decades, before he ever worked at Apple. He does everything he can to make sure that Apple directly controls as much of the supply chain as possible, and uses the full extent of their influence to get favorable long-term deals on what they don't make themselves.

        In the past this has resulted in stuff like Samsung Display sending their best displays to Apple instead of Samsung Mobile.

  • khannn6 hours ago

    "The price of eggs has nothing on the price of computer memory right now.". A dozen eggs went to ~$5. They are eggs and most people use what, max 12 eggs a month? Get out of here with that trite garbage. Everyone knew that the egg shortage was due to the extreme step the US does of culling herds infected with avian flu and that they were transitory.

    • 5424586 hours ago |parent

      Surprisingly, apparently Americans average 279 eggs per year per person or 24 per month.

      https://www.washingtonpost.com/business/2019/02/28/why-ameri...

      (This is not a comment making any judgements about cost or the state of the economy, I was just surprised to find it that high)

      • red-iron-pine6 hours ago |parent

        cuz eggs are in breakfast sandwiches, are ingredients in pastries, act as binders in things like meatloaf or fried chicken, etc. etc.

      • silisili4 hours ago |parent

        That sounded high to me as well(probably because I rarely eat eggs), but then I remembered my parents who each eat two per day which isn't that uncommon I guess.

      • baud1472585 hours ago |parent

        Maybe if you include all the eggs in processed food like cookies or cakes and in restaurants or other catering operations you reach that number? And eggs consumed at home could still be around 12 per person?

    • xboxnolifes5 hours ago |parent

      The average person buys, what, 0 ram per month? Which cares.

      • khannn3 hours ago |parent

        The average person buys a phone amortized at 36 months minus trade-in value. So they do indeed buy ram every month but it's a line item on a phone bill.

        • xboxnolifesan hour ago |parent

          Assuming an 8GB phone on average and 2x16GB DDR5 desktop sticks being ~$400, the average person then buys 0.25GB RAM per month at $3.125.

          If you want, you can add in a 16GB laptop every 36 months, tripling the total to 0.75GB and ~$10 a month. Still, that's multiple times less than the increase in egg price compared to the average consumption.

          • khannn18 minutes ago |parent

            Apples and oranges comparison. RAM works forever while eggs only keep someone full for 4 to 6 hours. I'd honestly like to see the amount of time someone is full from eating eggs vs the average daily screen time vs the cost of both, lets say the service life of the phone is 36 months with the cost of the eggs averaged out for that three year period.

    • n8cpdx5 hours ago |parent

      Eggs have traditionally been an extremely cheap protein staple.

      A typical pattern might be to have two eggs for breakfast (a whopping 120 calories), boiled eggs for lunch/snack (another 60-120 calories), and of course baking, but I will pretend that people don’t bake.

      A more typical serving for an adult breakfast might be 3 eggs if not supplemented.

      For mom and dad and the little one, you’re now at 35 (2+2+1+2)x5 eggs per week. When your cost goes from $6 (2x18 @3) to $16 (2x18@8) per week, you notice.

      Obviously the political discourse around this was not healthy. But eggs suddenly becoming a cost you have to notice is a big deal, and a symbol for all of the other grocery prices that went up simultaneously.

      If you’re a typical HN user in the US you might be out of touch with the reality that costs going up $10/week can be a real hardship when you’re raising a family on limited income.

      The peak was actually closer to $8/dozen, my math has been conservative at every step, the situation is worse than I describe.

      • khannn4 hours ago |parent

        Parents in the US don't feed their kids eggs for breakfast, it's majority cereal or breakfast bars. Maybe some yogurt but that's almost always upper middle class or above.

        "If you’re a typical HN user in the US you might be out of touch with the reality that costs going up $10/week can be a real hardship when you’re raising a family on limited income.".

        Skill issue. Oatmeal is very cheap and filling. The aforementioned yogurt. Nothing, yeah nothing, because the average person is obese here and nothing is exactly what they need for breakfast. A piece of fruit like the perennial classic banana for breakfast. Complaining about egg prices comes from the camp of "I tried nothing and nothing worked".

        • et-al41 minutes ago |parent

          Aside from yoghurt, you’ve only listed carbs. Sure oatmeal has protein (and fiber), but not as much as eggs.

        • hombre_fatal3 hours ago |parent

          I agree, but for some reason there's huge mental inertia to the foods we eat day to day.

          Paying more for staples that you've eaten your whole life (especially in a boiled frog way) is much more time/energy/mentally cheaper than experimenting with how you and your kids might like a bowl of oatmeal prepared.

          That said, if you're having trouble making ends meet and you have kids, you don't have much of a choice.

        • stuffn12 minutes ago |parent

          > Complaining about egg prices comes from the camp of "I tried nothing and nothing worked".

          Eggs are one of the highest protein-per-calorie, nutrient dense foods you can purchase. Up until recently it was cheaper than almost any other staple. When I was growing up (admittedly during a time everything was relatively cheap) my family ate a lot of eggs. We had spreads, we had eggs for breakfast, and eggs were incorporated into dinners in one way or another. I'm not the only one. I don't know anyone born in my cohort that didn't eat eggs regularly.

          > Oatmeal is very cheap and filling

          Also completely devoid of the same level of nutrition as eggs and requires supplementation.

          > it's majority cereal or breakfast bars.

          While true this is an education issue not a cost issue. We still have at least 3 generations of people having children that were raised in the "eggs are horrible for you" times, including myself.

          > Nothing, yeah nothing, because the average person is obese here and nothing is exactly what they need for breakfast.

          The average person is obese because of the relative ease of cheap, high calorie, fillers and good options being more expensive. The price of eggs increasing compounds this. However, I would wager most adults are obese because of the high calorie starbucks, fast food, and snacks. Not because of cereal for breakfast.

          > A piece of fruit like the perennial classic banana for breakfast.

          Demonstrably worse for you than both cereal and eggs. Once again, defeating your point and STILL demonstrating more expensive eggs makes nutritionally worse options the only option.

        • bobsmoothan hour ago |parent

          The quintessential out of touch HN comment.

          • khannn16 minutes ago |parent

            I have friends with kids, have siblings with kids, and indeed did grow up in the US. Ate cereal growing up with maybe some eggs on the weekend. My siblings feed their kids exactly what I described. My friends feed their kids the same. I have no idea how that is out of touch, but I grew up lower-middle class and that's my lived experience.

    • th0ma55 hours ago |parent

      There was also a lot of profiteering going on? This was talked about quite a bit? And it's still going on in other markets with other things like cars??

      • khannn4 hours ago |parent

        "Profiteering"? Truth is... the game was rigged from the start

      • venturecruelty3 hours ago |parent

        Sorry, we have to starve so the two dairy distributors can have another good quarter. I hear gruel is cheap, for now.

  • Shank5 hours ago

    This is going to be a serious problem. We’ve had smart devices percolate through all consumer electronics, from washing machines to fridges. That’s all fine and dandy but they all need RAM. At what point does this become a national security issue? People need these things and they all require RAM and now assumably will cost more as the raw chip cost increases significantly or the supply chains dry up for lower quantities all together.

  • fennecbutt3 hours ago

    Well well well. From an anti monopoly standpoint isn't it interesting that each business is doing what it should for its own best interests rather than special deals because they're under the same umbrella?

    • venturecruelty3 hours ago |parent

      I love how I've seen a bunch of responses that amount to "do you really need that much RAM anyway?" Unreal.

  • monster_truck7 hours ago

    Based on my time working for Samsung this does not surprise me. The silos within fight against one another more than they ever bother to compete with anyone else

  • qwertox6 hours ago

    I had planned to build a new workstation this fall, all the parts were in the list. but seeing the ram go from 300€ (96 GB) to 820€, in-stock for 999€, in under a month made me decide that i will continue using that laptop from 2019 for maybe another 1.5 years.

    It's a ridiculous situation and these companies, whoever they are, should be somewhat ashamed of themselves for the situation they're putting us in.

    That goes specially for those MF at OpenAI who apparently grabbed 40% of the worldwide DRAM production, as well as those sold in stores.

  • time4tea6 hours ago

    Dec 2023:

    96GB (2x48) DDR5 5x00 £260 today £1050

    128GB (4x32 ) DDR5 5x00 £350 today £1500

    Wut?

    Edit: formatting

    • bpye2 hours ago |parent

      Kind of wish I went for 2x48GB last year, not 2x32GB. Oh well.

    • dehrmann5 hours ago |parent

      ECC memory has been one of my better investments in the past two years, and now because of the crashes it might have prevented.

  • tippa1234 hours ago

    This is to be expected from any large corporation. In my experience, this sort of infighting leads to low morale and wastes a significant amount of energy that could be directed somewhere far more productive.

  • aceazzameen5 hours ago

    I really wanted to build a new PC this year, which is obviously not happening anymore. But I do have 2x16GB DDR5 SODIMMs from my laptop that I'm not using, after I upgraded to 64GB a while back. Now I wonder if I can build a tiny PC around those? Does anyone make motherboards that support DDR5 laptop memory?

    • max-leo3 hours ago |parent

      Minisforum offer a Mini-ITX board with a 16-core Zen4 AMD CPU soldered on for under $400. The AM5 socket version of that same CPU alone is over 500. It uses SO-DIMM DDR5 so might be an interesting option in your case. (Yes, it is a mobile CPU but it has the same amounts of L2/L3 Cache as the AM5 chip, just clocked 300MHz slower)

      https://store.minisforum.com/products/minisforum-motherboard

    • ineedasername5 hours ago |parent

      A bunch of the NUC models use laptop RAM, and often have barebones kits. Looks like ASUS has a decent range of kits and prebuilt, but you may be able to find boards. If you want something expandable, look for the "Pro" and "Extreme" range. I had one of the first gaming-oriented NUC's a while back, Hades Canyon, highly capable.

    • ThatPlayeran hour ago |parent

      There are adapters that convert the laptop memory for desktop motherboards. So that's an option too.

    • bakugo5 hours ago |parent

      https://www.asrock.com/mb/AMD/X600TM-ITX/index.asp

      Can't find it for sale, though. There's also a barebones mini-PC:

      https://www.asrock.com/nettop/AMD/DeskMini%20X600%20Series/i...

  • awongh8 hours ago

    This seems to be for chips put in phones in 2026? I thought these orders were booked further in advance, or is that only for processors?

  • me551ah6 hours ago

    It is absolutely the worst time to be a gamer. First it was the GPU prices that went up and NVIDIA started to focus on their enterprise cards more and more RAM prices. I don’t think I’ve seen the price of computer components go up so much.

  • Barathkanna8 hours ago

    When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.

    • Ericson23148 hours ago |parent

      Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.

  • itopaloglu838 hours ago

    The manufacturers are willing to quadruple the prices for the foreseeable future but not change their manufacturing quotes a bit.

    So much for open markets, somebody must check their books and manufacturing schedules.

    • dgacmu8 hours ago |parent

      In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.

      It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)

      • itopaloglu837 hours ago |parent

        I don’t think they’re working at 100% capacity or don’t have any other FAB that they can utilize for other low profit stuff.

        Let’s check their books and manufacturing schedule to see if they’re artificially constraining the supply to jack up the prices on purpose.

        • dgacmu7 hours ago |parent

          I'd take the opposite bet on this. They're diverting wafer capacity from lower-profit items to things like HBM, but all indications are that wafer starts are up a bit. Just not up enough.

          For example: https://chipsandwafers.substack.com/p/mainstream-recovery

          "Sequentially, DRAM revenue increased 15% with bit shipments increasing over 20% and prices decreasing in the low single-digit percentage range, primarily due to a higher consumer-oriented revenue mix"

          (from june of this year).

          The problem is that the DRAM market is pretty tight - supply or demand shocks tend to produce big swings. And right now we're seeing both an expected supply shock (transition to new processes/products) as well as a very sudden demand shock.

        • fullstop2 hours ago |parent

          > I don’t think they’re working at 100% capacity or don’t have any other FAB that they can utilize for other low profit stuff.

          I have a family member who works in a field related to memory and storage fabrication. At the moment Micron, etc, are running these money printers full time and forgoing routine maintenance to keep the money flowing.

    • filloooo4 hours ago |parent

      Memory chips have always been a very cyclical business, that's why their stock prices remain relatively low despite a windfall happening.

    • davey480167 hours ago |parent

      Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.

    • arijun8 hours ago |parent

      If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.

  • potato37328427 hours ago

    You make more money selling the good stuff. It's like this in just about every industry.

    • venturecruelty2 hours ago |parent

      Why bother selling to regular consumers at all then? One or two big companies can have everything, and the rest of us can have nothing. And we will like it.

  • blindriver5 hours ago

    I bought 64 GB DDR4 RAM for $189 in 2022. The exact same memory is now almost $600 on Amazon. How can this not impact PC sales and the sale of other electronics?

    • layer84 hours ago |parent

      It will. Manufacturers who didn’t get good supply contracts in time might be forced to leave the market.

  • rwyinuse3 hours ago

    I'm now glad I bought 128GB of DDR4 when building a new dual purpose server-gaming PC two years ago. The RAM is now worth way more than the rest of the parts combined.

    I wonder how this will impact phone prices.

  • Night_Thastus3 hours ago

    I am sooooooooooooooooooooooo glad I bought a 6000Mhz 2x16 kit before all this nonsense started.

    I'll be honest, I have 0 confidence that this is a transient event. Once the AI hype cools off, Nvidia will just come up with something else that suddenly needs all their highest end products. Tech companies will all hype it up, and suddenly hardware will be expensive again.

    The hardware manufacturers and chip designers have gotten a taste of inflated prices and they are NOT going to let it go. Do not expect a 'return to normal'

    Even if demand goes back to exactly what it what, expect prices to for some reason be >30% higher than before for no reason - or as they would call it 'market conditions'.

  • dcchambers25 minutes ago

    Seriously concerned for the future of consumer electronics right now.

    Next up: Nvidia exits the consumer hardware space and shifts fully to datacenter chips.

  • DustinBrett6 hours ago

    Ironically that site was eating up my RAM. PC World has some issues, Chrome & Firefox.

  • meindnoch6 hours ago

    I've bought 2x16GB Samsung ECC RAM last week for $150.

  • SanjayMehta7 hours ago

    In the 90s, Motorola Mobile used Cypress SRAMs and not Motorola SRAMs.

    Pricing.

  • dreamcompiler5 hours ago

    Once the AI bubble pops there will be smoking deals on RAM (and everything else).

  • DocTomoe8 hours ago

    I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'

    • muvlon8 hours ago |parent

      But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.

      That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.

    • JKCalhoun8 hours ago |parent

      Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?

      • jacquesm8 hours ago |parent

        If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.

        • jack_tripper32 minutes ago |parent

          The west is only smart at financial engineering (printing money to inflate stocks and housing). Anything related to non-military manufacturing should be outsourced to the cheapest bidder to increase shareholder value.

      • fullstop8 hours ago |parent

        Micron is bringing up one in Boise Idaho as well.

      • baiwl8 hours ago |parent

        What will we do with that fab in two years when nobody needs that excess RAM?

        • jacquesm8 hours ago |parent

          There has never been 'an excess of RAM', the market has always absorbed what was available.

          • jack_tripper31 minutes ago |parent

            Yeah right, tell that to Qimonda.

        • Ericson23148 hours ago |parent

          Sell it at lower prices. Demand is a function of price, not a scalar.

          • h2zizzle7 hours ago |parent

            Tax write-off donations to schools and non-profits, too.

        • JKCalhoun5 hours ago |parent

          I suspect there will be a shortage of something else then…

          And regardless, you could flip it around and ask, what will we do in x years when the next shortage comes along and we have no fabs? (And that shortage of course could well be an imposed one from an unfriendly nation.)

      • xzjis7 hours ago |parent

        It's a political problem: do we, the people, have a choice in what gets prioritized? I think it's clear that the majority of people don't give a damn about minor improvements in AI and would rather have a better computer, smartphone, or something else for their daily lives than fuel the follies of OpenAI and its competitors. At worst, they can build more fabs simultaneously to have the necessary production for AI within a few years, but reallocating it right now is detrimental and nobody wants that, except for a few members of the crazy elite like Sam Altman or Elon Musk.

    • jacquesm8 hours ago |parent

      Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.

      RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.

      • zorked8 hours ago |parent

        This industry has a history of forming cartels.

        https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal

      • sharpshadow6 hours ago |parent

        There is also the side effect of limiting people to run powerful models themselves. Could very well be part of a strategy.

  • maxglute7 hours ago

    Kdrama on this when?

  • venturecruelty3 hours ago

    It's unfortunate that we will soon not have computers because it is not profitable enough. Alas. Too bad the market is so efficient.

    • dmix2 hours ago |parent

      I'd much rather be in a country where the odd temporary shortage happens due to a massive new market appearing than one where supply/demand is always fixed and static because nothing new gets built without extreme careful planning.

      AI is not going away, but there will be a correction and things will plateau to a new higher level of demand for chips and go back to normal as always. There's too much money involved for this not to scale up.

      Markets can't adapt overnight to tons of data centers being built all of a sudden but it will adapt.

      • venturecruelty2 hours ago |parent

        >AI is not going away, but there will be a correction and things will plateau to a new higher level of demand for chips and go back to normal as always. There's too much money involved for this not to scale up.

        What will they do when people continue to not pay for this crap and investors demand their pound of flesh? Because uh, nobody's paying for this, and when people are gambling with trillions of dollars...

        >Markets can't adapt overnight to tons of data centers being built all of a sudden but it will adapt.

        Which data centers?

        • dmix2 hours ago |parent

          > Because uh, nobody's paying for this, and when people are gambling with trillions of dollars...

          I pay for 3 different AI products and every person on my team is paying for at least one. Just because some enterprise sales teams rushed to oversell some half-baked AI products they duct taped together doesn't mean there isn't a huge market.

          > Which data centers?

          Microsoft https://blogs.microsoft.com/blog/2025/09/18/inside-the-world... Anthropic https://www.datacenterknowledge.com/data-center-construction... Twitter https://www.datacenterdynamics.com/en/news/elon-musks-twitte... CoreSite https://www.coresite.com/news/coresite-launches-ny3-data-cen...

          Meta, Google, and Oracle are scaling theirs up too

  • shevy-java8 hours ago

    AI companies must compensate us for this outrage.

    A few hours ago I looked at the RAM prices. I bought some DDR4, 32GB only, about a year or two ago. I kid you not - the local price here is now 2.5 times as it was back in 2023 or so, give or take.

    I want my money back, OpenAI!

    • h2zizzle8 hours ago |parent

      This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.

      I think we're going to regret this.

      • amarcheschi7 hours ago |parent

        Americans are subsidizing ai by paying more for their electricity for the rest of the world to use chatgpt (I'm not counting the data centers of Chinese models and a few European ones though)

    • Uvix7 hours ago |parent

      DDR4 manufacturing is being spun down due to lack of demand. The prices on it would be going up regardless of what's happening with DDR5.

    • Forgeties798 hours ago |parent

      I am so glad I built my PC back in April. My 2x16gb DDR5 sticks cost $105 all in then, now it’s $480 on amazon. That is ridiculous!

      • basscomm4 hours ago |parent

        I'm also glad I overbought RAM when I did my last PC upgrade in January, because who knows when I'll be able to do that again.

        The 96GB kit I bought (which was more than I needed) was $165. I ended up buying another 96GB kit in June when I saw the price went up to $180 to max out my machine, even though I didn't really need it, but I was concerned where prices were going.

        That same kit was $600 a month ago, and is $930 today. The entire rest of the computer didn't cost that much

        • Forgeties793 hours ago |parent

          Yeah I do regret not going 64GB when it was so cheap but honestly? 32 has been fine. I had already pushed the budget to future-proof critical things (mobo, PSU, CPU, etc.) and ram hopefully one day will drop to sane prices again. I doubt I'll feel the strain for 3-5 years if at all. It's mainly a gaming rig right now

    • toss18 hours ago |parent

      Yup.

      And even more outrageous is the power grid upgrades they are demanding.

      If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.

      Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).

      • moregrist7 hours ago |parent

        Usually the narrative for externalizing these kinds of costs is that the investment will result in lots of jobs in the upgrade area.

        Sometimes that materializes.

        Here the narrative is almost the opposite: pay for our expensive infrastructure and we’ll take all your jobs.

        It’s a bit mind boggling. One wonders how many friends our SV AI barons will have at the end of the day.

      • fullstop7 hours ago |parent

        I bought 2x16 (32GB) DDR4 in June for $50. It is now ~$150.

        I'm kicking myself for not buying the mini PC that I was looking at over the summer. The cost nearly doubled from what it was then.

        My state keeps trying to add Data Centers in residential areas, but the public seems to be very against it. It will succeed somewhere and I'm sure that there will be a fee on my electric bill for "modernization" or some other bullshit.

    • bell-cot7 hours ago |parent

      The problem is further upstream. Capitalism is nice in theory, but...

      "The trouble with capitalism is capitalists; they're too damn greedy." - Herbert Hoover, U.S. President, 1929-1933

      And the past half-century has seen both enormous reductions in the regulations enacted in Hoover's era (when out-of-control financial markets and capitalism resulted in the https://en.wikipedia.org/wiki/Great_Depression), and the growth of a class of grimly narcissistic/sociopathic techno-billionaires - who control way too many resources, and seem to share some techno-dystopian fever dream that the first one of them to grasp the https://en.wikipedia.org/wiki/Artificial_general_intelligenc... trophy will somehow become the God-Emperor of Earth.

      • nyeahan hour ago |parent

        It'll be fine.