Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Yeah, but all of this is pointless when RAM is as expensive as two CPUs by itself - if it's even in stock. AMD/Intel should focus on that first if they want to save their DIY business at all - which I'm starting to doubt they don't
 help



The RAM shortage is... a shortage. It's temporary by nature. RAM didn't suddenly get 4x more expensive to produce, it's just in high demand right now. Supply will eventually catch up even if it takes a few years.

Most of us are not really doing detailed half-decade hardware purchasing plans, for most of us "the next few years" is really the only relevant time frame.

I have a 9 year cycle for my main machine. Not sure what's yours?

Right now I'm on a 2021 MacBook Pro, I'm debating whether to upgrade to the new M5 Pro laptops or to wait until next year; so that's 5-6 years. My desktop gets upgraded over time, so it's not really on a "cycle"; it's more that every few years, I may decide to get a more powerful CPU, or more RAM, or a new GPU, or more storage, or whatever else. Though I recently (as in, during the past year) upgraded from an AM4 CPU to an AM5 CPU, so that meant I replaced more than usual; everything other than the GPU, storage, power supply and peripherals. (I switched from ITX to mATX for an extra PCI slot so the case had to go; though it still lives on in the form of a lab PC of sorts, along with my old i7 6700k)

But common among all these replacements is that they're not really planned in detail years before. I may weigh up factors like "I don't really need an upgrade right now" against factors like "the market looks like it'll probably get really shitty later this year", or "I could really really use an upgrade right now" against "but the market is shitty now and we're right before a product launch which will shake things up". But I always react to my current or near-future needs/wants and current or near-future market conditions.

So hearing "the RAM market will be good again in 5 years" is completely irrelevant to me. My decisions are entirely based around how the RAM market is right now and how I believe it will look throughout the year or the next.


When someone is in the market for a new computer, they rarely look more than months ahead, no matter how long your cycle is.

> Supply will eventually catch up even if it takes a few years.

I am wondering if this so true. What resources and time are needed to increase supply by N times to catch demand.


Also supply will only increase if producers believe the current level of demand is going to be sustainable in the long term. Which I don't think anyone really believes.

The same way nobody wanted to invest in mask supply in the US or Western Europe during covid, because producers knew the demand spike wouldn't last and they'll be left with useless equipment to pay after the crisis passed.


> demand is going to be sustainable in the long term. Which I don't think anyone really believes.

I am one who think there is a chance it will sustain. AI is useful tool, Opus unlocks N times productivity gain for devs since Opus 4.5, which is available just for 3 months.

This means adaptation is just started, it could expand on all kind of usecases, niches, solving problems, products, etc, which could be N times more demand for compute from what we have now.


Correct, but if they're right in that belief then it's still just a matter of time before demand goes down and the shortage is solved.

If they're wrong and this is actually a permanent spike in demand, then it'll take the industry a while to realize it but eventually they'll collectively figure it out and increase supply. The ones who figure it out soonest and increase supply fastest will profit the most. The ones who figure it out slowest will lose market share.


The resources are certainly there; the high prices are providing them. It's just a matter of time.

Supply will catch up to demand when demand goes down.

Or they'll just enjoy the high margins and not invest significantly more than they normally would in new production capacity.

Or they're not convinced that betting on continued hypergrowth of AI is a good idea.

And thereby leave a bunch of profits on the table while simultaneously losing market share to competitors who do invest in more capacity?

Which competitor is going to invest in increasing their capacity when everyone expects the demand to decline sooner rather than later?

If demand declines then the shortage is still solved regardless.

If it doesn't decline, than anyone who took that risk and increased their production capacity will benefit greatly, and those who didn't will lose market share.


They're in business to make money not to solve the shortage. If they invest in new fabs and the bubble pops they're sitting on idle fabs.

And if they invest in new fabs and the bubble doesn't pop then they make a whole lot of that money they're in business to make.

The incentives here are naturally very well aligned with solving the shortage. If doing nothing is likely to solve the shortage, then they'll do nothing. If increasing supply is likely to solve the shortage, then they'll increase supply. If there's a 50/50 chance of both, then some will increase supply and some will do nothing, and the market will reward whichever group was right and punish the other.


Building a new factory would cost $20 billion and take 3 to 4 years [0,1]. With chip output capacity and AI boom profit margins, it would take just under a decade to break even. If the bubble bursts and chips return to pre-boom levels, then it would take over 30 years to break even.

Ford had almost $20 billion in EV car manufacturing investments planned for the mid-late 2020s and the abrupt end of the EV subsidies cost Ford billions of dollars and they have abandoned multiple investments.

If you do nothing, you still are rewarded because you are making pure profit in either scenario. If you invest billions then you are digging out of that for years regardless, and could be in the whole for decades if you bet wrong.

[0] https://www.construction-physics.com/p/how-to-build-a-20-bil...

[1] https://techovedas.com/what-does-it-take-to-build-a-semicond...


Correct, if you invest billions then you are taking a risk which could either pay off or not depending on whether you're right about that investment being an efficient use of society's resources.

Existing fabs are currently reaping the rewards of their previous wise investments in building out this capacity in the first place right before a shortage. Without those past investments, the shortage today would be even worse. Future rewards will be allocated based on who correctly predicts the best use of current resources to meet future demand (whether that includes huge investments to build even more fabs or not is yet to be discovered).


You forgot to say “Amen” at the end of your gospel.

I'm always baffled how seriously some people take that “market and incentives always lead to the greater good” religion despite plenty of empirical evidence against it.

But hey, there are creationists out there too so it's not too surprising I guess.


> “market and incentives always lead to the greater good”

I never said anything of the sort, I'm just explaining basic economic facts. Feel free to pretend they aren't true if you wish, but if you do then for all our sakes please stay as far away from the levers of economic policy as possible.


These aren't “basic economic facts”, these are basic pieces of economic liturgy, it's not grounded in any factual reality. It only works in microeconomics lalaland.

In the real world, executives simply aren't being incentivized for all-or-nothing risk taking and the shareholders of public industrial companies don't want the executives to make gambles in an attempt to make a big payout, they want steady yields with limited risks. And the financial actors who would be ready to take this kind of “high risk high reward” bets aren't operating in these kinds of markets.

Just look out of the window, semiconductor and electronics fabs aren't rushing to expand their capacity, they are very familiar with the issue of oversupply and are always very cautious before making capacity investment, because they know it could very well make them go under very quick should the market reverse.

The more time passes, the more convinced I am that nothing did more damage to the broader public understanding of economic dynamics than the average Econ 101 class.


The real world is indeed more complicated, but not in a way which renders the basic rules of economics untrue. What you're doing is the economic equivalent of pointing at an airplane and saying "Look at that! And you really still believe gravity exists?!? What an idiot!" I'm just pointing out some simple facts which tell us the plane will eventually come back down.

Except the “rules” in question are more akin to “the earth is the center of the universe” than they are to gravity. There's a strong religious aspect to it that make it stick no matter how many empirical refutation have been made.

China is coming online :(

Why the sad smily? I'm pretty happy Chinese tech is stepping up to break the current memory cartel.

They can't fix stupid.

Let me describe this in the most simple terms possible: You have speculators speculating about AI products. The speculators are not very smart when it comes to technology, and think RAM is RAM. There is at least three kinds of RAM that are important to this: DDR for system RAM, GDDR for GPUs, and HBM for high density enterprise products, and they are not interchangeable, there is no one-die-fits-all solution.

So, these speculators are like "oh no, more GPUs requires more RAM!", and then just start speculating on all RAM. Which of these RAMs are the ones that they need to worry about? Exclusively HBM, which is a minority in production, DDR and GDDR dominate production.

If you're into inference, and have older machines, you're buying Hxxx or Bxxx cards that use HBM, fit into dual slot x16 configurations, and you're jamming (optimally) 8 of them in. If you're into hardware that is newer, somewhere in the middle of the inference boom, you're using MXM cards. In either situation, the host machine has DDR, but if you're OpenAI, Anthropic, Microsoft, or Google, you're not building (more) inference machines like this.

The first two are buying Nvidia's all in one SBC solution: unified HBM, onboard ARM CPU to babysit the dual GPUs, has its own dual QSFP network controller that can RDMA, etc. No DDR or GDDR involved. Any machines built before this platform are being phased out entirely.

Microsoft is doing the same, but with AMD's products, the MI series that co-locates Epyc-grade Zen 4/5 CCDs with CDNA compute chiplets, running the entire thing off HBM, thus also unified and no DDR/GDDR needed. They, too, are phasing out machines older than this.

Google has a mix: they offer Nvidia all in one SBCs as part of GCP for legacy inference tasks (so your stack that can't run on AMD yet still can run), but also offer the same MI products that Microsoft offers via Azure's inference product, but also has their own TPUs that some of Gemini runs on; the TPUs run on HBM afiact. No DDR or GDDR here.

So, what does AMD or Intel do here? Lets say they waste fab time to make their own dies on the wrong process (TSMC and Intel-Foundry do not have for-RAM optimized processes)... they would be producing DDR and GDDR for a market that almost has its entire demand met. Intel lacks the die stacking technology required to build HBM, and TSMC I think can't do it for that many layers (HBM has 8 to 16 layers in current gen stuff iirc).

Micron, for example, already is bringing two large factories online here in the US to meet the projected growth in demand for the next 20+ years. When these factories finally start producing, it will not change the minds of speculators: they still seem to think AI datacenters need RAM, of any kind, and refuse to understand even the most basics of nuance. Also, when they come online, HBM will be a minority product; the AI inference boom is still just a bump in the road for them.

Nvidia kinda screwed their consumer partners, btw: they no longer bundle the GDDR required for the card with the purchase of the die. There is a slight short term bump in GDDR spot prices as partners are building up warchests to push series 60 GPUs into production, and once that is done, spot prices return to normal (outside of the wild speculation manipulation).

One last thing: what about LPDDR, used by AMD Strix Halo and Apple stuff? Speculation seems to have not actually effected it. I consider it as a sub-category of DDR (and some dies seem to work as either DDR or LPDDR as of DDR5, due to the merger of the specs by JEDEC), but since it isn't something you find in datacenters, it seems to have avoided speculation.

The Ryzen Max CPUs mentioned in the linked article? Uses LPDDR. Doubling down on the Ryzen Max product line might be a brilliant move.


> The speculators are not very smart when it comes to technology, and think RAM is RAM. There is at least three kinds of RAM that are important to this: DDR for system RAM, GDDR for GPUs, and HBM for high density enterprise products, and they are not interchangeable, there is no one-die-fits-all solution.

The commenter is also not very smart and does not realize companies making the RAM can trade capacity of one for another and any re-tooling at current price is still profitable.

The commenter also does not realize that is also true for lines currently making SSDs


They can trade capacity, but they generally aren't. The huge storage-only fabs owned by Samsung and Micron do runs that go for 9 months to 12 months.

Flash chips haven't been speculated on nearly as hard, and are suffering from the same sort of weird lack-of-nuance. Samsung, for example, isn't reassigning capacity to meet some sort of phantom datacenter demand that isn't already there, generically, across all datacenters, AI or not.

A lot of SSD price skyrocketing is largely "SSDs have RAM on them for cache", not "SSDs have flash chips, and they're both made at the same fabs"... which oddly effects low end SSDs that don't have external cache.

To make it worse, for the speculators who do understand this, because it isn't some universal homogeneous group, the flash chips that go into enterprise SSDs aren't the same that go into consumer SSDs.

The Big Three still aren't doing some major re-tasking of capacity, as the actual global demand isn't outstripping supply any more than normal. There is no short term problem to fix, speculators are just gonna have to stop hoarding toilet paper like its the start of Covid.

Edit: Oh, and if you want to ask how AMD/TSMC or Intel solve this? They can't, same reason why making their own in-house HBM isn't happening.


Both Western Digital and Kioxia have reported their 2026 Flash/Hard drive production capacity is sold out.

Micron killed Crucial to focus on AI.


I'm glad Kioxia (formerly Toshiba) have been able to do that. However, I also know they've been having problems meeting demand for quite some time, and haven't been able to scale up nearly as fast as the big three have. There was an incident in 2019 and another in 2022 that killed entire runs of chips and screwed them during the Covid datacenter rush.

Micron killed Crucial because Crucial was a weird offering that competed with their own partners. This was always a weird problem, and it just didn't make financial sense to continue with it. One of the analyses I read was Crucial was less than 12% of sales.

Like, don't get me wrong, I've liked many Crucial products over the years, and even recommended some of them, but it was always weird they were trying to out-compete companies like Adata and other major ODMs.

The counterexample of this is Nvidia absolutely trying to kill their partners, and going to first party assembly and sales of products. Nvidia isn't even going to PNY anymore for ODM needs, but going directly to Foxconn.

Micron execs claiming its because of AI is a bit weird and revisionist, because they've been working on exiting the Crucial brand since long before they publicly announced it. The public didn't learn of any such plans until right before the Ballistix brand sunsetting was announced in 2021, but started years before that. Like, I know they're just playing to their shareholders, but its still a bit weird.


When did nvidia drop PNY as ODM for their reference cards? I recall my A5000 (now 2 Gen old) was made by PNY.

As far as I know, the current lineup is PNY still makes the workstation cards, possibly also the x16 server cards, but Foxconn is doing the Blackwell SBCs and MXMs, and those SBCs are a pretty big chunk of Nvidia's income right now. I also believe they have moved to Foxconn for the Founders Edition consumer cards.

Also, with the FEs, their partners are disallowed from making their own FEs, even if they make their own PCB from scratch and not based on any existing Nvidia design. Doesn't matter who makes the FE, it immediately puts partners at a great disadvantage if they can't make one too.


Samsung and SK Hynix have moved all of their capacity over as well IIRC.

Reminds me of "false sharing" effect: hidden common dependency and bottleneck for what looks like independent variables on the surface.

> So, these speculators are like "oh no, more GPUs requires more RAM!", and then just start speculating on all RAM.

Are you claiming that these speculators are buying DDR5 RAM and warehousing it somewhere? Or what exactly is the mechanism you are proposing here?

To me it seems much simpler - AI companies want HBM, but HBM and DDR5 share the same wafer production process and facilities, but the HBM process is much more fragile and takes three times the wafer production.

There isn't enough DDR5 RAM being produced, so prices go up.


Those micron factories won’t even be targeted at consumer-grade RAM though, right?

There is no such thing as "consumer grade RAM". Servers still take DIMMs, ECC DIMMs just has more chips on it (previously 9 instead of 8, but now 10 instead of 8 as of DDR5; you'll see some DDR5 DIMMs with 5 instead of 4 because they're double die packages).

Micron, Samsung, and Hynix just basically sell you chips that comply with the JEDEC spec, and the DIMM manufacturers further bin them according to purpose. The highest end chips (that are stable at high clocks and acceptable voltages) end up in enthusiast performance products, the ones that don't work well at all but still meet JEDEC spec are sold to Dell/HP/Lenovo/etc for Grandma's Facebook machine, and the ones that are exceptionally stable at thermal design limits are plunked onto ECC DIMMs and sold to servers.

Also, as others have mentioned, its just a fab, and it can make any of the dies they're able to make. Whatever needs to be made to meet demand, they make, they just can't turn on a dime and react to quarterly concerns, and are locked into cycles that may range from 6 months to 18 months.

Side note that is also worth mentioning, sometimes you can order special bins of parts with features that wouldn't normally be available if you're willing to order enough. Recent example being Nvidia buying overclocked GDDR6 chips from Micron with additional features enabled; Micron was more than happy to become Nvidia's exclusive supplier for the custom GDDR chip if Nvidia was willing to buy out the entire run. Stuff like this happens every so often, but isn't the norm.


If there’s no such thing as consumer grade RAM, then what did Micron’s announcement mean?

https://investors.micron.com/news-releases/news-release-deta...


Re-read the previous comment again.

You just need an additional chip to move from "consumer grade" (ie no parity) to "server grade" (ie have parity). ECC support is actually in the memory controller which is in the CPU for the last 15 years. No magik.


The announcement means that they're closing Crucial - just like it says in the title and the first paragraph. The rest of that press release is outlining the mechanics of how that works + some fluff. Micron is going to continue producing the exact same memory chips in the exact same fabs. They're just not soldering it to a board, slapping the Crucial logo on it, and selling it directly to consumers. There's nothing stopping downstream vendors from buying Micron chips, soldering them to a board, and selling them to consumers as Micron was doing previously.

There's nothing in that press release that implies that the memory was somehow different (or "consumer-grade"). The _only_ thing they're saying is that they're ending their B2C business and focusing on B2B.


Didn’t you just describe the literal difference between consumer grade RAM that is soldered to a consumer format board vs a memory chip sold to a company to be soldered onto a product of the other company?

Calling it "consumer-grade RAM" is inaccurate - RAM is RAM. When you solder it to a board, you now have a DIMM that is carrying RAM chips. It's a semantic difference, but it's important.

So where are all these speculators storing DDR5, flash, and even spinning hard disks? Asking for a friend.

As a small buyer of all of those things supply at nearly any price has gotten very difficult to reliably predict week to week. When a lot of 100 64GB DDR5 sticks shows up available at a vendor, it’s at a take it or leave it price good for a couple hours. If I don’t pull the trigger they have another buyer for it and I might be waiting another month before anything becomes available again. We can no longer JIT for even failure replacement on our edge nodes.

Then you have the NVMe and even SATA SSD shortages. Still a bunch of very useful hardware out there I would love to find a decent deal on 8TB sata so I could repurpose it. Just doesn’t make any sense right now at current pricing and availability. Good luck trying to even find a batch of 12 of these disks at a time.

This goes for both enterprise and even prosumer I was willing to take for some of these uses.


Its mixed. Some of it really is Covid toilet paper behavior.

Datacenter customers, for example, have repair parts on hand; boxes of harddrives/ssds waiting to be put in, boxes of consumable parts, DIMMs waiting to replace ones that went faulty, entire machines already racked and waiting to take over for their fallen siblings, etc. Some of these customers added more to the spare parts pile. The big clouds manage their elastic demand of any sort of consumable or repair parts in volumes that are described in terms that fit cargo trucks in a quarterly basis, and they've already compensated.

Now, otoh, you have the truly psychotic people, that fill their basements with toilet paper, just hoarding more than they could ever use in their entire life. We've all seen that story where a guy was going to lose his house because he blew his mortgage money on toilet paper, and was selling it at a loss just to stay afloat. People like this exist in every crisis, and there's gonna be a headline in the near future where someone is gonna lose their house because they had like a hundred trays of DIMMs in their basement.

A few people I know who scrape eBay like its their job for electronics are just waiting for people to start fire-selling DIMMs and SSDs that got hoarded and they couldn't scalp people over; they're expecting half of MSRP or better sometime later this year.


> what about LPDDR, used by AMD Strix Halo and Apple stuff? Speculation seems to have not actually effected it

Good luck actually finding them on stock with 128GB+ RAM. I got strix laptop while ago, now price in EU is technically the same, but no stock. Maybe month or three

There is also claw hype. And large gwen3.5 models can run very well on DDR5 CPUs or mac minis...


I find the panic over RAM prices to be overestimated. 32GB DDR5 RAM is around $500 which is comparable to to the 9800x3D. Sure it sucks that it increases by around 4x, but when you factor in the overall price of a top end PC at around 1000-2000, especially for the lion's sum of the GPU, the increase is marginal.

This only effects a very narrow slice of highly budget conscious consumers trying to build high end PCs at razor thin margins.


$500 for 32GB is about $15/GB which is a high we haven't seen since the mid-2000s. This is a big deal, it turns RAM and to some extent storage (especially fast storage) into a massive economic bottleneck.

> since the mid-2000s.

Did you adjust for inflation ?


Adjusted for inflation, the last time prices (/GB) were this high was May 2011; the tail end of the 2009/2010 shortage. Aside from a brief glut in 2008, it wasn't really cheaper before (than it is now) though. Of course RAM is much faster these days, but also in 2011 most people had no more than 4 GB of system memory and 512 MB VRAM.

https://web.archive.org/web/20240805053759/https://jcmit.net...

https://thememoryguy.com/dram-prices-hit-historic-low/

Inflation applied manually; https://www.bls.gov/cpi/

https://www.neowin.net/forum/topic/983036-latest-steam-hardw...

Steam hardware survey GPU history: https://www.youtube.com/watch?v=wHTdnIviZTE


Inflation since the 2000s cannot possibly make up the difference in price we’ve seen in just the last 6 months.

That was not my point entirely; my point that citing prices from 2000s and comparing with modern ones |(with indexing about 2x times), regardless of underling reason is either a demonstration of lazyness or innumeracy, or even worse - an attempt to manipulate.

It’s not laziness, innumeracy, or manipulation when it can be taken at face value that the cost increase vastly outstrips anything that could be attributed to inflation. You don’t even need to look it up to know that.

> when it can be taken at face value that the cost increase vastly outstrips anything that could be attributed to inflation

But that was not my point _whatsoever_. What I said is - every time you bring the explicit numbers (like in GP "$500 for 32GB is about $15/GB which is a high we haven't seen since the mid-2000s") you _absolutely_ have to adjust for inflation to have a meaningful conversation. This is it.


Ram is clearly way more expensive now, yes?

Did you adjust for technological improvements that pumps out more chip per wafer compared to mid-2000s due to node-size shrink?

That was not my point entirely; my point that citing prices from 2000s and comparing with modern ones |(with indexing about 2x times), regardless of underling reason is either a demonstration of lazyness or innumeracy, or even worse - an attempt to manipulate.

No, I didn't adjust for the huge inflation in average RAM requirements since the mid-2000s.

Not the point - my point utterly of arithmetical nature - dollar has substantial inflation, and any comparison more that 5 years apart, let alone 20 warrants adjusting of prices, as error is substantial, 2x in the case of 2000s

You demand specific data points but respond with vague handwaving and general statements about the importance of calculating inflation in this discussion as if it represents more than a small fraction of the overall increase in ram cost

> more than a small fraction of the overall increase in ram cost

There is nothing vague about the question if prices were scaled or not (and in this pretty much unvague coefficient of ~2x between usd in 2000 and 2026), otherwise there is point in comparing these numbers, as there is no point in comparing inches and cm's without declaring beforehand which number is which.


You are perfectly capable of looking at the rate of inflation since the mid 2000s and seeing that it only tells a small portion of the story.

You cannot possibly look at the price of ram now compared to six months ago and be so fixated on including inflation. Obviously inflation occurred and obviously after 20 years it has an impact on price. But we are all on HN and all know what inflation is, so forcing people to drill down on its contribution in order to advance the conversation when it clearly only accounts for a small portion and we all know it’s a factor is absolutely ridiculous. You know this, we know this, and yet here we are still talking about it. I may as well explain what ram is if we want to get this elementary about things.

True or false: ram has become substantially more expensive in the last 6mo in a way that cannot be meaningfully explained by inflation.

There is a very clear, very obvious answer here. Inflation or not.


> You are perfectly capable of looking at the rate of inflation since the mid 2000s and seeing that it only tells a small portion of the story.

Have no idea why you are keeping arguing about something which was not my point to begin with.


$500 is 5x what it cost less than a year ago, just for context. It turns a $1600 computer build into a $2000 one. That’s a huge difference.

Edit: I don’t get your math. If we’re using a very generous definition of “top end,” even neglecting Nvidia and going AMD - which some would argue makes it not top end - you’re talking conservatively: $600 for a GPU, $500 for 32gb of ram, and $500 for a CPU. $1600 before PSU, case, SSD, fan(s), mobo…there’s no world in which you’re coming in under $2k. The SSD and board will put you over immediately.

You’re talking 3/2025 prices, not 3/2026. A compromise, mid-range computer is $1500 to build now.


A 5080 is 1.5K, A 5090 is even more. 1600 to 2000 is not really a large difference at the price band where you are spending that much money, especially since you would heavily comprising in other components if you want to keep that budget, in which that case you don't need 32gb RAM.

That is to say, if you want a system that keeps up with 32 GB Ram, you'd be already willing to spend alot what with options for noctua fans, water cooling, higher end MOBOs, premium cases, OLEDs etc. If you can't afford that then you won't be buying expensive DDRD5 RAM either.


A 9060 is like $450, an XT is like $550. Depending on what you’re using that computer for it could be more than enough firepower. There are tons of people not paying the Nvidia tax because they have a plenty viable build with AMD.

I built my current PC (9800x3D, 9060, 32gb DDR6) last April for about $1800. It would cost almost $3000 now between storage and ram increases. The economics have completely shifted. Everything is more expensive except basically the PSU and case


9060 is mid-tier, buying a 9800X3D and DDR6 RAM is overkill because the GPU won't keep up with their performance.

AMD has no equivalent to NVidia in the high end, it isn't tax as it is functional monopoly


We aren’t debating AMD vs. Nvidia and I shouldn’t have gotten distracted with it tbh. I am talking about what it takes to build a computer now.

Ram and storage have ballooned PC costs. That’s the issue. Whether you are buying an AMD GPU or an Nvidia GPU, it is still substantially increasing build costs. Nobody is spending $1500 on an Nvidia GPU and then going “well nothing else matters now.” The ram and storage has gone from $200-$300 to $800-$1000. That’s still a huge portion of the budget. They’ve gone from near-line item status to 1/3rd (or more) of the cost. Affordable builds have become incredibly difficult to achieve


Also just randomly realized I kept saying 9060 when I meant 9070.

Look at my comment above and see what i said about my build. I was unwilling to pay that for a moderate build that can sustain my computer use. Utilize bundles that can save you $ on RAM or chipset, CPU to skip some of the costs.

You aren't specific in your comment. Where are these bundles? What do you do with all the parts you don't need/end up swapping out? How much are you actually saving?

I went to micro center and they usually have decent priced bundles. RAM was G Skill flare 32gb sticks. Im not being specific in the previous post. because there is many ways to save depending on what your willing to trust. Best Buy open box/new, Micro center open box/new, Walmart has pretty good prices depending what your looking for. Ebay is iffy depending on the product your looking for GPU are expensive at the moment.

Microcenter is not available for the vast majority of us - they don’t ship. The nearest one to me is an almost 8hr drive and I live in a major city. I’m not spending 2 days and $200-$300 on gas/food/lodging to get there and back.

Bestbuy is selling ram and storage at the same cost as everyone else. I imagine Walmart is not much better. I’m also not sure what you do with all the bundle parts that you don’t need. Do you sell them? Where do you sell them?

What deals did you take advantage of? What did everything cost you in the end/when did you build? If you don’t feel like answering that’s fine but it’s valid to remain skeptical given all the evidence to the contrary. Perhaps you’re just really good at finding deals but you can look around this thread and see that we are all telling the same story. Building a computer has gone up $600+ for common builds over the last 4-5mo on top of the already inflated GPU prices we’ve been experiencing for years. If you put my exact build I did last April into PC part picker it is an additional $500+ to build now, and that’s with an AMD GPU to keep costs down.

It’s strange times when Mac minis are a budget-friendly computer. Building a half decent PC for less than $1500 is a serious challenge now. Things are so volatile valve still hasn’t released or even set a price for the new Steam machine.


Microcenter is useful if it's near you then your out of luck. Other stores depends on your egion and location same store gives different deals and discounts based on regional selling trends.

I'm not sure what you expecting to hear. What do i do with parts I'm not gonna use? What are you talking about? Don't get the bundle is you're not going to use what the bundle comes with, simple as that. Have you not shopped before?

Currently computer components are not cheap and it does not look like it's getting any better.

I currently have two moderately good laptop that either i sell or keep for back up.


I disagree with you. The issue does not only affect a “very narrow slice” of consumers. https://www.techspot.com/news/111472-hp-warns-ram-now-makes-... A major brand is now suggesting that this is a “new normal” and one solution is to just offer systems with less ram. This is an issue when lots of modern software seems to expect an unending supply.

That is an insane amount of money for just 32GB of RAM! That's what we were paying back when it was hard to use more than 32-64GB in a desktop setting. These days with all the electron and node bloatware, containers everywhere and AI - 32GB doesn't get you far.

> 32GB DDR5 RAM is around $500 which is comparable to to the 9800x3D.

Apples to oranges. Why are you comparing RAM prices to CPU prices? It's different hardware.

$500 for 32 GB is insane. Just 18 months ago, I bought 128 GB of DDR5 for only $480.


>"overall price of a top end PC at around 1000-2000"

All 4 of my "top end PCs" have 128GB RAM. Me server (I self host everything is 512GB). Lucky for me all were bought before that insanity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: