Hacker Timesnew | past | comments | ask | show | jobs | submit | qalmakka's commentslogin

It's a paramount imperative for Europe to wean itself from fossil fuels, regardless of environmental arguments (which are extremely relevant still). Getting a safe, unfettered provider of fossil fuels is getting a basically unsolvable problem. China is trying to build as much solar and nuclear capacity as humanly possible; we should do the same too. We've been having these energy shocks since the Yom Kippur war basically, it's like a broken cycle of instability and crisis we can't leave behind. There's no shale to be found in Europe, we just have wind, sun and nuclear to save our backs. And maybe geothermal pretty soon?

I don't understand leadership thinking. Surely spending €250B on a continental scale renewable energy project would have a relatively short payoff time (on country scale) given the instability of relying on foreign energy sources. I mean how long does oil have to sit above $100/barrel before it costs everyone that much anyway?

The recent "Offshore Wind Investment Pact" announcement was aiming for 1 Trillion euro of investment into North Sea wind by 2040.

Plus there's lots of other stuff happening. Also lots of pushback from those clinging to fossil fuels.


> spending €250B on a continental scale renewable energy project

Let me stop you there: the EU budget for 2026 was €193B. https://eur-lex.europa.eu/budget/www/index-en.htm

Basically Europe doesn't have political leadership, nor does the EU itself have a budget larger than the member states like the US Federal budget. In return, the EU, primarily Germany, has imposed "fiscal discipline" which prevents running a short term large deficit in order to make this kind of capital investment.

Also, two hundred billion Euro is a lot of money for anyone who isn't an AI startup.


What is the value of the EU if's it not coordinating multi-national scale efforts?

This would need to be a joint venture as some places are really good for wind, and some places are really good for solar, but not every country on their own has access to those locations. The budget for the EU doesn't matter, because this project would be a separate line item with it's own funding.

Energy independence is extremely valuable. Way way way more valuable then $250B or even $500 or $750B for that matter. Society runs on energy, and if it's not fully yours, you are always a rug pull away from social collapse.

If 2022 was a cold winter, and America had a cold leader, this project probably would have breezed through the bureaucracy in a week.


> What is the value of the EU if's it not coordinating multi-national scale efforts?

Remember the EU is just a fancy self-updating free trade agreement, not a nation.

The coordination that the member states have thus far allowed the EU to take responsibility for is ~ "make all our rules be equivalent so everyone's degrees are accepted everywhere, everyone's food is accepted everywhere, we all agree what counts as a safe consumer product, limited range for tax shenanigans, etc."

(And for this, they get denounced as "complex" and "bureaucratic").

Actual direct investments do also exist, I just missed out on one for startups 20 years back apparently due to a rules change, but it's peanuts compared to what member state governments do directly.


>> Energy independence is extremely valuable.

Not if your top politicians are on putin's payrol like Orban and Merkel.


The EU is not funding projects directly, it's setting the rules. Individual governments pay the bill

Is this perhaps changing with Macron indicating Europe will keep the €300B Europe has been investing in the US annually in Europe?

Macron says €300B in EU savings sent to the US every year will be invested in EU - https://hackertimes.com/item?id=46722594 - January 2026 (207 comments)

Europe can go fast when it wants to.

How Europe Ditched Russian Fossil Fuels With Spectacular Speed - https://www.bloomberg.com/news/features/2023-02-21/ukraine-n... | https://archive.today/yxGp2 - February 21st, 2023

> But what the past year has shown is that it’s possible to go harder and faster in deploying solar panels and batteries, reducing energy use, and permanently swapping out entrenched sources of fossil fuel.

> Solar installations across Europe increased by a record 40-gigawatts last year, up 35% compared with 2021, just shy of the most optimistic scenario from researchers at BloombergNEF. That jump was driven primarily by consumers who saw cheap solar panels as a way to cut their own energy bills. It essentially pushed the solar rollout ahead by a few years, hitting a level that will be sustained by EU policies.


This is not entirely correct. The EU actually does both — it sets regulatory frameworks and funds projects directly through several mechanisms.

It's a mix between decades of brainwashing, fossil-lobby having the bigger paychecks, unstable times, and all leaders fear to invest in something new and uncertain. In every industry/organization there is the old saying that nobody ever gets fired for supporting/using the established solutions. This is the same situation, there is more motivation for staying with the known paths, especially after there is strong propaganda against the new paths.

The problem is that 30-50% of voters would just look at that and say:

Why are you spending €250B on corporate subsidies instead of giving us €250B?!


I think the lesson of the UK winter fuel subsidy payment is that while it feels great in year one, it doesn't actually solve any of the problems, and then the voters get incredibly mad if you try to take it away again.

everything of importance ever done in EU was in response to a major crisis, in no small part because these exact voter emotions are dampened in such times.

This can be viewed as a feature not a bug. The defining feature of a republic is stability. Orderly and lawful transfers of power. Not rocking the boat. Deliberative processes. If the people are enjoying prosperity and peace why make drastic changes? So yes when the situation is extreme that's when big shifts in policy happen.

"Never let a good crisis go to waste."

>The problem is that 30-50% of voters would just look at that and say: Why are you spending €250B on corporate subsidies instead of giving us €250B?!

Why is it a "problem" for voters (aka the taxpayers) to ask such questions to their leaders to justify on how their tax money is being spent? To me this feels like basic transparency that keeps democracy in check.

To me it's the problem if politicians don't have or don't want to answer those questions because then, either they're grifting or they're incompetent.

It's not like we don't have a laundry list of mismanagement, couch corruption cough, of governments spending money on bullshit with nothing to show for, while stuff healthcare keeps being underfunded.

So yeah, if you spend my money, you better have an answer.


Wind, sun and geothermal we have. Albeit technology to harvest them seldom come from the Europe. But getting a safe, unfettered provider of nuclear fuel risk to be just another unsolvable problem.

[flagged]


You've been told multiple times that this is a lie by omission. Why do you persist with it?

Honest question: why would anyone use Vim and not NeoVim nowadays? I've switched what, 12 years ago? And I've never had to look back. Just curious, to be honest. Especially since neovim is full of new features, while the Vim9 scripting language kind of tanked

I'll field this one as someone who has used regular ol' Vim for ~18 years and never switched. Why switch if your tool is working fine? I use vim literally every day all day long and it does everything I need it to do. Switching has a cost and there's no reason to pay it if it's working fine.

I originally switched because neovim was more polished, had better plugins and Lua config files. I then never had a reason to go back

I think I first switched because neovim supported the XDG config location, I could have ~/.config/nvim/init.vim instead of ~/.vimrc.

I ended up switching for plugin support. Other than that, unless you want to use Lua for your config files, I don't see a reason to switch either.

Because I don't choose what tools are available on every server at work, and it's guaranteed that at the very least old-school vi is installed on every linux server, and often vim. Maintaining that muscle memory is useful.

I used to think this too, but I routinely switch back and forth between neovim and vim now for close to a decade, and I've never noticed. In fact I often don't even notice which one I'm using unless I explicitly check. Once you add neovim-only plugins that can change of course, but if you can't choose what tools are available on the server then I would imagine you're not installing plugins anyway.

One reason might be how off-putting the Neovim community is, hijacking Vim discussions to denigrate an all-time-great, beloved work of technology and its creator (who did decades of work for free, gave it to the world, and gave any money to actual orphans) all for Neovim users'/devs' own egos, promotion, and obsession. Almost all of Neovim was made by Moolenaar, from concept to execution, and I don't know that I've ever seen any gratitude.

I've never seen Vim users do that. If I had to choose, I'd use Vim.


Just want to say that although I don't use either Vim/Neovim, I feel grateful for what Vim has done. Vim keybindings can be used by a multitude of editors and you can even have the keybinding concept into browsers and other software's.

Its truly revolutionary when one thinks about it how much impact Vim has on terminal users.

(Neovim's plugin system is nice but I agree with ya that I also feel like some aspects of community often don't appreciate Bram because of the Vim vs Neovim thing from my observation) It's best if instead of treating it as Vim vs Neovim, we use the tools that we prefer and appreciate the tools other are using too and the contribution of one in another. Appreciating Vim doesn't make your appreciation for Neovim lesser, appreciating both can be great. Something which is hard within Editor space in general.

Rest in peace Bram.


Can't say I really interact with the "community", I installed the program and I use it a lot. I am grateful for the existence of vi and vim. I now use neovim where I can. vim or vi as needed.

I use both gvim on linux and macvim on mac for a lot of things--not 'real' coding, typically, but opening and editing scripts and config files, writing in markdown, etc; I'm usually opening these from dolphin or finder. In the terminal, working on real code bases and not scripts, I use neovim. My configs for these have diverged a bit over the years but since the use cases are different, it doesn't bother me.

I didn't switch because there was no reason to. And there is still none.

Have you ever called neovim inside a venv? Didn't work for me (or maybe I'm too lazy to jump hoops, if vim is working out of the box).

muscle memory mainly, I guess?

Sure, switching might not be that troublesome, but I can tell you the first 48 hours or so will be painful, you'll insert stray ":" and "i" characters everywhere :)


I barely use vim these days, and I still do that in every text editor.

gvim?

Isn't this like telling the world you ate a full meal by eating samples at Costco? Meta is ranking in billions as we speak, they ensure the FOSS projects they rely on are properly funded instead of shovelling cash to bullshit datacentre developments. Otherwise we're basically guaranteed to end up with another XZ fiasco once again when some tired unpaid FOSS maintainer ends up trusting a random Jia Tan in their desperation

This post is all about how they upstreamed their improvements!

If you get mad when a company makes good use of open source and contributes to a project’s betterment, you do not understand the point of open source, you’re just fumbling for a pitchfork.


I'd say this post reads more like them beating their chest about how great their improvements are.

>Isn't this like telling the world you ate a full meal by eating samples at Costco?

The analogy fails because free samples cost costco (or whatever the vendor is) money. Raking Meta over the coals for using ffmpeg instead of paying for some proprietary makes as much sense as raking every tech company over the coals for using Linux. Or maybe you'd do that too, I can't tell.


Meta is the sole reason PHP is still alive. Also a big reason we're not in MVC hell.

They bet on open source and they open source a lot of technology.

It's one of the best companies when it comes to open source.

I don't know how much total they donate, but I've seen tons of grants given to projects from them.


I think that WordPress is still big enough to keep PHP alive. Furthermore, the sheer number of developer that started coding web apps with PHP in year 2000 plus minus 5 years is large enough to give PHP a critical mass for the next 20 years.

Is Automattic contributing back to PHP? I think that WordPress benefits because PHP is available, but does not significantly contribute to PHP development.

WordPress is keeping PHP alive now

But PHP wouldn't be here today if it wasn't for Meta and it's support.


WordPress is from 2003 and has been very successful since the beginning. FaceBook is from 2004. Both were PHP apps because the late 90s and early 2000s were the years of PHP CMSes and ecommerce platforms. Even if FaceBook did not happen PHP would have been one of the top 5 languages of that age. PHP was popular because of web hostings and the simplicity of apache + mod PHP. It was not big in hype because it was a really bad language until about version 7 and few people would admit to like it.

Actually, FaceBook worked against WordPress and the adoption of PHP because a number of people that could have used a WP instance to blog or to market a product started using a FB page instead. Ecommerce went from self hosted (Magento, Woocommerce, Prestashop) to hosted or to Amazon and also FB.


FB actually made PHP faster (HHVM) and contributed to it upstream with bug fixes (like they are doing with ffmpeg here)

Wordpress did nothing to help further PHP other than adoption (which is still important, but not as important)


> Meta is the sole reason PHP is still alive.

This could not be more wrong. Meta is still using PHP AFAIK but I'm not sure it's modern. They created the Hack programming language ~10 years ago but it doesn't look like it's been updated in several years. Most of the improvements they touted were included in PHP 7 years ago.


I never said they were still using it (they are in some cases)

But when the backend world was either Java or ASP, FB chose PHP and helped us other small companies out.

They eventually went Hack, the rest went Node for the most part.

But during those PHP years they gave us HHVM and many PHP improvements to get us through.


HHVM was not a contribution to PHP. It resulted in PHP 7 being sped up and releasing with a bunch of long awaited features. But afaik , very little of HHVM made it back to PHP core.

It made PHP 5-7 usable in production, otherwise it would have died before we got to modern PHP.

Of course it wasn't merged in, it was a separate compiler, it certainly inspired future optimizations though.

But the point is, it was a very useful stop-gap solution for the community.

Also would like to highlight that they have contributed a lot to PHP upstream in addition to that.


Yeah we’re in React SPA hell instead. I’d rather be in MVC hell.

> Yeah we’re in React SPA hell instead. I’d rather be in MVC hell.

I am guessing the world moved to React because the developer community in general does not feel the same way.


No they moved to Reactjs because it was evangelized as the only framework available. There are plenty of people who hate reactjs, don’t worry.

As a react hater, I share DHH's opinion that React was driven by ZIRP. So many giant, slow, react apps out there that are super slow to develop with. IMO HTMX is a 10x dev time reducer over React.

That's a common take here but I'd take React any day.

Been doing this for 20 years. React/JSX is the easiest (for me)


Yeah, same. Not sure if everyone is as traumatized as us when it comes to dealing with 100K LOC large Backbone.js codebases though, or before that where we kept state in the DOM itself and tried to wrangle it all with jQuery.

React and JSX really did help a lot compared to how it used to be, which was pretty unmanageable already.


>It's one of the best companies when it comes to open source.

If you, for some inexplicable reason, judge companies "the best" only based on their open source software and totally ignore everything else they do to society, while totally ignoring all the other companies who support open source software so much better, without doing all the evil shit that Facebook does (like React).

The rest of us don't bend over backwards so far and blindfold ourselves to harsh reality just to lick Zuckerberg's boots.


I said "one of" when it comes to "open source". Nothing more. (you know the topic at hand)

Not defending the company in any other regard nor do I even like social media platforms, would rather have forums only again as a society.

Feel free to continue to follow me around and perform bad takes, it's funny.


I mean, they contributed their fixes upstream. Thats the most important thing they could do here.

> What would native containers bring over Linux ones?

What would a Phillips screwdriver bring over a flathead screwdriver? Sometimes you don't want/need the flathead screwdriver, simple as that. There are macOS-specific jobs you need to run in macOS, such as xcode toolchains etc. You can try cross compiling, but it's a pain and ridiculous given that 100% of every other OS supports containers natively (including windows). It's clear to me that Apple is trying to make the ratio jobs/#MacMinis as small as possible


> gws doesn't ship a static list of commands. It reads Google's own Discovery Service at runtime and builds its entire command surface dynamically

You're not exactly describing rocket science. This is basically how websites work, there's never been anything stopping anyone from doing dynamic UI in TUIs except the fact that TUI frameworks were dog poop until a few years ago (and there was no Windows Terminal, so no Windows support). Try doing that in ncurses instead of Rataui or whatever, it's horrendous


> all CPU/RAM capacity is being sold to LLM companies, and as a result we can't get the hardware needed for good local LLMs.

yeah... Ironic I guess. It's as if they've realised that it's only a matter of time until we get a "good enough" FOSS model that runs on consumer hardware. The fact that such a thing would demolish their entire business of getting VC hyped while giving out their service for a loss surely got lost to them. Surely they and Nvidia have not realised that the only thing that could stop this is to make good hardware unreachable for anything smaller than a massive corp

Mark my words: in less than one year, we'll probably get something akin to Opus 4.6 FOSS. China is putting as much money into that as they can because they know this would crash the US economy, which is in the green only thanks to big tech pumping up AI. China wants Trump either gone or neutered as soon as possible, which they know they can do by making Republicans as unelectable as possible - something that will probably do if the economy crashes and a recession happens


In Italy I pay 9.99 for 250GB, unlimited calls and SMS (never sent one in 10+ years but it's nice to have I guess), which for me is basically akin to infinite traffic given that unless I start downloading torrents from 5G I'll never ever run out of traffic ever

Yeah, but all of this is pointless when RAM is as expensive as two CPUs by itself - if it's even in stock. AMD/Intel should focus on that first if they want to save their DIY business at all - which I'm starting to doubt they don't

The RAM shortage is... a shortage. It's temporary by nature. RAM didn't suddenly get 4x more expensive to produce, it's just in high demand right now. Supply will eventually catch up even if it takes a few years.

Most of us are not really doing detailed half-decade hardware purchasing plans, for most of us "the next few years" is really the only relevant time frame.

I have a 9 year cycle for my main machine. Not sure what's yours?

Right now I'm on a 2021 MacBook Pro, I'm debating whether to upgrade to the new M5 Pro laptops or to wait until next year; so that's 5-6 years. My desktop gets upgraded over time, so it's not really on a "cycle"; it's more that every few years, I may decide to get a more powerful CPU, or more RAM, or a new GPU, or more storage, or whatever else. Though I recently (as in, during the past year) upgraded from an AM4 CPU to an AM5 CPU, so that meant I replaced more than usual; everything other than the GPU, storage, power supply and peripherals. (I switched from ITX to mATX for an extra PCI slot so the case had to go; though it still lives on in the form of a lab PC of sorts, along with my old i7 6700k)

But common among all these replacements is that they're not really planned in detail years before. I may weigh up factors like "I don't really need an upgrade right now" against factors like "the market looks like it'll probably get really shitty later this year", or "I could really really use an upgrade right now" against "but the market is shitty now and we're right before a product launch which will shake things up". But I always react to my current or near-future needs/wants and current or near-future market conditions.

So hearing "the RAM market will be good again in 5 years" is completely irrelevant to me. My decisions are entirely based around how the RAM market is right now and how I believe it will look throughout the year or the next.


When someone is in the market for a new computer, they rarely look more than months ahead, no matter how long your cycle is.

> Supply will eventually catch up even if it takes a few years.

I am wondering if this so true. What resources and time are needed to increase supply by N times to catch demand.


Also supply will only increase if producers believe the current level of demand is going to be sustainable in the long term. Which I don't think anyone really believes.

The same way nobody wanted to invest in mask supply in the US or Western Europe during covid, because producers knew the demand spike wouldn't last and they'll be left with useless equipment to pay after the crisis passed.


> demand is going to be sustainable in the long term. Which I don't think anyone really believes.

I am one who think there is a chance it will sustain. AI is useful tool, Opus unlocks N times productivity gain for devs since Opus 4.5, which is available just for 3 months.

This means adaptation is just started, it could expand on all kind of usecases, niches, solving problems, products, etc, which could be N times more demand for compute from what we have now.


Correct, but if they're right in that belief then it's still just a matter of time before demand goes down and the shortage is solved.

If they're wrong and this is actually a permanent spike in demand, then it'll take the industry a while to realize it but eventually they'll collectively figure it out and increase supply. The ones who figure it out soonest and increase supply fastest will profit the most. The ones who figure it out slowest will lose market share.


The resources are certainly there; the high prices are providing them. It's just a matter of time.

Supply will catch up to demand when demand goes down.

Or they'll just enjoy the high margins and not invest significantly more than they normally would in new production capacity.

Or they're not convinced that betting on continued hypergrowth of AI is a good idea.

And thereby leave a bunch of profits on the table while simultaneously losing market share to competitors who do invest in more capacity?

Which competitor is going to invest in increasing their capacity when everyone expects the demand to decline sooner rather than later?

If demand declines then the shortage is still solved regardless.

If it doesn't decline, than anyone who took that risk and increased their production capacity will benefit greatly, and those who didn't will lose market share.


They're in business to make money not to solve the shortage. If they invest in new fabs and the bubble pops they're sitting on idle fabs.

And if they invest in new fabs and the bubble doesn't pop then they make a whole lot of that money they're in business to make.

The incentives here are naturally very well aligned with solving the shortage. If doing nothing is likely to solve the shortage, then they'll do nothing. If increasing supply is likely to solve the shortage, then they'll increase supply. If there's a 50/50 chance of both, then some will increase supply and some will do nothing, and the market will reward whichever group was right and punish the other.


Building a new factory would cost $20 billion and take 3 to 4 years [0,1]. With chip output capacity and AI boom profit margins, it would take just under a decade to break even. If the bubble bursts and chips return to pre-boom levels, then it would take over 30 years to break even.

Ford had almost $20 billion in EV car manufacturing investments planned for the mid-late 2020s and the abrupt end of the EV subsidies cost Ford billions of dollars and they have abandoned multiple investments.

If you do nothing, you still are rewarded because you are making pure profit in either scenario. If you invest billions then you are digging out of that for years regardless, and could be in the whole for decades if you bet wrong.

[0] https://www.construction-physics.com/p/how-to-build-a-20-bil...

[1] https://techovedas.com/what-does-it-take-to-build-a-semicond...


Correct, if you invest billions then you are taking a risk which could either pay off or not depending on whether you're right about that investment being an efficient use of society's resources.

Existing fabs are currently reaping the rewards of their previous wise investments in building out this capacity in the first place right before a shortage. Without those past investments, the shortage today would be even worse. Future rewards will be allocated based on who correctly predicts the best use of current resources to meet future demand (whether that includes huge investments to build even more fabs or not is yet to be discovered).


You forgot to say “Amen” at the end of your gospel.

I'm always baffled how seriously some people take that “market and incentives always lead to the greater good” religion despite plenty of empirical evidence against it.

But hey, there are creationists out there too so it's not too surprising I guess.


> “market and incentives always lead to the greater good”

I never said anything of the sort, I'm just explaining basic economic facts. Feel free to pretend they aren't true if you wish, but if you do then for all our sakes please stay as far away from the levers of economic policy as possible.


These aren't “basic economic facts”, these are basic pieces of economic liturgy, it's not grounded in any factual reality. It only works in microeconomics lalaland.

In the real world, executives simply aren't being incentivized for all-or-nothing risk taking and the shareholders of public industrial companies don't want the executives to make gambles in an attempt to make a big payout, they want steady yields with limited risks. And the financial actors who would be ready to take this kind of “high risk high reward” bets aren't operating in these kinds of markets.

Just look out of the window, semiconductor and electronics fabs aren't rushing to expand their capacity, they are very familiar with the issue of oversupply and are always very cautious before making capacity investment, because they know it could very well make them go under very quick should the market reverse.

The more time passes, the more convinced I am that nothing did more damage to the broader public understanding of economic dynamics than the average Econ 101 class.


The real world is indeed more complicated, but not in a way which renders the basic rules of economics untrue. What you're doing is the economic equivalent of pointing at an airplane and saying "Look at that! And you really still believe gravity exists?!? What an idiot!" I'm just pointing out some simple facts which tell us the plane will eventually come back down.

Except the “rules” in question are more akin to “the earth is the center of the universe” than they are to gravity. There's a strong religious aspect to it that make it stick no matter how many empirical refutation have been made.

China is coming online :(

Why the sad smily? I'm pretty happy Chinese tech is stepping up to break the current memory cartel.

They can't fix stupid.

Let me describe this in the most simple terms possible: You have speculators speculating about AI products. The speculators are not very smart when it comes to technology, and think RAM is RAM. There is at least three kinds of RAM that are important to this: DDR for system RAM, GDDR for GPUs, and HBM for high density enterprise products, and they are not interchangeable, there is no one-die-fits-all solution.

So, these speculators are like "oh no, more GPUs requires more RAM!", and then just start speculating on all RAM. Which of these RAMs are the ones that they need to worry about? Exclusively HBM, which is a minority in production, DDR and GDDR dominate production.

If you're into inference, and have older machines, you're buying Hxxx or Bxxx cards that use HBM, fit into dual slot x16 configurations, and you're jamming (optimally) 8 of them in. If you're into hardware that is newer, somewhere in the middle of the inference boom, you're using MXM cards. In either situation, the host machine has DDR, but if you're OpenAI, Anthropic, Microsoft, or Google, you're not building (more) inference machines like this.

The first two are buying Nvidia's all in one SBC solution: unified HBM, onboard ARM CPU to babysit the dual GPUs, has its own dual QSFP network controller that can RDMA, etc. No DDR or GDDR involved. Any machines built before this platform are being phased out entirely.

Microsoft is doing the same, but with AMD's products, the MI series that co-locates Epyc-grade Zen 4/5 CCDs with CDNA compute chiplets, running the entire thing off HBM, thus also unified and no DDR/GDDR needed. They, too, are phasing out machines older than this.

Google has a mix: they offer Nvidia all in one SBCs as part of GCP for legacy inference tasks (so your stack that can't run on AMD yet still can run), but also offer the same MI products that Microsoft offers via Azure's inference product, but also has their own TPUs that some of Gemini runs on; the TPUs run on HBM afiact. No DDR or GDDR here.

So, what does AMD or Intel do here? Lets say they waste fab time to make their own dies on the wrong process (TSMC and Intel-Foundry do not have for-RAM optimized processes)... they would be producing DDR and GDDR for a market that almost has its entire demand met. Intel lacks the die stacking technology required to build HBM, and TSMC I think can't do it for that many layers (HBM has 8 to 16 layers in current gen stuff iirc).

Micron, for example, already is bringing two large factories online here in the US to meet the projected growth in demand for the next 20+ years. When these factories finally start producing, it will not change the minds of speculators: they still seem to think AI datacenters need RAM, of any kind, and refuse to understand even the most basics of nuance. Also, when they come online, HBM will be a minority product; the AI inference boom is still just a bump in the road for them.

Nvidia kinda screwed their consumer partners, btw: they no longer bundle the GDDR required for the card with the purchase of the die. There is a slight short term bump in GDDR spot prices as partners are building up warchests to push series 60 GPUs into production, and once that is done, spot prices return to normal (outside of the wild speculation manipulation).

One last thing: what about LPDDR, used by AMD Strix Halo and Apple stuff? Speculation seems to have not actually effected it. I consider it as a sub-category of DDR (and some dies seem to work as either DDR or LPDDR as of DDR5, due to the merger of the specs by JEDEC), but since it isn't something you find in datacenters, it seems to have avoided speculation.

The Ryzen Max CPUs mentioned in the linked article? Uses LPDDR. Doubling down on the Ryzen Max product line might be a brilliant move.


> The speculators are not very smart when it comes to technology, and think RAM is RAM. There is at least three kinds of RAM that are important to this: DDR for system RAM, GDDR for GPUs, and HBM for high density enterprise products, and they are not interchangeable, there is no one-die-fits-all solution.

The commenter is also not very smart and does not realize companies making the RAM can trade capacity of one for another and any re-tooling at current price is still profitable.

The commenter also does not realize that is also true for lines currently making SSDs


They can trade capacity, but they generally aren't. The huge storage-only fabs owned by Samsung and Micron do runs that go for 9 months to 12 months.

Flash chips haven't been speculated on nearly as hard, and are suffering from the same sort of weird lack-of-nuance. Samsung, for example, isn't reassigning capacity to meet some sort of phantom datacenter demand that isn't already there, generically, across all datacenters, AI or not.

A lot of SSD price skyrocketing is largely "SSDs have RAM on them for cache", not "SSDs have flash chips, and they're both made at the same fabs"... which oddly effects low end SSDs that don't have external cache.

To make it worse, for the speculators who do understand this, because it isn't some universal homogeneous group, the flash chips that go into enterprise SSDs aren't the same that go into consumer SSDs.

The Big Three still aren't doing some major re-tasking of capacity, as the actual global demand isn't outstripping supply any more than normal. There is no short term problem to fix, speculators are just gonna have to stop hoarding toilet paper like its the start of Covid.

Edit: Oh, and if you want to ask how AMD/TSMC or Intel solve this? They can't, same reason why making their own in-house HBM isn't happening.


Both Western Digital and Kioxia have reported their 2026 Flash/Hard drive production capacity is sold out.

Micron killed Crucial to focus on AI.


I'm glad Kioxia (formerly Toshiba) have been able to do that. However, I also know they've been having problems meeting demand for quite some time, and haven't been able to scale up nearly as fast as the big three have. There was an incident in 2019 and another in 2022 that killed entire runs of chips and screwed them during the Covid datacenter rush.

Micron killed Crucial because Crucial was a weird offering that competed with their own partners. This was always a weird problem, and it just didn't make financial sense to continue with it. One of the analyses I read was Crucial was less than 12% of sales.

Like, don't get me wrong, I've liked many Crucial products over the years, and even recommended some of them, but it was always weird they were trying to out-compete companies like Adata and other major ODMs.

The counterexample of this is Nvidia absolutely trying to kill their partners, and going to first party assembly and sales of products. Nvidia isn't even going to PNY anymore for ODM needs, but going directly to Foxconn.

Micron execs claiming its because of AI is a bit weird and revisionist, because they've been working on exiting the Crucial brand since long before they publicly announced it. The public didn't learn of any such plans until right before the Ballistix brand sunsetting was announced in 2021, but started years before that. Like, I know they're just playing to their shareholders, but its still a bit weird.


When did nvidia drop PNY as ODM for their reference cards? I recall my A5000 (now 2 Gen old) was made by PNY.

As far as I know, the current lineup is PNY still makes the workstation cards, possibly also the x16 server cards, but Foxconn is doing the Blackwell SBCs and MXMs, and those SBCs are a pretty big chunk of Nvidia's income right now. I also believe they have moved to Foxconn for the Founders Edition consumer cards.

Also, with the FEs, their partners are disallowed from making their own FEs, even if they make their own PCB from scratch and not based on any existing Nvidia design. Doesn't matter who makes the FE, it immediately puts partners at a great disadvantage if they can't make one too.


Samsung and SK Hynix have moved all of their capacity over as well IIRC.

Reminds me of "false sharing" effect: hidden common dependency and bottleneck for what looks like independent variables on the surface.

> So, these speculators are like "oh no, more GPUs requires more RAM!", and then just start speculating on all RAM.

Are you claiming that these speculators are buying DDR5 RAM and warehousing it somewhere? Or what exactly is the mechanism you are proposing here?

To me it seems much simpler - AI companies want HBM, but HBM and DDR5 share the same wafer production process and facilities, but the HBM process is much more fragile and takes three times the wafer production.

There isn't enough DDR5 RAM being produced, so prices go up.


Those micron factories won’t even be targeted at consumer-grade RAM though, right?

There is no such thing as "consumer grade RAM". Servers still take DIMMs, ECC DIMMs just has more chips on it (previously 9 instead of 8, but now 10 instead of 8 as of DDR5; you'll see some DDR5 DIMMs with 5 instead of 4 because they're double die packages).

Micron, Samsung, and Hynix just basically sell you chips that comply with the JEDEC spec, and the DIMM manufacturers further bin them according to purpose. The highest end chips (that are stable at high clocks and acceptable voltages) end up in enthusiast performance products, the ones that don't work well at all but still meet JEDEC spec are sold to Dell/HP/Lenovo/etc for Grandma's Facebook machine, and the ones that are exceptionally stable at thermal design limits are plunked onto ECC DIMMs and sold to servers.

Also, as others have mentioned, its just a fab, and it can make any of the dies they're able to make. Whatever needs to be made to meet demand, they make, they just can't turn on a dime and react to quarterly concerns, and are locked into cycles that may range from 6 months to 18 months.

Side note that is also worth mentioning, sometimes you can order special bins of parts with features that wouldn't normally be available if you're willing to order enough. Recent example being Nvidia buying overclocked GDDR6 chips from Micron with additional features enabled; Micron was more than happy to become Nvidia's exclusive supplier for the custom GDDR chip if Nvidia was willing to buy out the entire run. Stuff like this happens every so often, but isn't the norm.


If there’s no such thing as consumer grade RAM, then what did Micron’s announcement mean?

https://investors.micron.com/news-releases/news-release-deta...


Re-read the previous comment again.

You just need an additional chip to move from "consumer grade" (ie no parity) to "server grade" (ie have parity). ECC support is actually in the memory controller which is in the CPU for the last 15 years. No magik.


The announcement means that they're closing Crucial - just like it says in the title and the first paragraph. The rest of that press release is outlining the mechanics of how that works + some fluff. Micron is going to continue producing the exact same memory chips in the exact same fabs. They're just not soldering it to a board, slapping the Crucial logo on it, and selling it directly to consumers. There's nothing stopping downstream vendors from buying Micron chips, soldering them to a board, and selling them to consumers as Micron was doing previously.

There's nothing in that press release that implies that the memory was somehow different (or "consumer-grade"). The _only_ thing they're saying is that they're ending their B2C business and focusing on B2B.


Didn’t you just describe the literal difference between consumer grade RAM that is soldered to a consumer format board vs a memory chip sold to a company to be soldered onto a product of the other company?

Calling it "consumer-grade RAM" is inaccurate - RAM is RAM. When you solder it to a board, you now have a DIMM that is carrying RAM chips. It's a semantic difference, but it's important.

So where are all these speculators storing DDR5, flash, and even spinning hard disks? Asking for a friend.

As a small buyer of all of those things supply at nearly any price has gotten very difficult to reliably predict week to week. When a lot of 100 64GB DDR5 sticks shows up available at a vendor, it’s at a take it or leave it price good for a couple hours. If I don’t pull the trigger they have another buyer for it and I might be waiting another month before anything becomes available again. We can no longer JIT for even failure replacement on our edge nodes.

Then you have the NVMe and even SATA SSD shortages. Still a bunch of very useful hardware out there I would love to find a decent deal on 8TB sata so I could repurpose it. Just doesn’t make any sense right now at current pricing and availability. Good luck trying to even find a batch of 12 of these disks at a time.

This goes for both enterprise and even prosumer I was willing to take for some of these uses.


Its mixed. Some of it really is Covid toilet paper behavior.

Datacenter customers, for example, have repair parts on hand; boxes of harddrives/ssds waiting to be put in, boxes of consumable parts, DIMMs waiting to replace ones that went faulty, entire machines already racked and waiting to take over for their fallen siblings, etc. Some of these customers added more to the spare parts pile. The big clouds manage their elastic demand of any sort of consumable or repair parts in volumes that are described in terms that fit cargo trucks in a quarterly basis, and they've already compensated.

Now, otoh, you have the truly psychotic people, that fill their basements with toilet paper, just hoarding more than they could ever use in their entire life. We've all seen that story where a guy was going to lose his house because he blew his mortgage money on toilet paper, and was selling it at a loss just to stay afloat. People like this exist in every crisis, and there's gonna be a headline in the near future where someone is gonna lose their house because they had like a hundred trays of DIMMs in their basement.

A few people I know who scrape eBay like its their job for electronics are just waiting for people to start fire-selling DIMMs and SSDs that got hoarded and they couldn't scalp people over; they're expecting half of MSRP or better sometime later this year.


> what about LPDDR, used by AMD Strix Halo and Apple stuff? Speculation seems to have not actually effected it

Good luck actually finding them on stock with 128GB+ RAM. I got strix laptop while ago, now price in EU is technically the same, but no stock. Maybe month or three

There is also claw hype. And large gwen3.5 models can run very well on DDR5 CPUs or mac minis...


I find the panic over RAM prices to be overestimated. 32GB DDR5 RAM is around $500 which is comparable to to the 9800x3D. Sure it sucks that it increases by around 4x, but when you factor in the overall price of a top end PC at around 1000-2000, especially for the lion's sum of the GPU, the increase is marginal.

This only effects a very narrow slice of highly budget conscious consumers trying to build high end PCs at razor thin margins.


$500 for 32GB is about $15/GB which is a high we haven't seen since the mid-2000s. This is a big deal, it turns RAM and to some extent storage (especially fast storage) into a massive economic bottleneck.

> since the mid-2000s.

Did you adjust for inflation ?


Adjusted for inflation, the last time prices (/GB) were this high was May 2011; the tail end of the 2009/2010 shortage. Aside from a brief glut in 2008, it wasn't really cheaper before (than it is now) though. Of course RAM is much faster these days, but also in 2011 most people had no more than 4 GB of system memory and 512 MB VRAM.

https://web.archive.org/web/20240805053759/https://jcmit.net...

https://thememoryguy.com/dram-prices-hit-historic-low/

Inflation applied manually; https://www.bls.gov/cpi/

https://www.neowin.net/forum/topic/983036-latest-steam-hardw...

Steam hardware survey GPU history: https://www.youtube.com/watch?v=wHTdnIviZTE


Inflation since the 2000s cannot possibly make up the difference in price we’ve seen in just the last 6 months.

That was not my point entirely; my point that citing prices from 2000s and comparing with modern ones |(with indexing about 2x times), regardless of underling reason is either a demonstration of lazyness or innumeracy, or even worse - an attempt to manipulate.

It’s not laziness, innumeracy, or manipulation when it can be taken at face value that the cost increase vastly outstrips anything that could be attributed to inflation. You don’t even need to look it up to know that.

> when it can be taken at face value that the cost increase vastly outstrips anything that could be attributed to inflation

But that was not my point _whatsoever_. What I said is - every time you bring the explicit numbers (like in GP "$500 for 32GB is about $15/GB which is a high we haven't seen since the mid-2000s") you _absolutely_ have to adjust for inflation to have a meaningful conversation. This is it.


Ram is clearly way more expensive now, yes?

Did you adjust for technological improvements that pumps out more chip per wafer compared to mid-2000s due to node-size shrink?

That was not my point entirely; my point that citing prices from 2000s and comparing with modern ones |(with indexing about 2x times), regardless of underling reason is either a demonstration of lazyness or innumeracy, or even worse - an attempt to manipulate.

No, I didn't adjust for the huge inflation in average RAM requirements since the mid-2000s.

Not the point - my point utterly of arithmetical nature - dollar has substantial inflation, and any comparison more that 5 years apart, let alone 20 warrants adjusting of prices, as error is substantial, 2x in the case of 2000s

You demand specific data points but respond with vague handwaving and general statements about the importance of calculating inflation in this discussion as if it represents more than a small fraction of the overall increase in ram cost

> more than a small fraction of the overall increase in ram cost

There is nothing vague about the question if prices were scaled or not (and in this pretty much unvague coefficient of ~2x between usd in 2000 and 2026), otherwise there is point in comparing these numbers, as there is no point in comparing inches and cm's without declaring beforehand which number is which.


You are perfectly capable of looking at the rate of inflation since the mid 2000s and seeing that it only tells a small portion of the story.

You cannot possibly look at the price of ram now compared to six months ago and be so fixated on including inflation. Obviously inflation occurred and obviously after 20 years it has an impact on price. But we are all on HN and all know what inflation is, so forcing people to drill down on its contribution in order to advance the conversation when it clearly only accounts for a small portion and we all know it’s a factor is absolutely ridiculous. You know this, we know this, and yet here we are still talking about it. I may as well explain what ram is if we want to get this elementary about things.

True or false: ram has become substantially more expensive in the last 6mo in a way that cannot be meaningfully explained by inflation.

There is a very clear, very obvious answer here. Inflation or not.


> You are perfectly capable of looking at the rate of inflation since the mid 2000s and seeing that it only tells a small portion of the story.

Have no idea why you are keeping arguing about something which was not my point to begin with.


$500 is 5x what it cost less than a year ago, just for context. It turns a $1600 computer build into a $2000 one. That’s a huge difference.

Edit: I don’t get your math. If we’re using a very generous definition of “top end,” even neglecting Nvidia and going AMD - which some would argue makes it not top end - you’re talking conservatively: $600 for a GPU, $500 for 32gb of ram, and $500 for a CPU. $1600 before PSU, case, SSD, fan(s), mobo…there’s no world in which you’re coming in under $2k. The SSD and board will put you over immediately.

You’re talking 3/2025 prices, not 3/2026. A compromise, mid-range computer is $1500 to build now.


A 5080 is 1.5K, A 5090 is even more. 1600 to 2000 is not really a large difference at the price band where you are spending that much money, especially since you would heavily comprising in other components if you want to keep that budget, in which that case you don't need 32gb RAM.

That is to say, if you want a system that keeps up with 32 GB Ram, you'd be already willing to spend alot what with options for noctua fans, water cooling, higher end MOBOs, premium cases, OLEDs etc. If you can't afford that then you won't be buying expensive DDRD5 RAM either.


A 9060 is like $450, an XT is like $550. Depending on what you’re using that computer for it could be more than enough firepower. There are tons of people not paying the Nvidia tax because they have a plenty viable build with AMD.

I built my current PC (9800x3D, 9060, 32gb DDR6) last April for about $1800. It would cost almost $3000 now between storage and ram increases. The economics have completely shifted. Everything is more expensive except basically the PSU and case


9060 is mid-tier, buying a 9800X3D and DDR6 RAM is overkill because the GPU won't keep up with their performance.

AMD has no equivalent to NVidia in the high end, it isn't tax as it is functional monopoly


We aren’t debating AMD vs. Nvidia and I shouldn’t have gotten distracted with it tbh. I am talking about what it takes to build a computer now.

Ram and storage have ballooned PC costs. That’s the issue. Whether you are buying an AMD GPU or an Nvidia GPU, it is still substantially increasing build costs. Nobody is spending $1500 on an Nvidia GPU and then going “well nothing else matters now.” The ram and storage has gone from $200-$300 to $800-$1000. That’s still a huge portion of the budget. They’ve gone from near-line item status to 1/3rd (or more) of the cost. Affordable builds have become incredibly difficult to achieve


Also just randomly realized I kept saying 9060 when I meant 9070.

Look at my comment above and see what i said about my build. I was unwilling to pay that for a moderate build that can sustain my computer use. Utilize bundles that can save you $ on RAM or chipset, CPU to skip some of the costs.

You aren't specific in your comment. Where are these bundles? What do you do with all the parts you don't need/end up swapping out? How much are you actually saving?

I went to micro center and they usually have decent priced bundles. RAM was G Skill flare 32gb sticks. Im not being specific in the previous post. because there is many ways to save depending on what your willing to trust. Best Buy open box/new, Micro center open box/new, Walmart has pretty good prices depending what your looking for. Ebay is iffy depending on the product your looking for GPU are expensive at the moment.

Microcenter is not available for the vast majority of us - they don’t ship. The nearest one to me is an almost 8hr drive and I live in a major city. I’m not spending 2 days and $200-$300 on gas/food/lodging to get there and back.

Bestbuy is selling ram and storage at the same cost as everyone else. I imagine Walmart is not much better. I’m also not sure what you do with all the bundle parts that you don’t need. Do you sell them? Where do you sell them?

What deals did you take advantage of? What did everything cost you in the end/when did you build? If you don’t feel like answering that’s fine but it’s valid to remain skeptical given all the evidence to the contrary. Perhaps you’re just really good at finding deals but you can look around this thread and see that we are all telling the same story. Building a computer has gone up $600+ for common builds over the last 4-5mo on top of the already inflated GPU prices we’ve been experiencing for years. If you put my exact build I did last April into PC part picker it is an additional $500+ to build now, and that’s with an AMD GPU to keep costs down.

It’s strange times when Mac minis are a budget-friendly computer. Building a half decent PC for less than $1500 is a serious challenge now. Things are so volatile valve still hasn’t released or even set a price for the new Steam machine.


Microcenter is useful if it's near you then your out of luck. Other stores depends on your egion and location same store gives different deals and discounts based on regional selling trends.

I'm not sure what you expecting to hear. What do i do with parts I'm not gonna use? What are you talking about? Don't get the bundle is you're not going to use what the bundle comes with, simple as that. Have you not shopped before?

Currently computer components are not cheap and it does not look like it's getting any better.

I currently have two moderately good laptop that either i sell or keep for back up.


I disagree with you. The issue does not only affect a “very narrow slice” of consumers. https://www.techspot.com/news/111472-hp-warns-ram-now-makes-... A major brand is now suggesting that this is a “new normal” and one solution is to just offer systems with less ram. This is an issue when lots of modern software seems to expect an unending supply.

That is an insane amount of money for just 32GB of RAM! That's what we were paying back when it was hard to use more than 32-64GB in a desktop setting. These days with all the electron and node bloatware, containers everywhere and AI - 32GB doesn't get you far.

> 32GB DDR5 RAM is around $500 which is comparable to to the 9800x3D.

Apples to oranges. Why are you comparing RAM prices to CPU prices? It's different hardware.

$500 for 32 GB is insane. Just 18 months ago, I bought 128 GB of DDR5 for only $480.


>"overall price of a top end PC at around 1000-2000"

All 4 of my "top end PCs" have 128GB RAM. Me server (I self host everything is 512GB). Lucky for me all were bought before that insanity.


That's precisely why we had forums back in the '00s. there were forums for basically everything under the sun - unfortunately they mostly died nowadays. HN is basically just a forum with a single board that got enough recognition - your average forum had more boards than users and was so fragmented it mostly ended up in instadeath after a few months,maybe years

I think people don't really realise that compilers are "difficult" projects in the same way as an appendicectomy is for a skilled surgeon, i.e. the surgery is "routine" only because the surgeons spent decades honing their skills to do these routinely. The hard part was training someone to be able to do that.

Writing a compiler/interpreter is _extremely_ straightforward; a lexer -> parser -> ast -> semantic analysis -> {codegen -> linker | evaluator} pipeline is a very widely understood and tested way to write a compiler in any language, regardless of what language you are trying to compile. The hard part is _learning_ how it works, but after that implementing a compiler is a kind of mechanical activity. That's why LLMs are so great at writing parsers: they can just read the source of any compiler (and they probably read all of them) and apply the same stuff mechanically, with almost a 100% accuracy. We even have formal languages to define parsers and RTL and stuff, that's how "mechanical" the whole process can be.

I'm pretty sure that any skilled compiler dev with the ISO C standard and a few packs of Red Bulls can apecode a working C compiler in a few days, give or take. The hard part isn't doing that, the hard part is the decades of iterative improvements to make it generate extremely performant yet correct code as fast as possible.


This is the first time I've heard the term 'apecode', and I will make sure to use it at every opportunity.

It is not mine unfortunately, it comes from a very funny write-up by rsaksida: https://rsaksida.com/blog/ape-coding/

That’s true for any software project - on average only 5% goes into developing of the first version, while 95% goes into continuous development, support, and maintenance.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: