Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Wrong conclusion, IMO. This makes inference more cost effective which means self-hosting suddenly becomes more attractive to a wider share of the market.

GPUs will continue to be bought up as fast as fabs can spit them out.



The number of people interested in doing self-hosting for AI at the moment is a tiny, tiny percentage of enthusiast computer users, who indeed get to play with self-hosted LLMs on consumer hardware now.. but the promise of these AI companies is that LLMs will be the "next internet", or even the "next electricity" according to Sam Altman, all of which will run exclusively on Nvidia chips running in mega-datacenters, the promise of which was priced into Nvidia's share price as of last Friday. That appears on shaky ground now.


I'm not talking about enthusiastic computer users. To be frank, they're rather irrelevant here. I'm talking about companies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: