Skip to here[1] if you want to get to the part actually about Altman and his recent fundraising pitch. The beginning is structured as if it is necessary context for that section, but it really is not.
Some interesting thoughts, but this piece is way too "stream of consciousness" for me to get much out of it. Maybe long-time readers of Mowshowitz are accustomed to this format.
I thought maybe I was just too dumb, but looking back, the lack of coherence or structure definitely contributed to not forming a clear model of everything he wrote.
Sam Altman realizes that AI era is not about software (a few hundred lines per model) or about data (it's everywhere) , but about hardware (hence nvidia rockets to the moon). He wants to compete with nvidia with easy money from UAE. It s not really trillions of dollars , but that number grabs the attention of Emiratis who like to flex their money. I don't think it's much more complicated than that
The page definitely reads like a collection of journal thoughts/summarizing notes and citations but I appreciated the sections covering the lack of humanity in considerations, with a choice quote from a former Google researcher saying that even extinction from AI is irrelevant if intelligent life can continue (relevant pull quotes[1]).
> It is not what OpenAI’s non-profit mission looks like.
> The entire justification for OpenAI’s strategy is invalidated by this move.
I thought we were long past the point where we would compare OpenAI's strategy to it's "non-profit mission" and where it could have been "invalidated" based on this.
It's all jokes in this thread, but I wonder if there's any substance to such high praise. Is this just that, a praise? Or are there some traits of his that would make this plausible? Can I learn these traits?
I mean, surely I could say something like this about some of my acquaintances. Probably I have said that. But this doesn't really mean anything, and you shouldn't quote these jokes of mine as if they actually do.
My point is, there's that myth about "special" people. You know, like, there are NPCs, and there is Napoleon. But even though there are people I consider more capable and less capable, more interesting and less interesting, I don't know anybody I would truly believe to be special. And in the real world, everybody would die on that island just the same.
But then, I'm surely less well connected than Graham, and I don't even know many billionaires personally. So I cannot help but wonder.
There's no need to invoke "NPCs". Some people are talented at painting, some people are talented at math, and some people are talented at bending large numbers of people to their will. It just happens that someone talented in gaining and holding power is more likely to be highly impactful and visible than someone talented in something that, well, doesn't explicitly revolve around impact and visibility.
Napoleon kind of did, he just expressed it in more traditional religious language about destiny and the divine, as opposed to the second-rate materialist knockoff version.
This is a backhanded compliment. Nothing pg says about Altman is straightforwardly positive; although he deliberately phrases things in a way that lots of American struggle to read negatively.
I think PG's comments towards Sam (at least since Sam left YC) are not really complements, when you read between the lines... King of the cannibals? What do you have to be willing to do to take that crown?
But yes, knowing some CEOs, I'd say there are rare (at the population level) traits that predispose you to success in that role.
An ability to sell a vision, which can become pathological detachment from reality long before this trait becomes detrimental to your ability to fundraise for a startup.
An ability to focus on the mission and not get bogged down in details, which can become pathological sociopathy/psychopathy long before this trait becomes detrimental to the company.
Gift of the gab / eloquence is very helpful.
A willingness to sacrifice a lot to achieve one's goals, which is usually detrimental to any kind of life outside of the business.
Basically, CEOs are weirdos, the job is way harder than non-founders will ever know, and it definitely has selection pressures and feedback loops that can concentrate some extremely toxic traits if you're not careful. Most people don't want to be CEOs if they understand what it actually means, and most people wouldn't want to self-modify their personality to make them a lot more CEO-like.
So why AI and not climate change? Don't climate scientists and economist estimate it will cost several trillion to decarbonize the economy and pull CO2 out of the air to avoid the bad climate scenarios?
The AI, given the right prompt, will develop plans, conduct research, command robotic factories, and give us decarbonization and carbon-neutral industry, faster than 8B humans would manage!
(Just never mention paperclips. Forget this word.)
I can't put my finger on it, but it feels like Mark Zuckerberg is doing something similar spending 10's of billions lacking a coherent vision or immediate demand for a specific use. At least Sama has a strategy for maybe big things that could change the world(tm). My concern is spending 10's of billions or trillions without a specific, profitable need is inherently risking large piles of cash that would be more ethically given away rather than wasted.
Well, the money doesn't go up in smoke. They end up paying it to their staff, suppliers, and so on. If the project doesn't pan out, it's actually just voluntary redistribution of wealth and the society is better off than if Sam's investors hoarded that money or used it to buy up real estate or make other fairly safe and boring bets.
An argument can be made that there is some specific and deliberate way of giving out $10T that's more equitable - say, spend it on education, public transport, or healthcare. But if we could all agree on what to do, nothing is stopping us from doing it without Sam's (theoretical) money.
>If the project doesn't pan out, it's actually just voluntary redistribution of wealth and the society is better off than if Sam's investors hoarded that money or used it to buy up real estate or make other fairly safe and boring bets.
And yet, still dramatically worse off than if they had decide to "change the world" by funding access to clean water in rural India / Africa, or something along those lines.
Investing in renewables and nuclear should really be where it's going. Should've been done 30 years ago. It is hard to argue with, "but AI will make better panels", maybe it's true, but we need to survive the next 10 years to realize the benefits.
Your turn-of-phrase highlighted an interesting point, for me: a lot of that money will pay for a lot of electricity and other energy. Much of it literally goes up in smoke!
I still don't intuitively factor in the real, physical, non-renewable materials that go into these things projects, but I really should.
I don’t care who it is, nobody is capable or ethical enough to allocate that kind of capital in a more effective way for societies benefit than it being in the hands of a billion people.
So yes, it would be better for everyone if these billionaires literally allocated 99.9% of their class a stock to millions of employees and customers (+ cash for the gift tax or lobby the usg to waive it)
I’m trying to figure out how you think that the “capital class” lives off the debt of “the poor” when the bottom 50% of US households account for just 2.6% of total income.
The people doing the actual work in the world (aka the global labor force) are having their value stolen from them because the structure of the modern firm alienates the worker from their capital by assuming it was never theirs and the “proprietor” is assumed to own all simply because they fronted the cash to get it started
They hide behind coded socio-political and economic language to externalize their real needs on others, inflate their buying power, burn more fossil fuels.
The fiat wealth model is politically contrived misdirection. Physics makes no mention of a fundamental currency policy or sacred economic model. It’s 100% socialized propaganda there’s a communal upside to capitalism as we know it, not immutable physics.
If an employer makes a transfer to an employee specifically because they’re an employee (as opposed to gifting something to every customer, employee or not), that’s overwhelmingly likely to be treated as income (disallowed gift treatment), for fairly obvious reasons.
Yes, but it isn't used on its own like that so it doesn't apply. It's used as part of compounds like 'o-kyaku-sama' (valued customer) or after someone's name.
>>He wants to build new chip factories in the decidedly unsafe and unfriendly UAE.
Great. just what we need.
Another "tech genius" who is entirely forking clueless about geopolitics and the need to avoid helping authoritarians who are globally warring on democracy.
It's all ultimately about what helps them, today, nevermind anyone else or the future.
I'm using his product right now in the other window, and want it to get better, but this kind of inanity is discouraging.
You seem "forking clueless" about geopolitics. The middle east has been ruled by authoritarianism since the beginning of civilization. America, after the fall of the USSR, at the height of its power, with more relative global influence than any country in history, went on to spend decades and many trillions of dollars trying to create democracy in the middle east only to utterly fail. I think you need to come to terms with the fact that in some places, realistically, no amount of external influence will cause democracy to exist.
The above comment would be more complete if it would mention the history of US-backed coups of middle east (and other) governments throughout the 20th century any time a democratically-elected leader began leaning towards the Soviet sphere of influence. For example:
Yes I understand that democracy building was only our nominal reason for being in the middle east. But when it come to Syria, Iraq, and Afghanistan we did, as a matter of fact, both directly and covertly, try to systematically eradicate authoriarian regimes and general extremism. We have attempted social engineering in hostile countries like what I mentioned previously, or through movenments like the Arab spring, or through economic sanctioning (i.e. Iran, Russia, North Korea) and have achieved nothing. While the genuinness of America's intentions are debatable, I don't think it's debatable that America carried out the largest, most expansive democracy building efforts in the middle east that has been or will likely ever be attempted.
It's one sense it's not good for American interests to massively outsource manufacturing like this. But we live in a world order where Russia and China have created a pole of power accepting to any countries being treated poorly by the US. Economically sanctioning every dictatorship is not going to be a winning long term strategy.
Are we talking billions are trillions? I'll agree that it's not good to be spending trillions in the UAE, making it the future central manufacturing place of silicon. Though, saying that we shouldn't be making billion dollar deals with certain counties is tantamount to ececomic santions.
Any kind of semiconductor investment is at the very least on the order of tens of billions - fab infrastructure doesn't come cheaper than that. That's too much, to say nothing of trillions.
>> I think you need to come to terms with the fact that in some places, realistically, no amount of external influence will cause democracy to exist.
I have long ago come to terms with that.
That conclusion does NOT mean that we need to support those authoritarian regimes (except possibly for tactical advantage when necessary). This is especially true when the major authoritarian regimes are already waging war against the democracies because international rule of law threatens their kleptocracies. In case you doubt it, Putin's own Presidential Press Secretary Dmitry Peskov specifically use the word "war" in describing Russia's actions against the West. Specifically:
“A special military operation began against Ukraine. Over time, it took the form of a war against the collective West. This is a war when the countries of the collective West, led by the United States, are directly involved in the conflict." [via Google Translate]
Of course Russia is allied strongly with Iran and North Korea, and being aided by China and others. Just because any particular authoritarian state does not join the alliance any particular week, does not mean we should be supporting them.
Plus, as someone else pointed out, if he's going to spend $7 TRILLION, the initial conditions of the location hardly matter. There are plenty of countries which would better benefit and be more cooperative. But obviously doing synergistic or even incidental good for others does not figure into these wannabe-tycoons' plans.
Friends, coders, business men and women,
Lend me your ears;[1]
I come to bury Altman, not to praise him.
The carbon footprint that men leave lives after them;
The good is oft archived with their code;
So let it be with Altman.
The noble Mowshowitz
Has told you Altman was ambitious:
If it were so, it was a grievous fault,
And grievously has Altman answered it.
Here, under the post of Mowshowitz and the rest–
For Mowshowitz is an honorable man;
So are they all, all honorable men–
Come I to speak at Altman’s burial.
He was our friend, faithful and just to us:
But Mowshowitz says he was ambitious;
And Mowshowitz is an honorable man.
He has brought many models forth to roam
Whose free tier outputs did the public's coffers fill:[2]
Did this in Altman seem ambitious?
When that the poor have cried, Altman has wept:[3]
Ambition should be made of sterner stuff:
Yet Mowshowitz says he was ambitious;
And Mowshowitz is an honorable man.
You all did see that Yishan Wong resigned [4]
And presented Sam Altman a kingly crown,
Which Altman did hand off to Steve Huffman: was this ambition?
Yet Mowshowitz says he was ambitious;
And, sure, he is an honorable man.
I speak not to disprove what Mowshowitz spoke,
But here I am to speak what I do know.
We all did love him once, not without cause:
What cause withholds us then, to mourn for him?
O judgment! thou art fled to brutish beasts,
And men have lost their reason. Bear with me;
My heart is in the coffin there with Altman,
And I must pause till it come back to me."
[3] "Altman is a supporter of universal basic income" (Wikipedia)
[4] "For eight days in 2014, Altman was the CEO of Reddit, a social media company after CEO Yishan Wong resigned. He announced the return of Steve Huffman as CEO on July 10, 2015." (Wikipedia)
> [3] "Altman is a supporter of universal basic income" (Wikipedia)
I thought universal basic income was just giving people money, but Sam's vision is apparently different. Here in Kenya his WorldCoin had desperate people lining up to scan their eyeballs and give over all their personal information in exchange for some useless cryptocurrency. The Kenyan government eventually told them to stop doing that.
I can't see the article on a quick scroll of the wordpress and I'm not sure where the LessWrong posts go. I really doubt it's AI. He just writes like that.
When I was a young man, I believed in computers. No more, and every time there is news about Altman, I get a fresh reminder that computers are probably going to be the end of us (because we are clever enough to create powerful AI but not yet clever enough to create safe powerful AI).
I don't think there's much to figure out. The original mission is never coming back. The goal now is to make as much $$$ as possible. Sam & MSFT would never have agreed to it otherwise.
Unfortunately this article missed the mark, with a great bait title followed by a disorganized dumpster tire fire of congealed mystery proto goo that leaves the reader wanting.
Seven trillion dollars.. who is this, Dr. Evil? Give me a break, giving any any single human, especially Sam Altman, access to that level of capital would be a disaster, it's around 25% of the USA's total GDP.
Sam's supposed to be an AI doomer, only pressing ahead to take up "overhang" created by GPUs. That's clearly not what's going on if he wants to build custom hardware.
He's explicitly not an AI doomer. He says that he disagrees with Yudkowsky and thinks superintelligence will be fine if we're careful. Curiously he never submits his reasoning for critique, it's always just an assertion given in softball interviews.
If you compare to Yudkowsky, not many people are. Nick Land? Gwern? I think of it, if someone thought there was a 4% chance of a comet killing every single person on Earth before the year 2100, it would be very hard to claim they were an optimist, though that's lower than the p(doom) number "AI optimists" give. It's only relatively that you can say people like Hinton, Christiano, Altman, and Amodei are not doomers.
Hinton I would say is more in the doomer camp. He is really concerned and says he has no idea what to do aside from raise the alarm for more people to work in AI risk.
Christiano puts AI takeover at something like 20 percent chance if I remember correctly.
Sam might have similar probabilities but has never verbalized it. It's more a "I disagree with Eliezer but trust me bro, I think we can get it right" attitude in his public facing statements so far. He is more towards dooomer than Zuck, though. At least he recognizes there's a difficult problem there.
So from this article I learned that Scott Alexander started a new blog on substack after notorious incident with slatestarcodex. Cool, but what's the point to change the platform, if all the older posts are linked on About page anyway (via archive.org!), and the new blog is pretty much exactly the same thing, only less readable and runs on JS-heavy platform? Is it even actually the same person, or is astralcodexten AI-generated content too?
Some interesting thoughts, but this piece is way too "stream of consciousness" for me to get much out of it. Maybe long-time readers of Mowshowitz are accustomed to this format.
[1] https://thezvi.substack.com/p/ai-51-altmans-ambition#%C2%A7a...