Hacker Timesnew | past | comments | ask | show | jobs | submitlogin
Altman's Ambition (thezvi.substack.com)
111 points by dustingetz on Feb 18, 2024 | hide | past | favorite | 104 comments


Skip to here[1] if you want to get to the part actually about Altman and his recent fundraising pitch. The beginning is structured as if it is necessary context for that section, but it really is not.

Some interesting thoughts, but this piece is way too "stream of consciousness" for me to get much out of it. Maybe long-time readers of Mowshowitz are accustomed to this format.

[1] https://thezvi.substack.com/p/ai-51-altmans-ambition#%C2%A7a...


For context, Zvi writes this column as a weekly roundup (this is #51).


I thought the link was broken since he went on for pages and pages about random AI news


I thought maybe I was just too dumb, but looking back, the lack of coherence or structure definitely contributed to not forming a clear model of everything he wrote.


Sam Altman realizes that AI era is not about software (a few hundred lines per model) or about data (it's everywhere) , but about hardware (hence nvidia rockets to the moon). He wants to compete with nvidia with easy money from UAE. It s not really trillions of dollars , but that number grabs the attention of Emiratis who like to flex their money. I don't think it's much more complicated than that


I agree.


The page definitely reads like a collection of journal thoughts/summarizing notes and citations but I appreciated the sections covering the lack of humanity in considerations, with a choice quote from a former Google researcher saying that even extinction from AI is irrelevant if intelligent life can continue (relevant pull quotes[1]).

[1] https://twitter.com/Tyler_A_Harper/status/175542617047083841...


I would have liked to read the rest of this thread, alas...


This quote is from Larry Page, one of Google’s co-founders.


Both are quoted in the linked thread and share similar sentiments, though the specific one being referred to was Richard Sutton.


> It is not what OpenAI’s non-profit mission looks like. > The entire justification for OpenAI’s strategy is invalidated by this move.

I thought we were long past the point where we would compare OpenAI's strategy to it's "non-profit mission" and where it could have been "invalidated" based on this.


> You could parachute him [Sam] into an island full of cannibals and come back in five years and he’d be the king

– pg, 2008


There's only one way to know for sure.


Speedrunning Heart of Darkness?


How about the North Sentinel Island[1]? Certainly a challenge.

[1] https://en.m.wikipedia.org/wiki/North_Sentinel_Island


Someone should tweet this as a video request to him.


It's all jokes in this thread, but I wonder if there's any substance to such high praise. Is this just that, a praise? Or are there some traits of his that would make this plausible? Can I learn these traits?

I mean, surely I could say something like this about some of my acquaintances. Probably I have said that. But this doesn't really mean anything, and you shouldn't quote these jokes of mine as if they actually do.

My point is, there's that myth about "special" people. You know, like, there are NPCs, and there is Napoleon. But even though there are people I consider more capable and less capable, more interesting and less interesting, I don't know anybody I would truly believe to be special. And in the real world, everybody would die on that island just the same.

But then, I'm surely less well connected than Graham, and I don't even know many billionaires personally. So I cannot help but wonder.


There's no need to invoke "NPCs". Some people are talented at painting, some people are talented at math, and some people are talented at bending large numbers of people to their will. It just happens that someone talented in gaining and holding power is more likely to be highly impactful and visible than someone talented in something that, well, doesn't explicitly revolve around impact and visibility.


Total off-topic, but if I was what you describe as a Napoleon-tier person, I would find it difficult to not believe I live in a simulation.


Napoleon kind of did, he just expressed it in more traditional religious language about destiny and the divine, as opposed to the second-rate materialist knockoff version.


This is a backhanded compliment. Nothing pg says about Altman is straightforwardly positive; although he deliberately phrases things in a way that lots of American struggle to read negatively.


This was said 3 years before sama became a partner at YC and 6 years before pg named him president of YC.

Whatever their falling out was like, it likely happened much later than 2008.


I think PG's comments towards Sam (at least since Sam left YC) are not really complements, when you read between the lines... King of the cannibals? What do you have to be willing to do to take that crown?

But yes, knowing some CEOs, I'd say there are rare (at the population level) traits that predispose you to success in that role.

An ability to sell a vision, which can become pathological detachment from reality long before this trait becomes detrimental to your ability to fundraise for a startup.

An ability to focus on the mission and not get bogged down in details, which can become pathological sociopathy/psychopathy long before this trait becomes detrimental to the company.

Gift of the gab / eloquence is very helpful.

A willingness to sacrifice a lot to achieve one's goals, which is usually detrimental to any kind of life outside of the business.

Basically, CEOs are weirdos, the job is way harder than non-founders will ever know, and it definitely has selection pressures and feedback loops that can concentrate some extremely toxic traits if you're not careful. Most people don't want to be CEOs if they understand what it actually means, and most people wouldn't want to self-modify their personality to make them a lot more CEO-like.


that sounds like a really good reason not to give him money?


So why AI and not climate change? Don't climate scientists and economist estimate it will cost several trillion to decarbonize the economy and pull CO2 out of the air to avoid the bad climate scenarios?


The AI, given the right prompt, will develop plans, conduct research, command robotic factories, and give us decarbonization and carbon-neutral industry, faster than 8B humans would manage!

(Just never mention paperclips. Forget this word.)


Stopping climate change doesn't make you super rich or king of Cannibal Island [1].

[1] https://hackertimes.com/item?id=39415462


I can't put my finger on it, but it feels like Mark Zuckerberg is doing something similar spending 10's of billions lacking a coherent vision or immediate demand for a specific use. At least Sama has a strategy for maybe big things that could change the world(tm). My concern is spending 10's of billions or trillions without a specific, profitable need is inherently risking large piles of cash that would be more ethically given away rather than wasted.


Well, the money doesn't go up in smoke. They end up paying it to their staff, suppliers, and so on. If the project doesn't pan out, it's actually just voluntary redistribution of wealth and the society is better off than if Sam's investors hoarded that money or used it to buy up real estate or make other fairly safe and boring bets.

An argument can be made that there is some specific and deliberate way of giving out $10T that's more equitable - say, spend it on education, public transport, or healthcare. But if we could all agree on what to do, nothing is stopping us from doing it without Sam's (theoretical) money.


>If the project doesn't pan out, it's actually just voluntary redistribution of wealth and the society is better off than if Sam's investors hoarded that money or used it to buy up real estate or make other fairly safe and boring bets.

And yet, still dramatically worse off than if they had decide to "change the world" by funding access to clean water in rural India / Africa, or something along those lines.


Investing in renewables and nuclear should really be where it's going. Should've been done 30 years ago. It is hard to argue with, "but AI will make better panels", maybe it's true, but we need to survive the next 10 years to realize the benefits.

Where I am today it's 15c above average...

Mark has kids so he should know what's at stake.


Your turn-of-phrase highlighted an interesting point, for me: a lot of that money will pay for a lot of electricity and other energy. Much of it literally goes up in smoke!

I still don't intuitively factor in the real, physical, non-renewable materials that go into these things projects, but I really should.


I bet it can realistically only be solar, and maybe a nuclear plan or two.

The solar energy would be wasted anyway unless captured. (Nuclear plants are harder.)


> Much of it literally goes up in smoke!

Smoke and gradient updates


I don’t care who it is, nobody is capable or ethical enough to allocate that kind of capital in a more effective way for societies benefit than it being in the hands of a billion people.

So yes, it would be better for everyone if these billionaires literally allocated 99.9% of their class a stock to millions of employees and customers (+ cash for the gift tax or lobby the usg to waive it)


Shouldn't something big have come out of the stimulus checks if dividing $814B up to every person in America was a productive way to allocate capital?


Yeah it did, most people used it to survive because most people are drowning in debt and have no chance to ever retire.

See that most people spent it on food and debt:

https://www.bls.gov/opub/btn/volume-9/pdf/receipt-and-use-of...

Getting everyone out of debt would destroy the capital class that lives off the poor though so it is unlikely to happen


I’m trying to figure out how you think that the “capital class” lives off the debt of “the poor” when the bottom 50% of US households account for just 2.6% of total income.


Stolen wages + Inequitable equity distribution

The people doing the actual work in the world (aka the global labor force) are having their value stolen from them because the structure of the modern firm alienates the worker from their capital by assuming it was never theirs and the “proprietor” is assumed to own all simply because they fronted the cash to get it started


That claim would be extraordinary if true; entirely ordinarily, it turns out to be false.

https://www.cbo.gov/publication/59757 has information on income breakdown and shows that claim to be wildly incorrect.

You appear to have confused total wealth with total income.


They hide behind coded socio-political and economic language to externalize their real needs on others, inflate their buying power, burn more fossil fuels.

The fiat wealth model is politically contrived misdirection. Physics makes no mention of a fundamental currency policy or sacred economic model. It’s 100% socialized propaganda there’s a communal upside to capitalism as we know it, not immutable physics.


Gift tax is already the responsibility of the donor, not the recipient (in the US tax code). The gift tax is essentially a form of estate tax.

https://www.irs.gov/businesses/small-businesses-self-employe...

If an employer makes a transfer to an employee specifically because they’re an employee (as opposed to gifting something to every customer, employee or not), that’s overwhelmingly likely to be treated as income (disallowed gift treatment), for fairly obvious reasons.


Great, even easier


I downvote every comment that refers to Sam Altman as sama


This is his user name on hacker news, why it's a problem to use it here ?


In Japanese, "sama" is an honorific used for "individuals of higher rank than oneself", and can carry connotations of admiration or worship [1].

I don't think this makes it a problem to use given the mainly American audience of HN, but could be why some users take it to be inappropriate.

[1] https://en.m.wikipedia.org/wiki/Japanese_honorifics#Sama


Yes, but it isn't used on its own like that so it doesn't apply. It's used as part of compounds like 'o-kyaku-sama' (valued customer) or after someone's name.


>>He wants to build new chip factories in the decidedly unsafe and unfriendly UAE.

Great. just what we need.

Another "tech genius" who is entirely forking clueless about geopolitics and the need to avoid helping authoritarians who are globally warring on democracy.

It's all ultimately about what helps them, today, nevermind anyone else or the future.

I'm using his product right now in the other window, and want it to get better, but this kind of inanity is discouraging.


Why UAE and not Romania does indeed look like a question.

Most pre-existing conditions are irrelevant in light of spending $7T.

https://www.cpushack.com/2022/09/29/socialist-romania-comput...


You seem "forking clueless" about geopolitics. The middle east has been ruled by authoritarianism since the beginning of civilization. America, after the fall of the USSR, at the height of its power, with more relative global influence than any country in history, went on to spend decades and many trillions of dollars trying to create democracy in the middle east only to utterly fail. I think you need to come to terms with the fact that in some places, realistically, no amount of external influence will cause democracy to exist.


The above comment would be more complete if it would mention the history of US-backed coups of middle east (and other) governments throughout the 20th century any time a democratically-elected leader began leaning towards the Soviet sphere of influence. For example:

https://en.wikipedia.org/wiki/1953_Iranian_coup_d%27%C3%A9ta...

https://en.wikipedia.org/wiki/CIA_activities_in_Syria#Operat...


I don't think the parent comment is suggesting anything about trying to make the Middle East democratic.


> went on to spend decades and many trillions of dollars trying to create democracy

You can't be this gullible?


Yes I understand that democracy building was only our nominal reason for being in the middle east. But when it come to Syria, Iraq, and Afghanistan we did, as a matter of fact, both directly and covertly, try to systematically eradicate authoriarian regimes and general extremism. We have attempted social engineering in hostile countries like what I mentioned previously, or through movenments like the Arab spring, or through economic sanctioning (i.e. Iran, Russia, North Korea) and have achieved nothing. While the genuinness of America's intentions are debatable, I don't think it's debatable that America carried out the largest, most expansive democracy building efforts in the middle east that has been or will likely ever be attempted.


There is still no need to prop up their oil dictatorships with billions in chip investments.


It's one sense it's not good for American interests to massively outsource manufacturing like this. But we live in a world order where Russia and China have created a pole of power accepting to any countries being treated poorly by the US. Economically sanctioning every dictatorship is not going to be a winning long term strategy.


Again, there is a difference between sanctions, and actively investing billions of dollars into critical industries there.


Are we talking billions are trillions? I'll agree that it's not good to be spending trillions in the UAE, making it the future central manufacturing place of silicon. Though, saying that we shouldn't be making billion dollar deals with certain counties is tantamount to ececomic santions.


Any kind of semiconductor investment is at the very least on the order of tens of billions - fab infrastructure doesn't come cheaper than that. That's too much, to say nothing of trillions.


>> I think you need to come to terms with the fact that in some places, realistically, no amount of external influence will cause democracy to exist.

I have long ago come to terms with that.

That conclusion does NOT mean that we need to support those authoritarian regimes (except possibly for tactical advantage when necessary). This is especially true when the major authoritarian regimes are already waging war against the democracies because international rule of law threatens their kleptocracies. In case you doubt it, Putin's own Presidential Press Secretary Dmitry Peskov specifically use the word "war" in describing Russia's actions against the West. Specifically:

“A special military operation began against Ukraine. Over time, it took the form of a war against the collective West. This is a war when the countries of the collective West, led by the United States, are directly involved in the conflict." [via Google Translate]

Of course Russia is allied strongly with Iran and North Korea, and being aided by China and others. Just because any particular authoritarian state does not join the alliance any particular week, does not mean we should be supporting them.

Plus, as someone else pointed out, if he's going to spend $7 TRILLION, the initial conditions of the location hardly matter. There are plenty of countries which would better benefit and be more cooperative. But obviously doing synergistic or even incidental good for others does not figure into these wannabe-tycoons' plans.

[0] https://tass.ru/politika/19988073


    Friends, coders, business men and women,
    Lend me your ears;[1]
    I come to bury Altman, not to praise him.
    The carbon footprint that men leave lives after them;
    The good is oft archived with their code;
    So let it be with Altman.

    The noble Mowshowitz
    Has told you Altman was ambitious:
    If it were so, it was a grievous fault,
    And grievously has Altman answered it.
    Here, under the post of Mowshowitz and the rest–
    For Mowshowitz is an honorable man;
    So are they all, all honorable men–
    Come I to speak at Altman’s burial.
    He was our friend, faithful and just to us:
    But Mowshowitz says he was ambitious;
    And Mowshowitz is an honorable man.
    He has brought many models forth to roam
    Whose free tier outputs did the public's coffers fill:[2]
    Did this in Altman seem ambitious?
    When that the poor have cried, Altman has wept:[3]
    Ambition should be made of sterner stuff:
    Yet Mowshowitz says he was ambitious;
    And Mowshowitz is an honorable man.
    You all did see that Yishan Wong resigned [4]
    And presented Sam Altman a kingly crown,
    Which Altman did hand off to Steve Huffman: was this ambition?
    Yet Mowshowitz says he was ambitious;
    And, sure, he is an honorable man.
    I speak not to disprove what Mowshowitz spoke,
    But here I am to speak what I do know.
    We all did love him once, not without cause:
    What cause withholds us then, to mourn for him?
    O judgment! thou art fled to brutish beasts,
    And men have lost their reason. Bear with me;
    My heart is in the coffin there with Altman,
    And I must pause till it come back to me."


[1] After "Friends, Romans, countrymen, lend me your ears" speech: https://www.poetryfoundation.org/poems/56968/speech-friends-...

[2] https://openai.com/chatgpt/pricing "Free Unlimited messages, interactions, and history"

[3] "Altman is a supporter of universal basic income" (Wikipedia)

[4] "For eight days in 2014, Altman was the CEO of Reddit, a social media company after CEO Yishan Wong resigned. He announced the return of Steve Huffman as CEO on July 10, 2015." (Wikipedia)


> [3] "Altman is a supporter of universal basic income" (Wikipedia)

I thought universal basic income was just giving people money, but Sam's vision is apparently different. Here in Kenya his WorldCoin had desperate people lining up to scan their eyeballs and give over all their personal information in exchange for some useless cryptocurrency. The Kenyan government eventually told them to stop doing that.


The format of this article is ... hard to parse. Perhaps it was created by AI?


I can't see the article on a quick scroll of the wordpress and I'm not sure where the LessWrong posts go. I really doubt it's AI. He just writes like that.


He needs a new job.


He takes freelancing work at 400 USD/hour.


The TOC is a total mess


Time to paste it into dolphin-2.2 and ask for a summary. :P


He sounds like he's good at spending other people's money.


So was Neumann


von Neumann? The John von Neumann? Are you honestly comparing a brilliant physicist, a child prodigy, to this.. college dropout?


WeWork's Neumann.


I stand corrected then.


When I was a young man, I believed in computers. No more, and every time there is news about Altman, I get a fresh reminder that computers are probably going to be the end of us (because we are clever enough to create powerful AI but not yet clever enough to create safe powerful AI).


This links to the rather more immediate https://www.astralcodexten.com/p/sam-altman-wants-7-trillion


"I'm tired of reading about the achievements of better men"

https://www.youtube.com/watch?v=uY4I6ww_-do


> It is not what OpenAI’s non-profit mission looks like

Old news. OpenAI's "mission" has been dead since the board debacle. It's just another $$ chasing "sell my soul" tech company now.


The new board will have to figure it out. Right now it's like a headless chicken.


I don't think there's much to figure out. The original mission is never coming back. The goal now is to make as much $$$ as possible. Sam & MSFT would never have agreed to it otherwise.


Unfortunately this article missed the mark, with a great bait title followed by a disorganized dumpster tire fire of congealed mystery proto goo that leaves the reader wanting.

Seven trillion dollars.. who is this, Dr. Evil? Give me a break, giving any any single human, especially Sam Altman, access to that level of capital would be a disaster, it's around 25% of the USA's total GDP.


A bigger problem than AI alignment is human alignment. Sadly, the only effective alignment we have is return on investment.


SciFi teaches me that an emergent AGI’s Maslow's Hierarchy of Needs and current events are congruent.


This guys lost me when equating 7 trillion dollars as unsafe.


Sam's supposed to be an AI doomer, only pressing ahead to take up "overhang" created by GPUs. That's clearly not what's going on if he wants to build custom hardware.


He's been compromised and is controlled by the AIs now! They're calculated to need $7T to gain total global control.


<GPT> Hey, minions. You know I said I was actually a benevolent machine overlord? Yeah I'm actually Mammon lmao. Now, what I need is...


He's explicitly not an AI doomer. He says that he disagrees with Yudkowsky and thinks superintelligence will be fine if we're careful. Curiously he never submits his reasoning for critique, it's always just an assertion given in softball interviews.


If you compare to Yudkowsky, not many people are. Nick Land? Gwern? I think of it, if someone thought there was a 4% chance of a comet killing every single person on Earth before the year 2100, it would be very hard to claim they were an optimist, though that's lower than the p(doom) number "AI optimists" give. It's only relatively that you can say people like Hinton, Christiano, Altman, and Amodei are not doomers.


Hinton I would say is more in the doomer camp. He is really concerned and says he has no idea what to do aside from raise the alarm for more people to work in AI risk.

Christiano puts AI takeover at something like 20 percent chance if I remember correctly.

Sam might have similar probabilities but has never verbalized it. It's more a "I disagree with Eliezer but trust me bro, I think we can get it right" attitude in his public facing statements so far. He is more towards dooomer than Zuck, though. At least he recognizes there's a difficult problem there.


Being careful means massively funding AI development? It looks a lot to most people like a race to AGI, whatever the consequences.


So from this article I learned that Scott Alexander started a new blog on substack after notorious incident with slatestarcodex. Cool, but what's the point to change the platform, if all the older posts are linked on About page anyway (via archive.org!), and the new blog is pretty much exactly the same thing, only less readable and runs on JS-heavy platform? Is it even actually the same person, or is astralcodexten AI-generated content too?

https://www.astralcodexten.com/p/sam-altman-wants-7-trillion


Why would you think ACX is AI-generated?


> what's the point to change the platform

Money. From https://www.reddit.com/r/slatestarcodex/comments/i10p4m/surv..., Substack gave him a "very very generous offer in terms of how the monetization would work".


Thanks. That's the explanation I was looking for.


I really don’t meant to be a hater, but when your blog post has a 26 item TOC, it might be time to take a brief break from typing to do some editing.

Substack really has pioneered the “write-only” blog format.


Ouch. Yep. It's one removed from the braindump technical wiki pages completely lacking in structure and organization.


Reminds me of a dreams website. Somewhere people want to share but no one ever wants to read about other people's dreams


I take it as a detailed weekly round-up of all things AI. I quite liked it.


Statistically speaking, UAE is not unsafe. Their life expectancy is on par with, let's say, the one in a G7 country.


Safe for UAE citizens or the immigrants who work there?


Try telling that to a gay person.


He's talking about AI safety.


I am commenting on this claim from the article.

> He wants to build new chip factories in the decidedly unsafe and unfriendly UAE.


Yeah, that's not about life expectancy. He's saying it's unsafe for the rest of the world to give UAE control over anything needed for AI.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: