HN2new | past | comments | ask | show | jobs | submitlogin

This is a great point regarding what we ought to consider when adapting our lifestyle to reduce negative environmental impact:

> In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives.

Although I would extend “our lives” to “society”. His own example with a hospital emitting more than a cruise ship is a good illustration of this; and as a more absurd example it would drastically cut the emissions if we remove all humans and replace them by LLMs (which sort of defeats the entire point, obviously, because LLMs are no longer needed).

Continuing this line of thought, when considering your use of an LLM, you ought to weigh not merely its emissions and water usage, but also the larger picture as to how it benefits the human society.

For example: Is it based on ethically sound approaches? (If it is more like “ends justify the means”, do we even know what those ends are?) What are its the foreseeable long-term effects on human flourishing? Will it (unless regulated) cause a detriment to livelihoods of the many people while increasing the wealth gap with the tech elites? Does it negatively impact open information sharing (willingness to run self-hosted original content websites or communities open to public, or even the feasibility of doing so[0][1]), motivation and capability to learn, creativity? And so forth.

[0] https://hackertimes.com/item?id=42486481

[1] https://hackertimes.com/item?id=42549624



> more like “ends justify the means at any cost”

Everyone is crystal clear on this, the goal is to replace expensive humans to increase profits.


...how do you think you got your job? You ever see those old movies with rows of people with calculators manually balancing spreadsheets with pen and paper? We are the automators. We replaced thousands of formerly good paying jobs with computers to increase profits, just like replacing horses with cars or blacksmiths with factories.

The reality of AI, if AI succeeds in replacing programmers (and there's reason to be skeptical of that) is that it will simply be a "move up the value chain". Former programmers instead of developing highly technical skills will have new skills - either helping to make models that meet new goals or guiding those models to produce things that meet requirements. It will not mean all programmers are automatically unemployable - but we will need to change.


A few questions popped in my head. Can you retain the knowledge to evaluate model output required to effectively help and guide models to do something if you do not do it yourself anymore? For humans to flourish, does it mean simply “do as little as possible”? Once you automated everything, where would one find meaningful activity that makes one feel needed by other humans? By definition automation is about scaling and the higher up the chain you go the fewer people are needed to manage the bots; what do you do with the rest? (Do you believe the people who run the models for profit and benefit the most would volunteer to redistribute their wealth and enact some sort of post-scarcity commmunist-like equality?)


> Can you retain the knowledge to evaluate model output required to effectively help and guide models to do something if you do not do it yourself anymore?

I mean, education will have to change. In the early years of computer science, the focus was on building entire systems from scratch. Now programming is mainly about developing glue between different libraries to suit are particular use case. This means that we need to understand far less about the theoretical underpinnings of computing (hence all the griping about why programmers don't need to write their own sorting algorithms, so why does every interview ask it).

It's not gone as a skill, it's just different.

>For humans to flourish, does it mean simply “do as little as possible”? Once you automated everything, where would one find meaningful activity that makes one feel needed by other humans?

So I had a eureka moment with AI programming a few weeks ago. In it, I described a basic domain problem in clear english language. It was revealing not just because of all the time it saved, but because it fundamentally changed how programming worked for me. I was, instead of writing code and developing my domain, I was able to focus my mind completely on one single problem. Now my experiences with AI programming have been much worse since then, but I think it highlights how AI has the potential to remove drudgery from our work - tasks that are easy to automate, are almost by definition, rote. I instead get to focus on the more fun parts. The fulfilling parts.

>By definition automation is about scaling and the higher up the chain you go the fewer people are needed to manage the bots; what do you do with the rest? (Do you believe the people who run the models for profit and benefit the most would volunteer to redistribute their wealth and enact some sort of post-scarcity commmunist-like equality?)

I think the best precedent here is the start of the 20th century. In this period, elites were absolutely entrenched against the idea of things like increasing worker pay or granting their workers more rights or raising taxes. However, I believe one of the major turning points in this struggle worldwide was the revolution in Russia. Not because of the communist ideals it epoused, but because of the violence and chaos it caused. People, including economic elites, aren't marxist-style unthinking bots - they could tell that if they didn't do something about the desperation and poverty they had created, they would be next. So due to a combination of self interest, and yes, their own moral compasses, they made compromises with the radicals to improve the standard of living for the poor and common workers, who were mostly happy to accept those compromises.

Now, it's MUCH more complicated than I've laid out here. The shift away from the gilded age had been happening for nearly twenty years at that point. But I think it illustrates that concentrating economic power that doesn't trickle down is dangerous - creating constant social destruction with no reward will destroy themselves. And they will be smart enough to realize this.


> AI has the potential to remove drudgery from our work - tasks that are easy to automate, are almost by definition, rote.

I like to think that the best kind of automation when it comes to writing code is writing less code, but instead writing it with strategic abstractions embodying your best understanding of subject matter and architectural vision.


Exactly. The whole goal is to remove the "skill premium" from the labor markets, as far as I can tell.

So I would argue that this whole thing is a net negative. We are spending tons of energy for net negative gains.


Maybe it's a negative for you if you already have marketable skills, but a positive for others who want to get in.


> Maybe it's a negative for you if you already have marketable skills, but a positive for others who want to get in.

I am not fully clear, to get in on what? The skill that is valued less and less? Or on being an LLM prompter? How much would a rational management be willing to pay a prompt writer (assuming they cannot automate that as well in the first place)?


Yeah but if there is no requirement/barrier of entry to do a job; it becomes, pretty much by definition, a minimum wage job.

Is this the great future the AI promises? Almost everyone working for minimum wage while a few tech oligarch rules over all?


yeah but we don't have time to analyse this for years and years while upping our power consumption. In the end we consume too much dirty power and have to change this, and quickly. AI is worth nothing if the world is burning.


> yeah but we don't have time to analyse this for years and years while upping our power consumption

this is 100 % true. we also don’t have time to debate the morality and necessity of each specific activity for years. if AI energy use is indeed as small as some comments here suggest, ignoring it to focus on improving things like heating, cooling, and transportation could be a better course of action.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: