Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Attempts to litigate any license violation are going to get precisely nowhere I bet, but I find the actual license violation argument persuasive.

This is an excellent example of how the AI singularity/revolution/whatever is a total distraction and that a much bigger and more serious issue is how AI is becoming so effective at turning the output of cheap/free human mental labour into capital. If AI keeps getting better and better and status quo socio-economic structure don't change, trillions in capital will be captured by the 0.01%.

I would be quite a turn up for the books if this AI co-pilot gets suddenly and dramatically better in 2030 and it negatively impacts the software engineering profession. "Hey, that's our code you used to replace us!" we will cry out too late.



I was somewhat worried about that until I saw this: https://twitter.com/nickjshearer/status/1409902649625956361?...

I think programming is one of the many domains (including driving) that will never be totally solved by AI unless/until it's full AGI. The long tail of contextual understanding and messy edge-cases is intractable otherwise.

Will that happen one day? Maybe. Will some kinds of labor get fully automated before then? Probably. But I think the overall time-scale is longer than it seems.


64-bit floats should be fine; I think that tweet is only sort-of correct.

The problem with floats-storing-money is (a) you have to know how many digits of precision you want (e.g. cents, dollars, a tenth of a cent), and (b) you need to watch out if you're adding values together.

Even if certain values can't be represented exactly, that's ok, because you'd want to round to two decimal places before doing anything.

Is there a monetary value that you can't represent with a 64-bit float? E.g. some specific example where quantization ends up throwing off the value by at least 1/100th of whatever currency you're using?


Storing money as float is always a bad decision. Source: been working for several banks and faced many of such bugs.


Pretty common in financial modeling, which I'm told is all done in Excel.


> "'Hey, that's our code you used to replace us!' we will cry out too late."

Are we in the software community not the ones who have frequently told other industries we have been disrupting to "adapt or die" along with smug remarks about others acting like buggy whip makers? Time to live up to our own words ... if we can.


>Are we in the software community not the ones who

No.

I'll politely clarify that for over a decade that I - and many others - have been asking not to be lumped in with the lukewarm takes of west coast software bubble asshats. We do not live there, we do not like them, I wish they would quit pretending to speak for us.

The idea that there is anything approaching a cohesive software "community" is a con people play on themselves.


To go on a bit of a tangent, I’m somewhat pessimistic that western societies will plateau and hit a “technofeudalism” in the next century or two. Combine what you mention with other aspects of capital efficiency. It’s not a unique idea, and is played out in a lot of “classic” sci-fi like Diamond Age.

Now it’s also not necessarily that bad of a state. That’s depending on ensuring a few ground elements are in place like people being able to grow their own food (or supplemental food) or still being free to design and build things on their own. If corporations restrict that then people will be at their mercy for all the essentials of life. My take from history is that I’d prefer to have been a peasant during much of the Middle Ages than a factory worker during the industrial revolution. [1] Then again Chinese people have been willing (seemingly) to leave farms in droves for the last decades to accept the modern version of factory life so perhaps farming peasant life isn’t as idyllic as it’d sound. [2]

1: https://www.lovemoney.com/galleries/84600/how-many-hours-did... 2: https://www.csmonitor.com/2004/0123/p08s01-woap.html


But the rate of product/services that machinery will produce will make that even a small tax to corporations producing everything autonomously will be enough to feed and give a quality of life to everyone with an UBI or partial time jobs.

You really want to push for high productivity across all industries, even if that means sacrificing jobs in the short term, because history demonstrated after that, new and more human jobs emerge latter.


Every decade was supposed to see fewer hours working for higher pay and quality of life. It didn't happen, as business owners (not just 1% fat cats, the owners of mom and pop shops are at least as guilty as anyone, they just sucked at scaling their avarice).

So the claim that this technological revolution will be different and that it will result in a broad social safety net, universal basic income, and substantive, well-paid part-time work is a joke but not a very good one. It will be more of the same - massive concentration of wealth among those who already hold enough capital to wield it effectively. A few lucky ones who manage to create their own wealth. And those left behind working more hours for less.


You are right that this won't happen by itself. We need another economic system, and not just hope that this time things will magically fix themselves.


This new economic system you want has been in use since the 70s. Everything about the economy is practically socially managed these days.

What part of printing trillions of dollars to stimulate economic productivity is somehow a free market system?


I wasn't talking about free market, but the state of present economy. Unfortunately, those trillions of dollars aren't being distributed to the people, but instead is concentrated in the hands of the richest.


Im pretty sure the people got 5000$ on average in stimulus checks at the tune of 9 trillion dollars these last few months.


I'd agree that many business owners are blameworthy (specifically the ones who have sought monopolies for their product or monopsonies for their labour supply), but we shouldn't forget landlords. A huge fraction of people's income goes to paying rent, especially in urban areas, yet the property tax is relatively low. This leaves a fat profit margin for landlords, even subtracting off the capital cost of the building. The proliferation of "single family house" zoning hasn't helped either. Preventing the construction of high density housing drives up rents, and benefits landlords at the cost of everyone else.


> those left behind working more hours for less

Doing what? Isn't the concern here that automation will push many people out of the workforce entirely?


Well as long as humans are more energy-efficient to deploy than robots you will always have a job. It might mean conditions for most humans will be like a century ago.


> as long as humans are more energy-efficient to deploy than robots

Energy efficiency isn't relevant. When switchboard operators were replaced by automatic telephone exchanges, it wasn't to reduce energy consumption.

The question is whether an automated solution can perform satisfactorily while offering upfront and ongoing costs that make them an economically viable replacement for human workers (i.e. paid employees).


Who debugs the software when there's a problem?


Professional software developers, i.e. members of one of the well-paid professions that is not under immediate threat from automation.


Automated debugging software of course


Yeah, for sure, the corporations that already pay effectively $0 in tax today are going to suddenly decide in the future to be benevolent and usher in the era of UBI and prosperity for all of humankind. They definitely won't continue to accumulate capital at the expense of everything else and use that to solidify their grasp of the future.

It would be a lot easier if more people on this website would just be honest with themselves and everyone else and simply admit they think feudalism is good and that serfs shouldn't be so uppity. But not me, of course; I won't be a serf. Now if you'll excuse me, someone gave me a really good deal on a bridge that I'm going to go buy...


Have fun being a hairdresser or prostitute for the 0.01% then.

New jobs in academic fields will not emerge. Already now a significant percentage of degree holders are forced into bullshit jobs.


Would the implication be that we are stagnating as a species then?


Not stagnating but moving into an "Elysium" (as in the film) type of society.


The problem with this is that you increasingly have to put your trust in the hands of a shrinking group of owners (people who have the rights to the automated productivity). At some point, those owners are just going to stop supporting everyone else (will probably happen when they have the ability to create everything they could ever want with automation - think robot farms, robot security forces, all encompassing automated monitoring, robot construction, etc.)


So we give away the world to the 1% and are supposed to be satisfied with the "privilege" of being able to eat?


Just look at authocratic countries. That top 1% still need something like 3-4% to work for beaurocracy and 3-5% for armed and police forces. And there are always family connections and relatives of relatives who want better living. So fortunatelly no AI will ever replace corruption and other human society flaws.

But yeah remaining 80-90% of population will have quality of life and bullshit jobs because it's how the world is right now outside of western countries bubble.


If AI can replace us with difficult tasks, it can repress us. How are you going to agitate for a UBI when AI has identified you as a likely agitator and sends in the robots to arrest you?


The current state of most wealthy countries do not show any hint of any significant corporation tax. Wealth will continue to accrue in the hands of the few.


Indeed, even here on HN, it's a pretty regular talking point in the comments that the only fair corporate tax rate is 0%.


> trillions in capital will be captured by the 0.01%.

How is that different from the current situation?


In the current arrangement, capital by itself is useless - you need workers to utilize it to generate wealth. Owners of capital can then collect economic rent from that generated wealth, but they have to leave enough for the workers to sustain themselves. This is an unfair arrangement, obviously; but at least the workers get something out of it, so it can be fairly stable.

In the hypothetical fully-automated future, there's no need for workers anymore; automated capital can generate wealth directly, and its owners can trade the output between each other to fully satisfy all their needs. The only reason to give anything to the 99.99% at that point would be to keep them content enough to prevent a revolution, and that's less than you need to pay people to actually come and work for you.


It is very similar to the current situation, but intensified. Technology tends to be an intensifier for existing power structures.


Except some random nobody can become a disruptor.


I was debating bringing up disruptors when I made the grandparent comment. My 2 cents: they can shift the balance of power at the very small scale (e.g. "some random nobody" getting rich, or some rich person going bankrupt), but the large scale power structures almost always remain largely intact. For instance, that "random nobody" may well get rich through the sale of shares in their company - now the company is owned by the owner class, who were previously at the top of the power hierarchy.


> but the large scale power structures almost always remain largely intact

Is that anything new? That seems to be a repeating fact of life throughout history.


Nothing new, certainly, but still worth examining. If we are not content with the current power structures, then we should be wary of changes that further intensify them.

We need not totally avoid such changes (i.e. shun technological advancements entirely because of their social ramifications), but we need to be mindful of their effects if we want to improve our current situation regarding the distribution/concentration of wealth and power in the world.


Uber vs taxi companies, Google vs Yahoo, or Facebook vs MySpace, Amazon versus all retailers ...


Exactly, in all cases the disruption was localized, and the broader power structures were largely unaffected. The richest among us - the owner class - were not significantly affected by all of these disruptions. They owned diversified portfolios, weathered the changes, and came out with an even greater share of wealth and power. Those who were most affected by the disruptions you listed were the employees of those companies/industries - not the owners/investors.


Random nobody whose parents just accidentally happened to be a millionaires and/or live, work, and study in the top capitals of the world.


> If AI keeps getting better and better and status quo socio-economic structure don't change, trillions in capital will be captured by the 0.01%.

This is absolutely one of the things that keeps me up at night.

Much of the structure of the modern world hinges on the balance between forces towards consolidation and forces towards fragmentation. We need organizations (by this I mean corporations, governments, unions, etc.) big enough to do big things (like fix climate change) but small enough to not become totalitarian or decrepit.

The forces of consolidation have been winning basically since the 50s with the rise of the military-industrial complex, death of unions, unlimited corporate funding of elections (!), regulatory capture, etc. A short linear extrapolation of the current corporate/government environment in the US is pretty close to Demolition Man's dystopian, "After the franchise wars, all restaurants are Taco Bell."

Big data is a huge force towards consolidation. It's essentially a new form of real estate that can be farmed to grow useful information crops. But it's a strange form of soil that is only productive if you have enough acres of it and whose yield scales superlinearly with the size of your farm.

Imagine doing a self-funded AI startup with just you and a few friends. The idea is nearly unthinkable. How do you bootstrap a data corporation that needs terabytes of information to produce anything of value?

If we don't figure out a "data socialism" movement where people have ownership over the data derived from their life, we will keep careening towards an eventuality where a few giant corporations own the world.


I expect nothing less. The 0,01 will be super rich.

You could call it endgame


They need to defend their capitals from the rest 99.99%. Expect huge combat robots investments and expanding of private armies.

And, of course, total surveillance helps to prevent any kind of unionization of those 99.99%.


Unions (and striking) become rather impotent when the means of production run by themselves and you no longer need workers.


Yep; so unions become militias.


Today's hyper-militarized police forces are their state-provisioned security to protect the capital of the 1%.


> The 0,01 will be super rich.

By definition, that has always been true.

We have been in the endgame for a very long time.


A percentile doesn't dictate the shape of the bell curve. The parent comment could be suggesting the tail is getting longer.


That’s fair.


> I would be quite a turn up for the books if this AI co-pilot gets suddenly and dramatically better in 2030 and it negatively impacts the software engineering profession. "Hey, that's our code you used to replace us!" we will cry out too late.

And that's why I won't be using it, why give it intelligence so it can work me out of a job?


> This is an excellent example of how the AI singularity/revolution/whatever is a total distraction [...]

Umm, no it's not. It's possible we just have two problems - the economic problem you mention might be correct, but also that people who believe in the problems of the singularity are right as well. The existence of a certain problem doesn't negate the existence of the other problem.


The difference between this model and a human developer is quantitative rather than qualitative. Human developers also synthesize vast amounts of code and can't reference most of it when they use the derived knowledge. The scales are different, but it is the same principle.


Is this the direct result of Microsoft owning GitHub or would they have been able to do it anyway?


> I find the actual license violation argument persuasive.

I'm curious as to why it seems persuasive. Open source licenses largely hinge on restrictions tied to distribution of the software, and training a model does not constitute as distribution.


Do we need an update of free software licenses to specifically address this?


Unlikely. If this use counts as a derivative work, then it's already a violation, and no update is needed.

OTOH if laundering through machine learning is a fair use, then licenses can't do anything about this. Licenses can't override the copyright law, so the law would have to change.


Could disincentivize open source? If I build black boxes that just work, no AI will "incorporate" my efforts into its repertoire and I will still have made something valuable.


  1. Programmers will become teachers of the co-pilot through IDE / API feedback
  2. Expect CI like services for automated refactoring


Shit... yea, we should make hay while the sun is shining and maybe become preppers to brace for the inevitable revolution by the < 99.99%.


It seems like the risk exposure would be more to the end user or their employer, doesn't it?


First in was lands, then other means of productions, and for the past 150 years, capitalists have turned many types of intellectual creations into exclusively owned capital (art, inventions). Now some want to turn personal data into capital (the “right to monetize” personal data advertised by some is nothing else) and this aims to turn publicly available code into capital. This is simply the history of capitalism going on: the appropriation of the commons.


Marx called this subsumption


Can the same argument/concerns be applied to all text generation AI?


I always assumed that one of the reasons Google et al work on AI is because software engineers are too expensive.


Google has the opposite problem. They make infinite money from ad platforms and hire people just for fun so nobody else can have them. They're working on AI because they need to stop them from getting bored.


So google pays the highest but still thinks engineers are paid too much? Why not pay them less.. the set high tier?

For google support employees cost too much.


They don’t pay the highest. And if they paid a lot less everyone would leave.


21st century alchemy!


I don't feel it's morally right to keep a profession around that is automated. Why should software be different?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: