Hacker Timesnew | past | comments | ask | show | jobs | submit | rdevilla's commentslogin

This is great. I sort of feel a lack of fora for discussing technical books over a longer lifetime than merely say, the HN front page.

While there is a very good selection of readings, it's unfortunate that both LinkedIn and Google are being used here, especially if the discussion is text-only.


Isn't it obvious? They aren't using AI to automate their business. That's why GM is doing this.

Rem acu tango.

> Everything people have figured out needs to be in living form to carried on.

It would appear that LLMs are invalidating this claim. Things can live in synthetic form and carry on just fine. Instead of cultivating a population of learned minds we are just feeding a few dozen egregores of models and training corpuses.


They are not invalidating this claim, and cannot, unless we'd actually try it out for a few generations. Which we shouldn't and won't.

LLMs are quite good at simulating life and living intelligence (in the short term), but they aren't any of that. That's why we call it artificial intelligence. It's true that we can't put our finger on what exactly the difference is, but it's not like reality has ever felt encumbered by our limited understanding.


All LLMs do is launder other people's IP. So I don't think you invalidated any other claim.

Ten years ago, I would have kowtowed to someone elite enough to build something like this.

Today, I just think, "how long would LLMs have taken to write this?"

I mourn the death of a human artform.


It's far more exciting than sad.

Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way.

Look to the positive instead of lamenting something that never would have happened.

It's unbelievably exciting that you can now program a computer virtually without the limitation of your ability to hand code it.


The result is unimpressive either way -- it's the journey that is exciting for these kinds of projects

I understand for some people its the display of human wizardy that matters.

For me it's about making the computer do awesome things - I do not care how I get there I just want it to do whatever I can conjure in my head.


As much as I enjoy the novelty of asking anime pictures from chatGPT I do not, for a single moment, consider myself a doer of anime pictures.

And a fair aside, the result will be "good enough" approximation of what I conjured in my head, but never the thing itself. For me to do the exact thing I conjured in my head it will require to pick up the mouse and draw the rest of the owl. I don't know if that's more telling of my imagination being demanding or my standards.


True if you use only chatGPT to do something and accept the generated stuff as the final output.

Probably not the case for anime pictures, but in other domains, you can use chatGPT as a first level and then go on the improve it from there. To make a parallel: if you draw with a pencil on a piece of paper, you would still think of yourself a doer even if you did not manufacture your pencil or paper.


There's still personal skill expression in driving cars and using a pencil for drawing, that makes the difference between drivers and artists visible enough to justify hiring one over another.

So far I can't say the same for leveraging LLM's and, in the off-chance that there is, we have an entire software development industry that doesn't even know how to filter for "it".


It's usually not even the display.

When I go on a trek, the end of trek landmark is nowhere nearly as significant as the experience of reaching it.

If I were to be magically transported there without the lives experience it would take almost all of the joy out of it. Some people get a kick out of doing hard things that are interesting but seemingly beyond one's ability. Making it an easy commodity spoils the fun.

As for teleportation, if it were, say, trip to moons of Saturn I can make exceptions.


Nah. I'm not going to yearn for the days of hitting steel on an anvil when we can have steel produced in a factory.

Have you ever done blacksmithing? It’s tremendously satisfying.

Sure, if you want 300,000 spoons, it’s far better to use a factory process and get essentially identical results. But if you only want a few spoons and accept (or even value) that the spoons will all be a little different, hand-forging them is quite enjoyable.

I’ve written enough assembly and done enough blacksmithing to know that the metaphor isn’t quite apt. But there’s both tremendous effort and satisfaction involved in both.


Hobbies are great. Making a living is usually a separate endeavor. I don’t want to pay for an artisan spoon. There will be a limited market for artisan software. But make no mistake, we are entering the era of software mass production. Say goodbye to your chisels and rasps. If you want to make money you will need to operate the machine that builds the machine.

You won’t be able to enjoy your free time playing with computers if anthropic et al make you jobless.

The “you” doesn’t necessarily refer to you. Im addressing 90% of the developers out there. We love playing around technology… but I doubt we will be thinking the same once we become unemployable. But here we are, having fun with the tools of companies that want to finish us. How ironic


Fiddling while our home burns has been a beloved pasttime for many people, from emperors to passengers on ocean liners to prisoners in camps. What else is there to do anyway? Sometimes history must take its course before humanity as a whole recognizes its folly.

This is silly.

While I work in IT today, that wasn’t always true. I am certain I spent more free time playing with computers when my work did not involve computers at all. While I enjoy working with computers at a variety of different levels, when I do it all day, I don’t typically wanna do it when I get home. If Anthropic means there are no more IT jobs, software jobs, etc. etc. etc. (which I think is highly unlikely) then I guess I will have to do a non-tech job just like 99% of the other human beings. If that comes to pass,I expect in my spare time I will suddenly reacquire a love for tinkering with computers.


Also consider the great fortune of dicking around with computers ever being so lucrative in the first place even if the gravy train eventually stops. We were lucky. Most hobbies aren’t anything like that.

> Got an idea that you'd need assembly language for - now you can do it instead of.....

Nobody actually needs a web server built in assembly language, it serves no practical purpose. And I say that as someone who learned to program 6502 assembly language in 1983 and has sporadically used assembly of various architectures since.

The absurdity of building it would have been the curiosity draw pre-LLMs, but when it existing is just a series of prompts away it really loses all of its meaning.

But yeah... hooray for AI. Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.


> Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.

Wasn’t it used for that before anything else? Google invented transformers and had LLMs internally before chatgpt got released. Presumably they were using them for ads, because their public demos were insane things like talking to the moon.


> Wasn’t it used for that before anything else? Google invented transformers and had LLMs internally before chatgpt got released.

According to friends who worked at Google (no direct knowledge myself, so don't know exactly how true it is), they mostly sat on the tech. Google News had internal prototypes of using them to expand/contract/summarise and/or add details/context to news articles and translate them to different languages, but it was never fully productised.

Then after ChatGPT got popular, sudden panic to start using them in products company-wide.


It has always been possible to do it. LLMs are not a particular enabler for that.

The difference is that now it is worthless: there is no learning, no person caring about the result, nothing aspirational for the public to look towards... we used to enjoy those challenges, used to be proud of solving complex problems... now? Yeah, whatever, execute execute commit push, let another LLM "review" and call it a day.


The difference is not that it’s “worthless”. The difference is that now it’s “practical” to implement given the low effort.

I wouldn’t be sad about defeating lower complexity challenges. There are always higher complexity challenges that arise once we start operating in a world when you can do more. The bar raises.


No, increasing the offer of something decreases its value, always. Do not necessarily increases its demand. That is basic economic rule. See that I use "value", not "cost". The distinction matters.

Yesterday I went to a bookstore: saw an interesting book cover then I thought "ah, looks like AI"... all excitement went away. There won't be a "new complexity frontier" for artists that used to draw book covers. Or writers, actors, writers, etc.

AI is currently not enabling any use case which previously was "too hard". It is just reducing the value of stuff by increasing the offer and making people delulu about what they can achieve without proper knowledge.

Making good stuff requires paying attention to a lot of details. Even "simple" stuff can become incredible complex once you actually learn about how it must be done. Most of what we humans do is working on that space, not chasing projects Manhattans.

What do we get if population is disconnected of the true complexity of creating stuff? Perceived value decreases and if everything is perceived equally bad people will stop caring about quality. That is why fascism likes uneducated people.

So, that is about the AI contribution to "value" itself.

Now, is it true that AI will allow us to create more complex stuff that is not practical now? I would strongly disagree. The reason is Kolmogorov complexity: it is not possible to find the shortest program that describes a task. Describing it with natural language will not magically give us permission to avoid having to describe that complexity. What is the point of switching from C to English, if I still have to specify every little detail in a much ambiguous and verbose language? Programming languages are not the challenge, they are the solution to the problem of having to specify complex tasks in a reproducible way.

Now gathering everything together: that is why I think that generative AI makes things worthless: value reduction, complexity perception reduction (which reduces value), a population ignorant of the complexity will choose subpar options because "they are all the same garbage" and we will not get any superior engineering capability anyway.


The point is the death of the celebration of excellence and technical mastery.

Once insurmountable challenges are now trivial to implement with, as you say, "low effort."

For those who were attracted to computing by the grind and the grand narrative that you, too, with sufficient effort, discipline, and merit, could become a revered craftsman, LLMs trivialize an entire lifetime of practice. I can't think of anything more demoralizing.


If your goals were fame, then yes. But you can still pursue excellence even if there is an alternative “easy” path.

The equivalent is something like hand tool woodworking - it’s still a thing despite the advent of machines, but more of a niche. You can still aim to become excellent, but maybe you won’t be famous.


> but maybe you won’t be famous.

Or employable. Which sucks if you're over 50.


That also sucks if you are not anywhere close to retire or having a beffy bank account and depend on regular monthly payments.

Did hammers obviate the technical mastery of finding a suitable rock? Or did they elevate the definition of “technical mastery”?

llms are nothing like hammers or other tools.

They are factories that product goods on a whim. There is nothing to compare them to as we never had anything like that. This is not industrial revolution this is obliteration of work at its core.


I look at them as lab grown bacteria. We’re in the early days and still have a lot of contamination we still don’t understand. They don’t always produce a viable result, and sometimes they break test rigs.

Just because they’re not a pure extension of our bodies or minds like a hammer or pencil doesn’t mean they will magically break the concept of work.


Would you apply the same reasoning to the building of horse drawn carriages and mass produced motor vehicles? A hand built PDP-11 to a Thinkpad?

> The difference is that now it is worthless

Writing whole software projects in assembly has been worthless and pointless for a couple of decades now. Even the projects who can put together a solid case will limit assembly to very specific components executed only in specific bits of a hot path. Perhaps the most performance-sensitive code we have today is high frequency trading and that field is dominated by C++.

Also, virtually all mainstream compiler suites have flags that output assembly,and that feature is largely ignored and unused.


That's just not true... the flags to get preprocessed output and assembly are quite useful and used a fair bit, in fact. Multiple reasons - sanitychecking your code, finding bugs, or even finding compiler errors.

The point is that these projects had worth because of what the programmer got out of the learning process, not because of the end result.

A lot of FFmpeg is written in assembly, and a lot of things are using FFmpeg in the backend.

Yep, another humane thing going to get killed, because people are naive, gullible and basically idiots handing out their expertise on a platter to faceless corpo entities.

What's next, human human contact abstracted away by brain stimulation?

And the transhumanist arsewipes gonna have a field day.

Never too late to ignite the nukes...


> What's next, human human contact abstracted away by brain stimulation?

Of course! Corona/junta/scarecrowvirus don't transmit over the wire, while ads, taxes and surveillance do alright!


If you've got an idea that you need assembly language for, you can use a compiler to create that assembly language. It'll probably do a better job than an LLM. Assembly projects are interesting because they're written in assembly, not because they contain assembly.

You'd be surprised, again.... most compilers don't generate very good code, mostly because

1. the time for optimisation is limited

2. the constraints are overlapping and just completely intractable beyond a single function (do you want to inline this, saving on the call and increasing binary size, or not do it because it's cold?)

3. they don't have domain-specific knowledge about your code, and even with PGO, they might incorrectly decide what's hot and what's not - typical example are program settings. You didn't enable a setting during PGO instrumentation, compiler sees you didn't call that path, shoves it out of line. Now your PGO-optimised code is worse than -O2. And compilers have different levels of adherence to manual branch hinting - on MSVC you get a reorder at best, Clang and GCC try much harder at [[likely]] and [[unlikely]].

4. There's still quite a bit of low-hanging fruit left, mostly because progress is jagged ;) For example our calling conventions generally suck - this is actually why inlining is so helpful - and the inertia makes everyone emit the default calling convention and that's it.

For example, did you know that compilers have very inconsistent support for struct unpacking? It can be much faster to write

  int32 meow(int64 a, int64 b);
than

  struct mytype {
    int64 x;
    int64 y;
  };

  int32 meow(mytype a);
because the first one goes through registers on the MSVC ABI, the second one gets lowered to the caller passing a pointer to the stack. Before someone says "oh this just means MS sucks" - fair, but for std::unique_ptr the situation is the other way around... on the MSVC ABI the callee cleans it up so it's truly zero-cost, but on the Itanium ABI using it is worse than using T* as a raw pointer... see the GCC codegen :)

These examples might seem a bit cherrypicked but this is only scratching the surface, not to talk about the codegen in higher-level languages, which is even more dreadful. Manually optimising your code can usually get a magnitude worth of free performance, which is just tragic.

I wouldn't even rule out LLM codegen in the future - although they're quite unreliable today so you'd get miscompiles like crazy - but there's just so much low-hanging fruit left on the table that it wouldn't be too out of step...


Expanding the struct to two arguments does not take longer than rewriting your whole project in assembly.

I've never said that, but using assembly in certain places can certainly be justified, especially for the performance-intensive parts.

This idea that LLMs are going to enable us to do things that we wouldn't have done before, therefore overall productivity and value is going to increase "exponentially" seems naive to basic economics.

If LLMs are good for doing things we aren't already doing, it indicates the overall addressable "value" that LLMs could provide for such things is actually quite low. If the task has necessary prerequisites that you don't currently possess, but you haven't spent the effort to jump that hurdle yet, then it's a good indication the value of completing that task is very low. Even if, maybe especially if, we're talking about personal projects where the value proposition is personal and not momentary, it indicates the person already feels in their bones that the return on doing this thing is not worth the effort.

I'm struggling with this with my leadership at work. We have developed a thing that is going to remove the need to hire temps [0] for data entry when we get clients who send us large amounts of their "data," aka "a thousand 30-slide PowerPoints each with one line graph of interest sitting in the corner of one slide somewhere." It is an ask that comes up a lot, it's always very expensive in both the time and money axes for the client for that task, but overall it's just a small part of the contact budget. I'm all for using what we've built to cut down the time cost for our clients, but my leadership thinking it's going to lead to massive cost savings for our clients seem to forget just how much time we spend in meetings and planning and documentation and testing and reevaluation and more meetings versus actually executing on things.

It's also bad business. To me, giving results faster should be a premium offering. We should be charging more, not less.

[0] We don't actually hire temps, we turn our junior data analyst into temps by burning them out with tons of unpaid overtime. They then leave and we have to back fill them at rather extreme cost of overhead for hiring compared to the direct contract overtime we didn't provide.


I think for programmers the enjoy is to write it by his own, not to just have a toy. If I just want a web server in asm the easiest is to just decompile an existing one into assembly and call it a day.

Only exciting if you already got a lot of programming under your belt, like Carmack, or a product guy.


> without the limitation of your ability to hand code it.

Isn't that kind of view pathetic and sad, though? Why would anyone pick up and guitar or play a piano if they could just listen to the same song already made by someone else? I struggle to understand this view of people that pretend to not understand why being an expert of some skill is perceived as valuable by some people. This is also belies next problem with this line of thinking which is that it says "we don't need to learn X to do Y because we have AI" but misses the same AI could easily replace the need to have you think to do Y in the first place. I don't know.


In my experience people that did not hand code enough imagine that the hard part is coding, and not clearly defining all the possible edge cases and use cases.

So, in my view, more people will (or should) understand now what is hard when building complex things, if they pass the stage of "I have a nice POC that works for this one case".


I don’t know. The edge cases become very clear when coding, because there are explicit and precise guarantees of how the code will behave that you can reason about (or when there aren’t, you know that you are in trouble and code defensively around it, and you can reason about that defensive code), which isn’t the case when vibe coding. When coding, you can prove the code correct in your head, and the edge cases are revealed by that process. But that process isn’t possible with vibe coding, because what exactly a prompt will produce isn’t predictable, and doesn’t have guarantees attached that you can reason about.

Completely agree. My point is that they will start realising it not when vibe coding (as you say, nobody will ask them for clear specifications) but as soon as they will try to use it for more than one happy path demo. Then they will "patch" one case and break another and so on. And then they will probably complain LLM-s are crap at coding, same way some lazy product manager complained before when asked questions that they did not think about ...

I see. Some people have the opinion that the things that are hard about hand-coding remain essentially the same when LLM-coding. They argue that this ensures their employment despite the shift from hand-coding to LLM-coding, and that the necessary expertise and diligence (e.g. appropriate architecture and test coverage) remains the same, and in consequence quality doesn’t suffer. I thought that you were arguing in that direction.

> Got an idea that you'd need assembly language for - now you can do it instead of..... never doing

But you're not doing it. The ai is doing it.

If the op can write a web server in assembly language then I'm pretty sure they could have done it in a higher-level language. But they did what they did for the journey and the learning along the way. Vibe coding it omits all that, and misses the point of the exercise.


I do believe this is just a next step in languages. We've come this far trying to make code NLP, now we have the closest thing to a translator in our generation. It's an exciting time, just don't pay attention to talking heads.

the biggest issue with llms is that they make those who have no idea what they are doing seem like they know something.

>> without the limitation of your ability to hand code it.

yeah its nice though this in 100% of cases results of software of even lower quality we had before.

so hard to tell where is the win here. the fact that you can generate some code does not make it a win, just a curious fact.


Which is why now companies can happily reduce head count.

> Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way

If you are having an LLM generate the assembly language for you, that is not even remotely close to writing the assembly language yourself.

I don't find it exciting even in the slightest. I can think of nothing more boring and unsatisfying than having an LLM generate all of your code for you.

I mean, I understand why some think this could be exciting from a "I can get something done fast because the LLM generates it for me" standpoint -- because their excitement stems from something getting done at all instead of just sitting in the pool of ideas forever. However, you will never know the code generated by an LLM like you know the code you wrote yourself. Also you will never gain the same satisfaction of finishing a project where the code was written by an LLM that you gain from finishing a project where you wrote the code yourself.

If you are a person that doesn't care about coding or doesn't like to code at all, I could totally see why you'd find this exciting - to you it's all about avoiding work you don't care for or want to do yourself anyway. Also, a high percentage of people who do love coding have zero interest in writing assembly language, so if they were required to write some for a project, I could also see them being happy with having an LLM generate that part of the project for them.

However, I think for people who genuinely love to write code, the situation is the opposite of what you said -- it is far more sad than it is exciting. In fact, for many of them it has already reached the point of depressing for many reasons. I don't think it is primarily because the LLMs have gotten significantly better at generating code (which they have). I think some of the bigger reasons are that so many people who now pay people to produce code have:

1) got a very short-sighted and "rose-colored-glasses" view of what LLM-produced code will do for their company.

    and
2) deeply under-appreciate the value of having a person or team of persons who understand their business, the hardware and software required to support their business, and the work required to both keep things running and handle new requirements as they come along. Because of that under-appreciation, many already have punted ( and/or are preparing to punt) those people to the curb because they think they can just have an LLM do their job and save a ton of money.

In the long run I think most (if not close to all) of those businesses are going to be sorry if they over-indulge in replacing human-produced code with LLM-produced code. I think the ones who lean too heavy on the LLM side are going to eventually collapse into a heap of unmanagable dumpster-fire code that they can't understand nor maintain. A whole new world of incidental complexity will consume every project, and in the long run it will just eat them alive (figuratively speaking, of course :-D ).


I think that the analogy of recorded music best captures your feeling. Not the exact technological and economic transformation that is happening, but the feeling.

Some 120 years ago recordings music was a living phenomena produced in the moment. Musicians worked at restaurants and coffee shops everywhere, being useful without being super stars.

Music didn’t disappear with recordings, but the works is certainly different.


The analogy is AI music produced without effort in seconds, not recorded music.

It doesn't diminish the art form though. If anything, I value these kinds of hand written projects even more now that so many people are pulled in by AI doing their projects for them in a fraction of the time and effort. I love doing these kinds of projects, and I love writing assembly, but I must admit that the temptation of just copy pasting generated code is big sometimes, because it's _right there_. In this context, seeing someone handwriting something awesome by hand is even more valuable to me.

I think the parent's point is: a couple of years ago handwriting was the only option. You'd see a post like this and know it was something special.

With LLMs, we can't tell anymore if something is a labor of love with hundreds of hours of work behind it, or half a dozen prompts to Claude Code.


Oh yeah I totally get that, and I feel the same, but the author of this project specifically says that it's hand written (first paragraph of the readme in the repo).

The answer is "no time at all." I used Gemini Ultra earlier this year to see how well it would do with some really gnarly assembler. I asked it to write a whole flat-shaded 3D engine in 8086 assembler that would run in CGA on an original XT and it one-shotted it in a couple of minutes.

https://imgur.com/a/Dy5rUku


Look at the bright side, it's much more feasible now :)

Yes, it's not deterministic, and if you were using it commercially, the ROI would be terrible, and it's certainly not reliable but for a hobby project.... why not?

Encouraging people to understand the layer of abstractions they're building on is helpful, doesn't matter if they do it by hand or with clankers.

LLMs lower the barrier for execution - they make you faster. The unstated question is: faster at what - they can make you faster at something clever, or faster at the entirely wrong thing...

Your point is correct if we're looking at it through a scarcity lens - the effort to make it certainly decreased a lot - but that doesn't mean that anything is now worthless. We can just move onto doing bigger, better things now, until we hit the next limits...


> Encouraging people to understand the layer of abstractions they're building on is helpful, doesn't matter if they do it by hand or with clankers.

But we know that long term use of LLMs does not lead to better understanding, it leads to reliance on the LLM for the person to be able to function at their job.


You also depend on {computers, the internet, electricity, groceries, etc.}, that's not the discriminating factor.... the question is, are you actually better at your job using it, and are you stunting your own growth?

I'm not claiming to have a definite answer to either, but I think the right question to ask is - are you going to benefit from using it in the long run? If yes, carry on, if no, re-evaluate what you're doing :)


Early in my career, I had a consulting job where the client wanted me to work on their system using their tools. They had nothing and would not provide anything. No compilers, no runtimes, nothing but a regular, run of the mill, desktop office workstation. Instead of performing the task in Visual Studio with .NET like we had planned, I did it in Excel with VBA macros.

If you come to rely on specific tools such that you can't do your job without them, you're no longer doing the job you think you're doing, you're now just a tool operator. If I don't have a computer of some kind, I'm not writing software, I'm just a manager. If my team can't work because GitHub is down, then we've done a bad job of being software developers.

The experience taught me that there are two kinds of tools: those that are necessary and those that are nice to have. Yes, that job sucked, and yes, I quit that job, but more because they refused to pay me the overtime I needed to get it done with such crappy tools.

I had another job where we were one client of many on site with a vendor for a training event and the vendor couldn't get a system configured correctly for our cohort to continue on. Part of the problem was their configuration system was garbage, very easy to get wrong, very time consuming to manually edit. While they were dicking around with throwing edits against the wall, I wrote an HTML page with JavaScript to make a UI to edit the data in an much more natural way and then generate the config file. It took 10 minutes and saved us hours of waiting so that we could continue on with the training. Perhaps the takeaway from that experience was that Valve's engineers write shitty config systems. All I know is we got the training event done and I walked away with a piece of paper that said I was now a Certified Hardware Vendor, which we then used to sell more contracts on our own.


> I mourn the death of a human artform

Well, look at this way, the needs of commerce are going to solve the conflict between practical and the beautiful. I think those of that value the beautiful aspects of coding will find new avenues of expression. For example, I'm about to get back into C programming to build a play.date game engine for an MMO.


Haven't used LLMs for assembly yet, I did try to use it on some DSLs with few docs, the results were much less impressive than those with popular, higher level languages AI companies scraped a gazillion repos for.

It was an artform and a necessity. It's even more of an artform now.

I appreciate the effort. But why not put the skills into something that would be genuinely useful to others or solve a pain point? Open source the results or handover to someone who would like to maintain it.

There’s a lot of things people do for fun that don’t have an obvious purpose. Life isn’t about spending your every waking hour doing something useful in my opinion.

I can dig a ditch and cover it up innumerable times. If you are doing something out of boredom, at least make it useful to someone. Probably would give you a sense of purpose in solving a problem.

Asked myself the same question. Just guessing... probably wanted to learn about web servers as well as assembly.

I used to LOVE getting into a good flow state while programming and was very proud of clever code I made. Now I just think how much faster an LLM could have generated the code.

> Ten years ago, I would have kowtowed to someone elite enough to build something like this.

I'm afraid it's an elite skill in the sense that juggling is also an elite skill. It's impressive for the first few seconds you gaze into it, but once the novelty factor wears off you understand that it's wasted effort that leads to a project that suffers from a massive maintainability problem, is limited in which platforms it can run, and brings no advantage whatsoever. It's an gimmick that has no practice use.

This is the software development equivalent of an amateur guitarist posting shredding videos on YouTube.


What an odd take. It is often titled "software craftsmanship". Is the craftsman not allowed to practice? Not everything needs an immediate real-world application. Not everything needs to be enterprise-grade, bulletproof, web-scale or whatever. It needs to work for the creator, and sometimes not even that.

In the same way we appreciate Japanese wood joinery, why not not just appreciate this? Someone might even learn a trick or two reading it.


> What an odd take. It is often titled "software craftsmanship".

No, not really. This is exactly the opposite example of software craftsmanship. Software craftsmanship involves things like technical excellence in delivering maintainable software that is adaptable to change.

Picking assembly, of all things, for a web server represents a complete failure in the analysis of both the problem and solution domain.

https://en.wikipedia.org/wiki/Software_craftsmanship

This sort of project is more in line with parlour tricks, juggling, and stunt shows. Trying to frame this sort of project as software craftsman is like discussing the whole Jackass series as cinema next to Hitchcock and Scorcese. It may take skill and practice to be punched in the nuts, but that doesn't make it a craft.


> Software craftsmanship involves things like technical excellence in delivering maintainable software that is adaptable to change.

To which change, exactly?


Would a craftsman not become a craftsman by honing his or her skills on seemingly pointless projects?

But if it was written with an LLM, then it's not really written in assembly.

Would we have considered writing a server in C writing it in assembly? No.


You can say that with manual reveal of photographs, LP collections, a text written with a typewriter.

Is not the end of the world. Is a change.


It's not just art though. It's human thinking and problem solving and learning that is dying.

I have a (relatively well informed) view that people who aledge that "if you used an llm you did no thinking or problem solving" have never in fact used an a llm to generate anything particular complex.

You do indeed need to do quite a bit of thinking and problem solving, to build things with an llm.

If you disagree, repeat this project, so you can share with us how little thinking it required.


Keep telling yourself that if it makes you feel better.

  I mourn the death of a human artform.
So what art form can a human make with an LLM assisting?

I get what you mean but I feel this new profound yearning for "hand-crafted" code is getting a bit out of hand. Software engineers have taken shortcuts whenever possible since software was a thing. Do you also mourn that we don't code airplanes by hand anymore (i.e. the death of the "craft of coding").

We need to stop thinking of software as carpenters where the magic is some physical skill and that is the "CRAFT WE MUST PROTECCT".

And at least your comment was grounded in reality; a lot of people I talk to (who are not coders) seem to think a good software engineer writes every line and every word with thoughtful genius and AI just spams code so one is better than the other. And they are convinced its some naunced smart take and they understand software development on a inner level or whatever.

And the base assumption still holds true (pure AI-generated code is garbage) but its mostly because its badly designed and is still a pretty poor architect. And there is a need to pushback against slop but why do we need to elevate typing code as if its some sacred acctivity? Most of the work a good coder does is in their mind with little connection to the phyiscal reality of the world.


In this case you could even replace "LLM" with "C compiler" and it would change nothing.

Look, I still got my physical copy of Michael Abrash's Graphics Programming Black Book with its genius content about hand-optimizing cycle count on 486 and Pentium processors, beating compilers at that time.

It was an absolute artform, but completely obsolete by today.


> I mourn the death of a human artform.

The artform only dies if you let it. Even if your employer is so idiotically myopic as to forbid you to ever write your own code, you can still continue the art on your own time. I for one don't care how "good enough" any AI-lableled technology gets at writing code. I will continue to hone my craft until the day I either die, become too unwell to do it, or some other creative endeavor consumes all of my personal time.


Human artform is still alive and well as evidenced by this post.

Yes, an LLM can write it, it’ll probably work. Yet, it’ll remain meaningless slop while this is not.


The problem, at least for me, is that by now I'm so desensitized that I won't even bother looking at something, because it could potentially be the product of a few prompts. The LLM noise is drowning out the human signal, so to speak. Same for articles, blog posts, etc. It only takes a few em-dashes, a few "it's not this, it's that" to lose faith in the text's authenticity, and with that, any interest in its content.

Very naive for you to say that.

As the article states, it's an older practice so this maybe goes without saying - canary traps are also useful for tracking the flow of information throughout a population.

A well crafted, bespoke whisper passed into one ear that returns to you from another direction is a very strong signal.


> I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies is more of a liability than a virtue.

This is one of the most alienating things about the modern software engineering industry. Someone who grew up just fucking around with computers since they were 5 is supposedly now on even footing with someone who took a 16 week bootcamp and a Claude subscription and has never seen a terminal before.

I was at a drum and bass show recently and talked to one of the other people there. It was obvious I didn't really listen to that much drum and bass as I couldn't name anybody except the most popular artists. You see peoples' reactions change slightly when they discover you are not really part of their music scene - you're an outsider, or a tourist, or even a poser. That's not even a problem, that's just the way subcultures are - you've either lived and breathed that way of life, or not.

What LLMs are doing is they are automating the manufacture of posers and cultural appropriators at scale - you don't really understand the nooks and crannies of this territory, you never actually lived on IRC or in the bash terminal - but you can sure wave around these oversimplified maps of the territory with all the back alleys and laneways missing, and use your pocket book of translated phrases to pose as a native.

> My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

The problem in software is it seems that we are losing the ability to distinguish between appropriators of computer geek culture and those who do "speak" programming languages natively. The bar has fallen so low that I can't even expect people to understand the difference between runtime and compile time. Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn, as if their ability to expose ignorance on foundational topics presents an existential (or career) threat.

There's been a rise of anti-intellectualism in software from people with non-STEM backgrounds who actually disdain seeking out and possessing such knowledge. It's utterly useless to study - just like math. I find it harder and harder to locate hobbyists, especially here in Toronto, who bother to go below the abstractions not just because they want to, but because they are compelled to understand.


Your words resonate with me. Even before LLMs, I’ve been disappointed with the general direction the software industry took in the 2010s. Today’s software industry is not the industry of Licklider, Engelbart, Bob Taylor, Alan Kay, Woz, Stallman, Ritchie, Thompson, Pike, Joy, and many others whom I admire, who helped establish an ethos of computing that fostered a sense of freedom, creativity, and wonder.

Instead, what we have today is a computing ecosystem dominated by powerful players who care about money and control. Speaking from the standpoint of a Bay Area resident, since roughly 2012, the field has been increasingly taken over by people who are in it for the money. Combine that with Alan Kay’s observation that computer science is a “pop culture” that often lives in the moment and has little regard for the past, and also combine that with the “move fast and break things” attitude that permeates modern software development, and this has created an environment that seems hostile to the types of nerdy pursuits that the industry once encouraged. The working environments of many major software companies and the products they release are a reflection of the values of the companies’ executives, managers, and shareholders.

While I’m not anti-AI, I see agentic coding as another step in the direction that the software industry was already heading towards, where it can move even faster and break even more things.

There is still wonder, joy, and freedom in computing, but I feel this is increasingly confined to the hobbyist world and certain niches in research environments.


I can confidently say that I know little to no people truly interested in understanding technology, except for strangers online.

sounds like youre working at the wrong place. detailed computing knowledge and maths is essential in some industries and like you said, scorned in others. i couldnt think of anything worse to do with my time than spend all day with mba's or webdevs (lol im sorry thats unfair, web development is complex with all the callbacks and sync issues).

Thank you, I was starting to wonder.

I guess because I’m in game dev maybe, but in all my jobs knowing about the underlying stack has either been necessary knowledge or highly regarded.

I can’t think of any time in my career where knowing about the internals of the stack was ever frowned upon or where it’s been anything other than an advantage (especially when hunting bugs). I must have been lucky.


> Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn.

Design time, code time, compile time, run time. Why all that potentially wasteful upfront work?

The next step are shipped applications whose help menu is a chat interface that responds to all user questions of the form "How do I ...", with a short pause to add a new hack to the existing pile, and then some upbeat instructions.

In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.


> In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.

If I had to sum up the zeitgeist of the '90s techno-optimism it would be this persistent, confident prediction that once people just learned _how_ to use computers, and everyone is a power user everything will be fine! Despite the mounting evidence that actually, no, like everything else in reality the distribution of skill is a bell-curve with the median sitting uncomfortably low for those who, to quote OP, "lived on IRC or in the bash terminal".

Free universal education didn't fix this problem, LLMs won't fix this problem. Man's natural paucity is no longer in the availability or accessibility of knowledge. The liberal ideal that all we must do is empower the individual turns out to not have been the solution to everything forever.

But hey, being self-aware enough to make productive use of this new technology is probably _some_ kind of edge.

May as many as possible survive.


> '90s techno-optimism

Yep: It's a cheap tiny factory you control that's big enough for your unique priorities! Every person would own the means of production!

I don't think the assumption was that everyone would write software, but that they'd at least they'd choose in a way that favored future independence of choice. It's depressing to see the arc from "tool that grants autonomy" to "tool for delivery of product from megacorp."


This has been done on purpose, look at the way governments all over pushed children to do STEM (or pretend to do STEM). The plan was never to get millions more genuine enthusiasts, this is impossible. It's just to create good enough workers.

It's a method that worked out, for those companies who needed it. There's still people who're like you they're just 'hidden'. This isn't just done in the tech industry, devaluing of workers is done everywhere. Look at the luddites or those working in older industries in America/Europe. At some point government/industry believed they had too much bargain power/there's labour shortage/it made economical sense and they could be replaced in Korea, China .... It's the same thing just in the tech industry it kept growing within the US.


There is a lens through which this looks like assembly programmers bemoaning the advent of the compiler, and all these posers who only know high level languages and have no idea what’s going on inside the machine, have less control over memory and other low-level stuff.

Not saying it’s wrong, but plenty of folks successfully program in high level languages, never knowing a single ISA at all.


people will accuse you of "gatekeeping" because you shouldn't need to have any knowledge or skill to do stuff. those things are unimportant, even bad, because anything requiring those is inherently exclusionary. lmao.

> There's a lot going on upstairs, higher mind stuff. I am older now, and I no longer experience this phenomena. Have I lost it to age, or have I integrated it somehow into my conscious mind?

It's similar to what Jaynes described in his "bicameral mind." Man of antiquity "heard" disembodied wisdom dispensed to him, seemingly at random, from an incorporeal source: "gods." Today we simply regard such pseudo-auditory phenomena as "thought," which may throw light on Cartesian-style equation of "the soul" with "the mind," and enduring mathematical truths with divinity.

Following the Bronze Age collapse and the "breakdown of the bicameral mind," human culture is replete with examples of people trying to hear the voices of gods, who were now being crowded out by the conscious, egoic, individualistic mental chatter of the newly developed default mode network - the crying out of the Psalms, elaborate rituals and procedures for invoking divine inspiration in the oracles, various forms of divination, augury, etc.

Tarot, properly understood, is not a means for divining the future, but a debugger or reverse engineering tool for probing the internal psychological state of the querent, and hopefully coaxing out these moments of unconscious, unbidden inspiration.

Much of modern esotericism is about trying to steer the brain into states of mind where these vestigial, intuitive, subconscious, nonlinear, pattern matching, Kahneman System 1 facilities of thinking, become once again accessible to conscious prompting and dialogue. Jaynes calls this "the induction," the Romans called it "the genius," Thelemites know it as "the knowledge & conversation," and it may be most broadly described as "union with God."


I view our existence as something like a fractal.

World history is a scrambled mess of lies and amnesia (from repeated collective concussions, heh) Who knows what is truth and what is the Victor writing the history books?

One's life is untraceable - how did we get here? Literally too much went into that story, majority unseen, and none of us can fully say.

And so at the personal level, are thoughts borne out of a chain (or DAG??) of memories that cannot ever be fully traced?

Was my homunculus voice who gave me detailed clues/answers just returning the highest probable solution gleaned from thousands of simulations in the problem space I presented? Of course I should not be privy to such musings, I wouldn't have the patience for it - so it seems to me to be "out of nowhere".

I do sometimes wonder though with all my weird experiences if I am merely the "doer in the body" whereas I have a higher self who is the real "thinker" running things in the background and who has access to the big picture.


> I do sometimes wonder though with all my weird experiences if I am merely the "doer in the body" whereas I have a higher self who is the real "thinker" running things in the background and who has access to the big picture.

Yes, precisely.

There is a classic initiatory text in the Thelemic tradition, Liber LXV, that personifies these different parts of the self. The "doer in the body" is the scribe that wrote the work, which is a dialogue in the scribe's mind between his egoic awareness (V.V.V.V.V, the namesake of the titular character from V for Vendetta) and the background "thinker," Adonai.

There is a lot of vocabulary in this space used to describe the self at very fine levels of detail.


Can I ask, and this is not judgement but anthropological curiosity, did you recently decide (or were you recently forced) to leave tech?

I interview people about this kind of thing and have noticed a trend.


Fascinating, thank you for wisdom and references!

Shoutout for Jaynes! I used to call it my "buck twenty-five" book because if anyone ever tried to get pretentious with me I'd steer the conversation to an opening to bring up "The Origin of Consciousness in the Breakdown of the Bicameral Mind" and shut them down :-)

Also, I got my copy signed by Jaynes back in the '80s


> The reality in the way information is used, I believe, is the opposite from what we think of. We believe that if there is sufficient information, we can use it to form an accurate model of reality.

You should read Yuval Noah Harari's Nexus. He calls this "the naive view of information," which is ignorant of the existence of what he astutely identifies as "intersubjective" realities (see also Angela Cooper-White's entry on "intersubjectivism" in The Encyclopedia of Psychology and Religion):

    [...] its deepest and most complex
    usage is related to the postmodern
    philosophical concept of
    constructivism or, in social
    psychology, social constructionism –
    the notion that reality is co-
    constructed by participants in a
    relationship and in society.
This is the endgame of postmodernist and constructivist thinking that exalts narrative and story as the ground source of truth. In some ways what we are seeing is a return to religious and superstitious thinking where sufficient belief in a dogma or a pantheon is enough to reify those narratives into consensus reality.

Historically Jungian psychology and indeed religion (a form of proto-psychology, from which Jung inherits by way of the alchemical tradition; see Jung's Psychology and Alchemy) was humanity's collective storehouse of wisdom and techniques for managing intersubjective realities and group "information hygiene." Such techniques are now being lost to antiquity with the late 20th and 21st century focus on only objectively verifiable, quantitative measurements (as opposed to the private subjective, qualitative phenomena experienced as the inner ruminations, contemplations, and dream life of the individual).

    White Rose: Do you ever think
    that if you imagined or
    believed in something, it
    could come true... Simply by
    will?

    Angela: Yes. Actually, I did
    believe that. But I'm slowly
    having to admit that's just
    not the real world... Even if
    I want it to be.

    White Rose: Well, I guess it
    all depends on what your
    definition of real is.
https://vimeo.com/387207936

> This is the endgame of postmodernist and constructivist thinking that exalts narrative and story as the ground source of truth. In some ways what we are seeing is a return to religious and superstitious thinking where sufficient belief in a dogma or a pantheon is enough to reify those narratives into consensus reality.

I suspect the mistake here is imagining a past era in which humanity formed "consensus reality" out of evidence and reason. It can certainly appear that way to us today due to some super-strong publication bias effects since the Enlightenment era. But I think we can add this to the list of our poorly-grounded narratives.

There has never been a prior time in which a greater percentage of humanity had the means and the inclination to build a well-founded knowledge base and use it to critically assess incoming information.


> I suspect the mistake here is imagining a past era in which humanity formed "consensus reality" out of evidence and reason. It can certainly appear that way to us today due to some super-strong publication bias effects since the Enlightenment era.

At least from the Newtonian perspective, reality definitely unfolds either one way or the other, and it's not a matter of opinion.

> There has never been a prior time in which a greater percentage of humanity had the means and the inclination to build a well-founded knowledge base and use it to critically assess incoming information.

This is definitionally Harari's naive view of information, which "says that information leads to truth, and knowing the truth helps people to gain both power and wisdom." You miss the point of the root comment.


> from the Newtonian perspective, reality definitely unfolds either one way or the other, and it's not a matter of opinion.

You won't hear me claim otherwise.

> This is definitionally Harari's naive view of information

That seems unlikely to me as I didn't say anything about "power" or "wisdom".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: