Hacker Timesnew | past | comments | ask | show | jobs | submit | progmetaldev's commentslogin

Since most won't actually deal with fintech (I don't know the stats on HN, but I'm talking devs as one industry), your first "a" example might actually be better than your first "b" example, depending on the complexity of the software. In lots (probably most) of industries, having a good codebase would mean architecture decisions were solid, but the domain/service layer is bad. Maybe my experiences don't match most of the HN crowd, but usually I get stuck with very detailed domain/service rules, but the architecture is a problem where too much memory or CPU is being used, just to abstract away the actual rules of the application (the purpose). Usually when I've been brought in to rebuild an application, the client is fine with the results, but they are upset over performance and/or cost to run the application. For anything of actual complexity, it's usually the supporting code that is the biggest failure, because complex apps usually have decent requirements. Now, if the requirements were bad, and the architecture was bad, AND the domain/service layer is bad, I don't know if there's anything to fix that.

I mean, hasn't it learned from reading other's code? I don't think it can be any better than the common patterns and practices that it has been trained on. Some outlier of amazing code is probably not going to make much of a difference, unless I am completely misunderstanding LLMs (which I very well may be, and would gladly take any criticism on my take here).

Yes, but, that’s a low bar, no? I mean, when they first started talking about AI, were you envisioning a mediocre AI that is just average, or were you imagining an expert?

To me, it instead sounds like you care about the code you produce. You judge it more harshly than you probably do other code. It sounds like you are also meeting deadlines, so I'd call that a success and more production than what a lot of people tend to put out into the world.

I often have a lot of time between projects, and am able to really think about things, and write the code that I'm happy with. Even when I do that, I do some more research, or work on another project, and immediately I'm picking apart sections of my code that I really took the time to "get right." Sometimes it can be worse if you are given vast amounts of time to build your solution, where some form of deadline may have pushed you to make decisions you were able to put off. At least that's my perspective on it, I feel like if you love writing software, you are going to keep improving nearly constantly, and look back at what you've done and be able to pick it apart.

To keep myself from getting too distressed over looking at past code now, I tend to look at the overall architecture and success of the project (in regards to the performing what it was supposed to, not necessarily monetarily). If I see a piece of code that I feel could have been written far better, I look at how it fits into the rest. I tend to work on very small teams, so I'm often making architecture decisions that touch large areas of the code, so this may just be from my perspective of not working on a large team. I still do think if you care about your craft, you will be harsh on yourself, more than you deserve.


Fresh out of college, I had an interview for a job working with COBOL. There were classes being held to teach people development, as well as how to maintain existing COBOL software. It was between myself and another recruit, and that other recruit already had COBOL experience. Naturally, I was not chosen, over someone who already have knowledge of the working language being used.

Although I probably don't make nearly as much as that COBOL developer over 20 years later, I would be willing to bet that I am happier and haven't locked myself into a specific technology the way that developer probably has. Money is great, but if you actually care about what you do, I expect that being stuck on the same codebase for years isn't too satisfying (at least on code you didn't have a hand in creating from the very start). Too many people translate money into happiness, and I guess there is a balance there, but usually it's not possible to maintain happiness based off money when you do the same thing day in and day out.


I think a large number of people seem to forget the trust that companies have built on IBM over decades. The mainframe market is IBM, where IBM already had a hold. People want to believe that dropping such a large company could be done with a rewrite, but as long as IBM is there to support what they already have in place, it makes it unlikely for companies to move away. Obviously a team that has experience moving away from IBM technology to something more "modern" could go with another platform running on different hardware, but you don't hear about those migrations too much because they are rare (for a reason, IBM also offers support that companies love to cling on to).

I don't blame companies that already tied up in IBM tech for sticking with what they already have. As boring and dated as IBM tech might be, it's still running a ton of infrastructure, and you don't get to be that kind of company without being solid and reliable. That's what companies want, even if a development team wants to flex their skills in something new and not tied to IBM.


When I was younger, I went into some places that I shouldn't have (legally). For myself, something scarier is walking through offices and tunnels that look completely deserted, and then coming upon an office or room where it's clear someone has recently been there. Whether it was someone homeless looking for a place to stay, or an employee that's still on the payroll, both would freak me out far more than a completely empty space.

I started House of Leaves a couple times, but I always end up spending more time online than reading fiction. I need to actually sit down and read it. I used to read lots of fiction before my addiction to technology. Every five years or so, I go back and read Herman Hesse's Steppenwolf, but it's a fairly small book to get through. I bought the House of Leaves version that has color coding and bizarre layout (not sure if that was always in every version). I suppose I spend enough time watching horror movies and playing odd games that I could afford the time to read House of Leaves.

I do watch a lot of amateur found footage films on YouTube, along with analog horror. I remember when Blair Witch Project first came out, and it reminded me of strange dreams and nightmares I've had, and I think that's part of where the attraction to liminal spaces comes from. It's something humans can relate to, but it's harder to put a specific label on the feeling you get when consuming this type of content.


House of Leaves is meant to be read with the weird layout and coloring (e.g. the word "house" always in blue)

It's a fun book! I first read it as a kid at my grandparents little condo in Mexico. I read some on the plane there and back, but the most scared I got was when I read one particular scene alone at night in the kitchen when everyone was asleep. I think it's the only time my heart rate has ever truly jumped like that reading a book.


My book states that the coloring is in the newest version. I suppose I should go back and take a look at the copyright, in case it is the newest, but is fairly old.

I enjoy the general feeling of the book, like claustrophobia inside something that grows infinitely. That's the best way I can describe it, but I haven't gotten more than a quarter of the way through. I do need to go back and fully read it.


It's more to do with the standard library being so barren of common application needs, and looking for a solution that the community has gotten behind. Axios has been a common dependency in many codebases, because it is a solid solution that many have already used. Every developer could try building all the libraries that they would reach for themselves, but then each company has now taken on the task of ensuring their own (much larger) codebase is free from security issues, on top of taking care of their own issues and bugs.

Isn't the issue that a lot of these devices have vulnerabilities and aren't updated often enough, rather than the device being of Chinese origin? You look at hardware for the home market, and most haven't received an update in years, if not a decade. Widely deployed hardware with out of date software seems like it's just a script to crawl home IP address spaces, like a Metasploit module, no?

Maybe I'm misunderstanding the link to Chinese vs. non-Chinese router vendors?


Sometimes if it's a client that isn't too difficult, they are worth keeping if they come at you with projects that expand your knowledge.

Squid caching takes me back. I was dealing with a network for a large car dealership (2006), and they were having issues with pages appearing out of date, as well as sales people who couldn't help themselves from looking at adult websites. I had to figure out the entire network (was put in place before I ever showed up to provide support), which included both the physical and software layers. Not only was I on ladders in the service area, using a network tone device (for those that don't know, you can connect a cable to a device that pushes a tone down the line, and then pick up that tone on a device that lets you run the device down the line and hear the one if you have the correct wire), but I also had to figure out this server using a Squid cache that stood in front of everything.

Eventually I got all the devices marked from origin to their patch cables in the server room, and I started looking into the Squid cache. It turns out that they were caching everything, as well as blocking websites. I figured out what websites they needed to do their job, and turned off caching, while also learning the ACLs for blocking websites. Anything else was allowed, but the Squid cache would hold a copy for some set amount of time (I think it was 24 hours, so if it was legitimate they only had to wait a day, but it also saved on bandwidth by quite a bit - although think this was used more to monitor user activity).

It was frustrating as someone new to large LANs, as well as to in-house caching, but had been using Linux since an early version of Slackware in the later 1990's. Even to this day, as someone that writes software and does DevOps, that knowledge has helped my debugging skills tremendously. Dealing with caching is a skill I feel you need to be burned by in order to finally understand it, and recognize when it's occurring. I cut my teeth on Linux through a teacher that set up a web server in 1997, and not only gave students access to upload their web files, but also a terminal to message each other and see who was online.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: