Hacker Timesnew | past | comments | ask | show | jobs | submit | BirAdam's commentslogin

This is a great idea. I do play Civ1 on my XT class machine (NEC V20 @ 10MHz, 1MB RAM, 64MB IDE, 256K Trident VGA, NE2000, Adlib) but the turn times are horrendous as this is a 1991 game being run on a 1982 CPU. Realistically, most people would have been playing on either a 286 or 386. Having the game available on modern hardware, I imagine it’d be far more enjoyable. I’ll give it a go.

Emulators are ok

Now, if only macOS still had the ability to drop to a Darwin shell without a GUI at all… we could just have a nice UNIX with something like KDE or COSMIC, brew as our package manager… what a dream.

But why MacOS then? If you take away the interface what differentiates Darwin from FreeBSD or GNU?

That it will actually run on Apple Silicon.

TBH, I would love to install GNU or BSD on my M4 Max Mac Studio. What I really wanted is a modern UNIX workstation. My Studio’s price/performance was the best available, so that’s what I bought. Now, I am happy with that purchase except for the constant diminution in software quality from Apple.


If I could buy modern apple hardware and run the Linux of my choice, I probably would. I have 0 interest in the apple ecosystem.

I wish I could get an Apple SoC in a 2013 Thinkpad chassis.

Performance on apple chipsets!

Yeah, why stick to the inferior kernel used by macs with a worse package manager? Like something like nix is just superior in every sense.

On intel macs there used to be single user mode, but even then I don't think you ever had control over the framebuffer.

Reading comments, I don’t think people are being completely fair here. For Intel and AMD to approach what Apple has accomplished they’re making many of the same compromises with Panther Lake and Ryzen AI Max. Apple chose to put disk controllers on their SoP rather than having them on the storage module. This shaves a tiny bit of latency. Worth it? No idea. I’m shit at hardware design.

As for not having a Pro or otherwise expandable system? It’s shit. They make several variations of their chips, and I don’t think it would hurt them to make an SoP for a socket, put a giant cooling system in it, and give it 10 or 12 PCIe slots. As for what would go in those slots? Make this beast rack mountable and people would toss better network cards, sound/video output or capture, storage controllers, and all kinds of other things in there. A key here would be to not charge so much just because they can. Make the price reasonable.


They have tried variations of this since time immemorial (we can argue about "price reasonablé") but there's just not much you can do with it that you can't do much cheaper or simpler in other ways.

The Xserve has been dead for 15 years now, and it was never tremendously amazing (though it was nice kit).

Apple apparently has some sort of "in-house" xserve-like thing they don't sell; but turning that into a product would likely be more useful than a Mac Pro, unless they add NUMA or some other way of allowing an M5 to access racks and racks of DIMMs.


WSJ recently did a thing about it but details were rather light

Jobs worked on NeXT and Jean-Louis Gassée was working on Be. Gassée had brought the world the Macintosh Portable and the IIfx, and he started the Newton project which had the effect of keep ARM alive.

When Gassée left Apple, he took many of Apple's best with him. If we want to know what Apple would have looked like under Gassée, I think it's easier to look at how many products he killed. Much of Apple's leadership was trying to force budget computers like the PC industry was building. Gassée would have none of it. He was focused on exceptionally good hardware married to exceptionally good software, knew the handheld devices would be vital in the future, but he didn't like boring things. I imagine that Apple built around Be would have delivered many of the same things, but wouldn't have become just plain brushed aluminum everywhere.

The curious part would have been the OS. BeOS and NeXT are wildly different.


Sway.

I was Palm guy and not Blackberry, so I went from a Palm Treo to webOS. After that though, I went to iPhone. I considered Windows Phone though. The tiles and text orientation were so amazing. I am, however, glad that I never went down that road, not just because Windows Phone died, but also seeing what has happened to Windows more recently.

webOS is still around -- sorta! https://www.webosarchive.org

I use webOS every day (LG television)

Recently?

Windows was first released in 1985. Windows 10 and 11 are therefore "recent".

Yeah, make /home, /var/log, and /usr/local rw and everything else ro. Makes a great "immutable" that's not as annoying as truly "immutable" systems.

The speed on a constrained device isn't entirely the point. Two years ago, LLMs failed at answering coherently. Now...

You're absolutely right. Now, LLMs are too slow to be useful on handheld devices, and the future of LLMs is brighter than ever.

LLMs can be useful, but quite often the responses are about as painful as LinkedIn posts. Will they get better? Maybe. Will they get worse? Maybe.


> Will they get better? Maybe. Will they get worse? Maybe.

I find it hard to understand your uncertainty; how could they not keep getting even better when we've been seeing qualitative improvements literally every second week for months on end? These improvements being eminently public and applied across multiple relevant dimensions: raw inference speed (https://github.com/ggml-org/llama.cpp/releases), external-facing capabilities (https://github.com/open-webui/open-webui/releases) and performance against established benchmarks (https://unsloth.ai/docs/models/qwen3.5/gguf-benchmarks)


There are many metrics for “better” and “worse”. It is entirely possible for an AI system to be better in the sense of hallucination while also being of less utility. An arrogant prick who’s always correct isn’t always a good person to have on your team, right?

I wish this were available for more browsers...

I've never understood having an office when it isn't absolutely required. Why spend money on something you do not actually need?

If people can't make remote collaboration work, perhaps they should study how gaming groups achieve this.


> I've never understood having an office when it isn't absolutely required

A number of jurisdictions require some amount of office usage for subsidizes, it's harder for managers to justify not offshoring if everyone is 100% WFH, and some employees just suck (eg. Overemployed, exfiltrating data, quiet quitting).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: