Hacker Timesnew | past | comments | ask | show | jobs | submit | rafram's commentslogin

This is incredibly software engineer-brained. The law doesn't work like software. The only thing that matters is how the judiciary interprets the text, and if you try to use LLM "test" output to argue for a specific interpretation, you'll be laughed out of court.

Everyone in government knows what Track Changes is. The standard format of a piece of legislation in British-influenced systems is a diff. The tech field does not have secret knowledge that the rest of humanity lacks.

I was thinking the same thing. I feel like people in software tech always think they have secret powers that experts in other fields can't imagine.

This isn’t a part of macOS 26 that bothers me, honestly. I don’t spend a lot of time stacking windows and measuring their corners.

In other words, MacOS is fine as long as you're undiscerning and not at all detail-oriented. Imagine telling Steve Jobs that this was the prevailing attitude needed to make using a Mac bearable.

These inconsistent corner radii are actually intentional, FWIW - the radius depends on the window’s function (main, utility, etc.). I don’t think it looks great, but there’s no lack of attention to detail.

i use a an auto-layout tool, so having windows stacked on top of each other is super-common for me, and the fact that they all peak thru each other (like the screenshot in the blog) looking absolutely terrible drives me crazy

To me it's a little like the situation with charging the Mighty Mouse. It's become a meme to post a picture of it on its side being charged, but if you own one it doesn't really matter, as you charge it once a month for 15 minutes while you're at lunch.

There are things which definitely do bother me like the Liquid Glass, but the window corners really don't bother me. And I'm into design and constantly inspect parts of ui with Digital Color Meter app.


You have a truly Magic Mouse if yours charges in 15 minutes. In my experience, it is hours to charge from zero, which until I put an always-running monitor in the menu bar for the mouse battery level is what you are guaranteed to have since there is no other indicator of mouse battery level.

I used to roll my eyes at the complaints until I actually had one of these, and it is appallingly bad engineering. Especially since the previous design, which was functionally identical just needed a 10 second battery swap.


I truly hate it, so so much. Mentally I'm already planning out what OS I'm going to migrate to.

Try both gnome and kde if you come on linux, and remember that on kde you can customize anything you don't like.

I don't either, the only thing that annoys me is it's much harder to resize windows, so the usability is worse

But where are you going to find an Nvidia GPU with 128+ GB of memory at an enthusiast-compatible price?

You don’t need it if you use llamacpp on Windows, or if you compile it on Linux with CUDA 13 and the correct kernel HMM support, and you’re only using MoE models (which, tbh, you should be doing anyways).

What MoE has to do with it? Aside from Flash-MoE that supports exactly one model and only on macOs - you still need to load entire model into memory. You also don't know what experts going to be activated, so it's not like you can predict which needs to be loaded.

With proper mmap support you don't really need the entire model in memory. It can be streamed from a fast SSD, and this is more useful for MoE models where not all expert-layers are uniformly used. Of course the more data you stream from SSD, the slower this is; caching stuff in RAM is still relevant to good performance.

Some Chinese sources sell modded Nvidia GPUs with extra VRAM. They're quite affordable in comparison to even a Mac Pro.

Any links to them? Never heard of this..

It’s been going on for a while. Search YouTube or the web for 48gb 4090 (this is one of the most popular modded Nvidia cards), Nvidia of course never officially made a 4090 with this much memory.

There are some on sale via eBay right now. The memory controllers on some Nvidia gpus support well beyond the 16-24gb they shipped with as standard, and enterprising folks in China desolder the original memory chips and fit higher capacity ones.


I've seen a guy who sells modded 2080 Ti with 22gb for $500

https://www.tomshardware.com/pc-components/gpus/chinese-work...

There's also unreleased Nvidia engineering samples of cards with doubled VRAM like this - https://www.reddit.com/r/nvidia/comments/1rczghu/update_unre...


Go at ebay and search for RTX 4090 48GBs. There's plenty of them with prices around $3.5k

And how much do you trust Chinese hardware?

Give that most of mine, and probably yours, and probably most of the world's computers are in fact made in China one way or another, some higher percentage than others, I'm guessing most of us trust our hardware enough to continue using it.

True. I was specifically referring to "modded Chinese hardware" from some unknown, unvetted third party versus say through a well-known brand that hopefully has its own rigorous QA and security processes in place.

At this point I trust them more than US or Israeli tech

When there's no one left to trust, maybe you need to re-evaluate your criteria.

I wouldn't say that's true or even likely. It's completely possible to be in a pit of vipers where every single snake is venomous, and that is pretty much what we are seeing: With technological advances, there is a certain subset of people that will use them primarily to solidify their power and control over others. There is no utopian society right now whose government doesn't look to spy through technology, which of course is best set up at time of manufacture.

Agreed. Unless you have full control over the production chain to fully produce a device, you are subject to the whims and desires of those who preside over such technological feats that we take for granted in our daily lives.

To the original point, it's safe to say that highlighting a nationality with regards to trust is baseless and without merit, as would be for any other topic (men/women from x are y, z food is better here, etc..). Real life is much more complicated and nuanced past nationalities. Some might call it FUD (fear, uncertainty and doubt) but there's always a deeper rationale at the individual level as well.


Rather than people being wary of Chinese in general, it's more that there is a high degree of government control exercised in China and they are known to be very strategic with long-term planning in regards to technology control both for spying and actual remote control of devices. We are all just looking for the least bad option. It's not like devices from other countries are immune, but they are often less organized so there is a better chance of avoiding the Chinese level of planned access.

It does seem like pretty low risk in this specific case so I agree OP's comment was bit over the top, but I would have no way to make anything resembling even an educated guess as to how far their programs go.


Yes, this is really what I was referring to. And the fact that the original comment I was replying to mentioned "modded Chinese hardware" from some unspecified, unvetted 3rd party which doesn't exactly fill me with confidence.

The Mac is also chinese hardware

It would be hilarious if you are using a Lenovo device right now.

I mean it's pretty funny that probably 90% of the things in our homes are made in China.

That might even be true, but how large is the TAM for such machines?

and let alone competing on the energy consumption!

The Nvidia DGX Spark is exactly this and in the same price and performance bracket.

Sadly, memory bandwidth is abysmal compared to Apple chips - 273 GB/s vs 614 GB/s on M5 Max for similar price. Even though fp4 compute is faster, it doesn't help for all the decode heavy agentic workflows.

You can still buy used 3090 cards on ebay. 5 of them will give you 120GB of memory and will blow away any mac in terms of performance on LLM workloads. They have gone up in price lately and are now about $1100 each, but at one point they were $700-800 each.

I don't see how 5x 3090's is a better option than an M3 Ultra Mac studio.

The mac will just work for models as large as 100B, can go higher with quantized models. And power draw will be 1/5th as much as the 3090 setup.

You can certainly daisy chain several 3090's together but it doesn't work seamlessly.


> You can certainly daisy chain several 3090's together

It's not "daisy chaining" 3090 has NVLink.


FWIW I have never used NVLink, and I’m not sure why people are bringing up “daisy chaining” because as far as I’m aware that is not a thing with modern GPUs at all.

Really? How would you NVLink more than 2 3090's?

> The mac will just work for models as large as 100B, can go higher with quantized models. And power draw will be 1/5th as much as the 3090 setup.

This setup will work for 100B models as well. And yes, the Mac will draw less power, but the Nvidia machine will be many times faster. So depending on your specific Mac and your specific Nvidia setup, the performance per watt will be in the same ballpark. And higher absolute performance is certainly a nice perk.

> You can certainly daisy chain several 3090's together but it doesn't work seamlessly.

Citation needed; there's no "daisy chaining" in the setup I describe, and low level libraries like pytorch as well as higher level tools like Ollama all seamlessly support multiple GPUs.


I think it's bad form to say "citation needed" when your original claim didn't include citations.

Regardless - there's a difference between training and inference. And pytorch doesn't magically make 5 gpus behave like 1 gpu.


> I think it's bad form to say "citation needed" when your original claim didn't include citations.

I apologize, but using multiple GPUs for inference (without any sort of “daisy chaining”) is something that’s been supported in most LLM tooling for a long time.

> Regardless - there's a difference between training and inference.

No one brought up training vs. inference to my knowledge, besides you — I was assuming the machine was for inference, because my experience building a machine like the one I described was in order to do inference. If you want to train models, I know less about that, but I’m pretty sure the tooling does easily support multiple GPUs.

> And pytorch doesn't magically make 5 gpus behave like 1 gpu.

I never said it was magic, I just said it was supported, which it is.


How much does it cost to have an electrician wire up 240v circuit just to power the thing?

The machine I’m describing works just fine on a dedicated 15A 120V circuit.

5x3090 in 1600W?

1800W is the max on a 15A circuit, but yes, it’s usually under 1600W. For LLM inference, limiting the TDP to 225W or so per card saves a lot of power, for a 5% drop in performance.

Where are you gonna find Apple hardware with 128GB of memory at enthusiast-compatible price?

The cheapest Apple desktop with 128GB of memory shows up as costing $3499 for me, which isn't very "enthusiast-compatible", it's about 3x the minimum salary in my country!


Apple is not catering to minimum salaries in poor countries. Does this really need to be explained?

$3499 is definitely enthusiast compatible. That's beefy gaming PC tier, which is possibly the canonical example of an enthusiast market.

This isn't tens of thousands of dollars for top tier Nvidia chips we're talking about.


Seems I misunderstood what a "enthusiast" is, I thought it was about someone "excited about something" but seems the typical definition includes them having a lot of money too, my bad.

I'm an immigrant to Canada, and yes, English has both literal meanings and colloquial meanings.

In the most literal meaning, absolutely, "Enthusiast" just means a person who likes something, is excited about something.

When it comes to market and products though, typically you'll see the word "Enthusiast" as mid-tier - something like: Consumer --> Enthusiast --> Professional (may have words like "Prosumer" in there as well etc:)

In that context, which is typically the one people will use when discussing product pricing and placement, "Enthusiast" is somebody who yes enjoys something, but does it sufficiently to be discerning and capable of purchasing mid-tier or above hardware.

So while a consumer photographer, may use their phone or compact or all-in-one camera, enthusiast photographer will probably spend $3000 - $5000 in camera gear. Equivalently, there are myriad gamers out there (on phones, consoles, Geforce Now, whatever:), an enthusiast gamer is assumed to have a dedicated gaming computer, probably a tower, with a dedicated video card, likely say a 5070ti or above, probably 32GB+ RAM, couple of SSDs which are not entry level, etc.

Again, this is not to say a person with limited budget is "not a real enthusiast", no gatekeeping is intended here; simply, if it may help, what the word means when it comes to market segmentation and product pricing :)


Additionally, "enthusiasts"/"hobbyists" tend to be willing to spend beyond practical utility, while professionals are more interested in pragmatism, especially in photography from what I can tell.

If you're an actual pro, you need your stuff to work properly, efficiently, reliably, when it's called for. When you're a hobbyist, it's sometimes almost the goal to waste money and time on stuff that really doesn't matter beyond your interest in it; working on the thing is the point, not the value it generates. Pros should spend money on good tools and research and knowledge, but it usually needs to be an investment, sometimes crossing over with hobbyist opinions.

A friend of mine who's a computer hobbyist and retail IT tech, making far far less than I do, spends comically more than me on hardware to play basically one game. He keeps up to date with the latest processors and all that stuff, he knows hardware in terms of gaming. I meanwhile—despite having more money available—have a fairly budget gaming PC that I did build myself, but contains entirely old/used components, some of which he just needed to get rid of and gave me for free, and I upgrade my main mac every 5 years or something. I only upgrade when hardware is really getting in my way.


>> So while a consumer photographer, may use their phone or compact or all-in-one camera, enthusiast photographer will probably spend $3000 - $5000 in camera gear.

It's interesting that you chose photographers as the example here. In many cases that I've seen, enthusiast photographers spend much more than professional photographers on their gear because the photographers make their money with their gear and therefore need to justify it, while the enthusiasts are often tech people, successful doctors, etc., who spend lots and lots on money on their hobbies...

In any case, your point stands, that "enthusiast" computer users would easily spend $3-4K or more on gear to play games, train models, etc.


$3.5k is a lot of money, but not a ton by American hobby standards. It's easy to spend multiples, even orders of magnitude more than that on hobbies like fishing, wine, sports tickets, concerts, scuba, travel, being a foodie, golf, marathons, collectibles, etc.

It's out of reach for lots of people, even in developed countries. But it's easily within reach for loads of people that care more about computing than other stuff.


I live in America, I am very well compensated. Have been for 15 years now. $3500 is a lot of money. A lot. There is a tiny bubble of us tech folks who think it is accessible to most people. It is not. It is also the same reason Macs are still a niche. Don't take your circles to be the standard, it is very very far from it, especially if you think $3500 is not a lot of money.

It is easy to confirm this, just look at the sales number of these $3500 devices. It is definitely not an enthusiast price point, even in the US.


It's not nothing for most people... it's more than a month of rent/mortgage for a significant number of Americans even. But if it's your primary hobby, it's not completely out of reach, and it's not something you necessarily spend every year. A lot of people will upgrade to a new computer every 3-5 years and maybe upgrade something in between those complete system upgrades.

I know plenty of people who don't make a lot of money (say top 25% or so) that will have a Boat or RV that costs more than a $3500 computer, and balk at the thought of spending that much on a computer. It just depends on where your interests are.


The first words I said: "$3.5k is a lot of money..."

There are tens of millions of top 10% income adults in America. So something can be both unaffordable to most people, and also easily accessible to very many people.


It’s a midrange to upper expense in the US if it’s your hobby. Most people don’t have a serious computer hobby but they golf, trade ATVs, travel, drink, etc.

There are something like 24 million millionaires in the United States... Estimates are that Americans spent $157 billion on pets in 2025.

There are a lot of people who could easily choose to spend $3,500 on a computer.


There is no Apple device priced above $3k that has done 1 million in annual sales. The US population is >300M. <0.3% of the population. Don't take your bubble to be representative of society. $3500 is a lot of money, even in the US.

$3500 would have been 3–4 months' discretionary spending as a PhD student in Finland 15 years ago. A sum you might choose to spend once a year on something you find genuinely interesting.

Some people succumb to lifestyle creep or choose it deliberately. Others choose to live below their means when their income grows. The latter have a lot more money to spend on extras, or to save if that's what they prefer.


In June 1977, the base Apple II model with 4 KB of RAM was $1,298 (equivalent to about $6,900 in 2025), and with the maximum 48 KB of RAM it was $2,638 (equivalent to about $14,000 in 2025).

(Source: Wikipedia via Claude Opus)


Wow, 48k for $14000. Now you can get a MBP with a million times more memory for $3500 or so. Whereas that CPU was clocked at 1 MHz, so CPUs are only several thousand times faster, maybe something like 30,000 times faster if you can make use of multi-core.

I'd argue that some of those are more consumption and activity than hobby depending on how they're engaged with, and that people use the word "hobby" too loosely, but would agree that Americans in-particular consume at obscene rates.

Golf equipment, mountaineering equipment, skiing and snowboarding lift tickets and gear, a single excessive graphics card that's only used for increasing frame rates marginally, or basically a single extra feature on a car, are all things that accumulate quite quickly. Some are clearly more superfluous than others and cater to whales, while some are just expensive by nature and aren't attempting to be anything else


Those are the prices for just buying equipment, which at least retain some kind of value. 3 million+ American kids are enrolled in competitive soccer with annual clubs dues between $1K and $5K, and that money is just gone at the end of the year. Basically none of those kids are going to have a career in soccer, so it's clearly a hobby, and everyone knows it. And soccer isn't even the most popular sport!

Ya, I guess that's another category entirely. The cost of enrolling a kid in anything, potential travel involved etc..

An enthusiast in the hobby space is by definition someone willing to pour much more money that someone else not that enthusiast in whichever hobby we are talking about.

Well, and also has a bunch of money, not just willing. I guess locally we don't really have that difference, as two other commentators here went by, that's why I had to update my local understanding of "enthusiast". Usually we use it for how engaged/interested a person is, regardless of how much money they can or are willing to use.

Learned something new today at least, so that's cool :)


Yes, when tech gear is sold as 'enthusiast' gear, it is almost invariably the most expensive non-professional tier of equipment. That is roughly the common understanding: Expensive and focused on features more than security required for public use; while remaining within reach of at least some individuals, not only corporations.

In a hobby where there are (strong) HW requirements, it mostly takes for granted you have money to shell out for your hobby, indeed.

For an individual making median income in the US, it would cost 2% of your income to get a machine like this every 4-5 years. That's a matter of enthusiasm, not a matter of having a lot of money. Sorry that income is less where you are, but the people talking about the product tier are using American standards.

1200$ as the minimum salary covers probably 70% of Europe by population?

The Neo has enough power to do small LLM testing and pretty much anything else a bit slowly, and costs $600?

Maybe, but that does not mean that the Mac Studio is not very expensive hardware even for rich first world countries.

Neo tops at 8GB RAM. What LLM are you going to run there? Functiongemma?

It can absolutely do some ML inference on it, but not much in terms of LLMs.


Did you need to add poor? Unless apple isn't catering to the US

I spent aaround that on my current personal desktop... 9950X, 2x48gb ddr5/@6000, RX 9070XT, 4tb gen 5 nvme + 4tb gen 4 nvme. I could have cut the cpu to a 9800x3d and ram to 32gb with a different GPU if my needs/usage were different. I'm running in Linux and don't game too much.

That said, a higher end gaming setup is going to cost that much and is absolutely in the enthusiast realm. "enthusiast" doesn't mean compatible with "minimum wage"


The original Mac with 128KB of memory cost $2,495 when Apple released it in 1984. It would be about 3x that in today's money.

I came here to say the same. Even with my student discount price of $1000, that's over 3K in today's dollars.

We are so freaking spoiled by the cheap cost of compute now.


> it's about 3x the minimum salary in my country!

Enthusiast compute hardware doesn't cater to the people on the minimum salary in any country, let alone developing nations. When Ferrari makes a car they don't ask themselves if people on minimum salary will be able to afford them.

In in the bottom two poorest EU member states and Apple and Microsoft Xbox don't even bother to have a direct to customer store presence here, you buy them from third party retailers.

Why? Probably because their metrics show people here are too poor to afford their products en-masse to be worth operating a dedicated sales entity. Even though plenty of people do own top of the line Macbooks here, it's just the wealthy enthusiast niche, but it's still a niche for the volumes they (wish to)operate at. Why do you think Apple launched the Mac Neo?


Right, I think maybe we're then talking about "upper class enthusiasts" or something in reality then? I understood that to juts be about the person, not what economic class they were in, maybe I misunderstood.

Yes, it's a different definition.

Enthusiast in this contest more or less means you are excited enough about something to get a level above what normal people should get and just below professional pricing. An enthusiast camera body can be 2000 euros.

I would say an enthusiast computer is 2-4k.

It really depends what you meant with minimum salary (yearly?) because paying 3 months of salary for a computer like that isn't far fetched. You're not using this to generate recipes for cookies. An enthusiast level car is expensive as well.


enthusiasts in computer hardware assumes enthusiasm about hardware, not about "hardware on an budget". It doesn't matter if it's afforable or not.

>Right, I think maybe we're then talking about "upper class enthusiasts" or something in reality then?

Why? Enthusiasts are by definition people for whom value for money is not the main driver but top performance and cutting edge novelty at any cost. Affording enthusiast computer hardware is not a human right same how affording a Lamborghini or McMansion isn't.

But you don't need to buy a Lamborghini to do your grocery shopping or drive your kids to school, same how you don't need an Nvidia 5090 or MacBook Pro Max to do your taxes or do your school work.

So the definition is fine as it is. It's hardware for people with very deep pockets, often called whales.


Languages like Swift do manage to make it much simpler. The culture guiding Rust design pretty clearly treats complexity as a goal.

I’m not a fan of comparing a cruel, genocidal government to a site that jacks up the prices on concert tickets, personally.

You can have a debit card in your own name when you're under 18, but not a credit card, meaning credit is a proxy for age but debit isn't. It's the same in the UK and the US.

(They also accept an ID scan.)


Passing a law that damages privacy is a negative. Complying with the law is an imperative. So mitigating harm from the law seems like a positive.

AGI (Agentic AI Infrastructure) is joining CSS (Compute Subsystems) in their lineup, apparently. Who’s naming this stuff?

The same people who abbreviate "generative" AI in a way that misleadingly conflates it with "general" AI.

Fraud is just the default lifestyle of marketers.


So Artificial General Intelligence and Cascading Style Sheets are not joining forces?

If there's ever a singularity as a result of AGI, it will likely look at CSS and decide that extermination is simply too good for the human race.

Always have been :)

Not really. The UK uses imperial units for most of the things you use units for in daily life (roads, cooking, drink sizes, body weight, utilities, land area...), even though they theoretically converted to metric. Canada is similar.

> The UK uses imperial units for most of the things you use units for in daily life (roads, cooking, drink sizes, body weight, utilities, land area...)

Not really. Old people might cook with funny old temperatures/measures and weigh themselves in stones, but it's fading out, contemporary cookbooks and gym culture are all metric. I've literally never seen a utility bill in anything other than metric (even if it's slightly weird metric like kWh or cubic metres of gas).


_Human_ body weight. I grew up measuring everything in kilos apart from people, which has I guess what amounts to its own wholly idiosyncratic scale, the stone, that no one I've since met outside of the UK has heard of.

I don't know why really, it's just 14lb, why does the US/Canada just stick with very large numbers of pounds instead of breaking it up as with others?

Kilograms seem more and more common for human weight too though, largely driven by fitness apps & communities I think. I doubt children in school today are accustomed to stone; only pounds and ounces for birth weight perhaps, but even that is metric medically and converted for the parents' familiarity these days I believe.


> _Human_ body weight.

Fraid not.

No medical professional in Blighty weighs people using imperial measurements. The only people who really use them are the elderly and (bizarrely) the type of crappy slimming magazine seen at supermarket chekouts...... The kind satirised by Viz as titled "Less Cake, More Exercise".


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: