HN2new | past | comments | ask | show | jobs | submit | bestcommentslogin
Most-upvoted comments of the last 48 hours. You can change the number of hours like this: bestcomments?h=24.

If this makes people develop stuff under the assumption that the user only has 8 GB of memory, I am happy for where we are going :-)

I had an interview question. What would you do if two different people were emailing a spreadsheet back and forth to track something?

I said I’d move them to google sheets. There was about five minutes of awkwardness after that as I was interviewing for software developer. I was supposed to talk about what kind of tool I’d build.

I found it kind of eye opening but I’m still not sure what the right lesson to learn was.


List of differences from the MacBook Air:

* Only supports 8 GB of unified memory

* No MagSafe

* One of the two USB-C ports is limited to USB 2.0 speeds of just 480 Mb/s

* No Thunderbolt support means the Neo cannot drive either of Apple’s new Studio Displays. However, it can push a 4K display with 60Hz refresh rate over USB-C.

* “Just” 16 hours of battery life, compared to the 18 hours quoted for the 13-inch MacBook Air

* Display supports sRGB, but not P3 Wide Color

* No True Tone

* 1080p webcam doesn’t support Center Stage

* No camera notch

* Dual side-firing speakers, down from four speakers on the Air

* Does not support Spatial Audio with dynamic head tracking on AirPods

* Dual-mic system, down from a three-mic system on the Air

* The 3.5 mm headphone jack does not have support for high-impedance headphones

* No keyboard backlighting

* Touch ID not included on base model

* Trackpad does not support Force Touch

* Supports Wi-Fi 6E, not 7

* No fast charging

* The Apple on the lid isn’t shiny

https://512pixels.net/2026/03/the-differences-between-the-ma...


I was sitting in a room the other day with a young adult, we were searching for additional algorithm learning materials. They searched in Google, and accept the cookies. They clicked on a website, and accepted those cookies too. They then started entering their email address to access another service. I was completely taken aback.

I'm the sort of person that either rejects the cookies, or will use another site entirely to avoid some weird dark-pattern cookie trickery. I don't like the idea of any particular service getting more information than they should.

Siting there I realized, we were not the real target. It is the young people that are growing up conditioned to press accept, enter any details asked of them, and to not value their personal data. Sadly, the damage is already done.


This is a major challenge to Microsoft. A 13-inch Surface Laptop costs $899 [1], that's 50% more than an equivalent MacBook! And even at that higher price the Surface Laptop doesn't have a good screen: it uses 150% scaling (as opposed to the ideal 200%) which means you have subtle display artifacts.

Other than Microsoft nobody even makes decent laptops in the Windows world. I am typing this on an Lenovo Yoga, it has decent screen and keyboard, but the touchpad is horrible. Samsung makes good laptops but my keyboard gave out after just 2 years. Most other laptop makers have horrible industrial design. Dell XPS 17 was pretty good, but now they have weird keyboard.

The best laptop is now significantly cheaper than the horrible ones. Incredible achievement by Apple, and a major challenge to Windows laptop makers.

[1] https://www.microsoft.com/en-us/store/configure/surface-lapt...


A couple years back John Reilly posted on HN "How I ruined my SEO" and I helped him fix it for free. He wrote about the whole thing here: https://johnnyreilly.com/how-we-fixed-my-seo

Happy to do the same for you if you want.

The quickest win in your case: map all the backlinks the .net site got (happy to pull this for you), then email every publication that linked to it. "Hey, you covered NanoClaw but linked to a fake site, here's the real one." You'd be surprised how many will actually swap the link. That alone could flip things.

Beyond that there's some technical SEO stuff on nanoclaw.dev that would help - structured data, schema, signals for search engines and LLMs. Happy to walk you through it.

update: ok this is getting more traction than I expected so let me give some practical stuff.

1. Google Search Console - did you add and verify nanoclaw.dev there? If not, do it now and submit your sitemap. Basic but critical.

2. I checked the fake site and it actually doesn't have that many backlinks, so the situation is more winnable than it looks.

3. Your GitHub repo has tons of high quality backlinks which is great. Outreach to those places, tell the story. I'm sure a few will add a link to your actual site. That alone makes you way more resilient to fakers going forward. This is only happening because everything is so new. Here's a list with all the backlinks pointing to your repo:

https://docs.google.com/spreadsheets/d/1bBrYsppQuVrktL1lPfNm...

4. Open social profiles for the project - Twitter/X, LinkedIn page if you want. This helps search engines build a knowledge graph around NanoClaw. Then add Organization and sameAs schema markup to nanoclaw.dev connecting all the dots (your site, the GitHub repo, the social profiles). This is how you tell Google "these all belong to the same entity."

5. One more thing - you had a chance to link to nanoclaw.dev from this HN thread but you linked to your tweet instead. Totally get it, but a strong link from a front page HN post with all this traffic and engagement would do real work for your site's authority. If it's not crossing any rule (specific use case here so maybe check with the mods haha) drop a comment here with a link to nanoclaw.dev. I don't think anyone here would mind if it will get you few steps closer towards winning that fake site


"it has no annoying fans"

I beg to differ ;)


I would suggest adding the /r/ProgrammerHumor version too: https://www.reddit.com/r/ProgrammerHumor/comments/1p204nx/ac...

The AI crank always cracks me up.


This is kind of a misleading title. While they "ended" the 30-percent cut, they are keeping a 20-percent cut.

In an alternative timeline, Firefox makes their context menu really short and someone writes a blog post ranting about how it deprives functionality from power users.

In fact, I've read several such rants about Firefox removing functionality from other parts of their UI.

It's sure hard to make everyone happy.


This was pointed out humorously by Douglas Adams:

> "..am I alone in finding the expression 'it turns out' to be incredibly useful? It allows you to make swift, succinct, and authoritative connections between otherwise randomly unconnected statements without the trouble of explaining what your source or authority actually is. It's great. It's hugely better than its predecessors 'I read somewhere that...' or the craven 'they say that...' because it suggests not only that whatever flimsy bit of urban mythology you are passing on is actually based on brand new, ground breaking research, but that it's research in which you yourself were intimately involved. But again, with no actual authority anywhere in sight."


I love the following section of their copy:

> Even More Value for Upgraders

> The new 14- and 16-inch MacBook Pro with M5 Pro and M5 Max mark a major leap for pro users. There’s never been a better time for customers to upgrade from a previous generation of MacBook Pro with Apple silicon or an Intel-based Mac.

I read as "Whoops we made the M1 Macbook Pro too good, please upgrade!"

I think I will get another 2-5 years out my mine.

Apple: If you document the hardware enough for the Asahi team to deliver a polished Linux experiene, I'll buy one this year!


Having been both the interviewer and the candidate in this kind of situation, this is really a big interviewer training failure.

The general way to handle this as an interviewer is really simple: acknowledge that the interviewee gave a good answer, but ask that for the purposes of evaluating their technical design skills that you'd like for them to design a new system/code a new implementation to solve this problem.

If the candidate isn't willing to suspend disbelief for the exercise, then you can consider that alongside all of the other signals your interviewer team gets about the candidate. I generally take it as a negative signal, not because I need conformance, but because I need someone who can work through honest technical disagreements.

As a candidate, what's worked for me before was to ask the interviewer if they'd prefer that I pretend ____ doesn't exist and come up with a new design, but it makes me question whether I want to join that team. IMO it's the systems design equivalent of the interviewer arguing with you about your valid algorithm because it's not the one the interviewer expects.


Whenever I hear german companies mention digitalisation, I get reminded that they still use pen and pencil in production environments to log data, pass those sheets to secreteries who enter the data into legacy systems so data analysts can enter it into another system that then has an integration with SAP. Data from SAP then flows onwards to some buzzword filled Azure product that costs a few million a month from which someone downloads an xls file and uploads it to Tableau where they run some simple calculations. Someone else downloads it as an xls and manually writes (not copy pastes) the numbers into a power point presentation and makes graphs by drawing shapes. This is then presented at some bi-monthly meeting.

I wish I was making this stuff up.


> macOS 15 uses ~5GB on startup without any app open

Sort of? Mac very aggressively caches things into RAM. It should be using all of your RAM on startup. That's why they've changed the Activity Monitor to say "memory pressure" instead of something like "memory usage."

I'm typing this on an 8 GB MacBook Air and it works just fine. I've got ChatGPT, VSCode, XCode, Blender, and PrusaSlicer minimized and I'm not feeling any lag. If I open any of them it'll take half a second or so as they're loaded from swap, but when they're not in the foreground they're not using up any memory.


We're going to do it again, aren't we? We're going to take something simple and sensible ("write tests first", "small composable modules", etc.), give it a fancy complicated name ("Behavior-Constrained Implementation Lifecycle pattern", "Boundary-Scoped Processing Constructs pattern", etc.), and create an entire industry of consultants and experts selling books and enterprise coaching around it, each swearing they have the secret sauce and the right incantations.

The damn thing _talks_. You can just _speak_ to it. You can just ask it to do what you want.


You forgot an important difference: the macbook neo has the A18 Pro chip (2 performance cores + 4 efficiency cores) whereas the macbook air has the M5 chip (4 performance cores + 6 efficiency cores)

Also the A18 Pro chip has a 5-core GPU whereas the M5 chip has 8 or 10.

Personally, the only dealbreaker in the list you posted is the amount of RAM. macOS 15 uses ~5GB on startup without any app open. I'd be swapping all the time on 8GB of RAM.


ICE has been detaining Chinese people in my area (and going door to door in at least one neighborhood where a lot of Chinese and Indians live). I was hearing about this just last week as word spread amongst the Chinese community here (Ohio) to make sure you have some legal documentation beyond just your driver's license on you at all times for protection. People will hear about this through the grapevine and it has a massive (and rightly so) chilling effect. US labs can try but with US government behaving like it is I don't think they will have much luck.

*edit: not that it matters, but since MAGA can't help but assume, these are all US citizens and green card holders that I am referring to.


I cannot be alone in feeling that titles (within "tech" in particular) are almost completely arbitrary? What constitutes a "senior", "lead", "principal" and "staff" X, respectively, has so much overlap that it really depends on the organisation. I myself have been called all of those things, but have honestly not been able to tell the difference: in some cases, I have had much more responsibility as a "senior backend developer" than a "staff engineer". I have recently interviewed for a number of roles with titles like CTO, engineering manager, tech lead etc and there is so much overlap that they seem to be one and the same. Have worked at companies on three continents, in organisations ranging from 6 people to 10k+, so have seen a few titles.

When @sama announced within hours that OAI was replacing Anthropic with the "same conditions ", it was clear that either the DoW or OAI (or both) were fudging. DoW balked at Anthropic's conditions so OAI's agreement must have made the "conditions" basically unenforceable.

And sure enough, my reading of it left the impression the OAI conditions were basically "DoW won't do anything which violates the rules DoW sets for itself."


These sorts of core-density increases are how I win cloud debates in an org.

* Identify the workloads that haven't scaled in a year. Your ERPs, your HRIS, your dev/stage/test environments, DBs, Microsoft estate, core infrastructure, etc. (EDIT, from zbentley: also identify any cross-system processing where data will transfer from the cloud back to your private estate to be excluded, so you don't get murdered with egress charges)

* Run the cost analysis of reserved instances in AWS/Azure/GCP for those workloads over three years

* Do the same for one of these high-core "pizza boxes", but amortized over seven years

* Realize the savings to be had moving "fixed infra" back on-premises or into a colo versus sticking with a public cloud provider

Seriously, what took a full rack or two of 2U dual-socket servers just a decade ago can be replaced with three 2U boxes with full HA/clustering. It's insane.

Back in the late '10s, I made a case to my org at the time that a global hypervisor hardware refresh and accompanying VMware licenses would have an ROI of 2.5yrs versus comparable AWS infrastructure, even assuming a 50% YoY rate of license inflation (this was pre-Broadcom; nowadays, I'd be eyeballing Nutanix, Virtuozzo, Apache Cloudstack, or yes, even Proxmox, assuming we weren't already a Microsoft shop w/ Hyper-V) - and give us an additional 20% headroom to boot. The only thing giving me pause on that argument today is the current RAM/NAND shortage, but even that's (hopefully) temporary - and doesn't hurt the orgs who built around a longer timeline with the option for an additional support runway (like the three-year extended support contracts available through VARs).

If we can't bill a customer for it, and it's not scaling regularly, then it shouldn't be in the public cloud. That's my take, anyway. It sucks the wind from the sails of folks gung-ho on the "fringe benefits" of public cloud spend (box seats, junkets, conference tickets, etc...), but the finance teams tend to love such clear numbers.


This is the best laptop for the general consumer around $1k.

  - it has no annoying fans, it is completely silent
  - a high res display with no PWM flickering and reasonable response times, no burn-in issues, enough brightness for outdoor use
  - best-in-class hardware, very very efficient, amazing single thread performance, good multi thread, very good GPU
  - no Microsoft Windows annoyances, ads, bloatware, broken stuff all the time
  - much better real world performance on battery than x64 processors (!). you can get reasonable perf by setting Intel/AMD CPUs to high perf, but then goodbye battery life and get ready for very loud fans. this is simply a point not emphasized enough, the real world battery perf of Intel/AMD laptops is very sluggish on default power modes and despite that, they consume more battery than the M5
  - amazing battery life
  - good workmanship, no creaking, good hardware overall (mics, webcam, keyboard, touchpad!)
  - very good speakers
There is simply nothing comparable in the Windows laptop world. You can maybe get a cheaper Windows laptop but it will be terrible in almost everything - the new Apple budget MacBooks will probably be a much better choice. And around $1000, there is no comparison. I wish it was different.

The single biggest issue for me with ChatGPT right now is how absolutely awful it sounds in every answer. "Why it matters", "the big picture", "it's not jut you", the awful emphasis, the quotations with rhetorical questions, etc.. I don't know if it's intentional so you can easily spot ChatGPT-generated content on the web? The very first GPT-5 version was good but they ruined it immediately afterwards with "making the personality warmer" and making the same mistakes as 4o. I see now that they even ruined Japanese even though it was one of the best languages supported by ChatGPT (under "Limitations" at the end). I don't use it anymore, immensely disappointed.

Dianna got better sometime last year as well, just in time to fly home to Hawaii for her father's funeral (yeah ...), but she got a lot worse again later. I really hope things will keep going well for Dianna now.

Props for her husband who's been incredible of taking care of her.


I made Flash Games back in the day. Here's my old profile on Newgrounds: https://cableshaft.newgrounds.com/

One thing Flash had that nothing else has really seemed to replicate as well since, is an environment that both coders and artists could use. I'd collaborate with an artist, they'd make their animations within an FLA, send it to me, and then I'd copy+paste into the project file, and it'd just work. I could even tweak their animations if need be to remove a frame here or there to tighten the animations and make it feel more fluid, etc.

That being said, I'm not sure I could go back to it now. I've been working with Love2D lately, and I prefer that (especially for the version control). FLA version control was always me going 'GameName-1.fla', 'GameName-2.fla', or when I got a little smarter 'GameName-Date.fla'. Eventually they let you split out the actionscript files into its own files, and that was better for version control, but you still had the binary mess of the FLA file.

But all these sprite-based game editors just can't handle the crazy intricate animations that vector-based Flash games could handle. Porting one of my old games (Clock Legends) that had hundreds of frames of hand drawn animation for a boss that filled the screen would be ridiculously huge nowadays, but the FLA for that was like 23MB, I believe (I'll need to hunt it down, I have it somewhere), and several MB of that were for the songs in the game.

Excited for this project though. It deserves to come back in some form.


> Why do all of the above have ...? No clue.

The "..." convention is used when menu options open a dialog box rather than just immediately doing the action.


I am in my mid forties, been working as a professional software developer for over 20 years.

I click “accept the cookies” almost every time. I just personally don’t feel it’s worth the effort and cost to try to avoid it.

What “dark pattern cookie trick” are you worried about? I just can’t come up with a scenario where it will actually harm me in any way. All the examples I have heard are either completely implausible, don’t actually seem that bad to me, or are things that are trivially easy to do even without any cookies.

Now, I am not going around giving my real email out to random sites, though, although even that doesn’t strike me as particularly dangerous. I already get infinite spam, and I am sure there are millions of other ways to get my email address… it is supposed to be something you give out, after all.

I just don’t think it is something that is worth stressing out about and fighting against. Maybe I am actually naive, but I just have not yet been convinced I should actually care.


"Education customers can purchase it for $499."

That is insane pricing for a brand new apple product. They will sell so many of these!


Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.

$599, 8 GB RAM, 256 GB, *No* Touch ID

$699, 8 GB RAM, 512 GB, Touch ID

Honestly pretty fantastic product and price.

This is clearly targeted towards education but I think I will happily replace by MacBook Air M1 with this :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: