Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> Running NPM Update ranges from doing not much at all, to breaking my entire system making deleting all of node_modules the only realistic fix.

Why not just pin your dependencies to specific versions, ie - “somepackage”: “4.2.9” ?



Having to delete node_modules to “fix” it is indicative of a poor package manager. Although not entirely the same, it reminds me of cabal hell, in Haskell many years ago.


I've had to do that with every package manager though.


After working with pip and gradle I have never encountered that issue


I'm borderline infuriated that this isn't the default way that npm works. Maybe if people didn't willnilly break things and followed semver, it'd be okay, but I waste too much time bisecting why some package that used to work is blowing up my build or failing in less obvious ways.

But I'm a crusty almost-30 .Net developer used to Nuget.


npm does maintain a lockfile for you by default since npm 5, if that's what you mean.

There really is no reason to be encountering surprise sub-dependency changes with npm.


Happens all the time when package.json defaults to foo: "^1.2.3", and some bozo does breaking changes in 1.2.5.


If you have a package lock, even caret in package.json won't automatically install it.


> Happens all the time when package.json defaults to foo: "^1.2.3", and some bozo does breaking changes in 1.2.5.

No it doesn't. Since npm 5, npm is lockfile-by-default, you don't get updates unless you ask for them. Whether one particular package correctly respects semver is irrelevant.


Because that's not how it works. Subdependencies are still liable to slight shifting - of course there is the shrink-wrappish package-lock and npm ci, but they do some weird things I don't want to elaborate here.

Npm is more fragile than it should be.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: