Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Author here, happy to answer any questions.


Do you have any internal guidelines for ensuring these sorts of public announcements have a neutral, compassionate tone? To me, it feels like it would be easy to read this article as being more empathetic to the tools than the users, and to interpret it as blaming the users for using the tools wrong: authors for adding a new distribution to a release post-hoc, instead of creating a new release; or consumers for not carefully vetting packages to ensure there’s a matching distribution for their platform & not using source distros.

I feel like, as a representative of the tool it’s easy for readers to default to understanding any criticism (even the slightest) of user behaviour & no discussion about tool behaviour as being defensive of the tools.

I know the Python package tools community value their users and would never dream of suggesting that they’re using the tools the wrong way. How important do you feel it is to recognise/appreciate those users in user-facing messaging?


Why isn't the obvious answer to do this change? https://github.com/pypa/packaging.python.org/issues/564#issu...


Because right now you can only upload individual distributions to PyPI. A release is implicitly created when you upload the first one. If PyPI implemented that change, you could only have one distribution per release.

The proper fix would be to make publishing a release a separate operation. But that breaks all existing tooling and workflows.


Make it a configuration option for the package on pypi, then tooling can migrate slowly and at some point it can become the default for new packages. If then at some point someone uses old tooling the downside would be that they might need to do a manual publishing step to actually publish their package, but at least you don't have the problem of publishing before you are ready.


What is the key advantage of PyDist over Artifactory/bintray ? That's the system most companies I know of use, and I was wondering what makes PyDist compelling.


PyDist is specifically for Python packages, so it's 1) a better fit in this niche (for example, it mirrors PyPI so you can install public and private packages through a single index, and you won't be broken by packages deleted from PyPI), and 2) it's much cheaper if you don't need everything else Artifactory offers.


I believe Artifactory proxies and caches PyPi as well via remote repos.


Ah, looks like you can set that up. But it's not as convenient, because 1) you have to provision the remote repository and 2) instead of transparently installing from PyPI, you have to specify the remote repository each time you install from it.


It is exactly as convenient (if not more). You set up a set of 3 repositories (local for your modules, remove to proxy the PyPI, and virtual, which unifies them under a single URL (which solves the #2 you mentioned).

So, both points are incorrect.


Honest question, is pydist just a hosted version of devpi (https://github.com/devpi/devpi)? What features distinguish pydist from devpi?


Node with npm must have the same problem. How they handle that and why python is not doing the same? If it should of course.


NPM doesn't have the same problem, because they only allow a single file per release (which also means there is no way to distribute platform-specific builds).


There is a way. It seems node-gyp and node-pre-gyp is used for that. Here is project that has precompiled binaries http://sharp.pixelplumbing.com/en/stable/install/ .


It looks like they don't include the binary in the package at all, and instead the package has custom install code that downloads it during install. Which works, but it completely bypasses the package manager and would certainly take me by surprise as a user.


That's my point - it works. What's wrong with this approach? I have not investigated neither python nor node way (while I use both) and as user I don't care (maybe I should, I don't know).


Expanding on jep42's answer, there are a few problems:

- The package manager can't tell whether the package will be able to support your system or not, so if this is e.g. an optional dependency your install will probably break instead of the package being skipped.

- This results in an unexpected and hard-to-foresee dependency on some other non-npm server, which could disappear, be blocked by a firewall, etc.

- Standard tools will not understand what this package is doing and their assumptions may be broken, e.g. when trying to make builds repeatable.


Looked into this a little bit deeper. There are multiple packages in node that can do prebuilts handling, e.g. sharp is using this https://github.com/prebuild/prebuild-install so there are multiple solutions.

Now answering your notes:

- It is not very different in python from practical point of view. I understand that while from theoretical point of view it should be different but it is not. E.g. when I was working on Windows I met situations where package is supported but there is no binary package. In such situations it falls back to compilation from source code and for some packages you need to perform "ancient Zulu dance" to get it compiled on Windows. Sometimes I was in the mood to do that sometimes I was not. However from user perspective even if package manager can tell that it is supported it does not help much. In practice, both python and node packages compile successfully quite frequently and in the end package is almost always supported.

Node supports optional dependencies and failure to install package either as prebuilt or built from source code will simply skip it and will not fail the whole install.

- In practice node packages usually use github and sometimes amazon s3 for prebuilts (at least based on quick analysis). There are still at least two systems but github and amazon s3 both seems to be good enough.

- I don't see problem with repeatable builds in node's case. Could you elaborate more here?

I see one problem with node's approach however. In case you want to have your own prebuilts (e.g. you have your own server) then you will end up with one big problem. You will need to have override each package with prebuilts separately.

Overall I think node did good thing by not trying to do everything and allowing community to figure out prebuilt solution. Problem is that there are multiple solutions. Python approach most probably will work out as better solution in the long run. In the end, most probably both solutions will be equally good and will look quite similar from user perspective.


jep. it can cause hard to trace errors and makes reproducible build more difficult.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: