As a first step, what about trying pip's `-f` option, combined with dumping your wheels in a dumb folder served by apache/nginx:
pip --help
[...]
Install Options:
[...]
-f, --find-links <url> If a url or path to an html file, then parse for links to archives. If a local path or file:// url that's a directory,then look for archives in
the directory listing.
EDIT: in the context of deploying some app at work, what's the interest of a full-blown hosted cheeseshop? Users of these solutions, what value does it add to a simple `pip install -f INTERNAL_PKG_URL pkg_a==1.2.3 pkg_b==0.1.2` ? Which features do you frequently use?
We migrated off of localshop and onto devpi. Devpi is a much better product and much more actively maintained. localshop was nothing but headaches and constantly breaking.
Author here: I created it to solve an issue I was running into a couple of years ago. I've only recently started using it again myself. I think the development version (not on pypi) is in much better shape with things like multiple repositories and better user management (teams).
Maybe you use more esoteric features. The only thing I've done in the last 18 months is patch a bug that prevented uploads from Python 2.7.4-2.7.10. We just run it under circus with chaussette and front it with an elb.
Do I understand this correctly? It only mirrors the packages that are requested from it? So I won't need to download 100GB+ of packages that I am not interested in?
Correct. My team has two main use cases: private packages and guaranteed access to packages we've built with. It's extremely frustrating to come into a codebase after several months or years to find it using a library that no longer seems to exist on the public internet.