HN2new | past | comments | ask | show | jobs | submitlogin

+1 rsync is pretty darn good at any scale -- I'm not sure why the simplest solution possible doesn't beat out docker as a suggestion in this thread.

I've been bundling libs and software into a single virtual environment like package that I distribute with rsync for a long time - it solves loads of problems, is easy to bootstrap a new system with, and incremental updates are super fast. Combine that with rsync distribution of your source and a good tool for automating all of it (ansible, salt, chef, puppet, et al) and you have a pretty fool-proof deployment system.

And a rollback is just a git revert and another push away -- no need to keep build artifacts lying around if you believe your build is deterministic.



Rsync is good for simple things. But it will fail with more complicated apps:

- how do you know which version you're running right now?

- how do you deploy to two environments where different deps are needed?

- how do you tell when your included dependencies need security patches?


rsync isn't the complete system - you're going to need git (or another vcs) and some other tools of course.

#1 is git (dump and log the git head on a deploy) #2 don't do that - keep a single consistent environment #3 use the system openssl - monitor other software components for security updates -- you need to do this anyway in any of these systems.


> #2 don't do that

I wish everyone to have easy deployments where environments, OS versions and everything else are always consistent. :)

> #3 monitor other software components for security updates -- you need to do this anyway in any of these systems.

Sure. But having multiple virtualenvs means you need to monitor all of them on all of deployed hosts. Having everything packaged separately means you can do audits much easier and without location-specific checks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: